I would contest this and say that Europe (Mistral) and other US companies (like Meta’s Llama series that seeded everything happening in China now) were chasing ChatGPT very closely before Deepseek/Alibaba. Even S Korea (LG Exaone) and many smaller companies are putting up competition, often building on international work.
Also, locally runnable Deepseek is nothing like GPT4. The 32B is smart, but it just doesn’t have the world knowledge the 671B model has, which is not practical to run locally.
…Sorry for being so nitpicky, as I agree with the sentiment.
That essentially wastes electricity for OpenAI (assuming you aren’t paying for the response), and its “filler” data for training on.