What if you can just hook your AI system up to some other AI system and drain everything? No weights access required. Just train on the raw inputs/outputs.
What stops this from being the future?
It never ends. There's no moat. One day your at-home GPU will unwind an entire hyperscaler's worth of expertise.
Does the capital outlay get them anything at all apart from a temporary lead?
While you couldn't download a car, your product use might train a low-cost competitor.
I've had Claude 3.5, Grok 3, and DeepSeek claim that it was made by OpenAI.
I wonder does OpenAI has permission from all authors of works it "inappropriately distilled"? A pirate has no right to complain about safety of navigation.
I think there should be a system that if country A illegally uses works of country B for developing an AI, it loses copyright protection in country B.
Whistleblower: Huawei cloned Qwen and DeepSeek models, claimed as own - https://news.ycombinator.com/item?id=44482051 - July 2025 (58 comments)
Also:
Huawei Whistleblower Alleges Pangu AI Model Plagiarized from Qwen and DeepSeek - https://news.ycombinator.com/item?id=44506350 - July 2025 (1 comment)
Pangu's Sorrow: The Sorrow and Darkness of Huawei's Noah Pangu LLM R&D Process - https://news.ycombinator.com/item?id=44485458 - July 2025 (2 comments)
Huawei's Pangu Pro MoE model is likely derived from Qwen model - https://news.ycombinator.com/item?id=44461094 - July 2025 (1 comment)
Huawei releases an open weight model trained on Huawei Ascend GPUs - https://news.ycombinator.com/item?id=44441089 - July 2025 (333 comments)
beambot•9h ago