Here's a summary of the themes from the Hacker News discussion:
Open Weights vs. Closed API Strategy
A central point of contention is the decision by Qwen (and potentially other Chinese AI companies) to move away from a fully open-weights strategy towards a closed API or restricted release model. Many users express disappointment, viewing open weights as a key strength and a path for innovation.
- "It doesn't seem to have open weights, which is unfortunate." - rushingcreek
- "One of Qwen's strengths historically has been their open-weights strategy, and it would have been great to have a true open-weights competitor to 4o's autoregressive image gen." - rushingcreek
- "There are so many interesting research directions that are only possible if we can get access to the weights." - rushingcreek
- "The way they win is to be open. I don't get why China is shutting down open source." - echelon
- "Both Alibaba and Tencent championed open source (Qwen family of models, Hunyuan family of models), but now they've shut off the releases." - echelon
The business rationale for this shift is debated. Some argue that releasing weights freely while offering a hosted API is a poor business strategy due to the high initial training costs being undercut by free riders.
- "The problem with giving away weights for free while also offering a hosted API is that once the weights are out there, anyone else can also offer it as a hosted API with similar operating costs, but only the releasing company had the initial capital outlay of training the model. So everyone else is more profitable! That's not a good business strategy." - yorwba
- "Even for a big cloud provider, putting out model weights and hoping that people host with them is unlikely to be as profitable as gating it behind an API that guarantees that people using the model are using their hosted version. How many people self-hosting Qwen models are doing so on Aliyun?" - yorwba
Others suggest that the move to closed models is a natural progression once a company feels its models are competitive, or as a way to monetize.
- "New entrants may keep releasing weights as a marketing strategy to gain name recognition, but once they have established themselves (and investors start getting antsy about ROI) making subsequent releases closed is the logical next step." - yorwba
- "Yes that I can totally believe. Standard corporation behaviour (Chinese or otherwise)." - natrys (referring to withholding superior models from Western releases)
Concerns About Training Data and Model Quality
There are significant concerns raised about the training data used for Qwen's new models, with several users pointing to the "orange tint" in generated images as evidence of training on OpenAI's outputs. This leads to discussions about the originality of the data and the overall quality compared to existing models.
- "It's also very clearly trained on OAI outputs, which you can tell from the orange tint to the images[0]." - Jackson__
- "Did they even attempt to come up with their own data?" - Jackson__
- "So it is trained off of OAI, as closed off as OAI and most importantly: worse than OAI. What a bizarre strategy to gate-keep this behind an API." - Jackson__
- "What do you mean Tencent just shut off the Hunyuan releases? There was another open weights release just today: [link]" - logicchains (countering the idea of a complete shutdown)
- "They're still clearly training on Western outputs, though." - echelon
The specific mention of the "orange tint" becomes a recurring point.
- "Huh, so orange tint = openAI output? Maybe their training process ended up causing the model to prefer that color balance." - vachina
- "Here's an extreme example that shows how it continually adds more orange: [link]" - Jackson__
- "It's really too close to be anything but a model trained on these outputs, the whole vibe just screams OAI." - Jackson__
This also ties into the desire for open weights, as users believe access to weights would allow for analysis of potential training data issues.
- "That form of collapse might just be inherent to the methodology. Releasing the weights would be nice so people can figure out why" - acheong08
The Nuance of "Open Weights" vs. "True Open Source"
A philosophical debate emerges regarding the definition of "open weights" and its distinction from "true open source" software. Some argue that releasing weights with commercial restrictions or for specific uses is not genuinely open, while others see it as a pragmatic compromise to encourage development.
- "But if you're suggesting they should do open weights, doesn't that mean people should be able to use it freely?" - diggan
- "You're effectively suggesting "trial-weights", "shareware-weights", "academic-weights" or something like that rather than "open weights", which to me would make it so that you can use them for whatever you want, just like with "open source" software. But if it misses a large part of what makes "open source" open source, like "use it for whatever you want", then it kind of gives the wrong idea." - diggan
- "I am personally in favor of true open source (e.g. Apache 2 license), but the reality is that these model are expensive to develop and many developers are choosing not to release their model weights at all." - rushingcreek
- "I think that releasing the weights openly but with this type of dual-license (hence open weights, but not true open source) is an acceptable tradeoff to get more model developers to release models openly." - rushingcreek
- "But isn't that true for software too? Software is expensive to develop, and lots of developers/companies are choosing not to make their code public for free. Does that mean you also feel like it would be OK to call software "open source" although it doesn't allow usage for any purpose? That would then lead to more "open source" software being released, at least for individuals and researchers?" - diggan
- "I mean it wasn't binary earlier, it was "to get more model developers to release", so not a binary choice, but a gradient I suppose. Would you still make the same call for software as you do for ML models and weights?" - diggan
- "I wouldn't equate model weights with source code. You can run software on your own machine without source code, but you can't run an LLM on your own machine without model weights." - hmottestad
This discussion touches on the licensing models used in software development, with comparisons to proprietary licenses and business source licenses.
The State of Open Source in China and Shifting Trends
Some users perceive a deliberate move by Chinese tech giants away from open-sourcing their advanced AI models, leading to speculation about a coordinated shift. This is contrasted with past periods where Chinese companies were seen as champions of open source for AI development.
- "The era of open weights from China appears to be over for some reason. It's all of a sudden and seems to be coordinated." - echelon
- "Alibaba just shut off the Qwen releases" - echelon
- "Tencent just shut off the Hunyuan releases" - echelon
- "Bytedance just released Seedream, but it's closed" - echelon
- "It's seems like it's over." - echelon
- "If you have worked or lived in China, you will know that Chinese open-source software industry is a totally shitshow. The law in China offers little protection for open-source software. Lots of companies use open-source code in production without proper license, and there is no consequence. Western internet influencers hype up Chinese open-source software industry for clicks while Chinese open-source developers are struggling." - rfv6723
However, other users push back against this narrative, citing recent releases and arguing that some companies have always had closed-weight variants.
- "What are you talking about? Feels like a very strong claim considering there are ongoing weight releases, wasn't there one just today or yesterday from a Chinese company?" - diggan
- "Alibaba from beginning had some series of models that are always closed-weights (-max, -plus, *-turbo etc. but also QvQ), It's not a new development, nor does it prevent their open models. And the VL models are opened after 2-3 months of GA in API." - natrys
- "Literally released one today: [link]" - logicchains (regarding Tencent Hunyuan)
- "Deepseek R1 0528, the flagship Chinese model is open source. Qwen3 is open source. HIdream models are also open source" - jacooper
There's also a discussion about whether "Chinese open weights" are as crucial as some claim, given that other options might exist.
- "That kind of downplays that Chinese open weights are basically the only option for high quality weights you can run yourself, together with Mistral. It's not just influencers who are "hyping up Chinese open-source" but people go where the options are." - diggan