Business

Alibaba's small, open source Qwen3.5-9B beats OpenAI's gpt-oss-120B and can run on standard laptops

Swipe to see the full story...

Key Highlight

Despite political turmoil in the U.S..

Key Highlight

AI sector, in China, the AI advances are continuing apace without a hitch.Earlier today, e-commerce giant Alibaba's Qwen Team of AI researchers, focused primarily on developing and releasing to the world a growing family of powerful and capable Qwen open source language and multimodal AI models, unveiled its newest batch, the Qwen3.5 Small Model Series, which consists of:Qwen3.5-0.8B & 2B: Two models, both ptimized for "tiny" and "fast" performance, intended for prototyping and deployment on edge devices where battery life is paramount.Qwen3.5-4B: A strong multimodal base for lightweight agents, natively supporting a 262,144 token context window.Qwen3.5-9B a compact reasoning model that outperforms the 13.5x larger U.S..

Key Highlight

rival OpenAI's open soruce gpt-oss-120B on key third-party benchmarks including multilingual knowledge and graduate-level reasoningTo put this into perspective, these models are on the order of the smallest general purpose models lately shipped by any lab around the world, comparable more to MIT offshoot LiquidAI's LFM2 series, which also have several hundred million or billion parameters, than the estimated trillion parameters (model settings) reportedly used for the flagship models from OpenAI, Anthropic, and Google's Gemini series..

Key Highlight

The weights for the models are available right now globally under Apache 2.0 licenses — perfect for enterprise and commercial use, including customization as needed — on Hugging Face and ModelScope.The technology: hybrid efficiency and native multimodalityThe technical foundation of the Qwen3.5 small series is a departure from standard Transformer architectures..

Want the full analysis?

Detailed coverage and expert insights available on our main news hub.

Read Full Article