DeepSeek and Qwen AI Models Launch as Ubuntu Snaps: What You Need to Know

Canonical has simplified the deployment of AI models on Ubuntu by introducing new ‘optimised inference snaps’ tailored for both Intel and ARM Ampere devices. The beta versions of DeepSeek R1 and Qwen 2.5 VL have now become the first large language models (LLMs) available on the Snap Store. These models come equipped with automatic engines and configurations that cater to the specific architecture of the devices they run on.

Overview of the Models

DeepSeek R1 is an open-source reasoning model from the Chinese AI company DeepSeek, emphasizing abilities in mathematics, coding, and complex reasoning tasks. On the other hand, Qwen 2.5 VL is an open-source model developed by Alibaba Cloud, designed for handling text, images, and videos seamlessly. Both models operate locally and do not require cloud API calls or subscription fees.

Jon Seager, the VP of Engineering at Canonical, highlighted the aim to make silicon-optimized AI models accessible to everyone. Jeff Wittich, Chief Product Officer at Ampere, remarked that this initiative would allow enterprises to quickly deploy and scale preferred AI models on Ampere systems using Ubuntu’s AI-ready ecosystem.

Developers will find these pre-tuned LLMs easier to access for applications and server workloads, eliminating the need for extensive research on deploying quantized model variants manually. Instead, a simple snap install command will suffice.

The architecture-optimized models are designed to run faster and more efficiently on compatible hardware. Automatic selection ensures that users don’t need to decide which model to install, as the system will automatically choose the best model for the device.

Industry Implications

While the introduction of these snaps signals Canonical’s commitment to AI on Ubuntu, it does not indicate a complete pivot toward heavy AI integration in the desktop environment. The existing Intel NPU driver and OpenVINO AI plugins are available as snaps, but Canonical primarily aims to facilitate easier access to well-known open-source LLMs optimized for specific hardware.

It is important to note that these models are standalone and do not come pre-installed or integrated into the Ubuntu interface, which keeps the platform user-centric. Canonical’s primary focus seems to be on providing developers and enterprises with optimized models to capitalize on their hardware’s capabilities without cluttering the user interface with unnecessary AI functionalities.

For more details, check the Canonical blog post.


Posted

in

by

Tags: