Microsoft is launching the DeepSeek R1 7B and 14B distilled models via Azure AI Foundry, following the introduction of NPU-optimized versions of the DeepSeek-R1 1.5B model for Copilot+ PCs. These models will be available on Copilot+ PCs powered by Qualcomm Snapdragon X, Intel Core Ultra 200V, and AMD Ryzen processors. The NPUs in these PCs can execute over 40 trillion operations per second and are designed for efficient local AI model execution, minimizing battery and resource consumption. The DeepSeek models utilize 4-bit block-wise quantization and int4 per-channel quantization for enhanced performance. Developers can access all distilled variants of the DeepSeek models through the AI Toolkit VS Code extension, allowing for local deployment and experimentation. Copilot+ PCs combine local computing capabilities with Azure's resources, facilitating a new paradigm of continuous computing for AI applications.