Hosted on MSN

What is AI Distillation?

Distillation, also known as model or knowledge distillation, is a process where knowledge is transferred from a large, complex AI ‘teacher’ model to a smaller and more efficient ‘student’ model. Doing ...
Model distillation is one of the technology trends that has reached a level of maturity identified in Gartner’s 2025 Hype Cycle for artificial intelligence (AI) as “the slope of enlightenment”.
David Sacks, U.S. President Donald Trump's AI and crypto czar. David Sacks says OpenAI has evidence that Chinese company DeepSeek used a technique called "distillation" to build a rival model. OpenAI ...
Chinese artificial intelligence lab DeepSeek roiled markets in January, setting off a massive tech and semiconductor selloff after unveiling AI models that it said were cheaper and more efficient than ...
Whether it’s ChatGPT since the past couple of years or DeepSeek more recently, the field of artificial intelligence (AI) has seen rapid advancements, with models becoming increasingly large and ...
Google’s AI chatbot Gemini has become the target of a large-scale information heist, with attackers hammering the system with ...
The updates could help OpenAI compete better with rivals such as Anthropic, Google, and AWS which already offer similar capabilities. In what can only be seen as OpenAI’s efforts to catch up with ...
On Thursday, Google announced that “commercially motivated” actors have attempted to clone knowledge from its Gemini AI chatbot by simply prompting it. One adversarial session reportedly prompted the ...