Microsoft launches smallest AI model Phi-3-mini.
Microsoft has introduced Phi-3-mini, the latest iteration of its featherweight AI model, representing the first in a series of smaller models set for release.
This new model, equipped with 3.8 billion parameters, has been trained on a scaled-down dataset in comparison to larger language models like GPT-4.
Availability on Leading Platforms:
Phi-3-mini is now accessible on prominent platforms such as Azure, Hugging Face, and Ollama.
Also read: Microsoft 365 down for thousands of users: Report
Microsoft’s planning includes subsequent releases of Phi-3 Small (7 billion parameters) and Phi-3 Medium (14 billion parameters).
Performance and Efficiency:
The Verge reports that Microsoft claims Phi-3 offers improved performance compared to its predecessors, delivering responses akin to models ten times its size.
This positions Phi-3-mini as a highly capable alternative to larger models like GPT-3.5.
Insights from Eric Boyd:
Eric Boyd, corporate vice president of Microsoft Azure AI Platform, said that Phi-3-mini matches the capabilities of larger models like GPT-3.5 but in a more compact form.
He emphasized the cost-effectiveness and superior performance of smaller AI models, particularly on personal devices such as phones and laptops.
Innovative Training Methodology:
As Microsoft launches smallest AI model Phi-3-mini, Boyd explained that developers adopted a unique “curriculum” approach, drawing inspiration from childhood learning methods.
They leveraged simplified vocabulary and sentence structures across diverse topics, aided by an LLM (Large Language Model) to create “children’s books” aimed at training Phi-3 in an accessible manner.
Also read: Microsoft Outlook and Teams down for tens of thousands around world
“There aren’t enough children’s books out there, so we took a list of more than 3,000 words and asked an LLM to make ‘children’s books’ to teach Phi.”