News

Phi-2 is a generative AI model with 2.7 billion-parameters used for research and development of language models. While large language models can reach hundreds of billions of parameters, Microsoft ...
Older large language model versions, such as GPT 3.5-Turbo version 0301 and GPT 4 version 0314, are set to "expire no earlier than October 15th, 2023," per Microsoft's "Azure OpenAI Service Models ...
Microsoft this week has made available the latest version of its suite of small language models (SLM), Phi-2, in the Azure AI Studio model catalog. Currently, the SLM is only available for ...
2023 was very much the year of the large language model. OpenAI’s GPT models, Meta’s Llama, Google’s PaLM, and Anthropic’s Claude 2 are all large language models, or LLMs, with many ...
Today, Microsoft introduced Phi-4, a 14B parameter state-of-the-art small language model (SLM) that even beats OpenAI's GPT-4 large language model in MATH and GPQA AI benchmarks.
However, Microsoft also had a surprise up its sleeve: the next release of Bing will feature a new next-generation large language model the company claims is "much more powerful" than ChatGPT and ...
Microsoft is reportedly working on its own large-scale language model (LLM), MAI-1 , which could rival cutting-edge AI models such as Google's Gemini, OpenAI's GPT-4, and Anthropic's Claude.
Microsoft has released the 'Phi-3' family as a language model that delivers great performance on a small scale. The smallest model in the family, Phi-3-mini, is an open model and can be used for ...
OpenAI is also reportedly partnering with Microsoft, a tech giant, to build a new $100bn data centre. Based on the numbers alone, it seems as though the future will hold limitless exponential growth.
Microsoft Research today open-sourced a tool for training large models and introduced Turing NLG, a Transformer-based model with 17 billion parameters.