Home AI Microsoft is Baking a New AI Model to Take on Google, OpenAI

Microsoft is Baking a New AI Model to Take on Google, OpenAI

by ccadm



Microsoft is reportedly developing a new in-house AI model, and it is for the first time since it invested heavily in OpenAI with more than $10 billion in order to utilize the startup’s AI models in its own products and stay ahead until competitors catch up.

MAI-1 is an in-house project of Microsoft

The new model is said to be large enough to compete with those of Alphabet, which is Google’s parent company, and OpenAI.

The new model is called MAI-1 internally at Microsoft, and recently hired Mustafa Suleyman is leading the project, according to a report by The Information. He is the same guy who co-founded DeepMind, which was later bought by Google, and very recently he was CEO of another startup called Inflection.

But it is claimed in the report that it is not Inflection’s AI model, but the company may use the training datasets and other related tech from the startup for this very Microsoft model, which is entirely separate from the ones that Inflection has already released.

The point to remember here is that Microsoft, before hiring Suleyman, put out $650 million to buy the entire rights to Inflection and also hired most of its staff members. 

It is said that MAI-1 will be a different animal altogether and much larger than the smaller Microsoft models that it trained before and are free source. According to people familiar with the subject, the model will need more computing resources and training data, so it will definitely be more expensive to develop.

A much larger undertaking

For comparison, smaller models trained by companies Mistral and Meta have 70 billion parameters, which can be roughly called a scale to gauge the size of an AI model, and OpenAI’s GPT4 is said to have more than 1 trillion parameters. And MAI-1 will have around 500 billion parameters.

But parameters alone are not a reliable feature to measure an AI model, as some recent Phi series of open source models from Microsoft are said to be performing at par with models 10 times their size.

The Phi-3 Mini, which was released last month by Microsoft, is one of the series that is smaller in size, and the company is aiming to target a broader customer base with its cheaper options.

It is also reported that Microsoft has been procuring a large number of servers, or rather clusters of servers that have Nvidia graphic processors embedded together with massive amounts of datasets to train the model.

Though when Microsoft announced that it was baking OpenAI’s ChatGPT chatbot in its Bing search, many analysts called it an “iPhone moment” and were too bullish on its impact, it initially gave only one or two percent extra search share to Bing against Google. 

However, the most recent Search Engine Journal report for April shows that for the first time, Google search share dropped more than 4% in a single month as competitors Bing and Yahoo gained share specifically in the US market.

But it looks like it has started to make sense for Microsoft to pour billions into OpenAI and now with its own Phi models success if it is investing in a larger model that makes sense too.





Source link

Related Articles