Last week OpenAI announced its development of SearchGPT, an AI powered search engine. OpenAI was now taking direct aim at Alpha/Google's primary source of revenue and profits. Readers will recall that in 2023 Microsoft's had hoped that an AI powered version of its Bing search engine would achieve this lofty goal; but it failed miserably. Now that Microsoft and OpenAi have become friendly competitors, OpenAI has set its sights on this same Holy Grail.
Why now? Why has OpenAI allocated staff and resources to produce SearchGPT instead of GPT-5 at this time? According to a recent report in The Information, OpenAI could lose $5 billion this year. If so, then OpenAI would focus on developing whatever would bring in the most cash as fast as possible
- "Why OpenAI Could Lose $5 Billion This Year", Amir Efrati and Aaron Holmes, The Information, 7/24/24
Perhaps super salesman/CEO Sam Altman has calculated that an effective SearchGPT will attract far more enterprise customers from Alpha/Google than another "new and improved" version of OpenAI's GPT model. Indeed, SearchGPT merely needs to be better than the confusing AI powered search mishmash that has become Alpha/Google's latest source of GenAI embarrassment.
Usually, OpenAI has not announced impressive new developments until they were available to the tech press, and asked other interested users to add their contact info to a wait list. The tech press offered preliminary assessments of the innovation. But none of the authors of the review articles for SearchGPT seemed to have had any hands on experience with SearchGPT. So everybody is on the wait list for this one.
The Microsoft/OpenAI partnership has always offered its GenAI models to two markets. On the one hand, they created an array of impressive GenAI services that were powered by their underlying GPT models. Then they provided end users with ChatGPT assistants that converted requests for these services in plain english into the complex calls to the API's of their models. Note: Microsoft also embedded its models into most of its existing products, e.,g., Office, thereby making these products smarter.
On the other hand, the partners also provided developers with direct access to the APIs of their underlying closed models, access that enabled the developers to create their own arrays of impressive GenAI functions ... but developers could not modify the underlying closed models
Google followed Microsoft's strategies, i.e., it provided end users with Gemini chatbots that converted english language requests into calls to the APIs of its underlying models; it embedded its models into most of its existing products; and it provided developers with direct access to the APIs of its models, but developers could not modify these closed models.
Meta developed celebrity "sound-alike" chatbots for non-developers, which relatively few people used; and it embedded its models into its existing products, e.g., Facebook and WhatsApp. However, Meta also published open versions of its models that developers could access via APIs. Developers could also modify these open models. Indeed, Meta's deep commitment to the notion of open source development has led it to anticipate the development of substantial improvements to its open models by the open source community. Improved models will generate more customers and more revenue for Meta's existing products.
Recently Microsoft and Google have claimed that their immense investments in their closed models has started to pay off; but the investment community has remained skeptical. Facebook has yet to make any claims that its investments have become profitable. When it released its first 2024 quarterly report three months ago, this blog's TL;DR offered the following comments:
- "Unfortunately, the news for Meta's stockholders was mixed. Evidently its soaring profits had little or no connection to its previous investments in generative AI. So when CEO Mark Zuckerberg provided "guidance" that Meta would continue to make massive investments in GenAI in the next three months, the value of its stock collapsed."
Its new Llama 3.1, its biggest model with 405 billion parameters, represents its continued commitment to this strategy. One therefore anticipates that its stock will fall once again when it releases its second quarterly report later this week.
- "Google’s parent company [Alphabet, Inc] narrowly topped revenue and profit expectations, driven by its search engine and cloud unit, and it said A.I. investments were “driving new growth.”"
- “The benefits [Google] is seeing in [Google Cloud Platform] on AI productization still seems difficult to discern, as is the full payoff that Google should see (we do not think potential revenue benefits will arrive until [the first half of 2025] at the earliest),"
- OpenAI
"OpenAI announces SearchGPT, its AI-powered search engine ", Kylie Robison, The Verge, 7/25/24 ***
-- This story also covered by Wired, TechCrunch, Engadget, Wall Street journal, VentureBeat, CNET, Gizmodo, BBC, ... and OpenAI - LLM News
"Meta releases the biggest and best open-source AI model yet" [Llama 3.1, 405b parameters], Alex Heath, The Verge, 7/23/24 ***
-- This story also covered by TachCrunch, Engadget, VentureBeat, The Information, - AI Big Tech quarterly earnings ***
+++ Alphabet/Google ...
"Alphabet Reports 29% Jump in Profit as A.I. Efforts Begin to Pay Off", Nico Grant, NY Times, 7/23/24
-- This story also covered by Yahoo!/finance, Reuters
No comments:
Post a Comment
Your comments will be greatly appreciated ... Or just click the "Like" button above the comments section if you enjoyed this blog note.