Connect with us

Hi, what are you looking for?

Investing

Opinion: Will generative AI complete the cloud transition? One prominent executive thinks so

There have been a lot of pronouncements about the capabilities of generative artificial intelligence in recent months, but one that Box Inc. Chief Executive Aaron Levie made recently merited a further look.

In a brief interview after Box’s
BOX,
+0.14%
most recent earnings report, Levie expanded on comments he made to Wall Street analysts that AI would be the “nail in the coffin” for legacy data centers, as well as the push that many companies need to move their data to the cloud. His theory is that generative AI systems have been designed to run in the cloud, as we saw with Open AI’s beta testing involving millions of users who were accessing ChatGPT via the cloud.

“The classic way a large enterprise would manages its files, its contacts and marketing materials and project files, inside of a legacy center or like server environment, those files will be largely inaccessible by these latest AI models,” he said. “It would be so cost inefficient and counter-productive to try and bring AI through your on-prem data center in any kind of reasonable way.”

“There are super nerdy kind of sub reasons, but it all amounts to this tech is really built for cloud systems, not legacy file systems,” Levie said.

It is not that surprising that an executive for a cloud-based storage company would see the biggest trend in technology at the moment from this perspective. If true, though, it could be a huge boon for cloud companies such as Amazon.com Inc.
AMZN,
-0.66%
Microsoft Corp.
MSFT,
+0.47%
and Alphabet Inc.
GOOG,
+0.16%

GOOGL,
+0.07%,
which have seen cloud growth moderate in recent quarters as companies look to cut costs.

More from Therese: Nvidia created an AI bubble, and software stocks are already paying the price

So I contacted Maribel Lopez, principal analyst at Lopez Research, for an independent opinion, and she said that the truth of the matter is “a really nuanced point” — it is not an a or a b, it sort of depends.

“He has a point about your data being locked in places where it cannot be accessed by large language models…. It means that people have to modernize their apps. It does not mean that everything has to go to the cloud.”

Lopez said that large language models like OpenAI’s technology and others have to run in the cloud because they are so big. She said that companies, though, are looking at ways to create their own large language models on premises, with their own data on site.

“Maybe your database is 1/20 of the size of OpenAI, that could be done on premises,” she said. “A lot of it depends on how big your data stack is.”

A bigger data stack requires a lot of money, however, and some companies are reticent to spend on pricy tech infrastructure right now. On the other hand, while the cloud seems less expensive, Lopez said that those cost savings are wildly exaggerated, and that having everything in the cloud is also expensive.

See also: Congress and tech seem open to regulating AI efforts, but that doesn’t mean it will happen

The answer probably lies somewhere in the middle — more companies will move to adopting the cloud, but will also maintain some local centers of computing power and storage, for security and legal reasons. This is known as hybrid cloud, and is the most popular approach in the corporate world right now. Forrester Research’ s State of the Cloud in the U.S. report last year concluded that among U.S. corporate enterprise infrastructure decision makers, 94% of them are using one type of cloud deployment, with a majority being hybrid cloud or multi-cloud.

For example, Lopez said that Intel Corp.
INTC,
-1.51%
and the Boston Consulting Group worked together to put the consulting firm’s vast half a century of proprietary data to better use so that it can more easily accessed by employees. With a custom natural language chatbot interface powered by a supercomputer with Intel Xeon processors and AI optimized hardware accelerators and software, BCG employees were able to retrieve and summarize information via semantic search, data buried in long lists of multi-page documents, the companies said.

Now they are offering this capability to other enterprise customers, with the pitch that this on premises approach will keep “private data in the isolation of their trusted environments.”

In addition, there are other reasons companies may continue to keep a hybrid and multi-cloud approach. As Infoworld pointed out last month, one fear is that generative AI could generate fake identities and fake data that could fool a cloud computing system, leading to potential data security breaches. There will also likely be systems integration issues, with generative AI algorithms being incompatible with some cloud systems, and the potential to tax cloud systems as these huge language models use up all the available computing power.

Generative AI could be a net positive for cloud computing, but it doesn’t sound as all-or-nothing as Levie suggests. Still, it will likely be a needed boon for the cloud companies, as they sell their compute power as a way to play what is being hailed as the next big thing in Silicon Valley.

Don’t miss: Vision Pro could be Apple’s biggest hit since iPhone, but that won’t be known for years

Read the full article here

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Videos

Watch full video on YouTube

Videos

Watch full video on YouTube

Videos

Watch full video on YouTube

Videos

Watch full video on YouTube