Enterprises embrace generative AI, but challenges remain

Enterprises embrace generative AI, but challenges remain

July 9, 2024 4:10 PM

Image credit: VentureBeat with DALL-E 3

We want to hear from you! Take our quick AI survey and share your insights on the current state of AI, how you’re implementing it, and what you expect to see in the future. Learn More

Less than two years after the release of ChatGPT, enterprises are showing keen interest in using generative AI in their operations and products. A new survey conducted by Dataiku and Cognizant, polling 200 senior analytics and IT leaders at enterprise companies globally, reveals that most organizations are spending hefty amounts to either explore generative AI use cases or have already implemented them in production. 

However, the path to full adoption and productivity is not without its hurdles, and these challenges provide opportunities for companies that provide generative AI services.

Significant investments in generative AI

The survey results announced at VB Transform today highlight substantial financial commitments to generative AI initiatives. Nearly three-fourths (73%) of respondents plan to spend more than $500,000 on generative AI in the next 12 months, with almost half (46%) allocating more than $1 million. 

However, only one-third of the surveyed organizations have a specific budget dedicated to generative AI initiatives. More than half are funding their generative AI projects from other sources, including IT, data science or analytics budgets. 

Countdown to VB Transform 2024

Join enterprise leaders in San Francisco from July 9 to 11 for our flagship AI event. Connect with peers, explore the opportunities and challenges of Generative AI, and learn how to integrate AI applications into your industry. Register Now

It is not clear how pouring money into generative AI is affecting departments that could have otherwise benefitted from the budget, and the return on investment (ROI) for these expenditures remains unclear. But there’s optimism that the added value will eventually justify the costs as there seems to be no slowing in the advances of large language models (LLMs) and other generative models.

“As more LLM use cases and applications emerge across the enterprise, IT teams need a way to easily monitor both performance and cost to get the most out of their investments and identify problematic usage patterns before they have a huge impact on the bottom line,” the study reads in part.

A previous survey by Dataiku shows that enterprises are exploring all kinds of applications, ranging from enhancing customer experience to improving internal operations such as software development and data analytics.

Persistent challenges in implementing generative AI

Despite the enthusiasm around generative AI, integration is easier said than done. Most of the respondents in the survey reported having infrastructure barriers in using LLMs in the way that they would like. On top of that, they face other challenges, including regulatory compliance with regional legislation such as the EU AI Act and internal policy challenges.

Operational costs of generative models also remain a barrier. Hosted LLM services such as Microsoft Azure ML, Amazon Bedrock and OpenAI API remain popular choices for exploring and producing generative AI within organizations. These services are easy to use and abstract away the technical difficulties of setting up GPU clusters and inference engines. However, their token-based pricing model also makes it difficult for CIOs to manage the costs of generative AI projects at scale.

Alternatively, organizations can use self-hosted open-source LLMs, which can meet the needs of enterprise applications and significantly cut inference costs. But they require upfront spending and in-house technical talent that many organizations don’t have.

Tech stack complications further hinder generative AI adoption. A staggering 60% of respondents reported using more than five tools or pieces of software for each step in the analytics and AI lifecycle, from data ingestion to MLOps and LLMOps. 

Data challenges

The advent of generative AI hasn’t eliminated pre-existing data challenges in machine learning projects. In fact, data quality and usability remain the biggest data infrastructure challenges faced by IT leaders, with 45% citing it as their main concern. This is followed by data access issues, mentioned by 27% of respondents. 

Most organizations are sitting on a rich pile of data, but their data infrastructure was created before the age of generative AI and without taking machine learning into account. The data often exists in different silos and is stored in different formats that are incompatible with each other. It needs to be preprocessed, cleaned, anonymized, and consolidated before it can be used for machine learning purposes. Data engineering and data ownership management continue to remain important challenges for most machine learning and AI projects.

“Even with all of the tools organizations have at their disposal today, people still have not mastered data quality (as well as usability, meaning is it fit for purpose and does it suit the users’ needs?),” the study reads. “It’s almost ironic that the biggest modern data stack challenge is … actually not very modern at all.”

Opportunities amid challenges

“The reality is that generative AI will continue to shift and evolve, with different technologies and providers coming and going. How can IT leaders get in the game while also staying agile to what’s next?” said Conor Jensen, Field CDO of Dataiku. “All eyes are on whether this challenge — in addition to spiraling costs and other risks — will eclipse the value production of generative AI.”

As generative AI continues to transition from exploratory projects to the technology underlying scalable operations, companies that provide generative AI services can support enterprises and developers with better tools and platforms.

As the technology matures, there will be plenty of opportunities to simplify the tech and data stacks for generative AI projects to reduce the complexity of integration and help developers focus on solving problems and delivering value.

Enterprises can also prepare themselves for the wave of generative AI technologies even if they are not exploring the technology yet. By running small pilot projects and experimenting with new technologies, organizations can find pain points in their data infrastructure and policies and start preparing for the future. At the same time, they can start building in-house skills to make sure they have more options and be better positioned to harness the technology’s full potential and drive innovation in their respective industries.

VB Daily

Stay in the know! Get the latest news in your inbox daily

By subscribing, you agree to VentureBeat’s Terms of Service.

Thanks for subscribing. Check out more VB newsletters here.

An error occured.

>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : VentureBeat – https://venturebeat.com/ai/enterprises-embrace-generative-ai-but-challenges-remain/

Exit mobile version