Register now for better personalized quote!

Your business is going to rely on hundreds of AI models. Here's why

Jul, 11, 2024 Hi-network.com
gettyimages-2153711289
shulz/Getty Images

Just as most organizations prefer multi-cloud options and multiple databases suited for various purposes, there's an artificial intelligence (AI) model for every purpose. A survey of over 1,000 IT decision-makers has found the most advanced AI adopters are leveraging hundreds of models simultaneously. 

We're now in the age of "multi-model AI". The average number of distinct AI models currently operational stands at 158 with projections suggesting this number will rise to 176 AI models within the next year, according to the survey by S&P Global Market Intelligence and underwritten by Vultr. 

Also: Do AI tools make it easier to start a new business? 5 factors to consider

The most advanced users reported an average of 175 models in use, projected to grow 14% to 200 models within the year. Respondents at the second-highest level of AI maturity projected 18% year-over-year growth in the number of models. Two-thirds of managers (66%) surveyed are building or developing their own or using open-source models. 

There are practical reasons for deploying multiple models across various use cases. A report out of MIT, for example, looks at the example of a system using three models trained on language, vision, and action data to help robots develop and execute plans for household, construction, and manufacturing tasks. "Each foundation model employed captures a different part of the decision-making process and then works together when it's time to make decisions," the MIT researchers stated. 

What's emerging is an "ensemble" approach to AI, involving several models operating simultaneously in every output, as described by Erica Dingman in a post for MovableInk. "The difference between a single model and an ensemble model approach is similar to that of a single violin versus an entire orchestra," she said. 

"While each instrument delivers value, several working together create something truly magical." In addition, employing diverse data sets and "an ensemble of models that are constantly being updated and trained" can help to reduce or eliminate bias in AI outputs.

Also: When's the right time to invest in AI? 4 ways to help you decide

The wide distribution and diversity of the systems supporting or being powered by AI models also support this proliferation of models. For example, AI is increasingly moving to the edge, the S&P and Vultr survey shows. 

"Distributed AI architectures, where the edge is a key component of applications that span an organization's infrastructure, appear likely to become the new norm," the survey's authors said. The vast majority (85%) of IT decision-makers surveyed say this shift is likely or extremely likely to occur in their environments, with 32% considering this change "extremely likely." 

The study's authors teased out what they defined as organizations leading the AI pack -- those with "transformational AI practices". Half of these transformers are performing "significantly better" against industry peers than those at operational levels. Almost all transformers said they improved their 2022 to 2023 year-over-year performance in customer satisfaction (90%), revenue (91%), cost reduction/margin expansion (88%), risk (87%), marketing (89%), and market share (89%). 

Also: Agile development can unlock the power of generative AI - here's how

Across all organizations surveyed, AI spending is expected to outpace general IT spending, the survey suggests. Close to nine in ten enterprises (88%) intend to increase AI spending in 2025, with 49% expecting moderate to significant increases. 

Behind the soaring numbers, however, are the challenges associated with demands on existing IT infrastructures. "When looking at high-demand AI activities such as real-time inferencing, respondents expressed concern that existing infrastructure would not be up to the task," said the survey authors. The top three concerns were insufficient CPU or GPU resources (65%), data locality issues (53%) and storage performance issues (50%). 

"We saw this echoed in the qualitative data, with interviewees expressing concerns about scheduling delays in higher-capacity GPU instances in public cloud, and potential impacts on data availability," the authors said. 

Also: Beyond programming: AI spawns a new generation of job roles

"We also noted growing concerns about the impact of infrastructure cost. Cost often becomes a more pressing concern once projects are running in production. Historically, organizations have had limited ability to effectively forecast costs."

tag-icon Hot Tags : Artificial Intelligence Innovation

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.