A survey shows that AI projects have grown tenfold over the past year

We know AI is riding the crest of the hype, but we don’t hear enough about the problems and headaches it brings with it. A new survey estimates that the number of completed or nearly completed AI projects has increased tenfold in the past 12 months. Good news, but that means IT teams are striving to keep up with this. Companies need more people with the right skills to pull it together, and CEOs and managers need to make sure that AI securely delivers what the business needs.

Subject of contention is talent shortages, integration challenges, and governance requirements, a recent survey of 700 IT managers and executives published by Juniper Networks Find.

Most eyebrow-raising sonic sting: AI applications completed or nearly completed have grown 6% a year ago to 63% today. Additionally, we’re currently seeing an increase in enthusiasm for the full adoption of AI, versus the narrower use cases that dominated last year’s survey. The percentage of IT leaders who say they are looking to deploy fully enabled AI use cases with widespread adoption in the future jumped from 11% to 27%.

The long-standing build-or-buy dilemma has emerged with AI projects. Companies are divided over the implementation of off-the-shelf AI solutions compared to in-house systems. Nearly 4 in 10 CEOs (39%) indicate that their organizations mix off-the-shelf AI solutions with fully self-built ones, with 3 in 10 saying they either use only off-the-shelf solutions or only fully integrated ones.

Building AI solutions internally brings its own set of challenges. More than half (53%) of IT leaders surveyed say the reliability of internal AI applications is the biggest challenge, followed by integration with existing systems (46%), finding new AI-capable talent (44%), and development time ( 44%).

Finding or nurturing the right talent to develop, operate, and leverage AI is a problem that challenges many IT leaders today. The survey found three areas as the most important investment intentions (21% each): hiring the right people to operate and develop AI capabilities, increasing training of AI models, and expanding the capabilities of an existing AI tool into new business units.

At least 42% say their datasets and analytics are integrating and leading the way in AI technology. A similar number have created a center of excellence for artificial intelligence.

The most important steps to enable workforce adoption of AI include providing tools and opportunities to apply newly acquired AI skills (43%), updating performance metrics to include AI (40%), developing a workforce plan that identifies new skills and roles (39%), and changing frameworks Learning and development (39%).

39% of IT leaders meet this challenge by implementing and using low-code, no-code AI-enabled development tools. About a third, 34%, adopt AI modeling automation tools.

In 2021, companies faced challenges related to AI to develop models and standardize data. In 2022, these challenges remain, but challenges related to developing governance policies (35%) and maintaining AI systems (34%) have grown in importance.

With great AI capabilities comes great responsibility. Only 9% of IT leaders consider AI governance and policies, such as creating a company-wide AI leader or responsible AI standards and processes, as fully mature. More leaders see IT governance as a priority: 95% agree that having proper AI governance is key to staying ahead of future legislation, up from 87% in 2021. Almost half (48%) of respondents believe more action should be taken To be built around the effectiveness of AI governance.

Nearly half, 44%, report that they have established responsible AI ethics and AI standards and processes. The same percentage also created company-wide AI leaders who oversee AI strategy and governance.

IT leaders point out that the main risks from inadequate AI oversight are accelerated hacking or what the survey authors call “AI terrorism” (55%). Privacy is also the number one concern, according to 55%. Compliance with regulations (49%) and loss of human capacity (48%) were also cited as extreme risks.