A recent Salesforce survey of 600 IT leaders reveals a new mandate from their bosses: Incorporate generative artificial intelligence (GenAI) into the technology stack -- and fast. But the response from IT professionals is "not so fast" -- highlighting concerns about resources, data security, and data quality.
Also: Even more businesses will use AI and data to boost sales and services this year
Nearly three in five IT professionals say business stakeholders hold unreasonable expectations regarding the speed and agility of new technology implementations. In fact, the IT leadership survey reveals almost nine in 10 IT professionals can't support the deluge of AI-related requests they receive at their organization.
A 2024 study found that 90% of IT leaders say it's tough to integrate AI with other systems. AI adoption has exploded and amplified the need for a coherent IT strategy, but achieving that balance is easier said than done. MuleSoft's ninth annual Connectivity Benchmark Report was produced from interviews with 1,050 IT leaders (management positions or above) across the globe (public and private sector with at least 1,000 employees). The report's executive summary suggests:
Business success and growth is dependent upon trust, data, AI and automation. Businesses today are competing in an experience-led economy that is based on trust, personalization, speed, and intelligence. Most people are concerned about the implications of GenAI on data security, ethics, and bias. In fact, 81% of customers want a human to be in the loop, reviewing and validating generative AI outputs. The road to implementation and adoption of AI in a secure, trustworthy, scalable and stakeholder value-driven model will require a lot more than just solid technology and processes. What's needed most is "deployment empathy."
Also: Will AI hurt or help workers? It's complicated
To better understand how large, complex organizations successfully deploy and adopt new technologies in order to turbo charge their value creation capabilities, Constellation Research CEO Ray Wang and I invited three business technology leaders to our weekly podcast DisrupTV. We discussed GenAI -- and the need for organizations to adopt and practice deployment empathy when launching new AI efforts -- with Teresa Carlson, Rhonda Vetere, and Dr. David Bray.
Deployment empathy embodies putting people first, managing change thoughtfully, creating psychological safety, reassuring anxious workers, and collaborating across sectors to co-create solutions tailored for shared benefit. The practice of deployment empathy centers around the principle that empathetic leadership will enable a smooth, productive transition amid the disruption created by GenAI's impacts on companies, customers, employees, citizens, communities, and societies.
Teresa Carlson is a technology executive and leader with more than 20 years of experience helping governments and enterprises adopt new technologies like cloud computing and AI. Teresa started and led Amazon Web Services' worldwide public sector business, helping more than 5,000 government agencies and 10,000 education institutions adopt cloud technologies. She also served as president and chief commercial officer at Flexport, a supply chain/logistics company; corporate vice president of Microsoft, as well president and chief growth officer at Splunk.
Currently, Teresa is a strategic advisor to technology companies and government organizations. She serves on the boards of Finch AI, Cura, and others. Teresa is also vice chair of the White House Historical Association and an Atlantic Council board member.
During the lively group discussion, Teresa emphasized the critical importance of radical collaboration between public and private sector entities when working to adopt emerging technologies like AI in government settings. This entails deeply listening to agencies' specific needs, co-designing responsible solutions tailored for them, and ensuring full interoperability with legacy systems.
Also: Want to work in AI? How to pivot your career in 5 steps
Why is deployment empathy so essential? Government and enterprise environments do not reward risk-taking and innovation. Championing deployment empathy requires recognizing the current risk-reward environment. Leaders in these risk-averse cultures must create incentives and psychological safety for teams to feel comfortable trying new things like AI. This involves transparency, setting clear guidelines, and managing change thoughtfully.
Also bringing deep technology and leadership expertise, Rhonda Vetere is a global executive who has led major digital transformation initiatives across industries. She is also an accomplished triathlete and author who applies athletic approaches to business leadership and strategic advisory roles. She has worked as a CIO, CTO, and digital transformation leader at large companies like HP Enterprise, Barclays, and JPMorgan.
She is the author of the book "Grit and Grind: 10 Principles for Living an Extraordinary Life ," which focuses on achieving one's full potential. Rhonda serves on boards and is a strategic advisor to companies globally on digital transformation and emerging technologies like AI.
As part of the discussion, Rhonda noted that many employees feel anxious about AI automation's potential impact on jobs and skills. Business leaders should be fully transparent about where AI automation makes sense while clearly communicating reskilling plans.
Why is deployment empathy needed now? When deploying AI, leaders should start conversations by discussing where humans fit into the process rather than leading with the technology. Championing AI adoption requires a human-centric mindset focused on impact to people and jobs - deployment empathy.
As part of the discussion trio, Dr. David Bray is an acclaimed technology leader with extensive experience guiding organizations through complex, high-risk situations. He is an award-winning, recognized expert on issues such as leadership during turbulent times, digital transformation, resilience, countering disinformation, and responsible adoption of emerging technologies like AI.
He has served in multiple leadership roles dealing with crisis situations and challenges, including bioterrorism preparedness and response, leading two bipartisan National Commissions on R&D, as well as work with the US intelligence cmmunity, the FCC, and the Department of Defense. David is co-chair for the Loomis Council and distinguished fellow at the Stimson Center.
Also: Workers with AI skills can expect higher salaries - depending on their role
David observed that AI and related technologies are catalyzing seismic societal changes in how we work and live, at a pace exceeding our ability to adapt policies, social contracts, and organizational change management practices. He noted that this calls into question existing social contracts around displaced workers and economic opportunity, which is why leadership paired with deployment empathy in our GenAI era is important now more than ever.
How to embody deployment empathy authentically? David noted that leaders must provide a steady "non-anxious presence" amidst uncertainty to reassure people worried about job loss. This empathetic leadership is crucial. He also noted AI journeys for companies, governments, and society will involve both short-term sprints and longer-term marathons. While moving fast, leaders cannot forgo security, customer value, and business continuity. We must balance thoughtfulness, empathy, and care for people with the urgency to innovate for shared prosperity.
During the discussion, Teresa highlighted the need for clear governance frameworks and guidelines around ethical, fair, transparent, and legally compliant AI deployment in the public sector. This responsible AI approach builds trust and mitigates risks as government agencies adopt AI.
Rhonda also suggested creating formal programs to identify roles needing upskilling, "ringfencing" those employees, providing educational resources, and guaranteeing jobs after reskilling is complete. This thoughtful change management reduces anxiety and distrust, while promoting psychological safety.
Also: Beyond programming: AI spawns a new generation of job roles
David also highlighted that -- now more than ever -- leaders need to provide a steady, "non-anxious presence" to reassure people that it will be OK through this transition, even if the outcomes remain uncertain. This means openly acknowledging people's fears, showing genuine empathy, communicating transparently, and co-creating solutions.
Together, the speakers emphasized that responsible and ethical AI adoption requires empathetic change management and responsible governance frameworks. Leaders should promote psychological safety through transparency, reskilling support, reassurance during uncertain transitions, and co-designing solutions tailored to people's needs.
Deployment empathy also includes recognizing the importance of AI adoption and managing unintended consequences that requires cross-sector collaboration between government, academia, civil society groups, and business. We need new social contracts for labor displacement and other seismic economic shifts catalyzed by AI.
Also: AI is changing cybersecurity and businesses must wake up to the threat
Cumulatively, the speakers highlighted the societal leadership challenges posed by AI and that leaders have a duty to support their workforce through AI adoption with empathy, communication, and responsible governance. Leaders must champion deployment empathy both internally and externally to their organizations. Together with radical collaboration, clear governance guardrails, and compassionate communication, leaders can guide their organizations through the AI-driven transformation in a productive way.
This article was co-authored by Dr. David Bray, Principal & CEO at LeadDoAdapt (LDA) Ventures.