Key Challenges in AI Adoption for Businesses

Author: Andrew Simmons


In continuation to part one of the webinar, which explored how AI is disrupting industries and redefining business models, this second section focuses on the challenges businesses face when adopting AI. Adopting AI is a complex journey that requires more than just technology—it demands cultural shifts, proactive leadership, and robust data governance.

 

Hear from:

 

  • Phil Le-Brun, Enterprise Strategist at Amazon Web Services, Speaker 
  • Mary Purk, Executive Director and Co-Founder of AI at The Wharton School, Speaker 
  • Andrew Simmons, Retail and CPG Practice Lead at Wavicle Data Solutions, Moderator 

 

In this discussion, these AI experts explore issues like leadership hesitation, cultural resistance, and the need for robust data governance. This includes their insights on overcoming these barriers and building a foundation for successful AI integration.

 

Watch the complete webinar here or keep scrolling to read a transcript of the discussion between Phil, Mary, and Andrew:

 

 

Andrew: I’ll start with you, Phil, as we discuss some of the challenges to harnessing AI’s power. Something I hear from the C-Suite in my practice is – and perhaps, it’s a legacy of other technological innovations in the last 25 years – “I think I’m going to let the market figure this out a little bit more. I want to be a fast follower or even a slower follower. I’m going to keep my powder dry.” What would your message be to leaders taking that stance regarding what they should be doing? Is that the right mental model to think about here, as Mary indicates, this 18- to 24-month imperative inflection point? 

 

Phil: Quite simple. Start being a leader. A leader isn’t a title. It’s what you do. How much longer do we have to wait to capitalize on what we have at our fingertips? Twenty years ago, people were waiting to see what happened with the cloud. Fifteen years ago, people were waiting to see what happened with data. Now, it’s like, how much longer are we going to wait?

 

We know there are certain imperatives around getting ready for the future. I’m not a great believer in using crystal balls. The only thing that happens when you play with crystal balls is you cut your hands on broken glass. But one fair prediction is whatever you want to do in the future, whether it’s generative AI, AI, ML, insights into your business, quantum, or whatever the future holds, you’re going to need a solid foundation for data, and you’re going to have to be able to use that data. So why would you not make that no-regrets decision today to make sure you have the foundation in place? Yes, there’s a technology piece to that, making sure that you have data in a single place that is secure and accessible.

  

Those are solved problems. There are organizations out there that have been doing that for 20 or 30 years now, so why would we wait to resolve that in an organization today? But the bigger part we see is actually more on the culture, leadership organization, and skill set.

 

We know quantifiably that the majority of data issues are things like leadership behavior, the leader who stands there and says, “We’re going to be a data-enabled organization, but I’ve been here 20 years, so I know what the answer is,” immediately dissolving that desire for employees to go off and actually use data. Or data being trapped in silos because folks believe that’s their power base or, more obvious, is data literacy. We train people, or we are brought up as leaders being versed in descriptive analytics. How many tires or Big Macs did I sell last Wednesday? That doesn’t mean you can use predictive or prescriptive analytics. How many tires am I going to sell next Wednesday, and what can I do about it?

 

So, raising the bar regarding data literacy across the entire organization and creating a culture where people can take that data, not just to get insights because they’re mildly interesting but actually to drive those insights into action, pushing the data to the frontline as much as possible. So there are many other things that can be done, but creating a culture where people want to use the data, not just to answer questions, but even to find questions that they hadn’t even thought about asking previously.

  

Andrew: Mary, Phil indicated a couple of things I’d love to get your responses to as a follow-up, but I’ll start here. You know that it is imperative and without regrets to have a data foundation from which to build and scale AI in the future. When you think about the challenges from your vantage point for organizations and, you know, deploying and building out that data foundation, what are you guys seeing out there? 

 

Mary: Well, the good news is that, fortunately, everyone realizes how important data is, and there is this connection between data and models. And you know, garbage in, garbage out. So, there is that basis of data literacy.

   

But to Phil’s point, he says, “We need to raise the bar and explain to people that those are table stakes.” As Reid Hoffman said on an HBR Idea Cast podcast, “Wayne Gretzky skates to where hockey is going be, not where it is.” That’s how I would even take data literacy. You need to educate your teams to where you need them to be, not where they are, and that’s how you can get into predictive and prescriptive analytics.

 

Frankly, we should change those names, if we can make them a little more user-friendly. But you know that essentially is what it is, and that is within the culture. I also appreciate that Phil said you want to encourage your teams and employees to use the data. So that’s where this generative AI comes in, because it’s a phenomenon of how many people downloaded, like 1,000,000 or 5 million downloads in the first week, then 100 million.

  

Andrew: It gave TikTok a run for its money.

  

Mary: It was just crazy good. Leaders should take that upon themselves and say, “Oh my God. We have all this excitement. Now, let’s really add in the data piece because these models only work as well as the data.” We now have the LLMs we might be bringing to our enterprise. But then, you know, we have all our proprietary data. And why is that so important? And as Phil said, we could figure out if we can sell more Big Macs on the South Side of Chicago on a Wednesday morning because of XXX, so why aren’t we doing that?

 

They can say, “I can solve these questions now because it’s not so hard to get the data, but I need to double-check and make sure that data’s good, and that’s why I’m important” —the human in the loop, and again, it’s part of the model, the data, and the people to create good decisions to go out there.

 

That’s why I love data. Luckily, we have so much with us that we can finally do something about it, and it’s not going to be cumbersome. Thanks to companies like AWS and Wavicle, we should be able to use data really easily. And now we have our models, so work should be really a lot of fun.

   

Phil: I just want to pick up on something you said, Mary, because it really struck a chord. A lot of leadership’s role now is setting the right problems to solve and then letting people get on with it. That’s what makes work fun. This idea is that the leaders determine what needs to be done because they have all the knowledge, that works well in the 19th century. Now, the leaders coming up with those big hairy problems and giving those to teams who have the data to really look at, “What hypothesis do I need to come up with? What experiments do I need to run?” That makes work fun and is massively impactful in terms of the buy-in from your own employees.

  

Mary: Do you know what else that means? People are going to be looking for different types of leaders. You have to have that ego in check, you have to be vulnerable, and you have to be able to know that you’re going to win as a team. You have to be willing to give authentic congratulations to individuals rather than keeping them within. That, to me, is also going to be really critical within leadership.

  

Andrew: You know, Phil, one other thing that Mary indicated that resonated with me that I’d like to get you to comment on is that when it comes to descriptive reporting, it has been useful in the last 25 years. Whether you’re reporting to the street or trying to figure out a sales commission to the dollar, for example.

  

This is my opinion, but I think some of the implicit anxiety that we’ve injected into organizations and data professionals is that you have to be scared of data, and it has to be right. It’s got to go through dev, QA, pre-prod, pre-prod two, pre-prod three, get released, then QA before anybody can use it. And you know, part of AI is – Mary, as you indicated – it’s so exciting getting into a culture that is data friendly and wants to use data. So, Phil, I’d be interested in your thoughts about how we can decrease anxiety and increase the discovery in organizations when it comes to using their data.

  

Phil: I must have become a geek over the past few years because I love playing around with data. It’s amazing what you can find if you go onto it with an open mind. So, a couple of things spring to mind. You know, I think the era of big data should have taught us something, which is data quality is really important. Especially when it comes to threats to life and such, you want it to be accurate for much of what we do, and what we’re looking for is big trends.

 

Machine learning is ideal for that. It’s less about the individual data points; it’s more about what you can observe in the trends. I see a lot of individuals who get there and start to apply the creative side of their minds. Do you know what questions I should be asking rather than what answer I am seeking? And then, you’re absolutely right: Make data fun to use.

 

We talk about building data into the average employee’s workflow. So rather than making them go down a particular route, you have to use this tool, get this person’s permission, and sign off. I know it’s an overused phrase, but democratize access to data as much as you can. I think Stanley McChrystal, in Team of Teams, talked about sharing data until it was nearly illegal. 

 

So, take the approach of providing as much data as possible and then giving people the tools they are comfortable with. That could be QuickSight, Power BI, SAS, R, or Python. We did that at McDonald’s, and it was really interesting because if you make it easy for people to use data because they’re using tools familiar to them, then they’re more inclined to do so. It’s obvious, isn’t it? But it’s basic change management: How do you enable users to get to the data? As opposed to: What do you want those users to be doing to jump through the hoops you put in the way of them getting to the data?

  

Mary: Can I just comment on one thing? What Phil said is so key. He just said its basic change management, and that’s what’s great about what’s happening with generative AI and the fact that a lot of us already have our data on the cloud. People have all these clouds, and they have these security systems and everything. You have to be secure about your models if they’re going to be enterprise, on-prem, or all these other things. You have people who make those decisions. But it’s basic change management. Change is hard, but it’s not like you need to get a PhD in, you know, mechanics or physics or whatever. A lot of people can make this happen and be transformative in their company. That’s what will be really exciting to see.

 

Andrew: Absolutely. Mary, one quick follow-up question. Just put a button on this particular subject. So when you think about data foundations, you know many organizations, to both of your points, have been in that space for 25 to 30 years and have programs to improve data quality and access through data governance, et cetera. What role do you see those traditional data governance programs have in enabling quality data on a data foundation and growing into the AI governance space?

  

Mary: Well, first and foremost, that is one term that companies need to educate themselves about, as well as their employees and their board. All the board members – it’s not just one committee around governance, it permeates across all the different departments.

 

I’m sure this is across many companies, but I have specifically seen Walmart put out their governance that they’re doing across all their different functions. They have a responsible AI governance that they are having all employees sign, and part of it touches upon all the differences, such as data security and such. But it also talks about what they’re going to be doing to protect the customer’s data and such.

 

That, to me, is an action, that they decided to educate their whole ecosystem about what they’re going to do with governance. It can take different shapes and forms based on your industry, employees, and customers. But it needs to be much broader, and it’s like a literacy around governance, and it’s very closely and inextricably tied to data, which is the lifeblood of almost all companies at this point.

  

So that’s how I see governance now. Can they do different things? Can they have an AI committee that dictates how you prioritize your use cases? There’s a wide variety of things that AI governance can be part of. You just have to decide how you set up the governance structure. So, how can you expand it so that it’s not just within the audit function of the CIO, but it permeates across? Everyone’s affected by HR, so it’s a similar thing with governance. That’s what I would say about that.

  

Andrew: Absolutely. That’s a terrific example; federated HR is the same as governance.

 

Stay tuned for part three of this webinar series, where they will discuss how to prepare for an AI-driven future. You can also read more of our AI thought leadership here.