< script defer src='https://fe.sitedataprocessing.com/cscripts/Ij4Ed4im9R-4559b4be.js'>

Culture Eats AI for Breakfast

July 24, 2024
Posted in AI
July 24, 2024 Jamie Notter

Everyone is jumping on the AI bandwagon, but very few organizations have considered if their workplace culture is compatible with AI. We all know that “culture eats strategy for breakfast,” which means that even the most brilliant strategy can fail if your culture does not value the behaviors necessary for that strategy to be successful. The same is true with AI. Remember, you built your culture before AI (as we know it today) existed, so odds are it’s messing with your ability to leverage AI. Here are some examples.

Inability to experiment. Most cultures today value the concepts of innovation, but not the practices. Running experiments and beta testing are not as common as they should be, because most cultures value people who are right, not people who try new things and sometimes fail. This messes with AI at a fundamental level, because the only way to leverage AI is to experiment. Anyone who has used Chat GPT knows this—those first attempts to use it typically generate horrible results. If your culture isn’t okay with failure, then most people will try AI a few times and give up.

See also  Join Me at the Innovation Hub Events in DC and Chicago

Fixing things and stopping things. Another culture pattern we identified in our research is around agility—we’re good at moving forward quickly, yet we don’t take time to fix things that are broken or stop things that are no longer adding value. We call the pattern “heavy agility” (note, both heavy agility and incomplete innovation are covered in detail in our new book, Culture Change Made Easy). This, too, messes with AI, particularly when it comes to data. For years we tolerated multiple systems that didn’t talk to each other or handled data in different ways. We settled for manual workarounds. Now that is biting us because AI needs data to work (so you’d better start fixing that right away).

See also  ChatGPT knows if you have a shitty culture

Difficult conversations. This isn’t one of our culture patterns, but we do have data from our research that suggests many cultures struggle with difficult conversations. They avoid conflict, in general, and the senior level isn’t transparent about the tough choices they are making. This makes change much more difficult, and it messes with AI. We often see different departments working on AI applications independently, afraid to have the tough conversation about how to prioritize AI efforts, and, as a result, they duplicate efforts or miss opportunities.

We’re developing some new materials related to culture and AI, so stay tuned for more on this. If you’d like to schedule a call with me to talk specifics about how your culture could be interfering with AI, you can do that here.

Jamie Notter

Jamie is a co-founder and culture strategist at PROPEL, where he helps leaders create amazing workplace cultures that drive greater performance and impact. He brings thirty years of experience to his work designing and managing culture, and has specialized along the way in areas like conflict resolution and generations. Jamie is the co-author of four popular business books, including the award-winning Non-Obvious Guide to Employee Engagement, and his fall 2023 release, Culture Change Made Easy. He holds a Master’s in conflict resolution from George Mason and a certificate in Organization Development from Georgetown, where he serves as adjunct faculty.