Everyone is jumping on the AI bandwagon, but very few organizations have considered if their workplace culture is compatible with AI. We all know that “culture eats strategy for breakfast,” which means that even the most brilliant strategy can fail if your culture does not value the behaviors necessary for that strategy to be successful. The same is true with AI. Remember, you built your culture before AI (as we know it today) existed, so odds are it’s messing with your ability to leverage AI. Here are some examples.
Inability to experiment. Most cultures today value the concepts of innovation, but not the practices. Running experiments and beta testing are not as common as they should be, because most cultures value people who are right, not people who try new things and sometimes fail. This messes with AI at a fundamental level, because the only way to leverage AI is to experiment. Anyone who has used Chat GPT knows this—those first attempts to use it typically generate horrible results. If your culture isn’t okay with failure, then most people will try AI a few times and give up.
Fixing things and stopping things. Another culture pattern we identified in our research is around agility—we’re good at moving forward quickly, yet we don’t take time to fix things that are broken or stop things that are no longer adding value. We call the pattern “heavy agility” (note, both heavy agility and incomplete innovation are covered in detail in our new book, Culture Change Made Easy). This, too, messes with AI, particularly when it comes to data. For years we tolerated multiple systems that didn’t talk to each other or handled data in different ways. We settled for manual workarounds. Now that is biting us because AI needs data to work (so you’d better start fixing that right away).
Difficult conversations. This isn’t one of our culture patterns, but we do have data from our research that suggests many cultures struggle with difficult conversations. They avoid conflict, in general, and the senior level isn’t transparent about the tough choices they are making. This makes change much more difficult, and it messes with AI. We often see different departments working on AI applications independently, afraid to have the tough conversation about how to prioritize AI efforts, and, as a result, they duplicate efforts or miss opportunities.
We’re developing some new materials related to culture and AI, so stay tuned for more on this. If you’d like to schedule a call with me to talk specifics about how your culture could be interfering with AI, you can do that here.