Do as You Should, Not as You Can
Let's keep it a stack. Today's use of artificial intelligence (AI) is poised to revolutionize communication and entertainment. And, to be clear, this post is neither intended nor designed to be anti-AI. The purpose of this post is to focus on how powerful a tool AI can be. In that vein, this post must also remind us that as technology advances quickly, we need to explore AI's impact on all creatives and the wider communication industry.
The [Technical] Power of AI:
AI has transformed creative output for decades. Its ability to facilitate the process from idea to design element can be of high benefit to artistic output. This fact is widely supported by AI's position in today's conversation as well. However, when businesses participate in this conversation their approach often seems to be short-sighted—even when they hold AI as the wave of the future. Subsequently, it feels as if we need to, once again, revive the discourse about the ethical boundaries and responsibilities associated with using AI in the creative process. Case in point: Alicia Keys at the Super Bowl LVIII Halftime Show.
In Situ: In the Key(s) of Asking Question—The Halftime Show Replay
If you were one of the 129.3 million viewers of the Super Bowl LVIII Halftime Show, you likely noticed the Keys' first notes were less than pristine. Hey, she's human. Despite that, if you caught the replay published by the NFL and Apple Music, you may be wondering what in the blue hell I'm talking about. Although an alteration like this is likely innocuous in this situation, the act of inauthenticating her performance in this way should make us think about how we address feasible ethical quandaries in the future. Once again, I'm not here to ring the alarm about this instance. But, the fact that we can do this implies the need for boundaries. Not setting them has big implications when we have such a powerful tool.
Colonization by Way of Technology: Anticipating and Preventing Abuse
I've seen what may happen in the near future happen before. My warning is about the specter of technological colonialism—a cautionary tale to prevent possible exploitation by advanced technology. We must examine the insidious prospect of unauthorized appropriation and exploitation. In conjunction with that, it is time to identify proactive strides to curtail such abuses. This will prove crucial to artistic and cultural expressions and creativity as a whole.
Nurturing Responsible AI Application: A Collective Obligation
Then next step is simple: a collective obligation to foster responsible AI application. By definition, this will require content creators, technology developers, communication industry professionals, and other creatives to prioritize ethical considerations. Doing so as a full-on cooperative considers all in the long run. Furthermore, nurturing a culture of accountability and ethical discernment in this way is key.
Brand Health and Ethical AI Application: Upholding Integrity and Fairness
This would not be brand prix if I didn't mention how this applies to a brand. Why? Because the ramifications of AI's intervention in content creation and portrayal will have an effect on your brand's integrity. Futureproofing standards around IP protection, fair use, and licensing is a start but it should not stop there. Brands need to fortify their own frameworks around ethical activity, transparency, integrity and fairness. We're already seeing AI experiments in music and entertainment. That means we have some catching up to do.
Cultural Fluency + AI Application: Fostering Respect and Sensitivity
Fostering cultural sensitivity and embracing it as an ethic of respect in AI applications is instrumental. I'm only one of a group who has been calling for the development of learning specifically around cultural fluency (note: if you're interested in collaborating to build something, get in touch with me). We have had entirely too many instances of tech platform being culturally obtuse in the past. Thus, cultivating an ethos of respect for culture is a great step in navigating the pitfalls posed by AI's improper use.
Closing Call to Action: Raise the Profile of an Ongoing, Inclusive Dialogue
AI is on us. Studies in AI-hyperrealism are already showing us how hard it is for us to tell the difference between what is real and what is not. Therefore, it is incumbent that we not only elevate these conversations about the good and the bad of AI, but also then transform this discourse into plans that nurture an ecosystem of responsible application. It will take a concerted effort—one that champions responsibility, respects artistic integrity, and fosters cultural sensitivity—for us to create a paradigm where AI fortifies, rather than compromises, our creativity. It doesn't have to be a pipe dream. This is something we all can do in the name of the most successful outcomes for all.