AI for Cultural Ecosystems: Ally or Enemy?

What AI Means for Artists and Creativity 

AI has transformed creative practices by providing innovative tools and techniques that enhance artistic creation, streamline workflows, and inspire new forms of expression. Two aspects of AI and art are centrally important: 

  • AI is used for creative purposes and generating artworks.

  • AI is used for art analysis and employed on digitised artwork collections for user friendly collections management and data sign-posting, as shown by the recent collaboration between the London Museum and OpenAI

In music, artists like Holly Herndon have been experimenting with AI for more than a decade, while tools like AIVA, Amper AI Endel, and BandLab represent the vanguard of the AI composer toolkit, combining various musical elements to generate a song on demand. In a similar line, some companies sell royalty-free library music for use in games, ads, and other online content, where generative AI is frequently employed. In the hands of artists, AI has the potential to be another step towards greater accessibility of music production tools. Similarly to digital software, AI can empower creators to inexpensively produce and digitally distribute their music worldwide. At the same time, AI can challenge traditional notions of artistic creation and expand the boundaries of human creativity, for better or for worse. 

There is also a growing number of partnerships between AI companies and major record labels, such as the collaboration between Endel and Universal Music Group. In 2023, they signed a deal to create 50 AI-generated, “wellness-themed” albums. One outcome of this collaboration was a series of remixes of Roberta Flack’s GRAMMY Award-winning cover of "Killing Me Softly With His Song" for its 50th anniversary. The practice of taking old recordings and finding new ways to monetize them is likely to become increasingly common. However, as significant as AI is today, it cannot replace the relationship between an artist and their fans.

Criticism and Regulation

Organisations like PRS for Music are playing a pivotal role in stewarding the responsible development of AI in music. PRS’s AI principles establish a balanced approach: they avoid outright rejection of AI while implementing pragmatic governance to mitigate risks such as plagiarised outputs, diminished human agency, or creator exploitation. The focus remains on empowering and protecting PRS’s human composer members as they thoughtfully determine if and how to incorporate AI’s emerging creative capacities. These principles aim to provide initial guidance, mandating transparency through clear attribution when AI is involved, while still reserving full royalty payments and membership benefits for human creators who maintain oversight of the process. AI raises ethical and legal concerns about some practices, including the potential infringement of intellectual property rights through indiscriminate scraping of copyrighted content. 

Goldman Sachs Report Says AI Could Put 300 Million Jobs at Risk

An increasing number of organisations challenge the technology's perceived value. Even banking institutions like Goldman Sachs argue that Generative AI has limited potential and is not worth its high costs. Outside of ChatGPT, few AI products are widely used, and there are serious concerns about the sustainability and scalability of AI, including the need to rebuild America's power grid for further growth. 

The incredible demand AI places on the environment cannot be overstated. A recent report published by the Harvard Business Review notes that the training process for a single AI model “can consume thousands of megawatt hours of electricity and emit hundreds of tons of carbon. This is roughly equivalent to the annual carbon emissions of hundreds of households in America.” The report continues to note that the significant amount of water required to cool data centers puts an unprecedented strain on already threatened fresh water supplies. As the demand for AI continues to grow, so too will the potential for significant environmental pollution and degradation. With a single query to ChatGPT requiring the same amount of energy as it takes to light a lightbulb for 20 minutes, it is perhaps unsurprising that many tech firms have scaled back their climate commitments in favor of AI development. Given the significant, and so far unchallenged, threat facing the environment already, reasonable minds should question whether the continued development of AI is a wise decision.

Billie Eilish , Nicki Minaj, and Stevie Wonder, are just some of the many signatories of an open letter calling for protections against AI.

Artists have also expressed serious and justifiable concerns over the future of AI in the music industry. An open letter published last year by advocacy group Artist Rights Alliance called for protections “against the predatory use of AI to steal professional artists’ voices and likenesses, violate creators’ rights, and destroy the music ecosystem.” The letter was co-signed by over 200 prominent musicians, including Billie Eilish, Nicki Minaj, Stevie Wonder, Chappell Roan, Kacey Musgraves, The Cure’s Robert Smith, and more. Elsewhere, Nick Cave has warned against the use of AI for song-generation, arguing that “songs arise out of suffering, by which I mean they are predicated upon the complex, internal human struggle of creation [...] algorithms don’t feel. Data doesn’t suffer. ChatGPT has no inner being, it has been nowhere, it has endured nothing, it has not had the audacity to reach beyond its limits, and hence it doesn’t have the capacity for a shared transcendent experience, as it has no limitations from which to transcend.”

In the US, legislation that is responsive to these concerns has been aimed at protecting artists' interests. Federal regulations such as the No Fakes Act, the No AI Fraud Act, and the Music Modernization Act have sought to give artists more control over the use of their voice and likeness, address AI usage of artist likenesses, and establish mechanisms for artists to collect royalty payments, respectively, with mixed results. The most robust legislation has primarily been enacted at the state level. Notably, Tennessee became the first state to protect artists from AI impersonation with the passage of the ELVIS Act in March 2024.

AI for Cultural Ecosystems

As a company working towards a better integration of cultural ecosystems into place development, Sound Diplomacy’s interest in AI goes beyond enhancing creative practices and productivity. For instance, can AI-driven tools offer policymakers insights into the impact of the cultural and creative sector, aiding in informed decision-making regarding funding and support? Moreover, can AI assist policymakers in identifying and tackling issues related to diversity, equity, and inclusion within their cultural sector? To do so, the best approach would be to develop tailored AI tools for different places, instead of generic models like Chat GPT.

Simultaneously, we can imagine that AI could help to address the rising calls for greater transparency, diversity, and accountability. Here are some ways in which AI can support cultural ecosystems on these topics:

  • Expand access with the digitisation, categorisation, and preservation of cultural artefacts and works of art.

  • Bolster cross-cultural communication, for example through automatic language translation and automatic subtitling of cultural content. 

  • Make interfaces more intuitive for users with visual or hearing impairments, and make culture more accessible over all (for further reading, see our post on accessibility and cultural events).

As a result, public and private actors aiming to enhance places and cultural ecosystems might consider the following recommendations:

  • Promote AI and data literacy through courses and conferences for internal teams within local institutions, as well as representatives from urban agencies, cultural organizations, cultural and creative industries, artists, and civil society. These should cover topics on current AI applications, normative ecosystems, underlying technologies, advantages, technical and social challenges, and key players involved.

  • Encourage the development and use of new and diverse in-house AI technologies, addressing cultural biases, bottlenecks, and economic concentration. This may also include the creation of AI tech clusters within local creative hubs, tailored to more local contexts.

  • Establish an internal advisory tech committee within cities and regions, focused on AI and digital technologies to guide regulation and strategic developments that integrate cultural activities. Foster cooperation between city offices and cultural institutions.

  • Create a charter against AI-enabled harassment, malicious use of artistic work, and other detrimental actions, as a foundation for a democratic information ecosystem at the local level. Such a charter should encourage the creation of open datasets that are freely accessible, ethically and locally sourced, ensuring compliance with data protection standards.

The full effect of AI on human creativity has yet to be determined, and it is unlikely that either decidedly utopian or dystopian predictions can fully account for what the future might hold. While a wide-swath of international musicians have expressed reservations about AI, dedicated and detailed legislation to structure its use is already being drafted and implemented. Used properly, AI does offer the significant potential for increased accessibility for creators and audiences. In order for AI to ultimately yield a public good, efforts must be made to ensure its sustainability, its accessibility, its locality, its fairness, and its dedicated use for the growth, rather than restriction, of communal creativity world-wide. 

Previous
Previous

Breaking Barriers: Accessibility in Arts and Culture

Next
Next

Funding the Future: Hybrid Models at the Forefront of Arts and Culture Financing