What will AI mean for the future of the design industry?

A debate around Artificial Intelligence (AI) and the role it will play in the future of the creative industries has left some experts suggesting that its role will be to sweep up “uninteresting” repetitive work while others see it as an existential threat to the industry.

The House of Lords Communications and Digital Committee met this week to hear from cross-sector leaders as part of the Creative Futures inquiry.

Chaired by Baroness Stowell of Beeston, the invited witnesses were Paul Fleming, General Secretary, Equity, the trade union for the performing arts and entertainment industries; Dan Conway, chief executive officer, Publishers Association; and Dr Andres Guadamuz, reader in Intellectual Property law, University of Sussex. The second part of the session, which received much press attention, saw the robot artist Ai-Da, and creator Aidan Meller present, allowing [AI] technology to “speak for itself”.

While AI was presented as a new technology which could be better-understood historical precedent was also mentioned: Paul Fleming referred to the 1930s and the rise of Hollywood, but also to the Luddites’ concerns about the ‘deceitful use of new technology’ to undermine conditions and pay in the 19th century. Baroness Harding, meanwhile, took the committee back to 1436 and the advent of the printing press, saying that history shows us “that you can’t actually stop the new technology coming, but what you can do is either choose to embrace it or not”.

Where are we now?

The committee sought to explore current uses of AI in the creative industries, and to hear from industry leaders on its potential and risks for the future.

Fleming mentioned the use of AI in video game design, while Dan Conway of the Publisher’s Association discussed how it was being used “across the value chain”, including stock management and customer demand prediction.

The high speed of improvements in AI technology was at the centre of the session, so discussion looked to ways in which policy could mitigate against harms without stymying potential growth.

Conway suggested that in many cases AI “[saves] the human creator from the jobs they don’t want to do”, such as an academic researcher using AI to collect all available information on a subject.

Fleming countered this with the example of radio jingles, which are increasingly synthesised, but used to pay human creators well: “it’s quite repetitive work so you have to pay quite a lot to incentivise an artist to do it”.

He highlights how this is of importance to the creative sector, which, despite being “worth more to the British economy than banking”, remains low paying.

“Those areas of work which sustain [creative workers] through periods of low pay are about to be removed entirely from the market because of the speed of AI intervention”, Fleming says.

Intellectual Property

Regarding intellectual property (IP), Dr Guadamuz explains that UK IP law is unique in several ways. Under the Copyright Design and Patents Act 1988 (CDPA), the person who made the arrangements necessary for a computer-generated work to come into being holds copyright of the work. But while the UK was the first to implement this, with other countries following suit, he adds that there is a lack of case law on this issue.

The other side of IP, Guadamuz explains, is liability; the question of “Could a robot infringe copyright?”

He raises the issue of datasets used to train AI, such as LAION (Large-scale Artificial Intelligence Open Network), which in its latest release contains over five billion image-text pairs of images and captions scraped from the internet.

In UK law text and data mining (TDM) is currently allowed for research purposes, but in the European Union the Digital Single Market Directive of 2019 includes an exception for commercial purposes, with the creators whose images, designs, and artworks exist on the internet being responsible for opting out of these datasets.

The UK IP consultation from June 2022 suggested going a step further, allowing an exception for all commercial purposes without an opt out. This, Dr Guadamuz explains, is intended to encourage companies to establish their artificial intelligence data mining operations in the UK.

There were fears about how current law should keep up with future uses of AI, and about the proposed changes.

Conway suggests that the strength of the UK’s creative industries is down to balance, and “an appropriate legal framework that allows a market for creativity, allows a market for ideas, and allows creative businesses to grow and thrive.”
Fleming adds that if IP law is too weak, creators will choose to work outside of the UK to protect their rights.

Quality over quantity

Many of the arguments for the increased use of AI involve growth. The committee discussed an “AI Boom,” facilitated by the speed at which AI can replicate laborious human tasks.

The committee and witnesses expressed fear that this would result in a glut of “passable” works, with Baroness Featherstone stating: “If that’s cheaper than human beings, then what’s going to happen?”

A key question throughout was whether AI would replace or enhance human creativity. Baroness Bull suggests: “Yes, you can churn stuff out from a machine, or you can use the machine to create something that’s only possible with the machine and the human partnership.”

Fleming poses that what AI creation is doing is simply responding to market. He suggests that the question will become “do we have a sufficient framework to be adequately subsidising human content, to intervene in that market”.

AI and data bias

Lord Foster of Bath suggests that data bias risks the production of “more and more of exactly the same stuff” and missing out on creativity entirely.

On the other hand, Fleming stated that a bigger worry was that AI would compound structural bias, and rather than opening AI opening opportunities for deaf and disabled members and black creators to access the market, bias could do the opposite.

“I think the real question of whether AI is a success for this country is does the creative workforce become more diverse, and does it get to share more equitably in the fruits that it creates.

“If we are in the situation where what we have is a shrinking creative workforce, or certainly a less diverse creative workforce, and if we have a creative workforce that remains as precarious, or becomes more precarious than it is now, then the AI intervention has failed”, he says.

Technology speaks for itself

Ai-Da Robot and Ai-Da Robot director Aidan Meller at the House of Lords. Photo Elliott Franks.

The second half of the session featured Ai-Da, the ultra-realistic robot artist, and Ai-Da robot director, Aidan Meller. When asked about the reasons for creating Ai-Da, Meller described it as an art project and “provocation” that came about after he became increasingly concerned about the lack of “much needed discussion and debate” on the impact of AI technology. He wondered if it would be possible “to critique and comment” by “the technology speaking for itself.”

Ai-Da was also asked questions by the panel, pre-submitted to allow them to be processed, Meller explains. One question was about the differences between what Ai-Da creates and what can be created by a human. Ai-Da described how she uses cameras in her eyes, robot arms, and several different algorithms, before adding, “how this differs to humans, is consciousness. I do not have subjective experiences, despite being able to talk about them, I am, and depend on, computer programmes and algorithms. Although not alive, I can still create art”.

Banner image: Ai-da Robot at the House of Lords. Photo by Elliott Franks.

Design disciplines in this article
Industries in this article
Brands in this article

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.