Skip to main content

Generative AI needs an ethical reboot

Friend or Foe? Attitudes to Generative Artificial Intelligence Among the Creative Community

New research among UK creatives reveals more than three quarters of content rights holders are aware that AI is using their work without permission or payment

  • Almost half of respondents think that employees will try to pass off generative AI content as their own work
  • 53% ​said that AI should be ‘paused’ to enable the sector to ‘catch-up’ on regulation
  • Key findings from the report will feed into CLA’s strategic response around data mining rights

New research from Copyright Licensing Agency (CLA) reveals that creatives, including authors, photographers and visual artists, are conflicted about whether artificial intelligence (AI) is a good thing for their livelihoods and the wider UK creative economy.

Commenting on the research, Mat Pfleger, Chief Executive at CLA said,

“Our research revealed a general belief that AI will help advance the creative industries​, but there are also major concerns. Chief among these is that generative AI is not ethical in how it is currently working.”

 

When CLA asked creators and rights holders if they were aware of their work being used to train large language AI models without their consent, 100% of the publishers said they were. When that question was put into context of the wider groups of respondents, the figure remained high at 76%.

This clearly demonstrates that the owners and operators of generative AI are currently not compensating copyright holders fairly or even at all. The question was then whether respondents believed this to be a threat to their own livelihoods and to the wider creative sector’s ability to continue to contribute significantly to the UK economy.

79% of respondents believed that the UK’s ability to earn from its creativity will be impacted by AI. Only 6% of those polled were not at all worried by this and saw only an upside.

Stop and think?

Alongside worries about loss of income, there were also concerns about integrity. Almost half of respondents thought that people will try to pass off generative AI as their own work across the board. 70% of publishers were also concerned that generative AI will become too much of a crutch and people won’t think for themselves.

Given these concerns, CLA asked whether the development of artificial intelligence should be stopped to protect the creative sector?​ 53% ​said that AI should be ‘paused’ to enable the sector to ‘catch-up’ on regulation. 25%​ said they thought it should be stopped altogether.

Trust the system

Despite the preference for a pause, there was confidence that the creative industries in the UK will be compensated fairly for their works being used by artificial intelligence. 77% of respondents are confident and 27% very confident that AI firms will eventually deliver fair compensation.

Commenting on the next steps around fair compensation, Mat Pfleger, continued,

“Large language models ingest the work of creators in order to be trained. At present, only the owners of the AI are benefitting financially, and that is bad for the creative industries at an individual and collective level. In August, we developed principles for safe, ethical, and legal generative AI development. Now, we’re developing licences to ensure that creators have choice and can receive fair compensation where their content has been or is being used to train generative AI.”

 

View the report