The International Federation of Reproduction Rights Organisations (IFRRO) and its members of authors’, artists and publishers’ federations acknowledge the EU’s proposed AI Act. However, to effectively address the risks faced by authors and publishers, it is important to clarify the terminology in relation to “obligations of the provider of a foundation model”, regarding generative AI (Article 28b 4)
IFRRO, the International Federation of Reproduction Rights Organisations, “is the leading organisation representing collective management organisations for text and image materials (known as Reproduction Rights Organisations, or RROs). IFRRO’s mission is to develop and promote effective collective rights management to ensure that the copyrights of authors and publishers are valued through the lawful and remunerated use of text and image-based works. With over 150 members from over 80 countries, IFRRO plays a key role in the global copyright ecosystem. [more: IFRRO.org]
The EU’s proposed AI Act is an important piece of legislation at a time when AI technologies are developing rapidly and increasingly being deployed.
How the AI Act can help address current challenges:
It is essential that the AI Act supports the cultural and creative sector to grow and thrive to support culture, economic growth and job creation. Respect for copyright and licensing are essential, as is requiring transparency with regard to the use of copyrighted material in the training of AI systems. It is essential that the integrity of copyright and licensing systems, enabling the remuneration of rightsholders, are preserved.
With the important support of EU decision-makers the AI Act, if further strengthened, can assist in addressing some of the key challenges currently faced by authors, artists, publishers and other rightsholders – and more broadly our economy and democratic society – for example, by ensuring:
– Full respect for the “acquis communautaire” in relation to copyright rules as well as the Charter of Fundamental Rights of the EU.
– Legal access / respect for opt-outs: ensure that AI models are only trained on legally obtained data sources, which are transparently reported, and that rightsholder opt-outs are fully supported, implemented and respected in all jurisdictions.
– Transparency relating to ‘input’: ensure providers of foundation models provide details of the copyrighted materials used, for what purpose and from where they were collected, and exclude sources that provide illegal access to copyright material.
– Transparency relating to ‘output’: inform citizens when works are AI generated so they can distinguish between human creation and AI generated works – promoting trust in all types of works. This indication also helps to avoid unfounded copyright claims.
We welcome that the EP has recognised the challenge posed by generative AI systems and has taken steps to address the risks faced by authors and publishers, in particular by introducing specific obligations for generative AI and emphasising the need to respect copyright law. However, in order to effectively address the risks faced by authors and publishers, it is important that law-makers further strengthen and clarify the EP’s proposals, including e.g., some of the terminology in relation to “obligations of the provider of a foundation model”, regarding generative AI (Article 28b 4).
In particular, it would be necessary to develop further the obligation under Article 28b 4c) to “make publicly available a sufficiently detailed summary of the use of training data protected under copyright law” as the notion of “sufficiently detailed summary” is currently unclear and, without a clear, consistent and practical implementation framework, could open 4 significant loopholes in practice that would leave citizens unaware of the content used to produce an AI generated work, and rightsholders unable to enforce their rights. This could increase the probability of false or misleading content going unchecked which could cause harm to citizens and society. It must be clear which works have been used for training, where they were collected and for what purpose.
An immeasurable amount of works has been scraped and (mis)used by AI developers without the permission of authors, artists, publishers and other rightsholders and without remuneration over the last decade. This is unacceptable and it is vital that the damage already done to authors and publishers – as well as the damage they face in coming years without an immediate and robust regulatory response – is fully taken into account in the course of discussions on the AI Act.
We call upon EU law-makers to keep in mind the value of the cultural and creative sector for our culturally diverse and democratic society in Europe as they work towards a final agreement on the AI Act.
Learn more about the EWC Campaign agAInstWritoids