Skip to Content

CARFAC-RAAV’s recommendations regarding AI and visual artists


Canadian Artists’ Representation / Le Front des artistes canadiens (CARFAC) and le Regroupement des artistes en arts visuels du Québec (RAAV) are the national associations representing visual and media artists. Together we work to affirm the economic and legal rights of artists, many of whom have been deeply impacted by the nonconsensual use of their intellectual property by companies who use generative artificial intelligence (AI).

In 2023, the Government of Canada produced a Guide on the use of Generative AI, which describes what generative AI is, and introduces some of the main challenges, concerns, and potential solutions that have been identified. On January 15, 2024, we submitted briefs to the Government in response to a consultation on copyright in the age of generative AI. Our submission results from over 220 unique responses from artists to a national survey regarding their concerns about generative AI products; community dialogues at virtual panel events organized by CARFAC and RAAV in Ontario, Quebec, and Saskatchewan; and hundreds of hours connecting with artists and stakeholders across the cultural sector in Canada and abroad.


AI and the potential for copyright infringement

We are concerned about how generative AI companies interpret the laws that apply to their business models. Midjourney recently updated its terms of service to clarify that users may not use the product to violate others’ intellectual property  rights, and that doing so may result in the company taking legal action against the user. Yet, the company has itself been accused of copyright infringement on multiple occasions. The terms of service goes on to state that the company does not guarantee that the service does not infringe on copyright.

Additionally, Open AI recently launched a program called Copyright Shield which promises to pay legal costs for its developer customers who face lawsuits over intellectual property claims. Assertions that these business practices are in compliance with the Copyright Act are inconsistent with these protective measures and also raise the question why some AI companies are ratifying licensing deals with major publishing and media providers, but argue against the use of licensing models for all content being used as training data.


Clarity around copyright and Text and Data Mining (TDM)

Using artwork obtained through TDM to train generative AI products without allowing artists to provide consent, negotiate compensation, or determine if/how they will be credited violates those artists’ rights under the Copyright Act. The Federal Government’s consultation is an opportunity for them to educate generative AI companies to ensure they comply with the law. 

Questions about moral rights are notably absent from the government’s survey; however, the violation of artists’ moral rights is inevitable based on the current business models used by most generative AI companies. It is common for generative AI models to distort original works which may harm the reputation of artists, and artists do not commonly have the choice to be credited or remain anonymous. Generative AI also enables an environment in which artists are unable to protect their works from association with causes, products, services, or institutions to which they are personally opposed. 

Eighty-two percent of artists responding to our survey indicated they were very or extremely concerned that their artwork is used without consent to train generative AI products. This concern is so deep and widespread that independent countermeasures are required. For example, the University of Chicago has developed Glaze, a tool that artists can use to protect their works online from becoming AI training data. While these efforts are appreciated, protecting the intellectual property rights of artists against non-consensual use by some of the world’s largest corporations should be done through copyright law and federal regulation. It should not rely solely on these independent initiatives.    

Terms such as “training data” are used frequently, and this language can devalue a creator’s intellectual property. For artists, this is not “training data”; this is their life’s work.


Challenges faced by rights holders in licensing their works for TDM activities

The plurality of stakeholders, the legal uncertainty, the need for more transparency in data management systems, and the opacity of AI systems are insurmountable obstacles for artists to defend their copyrights without support. The critical challenge currently faced by Canadian rights holders in licensing their works is the inability to determine what copyright-protected works have trained generative AI products; this opacity prevents parties from negotiating licensing terms and stifles the development of emerging licensing markets. 

It is also challenging to establish that the infringing party had access to the original work, that the original work was the source of the copy, and that the work was significant in informing the creation of the new content produced. However, we understand that AI developers and researchers in the sector document their training data. Therefore, greater transparency of this data with rights holders is therefore technically feasible.

Most artists have yet to have opportunities to negotiate licenses for their works already used to train generative AI models. Though many mainstream generative AI companies do not employ licensing models, such business frameworks exist within the AI industry. Getty Images, for example, has released an AI Image Generator trained exclusively on its content. Getty compensates creators for the use of their work in their AI model.

Resistance from generative AI companies to engage in licensing negotiations with the arts sector is another critical challenge in establishing a market-based approach to consent and compensation for artwork used in TDM. Meta, for example, has argued that imposing a licensing regime after the fact would cause chaos for the industry and result in little benefit for artists, given the insignificance of their respective works within larger datasets. 

Already, however, we are seeing contradictions in these arguments. OpenAI recently entered a licensing deal with Axel Springer, the parent company of Business Insider and Politico; such an agreement could become the norm. Companies that regularly violate the Copyright Act should not benefit from an exemption on the grounds that those actions have already occurred. Even when the financial value of an individual work is small, this does not preclude the rights of artists to provide consent and receive payment for using that work. 

The Canadian Government should avoid entertaining arguments that complying with the Copyright Act and obtaining prior consent from artists would slow the development process of generative AI products. While AI technology may be complex, basic principles of fairness, justice, and asking permission before taking things are straightforward and baked into Canadian laws and social values.

Generative AI companies are rightfully excited about the products they produce and understandably feel a sense of urgency to accelerate the development of those products. Artists are no different; we must regard their needs with the same level of importance, innovation, and urgency. Neither group can be permitted to operate outside of the law or develop their products in ways that harm individuals or society at large.  


Amending the Copyright Act to clarify the scope of permissible TDM activities

The Copyright Act is sufficient and applicable to protect the rights of creators in the context of generative AI. There is no reason to believe that current copyright laws do not, or should not, apply. Situations in which private companies use, without permission, the copyrighted works of Canadian artists to develop and grow the value of their commercial products is precisely the kind of scenario that the Copyright Act should prevent. 

The Federal Government should refrain from implementing new fair dealing exceptions for TDM. Doing so would devastate the economic environment for Canadian artists – many of whom live at or under the poverty line. A TDM exception would result in long-term negative social and cultural externalities, including compromising the global competitiveness of Canadian arts and culture and harming small creative businesses. 


Proposed obligations on AI developers to keep records of copyright-protected content

The current environment must enable rights holders to determine if their works have been used to train generative AI models. An opaque operating model both encourages the unauthorized use of Canadian artists’ works by AI developers and prevents licensing negotiations from taking place. We, therefore, recommend that generative AI companies be required to publish records of copyright-protected works that have trained AI models.

Developers and researchers in the generative AI sector are already documenting their training data, for example, using model cards. Model cards can record structured information, such as the names of domains where training data is collected. Therefore, introducing a record-keeping obligation should not entail additional costs for the AI industry and would provide much-needed transparency.


Remuneration for the use of a given work in TDM activities

Canadian artists face growing labor disruptions due to the proliferation and use of AI-assisted and AI-generated content by businesses and consumers. This trend is unsurprising as organizations that once licensed the use of original works can now use generative AI to meet their needs without paying creators.

Artists and generative AI companies should negotiate remuneration for licenses without government intervention. This could be facilitated by collective licensing options to simplify the process so artists need not have to negotiate with companies on an individual basis, and this option is already being explored in several countries. 

The Government can enable a market-based solution by ensuring that the generative AI companies operating within Canada comply with current Canadian copyright law without exception, and that records of copyright-protected works that trained AI products are made public. Generative AI has negatively impacted labour opportunities in our industry. The Federal Government can contribute to stabilizing this fallout by ensuring that generative AI companies operating in Canada adopt appropriate licensing models.


Authorship and ownership of content generated by AI 

Existing copyright laws are sufficient to address authorship and ownership, and no legal amendments are required. As the Supreme Court of Canada noted in CCH v. The Law Society of Upper Canada, “An original work must be the product of an author’s exercise of skill and judgment. The exercise of skill and judgment required to produce the work must not be so trivial that it could be characterized as a purely mechanical exercise”. 

These same criteria should apply when evaluating the granting of copyright to AI-produced or AI-assisted works. Entering a series of text prompts into an AI Image Generator is decidedly a “purely mechanical exercise,” that does not require the user to exercise “skill and judgment.” 

There may be, however, other situations where AI-generated or AI-assisted artwork meets the criteria for copyright protection. For example, suppose an artist designs an AI model and trains that model with their artwork so that the model can understand and interact with the training data in unique ways specified by the artist. In that case, the content resulting from this process may meet the copyright criteria.

The question of authorship of AI-generated works is essential but difficult to consider in a landscape where private companies use Canadian artists’ intellectual property without consent, credit, or compensation to develop their products and increase the value of those products. Suggestions that generative AI companies could be able to continue developing their products using unauthorized Canadian artists’ works while simultaneously considering if the resulting content generated by those products should receive copyright protection are concerning. 

The devastating impact this would have on the creative economy in Canada is profound and difficult to predict, though it is essential to highlight the specific effects on Indigenous artists and communities. As the theft of original Indigenous cultural expressions is already widespread, its unauthorized use by generative AI companies is unconscionable and contrary to notions of Truth and Reconciliation. Moreover, including counterfeit Indigenous artwork in training datasets accelerates the spread of counterfeit imagery, and the generation of AI content based on authentic or fake Indigenous artwork cannot be permitted. This issue deserves a separate analysis and consultation, and with thorough consideration of the work already done by the Indigenous Protocol and AI Working Group


Infringement and liability

When generative AI companies use Canadian art to train their models and build the commercial value of their products without consent, they commit copyright infringement and must assume liability for those actions.

Requiring generative AI companies to keep and publish records of copyright-protected works used to train their models will address the large-scale copyright infringement that has already happened and provide parties involved with the information needed to negotiate terms for using those works. This will enable the development of licensing markets and strengthen Canada’s creative economy while potentially accelerating growth and competition within the AI industry itself.



RAAV and CARFAC do not wish to hinder the advancement of AI, but we need to preserve the balance that the Copyright Act underpins, and we must uphold the interests of copyright holders. Indeed, we see the potential of AI: if adequately regulated, it could fuel creativity, promote content discoverability, and equip creators to defend their rights.

Nevertheless, it is essential to be aware of the negative impacts that AI can have on all sectors, on the foundations of our society, and the rights of artists. As generative AI profoundly impacts the cultural industries, creators must be centrally involved in developing the governance and policy frameworks affecting our sector, and invited to participate in the design of future consultations. 

In summary, our primary concern is to ensure compliance with the Copyright Act. The “3 Cs” principle (consent, credit, and compensation) must guide the Government’s actions in this public consultation and any potential amendments to the Copyright Act. Our request is consistent with those called for by artists in other countries at the present time. 

Creators’ consent must be obtained, and the Government must not undercut their options to be paid when their content is used for text and data mining purposes. We also recommend that a transparency obligation be imposed on users. Specifically, this framework should require disclosure of any works used in the context of generative AI. Such a mechanism is feasible and poses no technical difficulties for generative AI companies. Rather, it would lay the foundations of the edifice, to ensure fair and equitable remuneration for artists and copyright holders.