AI Guiding Principles
Recent advancements in AI technology, including the proliferation of generative AI tools, present unique opportunities and challenges for the Australian screen industry.
Screen Australia’s approach to AI will be guided by the below key principles. This is a rapidly evolving area and as such, the agency is closely monitoring the developing regulatory landscape in Australia and overseas. This includes the National framework for the assurance of artificial intelligence in government and the Digital Transformation Agency’s Policy for the responsible use of AI in government. These Guiding Principles are intended to complement the Commonwealth Government’s AI regulations and policies and the agency reserves the right to update these principles as new developments arise.
-
Talent, creativity, culture and the individual. We will prioritise the human talent, creativity and culture that are the heart of Australia’s screen industry and the content it creates. This includes ensuring that the rights of screen practitioners are adequately protected including in relation to the use of their personal information and intellectual property in training data, prompts or any generated outputs from AI systems. Indigenous Cultural and Intellectual Property (ICIP) rights must be respected and protected in any use of AI.
- Transparency. Use of AI should be based on trust, which in turn requires transparency. Screen Australia, its stakeholders and the wider industry should be informed about how and when AI may be used, for what purposes and who may be impacted. Audiences should similarly be informed about the use of AI in screen content they consume.
- Ethical use of AI. Screen Australia supports the ethical use of AI systems and encourages the application of Australia’s AI Ethics Principles in the design, development and implementation of AI.
- Diversity, equity and inclusion. We encourage active consideration of how AI tools may be utilised to increase diversity, equity and inclusion. Their use should not result in discrimination for any individual, community or group or perpetuate societal injustices.
- Fairness. In keeping with an ethical use of AI, negotiations between all parties involved in screen projects should be consultative, with consent obtained from all impacted screen practitioners and other rightsholders. The remuneration and terms upon which screen practitioners may consent to the use of AI in relation to their content, likeness or performance must be fair.
- Responsibility and accountability. Responsibility must be taken for any use of AI systems. This includes ensuring that the proposed use is informed, that there is sufficient governance and oversight with clear lines of accountability. Appropriate risk assessment, due diligence and security measures must be implemented, particularly in relation to the handling of data, intellectual property, and personal and confidential information. Processes should be put in place to continually test and challenge the use or outcomes of AI systems.
Changes will be introduced to Screen Australia’s application processes in due course which will request the disclosure of the use of AI tools in application forms and supporting materials. Applicants are advised to look out for updates regarding these changes on Screen Australia’s funding pages.