European guidelines on the responsible use of generative AI in research

Our association was an invited contributor to the development of European guidelines on the responsible use of generative AI in research published by the European Commission on 20 March.
22nd March 2024
Back to overview

The European Commission has issued new guidelines on the use of artificial intelligence (AI) in research, encouraging its use but warning of potential risks to the scientific process. The guidelines recommend researchers to avoid using generative AI tools significantly in sensitive activities that could impact other researchers or organisations, such as peer review and the evaluation of research proposals. This is intended to limit the risk of unfair assessment due to bias from the data sets used to train the tools and to prevent unpublished work from being included in AI models without the originator’s consent.

The guidelines aim to provide funding agencies, research organisations, and researchers with recommendations on promoting the responsible use of generative AI. They highlight the potential of AI to improve the effectiveness and pace of research and verification processes, produce texts in multiple languages, and summarise and contextualise wide sources of knowledge. The guidelines call for transparency about which generative AI tools have been used in research processes and remind researchers that they are accountable for scientific output generated by AI. Research organisations should provide or facilitate training on using AI and monitor the use of AI systems within their organisations. To ensure data protection and confidentiality, universities and other organisations should implement locally hosted or cloud-based generative AI tools that they govern themselves, whenever possible.

“The ability to quickly generate vast amounts of content (e.g. text, code, data, images) underlines the disruptive potential – both positive and negative – of generative AI tools such as ChatGPT. Enabling and facilitating positive developments, while mitigating against potential risks, is a key challenge in such a fast-moving field. Through CESAER, we can have a clear impact together including by contributing directly to the drafting of European guidelines. This strengthens our capacity to support our own researchers in navigating this space, as well as our key partners such as research funders and policymakers” said Marjo Rauhala (Head of Responsible Research Practices at TU Wien, and member of Task Force Openness of Science & Technology) who contributed in the drafting team led by the European Commission and coordinated with our Task Force Openness of Science & Technology.

For more information please contact Secretary General Mattias Björnmalm.

The image accompanying this article was generated by an AI.

Request more information

If you want to know more about CESAER click on the button below.

Request more information here