AI tools

With the following information, BTU Cottbus-Senftenberg provides academic staff with guidance on the use of generative AI tools in the context of academic work. BTU Cottbus-Senftenberg does not fundamentally reject the use of AI tools in the university context, but supports their sensible, critical, reflective and responsible use without neglecting problematic implications such as copyright and data protection law and research ethics issues. It follows the DFG's guidelines for dealing with generative models for text and image creation as well as the general principles of good scientific practice, in particular the principle of transparency as a principle of scientific honesty.

The use of generative AI tools should be based on professional and methodological reflection and risks as well as ethical and legal aspects should be appropriately considered. Due to the current dynamic nature of the topic, the following information will be updated regularly and the creation of a university-specific guideline with rigid specifications and regulations will be avoided for the time being. BTU is also currently reviewing the legally compliant provision of generative AI tools for BTU teaching staff and employees.

Legal aspects of the use of AI tools

n dealing with the personal data of third parties in connection with AI-based tools, compliance with data protection law, in particular the General Data Protection Regulation (GDPR), the Federal Data Protection Act (BDSG) and the Brandenburg Data Protection Act (BbgDSG), must be ensured. This applies both to the upload of information to AI-based tools and to the content generated by these tools. Personal data may only be entered into generative AI tools if the software operators do not make this data accessible to third parties or use it as training data. In addition to data protection, the input of confidential information, sensitive research data or internal documents is also not permitted.

It is recommended that data-saving AI tools are preferably used and, if possible, that the data protection settings are adjusted so that, for example, chats are not saved and chat histories are not used as training data.

If researchers use the results of AI-based tools in their own work after checking the content, the users are responsible for any incorrect or distorted content generated by the AI, incorrect references, copyright infringements or plagiarism.

Authorship of AI-generated output

AI-supported programmes for text production cannot be considered authors or creators of the text they generate within the meaning of the German Copyright and Related Rights Act (UrhG); users of such programmes can. The decisive factor here is a significant degree of intellectual contribution.1

Scientific work with AI-based tools

AI tools can be used in a variety of ways in scientific work. For example, they can help to optimise scientific work processes, solve complex problems more efficiently, gain new insights, develop innovative solutions and improve research results. In addition to commercial tools, there are now more and more open-source offerings (including those that can be used locally). The choice of an AI tool also depends on the intended use. Sites such as www.futurepedia.io, www.hcilab.org/ai-tools-directory/, www.advanced-innovation.io/ki-tools and theresanaiforthat.com provide an overview of AI tools. However, errors, superficialities, bias and so-called hallucinations can never be ruled out with the AI tools presented, so the results should always be scrutinised critically. In addition, the field is developing very quickly, which means that new AI tools and possibilities are constantly being introduced. Before using the non-binding AI tools presented, it is also advisable to check the costs, terms of use and data protection.

Possibilities for using AI tools in scientific work:

  • Literature research and analysis: help in identifying, collecting and analysing relevant scientific literature to present the state of research on a particular topic and formulate new research questions
  • Examples of AI tools: Elicit, Research Rabbit, Perplexity, Semantic Scholar, SciSpace, Consensus, Iris.ai, Keenious, ChatPDF
  • Experimental design and execution: AI-supported simulations and modelling tools can help to plan, conduct and analyse experiments to test hypotheses and gain new insights
  • Examples: Google DeepMind's "AlphaFold" algorithm; deep learning models to predict the activity and toxicity of new compounds in drug discovery
  • Data analysis and interpretation: analysing large amounts of data, automating repetitive tasks, identifying patterns and trends and deriving insights, helping to test hypotheses and uncover correlations
  • Examples of AI tools: Formula Bot, IBM Watson Analytics, DataRobot, RapidMiner, Julius, TensorFlow, H2O.ai, KNIME