Keep your data safe when using AI tools: when you use Generative AI (GenAI) tools, you’re not just generating text or images — you’re also sharing information. This guide shows you what’s safe to share, what to avoid, and which tools offer the best protection. It will help you protect your personal details, assessment output, and ideas while staying within university rules.
Important note:
Always follow your assessment brief. It will specify whether you can use Generative AI (GenAI) in your work, and whether this must be acknowledged. Requirements may vary by module or programme. If there is any conflict between this guidance and your assessment brief, the brief takes priority. For details of the University’s official position, see Using AI: Rules and Responsibilities.
Your assessment output and personal data are valuable – and once shared with AI tools, you may lose control over them.
Risks to your data:
- Your text can be stored permanently and used to train the AI.
- Parts of your work might appear in responses to other users' questions.
- Personal details can be exposed or misused by the company or hackers.
- You may lose ownership of your ideas through the tool's terms and conditions.
Examples in practice:
- A draft contract or research data could later appear in someone else’s AI response.
- Personal details you shared might be exposed to others.
Sharing your work with AI tools doesn’t just affect your privacy — it can also create academic risks and, in some cases, unfair disadvantages for certain groups of students.
- Submitting work that contains AI-generated content without acknowledgment/referencing or use of an SP&D digital sticker is misconduct. For full details on acceptable and unacceptable use, see our page on AI: Rules and Responsibilities.
- If you share substantial sections of your work with public tools, parts of it could later appear in AI outputs to other users. This creates a risk that your own work might resemble AI-generated text, which could lead to confusion or plagiarism concerns even if you did not use AI directly.
- Sharing substantial work undermines your ability to prove it's original.
Protecting your data helps you avoid plagiarism accusations, protect your intellectual property, maintain academic integrity, and follow university rules.
Before pasting text into an AI tool, think carefully about what you’re sharing. Once entered, your work or personal details may no longer be under your control.
- Unpublished or substantial work such as assessments or dissertations.
- Unpublished research, results, or datasets (e.g. survey data, lab results, interview transcripts).
- Confidential or personal details about yourself or others (e.g. names, IDs, contact details).
- Creative work such as designs, code, or artwork (e.g. game prototypes, source code, draft illustrations).
- Anything that could breach copyright or professional standards.
Instead, use short extracts or summaries if you want help and your assessment brief allows for phrasing, feedback, or brainstorming. If in doubt, keep extracts short and use university-supported tools.
Not all AI tools offer the same level of protection. From a data privacy perspective, start with the options that Abertay supports directly, and only use public tools with caution.
Microsoft Copilot Chat – provided by Abertay with enterprise security. This means:
- Your conversations are not used to train Microsoft's AI models.
- Data is processed within Abertay's secure systems.
- Enterprise Data Protection (EDP): When you use Microsoft Copilot Chat with your Abertay login, your conversations stay within the university’s secure Microsoft 365 system. They are protected in the same way as your Outlook emails or Teams chats, and they are not shared publicly or used to train Microsoft’s AI models.
- You have stronger legal protections under the university's data agreements.
- Privacy settings are managed centrally to protect all users.
- The university can provide support with access, security, or data protection issues.
Even with these protections, you must not upload confidential or sensitive personal information (such as health details, financial data, or information about other people). These protections make Copilot safer than public tools, but they don’t override your responsibility to protect sensitive data.
Adobe Express – covered by the University’s Adobe licence and powered by Adobe’s Firefly AI engine:
- Images and prompts are protected under Adobe’s enterprise agreement when you log in with your Abertay account.
You should still avoid uploading personal or sensitive information into prompts.
Academic search tools – GenAI is increasingly built into academic platforms and databases, such as Statista or Semantic Scholar. These are usually safer to use for research and reading because they don’t involve uploading your own work. For more detail, see our pages on AI for Academic Reading and AI for Researching a Topic.
Public AI tools (e.g. ChatGPT, Gemini) – can be helpful for exploring ideas, but they carry higher privacy risks. Always follow the guidance in What not to share before using them.
Even when using public AI tools for appropriate tasks, you can reduce privacy risks by adjusting your settings.
In ChatGPT
- Open Settings → Data controls.
- Turn off “Improve the model for everyone.”
- This stops your conversations being used to train the AI.
In other tools
- Look for options related to “data sharing,” “model training,” “history,” or “privacy.”
- Disable any setting that allows the company to use your content for training or analytics.
- Review settings regularly, as policies and interfaces can change.
Even with privacy settings enabled, treat public AI tools as if anything you enter could become public. Avoid sharing personal, sensitive, or unpublished work.
Your ideas and your assessments are your intellectual property (IP). Public AI tools may claim rights over what you share with them.
- Read the tool’s Terms & Conditions carefully.
- Check if the tool claims ownership of your input or output.
- Prioritise use of Abertay-supported tools with stronger protections.
See also: AI Tools: Terms & Conditions and Abertay's Intellectual Property Policy.
Data privacy isn't just a university rule – it's also covered by UK data protection law. Laws such as the GDPR and the Data Protection Act give you rights over your own personal data, but also responsibilities when handling other people’s information.
- Don't share anyone else's personal information (names, addresses, contact details).
- Be especially careful with sensitive data (health information, financial details, research data).
- Check if you need consent before sharing information about others.
This applies whatever stage of study you’re at — undergraduate, postgraduate, or research — and whatever subject you're studying. When in doubt, keep personal information out of AI tools entirely.
Before you paste anything into an AI tool, stop and ask yourself:
-
Am I about to share anyone’s name, contact details, or other personal information?
-
Am I about to share information that could identify me, others, or my location?
-
Am I sharing more than a short extract from my own work (e.g., over ~200 words)?
-
Am I about to paste in a journal article, book chapter, or other licensed academic content?
-
Am I about to use a tool without checking its privacy settings in the last month?
-
Am I about to share research data without clear permission or ethics approval?
-
Am I about to use a tool that is not Abertay-supported when an Abertay option is available?
If you answer “yes” to any of these, don’t share the content. Personal details, research data, and licensed academic materials should never go into AI tools — including Copilot. In a few cases (like long extracts or unchecked settings), you may be able to adjust and try again, but if in doubt, leave it out.
Need help?
If you have questions or would like more support, email studyskills@abertay.ac.uk
Last modified by Student and Academic Services