Using generative artificial intelligence (AI) for learning

City is exploring the opportunities and challenges presented by generative AI and, whilst we will not prohibit the use of it, we expect students to use these tools in an ethical and responsible way. Our guidance for students is outlined below.

Note: this guidance was last reviewed 1 November 2023 and may be updated or amended as discussions about the use of AI progress.

What is generative AI?

"Generative AI is a broad label that's used to describe any type of artificial intelligence (AI) that can be used to create new text, images, video, audio, code or synthetic data." (Techopedia, 2023)

Generative AI tools work by using sophisticated machine learning algorithms that learn from vast amounts of data to produce new content that is similar to what they've learned.

Essentially, these algorithms analyse existing content, such as text or images, and then use that information to generate new content on their own.

However, there are limitations; for example, generative AI may not always be accurate or reliable, and there is a risk of bias in the data it is trained on.

One of the tools that has risen to prominence recently is ChatGPT by OpenAI which uses a chatbot interface for users to engage with the system; however, there are other tools available, including those embedded in search engines and computer coding editors.

What is City’s position on the use of generative AI by students?

Like other universities, City is exploring the opportunities and challenges presented by generative AI. Whilst we will not prohibit the use of generative AI, it is important that AI is not used to falsely pass off academic work generated by AI as a student’s own work (see next section).

Where generative AI is used to enhance learning, we expect students to use these tools in an ethical and responsible way. For example, used responsibly, generative AI tools could support effective study strategies. Learning to use AI, understanding its strengths and weaknesses, may also become a useful employability skill and you may find some modules will embed the use of AI into assessments and other activities.

At City, a working group is reviewing how and when students can make the best use of generative AI tools and advising staff on how to incorporate AI, ethically and educationally, into assessments.

There are limitations of using generative AI tools as they have no understanding of the content they are producing. The content generated is based on predicting the next plausible word or sentence and therefore the result might not be factually correct, might contain inaccuracies, biases, offensive content and fake references or citations. In the case of AI generated computer code, this may contain security issues, bugs or illegal use of software libraries.

Care must be taken to understand the limitations of using generative AI. It is important to check the factual accuracy of the content produced and to use AI content in conjunction with the learning resources provided by your module tutor and other trustworthy sources.

Use of generative AI and academic integrity, plagiarism and misconduct

If your module leader has given you instructions on the use of AI for a particular assessment, you should follow these. In the absence of any such specific instructions, the key point is that when you hand in work for an assessment, you are stating that it is your own original work.

The University Regulations on Academic Misconduct have not changed. The University requires that all work submitted for assessment is your own original work, or in the case of group work, the original work of group members.

The purpose of any assessment is to measure what you have learnt and understood from the course being assessed. It is not possible to assess your understanding if you plagiarise, copy and paste text, use AI, paraphrasing software or essay mills. The University takes cases of academic integrity and misconduct very seriously and seeks at all times to rigorously protect its academic standards.

Under the Academic Integrity and Misconduct Policy, unauthorised use of generative AI tools could be considered:

  • Plagiarism – where text used in the assessment is the direct output from an AI tool. Plagiarised text which has been amended using further AI tools (e.g., paraphrasing) remains plagiarised. The sources at the end of this message give guidance on appropriate citation of external material.
  • Falsification or fabrication – where the outputs from using generative AI tools have produced sources or data that do not exist.
  • Contract-cheating – where generative AI tools have been used to produce the whole assessment.

Any students suspected of committing academic misconduct will be actively investigated and if the misconduct is confirmed the full sanctions outlined within our Academic Integrity and Misconduct Policy will apply.

Acknowledging the use of generative AI in your work

If your module tutor has specifically permitted you to use generative AI for your assessment, you should follow their guidance on ensuring that this use is acknowledged and cited correctly. For example, you may be asked to acknowledge how you have used generative AI, what you have used it for and to include a copy of the output generated, including relevant prompts.

In terms of citations, if the output of the AI produced material is available online, you can cite this as you normally would. If the output is not available online, then you are advised to cite this as ‘personal communication’. Please see this guidance on citing generative AI from Cite Them Right Online (City login required).

It is your responsibility to ensure the reader of your work can clearly distinguish between original content created by you and content generated or inspired by generative AI and other sources.

What is City doing about generative AI?

City has established a group to consider the opportunities and challenges presented by generative AI.

The group is co-chaired by Dr Julie Voce (LEaD) and Dr Simon Hayley (Bayes Business School).

The group will consider the following:

  • Promoting a better understanding of generative AI amongst staff and students in order to encourage responsible, effective and ethical use of these tools.
  • Keeping City’s academic integrity policy and other regulations and guidance under review and up-to-date.
  • Determining the implications for learning, teaching and assessment and improvements to assessment design.
  • Discussing when and how generative AI can be used by students and staff as a support tool.

If you would like to be contribute to student discussions about use of generative AI, please email Julie Voce.

Support and guidance

Support is available to assist you with academic skills and writing: