Generative Artificial Intelligence in the Classroom: FAQ’s

The latest generation of Artificial Intelligence (AI) systems is impacting teaching and learning in many ways, presenting both opportunities and challenges for the ways our course instructors and students engage in learning. At the University of Toronto, we remain committed to providing students with transformative learning experiences and to supporting instructors as they adapt their pedagogy in response to this emerging technology.

Many generative AI systems have become available, including Microsoft Copilot, ChatGPT, Gemini, and others. These AI tools use predictive technology to create or revise written products of all kinds, including essays, computer code, lesson plans, poems, reports, and letters. They also summarize text, respond to questions, and so on. The products that the tools create are generally of good quality, although they can have inaccuracies. We encourage you to try these systems to test their capabilities and limitations.

Sample Syllabus Statements

Revised April 2024: The University has created sample statements for instructors to include in course syllabi and course assignments to help shape the message to students about what AI technology is, or is not, allowed. These statements may be used for both graduate and undergraduate level courses.

You may also want to include a statement to the effect that students may be asked to explain their work at a meeting with the instructor. While you can call a student in for such a discussion whether you include a statement to this effect on your syllabus or not, a reminder of this on the syllabus may help remind students that they are responsible for the work they submit for credit.

Microsoft Copilot

In December 2023, Microsoft Copilot (formerly Bing AI) became available to all U of T faculty, librarians, and staff. This protected version is now also available to U of T students. Copilot is an enterprise version of an AI-powered chatbot and search engine which better protects the privacy and security of end users (when users are signed into their U of T account). Copilot, like other generative AI tools, may provide information that is not correct (“hallucinations”), and it is up to each individual user to determine if the results are acceptable. For information and instructions on accessing the enterprise edition, please read and adhere to the Microsoft Copilot guidelines for use.


If you are an instructor who is interested in using generative AI with students or to develop course materials, review the FAQ below for considerations.

Frequently Asked Questions

About Generative AI

Updated: April 10, 2024

Large Language Models are trained to predict the next word in a sentence, given the text that has already been written. Early attempts at addressing this task (such as the next-word prediction on a smartphone keyboard) are only coherent within a few words, but as the sentence continues, these earlier systems quickly digress. A major innovation of models such as GPT is their ability to pay attention to words and phrases which were written much earlier in the text, allowing them to maintain context for much longer and in a sense remember the topic of conversation. This capacity is combined with a training phase that involves looking at billions of pages of text. As a result, models like ChatGPT, Gemini, and their underlying foundational models are good at predicting what words are most likely to come next in a sentence, which results in generally coherent text.

One area where generative AI tools sometimes struggle is in stating facts or quotations accurately. This means that these tools sometimes generate claims that sound real, but to an expert are clearly wrong.

The best way to become familiar with the capabilities and limitations of the tools is to try them. Their capabilities continue to grow, so we recommend continuing to engage with the tools to keep your knowledge of their abilities current.

Updated: April 10, 2024

Instructors are welcome and encouraged to test Microsoft Copilot, ChatGPT, Gemini, Perplexity and other tools, use of which are currently free. You can also test other AI tools to assess their capability, for instance to see how they respond to the assignments used in your courses, the way in which they improve the readability and grammar of a paragraph, or the way they provide answers to typical questions students may have about course concepts. Experimentation is also useful to assess the limits of the tool. However, confidential information should never be entered into unprotected AI tools. Content entered into ChatGPT, Gemini, or other public, unprotected tools may become part of the tool’s dataset. Note that information entered into the protected version of Microsoft Copilot is not used for training.

Updated: April 4, 2023

This is a threshold question that instructors may want to consider. Mainstream media has been covering this issue extensively, and alternate viewpoints are widely available.  

Given that generative AI systems are trained on materials that are available online, it is possible that they will repeat biases present online. OpenAI has invested substantial effort into addressing this problem, but it remains a danger with these types of systems. You may also want to familiarize yourself regarding questions about the way the technology was developed and trained (e.g., who were the people who trained it?), the way we use the responses it provides, and the long-term impacts of these technologies on the world.

The Provost is consulting with faculty and staff experts on these larger questions involving generative AI systems, and welcomes debate and discussion on these issues.

Updated: April 10, 2024

There remains significant legal uncertainty concerning the use of generative AI tools in regard to copyright. This is an evolving area, and our understanding will develop as new policies, regulations, and case law become settled. Some of the concerns surrounding generative AI and copyright include: 

  • Input: The legality of the content used to train AI models is unknown in some cases. There are a number of lawsuits originating from the US that allege Generative AI tools infringe on copyright and it remains unclear if and how the fair use doctrine can be applied. In Canada, there also remains uncertainty regarding the extent to which existing exceptions in the copyright framework, such as fair dealing, apply to this activity.
  • Output: Authorship and ownership of works created by AI is unclear. Traditionally, Canadian law has indicated that an author must be a natural person (human) who exercises skill and judgement in the creation of a work. As there are likely to be varying degrees of human input in generated content, it is unclear in Canada how it will be determined who the appropriate author and owner of works are. More recently, the US Copyright Office has published the following guide addressing these issues: Copyright Registration Guidance for Works Containing AI-Generated Materials.

If you have further questions about copyright, please view the U of T Libraries webpage, Generative AI tools and Copyright Considerations for the latest information.

Student Use of Generative AI

Updated: April 10, 2024

Yes. Instructors may wish to use the technology to demonstrate how it can be used productively, or what its limitations are. The U of T Teaching Centres are continuing to develop more information and advice about how you might use generative AI as part of your learning experience design.

You can ask your students to use the protected version of Microsoft Copilot. However, keep in mind that asking or requiring your students to access other tools is complicated by the fact that they have not been vetted by the University for privacy or security. The University generally discourages the use of such systems for instruction until we are assured that the system is protecting any personal data (e.g., the email address used to register on the system). These tools should be considered with the same cautions as other third-party applications that ingest personal data.

If you decide to ask or encourage students to use this an AI system in your courses, there are a few issues to consider before you do so:

  • Never input confidential information or student work into an unprotected/unvetted AI tool. All content entered may become part of the tool’s dataset and may inadvertently resurface in response to other prompts.
  • Note that if you ask ChatGPT or other tools whether they wrote something, like a paragraph or other work, they will not give you an accurate answer.
  • There may be some students who are opposed to using AI tools. Instructors should consider offering alternative forms of assessment for those students who might object to using the tools, assuming that AI is not a core part of the course.
  • Instructors should consider indicating on their syllabus that AI tools may be used in the course and, as relevant, identify restrictions to this usage in relation to learning outcomes and assessments.
  • Be aware that not all text that generative AI technology produces is factually correct. You may wish to experiment with ChatGPT and other tools to see what kinds of errors it generates; citations are often fabricated, and inaccurate prompts are sometimes taken as fact.
  • Different tools will create different responses to the same prompt, with varying quality. You may want to try several different systems to see how they respond.
  • There is a risk that Large Language Models may perpetuate biases inherent in the material on which they were trained.
  • OpenAI and other companies may change their terms of use without notice. If you plan on using a system in the classroom, consider having a back-up plan. Because of the University’s relationship with Microsoft, use of Microsoft Copilot may help you avoid unexpected and disruptive changes in terms of use.

Updated: April 10, 2024

The University expects students to complete assignments on their own, without any outside assistance, unless otherwise specified. However, for the purposes of transparency and clarity for students, instructors are strongly encouraged to go further and to specify what tools may be used, if any, in completing assessments in their courses. Written assignment instructions should indicate what types of tools are permitted; vague references to not using ‘the internet’ will generally not suffice today.

If you are permitting, or even encouraging, students to use generative AI tools for developing their assignments, be explicit about this on the syllabus. Consider what tools and what use is acceptable. Can students use it for critiquing their work? For editing? For creating an outline? For summarizing sources? For searching the literature (e.g., using Semantic Scholar)? You may also want to ask students to reflect on how they used the tools to improve their writing/learning process.

If adding a prohibition on AI tools to assignment instructions, it is best to suggest that the ‘use of generative AI tools’ is prohibited, as opposed to the use of one particular tool, such as ChatGPT. There are many generative AI tools available today.

The University has created sample language that instructors may include in their course syllabi to clarify for students if the use of generative AI tools for completing course work is acceptable, or not, and why.

We also encourage instructors to include information on assignment instructions to explicitly indicate whether the use of generative AI is acceptable or not.

If an instructor indicates that use of AI tools is not permitted on an assessment, and a student is later found to have used such a tool on the assessment, the instructor should consider meeting with the student as the first step of a process under the Code of Behaviour on Academic Matters.  

Some students may ask if they can create their assignment outline or draft using generative AI, and then simply edit the generated first draft; consider before discussing the assignment with your students what your response to this question might be, and perhaps address this question in advance.

You may wish to consider some of the tips for assessment design available on the Centre for Teaching Support & Innovation’s webpage, Generative AI in the Classroom. You might also consider meeting with, or attending a workshop at, your local Teaching Centre to get more information about assignment design. Consider what your learning goals are for the assignment, and how you can best achieve those considering this new technology.

Updated: April 10, 2024

If an instructor specified that no outside assistance was permitted on an assignment, the University would typically consider a student’s use of generative AI to be use of an “unauthorized aid” under the Code of Behaviour on Academic Matters, or as “any other form of cheating”. We are in an interim period where students are receiving conflicting instructions in their various courses as to whether they can use AI or not. We therefore encourage all instructors to be very transparent and clear as to whether use of AI is permitted on any given assessment.

Updated: June 7, 2023

The University does not support the use of AI-detection software programs on student work. None of these software programs have been found to be sufficiently reliable, and they are known to incorrectly flag instances of AI use in human-written content. Some of the AI-detection software programs assess if a piece of writing was generated by AI simply on its level of sophistication.

Sharing your students’ work with these software programs without their permission also raises a range of privacy and ethical concerns.

However, instructors are encouraged to continue to use their traditional methods for detection of potential academic misconduct, including meeting with a student to discuss their assignment in person.

Updated: April 10, 2024

Yes. If you use multiple-choice quizzes/tests, assume that generative AI systems will be able to answer the questions unless they pertain to the specifics of a classroom discussion, the content of which cannot be found on the internet. Some instructors may wish to test the capability of generative AI systems by using their multiple-choice/short answer assessments as prompts, and reviewing responses from a variety of tools (e.g., ChatGPT, Microsoft Copilot, Gemini, Perplexity, Poe Assistant, etc.).  

Updated: April 10, 2024

Talking to students about generative AI tools and their limitations will let students know that you are well aware of the technology, and will generate interesting discussion and help to set guidelines for students. Let students know clearly, both verbally and in assignment instructions, what tools may or may not be used to complete the assignment. Advise students of the limitations of the technology, and its propensity to generate erroneous content.

Please note that detection of student use, especially if these tools are used to their best effect, is not possible. Like use of the internet, generative AI use will become ubiquitous.

Visit the Centre for Teaching Support & Innovation’s webpage, Generative AI in the Classroom, for course and assessment design considerations.

Updated: September 29, 2023

Students and faculty can refer to the U of T Libraries Citation Guide for Artificial Intelligence Generative Tools, which provides guidance on how to cite generative AI use in MLA, APA and Chicago Style.

Updated: July 17, 2023

The School of Graduate Studies (SGS) has posted Guidance on the Appropriate Use of Generative Artificial Intelligence in Graduate Theses which will be of interest to graduate students, supervisors, supervisory committee members, Graduate Chairs and Graduate Units.

Updated: April 10, 2024

No. Large Language Model (LLM) technology is at the heart of a variety of generative AI products that are currently available, including writing assistant programs (e.g., Microsoft Copilot, Gemini, and a huge number of others), image creation programs (e.g., DALL-E 3, Midjourney, etc.) and programs to assist people who are creating computer code (e.g., GitHub Copilot). It is also possible for you to build a system which utilizes this underlying technology (GPT-4 or another model) if you are interested in doing so. 

It is also worth noting that there are a variety of products (online and mobile apps) that have popped up which use GPT-4, Gemini or other AI models and require paid subscriptions. Some add additional features such as editing tools and templates. However, there are others that do nothing more than the free version but are meant to fool people into paying for a service that is currently free. 

Instructor Use of Generative AI

Updated: April 10, 2024

Currently, Microsoft Copilot is the recommended generative AI tool to use at U of T. When a user signs in using University credentials, Microsoft Copilot conforms to U of T’s privacy and security standards (i.e., does not share any data with Microsoft or any other company). It is also free to use. Microsoft Copilot uses OpenAI’s GPT-4 model and performs comparably to ChatGPT. For more information about Copilot, refer to CTSI’s Copilot Tool Guide

Updated: September 29, 2023

The question of copyright ownership remains one of the biggest unknowns when using generative AI tools. The ownership of outputs produced by generative AI is unsettled in law at the current time. If, as an instructor, you would like to use generative AI tools for content generation in your course, consider the following before doing so:

  • Have an understanding that while you can use these tools to create content, you may not own or hold copyright in the works generated.
  • Be mindful of what you input into tools: never input confidential information or intellectual property you do not have the rights or permissions to (e.g., do not submit student work or questions without their permission). All content entered may become part of the tool’s dataset and may inadvertently resurface in response to other prompts (tools like the protected version of Microsoft Copilot are an exception to this).
  • Review the terms of service of each tool, which will establish terms of use and ownership of inputs and outputs (for example, view the Terms of Use for OpenAI). Note that terms of use are subject to change without notice.
  • Be explicit in how you have used these tools in the creation of your work.

View the U of T Libraries, Generative AI tools and Copyright Considerations for more information.

Updated: January 27, 2023

Please note that the instructor is ultimately responsible for ensuring the grade accurately reflects the quality of the student’s work, regardless of the tool used. The University asks that you not submit student work to any third-party software system for grading, or any other purpose, unless the software is approved by the University. A completed assignment, or any student work, is the student’s intellectual property (IP), and should be treated with care.

The University currently has several licensed software tools available for facilitating grading, such as SpeedGrader and Crowdmark. These systems safeguard the student’s IP while also supporting the grading process. In the future these types of systems may include AI-powered grading assistance.

Updated: April 10, 2024

A Provostial Advisory Group on Generative AI in Teaching and Learning was struck in spring 2023 to identify areas in teaching and learning that require an institutional response or guidance. One such example is providing instructors with sample language to include in their course syllabi to clarify for students if the use of generative AI tools for completing course work is acceptable, or not, and why. A Generative AI in Teaching and Learning Working Group, chaired by the Centre for Teaching Support & Innovation, coordinates and plans for instructor resources needed to support generative AI in the classroom. There are also groups around the university (e.g., the libraries) that are tracking the technology and identifying opportunities and issues that we will need to confront.

Decisions regarding the use of generative AI tools in courses will remain with instructors based on the type of course and assessments within them. Regardless of your stance on this technology, it is important that you discuss it with your students, so they understand the course expectations.

Have feedback or want more information?

If you have any suggestions for teaching and learning resources that would be helpful to you as a course instructor, or if you have any other questions about generative AI at U of T that are not addressed through this FAQ, contact us now: