Generative Artificial Intelligence in the Classroom

The latest generation of Artificial Intelligence (AI) systems will impact teaching and learning in many ways, presenting both opportunities and challenges for the ways our course instructors and students engage in learning. At the University of Toronto, we remain committed to providing students with transformative learning experiences and to supporting instructors as they adapt their pedagogy in response to this emerging technology.

While many generative AI systems have recently become available, ChatGPT is currently the most prominent, garnering worldwide media attention. This is an AI tool that uses predictive technology to create or revise written products of all kinds, including essays, computer code, lesson plans, poems, reports, and letters. The products that the tool creates are generally of good quality, although they can have inaccuracies. We encourage you to try the system to test its capabilities and limitations.

In this FAQ, ChatGPT refers to the free, online AI chat system that utilizes the OpenAI GPT technology. Please note that this is only one of a variety of generative AI tools currently available.

Sample Syllabus Statements

April 2023: The University has created sample statements for instructors to include in course syllabi and course assignments to help shape the message to students about what AI technology is, or is not, allowed. These statements may be used for both graduate and undergraduate level courses. 

SGS Guidelines

July 2023: The School of Graduate Studies (SGS) announced new Guidance on the Appropriate Use of Generative Artificial Intelligence in Graduate Theses which will be of interest to graduate students, supervisors, supervisory committee members, Graduate Chairs and Graduate Units.

Copyright Considerations

September 2023: There remains significant legal uncertainty concerning the use of generative AI tools in regard to copyright. This is an evolving area, and our understanding will develop as new policies, regulations, and case law become settled. Some of the concerns surrounding generative AI and copyright include: 

  • Input: The legality of the content used to train AI models is unknown in some cases. There are a number of lawsuits originating from the US that allege Generative AI tools infringe on copyright and it remains unclear if and how the fair use doctrine can be applied. In Canada, there also remains uncertainty regarding the extent to which existing exceptions in the copyright framework, such as fair dealing, apply to this activity.
  • Output: Authorship and ownership of works created by AI is unclear. Traditionally, Canadian law has indicated that an author must be a natural person (human) who exercises skill and judgement in the creation of a work. As there are likely to be varying degrees of human input in generated content, it is unclear in Canada how it will be determined who the appropriate author and owner of works are. More recently, the US Copyright Office has published the following guide addressing these issues: Copyright Registration Guidance for Works Containing AI-Generated Materials.

If you have further questions about copyright, please view the U of T Libraries webpage, Generative AI tools and Copyright Considerations for the latest information.

If you are an instructor who is interested in in using generative AI to develop course materials, review the FAQ below for considerations.

Frequently Asked Questions

Updated: May 26, 2023

Instructors are welcome and encouraged to test ChatGPT, use of which is currently free upon registration. You can also test other similar AI tools to assess their capability, for instance to see if they can respond to the assignments used in your courses, or the way in which they improve the readability and grammar of a paragraph. Experimentation is also useful to assess the limits of the tool. However, confidential information should never be entered into an AI tool such as ChatGPT. All content entered may become part of the tool’s dataset and may inadvertently resurface in response to other prompts.

Please note that due to high demand, access to ChatGPT is, at times, unavailable.

Updated: April 4, 2023

Large Language Models, like ChatGPT, are trained to predict the next word in a sentence, given the text that has already been written. Early attempts at addressing this task (such as the next-word prediction on a smartphone keyboard) are only coherent within a few words, but as the sentence continues, these earlier systems quickly digress. A major innovation of models such as GPT is their ability to pay attention to words and phrases which were written much earlier in the text, allowing them to maintain context for much longer and in a sense remember the topic of conversation. This capacity is combined with a training phase that involves looking at billions of pages of text. As a result, models like ChatGPT, and its underlying technology GPT-3 (and now, GPT-4), are good at predicting what words are most likely to come next in a sentence, which results in generally coherent text.

One area where generative AI tools sometimes struggle is in stating facts or quotations accurately. This means that models like GPT-4 sometimes generate claims that sound real, but to an expert are clearly wrong.

A related area where ChatGPT seems to struggle is in the discussion of any event or concept that has received relatively little attention in online discourse. To assess these limitations, you could try asking the system to generate your biography.  Unless there are numerous accurate biographies of yourself online, ChatGPT is unlikely to generate a comprehensively correct biography. 

Updated: April 4, 2023

This is a threshold question that instructors may want to consider. Mainstream media has been covering this issue extensively, and alternate viewpoints are widely available.  

Given that generative AI systems are trained on materials that are available online, it is possible that they will repeat biases present online. OpenAI has invested substantial effort into addressing this problem, but it remains a danger with these types of systems. You may also want to familiarize yourself regarding questions about the way the technology was developed and trained (e.g., who were the people who trained it?), the way we use the responses it provides, and the long-term impacts of these technologies on the world.

The Provost is consulting with faculty and staff experts on these larger questions involving ChatGPT and other generative AI systems, and welcomes debate and discussion on these issues.

Updated: September 29, 2023

The question of copyright ownership remains one of the biggest unknowns when using generative AI tools. The ownership of outputs produced by generative AI is unsettled in law at the current time. If, as an instructor, you would like to use generative AI tools for content generation in your course, consider the following before doing so:

  • Have an understanding that while you can use these tools to create content, you may not own or hold copyright in the works generated.
  • Be mindful of what you input into tools: never input confidential information or intellectual property you do not have the rights or permissions to (e.g., do not submit student work or questions without their permission). All content entered may become part of the tool’s dataset and may inadvertently resurface in response to other prompts.
  • Review the terms of service of each tool, which will establish terms of use and ownership of inputs and outputs (for example, view the Terms of Use for OpenAI). Note that terms of use are subject to change without notice.
  • Be explicit in how you have used these tools in the creation of your work.

View the U of T Libraries, Generative AI tools and Copyright Considerations for more information.

Updated: May 26, 2023

Yes. Some instructors may wish to use the technology to demonstrate how it can be used productively, or what its limitations are. The U of T Teaching Centres have developed more information and advice about how you might use generative AI as part of your learning experience design.

Keep in mind that asking or requiring your students to access these tools is complicated by the fact that they have not been vetted by the University for privacy or security. The University generally discourages the use of such systems for instruction until we are assured that the system is protecting any personal data (e.g., the email address used to register on the system). If a version of GPT becomes part of the Office365 suite, it may become part of the software suite available to all students and faculty. However, until the University formally approves the use of specific generative AI software, these tools should be considered with the same cautions as other third-party applications that ingest personal data.

If you decide to ask or encourage students to use this or other AI systems in your courses, there are a few issues to consider before you do so:

  • Never input confidential information into an AI tool such as ChatGPT. All content entered may become part of the tool’s dataset and may inadvertently resurface in response to other prompts.
  • There may be some students who are opposed to using AI tools. Instructors should consider offering alternative forms of assessment for those students who might object to using the tools, assuming that AI is not a core part of the course.
  • Instructors should consider indicating on their syllabus that AI tools may be used in the course and, as relevant, identify restrictions to this usage in relation to learning outcomes and assessments.
  • Be aware that not everything that generative AI technology produces is correct. You may wish to experiment with ChatGPT to see what kinds of errors it generates; citations are often fabricated, and inaccurate prompts are sometimes taken as fact.
  • There is a risk that ChatGPT may perpetuate biases inherent in the material on which it was trained.
  • ChatGPT has also been overwhelmed by users recently and not available for hours at a time. OpenAI may also change its terms of use without notice. If you plan on using it ‘live’ in the classroom, consider having a back-up plan.

Updated: April 24, 2023

The University expects students to complete assignments on their own, without any outside assistance, unless otherwise specified. Instructors are strongly encouraged to speak to their students about what tools, if any, are permitted in completing assessments. Written assignment instructions should indicate what types of tools are permitted; vague references to ‘the internet’ will generally not suffice today. If adding a prohibition on AI tools to assignment instructions, it is best to suggest that the ‘use of generative AI tools’ is prohibited, as opposed to the use of one particular tool, such as ChatGPT. There are many generative AI tools available today.

The University has created sample language that instructors may include in their course syllabi to clarify for students if the use of generative AI tools for completing course work is acceptable, or not, and why. The document is available for download at the top of this webpage. We also encourage instructors to include information on assignment instructions to explicitly indicate whether the use of generative AI is acceptable or not.

If an instructor indicates that use of AI tools is not permitted on an assessment, and a student is later found to have used such a tool on the assessment, the instructor should consider meeting with the student as the first step of a process under the Code of Behaviour on Academic Matters.  

Some students may ask if they can create their assignment outline or draft using ChatGPT, and then simply edit the generated first draft; consider before discussing the assignment with your students what your response to this question might be, and perhaps address this question in advance.

You may wish to consider some of the tips for assessment design below. You might also consider meeting with, or attending a workshop at, your local Teaching Centre to get more information about assignment design. Consider what your learning goals are for the assignment, and how you can best achieve those considering this new technology.

Updated: March 2, 2023

If an instructor specified that no outside assistance was permitted on an assignment, the University would typically consider use of ChatGPT and other such tools to be use of an “unauthorized aid” under the Code of Behaviour on Academic Matters, or as “any other form of cheating”. Such a categorization is in keeping with how the University has classified use of other generative and unauthorized technology tools, such as Chegg, in the past.

Updated: June 7, 2023

The University discourages the use of AI-detection software programs on student work. Such software programs have been found to be unreliable and to incorrectly flag instances of AI use in human-written content. Some of the AI-detection software programs assess if a piece of writing was generated by AI simply on its level of sophistication. Using such detection programs could negatively impact students if they were to be improperly accused of using an AI tool.

Sharing your students’ work with these software programs without their permission also raises a range of privacy and ethical concerns.

However, instructors are encouraged to continue to use their traditional methods for detection of potential academic misconduct, including meeting with a student to discuss their assignment in person.

Updated: January 27, 2023

Yes. If you use multiple-choice quizzes/tests, assume that generative AI systems will be able to answer the questions unless they pertain to highly specific subjects; new knowledge; or the specifics of classroom discussions, the content of which cannot be found on the internet. Some instructors may wish to test this by using their multiple-choice/short answer assessments as prompts, and reviewing ChatGPT’s responses.  

Updated: March 2, 2023

Talking to students about ChatGPT and its limitations will let students know that you are well aware of the technology, and will likely generate interesting discussion and help to set guidelines for students. Let students know clearly, both verbally and in assignment instructions, what tools may or may not be used to complete the assignment. Advise students of the limitations of the technology, and its propensity to generate erroneous content.

If you choose not to allow use of AI tools on your assignments, here are some tips for generating assignments to which generative AI systems will have difficulty responding. Some include:

  • ask students to respond to a specific reading, particularly one that is from the last year, and may not be on the internet, or may not have generated much commentary online. Generative systems struggle to create accurate responses to prompts for which there is little or no information on the internet.
  • ask students to create a video or recording that explains or expands on their work.
  • use a flipped classroom approach, and/or assign group work to be completed in class, with each member contributing.
  • ask students to create a first draft of an assignment, or an entire assignment, by hand in class. (Consider the accessibility needs of students who may require accommodations.)
  • call on students in class to explain or justify elements of their work.
  • ask students to use ChatGPT to generate material, and then ask them to critique GPT’s response.
  • talk to your colleagues about ideas for your discipline. Different disciplines, such as computer science, history, language studies and visual studies may be developing new norms of pedagogy.

Updated: January 27, 2023

The University asks that you not submit student work to any third-party software system for grading, or any other purpose, unless the software is approved by the University. A completed assignment, or any student work, is the student’s intellectual property (IP), and should be treated with care.

The University currently has several licensed software tools available for facilitating grading, such as SpeedGrader and Crowdmark. These systems safeguard the student’s IP while also supporting the grading process. In the future these types of systems may include AI-powered grading assistance. Please note that the instructor is ultimately responsible for ensuring the grade accurately reflects the quality of the student’s work, regardless of the tool used.

Updated: September 29, 2023

They can refer to the U of T Libraries Citation Guide for Artificial Intelligence Generative Tools, which provides guidance on how to cite generative AI use in MLA, APA and Chicago Style.

Updated: April 4, 2023

No. Large Language Model (LLM) technology is at the heart of a variety of generative AI products that are currently available, including writing assistant programs (e.g., Bing, Jasper, Notion AI and Caktus ai), image creation programs (e.g., DALL-E 2, Midjourney, Stable Diffusion) and programs to assist people who are creating computer code (e.g., GitHub Copilot). It is also possible for you to build a system which utilizes this underlying technology (GPT-4 or another model) if you are interested in doing so. 

It is also worth noting that there are a variety of products (online and mobile apps) that have popped up which use GPT-4 or other AI models and require paid subscriptions. Some add additional features such as editing tools and templates. However, there are others that do nothing more than the free version but are meant to fool people into paying for a service that is currently free. 

Updated: September 12, 2023

A Provostial Advisory Group on Generative AI in Teaching and Learning was struck in spring 2023 to identify areas in teaching and learning that require an institutional response or guidance. One such example is providing instructors with sample language to include in their course syllabi to clarify for students if the use of generative AI tools for completing course work is acceptable, or not, and why. The document is available for download at the top of this webpage. A Generative AI in Teaching and Learning Working Group, chaired by the Centre for Teaching Support & Innovation, has recently been established to coordinate and plan for instructor resources needed to support generative AI in the classroom.

Decisions regarding the use of generative AI tools in courses will remain with instructors based on the type of course and assessments within them. Regardless of your stance on this technology, it is important that you discuss it with your students, so they understand the course expectations.

Have feedback or want more information?

If you have any suggestions for teaching and learning resources that would be helpful to you as a course instructor, or if you have any other questions about generative AI at U of T that are not addressed through this FAQ, contact us now: