Generative Artificial Intelligence in the Classroom: FAQ’s

Updated June 18, 2025

The latest generation of Artificial Intelligence (AI) systems is impacting teaching and learning in many ways, presenting both opportunities and challenges for our course instructors and students as they engage in learning.  At the University of Toronto, we remain committed to providing students with transformative learning experiences and to supporting instructors as they adapt their pedagogy in response to this emerging technology. 

Many generative AI (GenAI) language and coding systems have become available, including ChatGPT, Gemini (from Google), Claude (from Anthropic), Microsoft Copilot, Cursor and others.  

These AI Large-Language Models use predictive technology to create or revise written products of all kinds, including essays, computer code, lesson plans, poems, reports, and letters. They also summarize text, respond to questions, and so on. The products that the tools create are generally of good quality, although they can have inaccuracies. We encourage you to try these systems to test their capabilities and limitations. Faculty, librarians, staff and students at U of T have access to an enterprise edition of Microsoft Copilot and ChatGPT Edu, which conform to the University’s usual privacy and data protections.  

An institutional website on artificial intelligence was launched in early 2024. This site provides a space for U of T community members and the public to find academic and research opportunities at the University, information on technologies currently in use, institutional guidelines and policies, and updates on new artificial intelligence activities across the University. Visit https://ai.utoronto.ca/.  

AI Taskforce Report

U ofT’s institutional AI Task Force report was released in June 2025. The product of the Task Force includes the report, and the reports of the six Working Groups that each produced their own reports and contributed to the work of the Task Force. The product of the Task Force includes the report, and the reports of the six Working Groups that each produced their own reports and contributed to the work of the Task Force. The report identifies recommendations and strategies for the University. Information about the implementation of the recommendations as it unfolds will be posted on this same website.

Open Educational Resources

There is a proliferation of teaching and learning resources available for instructors related to Generative AI. To find out more, visit the following websites: 

DLI: Open Educational Resources for AI literacy. These modules can be customized or used as-is. They are designed to be used in a Quercus course website, or downloaded for use in PowerPoint or Word documents. 

CTSI: Teaching with Generative AI at U of T including the AI Virtual Tutor Initiative to support instructors in designing and implementing course-specific virtual tutors powered by Generative AI.

UTSC: Using GenAI: Faculty 

UTM: Building AI Literacy  

A&S: Assessment Design and Generative AI; Generative AI Policies and Resources 

School of Graduate Studies: Guidance on the Appropriate Use of Generative Artificial Intelligence in Graduate Theses 

Libraries: GenAI tools and Copyright Considerations; GenAI Digital Literacies Framework  

Centre for Research & Innovation Support (CRIS): GenAI Considerations in Academic Research 

Information Security: Use artificial intelligence intelligently 

Sample Syllabus Statements

Revised June 2025: Given the state of AI technology, and specifically Large-Language Model, traditional assignments may no longer be effective for assessing student learning. If you have been teaching during the last two years, this may already be obvious. Also, increasingly, students in many fields are expected to have some understanding of how to work with AI as they move into the next stage of their career and education. For these reasons, we recommend rethinking assignments if they are no longer working for the classes you teach. The Teaching and Learning Centres can help you with this process, as can your colleagues who are experimenting with new ways to approach teaching and assessment. 

The University also recommends that you include a statement on your syllabus that informs students about your expectations with respect to the use of AI. We have created sample statements for instructors to include in course syllabi and course assignments to help shape the message to students about what AI technology is, or is not, allowed. These statements may be used for both graduate and undergraduate level courses. Note; some units and divisions may have also developed guidelines for their instructors. 

If you have included syllabus statements in previous offerings of the course, be aware that GenAI technology is advancing very rapidly, and the current state-of-the-art may be more sophisticated compared to when the course was last offered. Check that your syllabus statements still seem appropriate in light of current capabilities, as well as emerging norms or practices within your division or unit, and the needs of your students. 

You may also want to include a statement to the effect that students may be asked to explain their work at a meeting with the instructor. While you can call a student in for such a discussion whether you include a statement to this effect on your syllabus, or not, reiterating this on the syllabus may help remind students that they are responsible for the work they submit for credit. 

Microsoft Copilot, ChatGPT Edu and other AI tools

U of T faculty, librarians, staff have access to a protected version of Microsoft Copilot, ChatGPT Edu, and Scopus AI, with more coming in the future. The only systems of this type currently available to students are Scopus AI and Microsoft Copilot. Currently, Microsoft Copilot is free to use, and there is a subscription cost for ChatGPT Edu, which can be purchased through the Library Licensed Software Office. A list of tools and their current status can be found on the CTSI website

These tools better protect the privacy and security of users, when users are signed into their U of T account. The prompts and other materials that a user provides are not used to train the Large-Language Model (LLM). Like other generative AI tools, these systems may provide information that is not correct (“hallucinations”), and it is up to each individual user to determine if the results are acceptable. For information and instructions on accessing enterprise editions, please read and adhere to the Microsoft Copilot guidelines for use and/or the ChatGPT Edu information. 

If you are an instructor who is interested in using generative AI with students or to develop course materials, review the FAQ below for considerations. 

Frequently Asked Questions

About Generative AI

Updated: June 18, 2025 

Large Language Models are trained to predict the next word in a sentence, given a prompt, any uploaded material or material from a web search (if enabled), and the text that has already been written. Early attempts at addressing this task (such as the next-word prediction on a smartphone keyboard) were only coherent within a few words, but as the sentence continued, these earlier systems quickly digressed. A major innovation of models such as GPT is their ability to pay attention to words and phrases which were written much earlier in the text, allowing them to maintain context for much longer and, in a sense, remember the topic of conversation. This capacity is combined with a training phase that involves looking at billions of pages of text. As a result, models like ChatGPT, Gemini, Claude, and their underlying foundational models are good at predicting what words are most likely to come next in a sentence, which results in generally coherent text.  

Now the technology has progressed to the point where the “deep research” modes in Gemini and ChatGPT can produce coherent papers that are more than 5 pages long, with verifiable links to sources of information. The more “context” the user provides in the form of a fulsome prompt, documents, notes, images, weblinks and so on, the more likely the resulting work will be coherent, on point, and mostly accurate. 

However, one area where generative AI tools sometimes struggle is in stating facts or quotations accurately. This means that these tools sometimes generate claims that sound real, but to an expert are clearly wrong. These errors can be subtle and nuanced, or blatantly obvious. 

The best way to become familiar with the capabilities and limitations of the tools is to try them. Their capabilities continue to grow, so we recommend continuing to engage with the tools to keep your knowledge of their abilities current.    

Updated:  June 18, 2025 

Instructors are welcome and encouraged to test publicly available free versions of ChatGPT, Gemini, Claude, Cursor, Semantic Scholar, Perplexity and other tools, as long as you do not put sensitive information into these unprotected systems. You can also test AI tools to assess their capability, for instance to see how they respond to the assignments used in your courses, the way in which they improve the readability and grammar of a paragraph, or the way they provide answers to typical questions students may have about course concepts.  Experimentation is also useful to assess the limitations of the tools.   

However, confidential information should never be entered into unprotected AI tools. Content entered into ChatGPT, Gemini, or other public, unprotected tools may become part of the tool’s training. The protected tools available through the University keep user’s prompts and other information private and secure. 

Updated:  June 18, 2025 

This is a threshold question that instructors may want to consider.  Mainstream media has been covering this issue extensively, and alternate viewpoints are widely available.  There is also a growing body of research literature on this topic. 

Given that generative AI systems are trained on materials that are available online, it is possible that they will repeat and amplify biases present online.  OpenAI and other companies have invested substantial effort into addressing this problem, but biases remain inherent in these types of systems. You may also want to familiarize yourself regarding questions about the way the technology was developed and trained (e.g., who were the people who trained it? Whose work was it trained on?), the way we use the responses it provides, and the long-term impacts of these technologies on the world. There are also concerns about the environmental impacts of these technologies, given the amount of electricity required (especially for model training) and the water that is used for cooling the data centres needed to train and run these systems. Even as the technology evolves to become more energy efficient, the absolute volume of use is rising globally. 

The Provost is seeking input from faculty and staff experts on these larger questions involving generative AI systems and welcomes debate and discussion on these issues. 

Updated: June 18, 2025 

There remains significant legal uncertainty concerning the use of generative AI tools in regard to copyright. The best source of current information is the U of T Libraries webpage, Generative AI tools and Copyright Considerations

This is an evolving area, and our understanding will develop as new policies, regulations, and case law becomes settled. Some of the concerns surrounding generative AI and copyright include:   

  • Input: The legality of the content used to train AI models is unknown in some cases. There are a number of lawsuits originating from the US that allege the development of Generative AI tools infringed on copyright and it remains unclear if and how the fair use doctrine can or will be applied. In Canada, there also remains uncertainty regarding the extent to which existing exceptions in the copyright framework, such as fair dealing, apply to this activity. 
  • Output: Authorship and ownership of works created by AI is unclear. Traditionally, Canadian law has indicated that an author must be a natural person (human) who exercises skill and judgement in the creation of a work. As there are likely to be varying degrees of human input in generated content, it is unclear in Canada how it will be determined who the appropriate author and owner of works are. More recently, the US Copyright Office has published a multi-part report on Copyright and Artificial Intelligence.   

Student Use of Generative AI

Updated:  June 18, 2025 

Yes. Instructors may wish to use the technology to demonstrate how it can be used productively, or what its limitations are. The U of T Teaching Centres are continuing to develop more information and advice about how you might use generative AI as part of your learning experience design. 

We also encourage you to consider making use of the new modules on AI literacy for students that are available from the Digital Learning Innovations office: Open Educational Resources for AI literacy. These modules, which were developed in collaboration with faculty at U of T, can be customized or used as-is. They are designed to be used in a Quercus course website, or downloaded for use in PowerPoint or Word documents. 

You can ask your students to use the protected version of Microsoft Copilot or Scopus AI (through the U of T Library). However, keep in mind that asking or requiring your students to access other tools is complicated by the fact that they have not been vetted by the University for privacy or security.  The University generally discourages the use of such systems for instruction until we are assured that the system is protecting any personal data (e.g., the email address used to register on the system). These tools should be considered with the same cautions as other third-party applications that ingest personal data. 

If you decide to ask or encourage students to use an AI system in your courses, there are a few issues to consider before you do so: 

  • Never input confidential information or student work into an unprotected/unvetted AI tool. All content entered may become part of the tool’s training set. 
  • Note that if you ask ChatGPT or other tools whether they wrote something, like a paragraph or other work, they will not give you an accurate answer. 
  • There may be some students who are opposed to using AI tools. Instructors should consider offering alternative forms of assessment for those students who might object to using the tools, assuming that AI is not a core part of the course. 
  • Instructors should consider indicating on their syllabus what AI tools may be used in the course and, as relevant, identify restrictions to this usage in relation to learning outcomes and assessments. 
  • Be aware that not all text that generative AI technology produces is factually correct. You may wish to experiment with ChatGPT and other tools to see what kinds of errors it generates; citations are sometimes fabricated, and inaccurate prompts are sometimes taken as fact.  
  • Different tools will create different responses to the same prompt, with varying quality. You may want to try several different systems to see how they respond. 
  • There is a risk that Large Language Models may perpetuate biases inherent in the material on which they were trained. 
  • Companies may change their terms of use without notice. If you plan on using a system in the classroom, consider having a back-up plan. Because of the University’s relationship with Microsoft, use of Microsoft Copilot may help you avoid unexpected and disruptive changes in terms of use or model availability. 

New: August 29, 2024

Undergraduate and graduate students may use GenAI tools as learning aids, for example, to summarize information or test their understanding of a topic. These tools should be used in a manner similar to consulting library books, online sources, peers, or a tutor. Such uses are generally acceptable even if an instructor has stated that AI tools are not otherwise permitted in the course. These uses typically do not need to be cited or disclosed.

However, if students use information from GenAI tools that would normally require citation (e.g., quoting, paraphrasing, or reproducing text or ideas), they must cite their use according to guidelines provided by U of T Libraries or as directed by their instructor.

Instructors may require students to describe their use of GenAI in assignments, but this must be clearly stated in the syllabus and assignment instructions.

Students should be aware of the limitations of GenAI tools, including the potential for generating inaccurate or biased information, and must adhere to copyright and intellectual property guidelines as outlined in the FAQ, “Can students use GenAI to generate study and learning aids derived from course materials?”

New: August 29, 2024

Students may use AI tools to generate learning aids (e.g., quizzes, flashcards, summaries) as long as they comply with copyright and intellectual property guidelines and the appropriate use guidelines in the FAQ, “Are students permitted to use AI tools as a learning aid?”

What materials can be used to generate learning aids?

As a general rule, course materials should not be uploaded to GenAI systems, because some AI tools use uploaded documents or prompts to train their models. Students may violate copyright laws if they submit materials to third-party tools without owning the IP or copyright. Course materials developed by instructors (e.g., syllabus, lecture notes, slides) are their intellectual property, and course texts (e.g., textbooks, readings) are likely copyrighted.

However, students can use the institutionally-approved version of Microsoft Copilot to generate study and learning aids, because of the data protection offered through this tool. 

To use the institutionally-approved version of Microsoft Copilot:

  • Log in to the U of T-approved version at copilot.microsoft.com using your U of T credentials.
  • Verify data protection by checking for the green shield and checkmark symbol in the upper right-hand corner. This symbol indicates data protection but does not guarantee accuracy or quality of the information.

Use AI-generated materials for personal use only—do not share them with others or commercialize them.

Students should also:

  • Let instructors know if they identify particularly useful approaches to using GenAI to support their learning, or if they have questions about the learning materials they generate.
  • Obtain explicit instructor permission to record or capture class lectures. Recording or using AI to capture transcripts without permission is not allowed.

Updated:  June 18, 2025 

The University expects students to complete assignments on their own, without any outside assistance, unless otherwise specified. However, for the purposes of transparency and clarity for students, instructors are strongly encouraged to go further and to specify what tools may be used, if any, in completing assessments in their courses. Written assignment instructions should indicate what types of tools are permitted; vague references to not using ‘the internet’ will generally not suffice.   

If you are permitting, or even encouraging, students to use generative AI tools for developing their assignments, be explicit about this on the syllabus. Consider what tools and what use is acceptable. Can students use it for critiquing their work? For editing? For creating an outline? For summarizing sources? For searching the literature? (noting that Scopus AI is freely available to students through the U of T Library). You may also want to ask students to reflect on how they used the tools to improve their writing/learning process. 

The University has created sample language that instructors may include in their course syllabi to clarify for students if the use of generative AI tools for completing course work is acceptable, or not, and why. We also encourage instructors to include information on assignment instructions to explicitly indicate whether the use of generative AI is acceptable or not. 

If an instructor indicates that use of AI tools is not permitted on an assessment, and a student is later found to have used such a tool in a manner that was not permitted, the instructor should meet with the student as the first step of a process under the Code of Behaviour on Academic Matters.   

However, given the state of the technology, and its likely future evolution, you may find that traditional forms of assessment, even with syllabus rules in place, are simply ineffective at gauging student learning. The Centre for Teaching Support & Innovation’s webpage, Generative AI in the Classroom website has some tips and you might also consider meeting with, or attending a workshop at, your local Teaching Centre to get more information about assignment design. We also would encourage you to work with your colleagues in your field as new norms of practice are emerging. Consider your learning goals, your students’ needs, and how you can best achieve those considering this new technology. 

Updated: August 29, 2024

Beyond use of GenAI tools as a general learning aid (see “Are students permitted to use AI tools as a learning aid?” above),if an instructor specified that no outside assistance was permitted on an assignment, the University would typically consider a student’s use of generative AI to be use of an “unauthorized aid” under the Code of Behaviour on Academic Matters.  Instructors should keep in mind that students might receive a range of instructions across different courses about what constitutes appropriate use of AI within each course. We therefore encourage all instructors to be very transparent and clear as to whether, and in what ways, use of AI is permitted on any given assessment.

Updated:  June 18, 2025 

The University does not support the use of AI-detection software programs on student work. None of these software programs have been found to be sufficiently reliable, and they are known to incorrectly flag instances of AI use in human-written content. Detection tools often use metrics like sentence complexity, vocabulary richness, and syntactic variety to estimate whether text was likely written by a human or AI. If writing is too simple or too uniform, or unusually polished with a lack of variation, it may cause a detector to flag the work as AI-generated. 

In addition, sharing your students’ work (i.e., their intellectual property) with these software programs without their permission also raises a range of privacy and ethical concerns.  

However, instructors are encouraged to continue to use their traditional methods for detection of potential academic misconduct, including meeting with a student to discuss their work in person, using short in-person assessments that ask students to expand on the ideas in their out-of-class assignments, and so on.

Updated: April 10, 2024

Yes. If you use multiple-choice quizzes/tests, assume that generative AI systems will be able to answer the questions unless they pertain to the specifics of a classroom discussion, the content of which cannot be found on the internet. Some instructors may wish to test the capability of generative AI systems by using their multiple-choice/short answer assessments as prompts, and reviewing responses from a variety of tools (e.g., ChatGPT, Microsoft Copilot, Gemini, Perplexity, Poe Assistant, etc.).  

Updated: June 18, 2025 

Talk to students about your expectations for generative AI use, and any limitations you have established in your course policies. Describe how you understand the impact of generative AI on particular learning outcomes and, if you have set limits, your rationale for setting limitations on its use. Let students know clearly, both verbally and in assignment instructions, what tools may or may not be used to complete the assignment. Advise students of the limitations of the technology, and its propensity to generate sometimes erroneous and/or sub-par content. You might also show students what AI produces when responding to an assignment prompt, and how it differs from your expectations of quality work. 

Increasingly, prevention and detection of AI use will not be feasible. It is important that instructors consider how students may use AI tools to ensure that assessments continue to provide meaningful checkpoints on student learning. 

Visit the Centre for Teaching Support & Innovation’s webpage, Generative AI in the Classroom, for course and assessment design considerations. 

Updated: September 29, 2023

Students and faculty can refer to the U of T Libraries Citation Guide for Artificial Intelligence Generative Tools, which provides guidance on how to cite generative AI use in MLA, APA and Chicago Style.

Updated: April 25, 2025 

The School of Graduate Studies (SGS) has posted Guidance on the Appropriate Use of Generative Artificial Intelligence in Graduate Theses which will be of interest to graduate students, supervisors, supervisory committee members, Graduate Chairs and Graduate Units. 

Updated: June 18, 2025 

There is a growing number of tools. ChatGPT, Google Gemini, and Claude are just a few of the popular conversational AI tools that allow students to interact with Large Language Models (LLMs). In addition, there are a large number of tools specifically designed for coding, such as Cursor and Windsurf. Large-Language Models can also be downloaded and used on a laptop, e.g., LLaMa. These tools can be used to generate, edit or analyze text and code, and in many cases also images, structured data, and video. LLMs can also be used for search and academic research.  

Other text, image, video or code generation or editing capabilities might be embedded within other software or systems (e.g. Grammarly, Google AI overviews, photo editing software, etc.). Users may not even be aware that they are interacting with generative AI when using these embedded capabilities. 

It is also worth noting that there are a variety of products (online and mobile apps) that have popped up which use GPT-4, Gemini or other AI models and require paid subscriptions. Some add additional features such as editing tools and templates. However, there are others that do nothing more than the free version but are meant to fool people into paying for a service that is currently free.   

Instructor Use of Generative AI

Updated: June 18, 2025 

Faculty, staff, librarians should either sign into Microsoft Copilot with their U of T credentials, or purchase a subscription for ChatGPT Edu through the Library Licensed Software Office (LSO), for AI use that involves personal or institutional data or information, copyrighted materials, or other professional or academic uses where data security and privacy is important. Please refer to U of T Information Security guidelines on Using Artificial Intelligence Intelligently for more details about AI information security.  

For more information on what tools are currently available, and to whom, please visit the CTSI website that lists currently approved tools

Updated: June 18, 2025

GenAI tools may assist instructors in developing or updating course materials. 

Instructors who make substantive use of these tools are encouraged to acknowledge this in their syllabus, or course documents, similar to how they would acknowledge materials borrowed or adapted from a colleague. This serves as a model for students regarding the expected use of GenAI tools. 

Note that copyright ownership of outputs produced by generative AI is currently unsettled in law. Instructors considering using these tools should: 

  • Understand that while they can create content with these tools, they may not own or hold copyright over the generated works. 
  • Avoid inputting confidential information or intellectual property they do not have rights to use (e.g., student work or questions without permission). If entered into an unprotected tool, the content may become part of the tool’s training set. If entered into a protected tool, such as the U of T Microsoft Copilot or ChatGPT Edu tools, professional etiquette dictates asking permission from the owner before using the content in this way. 
  • Review each tool’s terms of service, which govern the use and ownership of inputs and outputs. Note that if the tool is not licensed by the University, these terms can change without notice. 

For more information, refer to U of T Libraries’ Generative AI tools and Copyright Considerations

Updated: June 18, 2025

Please note that the instructor is ultimately responsible for ensuring the grade accurately reflects the quality of the student’s work, regardless of the tools used. The University asks that you not submit student work to any third-party software system for grading, or any other purpose, unless the software is approved by the University.  A completed assignment, or any student work, is the student’s intellectual property (IP), and should be treated with care. 

The University currently has several licensed software tools available for facilitating grading, such as SpeedGrader and Crowdmark.  These systems safeguard the student’s IP while also supporting the grading process.  In the future these types of systems may include AI-powered grading assistance. If instructors use generative AI to support them in providing formative feedback, instructors are responsible for structuring and reviewing the feedback provided to ensure that it follows effective practices for high-quality formative feedback by providing feedback that is timely, constructive, and appropriate in volume.  

TAs must consult with the course instructor before using generative AI to assess or provide feedback on student work.  

This is an area that is evolving quickly, and divisions at the University may be considering new rules as practices and tools emerge. Therefore, we recommend that you talk with your Chair, Director, or the appropriate Vice Dean before implementing AI into your grading processes. 

Updated: June 18, 2025 

In May 2024, the U of T Artificial Intelligence Task Force was established to develop a vision and strategy to guide the University’s AI activities, and to guide the integration of AI within our teaching, learning, and administrative processes and frameworks, ensuring alignment with our core values and mission. The Task Force and its working groups have prepared reports and developed recommendations for the U of T community. Information and updates about the Task Force are available on the U of T Artificial Intelligence site. 

In addition, a Generative AI in Teaching and Learning Working Group, chaired by the Centre for Teaching Support & Innovation coordinates and plans for instructor resources needed to support generative AI in the classroom. There are also groups around the University (e.g., the libraries) that are tracking the technology and identifying opportunities and issues that we will need to confront. 

Offices and divisions across the University have developed guidance for AI use in teaching, research and other academic activities. We share some of these below:  

DLI: Open Educational Resources for AI literacy. These modules can be customized or used as-is. They are designed to be used in a Quercus course website, or downloaded for use in PowerPoint or Word documents.  

CTSI: Teaching with Generative AI at U of T  

UTSC: Using GenAI: Faculty  

UTM: Building AI Literacy   

A&S: Assessment Design and Generative AI; Generative AI Policies and Resources  

School of Graduate Studies: Guidance on the Appropriate Use of Generative Artificial Intelligence in Graduate Theses  

Libraries: GenAI tools and Copyright Considerations; GenAI Digital Literacies Framework   

Centre for Research & Innovation Support (CRIS): GenAI Considerations in Academic Research  

Information Security: Use artificial intelligence intelligently  

Have feedback or want more information?

If you have any suggestions for teaching and learning resources that would be helpful to you as a course instructor, or if you have any other questions about generative AI at U of T that are not addressed through this FAQ, contact us now: