Artificial intelligence (AI)
Be part of the conversation on AI in teaching and learning!
This new web space was designed to provide a hub for thinking about the impact of artificial intelligence on teaching and learning in higher education.
You will find a selection of recent publications, links to training activities, resources, and above all, a forum to actively participate in the conversations pertaining to AI in a university setting. Engage with your colleagues, ask questions, suggest interesting resources, share your personal experiences and the strategies you’re using in your teaching.
To add to the conversation, please scroll down to the Post a message section. You can also include a link, an image or a video by clicking on the corresponding icon.To attach a document, click on Upload an attachment (accepted formats: doc, docx, jpg, jpeg, odt, pdf, png, ppt, pptx, xls, xlsx, zip - maximum size 30 MB). We will invite you to fill in your name and email so we can contact you if necessary. When you are ready, click on the Submit button.
Given that AI technology is rapidly evolving, the content on this page will be updated regularly, and resources will continue to be added to help those who are teaching.
Special events series
The Teaching and Learning Support Service (TLSS) is launching a new series of events around artificial intelligence (AI) in university teaching and learning. Based on a participatory and collaborative approach, we are pleased to welcome a variety of collaborators to share their expertise and experiences in AI in teaching and learning over the next few months.
We also offer webinars organized by other academic institutions or organizations of interest.
These webinars are offered as free additional resources. The opinions expressed in them do not necessarily reflect the views of the TLSS or the University of Ottawa.
For many people in the world of education, helping to democratize our students’ access to knowledge, skills, employment and well-being is a key part of what we do and why we are so committed to our work...
This event is organized and managed by Contact North. You will be redirected during the registration process. If you require technical support, please communicate directly with Contact North. This webinar is offered as free additional resources. The opinions expressed in it do not necessarily reflect the views of the TLSS or the University of Ottawa.
The launch last November of the latest version of OpenAI's ChatGPT application has created unprecedented media hype. What makes this tool so different is the breadth of its capabilities, which range from providing an immediate answer to a simple, factual question to writing essays and generating computer code. Over 100 million users have tested the application in just a few months.
Want to learn more? Check out the resources we have selected for you!
Does ChatGPT Change.. Everything | The Agenda
Date: January 16, 2023
Duration: 31:53 min
What is Chat GPT? OpenA1’s Chat GPT Explained
Date: December 14, 2022
Duration: 9:16 min
BBC Science Focus – February 2, 2023
Alvin Powell The Harvard Gazette – February 14, 2023
Katie Metzler & ChatGPT
Social Science Space – December 7, 2022
Below we have provided some simple suggestions from various authors that can be easily implemented to address some of the challenges associated with the emergence of artificial intelligence in teaching and learning.
Course design and teaching
Creating a positive, inclusive, and safe learning environment need to start in the very first class. It is important to talk to students about the elements that will maximize their learning experience. There is often a list of items that must be made explicit (organization of the course, intended learning outcomes, expected behaviours, communication guidelines, evaluations, etc.). We now need to add the use of artificial intelligence to the list and in particular what the boundaries are that must be respected in your course. For example, Cohen (2023) explains why these conversations are important.
Students tend to approach new technologies through a lens of play and experimentation, seeking to uncover capabilities and limitations through trial and error. Despite the notion of students as "digital natives," it is important to recognize that students do not inherently understand how to use tools like ChatGPT for academic purposes. It’s from this perspective that ChatGPT opens opportunities for educators to teach students about these tools—to have important conversations with students about the powers, limitations, and ethical uses of advanced technological tools in education contexts.
Principles of academic integrity must be clearly outlined and openly discussed with students. It may be important to rethink key messages and include the new reality of artificial intelligence. Since AI may be used in different ways in different courses, these conversations will be essential to guide students.
As Mills and Goodlad (2023) note, in some contexts, students may be asked to confirm that the work submitted is their own and not that of another person or an automated system. They continue by stating that “this practice has long been used to deter plagiarism and can be adapted to include text generation”. Here is an example they suggest.
I certify that this assignment represents my own work. I have not used any unauthorized or unacknowledged assistance or sources in completing it including free or commercial systems or services offered on the internet.
In a very different context, D'Agostino (2023) shares the experiences of a group of faculty members that illustrate how important the uses of AI and especially the conversations around academic integrity are. The experience shared by an English professor from North Carolina State University highlights how students might receive competing messages.
For the past few semesters, I’ve given students assignments to “cheat” on their final papers with text-generating software. In doing so, most students learn—often to their surprise—as much about the limits of these technologies as their seemingly revolutionary potential. Some come away quite critical of AI, believing more firmly in their own voices. Others grow curious about how to adapt these tools for different goals or about professional or educational domains they could impact. Few believe they can or should push a button to write an essay. None appreciates the assumption they will cheat.
Grappling with the complexities of “cheating” also moves students beyond a focus on specific tools, which are changing stunningly fast and towards a more generalized AI literacy.
The writing process is a component that contributes to learning. Again, it is important to help students understand this fundamental idea. Mills and Goodlad (2023) provide a simple summary of arguments that could be discussed with students.
Make explicit that the goal of writing is neither a product nor a grade but, rather, a process that empowers critical thinking. Writing, reading, and research are entwined activities that help people to communicate more clearly, develop original thinking, evaluate claims, and form judgments.
The assessment of learning has always been an aspect that raises questions and challenges in a educational context. The arrival of new technological tools, based on artificial intelligence such as ChatGPT, only adds another layer to a process that is already complex and difficult to navigate for many educators.
In the long term, this new disruption will undoubtedly lead to a rethinking of the importance placed on assessment and best practices (pedagogical alignment, authentic assessment, etc.) that should already be in place in a university setting. In the short term, however, we must face this new reality and adapt accordingly to prevent the assessments we propose to students from becoming completely meaningless.
Mills and Goodlad (2023) offer (current) strategies that can be considered when assessment tasks require students to develop text.
- Requiring the use of sources and citations. Currently, an application like ChatGPT is not very reliable when it comes to citing sources. The results are often inaccurate and in some cases completely fabricated. Requiring students to produce texts that are supported by relevant academic sources and cited appropriately is a practice that is already part of the university culture.
- Designing a question or proposing an analysis based on an image, diagram, audio or video excerpt. Currently, this kind of analysis cannot be effectively handled by an application like ChatGPT.
- Designing a question or proposing an analysis based on a recent event. Note that the current version of ChatGPT is based on data prior to 2021. If at all possible, base your questions or topics on recent research or current events since they will prevent students from effectively using this application.
- Designing a question or proposing an analysis based on discussions that took place in class. Ideas, arguments, or conclusions drawn from classroom activities are another way to personalize student work and make the use of AI-based technologies less appealing.
- Designing a question or proposing an analysis that requires a comparison (between the positions of two or more authors, between an author's view and the students' personal experiences, between an author's view and a class discussion, etc.). Again, this type of task that requires intersecting or contrasting ideas often produces rather superficial results in current artificial intelligence-based applications. The complexity of the reasoning needed requires thinking skills that are, for the moment at least, essentially human.
The few examples proposed above clearly demonstrate that the assessment of student learning and competencies should be based on the use of higher-order thinking skills (making comparisons, classifying, sequencing, analyzing cause and effect, hypothesizing, critiquing, etc.). If artificial intelligence-based applications can now play a role in students' learning experiences, it is important to question some of our current assessment practices and refocus our efforts on more authentic assessment models. The assessment of learning must therefore be based on skills that are still specific to human thinking.
Kevin Jacob Kelley
Inside Higher ED – January 19, 2023
University Affairs – February 3, 2023
Inside Higher ED – January 31, 2023
The emergence of artificial intelligence-based technologies has quickly raised concerns about many situations that would require rigorous integrity principles. As was the case in the past for other technological innovations (calculators, spell checkers, translation tools, etc.), these new technologies invite us to reconsider how their use could be aligned with the principles of integrity that are essential in an academic setting.
Some authors urge caution. Eaton's (2022) comments are along these lines.
- Using artificial intelligence for school work does not automatically equate to misconduct.
- Artificial intelligence can be used ethically for teaching, learning, and assessment.
- Trying to ban the use of artificial intelligence in school is not only futile, it is irresponsible.
- Human imagination and creativity are not threatened by artificial intelligence.
- Assessments must fit for purpose and should align with learning outcomes.
- Artificial intelligence is not going anywhere. We must learn to work with new technology, not against it.
For now, existing policies can support faculty members who may face situations involving these new technological tools and issues of academic integrity.
Information Matters – February 16, 2023
TechCrunch – February 16, 2023
The Globe and Mail – January 31, 2023
Below are additional resources from other academic institutions or interest groups that are also concerned about the issue of artificial intelligence in post-secondary teaching and learning.
Do you have resources related to AI in teaching and learning that you would like to share with the university community? You can send them to the TLSS team, and we will add them to this page.
How AI Tools such as ChatGPT are changing conversations in Higher Education | UBC
Date: February 28, 2023
Duration: 82:13 min
ChatGPT and the Future of Education | GRAILE AI
Date: February 8, 2023
Duration: 58:32 min
Higher Education’s Thoughtful Response to Robot Writing | Studiosity
Date: February 6, 2023
Duration: 61:09 min
AI and Academia: The End of the Essay? | Maple League of Universities
Date: January 31, 2023
Duration: 62:28 min
TLSS special series
AI in University Teaching and Learning
Event date: February 9, 2023
Hosts: Panel of professors
Transcript to come...
From Contact North special series
Content Creation Using AI: How AI Can Be Used to Build Courses and Learning Experiences
Event date: Wednesday, March 29, 2023
Host: Stephen Murgatroyd
Chief Innovation Officer, Contact North | Contact Nord
Stephen Murgatroyd, Ph.D., is a frequent keynote speaker and presenter at higher-education conferences around the world. He has consulted for governments in Chile, the US, the UK, Canada, Australia and New Zealand and has been an active management consultant for over 30 years.
Webinar offered by the Service de soutien à l'enseignement of Laval University.
ChatGPT et l'IA en enseignement universitaire - Partage de pratique (en français)
Event Date: April 5, 2023
Guest speakers: Stéphane Hamel, MBA, OMCP Lecturer, Faculty of Administrative Sciences and Alexandre Lepage, M.A., Ph.D. in Educational Sciences Lecturer, Faculty of Educational Sciences.
FAQ for professors
There is also an AI FAQ for students on the Academic Integrity for Students webpage (last section of the page).
While many generative AI systems have recently become available, ChatGPT is currently the most prominent, garnering worldwide media attention. This AI tool uses predictive technology to create or revise texts, including essays, computer code, lesson plans, poems, reports and letters. The products created are generally of good quality, although they may contain inaccuracies. We encourage you to try the system to test its capabilities and limitations.
In this FAQ, ChatGPT refers to the free, online AI chat system that uses OpenAI GPT technology. It is just one of several generative AI tools currently available.
Click on a question below to see the answer...
How can I test out ChatGPT to see its capability?
Instructors are welcome and encouraged to test ChatGPT, use of which is currently free upon registration. You can also test other similar AI tools to assess their capability, for instance to see if they can respond to the assignments used in your courses, or the way in which they improve the readability and grammar of a paragraph. Experimentation is also useful to assess the limits of the tool.
Please note that due to high demand, access to ChatGPT is at times unavailable.
Is ChatGPT accurate and reliable?
Large Language Models, like ChatGPT, are trained to predict the next word in a sentence, given the text that has already been written. Early attempts at addressing this task (such as the next-word prediction on a smartphone keyboard) are only coherent within a few words, but as the sentence continues, these earlier systems quickly digress. A major innovation of models such as GPT is their ability to pay attention to words and phrases which were written much earlier in the text, allowing them to maintain context for much longer and in a sense remember the topic of conversation. This capacity is combined with a training phase that involves looking at billions of pages of text. As a result, models like ChatGPT, and its underlying technology GPT-3, are good at predicting what words are most likely to come next in a sentence, which results in generally coherent text.
One area where generative AI tools often fail is in repeating facts or quotations. To a model trained to sound convincing, the only important aspect of a fact is that it sounds like a fact. This means that models like GPT-3 frequently generate claims that sound real, but to an expert are clearly wrong.
Related areas where ChatGPT seems to struggle include citations and discussion of any event or concept that has received relatively little attention in online discourse. To assess these limitations, you could try asking the system to generate your biography. Unless there are numerous accurate biographies of yourself online, ChatGPT is unlikely to generate a comprehensively correct biography.
What are the ethical considerations regarding the use of generative AI systems?
This is a threshold question that instructors may want to consider. Mainstream media has been covering this issue extensively, and alternate viewpoints are widely available.
Given that ChatGPT is trained on materials that are available online, it is possible that it will repeat biases present online. OpenAI has invested substantial effort into addressing this problem, but it remains a danger with these types of systems. You may also want to familiarize yourself regarding questions about the way the technology was developed and trained (e.g., who were the persons who trained it?), the way we use the responses it provides, and the long-term impacts of these technologies on the world.
The Provost is consulting with faculty and staff experts on these larger questions involving ChatGPT and other generative AI systems, and welcomes debate and discussion on these issues.
Can I use generative AI tools for pedagogical purposes in my classroom?
Yes. Some instructors may wish to use the technology to demonstrate how it can be used productively or what its limitations are. The TLSS is developing more information and advice about how you might use generative AI as part of your learning experience design.
Remember that asking or requiring your students to access these tools is complicated because the University has not vetted them for privacy or security. The University generally discourages using such systems for instruction until we are assured that the system protects personal data (e.g., the email address used to register on the system). If, as media reports have suggested, a version of GPT becomes part of the Office365 suite, it may become part of the software suite available to all students and faculty. However, until the University formally approves the use of specific generative AI software, these tools should be considered with the same caution as other third-party applications that ingest personal data.
If you decide to ask or encourage students to use this or other AI systems in your courses, there are a few issues to consider before you do so:
- There may be some students who are opposed to using AI tools. Instructors should consider offering alternative forms of assessment for those students who might object to using the tools, assuming that AI is not a core part of the course.
- Instructors should consider indicating on their syllabus that AI tools may be used in the course and, as relevant, identify restrictions to this usage regarding learning outcomes and assessments.
- Be aware that not everything that generative AI technology produces is correct. You may wish to experiment with ChatGPT to see what errors it generates; citations are often fabricated, and inaccurate prompts are sometimes taken as fact.
- There is a risk that ChatGPT may produce plagiarized text or perpetuate biases inherent in the material on which it was trained.
Are students permitted to use AI tools to complete assessments?
The University expects students to complete assignments independently, without any outside assistance, unless otherwise specified. Instructors are strongly encouraged to speak to their students about what tools are permitted to complete assessments. Written assignment instructions should indicate what tools are permitted; vague references to ‘the internet’ will generally not suffice today.
Suppose an instructor indicates that AI tools are not permitted on an assessment, and a student is later found to have used such a tool on the assessment. In that case, the instructor should consider meeting with the student as the first step of a process under the Code of Behaviour on Academic Matters.
Some students may ask if they can create their assignment outline or draft using ChatGPT and then edit the generated first draft; consider in advance of discussing the assignment with your students what your response to this question might be, and perhaps address this question in advance.
You may wish to consider some of the tips for assessment design below. You might also consider meeting with or attending a workshop at your local Teaching Centre to get more information about assignment design. Consider your learning goals for the assignment and how you can best achieve those considering this new technology.
Would the University classify use of generative AI systems as an academic offence?
If an instructor specified that no outside assistance was permitted on an assignment, the University would typically consider use of ChatGPT and other such tools to be use of an “unauthorized aid” under the Academic regulation I-14, or as “any other form of cheating”.
It is also vital to note that because generative AIs are trained on existing data, they are at risk of ‘generating’ text that was in fact written by a real person in the past. This can result in a student unintentionally plagiarizing a source on which the model was trained.
Can I or should I use one of the new AI-detectors such as GPTZero?
The University discourages the use of AI-detectors on student work. The quality of such detectors has not yet been confirmed, and AI technology is developing at a swift enough pace that the detectors are unlikely to keep up with the technology itself. For instance, some of the detectors base their assessment of whether a piece of writing was generated by AI on the level of sophistication. Making assumptions that a relatively simply phrased assignment is the work of an AI tool would have significant negative impacts on students. Instead, consider some of the tips below on assessment design. You may also wish to consider submitting your assessment topic into ChatGPT to see what type of answer comes out.
Sharing your students’ work with these detectors without their permission also raises a range of privacy and ethical concerns. The University has noted that companies like Turnitin (which owns Ouriginal) are working on their own versions of detectors.
How can I prevent students from using ChatGPT or similar tools on my assignments?
Talking to students about ChatGPT and its limitations will let students know that you are well aware of the technology and will likely generate interesting discussion and help to set guidelines for students. Let students know clearly, both verbally and in assignment instructions, what tools may or may not be used to complete the assignment. Advise students of the limitations of the technology and its propensity to generate erroneous content.
If you choose not to allow use of AI tools on your assignments, here are some tips for generating assignments to which generative AI systems will have difficulty responding. Some include:
- asking students to respond to a specific reading, particularly one that is from the last year and may not be on the internet or may not have generated much commentary online. Generative systems struggle to create accurate responses to prompts for which there is little or no information on the internet.
- ask students to create a video or recording that explains or expands on their work.
- use a flipped classroom approach and/or assign group work to be completed in class, with each member contributing.
- ask students to create a first draft of an assignment, or an entire assignment, by hand in class. (Consider the accessibility needs of students who may require accommodations.)
- call on students in class to explain or justify elements of their work.
- ask students to use ChatGPT to generate material, and then ask them to critique GPT’s response.
- request citations in all written assignments, and if feasible, spot check them—the accuracy of ChatGPT’s citations is one of its gravest shortcomings.
- talk to your colleagues about ideas for your discipline. Different disciplines, such as computer science, history, language studies and visual studies may be developing new norms of pedagogy.
Can generative AI systems respond to multiple-choice or short answer questions?
Yes. If you use multiple-choice quizzes/tests, assume that generative AI systems will be able to answer the questions unless they pertain to highly specific subjects; new knowledge; or the specifics of classroom discussions, the content of which cannot be found on the internet. Some instructors may wish to test this by using their multiple-choice/short answer assessments as prompts, and reviewing ChatGPT’s responses.
Can I use ChatGPT or other AI tools to assess (i.e., grade) student work?
The University asks that you not submit student work to any third-party software system for grading, or any other purpose, unless the software is approved by the University. A completed assignment is the student’s intellectual property (IP), and should be treated with care.
If a student is permitted to use ChatGPT on my assignment, how should they cite it?
This question is still being actively debated by the global academic community. We expect to see standards of practice emerge in the coming months. The MLA has some guidance on the general question of citing AI output here. However, this guidance predates ChatGPT, and may become obsolete as these new tools take on a greater presence in academic writing.
What recommendations should I offer my students for creating a ChatGPT account (or any other account)?
When creating an account (ChatGPT or otherwise), websites may ask for personal information such as phone number, address, email, etc. If this is the case with any applications you wish to use with your students, you must first consult uOttawa IT to request a security assessment.
In the case of ChatGPT, only an email and password are asked for, so you don't need to make such a request. However, we recommend that you inform your students that they should not use their uoAccess credentials. Indeed, the email and password are permanently stored on the provider's server, which could be hacked. With an account created specifically for each application, we limit the possibility of access to multiple systems in case of hacking.
Is ChatGPT the only generative AI system that my students might be using?
No. Large Language Model (LLM) technology is at the heart of a variety of generative AI products that are currently available, including writing assistant programs (e.g., Jasper, Writer, Moonbeam), image creation programs (e.g., DALL-E 2, Midjourney, Stable Diffusion) and programs to assist people who are creating computer code (e.g., GitHub Copilot).
It is also possible for you to build a system which utilizes this underlying technology (GPT-3 or another LLM) if you are interested in doing so. It is also worth noting that there are a variety of products (online and mobile apps) that have popped up which use GPT-3 or other LLM technology and require paid subscriptions. Some add additional features such as editing tools and templates. However, there are others that do nothing more than the free version but are meant to fool people into paying for a service that is currently free.