Skip to Main Content

Generative AI and Academic Integrity

The goal of this guide is to help students learn about the relationship between generative AI and academic integrity.

Potential harms

The use of generative AI to complete assignments comes with potential harms to students’ learning.

Below is a list of potential harms. Can you think of others that might belong on this list?

(Note: This list was generated by ChatGPT, with just a few small changes. Consider: why might the author of this guide have used ChatGPT for this task? How does it make you feel about the quality and authority of the information?)

  1. Academic Misconduct Risk: One of the most significant concerns is the potential for students to misuse these tools by copying or paraphrasing content generated by the AI without proper attribution. This can lead to academic misconduct and severe consequences.
  2. Reduced Critical Thinking: Overreliance on AI tools may discourage students from engaging in critical thinking, research, and problem-solving on their own. They might become overly dependent on the AI for answers.
  3. Loss of Writing Skills: Excessive use of AI for writing and editing may hinder the development of students' writing and editing skills. They may rely on AI to fix errors instead of learning to identify and correct them themselves.
  4. Misleading Information: AI tools like ChatGPT can sometimes provide inaccurate or outdated information. Students must verify the information they receive and not blindly trust AI-generated content.
  5. Privacy Concerns: Using AI tools may involve sharing sensitive or personal information, which raises privacy concerns, especially if the service provider mishandles or misuses data.
  6. Dependency: Students who use AI extensively may become overly reliant on these tools, making it challenging to complete assignments when access to the tool is unavailable.
  7. Bias and Stereotyping: AI models like ChatGPT can perpetuate biases present in their training data, potentially leading to biased or inappropriate content generation, which is especially problematic in sensitive topics or essays on social issues.
  8. Inauthentic Work: If a significant portion of an assignment is generated by AI, it may not reflect the student's true knowledge or abilities, potentially leading to disparities between their perceived and actual academic skills.
  9. Detrimental for Learning Process: Students who use AI tools as shortcuts may miss out on the valuable learning experiences that come from struggling with challenging assignments, seeking help from professors or peers, and refining their skills through practice.
  10. Lack of Human Interaction: Overuse of AI tools might reduce human interaction and collaboration, which is an essential aspect of the learning experience in college.