Generative AI Ethics & Privacy

As generative AI tools proliferate in academia, experts caution users to be aware of ethical considerations such as equitable access, data control & privacy, bias in underlying datasets, and convincing fakes. Whiting School of Engineering (WSE) encourages faculty and staff to learn about AI ethics and discuss these concerns with students. If you plan to use generative AI tools in your course(s), it is crucial that you inform students of potential risks, share appropriate policies, and implement strategies for mitigating these issues.

Generative AI Risks

Equitable Access

Some students cannot afford high-speed internet or subscription-based technologies, including AI tools with paid tiers, so design activities that do not disadvantage these students. 

Data Privacy

Understand what data you share with generative AI tools, especially Personally Identifiable Information (PII), since most data entered into these tools is incorporated into public models.

Inherent Bias

Be aware that datasets used to train generative AI tools contain bias, stereotypes, and other harmful content that do not reflect diversity at JHU and may misrepresent individuals and populations.

Convincing Fakes

Recognize that generative AI tools can produce convincing fakes that mimic real content, including people and names, scholarly citations (article titles and content), and inaccurate “facts.” 

Learning Opportunities and Strategies for Mitigating Risks

Despite risks involved with using generative AI, there are many ways to incorporate these tools into teaching and learning—and they may even provide benefits to accessibility (e.g., automated alt text, audio descriptions, document tagging, etc.), personalized learning, and student support (e.g., 24/7 real-time AI teaching assistants). If you intend to use these tools in your course(s), be aware of the risks and consider best practices for mitigating potential issues.

AI literacy is not something we should seek to eliminate. Although there are risks, AI technology is an astounding resource that can be leveraged if we understand its capabilities and design curricula that focus on effective teaching and learning.

Course Policies on AI

If you plan to have students use AI tools in your course(s), WSE recommends that you clearly outline course policies and communicate concerns about data privacy, collection and storage.

Teaching Tips & Best Practices

  • Focus on effective teaching and learning
  • Specify course policies on the use of generative AI tools
  • Communicate concerns about ethical considerations and risks 
  • Create a culture of academic honesty to reduce the risk of misconduct 

Individual users are responsible for reviewing technology terms of service and privacy policies.

If you plan to have students use AI tools in your course(s), WSE recommends that you clearly outline course policies and communicate concerns about PII and data privacy, data collection processes and storage. You should also allow students to share their concerns with you and opt out of sharing their personally identifiable information—which may require providing alternative activities, assignments, or approaches for students who opt out. Consider including language from the following widely-shared syllabus template:

Be sure to advise students against sharing their personally identifiable information and other sensitive data without appropriate consent, which includes understanding the technology terms of service and privacy policies. Students should be aware that any data submitted to generative AI tools may be used to train the underlying data models and appear in future outputs.

AI for Activities or Assignments

Syllabus Template

“In this course, you (students) will be utilizing [specify the tool or platform], which serves the purpose of [explain why and how the students will be using the tool]. During the account creation process, you will be required to provide your name and other identifying information. The tool is hosted on servers located in [specify location]. By using this service, you consent to the storage of your information in [the specified location]. If you prefer not to provide your consent, please contact me [instructor’s email] to discuss alternative arrangements.” 

If you intend to make AI tool usage mandatory for activities or assignments, WSE recommends providing alternatives for students who do not wish to share personally identifiable information. Alternatives may include: 

  • Having students use a pseudonym and/or generic email address
  • Having faculty create a shared account for students to use (that is not tied to the faculty member’s personal account)
  • Providing alternative assignments or activities that allow students to achieve the same learning goals without using these tools.  

Before implementing any of these options, it is essential to review the Terms of Use. For example, OpenAI’s Terms of Use currently state that access credentials or accounts should not be shared with individuals outside of the organization, and users are responsible for all activities conducted using their credentials.  

Ensuring Equitable Access

If you do use AI tools in your courses, WSE recommends taking steps to ensure equitable access for all students. Some generative AI tools, including ChatGPT, offer paid service tiers (e.g., ChatGPT Plus, which can write an execute code among other enhanced features). Faculty should design course activities in ways that do not disadvantage students who can not afford paid or premium versions.

Addressing Bias and Harmful Content

Generative AI tools are based on Large Language Models (LLMs) that ingest vast amounts of data. When the underlying data contains biased, discriminatory, abusive or unreliable content the outputs of these tools can be similarly problematic, offensive, and/or inaccurate. WSE recommends that faculty and staff acknowledge these concerns, discuss them with students, and implement strategies for mitigating these issues when using AI tools in teaching and learning.