skip to content
  • Calendar
  • Maps

Data Privacy Guidelines

As AI tools become increasingly available, Weber State University is committed to ensuring that their use across campus is responsible, ethical, and aligned with institutional values. This page outlines core guidelines for using AI tools, with practical advice for students, faculty, and staff. Whether you're learning, teaching, researching, or supporting campus operations, these principles help you get the most out of AI—safely and effectively.

Why it Matters

AI tools can be powerful learning and productivity aids, but they also come with risks. Inaccurate outputs, plagiarism, misuse of personal data, or policy violations can have serious consequences.

These guidelines help ensure that:

  • Student learning is authentic and supported by academic integrity.
  • Faculty use of AI promotes innovation while protecting student data.
  • Staff members use AI tools to enhance work without compromising university standards.

General Guidelines for All Users

Data Considerations When Using AI Tools

At Weber State University, we encourage thoughtful and responsible use of AI tools to support academic, professional, and operational success. Users are expected to apply caution and good judgment when working with AI technologies.

A key best practice is to carefully consider the type of data you share with AI tools. Data can range from public information—like published course catalogs or official press releases—to more sensitive or confidential information—such as student records, employee files, or financial and health data.

General Guidance

Public data (low sensitivity): Typically safe to use with AI tools, especially information already available to the public.

⚠️ Non Public- Sensitive data (moderate sensitivity): Do not use with AI platforms that are not approved. Use caution. Includes private or protected information within the university (like student coursework or internal communications)..

🚫 Non Public- Restricted data (high sensitivity): Do not use with AI platforms. Examples include Social Security numbers, financial details, health records, and any data that could cause significant harm if exposed.

Important Reminder

Be aware. AI platforms are available in different versions. Ensure that you are using the appropriate version for the data you are using. If you’re unsure about the sensitivity of your data, it’s best to err on the side of caution. Treat it as sensitive and seek guidance from your supervisor, department lead, or the AI Task Force before using it with AI services.

We will continue to update these guidelines as our data governance framework evolves. Stay tuned for more detailed classifications and best practices in the near future.

For Students

As a student, AI tools can help you study, explore, and create—but only when used appropriately. These guidelines are here to help you stay on track with academic integrity while using AI to support your learning.

• Check with your instructor before using AI on any coursework.
• Understand what is considered plagiarism or misrepresentation when using AI.
• Use AI as a learning aid, not a shortcut.
• Protect your own data—never share personal or academic records in an AI tool.
• Visit the AI for Students section for training and examples.

For Faculty

Faculty are essential in shaping responsible AI use on campus. This section offers best practices for classroom integration, policy-setting, and ethical guidance as you explore how AI tools can support teaching and research.

  • Clearly communicate your AI policies in syllabi and course expectations.
  • Evaluate how AI tools can support teaching goals without replacing authentic learning.
  • Be cautious when asking students to use AI—review terms of use, data handling, and accessibility.
  • Use CETL’s AI resources for classroom integration strategies.

For Staff

Staff members are using AI more often in administrative work, communications, and support roles. These guidelines are designed to help you use AI safely and efficiently while protecting institutional data and meeting compliance standards.

• Use AI tools to enhance productivity, not to handle confidential data.
• Review IT Division News for safe AI adoption scenarios.
• Ensure compliance with FERPA and internal data security policies.
• Confirm tool approval with the Information Security Office if in doubt.

Still Have Questions?

AI at Weber State is a shared journey—and we're learning together. If you're ever unsure about how or when to use AI tools, support is just a message away. These guidelines are evolving along with the technology. If you're unsure about using AI in a specific context, contact the AI Task Force at ai-taskforce@weber.edu for guidance and support.