Purpose
This guidance establishes best practices and requirements for the ethical, responsible, safe, and legal usage of Artificial Intelligence (AI) technologies across academic, research, and administrative functions at the University of Wisconsin-Eau Claire. It is meant to encourage the understanding and education of AI technologies across the campus community while promoting UW-Eau Claire data and systems security and supporting access for those who want to use them. The development of this guidance was completed with the mindset that it will be reviewed, updated, and continuously improved as faculty, staff, and students explore and learn more about AI technologies and their usage in our campus community.
Scope
This guidance is for all UW-Eau Claire students, faculty, staff, researchers, and third-party partners who use AI technologies or develop AI systems within the University’s technology ecosystem.
Procurement
Any AI technology that needs to be procured at UW-Eau Claire must be reviewed by UW-Eau Claire Procurement and Learning & Technology Services in order (1) to properly verify the end user license agreement allows for legal usage of the product by a public institution and (2) to verify the AI technology is properly obtained, licensed, and/or procured. Purchases must follow all University procurement rules. Purchases may not be put on credit cards unless written pre-approval is obtained from UW-Eau Claire Procurement or Learning & Technology Services.
If a faculty or staff is interested in procuring AI technologies as a part of curriculum, research, or operations at UW-Eau Claire, please follow the process outlined in the Purchasing: Software knowledge base article or email ltsconsulting@uwec.edu to get started.
Personal Liability
Generative AI tools, such as Claude or Gemini, use click-through agreements. Click-through agreements, such as Anthropic's Claude terms of use or Google's AI service agreements, are contracts. Individuals who accept click-through agreements without delegated signature authority may face personal consequences, including responsibility for compliance with terms and conditions. Faculty and staff should be aware that agreeing to terms of service for AI tools may create personal legal obligations.
University Data Usage
Any AI technology that will access, process, or store University data requires prior review and approval by the Learning & Technology Services and most go through the required risk assessment process as mandated by the Universities of Wisconsin information security policies.
University data is defined as information collected and maintained by the University of Wisconsin-Eau Claire. This includes, but is not limited to, student demographics, academic records, financial information, research data, and personally identifiable information (PII).
In particular, the AI technology will be reviewed to verify:
-
Its data is securely used, transmitted, and stored.
-
It is protected against unauthorized access, tampering, and misuse via secure login, including MFA.
-
Its security measures, including encryption and access controls, have been implemented.
-
It complies with all relevant data protection laws and regulations.
In addition:
-
Generated data ownership and copyright must comply with SYS 1310.
-
Faculty, staff, and students must be informed when they are interacting with AI technology.
Installation of AI Technology on University Property
Any AI technology that requires downloading and installation must be reviewed and approved by Learning & Technology Services in order to verify it is legal, secure, and safe to install on UW-Eau Claire property such as computers, servers, and the high performance computing cluster.
Conditional Usage of Generative AI
Generative AI tools that require no procurement, University data, or installation on University property may be evaluated by faculty and staff for conditional use in accordance with the "Generative AI Tools: Guidelines for Conditional Usage" reference document.
Prohibited Generative AI Usage
Never input personal, confidential, proprietary, or sensitive information into public generative AI tools—assume all data may be exposed or misused. This includes student records subject to FERPA and any information classified as Medium or High Risk per SYS 1031.
Similarly, public instances of generative AI tools should not be used to generate output that would be considered confidential or sensitive. Examples include, but are not limited to, proprietary or unpublished research; legal analysis or advice; recruitment, personnel, or disciplinary decision-making; completion of academic work in a manner not allowed by the instructor; creation of non-public instructional materials; and grading.
Curriculum and Research Usage
University employees are responsible for knowing and following the rules for using AI tools in their department or unit, in addition to adhering to UW-Eau Claire and Universities of Wisconsin policies. Make sure you understand the rules before using AI for work. If you’re unsure, ask your supervisor or department head for clarification.