CU faculty, staff and students push back against university-controlled AI rollout

Courtesy of Paul Wedlake, CU Denver
Faculty across the University of Colorado system say the rollout of a campus AI system raises broader questions about how artificial intelligence will reshape teaching and research, how learning will be measured in the AI era, and how much influence technology companies should have inside universiti

Hundreds of faculty members, students and staff across University of Colorado campuses are pushing back against a new OpenAI system launching March 31.

In February, the university entered a $2 million-a-year agreement for three years, renewable annually, to provide ChatGPT Edu across the system to more than 100,000 students, staff and faculty.

Hundreds have signed a letter of dissent arguing that the rollout lacked transparency and technical oversight. Others say campus leaders haven’t adequately addressed concerns about student privacy, academic integrity, corporate influence and environmental sustainability.

Faculty on CU Denver and Boulder campuses say the decision was reached without consulting campus experts in AI, ethics or education.  

“It appears to all of us that this decision was based on following the AI hype rather than on any empirically supported educational benefits to students, staff, and faculty,” the petition read.

Faculty say the rollout raises broader questions about how artificial intelligence will reshape teaching and research, how learning will be measured in the AI era, and how much influence technology companies should have inside universities.

Why the institution says campuses need their own AI system

Right now, research labs may use AI tools to flesh out ideas or run journal articles or grant proposals through the systems. University officials argue that a private system protects research and other proprietary information compared to public AI systems.

Data show about 28,000 CU Boulder faculty, students and staff are already using public AI tools logging in with university email accounts, including 3,000 faculty and staff.

“But using institutional data on the public platform can expose students, faculty, staff and the university to security risks,” said CU Boulder chancellor Justin Schwartz, adding that a private system ensures sensitive data stays off public training models.

They also said the initiative ensures that all students have access to the same tools.

“Equitable access to this emerging technology is essential for our students and employees,” said CU President Todd Saliman, noting that the system removes financial barriers.

Officials said the new system won’t change university policies, including academic freedom, student codes of conduct or how data is governed. They also stress that using AI in classrooms or research is optional.

Users must complete a training course on ethics and privacy before gaining access to the new system. Additional resources will be available on how to use AI for research, teaching or administrative work.

Faculty pushback

While the university claims data won’t train OpenAI’s public models, faculty worry OpenAI will own de-identified data to develop or improve the company’s services … essentially commercializing student activity.  

“Even if identifiable student data cannot be used directly … aggregated or anonymized versions of student and faculty interactions can be used for product development,” said Joanne Addison, a CU Denver professor of English and former chair of the faculty assembly.

Though the contract states that student data remains the property of the university, faculty worry the contract allows disclosure to law enforcement.  

The contract prohibits selling student data.

Surveillance concerns

CU Boulder Provost Ann Stevens said in a campus email that interactions are private and can’t be monitored by the university’s IT department. However, faculty argue chat logs could end any expectation of privacy via Colorado public records requests.

This could create risks for politically sensitive research and places responsibility for all the content that is input on the university, which is problematic in the case of copyright or confidentiality violations or improper use of AI.

Stevens said the company must notify CU for any request for university data through the OpenAI tool before responding. She didn’t specify whether faculty or students would be notified if their data were requested by law enforcement.

Harm to student learning

Faculty critics say the growing use of generative AI is already affecting classrooms. They fear AI shortcuts will encourage surface-level thinking and cripple critical thinking skills. They worry that the system’s adoption of AI normalizes the behavior.

Some instructors say they are already seeing AI-generated assignments, flawed grading and inaccurate content produced by the tools. They say they’re being forced to act as “AI” police despite no reliable tools to detect cheating.

“The university’s sponsorship of such a product without attending to existing issues in our educational system presents a laissez-faire attitude to learning, laying sole responsibility at the feet of educators whose jobs are already often under-resourced,” the letter of dissent reads.

In a CU Denver faculty meeting, instructors questioned how the university will determine whether AI improves learning, because grades or pass rates may simply reflect a situation where AI is doing much of the work.

Chancellor Kenneth Christensen stressed that faculty still control AI use in the classrooms but that responsible AI use and how to combine AI with human judgment may become new teaching responsibilities.

“We can’t hide from AI. AI is a part of our lives, AI is a part of the workforce, which means it’s going to be a long-standing part of our students’ lives.”

Amy Hasinoff, chair of the learning technology committee, said she’s hearing from both faculty and students a desire to question that assumption.

“That in fact, do we have a responsibility to embrace AI? It feels like that’s a conversation that ... hasn’t happened yet.”

Financial concerns

The faculty also questioned the agreement’s cost.

While the CU system covers the initial $2 million a year, campuses could bear future costs – a major concern as UCCS faces a $12 million budget shortfall.

Critics say there’s no evidence AI improves academic outcomes and argue funds should instead support financial aid, graduate pay or mental health services.

Some faculty have also raised questions about OpenAI’s leadership and political donations.

Environmental impacts

Critics also cite tremendous water and power demands of AI, which disproportionately harm lower-income communities. Officials say CU will align AI adoption with its sustainability goals.

Christensen, a mechanical engineer, said studies show an active daily AI user making 15 prompts is the equivalent of a short car commute.

“We navigate competing priorities every day based on our informed understanding of risks and consequences,” he said in a campus email. “The same restraint must guide how we integrate these AI tools across our institution.”

Faculty demands

Faculty critics are calling for a faculty-led process to define ethical use guidelines and AI literacy training.

They also want clear policies governing grading, a shift away from “productivity” metrics that AI uses in favor of students' ability to learn and a detailed plan for managing AI data.

Provost Stevens noted the three-year contract is a “pragmatic step to reduce risk now.”

“It does not preclude other tools, future partnerships or alternative models, nor does it signal endorsement of a single vendor as a long‑term solution,” she said.

She acknowledged that faculty, staff and students were not broadly consulted before the contract was finalized but promised they would be going forward.

CU Denver faculty voiced concerns that the university isn’t ready to implement the new system. Christensen pointed to three new AI fellows to implement the policy recommendations made last year and a soon-to-be hired AI coordinator.

Renewing the contract annually will hinge on faculty feedback and evidence of academic value.