Why Faculty Control Matters More Than Any AI Policy You Could Write.


Every university has an AI policy by now. Most of them amount to the same thing: a document that tries to draw lines around what students can and cannot do with AI. What actually changes outcomes is not the policy. It's whether faculty have the tools to shape how AI interacts with their students inside the learning environment. When a professor can set the parameters of the AI, control its behavior per assignment, and decide what kind of support students receive, the AI becomes an extension of their teaching rather than a workaround for it.
That distinction is why structured use of AI improves learning outcomes, whereas unstructured use often undermines them.
What does "faculty control" actually look like in practice?
It means the institution and/or professor decides how the AI behaves in their course, not the vendor.
With a platform like Nectir AI, an instructor can configure their AI Course Assistant to:
- Use the Socratic method instead of giving direct answers
- Restrict the AI to only reference uploaded course materials (syllabi, lecture notes, rubrics)
- Block the AI from writing essays, solving homework, or completing assignments on behalf of the student
- Redirect students to office hours or tutoring after a set number of exchanges
- Align the AI's tone and pedagogy with how the instructor actually teaches
This is fundamentally different from what happens when a student uses a generic chatbot. Those tools are designed to answer the question as efficiently as possible. A faculty-controlled AI is designed to support the learning process the professor has built.
As Professor Adam Hathaway at Chabot College put it:
"This is a 24/7 program, so even in the middle of the night, students can get accurate feedback. It's also very customizable. If you sent this over to ChatGPT, you'd get very different answers. [Nectir AI] locks them into a system that I get to control."
Why can't a campus-wide AI policy achieve the same thing?
Because a policy is a set of rules. Faculty control is a set of tools.
A policy might say, "Students may use AI for brainstorming but not for drafting final submissions." That's unenforceable at scale. It also assumes every course and every assignment should have the same AI rules, which any instructor will tell you is not the case.
A chemistry professor might want AI to help students work through practice problems step by step. An English professor might want AI to check thesis alignment against a rubric without ever touching the student's prose. A nursing instructor might want AI to simulate patient scenarios but never provide clinical diagnoses.
Faculty control means each of those instructors can configure AI to match their pedagogy, their course, and their standards. The result is that AI use looks different in every classroom, because it should.
Does giving faculty control over AI require technical expertise?
No. That is one of the most common misconceptions, and one of the biggest barriers to adoption.
Nectir AI is built so that any instructor can configure their Course Assistant through a prompt library and natural-language settings. A professor doesn't need to write code or understand large language models. They need to know how they want students to engage with support, and the platform translates that into AI behavior.
In the California Community Colleges pilot, 260 instructors adopted Nectir AI across 84 campuses. 86% rated the teaching experience as "excellent" or "very good," and nearly half gave Nectir a perfect 10/10 recommendation. Faculty adoption at that scale doesn't happen when the tool is hard to use. It happens when the tool respects the instructor's expertise and gives them real control.
What do the results look like when faculty are in control?
The data is clear. When faculty set the guardrails, students learn more and engage more deeply:
- A peer-reviewed study at Los Angeles Pacific University found a 7.5% increase in GPA campuswide after one term of using Nectir AI with faculty-set guardrails
- A 13% rise in average final scores
- A 36% boost in intrinsic motivation to learn
- 74% of students reported a better learning experience, including improved understanding of complex content and critical thinking
- At Foothill-De Anza College, student AI adoption surged from 18-30% to 73% when instructors actively introduced and supported the tool
As Professor Adoria Williams, Head Librarian at Merritt College, shared:
"[Nectir AI] has helped those students who might have families, full-time jobs. I teach my students how to use [AI] responsibly. I have also trained [my Nectir AI Course Assistant] to use my course material and not give students the answers, but to lead them down the path of critical thinking and analysis."
How does faculty control connect to the bigger picture of AI in education?
The conversation about AI in education tends to split into two camps: schools that want to embrace AI and schools that want to restrict it. Faculty control is the answer to both.
When faculty control the AI, adoption doesn't feel like a risk. It feels like an upgrade to the teaching tools they already use. And when the AI is grounded in the instructor's content and pedagogy, it doesn't replace the teacher. It scales their ability to support every student.
As I shared in an interview with Forbes, TAs and teachers should remain in the classroom forever. We are not trying to replace the human educators at the front lines, but trying to scale their ability to help every single student.
Frequently Asked Questions About Nectir AI
What is Nectir AI? Nectir AI is AI infrastructure purpose-built for schools. It gives colleges, universities, and high schools the ability to deploy AI Assistants across their campus that are fully controlled by faculty and administrators, integrated with existing learning management systems, and compliant with FERPA and SOC 2 standards. Unlike consumer AI tools, Nectir is purpose-built to support structured, pedagogy-driven AI use in academic settings.
Nectir AI is currently trusted by 80,000+ students across 100+ campuses, including a landmark partnership with California Community Colleges, which serves 2.1 million students across 116+ campuses. The platform is fully FERPA- and SOC 2-compliant, and gives every instructor full control over how AI shows up in their classroom.
How is Nectir AI different from consumer AI tools? With Nectir AI, faculty can control the AI's behavior: they set the prompt, define what the AI can and cannot do, choose the teaching method (such as Socratic questioning), and establish boundaries that keep students learning actively rather than passively receiving answers. Student data stays private and is never shared with model providers.
Can Nectir AI work with my school's existing LMS? Yes. Nectir integrates with existing learning management systems, enabling schools to deploy AI infrastructure without overhauling their current technology stack. Faculty can build AI Assistants that reference their own course materials, syllabi, and rubrics within the platform they already use.
How can I learn more about Nectir? Want to see what faculty-controlled AI looks like on your campus? Schedule a demo, and our team will walk you through how Nectir works and what it looks like at schools like yours.
