How Palomar College Reported 73.5% of Students Saw Improved Learning

Kavitta Ghai
February 17, 2025

As education stands at the crossroads of tradition and innovation, recent research from the largest AI pilot in higher education history is illuminating how AI can fundamentally enhance learning in ways previously unimaginable. This groundbreaking pilot measures the efficacy of AI on student outcomes at 83 community colleges across California, including Palomar College where instructors captured detailed feedback from students using Nectir’s AI course assistants. The results were overwhelmingly positive: 73.5% of participating students reported enhanced learning experiences, with nearly a quarter specifically saying the AI helped them grasp challenging concepts better. This isn’t just another tech trend—it’s evidence of a pivotal shift in how students engage with course material. Moreover, a landmark study at Los Angeles Pacific University (LAPU) mirrors these findings on a larger scale, showing measurable gains in grades and student motivation when AI assistants are integrated into classes. Together, the Palomar and LAPU insights offer a compelling roadmap for AI adoption in education, demonstrating that when implemented thoughtfully, AI tools can be catalysts for deeper understanding, critical thinking, and student success.

Overwhelmingly Positive Feedback at Palomar College

In the Palomar College pilot (part of the California Community College Nectir AI Pilot Program), 68 students were given access to AI course assistants integrated into their LMS via the Nectir AI platform. These Nectir AI course assistants were trained on the syllabus, assignments, and other course materials uploaded by the instructor. Students could access their Nectir AI course assistants for 24/7 classroom support and resources. After using these personalized AI helpers, students reported a wide range of benefits that significantly improved their learning experience.

Key findings from the student feedback include:

  • Better Understanding of Complex Content (22.1%) – Over one-fifth of the students said the AI course assistant made complex academic content more accessible. They valued the AI’s ability to break down difficult language and concepts, which was especially useful for interpreting dense readings like historical documents or scientific texts. In essence, the AI acted as a 24/7 tutor, rephrasing and simplifying challenging material whenever students needed help.
  • Improved Critical Thinking (19.1%) – Nearly a fifth of respondents noted that interacting with the AI course assistant improved their critical thinking skills. The AI didn’t just hand out answers; it prompted students with reflective questions and guided them to figure out solutions on their own. Students appreciated being led to answers through reasoning rather than being given the solution outright. This kind of Socratic questioning by the AI encouraged deeper analysis and helped students learn how to think through problems, not just what to think.
  • Supportive Learning Environment (17.6%) – Many students felt the AI course assistant created a more supportive and confident learning environment. They highlighted the AI’s role as a reliable information source that they could turn to anytime. Because the AI was always available and consistent in its guidance (while still maintaining academic integrity checks), it helped students feel less alone in their study process. It’s like having a teaching assistant on call that never gets tired of answering questions. This consistent support also helped some students develop more independent learning skills, since they could ask the AI and get immediate feedback or clarification on their ideas.
  • Higher Efficiency and Engagement (14.7%) – About one in seven students reported more efficient and engaging study sessions when using the AI course assistant. Routine tasks like searching through textbooks or lecture notes for an explanation were sped up, as the AI could immediately provide relevant information or examples. Students described their learning as more enjoyable and interactive; for instance, they could have a back-and-forth dialogue with the AI about a topic, which kept them engaged. The 24/7 availability of the AI assistant also meant that even outside of class or late at night, they had help at hand – promoting consistent study habits and reducing frustration when they encountered roadblocks at odd hours.

Overall, the feedback from Palomar College students reveals a strong positive sentiment toward AI in the classroom. In total, roughly three-quarters (73.5%) of the students reported that the Nectir AI assistant improved their learning outcomes – whether through better understanding, improved critical thinking, or greater engagement. Only a small minority (about 5.9%) still had ongoing concerns after using the AI. These numbers underscore that the introduction of an AI course assistant, far from confusing or distracting students, actually helped the vast majority of them learn more effectively.

From Skepticism to Trust: Changing Student Attitudes

One of the most insightful outcomes of the Palomar pilot was how it changed students’ attitudes towards AI over time. Before using the AI assistant, some students were unsure or skeptical about whether it would genuinely help or just be a gimmick. In fact, 11.8% of the respondents admitted they were initially skeptical about AI in an educational setting. However, after hands-on experience, these initially doubtful students ended up reporting positive experiences. This conversion is telling – it suggests that exposure and proper use can turn skeptics into believers. As the survey indicates, many who started with doubts found value once they saw the AI in action in their coursework.

There were also students with mixed experiences (8.8%) and a very small group (5.9%) who maintained some concerns or reservations. Those who had mixed feelings often appreciated some benefits but also pointed out areas for improvement – for example, instances where the AI might have misunderstood a question or where they weren’t sure how to phrase a query to get a useful response. Meanwhile, the few who remained concerned primarily worried about the limitations or potential misuse of AI (such as over-reliance on it, or fears about academic integrity). Importantly, their concerns tended to focus on how the AI was implemented rather than outright opposition to the concept of AI in education.

What’s encouraging is that the vast majority of students moved toward a positive view. The Palomar feedback paints a picture of a shift from hesitation to appreciation: students realized the AI assistant wasn’t there to do their work for them, but to help them learn more effectively. As noted in the study, students recognized the AI as a facilitator rather than a replacement for real learning. They saw that AI isn’t about getting easy answers – it’s about getting better answers to their questions and even better questions to think about. This addresses a common concern educators have: that students might use AI to cheat or shortcut learning. The Palomar students, on the whole, did not see the AI this way. In fact, they appreciated that Nectir’s AI would pose follow-up questions and encourage them to dig deeper, which actually enhanced their learning and critical thinking. One student reflected that while generic tools like ChatGPT sometimes “hallucinate” incorrect information or make it too easy to cheat, the course-specific Nectir AI was different – it provided reliable information and guided questioning that made the assignment “much more relaxed and enjoyable”, helping them understand the material without doing the work for them. This kind of feedback suggests students are developing a sophisticated understanding of AI’s proper role: they see it as a guide and mentor rather than an answer vending machine.

The evolution of student perception at Palomar is a powerful lesson for broader AI adoption: initial skepticism is natural, but when AI tools are implemented thoughtfully and students are educated on how to use them, most will embrace the technology and recognize its value. It reinforces that trust in AI in education is built through experience. As students see that AI assistants maintain academic integrity (by not just handing out answers) and actually help them learn more deeply, their buy-in increases.

Corroborating Evidence from LAPU: AI Adoption at Scale

Palomar’s pilot might have involved a relatively small group of students, but a much larger study at Los Angeles Pacific University (LAPU) shows that the positive impact of AI in classrooms isn’t isolated. LAPU, a fully online university serving primarily working adults, undertook a comprehensive research study of Nectir AI’s course assistants across many courses and students. The scale of this study was significant – it analyzed 2,094 student-course enrollments, encompassing 1,340 unique students across 99 courses. This wasn’t just a tiny experiment; it was a broad implementation, making the findings hard to ignore. The student population was diverse (78% part-time, a majority female, and nearly half Hispanic, among other demographics), meaning the results are likely applicable to a wide range of institutions and learners.

The results from LAPU’s implementation of Nectir AI were strikingly positive, echoing the student sentiments from Palomar but also demonstrating tangible academic improvements. According to the study (which is currently under peer-review), students who actively used the AI assistant saw higher academic performance on average than those who did not. In courses with AI assistants, the average course GPA was 3.28 compared to 3.05 in courses without the AI, a 0.23 grade point increase. In practical terms, that difference is meaningful – it could be the difference between a B and a B+ average. By another measure, after one term with campus-wide AI integration, LAPU reported roughly a 20% improvement in students’ GPAs relative to prior performance. While the exact figures depend on how you calculate the baseline, the clear takeaway is that grades improved notably when AI assistants were in the mix.

It wasn’t just grades that improved. LAPU also tracked other indicators of student success and found significant gains in those who used the AI tutors. There was a 13% rise in average final assessment scores in AI-supported classes, suggesting students not only scored higher overall, but they were mastering the material more effectively by the end of the term. Perhaps most impressively, student motivation got a considerable boost: LAPU observed a 36% increase in the motivation levels of students using AI, and a 13% improvement in self-efficacy (students’ confidence in their own ability to learn and succeed). These are critical factors — motivated students with higher self-confidence tend to engage more and persist through challenges, which likely contributes to the better grades we just noted.

These LAPU findings validate what Palomar students reported anecdotally. At Palomar, students said the AI made learning more engaging and less intimidating; at LAPU, we see that translate into measurable upticks in motivation and confidence. Palomar students felt they understood content better; at LAPU, final exam scores rose. In short, the qualitative improvements students described at Palomar are backed by quantitative results at LAPU. This corroboration is important: it shows that AI course assistants like Nectir’s don’t just make students feel like they’re learning more – students actually learn more, as evidenced by performance metrics.

It’s also worth noting that LAPU’s context — a 100% online university — demonstrates how AI can be a game-changer in digital learning environments. Online students often juggle jobs, family, and irregular study hours. The LAPU study highlighted that with AI tutors available, students could get help at 2 AM if needed, or pause and resume learning at their own pace without losing support. This flexible, on-demand assistance is something traditional teaching support can’t always provide, but AI made it possible. As one administrator at LAPU put it, the AI assistants act like always-available teaching aides who never tire, never judge, and never lose patience, engaging students in dialogue and guiding them to understanding whenever they need help. This kind of scalable support is particularly valuable in an online setting and for non-traditional students.

The LAPU study also underscores that successful AI adoption can be scaled beyond a single pilot class. In just 6 weeks, LAPU went from a small pilot in two courses to full campus-wide deployment of AI assistants. They integrated custom AI tutors (nicknamed "Spark" and others) into every course, showing that with the right planning and collaboration (in LAPU’s case, working closely with Nectir’s team), even a large institution can roll out AI across the curriculum quickly and effectively. The fact that they saw positive results across 225 course sections suggests that AI tools can generalize across subjects—from psychology to literature to science—and still provide benefit. It’s not just tech-savvy students or certain disciplines that gain; the advantages seem to cut across various areas of study.

Of course, even at LAPU not every single student jumped on board immediately – some students were slower to embrace the AI, which is natural. Research into the LAPU rollout found that a subset of students didn’t use the AI tool initially due to reasons like not understanding how it could help, preferring to figure things out on their own, or being unsure about the technology. However, by identifying these barriers (e.g., lack of awareness or training) the university could address them by improving communication and guidance. The overall trend was clear: when students understood the value and had the AI available, a large majority used it and benefited. In the end, the data from LAPU provides robust evidence that aligns with the positive feedback from Palomar – giving us confidence that AI course assistants can significantly enhance student success at scale.

Embracing the Future of AI-Augmented Learning

The data and experiences from Palomar College and LAPU converge on a clear message: when thoughtfully integrated, AI tools like Nectir’s course assistants can significantly enrich the educational experience.

Crucially, these findings show that AI in the classroom is not about replacing teachers or diluting education but enhancing the learning process and empowering students. With evidence of success now in hand, AI is no longer just a buzzword or experiment in academia – it’s a proven catalyst for student success and a powerful ally in the mission to educate.

Want a one-pager of this case study? Click here for PDF you can download and share.

Kavitta Ghai
February 17, 2025

This is the future of education.

Join over 45,000+ students, faculty, and staff using Nectir.