Dartmouth’s engineering program embraces artificial intelligence

Dartmouth's engineering program embraces artificial intelligence
February 6, 2026

LATEST NEWS

Dartmouth’s engineering program embraces artificial intelligence

HANOVER — Dartmouth has encouraged students to embrace artificial intelligence programs across disciplines, and its new concentration in the engineering program is the latest example.

The interest in AI studies is there among students, and so is the need for professionals in the field, said Dean of Thayer Engineering School Doug Van Citters, with 94% of employers facing a shortage in AI talent, according to the World Economic Forum. 

“We really want to be leaders in how we teach not just AI, but teach responsible use, and its responsible construction,” said Van Citters. 

The AI concentration for engineering students is meant to provide an educational foundation that helps students avoid jumping into AI career paths without proper background, Gene Santos, director of the Master of Engineering Program at Dartmouth, said in a Thursday interview.

While many of the required AI courses are already offered and taught by faculty, part of introducing this concentration involves expanding the scope. Beyond learning how to apply AI to various problems, they will also learn about user/programmer intentions, ethics and limitations, said Santos.

Associate Professor Thomas Thesen shows a chest x-ray from the case study for which his Geisel School of Medicine students are trying to formulate a diagnosis and treatment plan during a problem based learning class at Dartmouth Hitchcock Medical Center in Lebanon, N.H., on Wednesday, Feb. 4, 2026. Thesen introduced the AI Patient Actor, which was developed in his neuroscience learning lab, to the group for the first time during the class and they were able to practice asking questions and eliciting answers to help inform their diagnosis. JAMES M. PATTERSON / Valley News

Funding comes primarily from the university itself, as a mix of the operating budget, special endowments and engineering scholarships, said Santos. With the program being in its early stages, it remains unclear what the funding needs will be moving forward.

The next step is recruiting incoming students, Van Citters said Thursday. The electrical and computer engineering applicant pool is strong, with many of the classes already available and oversubscribed, said Van Citters. 

While AI innovations are evolving rapidly, there are concerns on campus about the mixed messages encouraging AI use can send students

Pretend patients

At the Geisel School of Medicine, students engage with an AI patient actor tool to practice communication skills required to interact with real patients.

The newest version was released last spring, said Dr. Thomas Thesen, an associate professor of medical education who helped develop the tool, in a Tuesday interview on campus. 

Previously, human actors were employed and asked to use a script. Thesen cited issues with scalability and feedback. 

John Ejiogu, a first-year Geisel Medical School student reads a slide revealing new information about a case study of an emergency department patient as he and his classmates try to develop a differential diagnosis in a class at Dartmouth Hitchcock Medical Center in Lebanon, N.H., on Wednesday, Feb. 4, 2026. JAMES M. PATTERSON / Valley News

The tool is an openly accessible website that has had over 10,000 student encounters and 155 different educators signed up, said Thesen.

Students can choose a condition and personality for the AI chatbot, and work with the “patient” by speaking into the computer. Feedback is returned based on a rubric that can be altered or replaced by teachers. 

“There’s a whole skill set about breaking bad news, like a cancer diagnosis or a son died, or something like that,” Thesen said. “There are whole frameworks on how you structure this conversation in the best way. So the AI is good for that.”

One limitation with the tool includes the lack of true socializing. Students who get nervous talking to people do not get authentic practice helping patients when they are able to do so in a comfortable environment by themselves.

Also, body language and eye contact play a role in these interactions, said Thesen. For these reasons, Geisel is not looking to replace real people completely with the tool. A team of students interested in health equity is also looking to correct potential racial biases within the system.

“Large language models are built with all the internet data, the good, the bad and the ugly. So they reflect how society thinks, and society is racist, right?” said Thesen.

Moving forward, Thesen hopes the tool can start analyzing tone of voice, which also plays a role in this type of communication. 

The tool is considered a locally grown product by Thesen, as computer science students in the Dali Lab — a collaborative studio for digital solutions — developed it. One of these students is Colin Wolfe, who worked on the voice-to-voice function of the tool last spring. 

“It’s nice to know that some of your work has been deployed and is helping Geisel med students,” said Wolfe. 

Working in pairs, first-year medical students, from left, Temel Moore, Nick Manickas, John Ejiogu, and Bailey Chan, discuss new information revealed as they progress through a hypothetical emergency department visit based on a case study during a problem based learning class at Dartmouth Hitchcock Medical Center in Lebanon, N.H., on Wednesday, Feb. 4, 2026. The class normally progresses through a series of slides building on the information available to help rule out incorrect diagnoses, but Associate Professor Thomas Thesen included the AI Patient Actor in the class, giving the students their first opportunity to use the tool to ask direct questions of the patient. JAMES M. PATTERSON / Valley News

Wolfe is a junior in computer science and engineering. His courses in AI currently function more as electives, but through the concentration, they could apply more directly to his degree. 

As a teaching assistant in a Software Design and Implementation course, Wolfe is concerned about the increase in AI used for activities such as codewriting and making comments, adding that it is noticeable when students use generative AI dishonestly. 

“It is sort of disheartening because people come to my office hours and sometimes they don’t understand a reasonably important foundational piece of knowledge they should know starting from like week one,” said Wolfe, referring to those who generate responses to coursework. 

While AI policies are up to the professor, it can be hard to prove it was used. To combat this, there has been a rise in handwritten exams in the computer science department, said Wolfe. 

Concerns about academic integrity

James Dobson, an associate professor in the Department of English and Creative Writing, teaches a course titled “Critical AI,” where students can “apply cultural critique to artificial intelligence while learning the fundamentals of how these technologies work and how they fail,” according to the syllabus.

AI has value across the humanities in addition to STEM disciplines, Dobson said Thursday in a phone interview. The technology can offer cutting-edge tools but with them come concern from students and faculty over intellectual property and future career opportunities.

Thomas Thesen, a Geisel School of Medicine Students assistant professor, right, leans in to look at the notes of first year student Stephen Batter, left, during a problem based learning class at Dartmouth Hitchcock Medical Center in Lebanon, N.H., on Wednesday, Feb. 4, 2026. JAMES M. PATTERSON / Valley News

Faculty members in the humanities traditionally oppose AI and favor classic teaching methods. With student use of generative AI increasing, Dobson points to two unfavorable options to mitigate the problem of inappropriate use: requiring assessments be in class or encouraging the use of AI in a particular form.

While paper examinations may be ideal in some cases, it creates a time constraint for English students writing essays, said Dobson. Some students consider generative AI a useful tool in organizing thoughts. Others prefer not to use it at all, though that population of students is smaller than the population of faculty who oppose it.

There is also a group of students who feel pressured to use the tools to keep up with their peers, Dobson said.

Another issue identified by Dobson is the mixed messages students receive regarding appropriate use. Students may have one professor who does not allow AI in any form, and another who requires it as a step in the submission process, such as for revising work. With this, it can be difficult to generally understand appropriate use.

“We have uncertainty on the faculty side about what we think is appropriate. Students are using it all over the place with little sense of how it works and the tools they need to have some agency involved,” said Dobson.

Despite reservations about AI technology and ethics across the board, Santos foresees a continued interest in using AI to further education in different areas of study, with the concentration being a step in that direction.

“It’s the 70th anniversary where AI started at Dartmouth. Let’s strike while the iron is hot and we can be seen much more,” said Santos.

Pro-AI article prompts controversy

One of the college’s AI programs has been involved in a recent controversy.

Fourth-year student Teddy Roberts recently revealed he was compensated for a November op-ed in The Dartmouth promoting Evergreen.AI, the school’s AI wellness platform created to support student success. 

Initially, the Office of Communications reached out to Roberts among others involved with Evergreen as student employees to request they help get the word out about the platform.

Despite suggesting an op-ed in the student paper as a way to promote the platform, the Office of Communications was not aware that Roberts was instructed by an Evergreen.AI staff member to log those hours, Kathryn Kennedy, associate vice president of the Office of Communications, said by email.

“We do not pay students, faculty, staff, or external contributors to author opinion columns for any media outlet. It is, however, our role to provide guidance to our community on interacting with news media, as we did in this instance,” she wrote in the emailed statement. 

An October email thread obtained by the Valley News from the Office of Communications regarding the op-ed show Roberts asking which topics and perspective he should focus on when writing. Jana Barnello, director of media relations and communications strategies, responded back with question prompts about his involvement with Evergreen.AI and why the project is important.

Barnello checked in again on Oct. 14, and Roberts requested more time. Barnello responded, saying “When you’re ready, we’re here to help sharpen the arguments even further!”

Upon sharing the op-ed and receiving feedback on Oct. 16, Roberts wrote back saying that he accepted most of the notes. Payment was subsequently provided by Evergreen, which goes against The Dartmouth’s ethics code for content publication.

In a Jan. 29 op-ed published by The Dartmouth titled, “Why I actually care about Evergreen.AI,” Roberts expressed the value of the platform while calling on leadership to connect better to its audience by engaging with student staff members and leaning into real student experiences.

“I am not writing this to burn Evergreen to the ground; I’m writing to save it from becoming just another well-funded, well-intentioned, but ultimately hollow initiative. I have immense hope for this project, but hope isn’t enough,” wrote Roberts. “We don’t need more oversight from above; we need the safety and autonomy to speak the truth from below.”

Students are at the heart of Evergreen.AI and are often sharing their efforts with the wider community, including the press, Lisa Marsch, the Evergreen project leader, said in an email statement on Thursday.

“We greatly value their honest input and encourage them to speak with whomever they want, however they want, about Evergreen. And like any employee, they get paid for their time,” said Marsch.

Share this post:

POLL

Who Will Vote For?

Other

Republican

Democrat

RECENT NEWS

Vershire Couple Wins HGTV's 'Ugliest House in America'

Vershire Couple Wins HGTV’s ‘Ugliest House in America’

From Vermont to Olympics: Peter Graves' memoir released

From Vermont to Olympics: Peter Graves’ memoir released

Windsor Sheriff Stripped of Law Enforcement Certification

Windsor Sheriff Stripped of Law Enforcement Certification

Dynamic Country URL Go to Country Info Page