Yale College seniors Elizabeth Dejanikus and Madeline Levin — neither of whom identify as techies — are poised to contribute to the academic literature on the rise of artificial intelligence (AI).
Both are completing scholarly papers that examine the regulation of AI in the United States, in the absence of comprehensive federal legislation, and the establishment of ethical boundaries for the rapidly advancing technology. Dejanikus, who is double majoring in political science and the humanities, is writing about individual states’ options for regulating AI. Levin, an English major, is exploring the mechanisms through which federal agencies could set rules governing the technology’s use.
Their classmate Tyler Schroder, a computer science major, is the lead author on a study that offers recommendations to tech companies and government regulators for improving the security of brain-computer interfaces — devices implanted inside people’s skulls that process brain activity and interact with external software to perform specific tasks.
The three undergraduates were director’s fellows during the fall of 2024 at Yale’s Digital Ethics Center, a multidisciplinary hub established in 2023 for studying the governance, ethical, legal, and social implications of digital innovation and technologies, and their human, societal, and environmental impact.
In pursuing its mission, the center supports forward-looking research by early-career and young scholars, from first-year undergraduates to postdocs, that acknowledges the benefits of digital technologies while addressing the potential risks or shortcomings they pose. For example, Dejanikus, Levin, and Schroder are co-authors on a recent study that assessed and validated the ability of Wikipedia’s editing mechanism to detect misinformation in the massively popular online encyclopedia.
Yale philosopher Luciano Floridi, the center’s founding director, encourages undergraduate fellows to develop their own research projects and — with help from the graduate students and postdocs on the center’s research team — guides them through the process of generating scholarly work suitable for publication in leading journals.
“The undergraduates on our team are extremely bright,” said Floridi, professor in the practice in the Cognitive Science Program in Yale’s Faculty of Arts and Sciences, and a pioneer of the philosophy of information. “I’m confident they will produce publishable papers that expand knowledge in our field. It is a pleasure to help them realize their potential as researchers and I’m grateful for their contributions to the center’s work.”

Director Luciano Floridi, center, has fostered a collaborative environment at the Digital Ethics Center where young scholars receive support while pursuing forward-looking research.
The undergraduates, who remain affiliated with the center now that their fellowships have ended, join a team of researchers supported by the center who, over its first year, produced a lengthy list of studies covering a broad range of topics, including papers that examine the potential role of AI in the health care sector; the regulation of autonomous weapons systems; global AI governance; and principles for responsible quantum innovation.
Through a generous gift from Tarek Sherif ’84, the center recently established the de Vries-Sherif Program on the Future of Humanity and Technology to support teaching and research examining the intersection between the human condition and technological innovation.
As the center grows, its work, and opportunities for undergraduates to contribute to it, will expand, Floridi said.
He envisions establishing similar programs within the center to tackle questions concerning technology’s impact on health care, democracy, the environment, and other areas of societal importance.
“We will create these pillars dedicated to specific questions or research areas, but they will not become silos,” he said. “The research we produce will always require cross-disciplinary cooperation, which allows us to bring multiple perspectives to the work and attract talented scholars from all levels with a variety of interests.”
English majors in the digital sphere
Yale College offers students many opportunities to be involved in scholarly research, and it is not uncommon for undergraduates to be listed as coauthors of peer-reviewed studies. But the center also gives students the chance to lead projects with guidance from seasoned scholars, Schroder said.
“Being able to sell a faculty member on a new project idea is something that you can do at the center,” he said. “I’m not sure there are many other places on campus that offer undergraduates the opportunity to forge their own path while working closely with mentors. You’re treated as the project lead here. That’s pretty special.”
The center nicely blends scholarly research with practical policy work. It’s a place where an English major interested in public policy can become involved in the digital sphere.
Dejanikus became aware of the center in the fall of 2023, when a professor in a class she was taking at the Yale Jackson School for Global Affairs recommended students attend one of the workshops Floridi regularly hosts on campus.
“I understood about half of what he was saying, but I understood enough to realize that the center was doing really important work that hadn’t really been done at Yale before,” she said.
Last summer, Dejanikus completed an internship in the legislative affairs office at the Department of Defense in Washington, D.C., where she learned about artificial intelligence regulations proposed by the Pentagon and federal government more broadly. Levin also spent the summer in Washington, interning in the Consumer Protection Branch of the Department of Justice. (The two were roommates.) She became interested in AI regulation while working with attorneys in the branch’s emerging technology division.
The pair started working as director’s fellows at the center in the fall. Both appreciate the center’s supportive, multi-disciplinary atmosphere, where computer scientists work shoulder to shoulder with political scientists and legal experts.
“The center nicely blends scholarly research with practical policy work,” said Levin, who plans on attending law school. “It’s a place where an English major interested in public policy can become involved in the digital sphere. It’s been a nice bridging exercise.”
Schroder was introduced to the center in the fall 2023 when Floridi gave a talk during a meeting of the Department of Computer Science.
“I was like, ‘Wow, I got to get in on this,’” said Schroder, who soon after began assisting with research on who should control undersea internet cables, ethical questions concerning brain implants, and other topics involving cutting-edge technology.
Generous mentors
Before the students embarked on their projects, Floridi briefed them on the basics of conducting academic research and composing a scholarly article. He also spoke to them about finding interesting questions to pursue, with an eye towards identifying nascent issues likely to increase in relevance as technologies develop and new ones emerge. The students have monthly one-on-one meetings with him to discuss their progress and any problems they have encountered, and he is available to meet with them or review drafts of their work as needed.
“Luciano is an incredibly generous mentor,” Levin said.
Additionally, each undergraduate is assigned to a postdoctoral researcher, who helps guide their work. They also consult frequently with the center’s other research fellows, present drafts of their papers to the center’s research team, and field critiques from their colleagues.
It is daunting, but also a valuable experience that prepares them to produce high-quality scholarship, the undergraduates said.
“When you write a research paper for class, you’re basically trying to present some new spin on material somebody else produced,” said Dejanikus, who is considering pursuing an academic career. “Here, it feels like we’re creating knowledge that will be useful to scholars in this field and people outside of academia.”
Her paper studies state efforts to regulate AI and where lawmakers are looking for guidance as they craft legislation. For example, she’s interested in the influence of the recently enacted AI regulations in the European Union (EU). (Floridi advised EU policymakers on the legislation.) She’s also closely following California’s ongoing efforts to pass an AI law, given the state’s status as a hotbed of digital innovation.
Levin’s study suggests that federal agencies will prioritize enforcing existing rules in regulating AI rather than establishing new ones. Congress hasn’t yet passed legislation that would enable rulemaking and the current majority of the U.S. Supreme Court is often skeptical of government agencies wielding rulemaking authority.
The questions we’re investigating here right now are only going to grow more complicated, so we’re not in danger of running out of ethical problems to tackle anytime soon.
The enforcement-based strategy raises concerns that it will create a patchwork regulatory environment, Levin said, in which tech companies must navigate different or conflicting approaches by various agencies. In the paper, she cites the Security and Exchange Commission’s recent prosecutions of cryptocurrency companies as an example of the enforcement-based approach creating confusion.
“It was very controversial,” said Levin. “I think the companies felt like the rules were ambiguous and were unclear whether or not digital assets count as securities.”
Brain-computer interfaces (BCIs) — the technology Schroder is studying — are mostly used for medical and therapeutic purposes, such as restoring the function of stroke victims’ limbs or helping people with paralysis control prosthetic hands. (Research in this area is happening at Yale.) But the technology’s potential applications include enabling people to control computer software, such as video games or military drones and other weapons systems, with their thoughts.
Schroder’s paper provides recommendations intended to help BCI manufacturers produce more secure devices and regulators understand where more guidance is needed to protect patient safety and data confidentiality. He is a coauthor on a separate paper that reevaluates the existing regulatory approach to BCIs and makes recommendations on how manufacturers and policymakers can better address the unique ethical, legal, and social risks of the technology.
Both studies deal with an inconsistency in the regulation of BCIs: The hardware implanted in the brain is tightly regulated while the software that the devices communicate with is subjected to minimal regulatory oversight.
“Some of our recommendations basically apply strong software regulations in place elsewhere to brain computer interfaces,” said Schroder, who has twice presented the paper on which he is the lead author at academic conferences. “We’re also asking what rules we could set today to prevent BCI devices from getting mass-recalled 10 years from now.”
His experience at the center has equipped him with the skills he’ll need after he graduates, he said.
“It’s going to let me hit the ground running instead of trying to figure things out,” said Schroder, who plans to work in the tech industry or in federally funded research and development after graduation. “The questions we’re investigating here right now are only going to grow more complicated, so we’re not in danger of running out of ethical problems to tackle anytime soon.”