ASU PhD Student Myke Cohen Bridges Music, Psychology and AI in Research for a Safer Future

Myke Cohen never planned to be an engineer in his current field. But a fascination with how people experience world from music to technology opened the door to a different pursuit: understanding human cognition. That unexpected shift eventually led him to Arizona State University, where he now applies insights from psychology and human factors to help shape the future of national security.

Cohen is a doctoral student researcher on the CHIMERAS project, which stands for Collaborative Human-AI Interdependence Models for Explainable, Resilient, and Accurate Screening. Led by Dr. Nancy Cooke, with co-principal investigators Dr. Erin Chiou and Dr. Mickey Mancenido, CHIMERAS is funded by the Department of Homeland Security through the Center for Accelerating Operational Efficiency. The project examines how artificial intelligence can collaborate more effectively with human screeners in high-stakes environments, such as airports and border checkpoints.

The Human Side of Technology

For Cohen, collaboration and working in teams often feel like playing in a band. “You’ve got people with different strengths, instruments, and it only works when you listen to each other,” he said. “Research and work are the same in this regard. It’s not about being the loudest or smartest; it’s about listening and growing together.”

CHIMERAS doesn’t aim to replace human workers with AI—instead, it seeks to enhance their performance. Cohen’s research focuses on how people interpret AI recommendations, especially under time pressure. By exploring explainable AI features and decision-making workflows, the project helps ensure that AI tools are not only accurate but also trustworthy and supportive of human expertise.

“We’re not trying to replace human screeners—we’re trying to build systems that support them, that help them feel confident in their decisions,” Cohen said.

Two experimental platforms anchor the CHIMERAS study: Facewise, which supports facial identification, and AIID, which supports X-ray baggage screening. Cohen works with both, designing experiments that simulate real-world screening tasks to observe how people interact with system feedback. The goal is to identify when and how AI explanations aid or obstruct decision-making, and to improve the coordination between human and machine.

From Music to Engineering

Cohen’s unconventional journey into engineering began with a simple but powerful question: What makes a musical experience meaningful? That curiosity led him to psychology, where he discovered a deep interest in how people interact with complex systems.

“I didn’t expect to end up in engineering—but once I saw how psychology could inform system design, especially with AI, I was hooked,” he said.

At ASU, Cohen found an ideal research home. He credits his mentors with helping him bridge disciplinary boundaries and apply human factors principles in ways that have real-world impact.

“Working with Dr. Chiou and Dr. Cooke has been transformative,” Cohen said. “They’ve helped me see how my ideas can actually shape real-world systems.”

Building Trust in AI

One of the central challenges in Cohen’s work is explainability—ensuring that AI tools offer feedback humans can understand and trust. His research shows that trust in AI is more than a technical issue.

“Trust in AI isn’t just about performance metrics—it’s emotional, it’s relational. People need to feel like the system is on their team,” he said.

That emotional dimension is particularly important in high-pressure settings like airport security, where screeners must make fast, accurate decisions. Through CHIMERAS, Cohen helps explore how different AI design elements can foster that sense of partnership and improve decision quality.

Looking Ahead

As CHIMERAS continues to analyze how human-AI teams operate, Cohen is already thinking about the broader implications. His dissertation work  with the CHIMERAS project, focuses on how interdependence between humans and AI can be measured and improved over time.

“Explainability is a moving target,” he said. “What makes a system feel ‘understandable’ to one person might not work for someone else—and that’s the real challenge.”

For Cohen, the work is deeply rewarding. It allows him to blend his interests in cognition, communication, and design, while contributing to systems that could one day make air travel and border security safer for millions of people.

“The CHIMERAS project has pushed me to think about the human side of tech design in a much deeper way,” he said.

However, it’s not all about algorithms and experimental design. He’s also mentoring junior lab members and has received notification of his first accepted paper for publication.  As AI continues to shape the future of national security, researchers like Myke Cohen are making sure the human element remains at the center of innovation.

Cohen offered these words for students and early-career researchers navigating their own academic paths:

“When I started my PhD, I thought it would be a solitary journey—that I’d have to convince others to get on board with my ideas before I could pursue them,” he said. “But I learned that research is really a community effort. You grow by talking to people, sharing ideas, and sometimes going through side paths that end up shaping your main work.”