Can artificial intelligence detect deception? Should it? And what role does human bias play in how machines learn to "read" us?
These questions drove Sayde King鈥檚 doctoral research at the 国产短视频, where her work on AI deception led to a job offer before graduation. On Saturday, she crossed the commencement stage with a PhD in computer science and engineering. Next, she will join the NASEM National Research Council as a postdoctoral associate at the U.S. Air Force Research Laboratory鈥檚 711th Human Performance Wing.
Her dissertation explored the complex intersection of facial analysis, truthfulness, and AI, digging deep into existing research on AI-enabled deception detection systems.
Her findings? Humans lie. But AI still is not reliable enough to detect lies consistently.
Can AI Detect Deception?
鈥淭here鈥檚 a lot of interest in the idea that computers might be able to tell when someone is lying,鈥 King said. 鈥淭here is some work that does really well in one area, like crime, for example, but we鈥檙e not there yet when it comes to detecting deception across multiple domains.鈥
The topic was familiar to King long before it became a research focus as her mother works in a crime scene lab. She first explored it for a class survey paper, which sparked her interest in affective computing. That鈥檚 when the idea of gaining expertise in affective computing came to life.
鈥淚 started reading studies where facial recognition and micro-expression analysis were being tested in mental health screening,鈥 she said.
That raised deeper questions: Could AI detect lying or misrepresentation in high-stakes mental health contexts, such as screening for post-traumatic stress disorder or suicide prevention? And, more importantly, should they?
鈥淭here are big questions here: Is it ethical to use AI to try to detect whether someone is being deceptive when talking about something like suicidal ideation? What are the consequences if the tool is wrong?鈥
The student-faculty research connection

Working closely with faculty member Tempestt Neal, founder and director of USF鈥檚 Cyber Identity and Behavior Research Lab, King examined research on AI-driven deception detection using video, audio, and physiological data. She used tools to extract information like facial cues to evaluate claims, validity, and ethical concerns surrounding AI tools that attempt to 鈥渞ead鈥 facial expressions or biological cues to infer truthfulness.
鈥溾淥ur lab is driven by the belief that AI systems must go beyond data processing, feature engineering, and modeling 鈥 they must engage with the human side,鈥 Neal said. She said the group asks questions such as 鈥淲hy should anyone care? When is the technology useful, intrusive, or undesirable?鈥
鈥淚n areas like deception detection, the stakes aren鈥檛 just technical, they鈥檙e personal, ethical, and societal. Understanding human behavior is what allows us to design AI that helps rather than harms, and Sayde鈥檚 research embodied exactly what we value: not just building AI tools, but questioning their role, their limits, and their consequences in high-stakes environments.鈥 Neal believes the work highlights both the promise and the limitations of AI in understanding human behavior. 鈥淚t鈥檚 a reminder that technology must be developed with both accuracy and ethics in mind.鈥
King found that a lot of these tools made bold claims and AI models have done well with lies and emotional recognition. 鈥淏ut lies across multiple contexts is challenging,鈥 she said.
In the lab, King collected data from experiments to understand how people behave when attempting to deceive. She asked student participants a mix of personal questions, some innocuous, some more revealing, and instructed them to answer some truthfully and others deceptively. The sessions were recorded to capture nonverbal cues and biological data. She then interviewed each participant to confirm which answers were deceptive, creating a labeled dataset for analysis.
She collaborated with Neal and fellow lab members to analyze the data, contributing to three peer-reviewed publications between 2022 and 2024, including articles in IEEE Access, Frontiers in Research Metrics and Analytics, and JMIR Formative Research.
鈥淚 wasn鈥檛 planning on doing a PhD, but when I got involved in the lab with Dr. Neal, I really enjoyed the research,鈥 she said. 鈥淚 loved being able to dive deep into questions that didn鈥檛 have clear answers.鈥
A 国产短视频Scholar from Start to Finish
King鈥檚 academic journey has been rooted in 国产短视频from the beginning. She earned her bachelor鈥檚 degree in computer science in 2016 and her master鈥檚 in 2019. During her master鈥檚 studies, she developed an interest in computer vision, artificial intelligence, and systems that interact with human behavior 鈥 the foundation for her doctoral research.
As a Sloan Scholar in USF鈥檚 Alfred P. Sloan Foundation University Center of Exemplary Mentoring program, King received support designed to help underrepresented students succeed in STEM doctoral programs. She also mentored others as a teaching assistant in undergraduate computing courses.
鈥淒octoral work can be isolating,鈥 she said. 鈥淚 think it鈥檚 important to make sure students feel seen and supported, especially when they鈥檙e just starting out.鈥
King credits her mentors for inspiring her academic path. 鈥淚鈥檝e been lucky to have mentors who saw potential in me even when I doubted myself,鈥 she said. 鈥淒r. Neal, especially, gave me space to ask hard questions and pursue research that mattered. That kind of support changed the course of my career.鈥
鈥淪ayde was one of the first students to join my lab when I arrived at 国产短视频in 2018,鈥 Neal said. She started as an undergraduate research volunteer and grew into a scholar who just earned her PhD. I鈥檓 incredibly proud of her journey, and I know she鈥檚 going to do great things well beyond USF.鈥
What鈥檚 Next
At the U.S. Air Force Research Laboratory, King will bring technical knowledge and a human-centered mindset to her postdoctoral research. Her work will focus on how people interact with automated systems, an area where her background in detection and facial analysis may offer valuable insights.
鈥淚t鈥檚 a great opportunity to continue exploring the intersection of people and technology,鈥 she said. 鈥淚鈥檓 excited to apply what I鈥檝e learned in a real-world setting where the stakes are high, and the potential for impact is, too.鈥
鈥淪ometimes you start out following one question,鈥 she added, 鈥渁nd you realize it opens the door to dozens more. That鈥檚 what I love about research.鈥