January 28, 2026
A major push by Mizzou Engineering researchers will make advanced instrumentation as accessible as cloud computing.

At Mizzou Engineering, we continually push the boundaries of what’s possible. Now a team of our researchers has developed a powerful new platform that uses artificial intelligence (AI) to help researchers access instruments outside their physical labs and use them more effectively.
Scanning electron microscopy (SEM) has become essential across materials science, biotechnology, and nanotechnology because it reveals structures and properties that cannot be observed with conventional imaging.
But many researchers lack immediate access to SEMs and other advanced instruments because these tools are expensive, specialized and often housed only in centralized core facilities. Internet-enabled connectivity has broadened access to shared scientific resources.
“Research networks and supercomputing resources allow scientists to access instruments outside their physical labs,” said Curators’ Distinguished Professor Prasad Calyam of the Department of Electrical Engineering and Computer Science. “Remote instrumentation removes geographic and resource barriers, allowing researchers to pursue cutting edge work regardless of where they’re based.”
Significant challenges remain. Even when centralized instruments can be accessed remotely, novice operators often struggle with complex setup and calibration, resulting in inconsistent data quality and inefficient experimentation.
“Researchers told us the manual setup process was cumbersome, slow and prone to misadjustments,” said Mauro Lemus Alarcon, a doctoral student in Calyam’s lab. “Even experienced users who stepped away for a few months had to re‑learn the instrument.”
These challenges raised a pivotal question for Calyam’s team: If humans struggled with the precision and speed required for SEM imaging, could an AI learn to do it better?
That idea led the researchers in a new direction. In 2021, the group applied a foundational reinforcement learning method called Q-learning to carbon nanotube (CNT) growth experiments, offering the first proof that automated remote SEM control and image analysis were possible.
But the Q-learning model had major limitations. It learned slowly, and even when it finally learned something, it didn’t learn it reliably. It also could not reliably optimize zoom, focus, contrast and other SEM parameters across CNT images. A more dependable approach was needed.
Finding the optimal combination
Now, the researchers have advanced from basic automation to truly intelligent control. They describe their new tool, the Remote Instrumentation Science Environment (RISE), in an article published in IEEE Transactions on Human-Machine Systems.
Whereas Q-learning works best in simple, discrete environments, RISE employs multiagent reinforcement learning (RL) methods designed for complex, continuous systems. These agents learn how different SEM settings influence microscopic image quality and iteratively recommend improved parameters.
“We trained a model by giving positive or negative feedback based on its predictions, rewarding good predictions and penalizing bad ones,” Upasana Roy, a doctoral student in Calyam’s lab, said.
Tests were promising. Compared to earlier approaches, one of the RL agents achieved 63% higher training reward, over 600% improvement in testing reward and nearly 75% faster convergence.
To complement RL, the team added another tool: imitation learning (IL). Unlike RL, which learns through trial and error, IL learns by observing human experts. Using a gradient-boosting model trained on real SEM images and expert decisions, the IL agent predicts SEM parameters directly.
In a striking result, IL reduced zoom prediction errors by about 61%, focus errors by nearly 80%, and contrast errors by nearly 74%, dramatically outperforming RL-only approaches.
“That doesn’t mean IL always outperforms RL — different algorithms work better for different datasets — but in this case, IL was more effective,” Roy said.
Human in the loop
One of RISE’s most innovative features is its chatbot-guided interface. Many researchers — especially students and collaborators in other fields — do not have deep backgrounds in remote instrumentation. The chatbot allows users to interact with the system using natural language, expediting the process.
“New users often spend days learning how to adjust parameters,” Lemus Alarcon said. “With RISE, they can get useful results within hours.”
The chatbot informs the researcher of the recommended parameters. The researcher confirms or rejects, and the instrument is updated until the researcher is satisfied.
“This is a human-in-the-loop workflow: The system gives actionable information, and the human chooses the next step,” Calyam said.
RISE has been implemented and validated in a realistic testbed using actual SEM instruments and over 800 CNT images.
“Because our dataset is small, the system remains a prototype,” Roy said. “With more real data, the model will improve. For now, it provides direction where previously there was none.”
As for next steps, the researchers aim to improve generalizability across different instruments and scientific domains, reduce the training data requirements for RL, expand the system to consider additional SEM chamber conditions, and integrate large language models (LLMs) for richer chatbot support and contextual analysis.
As these enhancements progress, the broader vision comes into focus: Redefining how researchers interact with scientific instruments to create a flexible, secure and intelligent environment that enables scientists at all experience levels to accelerate discovery.
“We want to leverage Mizzou’s worldclass resources to build state-of-the-art image analytics infrastructure that supports both AI development and AI-assisted workflows,” Calyam said.
RISE promises to help transform remote experimentation across materials science, biomedical imaging, chemistry and beyond — ushering in a new era of automated, AI-powered scientific discovery.
In addition to Lemus Alarcon, Calyam and Roy, other researchers involved in the study include students Minasadat Attari, Andrew Hellman, Anirudh Kambhampati and Ramakrishna Surya; and faculty Filiz Bunyak, Sheila Grant, Matthew Maschmann and Kannappan Palaniappan.
The research was funded by grants from the National Science Foundation (CMMI-2026847, and OAC-2322063, and the United States Army Engineer Research and Development Center (W912HZ24C0022).
Discover more ways Mizzou Engineering researchers are unlocking solutions to real-world problems.