New capability lets scientists simulate and visually inspect automated experiments before robots run them
Combined simulation and visualization layer bridges AI-generated protocols and real laboratory execution
Before a robotic experiment begins, researchers must understand exactly how the robot will execute each step. Opentrons Labworks, Inc., a laboratory robotics company enabling AI-driven autonomous science, today announced Protocol Visualization for Opentrons Flex®, a new simulation and visualization capability within the Opentrons software environment. The feature allows scientists to simulate and visually inspect robotic protocols in a dynamic virtual environment prior to running them on a Flex system. Within the interface, users can observe pipette movements, liquid handling actions, labware positions, and module status throughout an automated workflow. For researchers developing automated experiments, previewing protocol execution may help identify potential issues before reagents and instrument time are committed.
Artificial intelligence tools are increasingly used to propose experiments and generate automated protocols through natural language prompts or programmatic interfaces. These systems can design workflows containing thousands of robotic actions. However, those workflows must still be translated into reliable physical instructions that operate correctly inside laboratory automation systems. The new simulation and visualization environment introduces an inspection layer between AI-generated experimental plans and robotic execution by allowing scientists to step through each action of a protocol before it runs on a robot. Researchers can navigate workflows step by step or move quickly across the entire timeline to examine how robotic actions will unfold. For organizations deploying AI-assisted experimentation, the ability to review robotic execution pathways may improve oversight of automated experimental design.
Simulation and visualization is supported for protocols authored across the Opentrons software ecosystem, including OpentronsAI, the Python Protocol API, and the Protocol Designer application. The visualization environment tracks pipette positioning, liquid volumes, tip usage, and labware interactions while maintaining a continuous view of the Flex deck configuration. Scientists can inspect workflows containing thousands of actions and observe changes in liquid levels at microliter scale. The system also includes a Slot Spotlight view that provides additional detail for individual deck locations, allowing users to monitor well volumes and module conditions throughout a run. For laboratories developing complex automation workflows, this level of inspection may support faster debugging and protocol refinement.
“AI can now design experiments and generate robotic protocols, but scientists still need to understand how those experiments will execute in the physical world,” said James Atwood, Chief Executive Officer of Opentrons. “This capability gives researchers a dynamic way to simulate and inspect robotic execution before an experiment begins, creating a clearer bridge between computational design and physical laboratory workflows.”
By design, visualization operates directly within the Opentrons App and requires only a protocol file to run. Scientists can review workflows offline without connecting to a robot, enabling protocol development and troubleshooting while automation systems are running other experiments. This capability allows researchers to iterate on experimental workflows without interrupting active laboratory operations. For teams managing shared robotic infrastructure, offline inspection may support faster development of automated protocols.
“Our focus is building the execution layer that connects AI-generated experimental plans to real laboratory experiments,” said Atwood. “As AI systems propose more experiments, researchers need infrastructure that makes those experiments understandable, inspectable, and repeatable before they reach the bench.”
The release arrives amid rapid progress in connecting artificial intelligence systems with laboratory automation. Opentrons has recently expanded its work integrating AI platforms with robotic experimentation infrastructure, including collaborations used to train physical AI systems for laboratory environments. In this context, simulation and visualization tools help researchers interpret how computationally generated experimental plans translate into robotic actions within standardized laboratory hardware. For organizations exploring autonomous experimentation, inspection layers that expose robotic execution details may support more transparent and reliable AI-driven research workflows.
The new capability will be available through Opentrons App version 9.0, scheduled for release in April 2026. The feature is designed for use with the Opentrons Flex robotic platform and supports protocols authored across the Opentrons software ecosystem. For laboratories adopting AI-assisted experimentation, the update provides a new tool for reviewing automated workflows before they are executed on physical laboratory systems.
