Catalyzing Chemistry: Ultra High-Throughput Physical Experimentation for Advanced Materials Discovery & Optimization

Z. Batts
Dot Energy LLC,
United States

Keywords: High-Throughput Experimentation; Advanced Materials Discovery; Modular Reactor Systems; Parallel Experimentation; Design of Experiments (DoE); AI-Orchestrated Experimentation

Summary:

Catalysts are fundamental to modern society, enabling processes across energy, chemicals, and materials. The state-of-the-art approach for discovering new catalysts is often divided into two camps: computational modeling and physical experimentation. However, each method presents significant limitations: computational models often lack real-world validation, while traditional experimentation is slow and resource-intensive. Our work combines these two approaches in the form of a modular, high-throughput physical experimentation platform and a proprietary machine learning-based algorithm operating in a closed-loop to build a framework for catalyst discovery and optimization. In this poster, we present the design and development progress of a modular, ultra high-throughput experimentation platform prototype. This prototype’s primary function is to demonstrate an architecture emphasizing modularity and scalability, repeatable high-quality data generation, and native integration with AI/ML workflows. Although the prototype will first be used for performance testing of CO2 adsorbents, the generalized design is applicable to any homogeneous and heterogeneous catalytic chemical reaction across a wide range of temperatures and pressures. Modularity and Scalability: A common infrastructure supports the integration of independent, hot-swappable reactor modules within a server rack design inspired by the Open Compute Project ORV3 architecture. Each reactor has blind-matable fluid, electrical, and communication connections enabling rapid reactor replacement (i.e. “plug-and-play”). System scalability is achieved through a highly modular, rack-based architecture in which multiple reactors are integrated into standardized modules, and modules are aggregated within a rack. System capacity can be tailored by adding or removing modules within available space, infrastructure, and experimental constraints. This approach enables highly parallel experimentation while preserving consistent hardware, control, and data acquisition paradigms across the platform. Repeatable, High Quality Data at Scale: Each reactor module is instrumented to capture key performance variables (e.g. temperature, pressure, and flow) relevant to sorbent qualification. The platform is intentionally designed to generate large volumes of repeatable, high-quality data through standardized reactor hardware, controls, and data acquisition. As the system scales via parallel operation and repeatable module fabrication, experimental throughput increases while marginal hardware and operational costs decrease, driving the lowest achievable dollar per datapoint without compromising data fidelity. Integration with AI/ML: The platform is designed as an execution layer for AI and algorithm-driven experimentation. The system exposes a programmable interface (API) through which external orchestration agents (e.g. machine learning based optimizers or human users), can specify experimental conditions, sequencing, and priorities. The platform manages experiment execution and data acquisition, cleaning, and management, while remaining agnostic to the user’s optimization or decision-making algorithms. This allows users to pair their own AI driven orchestration logic with a scalable, deterministic experimental backend, positioning the system as enabling infrastructure for closed-loop and self-driving laboratory workflows. Initial prototype development is underway, with validation phases planned to demonstrate reactor density, modular scalability, data integration, and compatibility with algorithm-driven experimental orchestration. The goal is to engage academic and industrial laboratories, technology developers, and researchers interested in high-throughput experimentation, automation, and data-centric approaches to advanced materials design.