AI optical microscope analyzes 2D materials as precisely as human experts

Date:


Credit: ACS Nano (2025). DOI: 10.1021/acsnano.5c09057

Haozhe “Harry” Wang’s electrical and computer engineering lab at Duke welcomed an unusual new lab member this fall: artificial intelligence. Using publicly available AI foundation models such as OpenAI’s ChatGPT and Meta’s Segment Anything Model (SAM), Wang’s team built ATOMIC (Autonomous Technology for Optical Microscopy & Intelligent Characterization)—an AI microscope platform that can analyze materials as accurately as a trained graduate student in a fraction of the time.

“The system we’ve built doesn’t just follow instructions, it understands them,” Wang said. “ATOMIC can assess a sample, make decisions on its own and produce results as well as a human expert.”

Published on October 2 in the journal ACS Nano, the findings point to a new era of autonomous research, where AI systems work alongside humans to design experiments, run instruments and interpret data.

How ATOMIC works

Wang’s group studies two‑dimensional (2D) materials, crystals only one or a few atoms thick that are promising candidates for next-generation semiconductors, sensors and quantum devices. Their exceptional electrical properties and flexibility make them ideal for electronics, but fabrication defects can compromise these advantages. Determining how the layers stack and whether they contain microscopic defects requires laborious work and years of training.

“To characterize these materials, you usually need someone who understands every nuance of the microscope images,” Wang said. “It takes graduate students months to years of high-level science classes and experience to get to that point.”

To speed up the process, Wang’s team linked an off‑the‑shelf optical microscope to ChatGPT, allowing the model to handle basic operations like moving the sample, focusing the image and adjusting light levels. Layered on top was SAM, an open‑source vision model designed to identify discrete objects, which, in the case of materials samples, would include regions containing defects and pure areas.

Together, the two AIs formed a powerful tool in the lab, a kind of virtual lab mate that could see, analyze and act on its own.

Turning general-purpose AI into a reliable scientific partner, however, required significant customization from the Wang lab. SAM could recognize regions within the microscopic images, yet it struggled with overlapping layers, a common issue in materials research. To overcome that, they added a topological correction algorithm to refine those regions, isolating single-layer areas from multilayer stacks.

Finally, the team asked the system to sort the isolated regions by their optical characteristics, which ChatGPT could do autonomously.

The results were remarkable: Across a range of 2D materials, the AI microscope matched or outperformed human analysis, identifying layer regions and subtle defects with up to 99.4% accuracy. The system maintained this performance even with images captured under imperfect conditions, such as overexposure, poor focus or low light, and in some cases spotted imperfections invisible to the human eye.

“The model could detect grain boundaries at scales that humans can’t easily see,” said Jingyun “Jolene” Yang, a Ph.D. student in Wang’s lab and first author on the paper. “It’s not magic, however. When we zoom in, ATOMIC can see on a pixel-by-pixel level, making it a great tool for our lab.”

By locating and categorizing microscopic defects, the system helps Wang’s group determine the number of layers in a 2D material and pinpoint pristine regions suitable for follow‑up studies. Those high‑quality areas can then be used for other research in Wang’s lab, such as soft robotics and next-generation electronics.

Even more impressive, the system required no specialized training data. Traditional deep‑learning approaches need thousands of labeled images. Wang’s “zero‑shot” method leveraged the pre‑existing intelligence of foundation models, trained on broad swaths of human knowledge, to adapt instantly.

Welcome your new lab mate: Artificial intelligence
Haozhe “Harry” Wang and Jingyun “Jolene” Yang. Credit: Duke University

What it means for researchers

For Wang, his excitement about the innovation isn’t just about speed. It’s also about teaching his students to use the technologies at their disposal to be modern-day researchers.

“In the last year, AI has advanced a lot and Dr. Wang said if we do not embrace this era and make use of these AI tools, they may replace us,” Yang said. “We tested the ATOMIC system on many samples and different conditions, and it’s quite robust.”

Wang sees potential applications across disciplines, from chemistry to biology, where tedious optical analysis often slows progress. Simplifying those workflows could open advanced research to students, industry engineers or anyone with curiosity and a microscope.

At the same time, Wang stresses the importance of keeping humans in the loop. Foundation models can behave unpredictably, sometimes generating different results for identical prompts. His group tested thousands of repetitions to assess robustness and found that while minor variations occur, overall accuracy remains high.

“The goal isn’t to replace expertise; it’s to amplify it,” Wang said. “We still need humans to interpret what the AI finds and decide what it means. But once you have a partner that can complete weeks of analysis in mere seconds, the possibilities for new discoveries are exponential.”

More information:
Jingyun Yang et al, Zero-Shot Autonomous Microscopy for Scalable and Intelligent Characterization of 2D Materials, ACS Nano (2025). DOI: 10.1021/acsnano.5c09057

Provided by
Duke University


Citation:
AI optical microscope analyzes 2D materials as precisely as human experts (2025, October 27)
retrieved 27 October 2025
from https://phys.org/news/2025-10-ai-optical-microscope-2d-materials.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.



Share post:

Subscribe

Popular

More like this
Related

What on Earth is ‘land tenure’?

Editor’s note: From “climate adaptation” to “blue carbon,” from...

The Hidden Lesson in Projection: It’s Never Really About Us

Want more posts like this in your life?...

Can COP30 mark a turning point for climate adaptation?

Cristina Rumbaitis del Rio is a senior advisor...