AI microscope autonomously performs tasks like a scientist, only much faster
Duke engineers create an AI microscope that autonomously analyzes 2D materials with near-human accuracy – revolutionizing materials research.

Edited By: Joseph Shavit

Haozhe Wang’s electrical and computer engineering lab at Duke welcomed an unusual new lab member this fall: artificial intelligence. (CREDIT: Duke, Pratt School of Engineering)
Within a modest engineering laboratory at Duke University, a new type of researcher is quietly at work next to an optical microscope. This new researcher has no need for coffee, does not become tired, and does not require months of training in order to tell the subtly different colors and contours that unravel the mysteries of materials only a few atoms thick. The new lab assistant is an artificial intelligence.
Developed by Haozhe “Harry” Wang and his team, this artificial-intelligence-based microscope is called ATOMIC — short for Autonomous Technology for Optical Microscopy and Intelligent Characterization. It can identify, characterize, and classify ultra-thin materials almost as well as a trained graduate student.
However, it performs these tasks orders of magnitude faster. The realization of the ATOMIC system is a watershed moment for how science will be done, combining human curiosity with machine intuition.
New Methods for Characterizing the Thinnest Materials on Earth
Wang’s team works on two-dimensional (2D) materials, crystals that are just a few atoms thick, that offer promise for faster electronics, sensitive sensors, and powerful quantum devices. These materials—graphene and molybdenum disulfide, for example—are remarkable in their promise and yet difficult to work with because any small error could destroy their exceptional characteristics.
“Usually characterizing these materials takes someone who can identify every detail of the microscope images,” Wang said. “It takes graduate students months to years of experience to get to that place.”
Typically, this means a researcher manually analyzing thousands of images, describing flakes, and determining which flakes are usable. Each flake can differ in thickness, size, and smoothness. The method is tedious, and even minor errors can require the experiment to start over.
The ATOMIC system alters that. It leverages what researchers have termed zero-shot autonomous microscopy, a process in which the AI can recognize patterns and materials it has never seen. Instead of relying on thousands of labeled observations, the system uses what researchers call foundation models.
These are very large, pre-trained neural networks that already contain knowledge about a broad range of visual and language information. In the case of the autonomous microscope, these foundation models allow an AI model to interpret microscope images in a flexible way similar to how language models are trained to process words.
The Creation of an Autonomous Microscope
To test their idea, Wang's research team connected an off-the-shelf microscope to ChatGPT from OpenAI and Segment Anything Model (SAM) from Meta. ChatGPT was responsible for the conversational and logical reasoning aspects of the microscope operation, including adjusting the focal length, light, and camera position. SAM took care of identifying individual flakes and defects in each of the sample images.
Together, the systems created a closed-loop system in which the microscope could collect, interpret, and decide what to observe next without any human supervision. Wang said that this was similar to having a self-driving assistant that didn't just take orders, but also could make decisions based on the visual context of the images.
Wang explained: "The system we built doesn't just follow directions; it understands them. For example, ATOMIC can evaluate a sample, make decisions unprompted, and gather and generate results comparable to what a human expert would do."
The research, published in the journal ACS Nano, demonstrated that the AI microscope could detect and categorize thin flakes by number of layers, as well as assess quality, with up to 99.4% accuracy. Unlike conventional algorithms, which fail to perform well in low-light and out-of-focus scenarios, ATOMIC exhibited exceptional sensitivity in difficult situations, sometimes identifying defects that are invisible to the naked eye.
Seeing the Impossible
Ph.D. student Jingyun "Jolene" Yang, the first author of the paper, said she could hardly believe how sensitive the system was. "This model can identify grain boundaries at scales that are difficult for people to see," she said. "It's not magic. ATOMIC is, in essence, scanning the image on a pixel-by-pixel basis, so it's a valuable tool for our lab."
The AI microscope does not just gather images but also interprets them. It accounts for every flake; based on its color and contrast, it calculates how thick the flake is, and it assesses how uniform the flake is. These were all advantageous for selecting the material for future experiments. It then updated its internal map and continued on, scanning the next region that it believes is going to be most productive.
In benchmark tests, ATOMIC outperformed traditional deep-learning models that were trained on narrow datasets. Most impressively, ATOMIC achieved more than 90 percent accuracy across materials such as graphene, tungsten disulfide, and hexagonal boron nitride. ATOMIC was quite impressive in that it did not need to be retrained when shown new unlabeled samples.
The payoff was staggering: throughput for imaging increased nearly tenfold. What took days to complete can now be finished in hours.
What Happens When AI Collaborates with Scientists
Wang's approach to ATOMIC was not to replace scientists, but to increase what they can do. The AI takes care of repetitive tasks, such as moving slides or changing lens focus, or detecting flakes that are useful for research; that is, while the AI manages these tasks, the human researchers are then free to think, interpret, and innovate in ways that go beyond what the automated process pursues.
The autonomous decision-making of the system also indicates a trend of autonomous laboratories. Such a laboratory would not only require the AI to collect data, but generate hypotheses and conduct experiments, and adjust its approach on the fly.
The key innovation of the microscope is a model that utilizes vision and language. Because the AI can understand text prompts, the researcher can literally tell the microscope what to look for. For example, a scientist can type "Find Monolayer Graphene flakes," and the system can interpret the request as any other human would.
By placing language and images in a shared understanding of purpose, ATOMIC creates a bridge to a frontier of translational research that creates a gap between language and perception—an approach that could transform scientific discovery.
Limits and the Road Forward
While the results are positive, the researchers are aware that ATOMIC’s accuracy still heavily depends on the diversity of data used to train its foundation models. For instance, the system might misinterpret unwanted formatting textures. It could also view a change in lighting as having a different meaning than previous models.
Even under its current operating model, ATOMIC relies heavily on optical microscopy, which cannot be anticipated at the atomic-scale resolution of cutting-edge tools, such as transmission electron microscopy.
Future models may bring together optical, structural, and spectroscopic data to describe an even more complete picture of each sample's state. Wang's team anticipates being able to add significant modular changes that allow the autonomous system to shift automatically from wide field scans to high-resolution scans on very specific regions of interest.
These refinements may even lead to a fully autonomous lab filled with microscopes, sensors, and robotic arms. "We are headed to a world where AI microscopes would be able to explore materials landscapes faster than any human could," Wang said.
Pragmatic Considerations
The implications of ATOMIC's development does not stop or end at Duke or this laboratory. By removing the time and skills associated with analyzing two-dimensional materials and using autonomous microscopy, the design of next-generation semiconductors, energy devices, and quantum sensors could be expedited in companies developing all types of fast electronics (smartphones) and batteries.
Industries could use a similar framework for real-time quality control of microscopic defects before they become problems with cost. For research, the technological framework offers researchers more freedom to be creative and interpret data instead of focusing on collecting data.
In the long run, the same autonomy to work autonomously could move to telescopes, gene sequencers, robotic labs, or anywhere science relies on the capability to see the unseen.
Research findings are available online in the journal ACS Nano.
Related Stories
- SMART technology converts existing mobile phone cameras into high-resolution microscopes
- Caltech scientists discover a 'spooky' way to double the resolution of light microscopes
- Researchers have developed the next generation of microscopes
Like these kind of feel good stories? Get The Brighter Side of News' newsletter.
Mac Oliveau
Science & Technology Writer
Mac Oliveau is a Los Angeles–based science and technology journalist for The Brighter Side of News, an online publication focused on uplifting, transformative stories from around the globe. Passionate about spotlighting groundbreaking discoveries and innovations, Mac covers a broad spectrum of topics—from medical breakthroughs and artificial intelligence to green tech and archeology. With a talent for making complex science clear and compelling, they connect readers to the advancements shaping a brighter, more hopeful future.
Joseph Shavit
Science News Writer, Editor-At-Large and Publisher
Joseph Shavit, based in Los Angeles, is a seasoned science journalist, editor and co-founder of The Brighter Side of News, where he transforms complex discoveries into clear, engaging stories for general readers. With experience at major media groups like Times Mirror and Tribune, he writes with both authority and curiosity. His work spans astronomy, physics, quantum mechanics, climate change, artificial intelligence, health, and medicine. Known for linking breakthroughs to real-world markets, he highlights how research transitions into products and industries that shape daily life.



