Editorial Feature

An Introduction to the Robots that Use 3D Microscopy

Image Credit: VILevi/Shutterstock.com

Traditionally the microscope slide has been flat, not quite two dimensional, but certainly only a snapshot of a sliver of material. However, the world presents itself in three dimensions and the ability to look at “whole” objects rather than just a slice has become increasingly important.

The application of 3D microscopy is booming, largely as a result of a corresponding increase in computational power.

The term 3D microscopy covers a range of imaging technologies such as ultrasound tomography, micro-computed tomography (CT), micro-positron emission tomography (PET), photoacoustic imaging and others that were large volume low-resolution solutions compared to the availability of high resolution but only 2D solutions.

Not surprisingly these technologies are most useful in a variety of medical and surgical diagnostic specialties. Some can be employed in live surgery while others are more useful in a laboratory setting.

Medical Science and 3D Robotic Microscopes

When the term robotic 3D microscopy is used it often refers to robotic-assisted microscopy. A surgical Robotic Visualization System such as the ZEISS KINEVO 900 used in neurosurgery utilizes 3D 4K Digital Hybrid Visualisation that allows a whole surgical team to view the surgery in detail with the potential to perform microvascular suturing and other microsurgical manipulations when the KINEVO 900 is used as an exoscope with the 3D view.

It also limits the need for repositioning and refocusing during the procedure which saves vital time during an operation.

Synaptive’s Modus V™ is a robotic digital microscope (essentially a 3D microscope on a robotic arm) that combines state-of-the-art engineering with state-of-the-art optics, video processing, and robotic automation to provide surgeons and clinicians with a completely new approach to surgical operations.

Stereoscopic 3D visualization provides natural depth perception and realistic surgical views that give the surgeon enhanced visual information on an eye-level display during difficult surgical tasks. Modus V has a 12.5X optical zoom, a significantly larger volume of view than a standard microscope, and enhanced video processing to provide unobstructed, clear views for any surgical approach.

The surgeon can control the system settings from within the sterile operating area, including optics and robotics, using voice commands for an uninterrupted surgical workflow. It uses Synaptive’s tracked surgical instruments feature together with Modus V hands-free robotic movement to provide AutoFocus control, eliminating the need for manual focus adjustments during the surgery.

Click here for more information on robotics equipment.

Meanwhile, the team at 3Scan, a company founded by two MIT graduates Cody Daniel and Todd Huffman, is bringing high-volume robotic 3D imaging into the field of pathology, where it has huge potential to provide detailed 3D images that provide considerably more information than standard 2D microscopy.

Pathology typically involves cutting tissue samples by hand, placing each sample between two pieces of glass, and studying it under a microscope. A human can typically process about 12 sample slices per hour. 3Scan speeds this process up considerably.

Its knife-edge scanning microscope (KESM) tool uses an automated diamond knife to cut samples at 1,000 slices per hour, instantly scanning an image of each slice. The scans are layered to create a 3D tissue model with micron-scale resolution.

A Novel Application for 3D Robotic Microscopy

While most 3D robotic microscopy is used in the medical field, 3D robotic microscopes are being used in a very different setting in order to monitor the health of the world’s oceans.

Planktons are the natural indicators of changes in the health of water bodies and play a major contribution in controlling the quality of air and water on earth. They are also the start of the ocean food chain.

Worryingly, we know that plankton populations have fallen by over 40% since 1950 according to research published in the journal Nature in 2010, possibly due to a rise in global temperatures. However, studying them remains a challenge since it requires the collection of the required quantity of plankton samples and shipping to a laboratory without damage post-mortem.

IBM’s solution to this problem is to deploy Artificial Intelligence (AI) controlled swimming robotic microscopes networked via the cloud to track and monitor plankton behavior in their natural environment. The microscopes rely on an imager chip that captures the shadow of the plankton as it swims over the chip, “generating a digital sample of its health, without the need for focusing (since there’s no lens),” IBM said.

In addition, high-performance, low-powered AI technology could analyze and interpret the data locally, “reporting any abnormalities in real-time so they could be acted upon immediately.”

It isn’t a surprise that microscopy is being transformed by the application of three-dimensionality and AI.

As the technology matures in both resolution and the algorithms associated with image interpretation it will become an increasingly powerful tool not only in the field of medical and biological science but also in monitoring the health of the planet on a nanoscale.

References and Further Reading

Robot Microscopes, AI Bias Among IBM’s 5 ‘Life in 2023’ Predictions. Robotic Business Review. Available from: https://www.roboticsbusinessreview.com/ai/robot-microscopes-ai-bias-ibm-life-in-2023/

Cleaning our oceans with AI-powered robot microscopes. Queen Elizabeth Prize for Engineering. Available from: https://qeprize.org/news/cleaning-our-oceans

Modus V™. Synaptive. Available from: https://www.synaptivemedical.com/products/modus-v/

How a robotic operative microscope affects neurosurgery. Med-Tech Innovarion. Available from: https://www.med-technews.com/features/laboratory-evaluation-of-a-robotic-operative-microscope/

Collins, J.T. et al. (2019). Robotic microscopy for everyone: the OpenFlexure Microscope. Biomedical Optics Express. DOI: 10.1364/BOE.385729

See it in action: Robotic microscope puts pathology into 3D perspective. MIT Alumni Association. Available from: https://www.youtube.com/watch?v=9wZuSWP7txw

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

Stephen Edgar

Written by

Stephen Edgar

Steve Edgar is an Information Design professional with a degree in graphic communication and a master's in digital design.


Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Edgar, Stephen. (2020, May 04). An Introduction to the Robots that Use 3D Microscopy. AZoRobotics. Retrieved on June 14, 2024 from https://www.azorobotics.com/Article.aspx?ArticleID=376.

  • MLA

    Edgar, Stephen. "An Introduction to the Robots that Use 3D Microscopy". AZoRobotics. 14 June 2024. <https://www.azorobotics.com/Article.aspx?ArticleID=376>.

  • Chicago

    Edgar, Stephen. "An Introduction to the Robots that Use 3D Microscopy". AZoRobotics. https://www.azorobotics.com/Article.aspx?ArticleID=376. (accessed June 14, 2024).

  • Harvard

    Edgar, Stephen. 2020. An Introduction to the Robots that Use 3D Microscopy. AZoRobotics, viewed 14 June 2024, https://www.azorobotics.com/Article.aspx?ArticleID=376.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this article?

Leave your feedback
Your comment type

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.