Today's Hours:

Cardiologist Blair Suter, MD Utilizes New VR Technology for Medical Education

Display Breadcrumb
Yes
Summary

Ohio State cardiologist Dr. Blair Suter is exploring virtual reality for better visualization of medical scans. With syGlass, he’s creating immersive learning experiences for medical trainees that can help improve procedural planning in the future.

Blair Suter, MD is a cardiologist at The Ohio State University Wexner Medical Center that is board-certified in pediatrics, internal medicine and internal medicine/cardiovascular. Earlier this year, Dr. Suter approached the EdTech Incubator (ETI) team to see how its technology could help him better visualize cardiac multi-modality imaging with virtual reality.

The Exploration

As a new faculty member in the division of cardiology, Dr. Suter originally learned about the ETI through the curriculum cohort in Ohio State’s FAME (Faculty Advancement, Mentoring and Education) program. The cohort spent time in the 3D Printing Lab, Virtual Reality Zone and Anatomy Visualization Zone to learn what technology was available and how the free resource can be used by faculty. After the tour and demonstrations, Dr. Suter reached out to the ETI Coordinator, Mo Duncan, for guidance on utilizing the technology.

When asked about his initial motivation for this project exploration, Suter said, “Using Cardiac CT, we look at structural heart disease, including devices implanted in structural and congenital heart disease. Oftentimes, those things are difficult to visualize when you are just getting started, so what we’re looking at here is trying to do 3D modeling in virtual reality to try to make that easier for learners and possibly for clinical application as well.”

The Solution

After hearing about what Dr. Suter wanted to explore, Duncan recommended using syGlass, a new software available for 3D modeling visualization in virtual reality. For his current pilot study, Suter and his cardiac fellow, Omar Latif, MD, are using a patient case where a WATCHMAN device was mal-positioned. 

The WATCHMAN is a small device that fits directly into the left atrial appendage (LAA) in the heart and closes it off to prevent blood clots in the location that could lead to a stroke. It’s intended as a one-time surgical solution to allow patients to avoid using blood-thinning medications and the bleeding risks that come with them. However, if the device is mal-positioned, the patient is once again at a higher risk for blood clots and stroke.

Using syGlass, Suter uploaded a patient’s de-identified CT scan and was able to see their heart in 3D while wearing a Varjo XR-3 virtual reality headset. While looking at the scan in 3D, Suter manipulated the image by rotating it and splicing through the heart at various angles for a better look at the device placement. After determining the best visual manipulation to see the device, Suter was able to record a video while in the headset. Learners can watch the video and see what Suter saw in the headset, while also listening to him explain the mal-positioning and how this visualization can be utilized in procedural planning for correcting the placement.

Suter stated, “We’re on the first step, but myself and the cardiology fellow I’m working with, Omar, have talked about all kinds of different cardiac applications that VR modeling can have, in education but clinically as well.”

The Experience

“After the initial time I was here for the demo, I came over here and spent about 30 minutes and was able to manipulate the models in that time with me never having used this before and learning how to use this software and equipment,” said Suter, “This is just my second time using it and I could navigate it with 10 minutes of preparation.”

Suter is currently using his video recording with the CT scan of the WATCHMAN device as a prototype for a much larger vision. He plans to record a series of videos to build a library for medical students, residents and fellows to have immersive learning experiences with cardiac imaging.

Learn more about syGlass in our recent news article: Ohio State Approves syGlass Software for Virtual Reality.

Read Time
5
Impact Story Image
Image
Dr. Blair Suter standing and smiling in front of a anatomical heart on a display.
Impact Story Topic
Impact Story Status
Related Impact Stories

Interdisciplinary Collaboration Utilizing VR for Human Anatomy Exploration

Display Breadcrumb
Yes
Summary

In an innovative effort to enhance the educational experience of her critical care nutrition course, Kristen Roberts, PhD, RD, LD, partnered with Laura Boucher, PhD, AT, ATC, to teach students on the path to becoming dietitians. The advanced knowledge of human anatomy that students glean from Boucher gives students a better understanding of muscle and fat variations that helps them master the physical assessment skills needed to obtain their credential and perform their job duties as clinical dietitians.

In an innovative effort to enhance the educational experience of her critical care nutrition course, Kristen Roberts, PhD, RD, LD, partnered with Laura Boucher, PhD, AT, ATC, to teach students on the path to becoming dietitians. The advanced knowledge of human anatomy that students glean from Boucher gives students a better understanding of muscle and fat variations that helps them master the physical assessment skills needed to obtain their credential and perform their job duties as clinical dietitians.

The Exploration

Initially, the collaboration between Roberts and Boucher, both clinical associate professors at The Ohio State University, involved a class of dietetics students watching Boucher teach with a prosected donor body. While both professors agree that nothing can replace the experience of seeing anatomical structures in a real body, there were drawbacks to the approach for this discipline. While some dietetics students were up front and asked a lot of questions during the dissection, many of them hadn’t had dissection experiences and weren’t comfortable engaging.

When the two reached out to ETI Coordinator Mo Duncan, Roberts and Boucher sought out the cutting-edge educational technology tools in the EdTech Incubator’s (ETI) Anatomy Visualization Zone. They were interested in replacing the donor prosection teaching with a 2D dissection on the Sectra table and 3D visualization of a full body in virtual reality (VR), with hopes that they could engage more students while decreasing costs compared to using a donor prosection.

The Solution

The transition to the Anatomy Visualization Zone from the anatomy lab brought encouraging improvements to the learning experience. Technology used in the zone includes:

Sectra table with VH Dissector software in 2D:

  • The Sectra table is a 56x32-inch interactive screen that allows users to interact with 3D images, manipulating the size and angle for a better view.
  • The VH Dissector software shows a 2D digitized, full-body CT scan that allows users to isolate anatomical structures and simulate a dissection.

Varjo headset and VH Dissector software in VR:

  • The Varjo XR-3 headset is a mixed reality headset with high resolution, with the additional benefit of limiting motion sickness.
  • The VH Dissector software in VR shows the same digitized, full-body CT scan as the Sectra table but in 3D. The image shows the body vertically in front of you with the same ability to isolate anatomical structures and simulate a dissection. As one user is immersed in the VR experience, what they see in the headset displays on the classroom TV.

Quest 3 headsets with 3D Organon software:

  • 3D Organon is compatible with Meta Quest 3 VR headsets and shows a recreated, anatomically correct model that allows users to isolate different systems, regions and structures.

During the lab time, students were divided into three stations. One station included looking at the vascular system with Quest 3 headsets, and the second station was an interactive, virtual dissection led by Boucher. Boucher showed musculoskeletal anatomy, different cross-sections and thicknesses of fat, providing a deeper understanding of anatomical variations as students asked questions. She reported that more students felt comfortable being close to the virtual dissection and therefore more students were engaged in the learning experience.

“It worked really well,” said Boucher, “and we've had a lot of years of the donor body experience, but we want to come back to the [ETI] space. And I think that's a testament to the potential that's there and the feedback from the students.” 

The third station involved students independently looking at the same body that Boucher showed on the Sectra table using the Varjo headset. Students could isolate structures and explore different parts of the body while others could watch the experience on a large TV screen.

The Experience

Duncan provided guidance and training during two preparation sessions before students came in for their lab. There were a couple technical speedbumps to work through with the vascular system stations and Roberts and Boucher plan to cut out those stations the next time they use the space so students can get more time at Boucher’s virtual dissection station.

When asked about using the anatomy lab versus the Anatomy Visualization Zone, Roberts said, “I think there were trade-offs to both, but to me, the overarching positive with the [Anatomy Visualization] space was that I actually think their learning was enhanced because of how the time was spent.”

“I think this fall I'm going to try to go into the Anatomy Visualization area to take advantage of a similar set up with the MAT [Master of Athletic Training] students, because I think they might get more out of their general medical conditions course where we teach abdomen, thorax, pelvis and heart and lung anatomy,” says Boucher. “This experience has shown me what other opportunities there are and where I can apply the same type of learning in a class to have a better experience that might be more meaningful.”

Read Time
5
Impact Story Image
Image
Kristen Roberts and Laura Boucher standing and smiling in front of a large digital interactive display of human anatomy.
Impact Story Topic
Impact Story Status
Related Impact Stories

Transforming Surgical Education with an Interactive Virtual Reality Resource Library

Display Breadcrumb
Yes
Summary

At the crossroads of medicine and cutting-edge technology, the Anatomy Laboratory Toward Visuospatial Surgical Innovation in Otolaryngology and Neurosurgery (ALT-VISION) at The Ohio State University is making transformative strides in surgical education. Leading the way with his 3D-models-turned-virtual-reality experience is Moataz Abouammo, MD, MSc, associate researcher and coordinator at ALT-VISION.

At the crossroads of medicine and cutting-edge technology, the Anatomy Laboratory Toward Visuospatial Surgical Innovation in Otolaryngology and Neurosurgery (ALT-VISION) at The Ohio State University is making transformative strides in surgical education. Leading the way with his 3D-models-turned-virtual-reality experience is Moataz Abouammo, MD, MSc, associate researcher and coordinator at ALT-VISION.

Abouammo collaborated with Je Beom Hong, MD, Ohio State Department of Neurosurgery research fellow, and Rebecca Leme Gallardo, MD, Ohio State Department of Otolaryngology research fellow.

The Exploration  

Abouammo initially developed a comprehensive library containing more than 200 high-fidelity 3D models, simulating a wide range of endoscopic endonasal, transcranial and orbital surgical approaches. After rigorous surgical anatomical dissections, he used 3D photogrammetry — an advanced technique that compiles 3,000 to 4,000 photographs per specimen — to create precise, scalable reconstructions of surgical anatomy. These models were incorporated into the Atlas of Endoscopic Sinus and Ventral Skull Base Surgery.

Conceived as an interactive 3D dissection manual, the project aimed to provide detailed anatomical models for various endoscopic and skull base surgical approaches. But the vision extended beyond static models; the goal was to leverage technology to create a transformative educational tool that could enhance learning for trainees and practicing surgeons alike.

After Abouammo reached out to the ETI team, ETI Coordinator Mo Duncan and Learning and Development Specialist Thomas Ellsworth were able to troubleshoot and map out a path forward to take the 3D models even further.

The Solution

The collaborative effort resulted in scanning the previously developed 3D models and integrating them into a VR framework, crafting an immersive educational platform. Trainees and surgeons can now explore surgical approaches step by step, manipulate anatomical structures and visualize complex spatial relationships in a safe, interactive and three-dimensional space. This leap in educational technology bridged the gap between traditional dissection training and real-world surgical scenarios, enabling learners to rehearse procedures and build both confidence and precision without the risks associated with live surgery.

Looking forward, the team is poised to expand the project further. “Building on our success, we’re exploring ways to enhance the VR experience by layering the 3D models — allowing users to interactively peel back anatomical structures and visualize deeper surgical planes with precision,” said Abouammo. “We also plan to refine animations to better demonstrate critical steps in each approach, making the educational tool even more intuitive and dynamic. Furthermore, we are aiming to significantly expand the surgical scope to include maxillofacial, facial plastics, transoral and neck approaches.”

These advancements could revolutionize how surgeons train, offering a level of detail and interactivity that surpasses traditional methods. “With the ETI’s expertise in immersive technology and our shared commitment to innovation, I’m confident we can develop the next generation of surgical simulation tools, with potential applications in preoperative planning, interactive education for residents and medical students, patient education and beyond,” said Abouammo.

The Experience

Feedback from surgeons and trainees has been overwhelmingly positive. The immersive learning environment not only improved anatomical understanding but also increased trainees’ confidence and skill in complex procedures. Early results indicate that the tool is transforming surgical training by making learning more engaging, accessible, fun and effective. As long-term clinical impact is studied, the immediate educational benefits are clear — and the team looks forward to continuing their collaboration with the ETI to push the boundaries of what is possible in surgical education.

Abouammo shared the following: “Working with the ETI team has been an incredible experience — their expertise, creativity and collaborative spirit were instrumental in bringing this project to life. They were not only highly skilled in troubleshooting technical challenges but also deeply invested in the project’s success. They treated the project as if it were their own, constantly brainstorming ways to enhance its educational impact.

Thanks to their expertise, we were able to deliver a cutting-edge VR training tool that received outstanding feedback from surgeons and trainees alike. I’d jump at the opportunity to work with them again on future projects.”

Kyle K. VanKoevering, MD, FACS, associate professor, otolaryngology — head and neck surgery at Ohio State College of Medicine, praised Abouammo’s innovative thinking. "This novel, immersive technology allows learners to experience surgical skull base anatomy in a way that is unlike anything available,” said VanKoevering. “More portable and accessible than any cadaveric dissections, these 3D models have the potential to revolutionize anatomic education."

Read Time
5
Impact Story Image
Image
Rebecca Leme Gallardo, Moataz Abouammo and Je Beom Hong smiling and posing together wearing Ohio State branded scrubs.
Impact Story Topic
Impact Story Status
Related Impact Stories
Subscribe to Virtual Reality
Back to top