Anatomical Visualization
Bring surgical plans into the operating room
Iris is an anatomical visualization service, using data from diagnostic CT imaging, a segmented 3D model of a patient's anatomy is created. Iris enables surgeons to consult with patients and prepare for cases.
The model can be viewed standalone in a mobile app or loaded into the da Vinci surgical systems during the surgery.
I am the design lead on Iris 2.0 managing all design efforts and delivery since I joined the company in July 2019. I supported the design for the 2.0 release through Dec 2020 and now I oversee a designer’s work on the updates.
The first launch to the limited users is scheduled for May 2022.
Project
Area of Focus
Experience Continuity
On making the experience effortless and seamless, I designed and facilitated the conversations to deliver these features:
Single sign-on (SSO) integration to allow effortless account authentication
Consistent App and Web experience
An overall to redesign the UI to match Intuitive brand language from the software to the surgical system
Offline experience responding to unstable wifi in the operating room
Simplify Order Management
We learned through feedback during the pilot experience that the order status was not clear enough. So after I consolidated user and field feedback, I streamlined the experience by:
Revise order status and labels
Simplify overall web and mobile information architecture to allow for identifying and locating a patient model quickly
Work with the Human Factor Engineer and the Clinical Development Engineer to make the case to the regulatory team to remove an unnecessary verify step in the workflow
Improve Order Workflow
Users requested these features after they use the pilot app and site:
Expand to include common PACS search criteria
Upload scan via web
Allow delegate user (assistant) to access Iris and to order on their behalf
Improve 3D Model Comprehension and Manipulation
The core of the service is the 3D model. Thinking about how to help surgeons prepare for a case and eventually use them during a da Vinci surgery, I worked on improving the interaction to ensure they will also flow seamlessly and can be driven “blindly“ when need be.
Selected Interactions
Adjust single anatomy transparency
If users long press any anatomy in the view, they will get immediate feedback on the adjustment. When changing the transparency all the way down, the parenchyma rendering turns into a silhouette outline to better highlight the segmented borders.
Model transparency master sheet
Without changing the desired angle of the model, users are able to change the transparency or turn on or off any anatomy at the same time, which is helpful when some anatomies behind other anatomies are hard to select.
Change view
Users can save up to 5 views for planning and reference in the surgery. I chose to use a double-tapping gesture to turn the entire screen to be the menu, which allows the user to swipe to switch between the 3D views and the original medical image views without having to use the eyes to locate UI, for it to work continuously with the blind drive.
Blind drive
Before the models are fully integrated into the robotic system, surgeons still need to interact with the phone directly to manipulate the models. During the surgery, the surgeon’s head is in the console and they may not prefer to take the head out just to interact with the phone. By double-tapping the phone screen, the entire screen becomes a big slider which allows users to switch the view “blindly“ by simply swiping without having to look at the device.
User Feedback on the 3D models (Lung)
Nice borders near nodules
“I like this view of just lung segments to see the planes relative to the nodule without the vessels. Very unique view to see borders.”
– Dr. Dan Oh
Distinctive fissures
“The contouring of the fissure here is nice when viewed along its plane.”
– Dr. Dan Oh
Appreciating the airways with segmented borders
“I really like this view of the airways with vessels invisible and shadow of the segmental borders.”
– Dr. Dan Oh
User Feedback on Experience
Iris received an excellent SUS of 88 where the industry average is 68 and intuitive’s flagship surgical system scored 82.
“It’s a user-friendly interface,
I think you can do it without any training at all.”
– Surgeon from formative
The System Usability Scale (SUS) is used to provide an overall usability assessment measurement, as defined by ISO 9241, which is a multi-part standard from the International Organization for Standardization covering ergonomics of human-computer interaction.
It is made up of the following characteristics:
Effectiveness — Can users successfully achieve their objectives?
Efficiency — How much effort and resource is expended in achieving those objectives?
Satisfaction — Was the experience satisfactory?