Navlab
Graduated from IntuitiveX
Innovative intellectual property, artificial intelligence and a mission to improve surgical workflow.
Summary
Our Mission is to improve surgical workflows and patient outcomes through strategic IP development.
By finding the “White Spaces” in surgical navigation, the NavLab team strategically created a
12-asset IP portfolio covering all aspects of surgical workflow including:
  • AR/VR technology tools for surgical planning and intraoperative assistance
  • AI implementation for both traditional and robotic surgical systems
  • Surgical Navigation
  • Point-of-View of robot end-effector for MIS surgery using AR/VR
  • Multi-user interfaces (surgical team user roles, remote surgery)
The problem
Current surgical technology, such as diagnostic imaging tags, electronic medical records (EMRs) and other patient-specific data are built for reference during and after surgery but have extreme limitations. The surgeon never has direct control of information such as X-Rays, MRI images, CT scans or other key aspects of patient data needed for reference during operation. A surgeon can do general pre-operation planning such as reviewing images or making rough sketches but these are not accessible while scrubbed into surgery.

Additionally, there is no systematic method to analyzing the placement of surgical implants and compare to the preoperative plan. That means that there is no current system for a surgeon or machine to learn from and improve upon what has been executed.
Who we are
We develop IP for improved surgical workflow - from planning to performance - ultimately improving patient outcomes.

Artificial intelligence (AI) is increasingly vital to diagnosis, image analysis and surgical planning. The future surgical theater unarguably includes a virtual surgical tool set that augments the hand-held surgical instruments and compliments the AR/VR-assisted surgical process. Through intelligent surgical planning, a surgeon is able to capitalize on Augmented/Virtual Reality systems within the operating room, as well as intelligent, robot-assisted surgery.

The human-machine interaction is created to integrate computer algorithms and surgeon input to create an intelligent software that learns and improves outcomes with continued use. The ability to directly overlay interactive models of patient imaging, along with the surgeons’ virtual surgical plan will allow for not only the ability to perform the surgery prior to the operating room but also allow for a more accurate and minimally invasive surgical performance.

Additionally, this technology will augment the surgeon’s intraoperative knowledge of the anatomy so they can adjust their hand-held instrument placement, or robot-assisted surgery tool, through suggested ideal pathways based on image-analysis.
The team
CEO: Justin Esterberg, MD
CMO: Jeffrey Roh, MD, MBA
COO: Simon Robinson, MBA
CLO: Mark Han, Esq.
Strategic advisor


John Cronin: IP Capital Group