Download

 

Type: Review article
Published: 15-03-2019

 

Imaging and printing in plastic and reconstructive surgery part 2: emerging techniques

Michael P Chae MBBS BMedSc,1,2,3 David J Hunter-Smith MBBS MPH FRACS,1,2,3 Warren M Rozen MBBS PhD FRACS1,2,3

1 

Department of Plastic, Reconstructive and Hand Surgery
Peninsula Health
Frankston, Victoria
AUSTRALIA

 

2 

Peninsula Clinical School
Central Clinical School at Monash University
The Alfred Centre
Melbourne, Victoria
AUSTRALIA

3

Department of Surgery
School of Clinical Sciences at Monash University
Monash Medical Centre
Clayton, Victoria
AUSTRALIA

 

 

 

OPEN ACCESS

Correspondence

Name: Warren Rozen

Address: Monash University Plastic and Reconstructive Surgery Group (Peninsula)
Peninsula Health, Department of Surgery
2 Hastings Road
Frankston, Victoria, 3199
AUSTRALIA

Email: warrenrozen@hotmail.com

Telephone: +61 3 9784 8416

Citation: Chae MP, Hunter-Smith DJ, Rozen WM. Imaging and printing in plastic and reconstructive surgery part 1: established techniques. Aust J Plast Surg. 2019;2(1):69–77

Accepted for publication: 21 June 2018

Copyright © 2019. Authors retain their copyright in the article. This is an open access article distributed under the Creative Commons Attribution Licence which permits unrestricted use, distribution and reproduction in any medium, provided the original work is properly cited.


Abstract

Background: In the second of a two-part series, we evaluate emerging three-dimensional (3D) imaging and printing techniques based on computed tomography angiography (CT) and magnetic resonance angiography (MRA) for use in plastic and reconstructive surgery.

Method: A review of the published English literature dating from 1950 to 2017 was taken using databases such as PubMed, MEDLINE®, Web of Science and EMBASE.

Results: Image-guided navigation systems using fiducial markers have demonstrated utility in numerous surgical disciplines, including perforator-based flap surgery. However, these systems have largely been superseded by augmented reality (AR) and virtual reality (VR) technologies with superior convenience and speed. With the added benefit of tactile feedback, holograms also appear promising but have yet to be developed beyond the prototypic stage. Aided by a growing volume of digitalised clinical data, machine learning (ML) poses significant benefits for future image-based decision-making processes.

Conclusion: Most studies of image-guided navigation systems, AR, VR, holograms and ML have been presented in small case series and they remain to be analysed using outcomes-based validation studies. However, together they illustrate an exciting future where clinicians will be armed with intuitive technologies for surgical planning and guidance.

Key words: image processing, plastic and reconstructive surgery, augmented reality, virtual reality, holograms


Introduction

A plethora of imaging modalities has been used in plastic and reconstructive surgery to aid preoperative planning, intraoperative guidance and medical education.1,2 Conventional tomographic imaging modalities such as computed tomography angiography (CTA) and magnetic resonance angiography (MRA) remain relatively affordable and commonly accessible.3–8 As a result, clinicians have investigated novel technologies to expand their use such as image-guided navigation systems, augmented reality (AR), virtual reality (VR), holograms and machine learning (ML).

In the second of a two-part series, we evaluate emerging 3D imaging and printing techniques based on CTA and MRI.

Method

We reviewed the published English literature from 1950 to 2017 from well-established databases such as PubMed, MEDLINE®, Web of Science and EMBASE. We included all studies that analyse 3D imaging and printing techniques used in surgery, especially plastic and reconstructive surgery. We used search terms such as ‘3D imaging’, ‘CTA’, ‘MRA’, ‘3D image software’, ‘simulation surgery’, ‘stereotactic navigation-assisted surgery’, ‘augmented reality’, ‘virtual reality’, ‘hologram’, ‘automation’, ‘machine learning’, ‘artificial intelligence’, ‘preoperative planning’, ‘intraoperative guidance’, ‘education’, ‘training’ and ‘customised implant’. We also retrieved secondary references found through bibliographical linkages.

Through our literature review, we qualitatively analysed hardware and software programs used for image-guided navigation-assisted surgery, AR, VR, holograms and ML, evaluating their cost (affordability arbitrarily being defined as costing less than AU$500) and up-to-date clinical applications. Papers were assessed using Oxford Centre for Evidence-Based Medicine levels of evidence.9

Results and discussion

Recent technological advances have led to the use of image-guided navigation systems, AR, VR, holograms and ML in surgical planning.

Image-guided navigation systems

An image-guided navigation system tracks surgical instruments in real time and matches their location to the preoperative CTA and MRI for viewing them intraoperatively.10,11 The earliest system used external stereotactic frames fixed to the skull or other bony landmarks.12 Modern frameless navigation systems using fiducial markers,13 surface landmarks14 and surface-matching laser registration15 are faster, safer and more convenient.16 As a result, stereotactic navigation is used routinely in neurosurgery,17 spinal surgery,18 orthopaedic surgery,19 craniofacial surgery,20 ear, nose and throat surgery21 and endovascular surgery.22

In plastic surgery, Rozen et al demonstrated that registration systems using fiducial markers—six to seven in deep inferior epigastric artery perforator (DIEP) and nine to 10 in anterolateral thigh (ALT) flaps—are reliable for viewing CTA-derived perforator anatomy.23–26 Durden et al developed a novel electrocautery pen attached to a stereotactic frame and reported a global error of 2.1–2.4 mm during DIEP flap harvest.27 However, the longer, heavier diathermy handle may compromise surgical dexterity and requires its large reference frame to be fixed to the operating table.

In an interesting application, Chao et al developed a robot (KUKA Lightweight Robot; KUKA, Augsburg, Germany) that can perform osteotomy on a 3D-printed acrylic fibula with the aid of stereotactic navigation.28 Out of 18 robotic osteotomies executed, it reported average linear variation of 1.3 +/– 0.4 mm and angular variation of 4.2+/–1.7 degrees. It remains to be seen how this can be translated in vivo but its potential is intriguing.

Overall, navigation systems are seldom used in soft-tissue surgery due to lack of reliable bony landmarks and have been superseded by augmented and virtual reality platforms (see Table 1).

Table 1: Summary of image-guided navigation systems used in reconstructive plastic surgery
Product  Manufacturer Free Open-source Clinical application
BrainLAB BrainLAB AG (Feldkirchen, Germany) No No DIEP, ALT, DCIA
StealthStation Medtronic Inc. (Minneapolis, MN, USA) No No DIEP
KUKA Lightweight Robot KUKA (Augsberg, Germany) No No Free fibular flap

Augmented reality, virtual reality and holograms

In comparison with two-dimensional (2D) imaging modalities, AR, VR and holograms provide natural 3D visual perception and haptic feedback respectively. First described by Boeing engineers Caudell and Mizell in 1992,29 in AR real-time virtual images are superimposed on the view of one’s real environment.30 These images can be displayed directly onto an object in real life, also known as the projection method, or indirectly onto a portable device, such as a head-mount display or smartphone.31 In contrast, in VR, one’s entire visual perception is completely shrouded by a computer-simulated graphics environment.30

Virtual reality

Virtual reality is an attractive platform on which to generate anatomically accurate surgical simulations in order to perform preoperative planning or medical training and enable visual communication with multidisciplinary team members and patients.32 Arora et al have shown that mental practice using VR simulators can significantly improve the surgical skills of novice surgeons in laparoscopic cholecystectomy (p<0.05).33 However, currently most VR surgical simulators are pre-programmed, offer only limited interactions and exhibit such low image quality that it impedes the immersive experience.34–36

Augmented reality

Augmented reality produces an extended ‘layer’ or field of view that leads to intuitive real-time 3D visualisation of anatomical structures. Currently, most AR devices are expensive, slow and complicated. Nonetheless, their potential application has been explored in numerous surgical specialities including calibrating stereotactic instruments in neurosurgery,37 fashioning craniofacial implants in maxillofacial surgery,38 enhancing visualisation in laparoscopic surgery39 and sentinel lymph node biopsy in head and neck cancer40 and breast cancer surgery.41 In plastic and reconstructive surgery, AR appears to be most useful for preoperative planning, intraoperative image navigation and surgical training (see Table 2 and Figure 1).

Table 2: Summary of augmented reality devices used in reconstructive plastic surgery
Product Manufacturer Type/function Affordability Clinical application
Hardware
PicoPix PPX2480 Koninklijke Philips NV (Amsterdam, The Netherlands) Projector Yes DIEP, Inguinal lymph nodes
PRJ-5 Sanwa Electronic (Osaka, Japan) Projector Yes DIEP
nVisor ST60 NVIS Inc. (Reston, VA, USA) Head-mounted display No Thoracodorsal artery perforator flap
Projective Imaging System University Science and Technology of China (Anhui, People's Republic of China) Projector N/A Skin flap perfusion
Google Glass Alphabet (Mountain View, CA, USA) Head-mounted display No Rhinoplasty
Software
VitreaAdvanced fX Workstation Vital Images (Minnetonka, MN, USA) 3D rendering No DIEP, Inguinal lymph nodes
OsiriX Pixmeo (Geneva, Switzerland) 3D rendering Yes DIEP
ARToolKit ARToolworks, (Seattle, WA, USA) AR virtual modelling Yes Thoracodorsal artery perforator flap
Bespoke software written in OpenCV University Science and Technology of China (Anhui, People's Republic of China) AR virtual modelling N/A Skin flap perfusion
†Affordability is determined by whether the hardware or software costs less than AU$500 outright or per year in subscription, or is free. DIEP: deep inferior epigastric artery perforator N/A: not available. Source: Hummelink et al,43 Hummelink et al,44 Hummelink et al,45 Sotsuka et al,46 Jiang,47 Gan et al,48 Peregrin.52

Figure 1. Projection of ALT perforators preoperatively using CTA-based direct augmented reality performed using OsiriX software (Pixmeo) and Philips PicoPix pocket projector (Koninklijke Philips NV, Amsterdam, The Netherlands). The purple line delineates traditional anatomical landmarks. The mark on the line correlated exactly with the location of the ALT perforator. ALT: anterolateral thigh perforator CTA: computed tomographic angiography

Hummelink et al described a projection-based direct AR technique using an affordable hand-held projector and proprietary software suites in three case series.43–45 In the first series, they projected a 3D-reconstructed CTA image of DIEPs onto the abdominal wall and demonstrated its high accuracy (84.3 vs 56.9%, p=0.03).45 In the following series, they extended this application by including the location of inguinal lymph nodes.44 In the latest series, they calculated the required flap volume and dimensions using 3D surface scanning and projected the combined 3D-reconstructed image to aid flap design and planning.43 One of the major limitations of this technique is operator dependence, since the projector must be held steadily above the patient at the correct height without significant tremor. Sotsuka et al attempted to resolve this by mounting the projector onto a fixed handstand,46 but its reliability remains to be seen.

In animal studies, Jiang et al developed a highly accurate (3.5 mm) direct AR technique for raising thoracodorsal artery perforator flaps that does, however, require invasive positioning of the image registration system via percutaneous screws.47 Gan et al developed a compact direct AR technique consisting of a mini-projector and a near-infrared camera to detect skin perfusion after tail vein injection of ICG dye.48 However, their system is too small for clinical application.

Recent advances in augmented reality and virtual reality

Conventional AR devices require large stereoscopic towers for image registration and viewing that are inconvenient and occupy space in the operating theatre. However, there are now wearable devices that can carry sufficient computing power for AR and several investigators have developed bespoke wearable devices for surgical application.49,50 Mela et al report a device capable of projecting fluorescent angiography, 2D ultrasound and 3D CTA with depth perception in a compact, user-friendly interface.49 Similarly, Liu et al developed a compact, wireless, battery-operated device for hands-free viewing of fluorescent angiography for sentinel lymph node biopsy and tumour cell localisation.50 The latest and most promising wearable AR device was Google Glass (Alphabet, Mountain View, CA, USA). In plastic and reconstructive surgery, clinicians reported its benefits for viewing images and recording videos.51–54 Unfortunately, in 2015 Google Glass was taken off the market due to persistent software bugs and privacy concerns.55

Recently, researchers in Australia have developed a high-resolution, immersive 3D AR and VR environment using integrated supercomputers and multiple projectors with a cylindrical matrix of stereoscopic panels.56 This bespoke CAVE2TM (Monash University, Clayton, Victoria, Australia) consists of 80 high-resolution, stereo-capable displays producing an 8-metre diameter, 320-degree panoramic view (see Figure 2). Medical images can be processed relatively easily by a dedicated laboratory technician and the clinician can view them realistically in a 3D manner as if they are ‘walking through’ the anatomy. Currently, the set-up is too large to be portable and it is also expensive, but as lithium-ion batteries improve and technology becomes more mobile, the potential of such technology being transferred to a portable head-mounted display appears enticing.

Figure 2. Evolution of 3D imaging and printing techniques from 3D-reconstructed images and a basic 3D-printed model (A1–4) to clinically useful 3D printing applications in perforator-based breast reconstructive surgery (B1–6) and advanced image analysis technology such as augmented reality and CAVE2TM facility (C1–6). (A1) 2D-reconstructed CTA image of the abdominal wall vasculature. (A2) 3D-reconstructed CTA image of the same patient in A1. (A3) Segmented image of the DIEA of the same patient in A1. (A4) 3D-printed model of the DIEA in A3. (B1) Segmented image of the abdominal wall and DIEA that spurned the idea of creating a template for preoperative planning. (B2) A patient-specific bespoke ‘DIEP template’ is 3D printed and placed on the patient’s abdominal wall to help locate the DIEA perforator and its pedicle. (B3) This information is used for flap design. As 3D printing techniques advanced, we were able to create both a standard DIEP template (B4) and a ‘perforasome template’ (B5), which can additionally identify each perforasome (B6). (C1–6) Augmented reality can significantly reduce the time and labour cost involved in 3D printing by enabling direct viewing and real-time interaction with the image data. (C1) Our published direct ARC (augmented reality CTA) technique set-up using a hand-held projector demonstrating 2D-reconstructed (C2) and 3D-reconstructed (C3) images on the patient’s abdomen. As technology advances, we envision that greater software processing power will enable display of greater anatomical information, such as the intramuscular course of a DIEP (C4), and translation into a user-friendly, interactive platform for clinicians (C5). (C6) The latest CAVE2TM facility (Monash University, Clayton, Victoria, Australia) housing 84-million pixel stereoscopic display with powerful real-time motion tracking capability will enable interactive, seamless visualisation of relevant anatomy for preoperative planning and collaborative discussion. CTA: computed tomographic angiography DIEA: deep inferior epigastric artery DIEP: deep inferior epigastric artery perforator. Reprinted with permission, Chae et al.78–80

Holograms

A hologram exhibits reflective auto-stereoscopic (that is, no wearable device) 3D visuals that contain hogels (holographic elements instead of pixels or voxels), where each hogel contains up to one million different perspective views. Hackett et al evaluated the role of holograms in teaching cardiac anatomy to 19 volunteers (10 intervention versus nine control) and found a superior overall test performance after using it (89% vs 68%, p<0.05).57 Furthermore, volunteers demonstrated a trend in lower mental effort required in learning (4.9 vs 6.0, p=0.16). Recently, Makino et al have added tactile feedback to holograms by using concentrated ultrasonic energy.58 However, this technology has yet to advance beyond the prototypic stage.

Machine learning

Machine learning is a branch of artificial intelligence that uses a computer algorithm to aid clinical decisionmaking and to predict clinical outcomes based on knowledge acquisition from data mining of historical examples without explicit programming.59–61 The algorithm statistically analyses each hypothesis, compares multiple combinations and yields data models that are descriptive or predictive in nature. Machine learning has already transformed popular search engines such as Google (Google LLC, Mountain View, California, USA)62 and speech recognition software on smartphones such as Siri (Apple Inc, Cupertino, California, USA).63 Owing to an ever-growing volume of digitalised clinical data, ML presents a superior form of data interpretation to the traditional statistical methods.64

Machine learning techniques can be classified according to their mathematical structure: predictive, where learning is supervised by using pre-labelled data sets;65 descriptive, where learning is unsupervised and similar data points are clustered;66 and reinforcement, where ideal behaviour is determined by computer based on a simple reward feedback system on their actions.67 Evidently, it is difficult for non-statistically inclined clinical investigators to analyse how an algorithm has reached its conclusion.68,69 As a result, when using ML, clinicians need to collaborate with data scientists who can accurately evaluate the validity of the output obtained.70

In the last decade, investigators have applied ML to improve clinical challenges in various fields within plastic surgery as a diagnostic and predictive tool. In melanoma detection, Safran et al conducted a systematic review of 50 different ML screening techniques and found a mean sensitivity of 87.60 per cent (95% confidence interval: 72.72–100) and a mean specificity of 83.54 per cent (60.92–100).71 Encouragingly, there was no statistically significant difference between ML and dermoscopy examination by experienced professionals.72 In craniofacial surgery, Mendoza et al used a statistical shape model to help diagnose non-syndromic craniosynostosis from CT.73 The algorithm yielded a sensitivity of 92.3 per cent and a specificity of 98.9 per cent, similar to the trained radiologists.

In burns surgery, Yeong et al developed an ML algorithm to analyse reflectance spectrometry images and assess burns area and depth.74 They demonstrated an average predictive accuracy of 86 per cent. In free flap reconstructions, Kiranantawat et al developed an ML-based smartphone application, SilpaRamanitor, that can predict vascular compromise from 2D photographs with an overall sensitivity of 94 per cent, a specificity of 98 per cent and an accuracy of 95 per cent.75 In hand surgery, Conforth et al developed an algorithm capable of estimating the likelihood of tissue-engineered peripheral nerve graft take at 92.59 per cent accuracy.76 In aesthetic surgery, Gunes et al developed an automated classifier of facial beauty by analysing 165 images of attractive female faces as graded by human referees.77

Conclusion

Most studies of image-guided navigation systems, AR, VR, holograms and ML have been presented in small case series and they remain to be analysed using outcomes-based validation studies. Image-guided navigation systems are used less frequently in soft tissue surgery, in comparison with orthopaedic and neurosurgery, due to unreliable landmarks being available for image registration. Augmented reality platforms such as CAVE2TM which leads to intuitive real-time 3D visualisation of anatomical structures, appear promising. Machine learning is a rapidly emerging, disruptive technology that may become highly useful as a diagnostic and predictive tool. Together, they illustrate an exciting future where clinicians will be armed with numerous intuitive technologies for surgical planning and guidance.

Disclosure

The authors have no financial or commercial conflicts of interest to disclose.

References

  1. Chae MP, Rozen WM, McMenamin PG, Findlay MW, Spychal RT, Hunter-Smith DJ. Emerging applications of bedside 3D printing in plastic surgery. Front Surg. 2015;2:25.
  2. Pratt GF, Rozen WM, Chubb D, Ashton MW, Alonso-Burgos A, Whitaker IS. Preoperative imaging for perforator flaps in reconstructive surgery: a systematic review of the evidence for current techniques. Ann Plast Surg. 2012;69:3–9.
  3. Smit JM, Dimopoulou A, Liss AG, Zeebregts CJ, Kildal M, Whitaker IS, Magnusson A, Acosta R. Preoperative CT angiography reduces surgery time in perforator flap reconstruction. J Plast Reconstr Aesthet Surg. 2009;62:1112–117.
  4. Rozen WM, Anavekar NS, Ashton MW, Stella DL, Grinsell D, Bloom RJ, Taylor GI. Does the preoperative imaging of perforators with CT angiography improve operative outcomes in breast reconstruction? Microsurgery. 2008;28:516–23.
  5. Masia J, Clavero JA, Larranaga JR, Alomar X, Pons G, Serret P. Multidetector-row computed tomography in the planning of abdominal perforator flaps. J Plast Reconstr Aesthet Surg. 2006;59:594–99.
  6. Alonso-Burgos A, Garcia-Tutor E, Bastarrika G, Cano D, Martinez-Cuesta A, Pina LJ. Preoperative planning of deep inferior epigastric artery perforator flap reconstruction with multislice-CT angiography: imaging findings and initial experience. J Plast Reconstr Aesthet Surg. 2006;59:585–93.
  7. Masia J, Kosutic D, Cervelli D, Clavero JA, Monill JM, Pons G. In search of the ideal method in perforator mapping: noncontrast magnetic resonance imaging. J Reconstr Microsurg. 2010;26:29–35.
  8. Rozen WM, Stella DL, Bowden J, Taylor GI, Ashton MW. Advances in the pre-operative planning of deep inferior epigastric artery perforator flaps: magnetic resonance angiography. Microsurgery. 2009;29:119–23.
  9. Centre for Evidence-Based Medicine. OCEBM levels of evidence [PDF on internet] Oxford, United Kingdom: CEBM [Updated 1 May 2016; cited 1 October 2014]. Available from: https://www.cebm.net/2016/05/ocebm-levels-of-evidence/.
  10. Iseki H, Kawamura H, Tanikawa T, Kawabatake H, Taira T, Takakura K, Dohi T, Hata N. An image-guided stereotactic system for neurosurgical operations. Stereotact Funct Neurosurg. 1994;63:130–38.
  11. Unsgaard G, Ommedal S, Rygh OM, Lindseth F. Operation of arteriovenous malformations assisted by stereoscopic navigation-controlled display of preoperative magnetic resonance angiography and intraoperative ultrasound angiography. Neurosurgery. 2007;61:407–15; discussion 415–06.
  12. Jensen RL, Stone JL, Hayne R. Use of the Horsley-Clarke stereotactic frame in humans. Stereotact Funct Neurosurg. 1995;65:194–97.
  13. Woerdeman PA, Willems PW, Noordmans HJ, Tulleken CA, van der Sprenkel JW. Application accuracy in frameless image-guided neurosurgery: a comparison study of three patient-to-image registration methods. J Neurosurg. 2007;106:1012–016.
  14. West JB, Fitzpatrick JM, Toms SA, Maurer CR Jr, Maciunas RJ. Fiducial point placement and the accuracy of point-based, rigid body registration. Neurosurgery. 2001;48:810–16; discussion 816–17.
  15. Raabe A, Krishnan R, Wolff R, Hermann E, Zimmermann M, Seifert V. Laser surface scanning for patient registration in intracranial image-guided surgery. Neurosurgery. 2002;50:797–801.
  16. Quinones-Hinojosa A, Ware ML, Sanai N, McDermott MW. Assessment of image guided accuracy in a skull model: comparison of frameless stereotaxy techniques vs. frame-based localization. J Neurooncol. 2006;76:65–70.
  17. Spivak CJ, Pirouzmand F. Comparison of the reliability of brain lesion localization when using traditional and stereotactic image-guided techniques: a prospective study. J Neurosurg. 2005;103:424–27.
  18. Papadopoulos EC, Girardi FP, Sama A, Sandhu HS, Cammisa FP Jr. Accuracy of single-time, multilevel registration in image-guided spinal surgery. Spine J. 2005;5:263–67; discussion 268.
  19. Kraus MD, Dehner C, Riepl C, Scholl H, Gebhard F. A novel method of image-based navigation in fracture surgery. Arch Orthop Trauma Surg. 2012;132:741–50.
  20. Weissler JM, Sosin M, Dorafshar AH, Garcia JR. Combining Virtual surgical planning, intraoperative navigation, and 3-dimensional printing in prosthetic-based bilateral microtia reconstruction. J Oral Maxillofac Surg. 2017;75:1491–497.
  21. Klapan I, Vranjes Z, Risavi R, Simicic L, Prgomet D, Glusac B. Computer-assisted surgery and computer-assisted telesurgery in otorhinolaryngology. Ear Nose Throat J. 2006;85:318–21.
  22. de Lambert A, Esneault S, Lucas A, Haigron P, Cinquin P, Magne JL. Electromagnetic tracking for registration and navigation in endovascular aneurysm repair: a phantom study. Eur J Vasc Endovasc Surg. 2012;43:684–89.
  23. Rozen WM, Buckland A, Ashton MW, Stella DL, Phillips TJ, Taylor GI. Image-guided, stereotactic perforator flap surgery: a prospective comparison of current techniques and review of the literature. Surg Radiol Anat. 2009;31:401–08.
  24. Rozen WM, Ashton MW, Stell, DL, Phillips TJ, Taylor GI. Stereotactic image-guided navigation in the preoperative imaging of perforators for DIEP flap breast reconstruction. Microsurgery. 2008;28:417–423.
  25. Rozen WM, Ashton MW, Stella DL, Ferris S, White DC, Phillips TJ, Taylor GI. Developments in perforator imaging for the anterolateral thigh flap: CT angiography and CT-guided stereotaxy. Microsurgery. 2008;28:227–32.
  26. Ting JW, Rozen WM, Niumsawatt V, Baillieu C, Leung M, Leong JC. Developments in image-guided deep circumflex iliac artery flap harvest: a step-by-step guide and literature review. J Oral Maxillofac Surg. 2014;72:186–97.
  27. Durden F, Carruthers KH, Haran O, Kocak E. Intraoperative navigation-assisted identification of deep inferior epigastric artery perforators. Plast Reconstr Surg. 2012;129:880e-882e.
  28. Chao AH, Weimer K, Raczkowsky J, Zhang Y, Kunze M, Cody D, Selber JC, Hanasono MM, Skoracki RJ. Pre-programmed robotic osteotomies for fibula free flap mandible reconstruction: a preclinical investigation. Microsurgery. 2016;36:246–49.
  29. Caudell TP, Mizell DW. Augmented reality: an application of heads-up display technology to manual manufacturing processes. In: Proceedings of the twenty-fifth international conference on system sciences. 1992 Jan 7–10; Kauai, Hawaii, USA. Los Alamitos, California, USA: Wiley IEEE Computer Society Press 1992 vol II: 659–69.
  30. Milgram P, Kishino F. A taxonomy of mixed reality visual displays. IEICE Transact on Info Syst. 1994;E77.
  31. Mann S. Wearable computing: a first step toward personal imaging. IEEE Comput. 1997;30:25–32.
  32. Pensieri C, Pennacchini M. Overview: virtual reality in medicine. J Virtual Worlds Res. 2014;7:1–34.
  33. Arora S, Aggarwal R, Sirimanna P, Moran A, Grantcharov T, Kneebone R, Sevdalis N, Darzi A. Mental practice enhances surgical technical skills: a randomized controlled study. Ann Surg. 2011;253:265–70.
  34. Sutton C, McCloy R, Middlebrook A, Chater P, Wilson M, Stone R. MIST VR. A laparoscopic surgery procedures trainer and evaluator. Stud Health Technol Inform. 1997;39:598–07.
  35. Hyltander A, Liljegren E, Rhodin PH, Lonroth H. The transfer of basic skills learned in a laparoscopic simulator to the operating room. Surg Endosc. 2002;16:1324–328.
  36. Fairhurst K, Strickland A, Maddern G. The LapSim virtual reality simulator: promising but not yet proven. Surg Endosc. 2011;25:343–55.
  37. Zeng B, Meng F, Ding H, Wang G. A surgical robot with augmented reality visualization for stereoelectroencephalography electrode implantation. Int J Comput Assist Radiol Surg. 2017;12(8):1355–368.
  38. Murphy RJ, Liacouras PC, Grant GT, Wolfe KC, Armand M, Gordon CR. A craniomaxillofacial surgical assistance workstation for enhanced single-stage reconstruction using patient-specific implants. J Craniofac Surg. 2016;27:2025–030.
  39. Marzano E, Piardi T, Soler L, Diana M, Mutter D, Marescaux J, Pessaux P. Augmented reality-guided artery-first pancreatico-duodenectomy. J Gastrointest Surg. 2013;17:1980–983.
  40. Profeta AC, Schilling C, McGurk M. Augmented reality visualization in head and neck surgery: an overview of recent findings in sentinel node biopsy and future perspectives. Br J Oral Maxillofac Surg. 2016;54:694–96.
  41. Tagaya N, Aoyagi H, Nakagawa A, Abe A, Iwasaki Y, Tachibana M, Kubota K. A novel approach for sentinel lymph node identification using fluorescence imaging and image overlay navigation surgery in patients with breast cancer. World J Surg.2011;35:154–58.
  42. Kim Y, Kim H, Kim YO. Virtual reality and augmented reality in plastic surgery: a review. Arch Plast Surg. 2017;44:179–87.
  43. Hummelink S, Verhulst AC, Maal TJJ, Hoogeveen YL, Schultze Kool LJ, Ulrich DJ. O. An innovative method of planning and displaying flap volume in DIEP flap breast reconstructions. J Plast Reconstr Aesthet Surg. 2017;70:871–875.
  44. Hummelink S, Schultze Kool LJ, Ulrich DJ. Displaying inguinal lymph nodes before transplantation in a deep inferior epigastric perforator flap breast reconstruction using an innovative projection method. J Plast Reconstr Aesthet Surg. 2016;69:376–80.
  45. Hummelink S, Hameeteman M, Hoogeveen Y, Slump CH, Ulrich DJ, Schultze Kool LJ. Preliminary results using a newly developed projection method to visualize vascular anatomy prior to DIEP flap breast reconstruction. J Plast Reconstr Aesthet Surg. 2015; 68(3):390–394.
  46. Sotsuka Y, Matsuda K, Fujita K, Fujiwara T, Kakibuchi M. Image overlay of deep inferior epigastric artery in breast reconstruction. Plast Reconstr Surg Glob Open. 2014;2:e235.
  47. 4Jiang T, Zhu M, Zan T, Gu B, Li Q. A novel augmented reality-based navigation system in perforator flap transplantation: a feasibility study. Ann Plast Surg. 2017;79:192–96.
  48. Gan Q, Wang D, Ye J, Zhang Z, Wang X, Hu C, Shao P, Zu RX. Benchtop and animal validation of a projective imaging system for potential use in intraoperative surgical guidance. PLoS One. 2016;11:e0157794.
  49. Mela CA, Papay FA, Liu Y. Intraoperative fluorescence imaging and multimodal surgical navigation using goggle system. Methods Mol Biol. 2016;1444:85–95.
  50. Liu Y, Bauer AQ, Akers WJ, Sudlow G, Liang K, Shen D, Berezin MY, Culver JP, Achilefu S. Hands-free, wireless goggles for near-infrared fluorescence and real-time image-guided surgery. Surgery. 2011;149:689–98.
  51. Rahman OF, Nahabedian MY, Sinkin JC. Augmented reality and wearable technology in image-guided navigation and preoperative planning. Plast Reconstr Surg Glob Open. 2016;4:e1057.
  52. Peregrin T. Surgeons see future applications for Google Glass. Bull Am Coll Surg. 2014;99:9–16.
  53. Berger AJ, Gaster RS, Lee GK. Development of an affordable system for personalized video-documented surgical skill analysis for surgical residency training. Ann Plast Surg. 2013;70:442–46.
  54. Sinkin JC, Rahman OF, Nahabedian MY. Google Glass in the operating room: the plastic surgeon’s perspective. Plast Reconstr Surg. 2016;138:298–02.
  55. Bilton N. Why Google Glass broke [Internet]. New York: The New York Times. 4 February 2015 [Cited 12 February 2019]. Available from: https://www.nytimes.com/2015/02/05/style/why-google-glass-broke.html.
  56. Vohl D, Barnes DG, Fluke CJ, Poudel G, Georgiou-Karistianis N, Hassan AH, Benovitski Y, Wong TH, Kaluza OL, Nguyen TD, Bonnington CP. Large-scale comparative visualisation of sets of multidimensional data. PeerJ Comp Sci. 2016;2:e88.
  57. Hackett M. Medical holography for basic anatomy training. In: Proceedings of the Interservice/Industry Training, Simulation, and Education Conference; 2013 Dec 2–6; Orlando, Florida, USA. Arlington, Vancouver, USA: I/ITSEC 2013: Paper No 13207.
  58. Makino Y, Furuyama Y, Inoue S, Shinoda H. HaptoClone (Haptic-Optical Clone) for mutual tele-environment by real-time 3D image transfer with midair force feedback. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems; 2016 May 7–12; San Jose, California, USA. New York, New York: ACM 2016: 1980–1990.
  59. Ghahramani Z. Probabilistic machine learning and artificial intelligence. Nature. 2015;521:452–59.
  60. Obermeyer Z, Emanuel EJ. Predicting the future: big data, machine learning, and clinical medicine. N Engl J Med. 2016;375:1216–219.
  61. Jordan MI, Mitchell TM. Machine learning: trends, perspectives, and prospects. Science. 2015;349:255–60.
  62. Metz C. AI is transforming Google search. The reset of the web is next [Internet]. San Francisco, California, United States: Wired, 2 April 2016 [Cited 12 February 2019]. Available from: https://www.wired.com/2016/02/ai-is-changing-the-technology-behind-google-searches/.
  63. Siri Team. Deep learning for Siri’s voice: on-device deep mixture density networks for hybrid unit selection synthesis. Apple Machine Learning J. 2017;1(4).
  64. Murdoch TB, Detsky AS. The inevitable application of big data to health care. JAMA. 2013;309:1351–352.
  65. Meyfroidt G, Guiza F, Ramon J, Bruynooghe M. Machine learning techniques to examine large patient databases. Best Pract Res Clin Anaesthesiol. 2009;23:127–43.
  66. Deo RC. Machine learning in medicine. Circulation 2015;132:1920–930.
  67. Furnkranz J, Gamberger D, Lavrac N. Foundations of rule learning. Heidelberg, Germany: Springer, 2012.
  68. Waschkowski F, Hesse S, Rieck AC Lohmann T, Brockmann C, Laube T, Bornfeld N, Thumann G, Walter P1, Mokwa W, Johnen S, Roessler G. Development of very large electrode arrays for epiretinal stimulation (VLARS). Biomed Eng Online. 2014;13(1):11.
  69. Imhoff M, Kuhls S. Alarm algorithms in critical care monitoring. Anesth Analg. 2006;102:1525–537.
  70. Kanevsky J, Corban J, Gaster R, Kanevsky A, Lin S, Gilardino M. Big data and machine learning in plastic surgery: a new frontier in surgical innovation. Plast Reconstr Surg. 2016;137:890e–97e.
  71. Safran T, Viezel-Mathieu A, Corban J, Kanevsky A, Thibaudeau S, Kanevsky J. Machine learning and melanoma: the future of screening. J Am Acad Dermatol. 2018; 78(3):620–21.
  72. Rajpara SM, Botello AP, Townend J, Ormerod AD. Systematic review of dermoscopy and digital dermoscopy/artificial intelligence for the diagnosis of melanoma. Br J Dermatol. 2009;161:591–04.
  73. Mendoza CS, Safdar N, Okada K, Myers E, Rogers GF, Linguraru MG. Personalized assessment of craniosynostosis via statistical shape modeling. Med Image Anal. 2014;18:635–46.
  74. Yeong EK, Hsiao TC, Chiang HK, Lin CW. Prediction of burn healing time using artificial neural networks and reflectance spectrometer. Burns. 2005;31:415–20.
  75. Kiranantawat K, Sitpahul N, Taeprasartsit P, Constantinides J, Kruavit A, Srimuninnimit V, Punyahotra N, Chatdokmaiprai C, Numhom S. The first smartphone application for microsurgery monitoring: SilpaRamanitor. Plast Reconstr Surg. 2014;134:130–39.
  76. Conforth M, Meng Y, Valmikinathan C, Xiaojun Y. Nerve graft selection for peripheral nerve regeneration using neural networks trained by a hybrid ACO/PSO method. In: Proceedings of the 6th Annual IEEE Symposium on Computational Intelligence in Bioinformatics and Computational Biology; 2009 30 Mar–2 Apr; Nashville, Tennessee, USA: CIBCB Press 2009:208–14.
  77. Gunes H, Piccardi M. Assessing facial beauty through proportion analysis by image processing and supervised learning. Int J Hum-Comput St. 2006;64:1184–199.
  78. Chae MP, Hunter-Smith DJ, Chung RD, Smith JA, Rozen WM. 3D-printed, patient-specific, DIEP template for preoperative planning autologous breast reconstruction: prospective case series in 20 patients. Microsurgery. Forthcoming. 2019
  79. Chae MP, Hunter-Smith DJ, Rostek M, Smith JA, Rozen WM. Enhanced Preoperative deep inferior epigastric artery perforator flap planning with a 3D-printed perforasome template: technique and case report. Plast Reconstr Surg Glob Open. 2018;6:e1644.
  80. Chae MP, Ganhewa D, Hunter-Smith DJ, Rozen WM. Direct augmented reality computed tomographic angiography technique (ARC): an innovation in preoperative imaging. Eur J Plast Surg. 2018;41:415–20.