Introduction
Since the advent of free vascularised tissue transfer approximately 40 years ago1 significant improvements, particularly in the preoperative planning phase, have led to improved patient outcomes.2 Techniques to identify perforators have evolved from handheld Doppler ultrasound3,4 to computed tomographic angiography (CTA) and magnetic resonance angiography (MRA). This is important due to the high degree of anatomical variation and intricacy of surgical dissection required. Individual patient anatomy may vary due to differences in development and subsequent changes due to previous surgery and scarring. Methods to account for such variations for preoperative perforator mapping have been developed.5,6 This approach reduces the amount of operative time required and risk of unanticipated anatomy that may compromise the perforator flap, hence increasing patient safety and the success of free tissue transfer.
Current imaging modalities are restricted by being represented in two-dimensional platforms, such as a computer screen or X-ray films. Clinicians have sought novel methods of using scan data from CTA or MRA to improve preoperative planning and surgical efficiency such as augmented reality (AR), where additional data is overlaid onto patients to help more accurately trace of the path of vessels through various tissues.
Due to the rapidly evolving field of AR in pedicled and free flap reconstruction to identify and dissect perforators, we conducted a systematic review of the literature to determine its usefulness in flap surgery.
Methods
This manuscript critically appraises the current literature on the use of AR to assist surgeons in surgical planning and dissection in accordance with PRISMA guidelines (Figure 1). A systematic search was performed by the primary author on EMBASE from January 2000 until May 2021 and PubMed® bibliographic databases to identify all English literature using keywords: ‘flap’ and ‘mixed reality’, ‘flap’ and ‘augmented reality’, ‘flap’ and ‘virtual reality’, ‘flap’ and ‘holographic/ hologram’. We also identified relevant articles through bibliographic linkage.
To be included in this systematic review, the augmentation technology must have been used to identify perforating vessels by overlaying data onto tissues. Articles were excluded if they did not address the above inclusion criteria, were abstracts for conferences or previous systematic reviews.
A total of 103 articles were identified. Duplicate articles were removed prior to review. Four articles were identified from bibliographic linkages. Two reviewers independently screened the titles for appropriate topic relevance, with each abstract of each citation analysed to identify manuscripts suitable for review. Upon discussion, the full text of concordant papers were screened for review by RP and WR. Any disagreements were resolved by discussion.
As most articles did not report on standardised parameters such as time to perforator identification or dissection time, no statistical analysis was performed. Rather, qualitative aggregation of data was performed.
Results
Following article appraisal, eleven articles were included in the systematic review (Table 1). All articles were single institution case reports or series.
Method of augmented reality overlay
Five of the eleven papers used augmented reality headwear to overlay data onto the patient.7,12,14,16,17 Other methods of projection included the use of a portable projector mounted on a fixed handstand and aligned with anatomical landmarks for preoperative marking,9–11 with one author reporting it took approximately 10 minutes from setting up the equipment to mapping out perforators15 and use of a smartphone to overlay flap data onto patients.8,13
Flap harvest time
Two studies analysed perforator identification time using augmented versus non-augmented reality perforator identification methods.11,13 Pereira and colleagues used a smartphone and superimposed pedicle data onto camera data with the image aligned over anatomical landmarks and compared to handheld Doppler and intraoperative findings, noting a 20 per cent reduction in flap harvest time, on average from 90 minutes to 72 minutes.13 Hummelink and colleagues used a proprietary device that aligned the image based on markers placed on the patient that corresponded with anatomical landmarks. Conducting a randomised controlled trial with Doppler ultrasound planning as the comparison, they reported in deep inferior epigastric perforator (DIEP) flap harvest a decrease in time on average by 19 minutes (12%) when compared to preoperative ultrasound mapping (136 minutes versus 155 minutes respectively).11
Perforator identification accuracy and time
Several studies compared the accuracy and time taken for perforator identification. Cifuentes and colleagues reported perforators identified through AR using thermography correlated with Doppler sound signals, with 3.5 minutes the median time to identifying perforators.8 Hummelink and colleagues demonstrated projection-based AR technique to identify perforators using handheld projector and proprietary software in two case series.9,10 In the first series comparing AR to Doppler ultrasound in DIEP flap reconstruction,9 a greater number of perforators were identified compared to the Doppler ultrasound, and at a higher accuracy (84.3% versus 56.9%, p = 0.03). The authors, in their second series,10 using similar methods were able to identify a corresponding Doppler signal in 41 of 42 projected perforator locations. Nuri and colleagues reported their AR identified perforator corresponded with its actual location, which was 1 cm away from where the Doppler ultrasound had localised the perforator, although this was in only one patient.16 When comparing AR localised perforators and intraoperative findings, Pereira and colleagues demonstrated 100 per cent correlation.13 Jiang and colleagues study, which developed and used a proof-of-concept AR system on dogs, determined a system accuracy of 3.5 mm and was useful for raising thoracodorsal artery perforator flaps.12 More recently, Wesselius and colleagues were able to demonstrate a technique that allowed real-time patient position to be compensated by the AR system via quick response markers attached to the patient.17
Surgeon feedback
The use of AR in operations performed in theatre has recently been demonstrated in a case series by Pratt and colleagues14 to aid the surgeon in identifying, dissecting vessels and navigating anatomy for both harvesting pedicled and free flaps to repair lower limb tissue defects. In all cases in this series, the surgeon noted no discrepancy between the AR projection of vasculature and the actual surgical position of the vasculature, preferring AR over ultrasound identification of perforators.
Discussion
Pioneered by Boeing scientists Caudell and Mizell,18 AR is a technology that augments the user’s visual field with superimposed real-time images that can be displayed either directly on the object in real-life, known as the projection method, or indirectly on a portable device, such as a head-mounted display (HMD) or a smart phone.19 Its use has been well documented in the literature. Examples include calibrating stereotactic instruments in neurosurgery,20 planning21 and fashioning craniofacial implants in maxillofacial surgery,22,23 enhancing visualisation in laparoscopic surgery24 and identifying sentinel lymph nodes in breast cancer surgery.25 In plastic surgery, it has previously appeared most broadly useful for preoperative planning, intraoperative navigation and surgical training.26 As proof of concepts, several authors including our group have been able to demonstrate the usefulness of AR on living tissue7,27 to assist surgical planning (Figure 2a–d).
Due to the high resolution provided by CTA, it is logical that there would be identification of more perforates in a shorter period of time when compared with conventional techniques such as handheld Doppler ultrasound, with a higher degree of accuracy and hence reduced operative time associated with perforator dissection. Although there was one study that reported setting up AR projection to be fast at 10 minutes,15 there was a lack of studies that analysed the time difference between setting up and identifying perforators using AR with conventional Doppler ultrasound. This may negate any benefit from the time saved for perforator dissection using AR, which was faster by 12 per cent9 and 20 per cent13 when compared to other methods, with similar complication rates between the alternative technologies used.
While several articles used open source software, several authors used proprietary software for data analysis and rendering, and no articles analysed the cost of AR technology. The use of proprietary software coupled with novel hardware could lead to a significant cost burden on clinicians which choose to adopt such technologies, and may be a barrier to its uptake as an adjunct for flap reconstruction.
All of the articles analysed presented, to varying degrees, a method of data acquisition, analysis and rendering. However, all but one article addressed how to overcome the limitation of only being able to project AR onto static tissue. As identified by Hummelink and colleagues, a major limitation identified in regard to image projection is in relation to the operator, since the projector must be held steadily above the patient at a correct height without movement in order to give an accurate image of the anatomy.9 In cases where the projector device or patient was in a fixed position, there was no software that was able, in real time, to automatically segment raw data and perform volumetric rendering for real-time model alignment to correct for tissue deformation, such as when pressing on tissue to dissect. Of note, 3D visual technologies are being used anecdotally in other surgical specialities, such as orthopaedics, and while high-level evidence is lacking in these other fields, there is increasing interest in its usefulness to aid the surgeon.
Another limitation of projector-based AR not addressed by the articles is interference from ambient lighting. Projectors in general only have several hundred lumens of brightness. Although only projected onto an area less than one metre squared, this translates to a low lux level compared to theatre lighting which has been reported anywhere from 1000 to 100,000 lumens.28 This is a significant impediment as it would mean, when trying to achieve adequate visualisation, all lights in the operating theatre must be significantly dimmed or turned off for the projection to be visible, which may disrupt theatre flow and may compromise efficiency.8
Apart from data acquisition from AR technology, Hummelink and colleagues in their second series of six patients10 were also able to perform volumetric analysis of the breast using 3D scanning in order to calculate flap volume required, and hence project the ideal flap design onto the patient, something that has not been previously explored in the literature. Although novel and encouraging, current 3D surface imaging technologies have a reported error rate of 13–16 per cent,29 and hence such techniques should be used with caution, as flaps may be undersized for what is required.
Conclusion
Advances in computer modelling of imaging data have led to holographic projection and AR being of great potential to influence the future of surgery. While the literature demonstrates that established methods of data rendering and projection onto patients can achieve true AR, there is a lack of objective outcome data to demonstrate its usefulness in assisting surgeons when compared to previous methods of perforator identification or dissection. This, combined with a cost analysis, are the main obstructions to this technology being more widely adopted.
Patient consent
Patients/guardians have given informed consent to the publication of images and/or data.
Disclosure
The authors have no financial or commercial conflicts of interest to disclose.
Funding declaration
The authors received no financial support for the research, authorship, and/or publication of this article.
Revised: 2021 February 2; May 25 AEST