Advertisement

A Platform for Tracking Surgeon⁠- and Observer⁠-Gaze as a Surrogate for Attention in Ophthalmic Surgery

  • Rogerio G. Nespolo
    Affiliations
    Department of Ophthalmology and Visual Sciences - Illinois Eye and Ear Infirmary, University of Illinois Chicago, Chicago, IL 60612, USA

    Richard and Loan Hill Department of Biomedical Engineering, University of Illinois Chicago, Chicago, IL 60612, USA
    Search for articles by this author
  • Emily Cole
    Affiliations
    Richard and Loan Hill Department of Biomedical Engineering, University of Illinois Chicago, Chicago, IL 60612, USA
    Search for articles by this author
  • Daniel Wang
    Affiliations
    Richard and Loan Hill Department of Biomedical Engineering, University of Illinois Chicago, Chicago, IL 60612, USA
    Search for articles by this author
  • Darvin Yi
    Correspondence
    These authors contributed equally to the manuscript and research described herein as corresponding authors.
    Affiliations
    Department of Ophthalmology and Visual Sciences - Illinois Eye and Ear Infirmary, University of Illinois Chicago, Chicago, IL 60612, USA

    Richard and Loan Hill Department of Biomedical Engineering, University of Illinois Chicago, Chicago, IL 60612, USA
    Search for articles by this author
  • Yannek I. Leiderman
    Correspondence
    These authors contributed equally to the manuscript and research described herein as corresponding authors.
    Affiliations
    Department of Ophthalmology and Visual Sciences - Illinois Eye and Ear Infirmary, University of Illinois Chicago, Chicago, IL 60612, USA

    Richard and Loan Hill Department of Biomedical Engineering, University of Illinois Chicago, Chicago, IL 60612, USA
    Search for articles by this author
Open AccessPublished:November 07, 2022DOI:https://doi.org/10.1016/j.xops.2022.100246

      Abstract

      Purpose

      To develop and validate a platform that can extract eye gaze metrics from surgeons observing cataract and vitreoretinal procedures, and to enable post-hoc data analysis to assess potential discrepancies in eye movement behavior according to surgeon experience.

      Design

      Experimental, prospective, single-center study.

      Participants

      Eleven (11) ophthalmic surgeons observing deidentified vitreoretinal and cataract surgical procedures performed at a single university-based medical center.

      Methods

      An open-source platform was developed to extract gaze coordinates and metrics from ophthalmic surgeons via a computer vision algorithm in conjunction with a neural network to track and segment instruments and tissues, identifying areas of attention in the visual field of study subjects. Eleven surgeons provided validation data by watching videos of six heterogeneous vitreoretinal and cataract surgical phases.

      Main Outcome Measures

      Accuracy and distance traveled by the eye gaze of participants and overlap of the participants’ eye gaze with instruments and tissues while observing surgical procedures.

      Results

      The platform demonstrated repeatability of >94% when acquiring the eye gaze behavior of subjects. Attending ophthalmic surgeons and clinical fellows exhibited a lower overall cartesian distance traveled in comparison to resident physicians in ophthalmology (P < 0.02). Ophthalmology residents and clinical fellows exhibited more fixations to the display area where surgical device parameters were superimposed compared to attending surgeons (P <0.05). There was a trend toward gaze overlap with the instrument tooltip location among resident physicians in comparison to attending surgeons and fellows (41.42% vs 34.8%, P > 0.2). The number and duration of fixations did not vary substantially among groups (P > 0.3).

      Conclusions

      The platform proved effective in extracting gaze metrics of ophthalmic surgeons. These preliminary data suggest that surgeon gaze behavior differs according to experience.

      Key Words

      Abbreviations and acronyms:

      AOI (Area of Interest), OR (Operating Room), CNN (Convolutional Neural Network), AUPR (Area Under the Precision-Recall Curve)

      Introduction

      Surgical procedures are dynamic events comprising complex visual scenes requiring that the surgeon observe evolving anatomy, tissue-instrument interactions, the effects of surgical instrumentation, and anticipate subsequent events. When performing or observing a surgical procedure, surgeons may have divergent visual spatial attention and awareness corresponding to each subject’s experience, specialization, training, or environment.
      • Liu S.
      • Donaldson R.
      • Subramaniam A.
      • et al.
      Developing expert gaze pattern in laparoscopic surgery requires more than behavioral training.

      Atkins MS, Tien G, Khan RSA, Meneghetti A, Zheng B. What Do Surgeons See: Capturing and Synchronizing Eye Gaze for Surgery Applications. http://dx.doi.org/101177/1553350612449075. 2012;20(3):241-248. doi:10.1177/1553350612449075

      Law B, Atkins MS, Kirkpatrick AE, Lomax AJ, Mackenzie CL. Eye Gaze Patterns Differentiate Novice and Experts in a Virtual Laparoscopic Surgery Training Environment. Proceedings of the Eye tracking research & applications symposium on Eye tracking research & applications - ETRA’2004. Published online 2004. doi:10.1145/968363

      • Cai L.Z.
      • Kwong J.W.
      • Azad A.D.
      • Kahn D.
      • Lee G.K.
      • Nazerali R.S.
      Where Do We Look? Assessing Gaze Patterns in Cosmetic Face-Lift Surgery with Eye Tracking Technology.
      Gaze tracking has been employed in diverse disciplines including autonomous driving, learning and knowledge transfer, surgery, and other activities that rely on visual processing, with the goal of a better understanding of how humans perform numerous tasks by systematically assessing gaze as a surrogate for attention.
      • Orquin J.L.
      • Ashby N.J.S.
      • Clarke A.D.F.
      Areas of Interest as a Signal Detection Problem in Behavioral Eye-Tracking Research.

      Vinuela-Navarro V, Erichsen JT, Williams C, Woodhouse JM. Quantitative Characterization of Smooth Pursuit Eye Movements in School-Age Children Using a Child-Friendly Setup. Transl Vis Sci Technol. 2019;8(5):8-8. doi:10.1167/TVST.8.5.8

      • Emhardt S.N.
      • Kok E.M.
      • Jarodzka H.
      • Brand-Gruwel S.
      • Drumm C.
      • van Gog T.
      How Experts Adapt Their Gaze Behavior When Modeling a Task to Novices.
      • Papesh M.H.
      • Hout M.C.
      • Guevara Pinto J.D.
      • Robbins A.
      • Lopez A.
      Eye movements reflect expertise development in hybrid search.

      Merali N, Veeramootoo D, Singh S. Eye-Tracking Technology in Surgical Training. https://doi.org/101080/0894193920171404663. 2017;32(7):587-593. doi:10.1080/08941939.2017.1404663

      • Richstone L.
      • Schwartz M.J.
      • Seideman C.
      • Cadeddu J.
      • Marshall S.
      • Kavoussi L.R.
      Eye metrics as an objective assessment of surgical skill.
      • Pugh C.M.
      • Ghazi A.
      • Stefanidis D.
      • Schwaitzberg S.D.
      • Martino M.A.
      • Levy J.S.
      How Wearable Technology Can Facilitate AI Analysis of Surgical Videos.

      Braunagel C, Kasneci E, Stolzmann W, Rosenstiel W. Driver-Activity Recognition in the Context of Conditionally Autonomous Driving. IEEE Conference on Intelligent Transportation Systems, Proceedings, ITSC. 2015;2015-October:1652-1657. doi:10.1109/ITSC.2015.268

      Aksum KM, Magnaguagno L, Bjørndal CT, Jordet G. What Do Football Players Look at? An Eye-Tracking Analysis of the Visual Fixations of Players in 11 v 11 Elite Football Match Play. Front Psychol. 2020;11:2624. doi:10.3389/FPSYG.2020.562995/XML/NLM

      Eye movement studies have been responsible for significant findings in human behavior in healthcare: previous studies identified the difference in eye-movement patterns among radiologists, surgeons, and nurses by extracting data that led to improved training for diagnostics, patient interaction, and intraoperative situational awareness by informing novice healthcare providers as to the areas of interest (AOI) of experienced professionals.
      • Liu S.
      • Donaldson R.
      • Subramaniam A.
      • et al.
      Developing expert gaze pattern in laparoscopic surgery requires more than behavioral training.
      ,
      • Khan R.S.A.
      • Tien G.
      • Atkins M.S.
      • Zheng B.
      • Panton O.N.M.
      • Meneghetti A.T.
      Analysis of eye gaze: Do novice surgeons look at the same location as expert surgeons during a laparoscopic operation?.
      ,
      • Bottorff J.L.
      Development of an Observational Instrument to Study Nurse-Patient Touch.
      Few studies in ophthalmology have focused on gaze tracking in clinical care, and most studies have assessed the interpretation of static images used for diagnostic purposes. Shahimin et al.

      Shahimin MM, Razali A. An Eye Tracking Analysis on Diagnostic Performance of Digital Fundus Photography Images between Ophthalmologists and Optometrists. Published online 2019. doi:10.3390/ijerph17010030

      analyzed patterns in eye movements between ophthalmologists and optometrists when assigning diagnoses to digital fundus photographs assessing fixed areas of interest (AOI) such as optic disc, macula, and blood vessels. Shirley et al.
      • Shirley K.
      • Williams M.
      • McLaughlin L.
      • Parker N.
      • Bond R.
      Impact of an educational intervention on eye gaze behaviour in retinal image interpretation by consultant and trainee ophthalmologists.
      identified that trainees could improve their search strategy after analyzing gaze data from expert clinicians when viewing retinal images. The extraction of metrics and features in eye-tracking research during surgery is currently limited: contemporary platforms fail to provide integration with relevant tools and heterogeneous methods for gaze acquisition. Commercially available solutions require costly equipment and licensed software. An open-source platform for intraoperative gaze analysis has the potential to facilitate research into the complex and dynamic environment of ophthalmic microsurgery.
      In this work, we developed a novel open-source platform for processing eye-tracking data from a diverse set of eye-tracker devices, extracting standard gaze metrics together with the dynamic generation of areas of interest (AOI) during vitreoretinal procedures via deep learning neural networks. To validate our platform, ophthalmic surgeons observed critical or complex elements of cataract and vitreoretinal procedures. By linking gaze coordinates with the detection of AOIs including surgical instrumentation and intraocular anatomic landmarks, we hope to establish a platform for the investigation of visual attention during ophthalmic surgery via generating semantically rich feature vectors for real-time or post-hoc data processing throughout complex intraoperative visual scenes.

      Methods

      Acquisition of gaze coordinates

      The data flow of the proposed platform for the acquisition and feature extraction of eye gaze data is displayed in Figure 1. The platform is designed to be compatible with eye-trackers that provide gaze coordinates in real-time such as eye-tracker bars, glasses, and gaze tracking via webcams (Figure 1a). Constraints related to display size, resolution, and distance of the subject to the screen are dependent solely on the technical restrictions of the hardware employed.
      Figure thumbnail gr1
      Figure 1Method for attention tracking during ophthalmic microsurgery using gaze tracking: (a) Gaze tracking is performed using any of a variety of commercially available solutions; (b) Detection and segmentation of instrument tooltip and optic disc via an instance segmentation neural network, integrating spatial data with gaze coordinates; (c) Automated Extraction of advanced features from gaze data allows for analysis of attention during ophthalmic surgery.
      An eye tracker [Tobii 4C Eye Tracker, Tobii AB, Sweden] was attached to a portable computer (system configuration details in the supplemental material section 1, available at www.ophthalmologyscience.org) displaying video from intraocular surgical procedures, and gaze tracking was performed in order to measure and analyze visual attention. Screen recorder software [OBStudio, OBS Project] acquired the gaze location frame-by-frame based on a trackable bubble overlay [Tobii Experience and Tobii Ghost, Tobii AB, Sweden] at a frequency of 60Hz (eFigure 1, available at www.ophthalmologyscience.org). The user did not have access to the spatial-temporal data, and the trackable bubble overlay was recorded in a separate file. A computer vision-based algorithm was developed to extract the cartesian coordinates from the recorded video with trackable bubble overlay, employing OpenCV Computer Vision Library color threshold and blob detection methods. Supplemental material section 1 provides the method for extraction and calibration of the eye-gaze acquisition method, and the developed code can be accessed in supplemental material section 2, available at www.ophthalmologyscience.org.

      Identification of dynamic Area of Interest (AOI) with instance segmentation via deep learning

      As a proof of concept to generate dynamic spatial data of surgical elements, a deep learning instance segmentation model based on YOLACT
      • Bolya D.
      • Zhou C.
      • Xiao F.
      • Lee Y.J.
      YOLACT++: Better Real-time Instance Segmentation.
      was trained to identify three different surgical instruments and one retina landmark: vitrector, intraocular microvitrectomy forceps, and endolaser tooltips, together with the optic disc (eFigure 2, code and implementation details available in supplemental material sections 2 and 3, available at www.ophthalmologyscience.org). The data used to train the model consisted of frames extracted from 100 pars plana vitrectomy surgeries for which the instrument tooltips and optic disc were annotated. Supplemental material section 3 and eFigure 3 describe the model training and evaluation (available at www.ophthalmologyscience.org).

      Advanced features extraction

      An open-source algorithm to evaluate gaze metrics based on the velocity of eye movements was employed to detect saccades and fixations (Saccade/fixation detection in R, code implementation in the supplemental material section 2, available at www.ophthalmologyscience.org). The algorithm was built on the saccade detection method proposed by Engbert et al.
      • Engbert R.
      • Kliegl R.
      Microsaccades uncover the orientation of covert attention.
      and it was configured to process gaze data acquired at a sampling rate of 60Hz. The algorithm defines a fixation as the period between two detected saccades. If the spatial distance between saccades is near zero, the algorithm classifies the movement as a blink or artifact. The beginning of each saccade is determined when the eye movement exceeds a threshold of 20 °/s. Involuntary microsaccadic movements are identified and discarded by detecting amplitudes of up to 120 arcminutes.
      To identify smooth pursuits that attend surgical instruments, we based our algorithm on a set of parameters for eye movements established by previous research.

      Linn´ L, Larsson L, Nyström M, et al. Smooth pursuit detection in binocular eye-tracking data with automatic video-based performance evaluation. J Vis. 2016;16(15):20-20. doi:10.1167/16.15.20

      Smooth pursuits can be distinguished from saccadic movements via the nature of the movement of the gaze and target: smooth pursuits occur in response to the velocity of the target, while saccadic movements are generated in reaction to the spatial translation of the target itself.
      • Fuchs A.F.
      Saccadic and smooth pursuit eye movements in the monkey.
      In this study, the surgeon’s gaze must be located within the AOI (instrument tooltip) defined by a distance of 3° or less of visual angle from the gaze point to the cartesian coordinates of the tooltip. Next, the algorithm compares the velocities of the instrument tooltip and surgeon’s gaze. Smooth pursuit of the instrument is detected if both eye and object move at an average speed between 0.35°/s (15 px/s) and the threshold of 100°/s (433 px/s). We based these parameters on the work of Linnéa et al.,

      Linn´ L, Larsson L, Nyström M, et al. Smooth pursuit detection in binocular eye-tracking data with automatic video-based performance evaluation. J Vis. 2016;16(15):20-20. doi:10.1167/16.15.20

      who identified an upper limit of 100°/s for smooth pursuit in humans. Since there is no established lower boundary for smooth pursuit velocity in the literature, we chose 0.35°/s as a means to minimize potential noise during data analysis.
      The platform outputs a feature vector that includes the distance traveled by the observer’s gaze and the magnitude of the velocity and acceleration vectors via the moving average of these variables in subsets of 100 ms. It also produces the distance to the centroid of the instrument tooltip, distance to the area of anatomic elements, such as the optic disc, and detection of fixation and smooth pursuit eye movements (Figure 1c).

      Validating the platform by extracting features from subjects

      The acquisition of gaze data from ophthalmic surgeons observing intraocular surgery was approved by the University of Illinois at Chicago Institutional Review Board. In order to extract the cartesian coordinates of participants’ gaze, the eye tracker-with an operating distance between 50 cm and 95 cm-was interfaced with a 15.6” video display at a resolution of 1920x1080 pixels (additional technical details of the display hardware may be found in the supplemental material section 1, available at www.ophthalmologyscience.org). The observer’s viewing distance was fixed at 65 cm from the display, and surgeons were asked to use optical correction appropriate to their needs for the computer monitor at this operating distance. The surgical videos employed for gaze data acquisition consisted of six video segments of comparable duration, one for each of the following surgical procedure types: cataract surgery, including uncomplicated capsulorrhexis, nuclear disassembly and phacoemulsification, and cortex removal (from three different cataract procedures); pars plana vitrectomy, including core vitrectomy, membrane peeling, and endolaser application (from a set of three vitreoretinal procedures). The durations and sample frames from each video clip are available in supplemental material section 4, eTable 2, and eFigure 4 (available at www.ophthalmologyscience.org). validation was performed by eleven ophthalmic surgeons grouped by three distinct levels of experience (Table 1).
      Table 1Distribution of participants
      PositionAverage years of experience (SD)No. of participantsSubspecialties
      Attending physician24.83 (23.82)32 Retina; 1 cornea
      Clinical Fellow4.8 (1.82)54 Retina; 1 Comprehensive
      Resident Physician1 (0)31 Comprehensive; 1 Not applicable

      Statistical Analysis

      Statistical analysis of the generated data was performed using Pandas and scikit-learn, open-source libraries for data manipulation and analysis. Descriptive statistics (mean +- standard deviation [SD]) were calculated for the extracted features of this study. The level of statistical significance was set at P < 0.05 using a two-tail Student t-test for each pair of surgeon groups (eTable 3 in the supplemental material, available at www.ophthalmologyscience.org).

      Results

      Extraction of fixation values

      Table 2 gives the extracted number and length of fixations for all three groups of surgeons during each surgical phase measured by the algorithm. While there was a trend toward Expert surgeons performing fewer fixations than residents during most surgical phases, statistical significance was not attained, and the average length of fixations was similar among experts. No statistical significance was identified for this discrepancy in the number of fixations between groups (P > 0.3, eTable 2).
      Table 2Characteristics of fixation between attending, fellow, and resident surgeons
      AttendingFellowResident
      No. of fixationsAvg. Duration (ms)No. of fixationsAvg. Duration (ms)No. of fixationsAvg. Duration (ms)
      Capsulorrhexis25 ± 1378 ± 3724 ± 4362 ± 6826 ± 5378 ± 18
      Phacoemulsification30 ± 1360 ± 3435 ± 9340 ± 2244 ± 6325 ± 53
      Cortex removal34 ± 17305 ± 2831 ± 7401 ± 2442 ± 3360 ± 73
      Core vitrectomy41 ± 2341 ± 1741 ± 7366 ± 5750 ± 3360 ± 39
      Membrane peeling36 ± 7287 ± 3031 ± 4280 ± 6037 ± 1354 ± 86
      Endolaser photocoagulation43 ± 7356 ± 1734 ± 9343 ± 2851 ± 7360 ± 1
      Table 3 shows the average gaze distance traveled per observer during each surgical phase (cartesian distance in pixels). Attending and fellow surgeons had an overall lower gaze travel distance when compared to residents during all surgical phases, regardless of subspecialty (P < 0.02, eTable 2).
      Table 3Average distance traveled in cartesian coordinates for each group of surgeons (pixels, mean ± SD)
      AttendingsFellowsResidents
      Capsulorrhexis9758±127914037±220918581±3116
      Phacoemulsification9528±133917089±348324219±4250
      Cortex removal16002±351321400±353327148±4623
      Core vitrectomy18591±316922559±330833148±4793
      Membrane peeling13683±370120316±394928294±4770
      Endolaser photocoagulation15709±351220564±323335457±4772

      Surgeons’ dwell time within areas of interest

      To investigate how visual attention among the three groups of surgeons differed for superimposed machine parameter information displayed on screen, the platform quantified the number of fixations in the area where parameters such as vacuum level and laser energy are displayed (Table 4). Experienced surgeons had fewer fixations in the area where overlayed instrument parameters were displayed on the screen compared to fellows and residents (eTable 2, P <0.05). Efigure 5 shows the extent of superimposed information considered for analysis (available at www.ophthalmologyscience.org).
      Table 4Number of fixations for each group within the AOI for overlayed surgical device parameters (IOP, Vacuum, vitrector cut-rate, and laser shot count/energy)
      GroupNo. of fixations in the superimposed information area (mean ± SD)
      Attendings3.333 ± 4.714
      Residents91.333 ± 21.892
      Fellows81.8 ± 33.358

      Identifying surgeons’ attention to surgical instruments and anatomic elements

      The platform combined the cartesian coordinates detected by the neural network from surgical instruments and the optic disc area with the gaze coordinates. With this combination, the platform estimated the time spent by each individual’s gaze at the selected AOIs, also known as dwell time. Table 5 shows the percentage of dwell time and the average distance to the instrument tooltip and optic disc for each surgical phase. Residents’ gaze consistently visited the instrument tooltip and its surroundings up to 6% more often when compared with attendings and fellows. However, no statistical significance was detected when comparing the dwell time percentage values among the three groups (P > 0.2, eTable 2).
      Table 5Dwell time percentage of participants within the AOI (surgical instrument tooltip), with their corresponding average distance to the object being tracked
      AttendingsFellowsResidents
      Dwell timeAvg. DistanceDwell timeAvg. DistanceDwell timeAvg. Distance
      Vitrector tooltip21.86%±7.94%334±23021.97%±2.29%356±21627.74%±1.05%378±230
      Membrane peeling forceps45.88%±1.05%310±19644.95%±17.20%338±20351.11%±2.38%369±218
      Endolaser tooltip39.76%±2.00%279±14030.05%±8.50%284±14245.41%±0.77%270±161
      Optic disc4.58%±0.84%371±2025.14%±6.81%418±2104.39%±6.08%406±206

      Data visualization

      In order to facilitate data visualization and analysis, we created a graphical representation of gaze behavior among subjects: eFigure 6 shows the gaze trajectory for the three groups, eFigure 7 demonstrates the change in cartesian coordinates during the experiment, and eFigure 8 shows the gaze velocity per participant during all surgical phases. eFigure 9 and Supplemental Video 1 demonstrate the capability to track and observe the gaze path for multiple participants simultaneously. eFigure 10 exemplifies the ability of our platform to visualize participants’ eye movement patterns relative to the position of the instrument tooltip or any feature tracked by the instance segmentation neural network. The horizontal line indicates the threshold where gaze overlapped with the AOI (vitrector tooltip), defined as a distance of ≤3° in visual angle (130px) consistent with established norms for gaze overlap
      • Gegenfurtner A.
      • Lehtinen E.
      • Säljö R.
      Expertise Differences in the Comprehension of Visualizations: A Meta-Analysis of Eye-Tracking Research in Professional Domains.
      ,

      Holmqvist K, Nyström M, Andersson R, Dewhurst R, Jarodzka H, Weijer J van de. Eye Tracking: A comprehensive guide to methods and measures. Published online 2011:560. Accessed March 16, 2022. https://books.google.com/books/about/Eye_Tracking.html?id=5rIDPV1EoLUC

      and following the hardware limitation specified by the eye tracker manufacturer (maximum resolution of 1.5° in visual angle, or 75px with our setup). Moreover, eFigure 11 provides a graphical representation of the identification of potential smooth pursuit movements, following the requirements for smooth pursuit recognition previously detailed in the methods section. Supplemental video 2 shows the smooth pursuits indicated by the red rectangle in eFigure 11 (available at www.ophthalmologyscience.org). All parameters can be altered in the source code according to the eye tracking device, screen size, and resolution used during the gaze acquisition: screen pixels can be converted to visual degrees via equation 1:
      sizepixels=arctan((h/2)/d))r2
      (1)


      Where h is the screen display height in centimeters, d the distance from the subject to the screen in centimeters, and r the vertical resolution of the display in pixels.

      Discussion

      There has been increasing attention on gaze behavior during surgical procedures such as laparoscopy and analysis of interpretation of still images in radiology and pathology.
      • Liu S.
      • Donaldson R.
      • Subramaniam A.
      • et al.
      Developing expert gaze pattern in laparoscopic surgery requires more than behavioral training.
      ,

      Shahimin MM, Razali A. An Eye Tracking Analysis on Diagnostic Performance of Digital Fundus Photography Images between Ophthalmologists and Optometrists. Published online 2019. doi:10.3390/ijerph17010030

      To our knowledge there have been no reports of assessment of gaze behavior during ophthalmic microsurgery. Several potential barriers include the significant cost of research-grade eye-tracking equipment combined with the proprietary software used for eye-tracking analysis, which may serve to inhibit investigations. We have developed an open-source and hardware-agnostic platform for acquiring and processing eye-tracking data during ophthalmic microsurgery. Our platform extracts a series of features by linking the surgeon’s gaze as a proxy for visual attention with a deep learning neural network that can incorporate multiple AOIs, enabling the investigation of surgeons’ gaze behavior during cataract and vitreoretinal procedures.
      The eye-tracking data acquired during the calibration phase of this study shows that the method employed to validate our platform is consistent, within the constraints of the limitations in hardware used for the study. Moreover, Feit et al.

      Feit AM, Williams S, Toledo A, et al. Toward Everyday Gaze Input: Accuracy and Precision of Eye Tracking and Implications for Design. Published online 2017. doi:10.1145/3025453.3025599

      demonstrated significant variability in precision and accuracy for both commercial and scientific-grade eye tracker models, reporting discrepancies in accuracy reported by manufacturers. In our study, we achieved an overlap of greater than 94% when directing a subject to follow an object on the screen, demonstrating the reliability of extracting gaze position using a commercially available eye-tracker device and computer vision algorithms (Supplemental material Section 1, eTable 1). In addition, fixation values extracted by our platform are compatible with previous studies: considering a wide set of activities, mean fixation durations in adult subjects range from 150 to 400 ms,

      Galley N, Betz D, Biniossek C. Fixation durations - Why are they so highly variable? In: Advances in Visual Perception Research. Nova Science Publishers, Inc.; 2015:83-106.

      ,
      • Negi S.
      • Mitra R.
      Fixation duration and the learning process: an eye tracking study with subtitled videos.
      comparable to the results of our platform’s output as shown in Table 3. Finally, it is important to note that our platform is hardware-agnostic: it is likely that increased reliability of acquired gaze data can be achieved by employing state-of-the-art eye-tracking hardware, i.e., research-grade equipment such as eye-tracking glasses that can identify subjects’ attention in 3D environment as well,

      Li TH, Suzuki H, Ohtake Y. Visualization of user’s attention on objects in 3D environment using only eye tracking glasses. J Comput Des Eng. 2020;7(2):228-237. doi:10.1093/JCDE/QWAA019

      while lower data fidelity is expected to be observed when using low-cost/lower-resolution equipment such as webcams. Surgeons’ gaze extraction via diversified equipment is the subject of ongoing studies by our group.
      The dynamic generation of an AOI in real time via deep learning neural networks enabled our platform to store eye-gaze data and spatial position of elements concurrently during surgery for subsequent analysis. Deep learning neural networks have been widely applied for image segmentation in clinical applications, including the diagnosis and classification of pathologies
      • Benjamens S.
      • Dhunnoo P.
      • Meskó B.
      The state of artificial intelligence-based FDA-approved medical devices and algorithms: an online database.
      • Vieira P.M.
      • Freitas N.R.
      • Lima V.B.
      • Costa D.
      • Rolanda C.
      • Lima C.S.
      Multi-pathology detection and lesion localization in WCE videos by using the instance segmentation approach.

      Sengupta S, Singh A, Leopold HA, Gulati T, Lakshminarayanan V. Ophthalmic diagnosis using deep learning with fundus images – A critical review. Artif Intell Med. 2020;102. doi:10.1016/j.artmed.2019.101758

      and real-time surgical guidance.

      Garcia Nespolo R, Yi D, Cole E, Valikodath N, Luciano C, Leiderman YI. Evaluation of Artificial Intelligence–Based Intraoperative Guidance Tools for Phacoemulsification Cataract Surgery. JAMA Ophthalmol. 2022;140(2):170-177. doi:10.1001/JAMAOPHTHALMOL.2021.5742

      • Rieke N.
      • Tan D.J.
      • Amat di San Filippo C.
      • et al.
      Real-time localization of articulated surgical instruments in retinal microsurgery.
      • Morita S.
      • Tabuchi H.
      • Masumoto H.
      • Yamauchi T.
      • Kamiura N.
      Real-Time Extraction of Important Surgical Phases in Cataract Surgery Videos.
      • Zhao Z.
      • Chen Z.
      • Voros S.
      • Cheng X.
      Real-time tracking of surgical instruments based on spatio-temporal context and deep learning.
      Our instance segmentation neural network tracked three different instruments and one anatomic element during three surgical phases, reaching AUPR values of up to 98%, demonstrating the capability of identifying AOIs in real time. This instance segmentation neural network is also capable of being trained to track and segment additional surgical landmarks according to the needs and goals of researchers, with several elements selected in this work to serve as a proof of concept. The identification of salient elements in surgical frames is directly related to the data with which we trained the neural network; the occurrence of high signal-to-noise ratio in surgical videos, such as suboptimal lighting conditions or vitreous hemorrhage, may affect detection and segmentation performance. Likewise heterogeneous instrument arrangements, such as different forceps shapes for internal limiting membrane (ILM) peeling, may lead to a drop in performance. Network validation for implementation in a clinical setting may require a larger and more heterogeneous set of cases for training. In this preliminary work, we leveraged our algorithm to demonstrate the potential of dynamically generating multiple AOIs together with the acquisition of eye gaze data to create two tools of potential utility: automated detection of overlap between gaze and location of surgical instrumentation and the identification of likely episodes of smooth pursuit during intraocular surgery.
      The preliminary data generated in our study indicate a potential relationship between the surgeon’s procedural experience and gaze behavior. Table 2 suggests that experience may affect the number and length of fixations, with fewer fixations performed by experienced surgeons. These data may be inconsistent with some studies of the eye movement patterns among surgeons: previous research has suggested that experts, compared with nonexperts, have more focused attention and elaborate visual representation during the performance of a task; this can be represented by increased fixation rates and a higher proportion of fixation within an area of interest.
      • Pugh C.M.
      • Ghazi A.
      • Stefanidis D.
      • Schwaitzberg S.D.
      • Martino M.A.
      • Levy J.S.
      How Wearable Technology Can Facilitate AI Analysis of Surgical Videos.
      ,
      • Shirley K.
      • Williams M.
      • McLaughlin L.
      • Parker N.
      • Bond R.
      Impact of an educational intervention on eye gaze behaviour in retinal image interpretation by consultant and trainee ophthalmologists.
      ,

      Fichtel E, Lau N, Park J, et al. Eye tracking in surgical education: gaze-based dynamic area of interest can discriminate adverse events and expertise. Surgical Endoscopy 2018 33:7. 2018;33(7):2249-2256. doi:10.1007/S00464-018-6513-5

      However, Sodergren et al.

      Sodergren MH, Orihuela-Espina F, Clark J, Darzi A, Yang GZ. A hidden markov model-based analysis framework using eye-tracking data to characterise re-orientation strategies in minimally invasive surgery. Cognitive Processing 2009 11:3. 2009;11(3):275-283. doi:10.1007/S10339-009-0350-3

      suggested that, even though experts can demonstrate a greater perceptual span and more rapid information acquisition with more fixations of shorter length, in some situations, this group may concentrate their gaze only on pertinent areas of view, with the AOI having a prominent place in the analysis. When analyzing the average gaze distance traveled by each group in Table 3, our data reveal that expert surgeons may focus on more relevant loci without loss of attention, represented by decreased cartesian distances traveled by their gaze. In contradistinction, trainee resident physicians may shift their gaze more frequently to regions not considered relevant or as critical by experienced surgeons. In addition, we observed that experienced surgeons had fewer fixations in the area where overlayed surgical device parameters were displayed on screen, indicating that this superimposed information may draw the attention of novice surgeons from instrument maneuvers and tissue handling. Alternately less experienced physicians may rely more heavily on feedback from surgical device parameters in real time in comparison with more experienced surgeons for the phases and types of procedures analyzed herein.
      The concurrent assessment of gaze coordinates with spatial data generated by the instance segmentation neural network allowed our platform to estimate the percentage of time the surgeon’s gaze attended the surgical instruments and the optic disc. One possible interpretation of the values in Table 5 could suggest that more experienced surgeons frequently scan peripheral areas and may anticipate subsequent movements of intraocular instruments or the effects of instrument maneuvers on the tissue environment. In contrast, novice surgeons may tend to focus their visual attention more uniformly on surgical instruments, specifically the tooltip, where instrument-tissue interactions are most likely to occur. No significant divergence in visual attention was identified among groups with respect to the optic disc area.
      Identifying potential smooth pursuit movement is another advantage of tracking features using a neural network while synchronously acquiring gaze data. When following previous requirements for a gaze movement to be considered a smooth pursuit,

      Linn´ L, Larsson L, Nyström M, et al. Smooth pursuit detection in binocular eye-tracking data with automatic video-based performance evaluation. J Vis. 2016;16(15):20-20. doi:10.1167/16.15.20

      our platform offers a tool that detects and indicates, in a timeline manner, the presence of these movements. This approach may, in the future, relieve the need for manual annotation in smooth pursuit studies while also helping to investigate how smooth pursuit movements may differ between different groups of subjects. Although, a deeper evaluation of the accuracy of these detections compared to manually annotated data is made necessary.
      There multiple potential applications of this platform may be broadly classified into several related areas: i) facilitating research relating to attention and awareness during intraocular surgery; ii) monitoring surgeon gaze and acquiring data in furtherance of surgical training; iii) monitoring attention and awareness during surgery in real time to potentially improve safety and efficacy. We propose that promoting an open-source and hardware-agnostic platform for the acquisition of gaze data from relatively inexpensive consumer-grade eye trackers will likewise facilitate the study of attention and awareness during microsurgery. Current eye-tracking platforms suitable for research are often costly, are dependent on software solutions for data analysis that are not open-source, and with few exceptions, require data generated by the manufacturer.
      • Niehorster D.C.
      • Hessels R.S.
      • Benjamins J.S.
      GlassesViewer: Open-source software for viewing and analyzing data from the Tobii Pro Glasses 2 eye tracker.
      Training in ophthalmic microsurgery could potentially be facilitated through gaze metrics that may distinguish patterns among expert and novice surgeons. These data could potentially be used to inform a learner as to how experienced ophthalmic surgeons attend to the visual surgical landscape, or even perhaps how the experienced surgeon reacts to intraoperative complications (i.e., before a posterior capsular rupture or iatrogenic retinal break). Fichtel et al have reported that assessment of gaze patterns from expert surgeons identified the occurrence of adverse events in observed laparoscopic surgical procedures.

      Fichtel E, Lau N, Park J, et al. Eye tracking in surgical education: gaze-based dynamic area of interest can discriminate adverse events and expertise. Surgical Endoscopy 2018 33:7. 2018;33(7):2249-2256. doi:10.1007/S00464-018-6513-5

      Ultimately, with the advent of three-dimensional visualization systems - in which the surgeon performs procedures using heads-up technology

      Romano MR, Cennamo G, Comune C, et al. Evaluation of 3D heads-up vitrectomy: outcomes of psychometric skills testing and surgeon satisfaction. Eye 2018 32:6. 2018;32(6):1093-1098. doi:10.1038/s41433-018-0027-1

      - and the existence of head-mounted wearable eye-tracker devices,

      Ye Z, Li Y, Fathi A, et al. Detecting eye contact using wearable eye-tracking glasses. UbiComp’12 - Proceedings of the 2012 ACM Conference on Ubiquitous Computing. Published online 2012:699-704. doi:10.1145/2370216.2370368

      future studies can assess surgeon attention and gaze in real time during the surgical procedure.
      The gaze behavior of surgeons observing a procedure retrospectively may differ from patterns observed in the surgeon actively performing the procedure and managing the equipment parameters, which may also differ from the surgeon supervising a trainee in real time. Atkins et al.

      Atkins MS, Jiang X, Tien G, Zheng B. Saccadic delays on targets while watching videos.

      demonstrated similarity in eye movement when comparing “doing” versus “watching” simulated laparoscopy surgery, with overlap values of up to 82% observed and a consistent delay in reaction time of over 600 milliseconds. Nonetheless, future studies will be required in order to assess these differences. Another potential limitation of our study is the hardware employed for gaze acquisition: while the primary goal of our platform is to offer an accessible solution for image processing of gaze data and extraction of features, the relatively low sample rate and lack of technical specifications in selected consumer-grade eye-trackers – such as head tracking to compensate for potential pitch and yaw of the surgeon’s head during gaze acquisition - may impact the fidelity of our results. A method that relies on computer vision may introduce errors due to the inaccessibility of raw data generated by the eye-tracker and gaze data processed by the manufacturer’s software may be subject to high levels of data transformation and processing, resulting in delay or lower accuracy in tracking when compared to research-grade equipment. Finally, previous studies indicated that other factors such as age and eye color might affect the accuracy of the collected data: although we calibrated the platform with the manufacturer’s software and performed a reproducibility test asking all subjects to follow objects on screen in order to assess for confounding variables.
      • Hessels R.S.
      • Andersson R.
      • Hooge I.T.C.
      • Nyström M.
      • Kemner C.
      Consequences of Eye Color, Positioning, and Head Movement for Eye-Tracking Data Quality in Infant Research.
      ,

      Dowiasch S, Marx S, Einhäuser W, Bremmer F. Effects of aging on eye movements in the real world. Front Hum Neurosci. 2015;9(FEB):46. doi:10.3389/FNHUM.2015.00046/XML/NLM

      In summary, we have developed a platform for eye-tracking that integrates open-source software for data analysis with deep learning neural networks to generate dynamic AOIs. In order to validate our approach in this proof-of-principle study we extracted gaze patterns comparing experienced ophthalmic surgeons with trainees of varying degrees of experience using cataract and vitreoretinal procedures. Preliminary data indicated quantifiable differences in eye movement behavior among groups. Additional studies are ongoing to begin to elucidate the role of visual attention during ophthalmic microsurgery.

      Supplementary data

      References

        • Liu S.
        • Donaldson R.
        • Subramaniam A.
        • et al.
        Developing expert gaze pattern in laparoscopic surgery requires more than behavioral training.
        J Eye Mov Res. 2021; 14https://doi.org/10.16910/JEMR.14.2.2
      1. Atkins MS, Tien G, Khan RSA, Meneghetti A, Zheng B. What Do Surgeons See: Capturing and Synchronizing Eye Gaze for Surgery Applications. http://dx.doi.org/101177/1553350612449075. 2012;20(3):241-248. doi:10.1177/1553350612449075

      2. Law B, Atkins MS, Kirkpatrick AE, Lomax AJ, Mackenzie CL. Eye Gaze Patterns Differentiate Novice and Experts in a Virtual Laparoscopic Surgery Training Environment. Proceedings of the Eye tracking research & applications symposium on Eye tracking research & applications - ETRA’2004. Published online 2004. doi:10.1145/968363

        • Cai L.Z.
        • Kwong J.W.
        • Azad A.D.
        • Kahn D.
        • Lee G.K.
        • Nazerali R.S.
        Where Do We Look? Assessing Gaze Patterns in Cosmetic Face-Lift Surgery with Eye Tracking Technology.
        Plast Reconstr Surg. 2019; 144: 63-70https://doi.org/10.1097/PRS.0000000000005700
        • Orquin J.L.
        • Ashby N.J.S.
        • Clarke A.D.F.
        Areas of Interest as a Signal Detection Problem in Behavioral Eye-Tracking Research.
        J Behav Decis Mak. 2016; 29: 103-115https://doi.org/10.1002/BDM.1867
      3. Vinuela-Navarro V, Erichsen JT, Williams C, Woodhouse JM. Quantitative Characterization of Smooth Pursuit Eye Movements in School-Age Children Using a Child-Friendly Setup. Transl Vis Sci Technol. 2019;8(5):8-8. doi:10.1167/TVST.8.5.8

        • Emhardt S.N.
        • Kok E.M.
        • Jarodzka H.
        • Brand-Gruwel S.
        • Drumm C.
        • van Gog T.
        How Experts Adapt Their Gaze Behavior When Modeling a Task to Novices.
        Cogn Sci. 2020; 44e12893https://doi.org/10.1111/COGS.12893
        • Papesh M.H.
        • Hout M.C.
        • Guevara Pinto J.D.
        • Robbins A.
        • Lopez A.
        Eye movements reflect expertise development in hybrid search.
        Cogn Res Princ Implic. 2021; 6: 1-20https://doi.org/10.1186/S41235-020-00269-8/FIGURES/16
      4. Merali N, Veeramootoo D, Singh S. Eye-Tracking Technology in Surgical Training. https://doi.org/101080/0894193920171404663. 2017;32(7):587-593. doi:10.1080/08941939.2017.1404663

        • Richstone L.
        • Schwartz M.J.
        • Seideman C.
        • Cadeddu J.
        • Marshall S.
        • Kavoussi L.R.
        Eye metrics as an objective assessment of surgical skill.
        Ann Surg. 2010; 252: 177-182https://doi.org/10.1097/SLA.0B013E3181E464FB
        • Pugh C.M.
        • Ghazi A.
        • Stefanidis D.
        • Schwaitzberg S.D.
        • Martino M.A.
        • Levy J.S.
        How Wearable Technology Can Facilitate AI Analysis of Surgical Videos.
        Annals of Surgery Open. 2020; 1e011https://doi.org/10.1097/AS9.0000000000000011
      5. Braunagel C, Kasneci E, Stolzmann W, Rosenstiel W. Driver-Activity Recognition in the Context of Conditionally Autonomous Driving. IEEE Conference on Intelligent Transportation Systems, Proceedings, ITSC. 2015;2015-October:1652-1657. doi:10.1109/ITSC.2015.268

      6. Aksum KM, Magnaguagno L, Bjørndal CT, Jordet G. What Do Football Players Look at? An Eye-Tracking Analysis of the Visual Fixations of Players in 11 v 11 Elite Football Match Play. Front Psychol. 2020;11:2624. doi:10.3389/FPSYG.2020.562995/XML/NLM

        • Khan R.S.A.
        • Tien G.
        • Atkins M.S.
        • Zheng B.
        • Panton O.N.M.
        • Meneghetti A.T.
        Analysis of eye gaze: Do novice surgeons look at the same location as expert surgeons during a laparoscopic operation?.
        Surg Endosc. 2012; 26: 3536-3540https://doi.org/10.1007/S00464-012-2400-7/FIGURES/4
        • Bottorff J.L.
        Development of an Observational Instrument to Study Nurse-Patient Touch.
        J Nurs Meas. 1994; 2: 7-24https://doi.org/10.1891/1061-3749.2.1.7
      7. Shahimin MM, Razali A. An Eye Tracking Analysis on Diagnostic Performance of Digital Fundus Photography Images between Ophthalmologists and Optometrists. Published online 2019. doi:10.3390/ijerph17010030

        • Shirley K.
        • Williams M.
        • McLaughlin L.
        • Parker N.
        • Bond R.
        Impact of an educational intervention on eye gaze behaviour in retinal image interpretation by consultant and trainee ophthalmologists.
        Health Informatics J. 2020; 26: 1419-1430https://doi.org/10.1177/1460458219881337
        • Bolya D.
        • Zhou C.
        • Xiao F.
        • Lee Y.J.
        YOLACT++: Better Real-time Instance Segmentation.
        IEEE Trans Pattern Anal Mach Intell. Published online. 2020; https://doi.org/10.1109/TPAMI.2020.3014297
        • Engbert R.
        • Kliegl R.
        Microsaccades uncover the orientation of covert attention.
        Vision Res. 2003; 43: 1035-1045https://doi.org/10.1016/S0042-6989(03)00084-1
      8. Linn´ L, Larsson L, Nyström M, et al. Smooth pursuit detection in binocular eye-tracking data with automatic video-based performance evaluation. J Vis. 2016;16(15):20-20. doi:10.1167/16.15.20

        • Fuchs A.F.
        Saccadic and smooth pursuit eye movements in the monkey.
        J Physiol. 1967; 191: 609https://doi.org/10.1113/JPHYSIOL.1967.SP008271
        • Gegenfurtner A.
        • Lehtinen E.
        • Säljö R.
        Expertise Differences in the Comprehension of Visualizations: A Meta-Analysis of Eye-Tracking Research in Professional Domains.
        Educ Psychol Rev. 2011; 23: 523-552https://doi.org/10.1007/S10648-011-9174-7/TABLES/9
      9. Holmqvist K, Nyström M, Andersson R, Dewhurst R, Jarodzka H, Weijer J van de. Eye Tracking: A comprehensive guide to methods and measures. Published online 2011:560. Accessed March 16, 2022. https://books.google.com/books/about/Eye_Tracking.html?id=5rIDPV1EoLUC

      10. Feit AM, Williams S, Toledo A, et al. Toward Everyday Gaze Input: Accuracy and Precision of Eye Tracking and Implications for Design. Published online 2017. doi:10.1145/3025453.3025599

      11. Galley N, Betz D, Biniossek C. Fixation durations - Why are they so highly variable? In: Advances in Visual Perception Research. Nova Science Publishers, Inc.; 2015:83-106.

        • Negi S.
        • Mitra R.
        Fixation duration and the learning process: an eye tracking study with subtitled videos.
        J Eye Mov Res. 2020; 13: 1-15https://doi.org/10.16910/JEMR.13.6.1
      12. Li TH, Suzuki H, Ohtake Y. Visualization of user’s attention on objects in 3D environment using only eye tracking glasses. J Comput Des Eng. 2020;7(2):228-237. doi:10.1093/JCDE/QWAA019

        • Benjamens S.
        • Dhunnoo P.
        • Meskó B.
        The state of artificial intelligence-based FDA-approved medical devices and algorithms: an online database.
        NPJ Digit Med. 2020; 3: 1-8https://doi.org/10.1038/s41746-020-00324-0
        • Vieira P.M.
        • Freitas N.R.
        • Lima V.B.
        • Costa D.
        • Rolanda C.
        • Lima C.S.
        Multi-pathology detection and lesion localization in WCE videos by using the instance segmentation approach.
        Artif Intell Med. 2021; 119102141https://doi.org/10.1016/j.artmed.2021.102141
      13. Sengupta S, Singh A, Leopold HA, Gulati T, Lakshminarayanan V. Ophthalmic diagnosis using deep learning with fundus images – A critical review. Artif Intell Med. 2020;102. doi:10.1016/j.artmed.2019.101758

      14. Garcia Nespolo R, Yi D, Cole E, Valikodath N, Luciano C, Leiderman YI. Evaluation of Artificial Intelligence–Based Intraoperative Guidance Tools for Phacoemulsification Cataract Surgery. JAMA Ophthalmol. 2022;140(2):170-177. doi:10.1001/JAMAOPHTHALMOL.2021.5742

        • Rieke N.
        • Tan D.J.
        • Amat di San Filippo C.
        • et al.
        Real-time localization of articulated surgical instruments in retinal microsurgery.
        Med Image Anal. 2016; 34: 82-100https://doi.org/10.1016/j.media.2016.05.003
        • Morita S.
        • Tabuchi H.
        • Masumoto H.
        • Yamauchi T.
        • Kamiura N.
        Real-Time Extraction of Important Surgical Phases in Cataract Surgery Videos.
        Sci Rep. 2019; 9https://doi.org/10.1038/s41598-019-53091-8
        • Zhao Z.
        • Chen Z.
        • Voros S.
        • Cheng X.
        Real-time tracking of surgical instruments based on spatio-temporal context and deep learning.
        Computer Assisted Surgery. 2019; 24: 20-29https://doi.org/10.1080/24699322.2018.1560097
      15. Fichtel E, Lau N, Park J, et al. Eye tracking in surgical education: gaze-based dynamic area of interest can discriminate adverse events and expertise. Surgical Endoscopy 2018 33:7. 2018;33(7):2249-2256. doi:10.1007/S00464-018-6513-5

      16. Sodergren MH, Orihuela-Espina F, Clark J, Darzi A, Yang GZ. A hidden markov model-based analysis framework using eye-tracking data to characterise re-orientation strategies in minimally invasive surgery. Cognitive Processing 2009 11:3. 2009;11(3):275-283. doi:10.1007/S10339-009-0350-3

        • Niehorster D.C.
        • Hessels R.S.
        • Benjamins J.S.
        GlassesViewer: Open-source software for viewing and analyzing data from the Tobii Pro Glasses 2 eye tracker.
        Behav Res Methods. 2020; 52: 1244-1253https://doi.org/10.3758/S13428-019-01314-1/FIGURES/6
      17. Romano MR, Cennamo G, Comune C, et al. Evaluation of 3D heads-up vitrectomy: outcomes of psychometric skills testing and surgeon satisfaction. Eye 2018 32:6. 2018;32(6):1093-1098. doi:10.1038/s41433-018-0027-1

      18. Ye Z, Li Y, Fathi A, et al. Detecting eye contact using wearable eye-tracking glasses. UbiComp’12 - Proceedings of the 2012 ACM Conference on Ubiquitous Computing. Published online 2012:699-704. doi:10.1145/2370216.2370368

      19. Atkins MS, Jiang X, Tien G, Zheng B. Saccadic delays on targets while watching videos.

        • Hessels R.S.
        • Andersson R.
        • Hooge I.T.C.
        • Nyström M.
        • Kemner C.
        Consequences of Eye Color, Positioning, and Head Movement for Eye-Tracking Data Quality in Infant Research.
        Infancy. 2015; 20: 601-633https://doi.org/10.1111/INFA.12093
      20. Dowiasch S, Marx S, Einhäuser W, Bremmer F. Effects of aging on eye movements in the real world. Front Hum Neurosci. 2015;9(FEB):46. doi:10.3389/FNHUM.2015.00046/XML/NLM