Usage of any items from the University of Cumbria s institutional repository Insight must conform to the following fair usage guidelines.

Similar documents
Is image manipulation necessary to interpret digital mammographic images efficiently?

+ Human method is pattern recognition based upon multiple exposure to known samples.

Breast screening: visual search as an aid for digital mammographic interpretation training

Oculomatic Pro. Setup and User Guide. 4/19/ rev

D-Lab & D-Lab Control Plan. Measure. Analyse. User Manual

E X P E R I M E N T 1

Intuitive Workflow by Barco. Designed for the way you work, naturally.

RECOMMENDATION ITU-R BT

CARESTREAM DIRECTVIEW Elite CR System

2-/4-Channel Cam Viewer E- series for Automatic License Plate Recognition CV7-LP

CARESTREAM DIRECTVIEW Elite CR System

Monitor QA Management i model

Introduction to Computer Graphics

Coronis 5MP Mammo. The standard of care for digital mammography

Durham Magneto Optics Ltd. NanoMOKE 3 Wafer Mapper. Specifications

Understanding PQR, DMOS, and PSNR Measurements

Hospital Wide. Healthcare Display Solutions DICOM Displays, Large Screen Displays and Projectors

What to consider when choosing a mammography display

Overview of Graphics Systems

Design of VGA Controller using VHDL for LCD Display using FPGA

Laser Beam Analyser Laser Diagnos c System. If you can measure it, you can control it!

Achieve Accurate Critical Display Performance With Professional and Consumer Level Displays

2.2. VIDEO DISPLAY DEVICES

Guidance for Quality Assurance of PACS Diagnostic Display Devices

3/2/2016. Medical Display Performance and Evaluation. Objectives. Outline

Objectives: Topics covered: Basic terminology Important Definitions Display Processor Raster and Vector Graphics Coordinate Systems Graphics Standards

Installation / Set-up of Autoread Camera System to DS1000/DS1200 Inserters

Dektak Step by Step Instructions:

DirectView Elite CR System. Improve workflow, productivity, and patient throughput.

Precise Digital Integration of Fast Analogue Signals using a 12-bit Oscilloscope

Guidelines for Assuring Softcopy Image Quality

-Technical Specifications-

Rec. ITU-R BT RECOMMENDATION ITU-R BT PARAMETER VALUES FOR THE HDTV STANDARDS FOR PRODUCTION AND INTERNATIONAL PROGRAMME EXCHANGE

Basic Pattern Recognition with NI Vision

G-106Ex Single channel edge blending Processor. G-106Ex is multiple purpose video processor with warp, de-warp, video wall control, format

2D/3D Multi-Projector Stacking Processor. User Manual AF5D-21

White Paper. Uniform Luminance Technology. What s inside? What is non-uniformity and noise in LCDs? Why is it a problem? How is it solved?

Evaluation report. Eizo RadiForce G33-N 3MP greyscale flat panel liquid crystal display (LCD) CEP 07003

Vicon Valerus Performance Guide

Computer Graphics: Overview of Graphics Systems

Interface Practices Subcommittee SCTE STANDARD SCTE Measurement Procedure for Noise Power Ratio

G-106 GWarp Processor. G-106 is multiple purpose video processor with warp, de-warp, video wall control, format conversion,

Cisco Telepresence SX20 Quick Set - Evaluation results main document

Part 1: Introduction to Computer Graphics

Calibrating and Profiling Your Monitor

CLIPSTER. 3D LUT File Generation with the Kodak Display Manager. Supplement

Table of content. Table of content Introduction Concepts Hardware setup...4

World First Slim Cassette Type Digital Mammo. Upgrade Solution

Stimulus presentation using Matlab and Visage

CARESTREAM VITA/VITA LE/VITA SE CR System Long Length Imaging User Guide

Smart Traffic Control System Using Image Processing

RECOMMENDATION ITU-R BT Studio encoding parameters of digital television for standard 4:3 and wide-screen 16:9 aspect ratios

PITZ Introduction to the Video System

Introduction. Edge Enhancement (SEE( Advantages of Scalable SEE) Lijun Yin. Scalable Enhancement and Optimization. Case Study:

Coronis Uniti (MDMC-12133) 12MP diagnostic display system for PACS and breast imaging

Optimizing the Workflow of Radiologists

MX215. Your advantages. 2MP Medical-Display

VeriLUM 5.2. Video Display Calibration And Conformance Tracking. IMAGE Smiths, Inc. P.O. Box 30928, Bethesda, MD USA

Nio. Industry-standard diagnostic display systems

Glossary Unit 1: Introduction to Video

Chapter 1. Introduction to Digital Signal Processing

Audio and Video II. Video signal +Color systems Motion estimation Video compression standards +H.261 +MPEG-1, MPEG-2, MPEG-4, MPEG- 7, and MPEG-21

BEAMAGE 3.0 KEY FEATURES BEAM DIAGNOSTICS PRELIMINARY AVAILABLE MODEL MAIN FUNCTIONS. CMOS Beam Profiling Camera

COMPOSITE VIDEO LUMINANCE METER MODEL VLM-40 LUMINANCE MODEL VLM-40 NTSC TECHNICAL INSTRUCTION MANUAL

CHARACTERIZATION OF END-TO-END DELAYS IN HEAD-MOUNTED DISPLAY SYSTEMS

BRIGHT BRIGHTER BRIGHTEST ONE ILLUMINATOR DESIGN THREE LIGHT SOURCES. featuring the EvenVue Reflector System

Chapter 3 Fundamental Concepts in Video. 3.1 Types of Video Signals 3.2 Analog Video 3.3 Digital Video

Acquisition Control System Design Requirement Document

PulseCounter Neutron & Gamma Spectrometry Software Manual

User s Manual. Log Scale (/LG) GX10/GX20/GP10/GP20/GM10 IM 04L51B01-06EN. 3rd Edition

Software Quick Manual

Quick-Start for READ30

VARIOUS DISPLAY TECHNOLOGIESS

Archiving: Experiences with telecine transfer of film to digital formats

The Versatile and Powerful ACLxy. ACLxy

BitWise (V2.1 and later) includes features for determining AP240 settings and measuring the Single Ion Area.

User Manual 15" LCD Open frame SAW Touch Monitor KOT-0150US-SA4W. Table of Contents

The human factors surrounding system change in breast cancer screening: a case study

PHY221 Lab 1 Discovering Motion: Introduction to Logger Pro and the Motion Detector; Motion with Constant Velocity

PRODUCT GUIDE CEL5500 LIGHT ENGINE. World Leader in DLP Light Exploration. A TyRex Technology Family Company

V9A01 Solution Specification V0.1

Available online at ScienceDirect. Procedia Manufacturing 3 (2015 )

NOTICE: This document is for use only at UNSW. No copies can be made of this document without the permission of the authors.

Understanding Compression Technologies for HD and Megapixel Surveillance

3 rd Party Interfaces. Version Installation and User Guide

AUTOMATIC LICENSE PLATE RECOGNITION(ALPR) ON EMBEDDED SYSTEM

These are used for producing a narrow and sharply focus beam of electrons.

Intelligent Monitoring Software IMZ-RS300. Series IMZ-RS301 IMZ-RS304 IMZ-RS309 IMZ-RS316 IMZ-RS332 IMZ-RS300C

In-process inspection: Inspector technology and concept

Torsional vibration analysis in ArtemiS SUITE 1

LedSet User s Manual V Official website: 1 /

PYROPTIX TM IMAGE PROCESSING SOFTWARE

Display Quality Assurance: Recommendations from AAPM TG270 for Tests, Tools, Patterns, and Performance Criteria

Display Quality Assurance: Recommendations from AAPM TG270 for Tests, Tools, Patterns, and Performance Criteria

Lab Determining the Screen Resolution of a Computer

Statement SmartLCT User s Manual Welcome to use the product from Xi an NovaStar Tech Co., Ltd. (hereinafter referred to as NovaStar ). It is our great

DICOM Correction Item

DRAFT. Proposal to modify International Standard IEC

Television History. Date / Place E. Nemer - 1

Comp 410/510. Computer Graphics Spring Introduction to Graphics Systems

Transcription:

Dong, Leng, Chen, Yan, Gale, Alastair and Phillips, Peter (2016) Eye tracking method compatible with dual-screen mammography workstation. Procedia Computer Science, 90. 206-211. Downloaded from: http://insight.cumbria.ac.uk/2438/ Usage of any items from the University of Cumbria s institutional repository Insight must conform to the following fair usage guidelines. Any item and its associated metadata held in the University of Cumbria s institutional repository Insight (unless stated otherwise on the metadata record) may be copied, displayed or performed, and stored in line with the JISC fair dealing guidelines (available here) for educational and not-for-profit activities provided that the authors, title and full bibliographic details of the item are cited clearly when any part of the work is referred to verbally or in the written form You may not a hyperlink/url to the original Insight record of that item is included in any citations of the work the content is not changed in any way all files required for usage of the item are kept together with the main item file. sell any part of an item refer to any part of an item without citation amend any item or contextualise it in a way that will impugn the creator s reputation remove or alter the copyright statement on an item. The full policy can be found here. Alternatively contact the University of Cumbria Repository Editor by emailing insight@cumbria.ac.uk.

Available online at www.sciencedirect.com ScienceDirect Procedia Computer Science 90 (2016 ) 206 211 International Conference On Medical Imaging Understanding and Analysis 2016, MIUA 2016, 6-8 July 2016, Loughborough, UK Eye Tracking Method Compatible with Dual-Screen Mammography Workstation Leng Dong a *, Yan Chen a, Alastair Gale a, Peter Phillips b a Loughborough University, Loughborough, LE11 3TU, UK b University of Cumbria, Lancaster, LA1 3JD, UK Abstract In this paper a new approach is proposed to track the perceptual behaviour of radiologists when they examine mammographic images displayed on large dual clinical monitors. Zooming and panning are inevitably performed by the radiologist to examine such large images by using the DICOM viewing software. Such image manipulating movements on the target displays makes eye tracking techniques difficult to perform and also the size of the dual clinical monitors makes existing eye tracking techniques generally inadequate. Hence a method using the Smart Eye Pro eye tracker and optical character recognition techniques was designed to relate the recorded radiologists eye gaze behaviour on the monitors to the actual zoomed and panned medical image areas. This then allows clinical studies involving radiologists interacting with these mammographic images to be successfully carried out. 2016 Published The Authors. by Elsevier Published B.V. by This Elsevier is an open B.V. access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). Peer-review under responsibility of the Organizing Committee of MIUA 2016. Peer-review under responsibility of the Organizing Committee of MIUA 2016 Keywords: Medical Imaging, Image Perception, Eye Tracking, DICOM, Mammography, Optical Character Recgonition; 1. Introduction Medical imaging research often examines the performance of radiologists when they examine different types of images. A fairly common behavioural approach is to use eye tracking, where the radiologist s visual search * Leng Dong. Tel.: +44-01509-635737; E-mail address: l.dong@lboro.ac.uk 1877-0509 2016 Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). Peer-review under responsibility of the Organizing Committee of MIUA 2016 doi:10.1016/j.procs.2016.07.013

Leng Dong et al. / Procedia Computer Science 90 ( 2016 ) 206 211 207 behavior is recorded as they examine the displayed images and information concerning where they fixated or did not fixate in the image is then linked to their performance. Typically, three types of user error are elaborated by such an approach: errors which are made where the radiologist clearly did not look at or near an abnormality (a visual search error), abnormalities looked at but not detected (a detection error) or abnormalities looked at and information reported as being detected (e.g. micro-calcifications identified) but then this information is not interpreted appropriately (interpretation error). In general, eye tracking can be undertaken using several different commercially available systems. These are typically either head-mounted glasses or systems which are affixed beneath the medical monitor. For most medical images either approach can work well, however particular difficulties arise with mammography images due to the large clinical monitor sizes. For instance, by monitoring a radiologist s eye movements on a digital mammography image, it is possible to tease out any performance differences between naïve breast screeners and experienced breast radiologists as well as how errors occurred and why experienced radiologists perform better. Previous research has examined the visual behaviour when participants read mammographic images on a clinical dual-screen workstation 1. This was achieved by using a head mounted eye tracker to monitor the participants eye movements. As this study reported, difficulties can occur when using a head mounted eye tracker to examine visual search behaviour with such large displays, such as the participant s head movements may cause the head mounted scene camera to lose track of the display. Also, due to the complex set up of the digital mammographic workstation, popular remote eye trackers which can be mounted below the monitors would not be able to be configured compatibly with the dualscreen display system. That is, the large size of the dual monitors exceeds the size of the visual display that can be recorded accurately by these eye trackers. An additional problem arises with the medical images themselves which are viewed using a DICOM viewer. Radiologists typically use zooming and panning of images to examine fine details; this is especially the case with mammographic images where often the interest is in perceiving whether very small calcifications are present and where these manipulations are a key factor. Currently popular eye trackers on the market will only allow an observers eye gaze position to be recorded according to a fixed co-ordinate system (i.e. the clinical display) which is defined and calibrated before the actual recording takes place. Any area of interest in the displayed image is then normally defined in relation to the full image being examined. Therefore, when zooming and panning is performed, it is difficult to relate the recorded observer s eye gaze position accurately (which is based on the coordinate model defined by the clinical display) to the actual eye gaze location on the image. Based on the above issues, a new approach is presented which optimises a current eye tracking method when a clinical mammographic workstation is used and allows for accurate eye tracking enabling appropriate panning and zooming of the medical images. 1. Experimental Set Up Fig. 1. GE Digital Mammography Workstation with Synedra View DICOM viewer The workstation used was a GE Digital Mammography Workstation (Fig. 1.) consisting of a desktop tower PC with one monitor and two 5MP diagnostic displays. The hardware configuration of this mammography workstation is no different to a standard compatible PC except that a special graphic card which supports the dual-screen output

208 Leng Dong et al. / Procedia Computer Science 90 ( 2016 ) 206 211 of 12-bit grayscale images is installed onto the motherboard. Synedra View had been installed to enable PACS archiving and display DICOM images. The software has been approved to be used for clinical use. For our purposes the software is useful as it permits the user to manipulate each mammography image (on different monitors) without affecting the image on the other screen. Also if the user operates zooming and panning controls then the related zooming and panning information is displayed at the bottom right corner of each display. Utilising that information linked to the user s eye gaze data can be leveraged to calculate accurate gaze location information in relation to the displayed images. The following section describes a computer graphic method designed to collect that screen information. 2. Manipulation Behaviour Analysis Tool In order to gauge where a user is looking when they are examining a mammographic image it is important to know whether the image being viewed is the full size image or some zoomed or panned part of that image. Unfortunately, there is no direct electronic way to acquire that information. However, because zooming and panning information is displayed at the bottom left corner of the diagnostic display it is possible to utilise a screen capture technique to record this information. Then with the help of an optical character recognition technique, it is hoped that images containing this zooming and panning information can be read and saved in a plain text format for subsequent detailed analysis. 2.1. Sub Screen Capture Tool An application called Sub Screen Capture Tool 2 had been previously developed by one of the authors to capture a specified area of a computer screen repeatedly at a user-defined frequency. Here, the zooming and panning information as displayed by the Synedra View software was the screen target to capture. The default screen shot interval was set to 10 ms. Fig. 2 (a) shows the graphic user interface of Sub Screen Capture where the precise area of the Synedra display to be captured was defined. Fig. 2 (b) shows an example of the captured image. In this captured image example, Z: 2.84 indicates that the zoom factor was 2.84. This means the image displayed on the workstation was zoomed in to a new size of 2.84 times the original image size. The additional information P: 724/- 3261 indicates the precise position information (726, -3261) of the x, y coordinates (measured in screen pixels) of the displayed image centre and the origin is the centre of the workstation screen. (a) (b) Fig. 2. (a) Sub Screen Capture GUI (b) example of captured picture As a user viewed images on the workstation and panned and zoomed these images then this tool captured images every 10ms and saved them as jpg files.

Leng Dong et al. / Procedia Computer Science 90 ( 2016 ) 206 211 209 2.2. Optical Character Recognition (OCR) Fig. 3. flowchart of OCR process An optical character recognition (OCR) application was then used to extract the required information out of the image and saved this into a text file. The process of OCR involves several steps including segmentation, feature extraction, and classification 3. A Matlab script using Image Processing Toolbox was developed to achieve this. The flowchart in Fig. 3 shows the brief procedure of the OCR. The captured JPG format image is first read into the computer memory stack and saved as a matrix. Each element in the matrix is equivalent to a pixel in the image. The value of the element is essentially the 8-bit greyscale value. As can be seen in Fig. 4 (a) that captured image is likely to have a noisy background, therefore a high pass threshold filter was applied to get rid of those pixels with lower greyscale value and clean up the image. Therefore, only text pixels with a high degree of greyscale value were kept (Fig. 4.b). In order to optimize image processing time, the filtered image was then compressed into a binary black and white image. After the above initializing process, a horizontal scan of each row in the matrix is performed with the aim of looking for any blank row in the image. If a blank row was spotted, a sub image with the height between the previous identified blank row and the currently identified blank row was extracted to present a single text line (Fig. 4.c). A similar process was then performed in the vertical direction with the aim of extracting a single character (Fig. 4.d). (a) (b) (c) (d) (e) Fig. 4. An example demonstrating the OCR process After a character is extracted, by computing the correlation coefficient between the extracted character and existing templates, the character with the largest coefficient value is the recognized one. Finally, the recognized numbers and decimals were written into a tab delimited text file (Fig. 4.e). This results in three numbers which represent the zoom factor with the panning x and y values respectively. 2.3. Coordinate Mapping This information is then used to determine what information is displayed on screen at any point in time. Assume that the panning information is denoted by (x, y) and the zoom factor is denoted by z. The width and length of the mammography image is denoted by w and l. The resolution of the mammography diagnostic screen is p x q. If we want to display a zoomed area on the original image, the centre of the zoomed area (X, Y) is: = + 2

210 Leng Dong et al. / Procedia Computer Science 90 ( 2016 ) 206 211 = 2 The zoomed area then is represented by the rectangle which has the same length-width-ratio as the diagnostic screen. And the length (L) and width (W) is = = 2.4. Visualization To test the accuracy of this function a calibration image was drawn. Five grey circles were drawn at four corners and the centre of the image. This image was then displayed on the mammography diagnostic screen. Each circle was then zoomed to fit the screen and the zoom and position information were manually recorded. The information was collected and plotted onto the original image using coordinate mapping equations. Fig. 5 shows the calibration image with the five zoomed areas. Each zoomed area is represented by red rectangles. These display the content shown on the screen after zooming and panning are applied. It can be seen clearly that all the zoomed areas exactly found their target circles. This means that this approach satisfies the accuracy requirement. Fig. 5. Calibration image with the five zoom areas 3. Smart Eye Pro System The Smart Eye Pro system is a head and gaze tracking system well suited for demanding environments, such as a vehicle cockpit where a lot of head movement occurs, and has the flexibility to cope with most research projects. The system measures the subject s head pose and produces eye gaze direction in full 3D. Additionally eye lid opening values and pupil dilation measurements can also be obtained. The system can be used with up to six cameras with different lenses, allowing for a very large field of view. For this research, a Smart Eye Pro system consisting of 3 cameras (Fig. 6(a)) was used to track a user s eye movements on the two digital mammography screens. As the mammography workstation has two diagnostic screens, this set up is able to provide a large enough tracking angle. The disadvantage of using the Smart Eye Pro system is that it does not provide enough support for on screen tracking. This means that the user has to create a real world model of the computer screen with very accurate spatial data. This is even more challenging for this research as the surface of the diagnostic screen is not fully flat. A real world model of the two diagnostic screens was created by measuring the length between the cameras and the surface of the screens. A pixel is used as the unit of length to obtain the coordinate data. Besides this, measuring the angle of the screen surface is also a complex process. As the surface of the diagnostic screen is actually a curved surface, this will inevitably cause some inaccuracy happening when tracking someone s eye gaze. Fig. 6.b shows the real world model created using pixel information. The two rectangles represent the two diagnostic screens. The origin is the central camera. The two yellow circles are the two LED flashes installed on the two wing cameras. Fig. 6.c shows the monitoring panel when the eye tracking mode is

Leng Dong et al. / Procedia Computer Science 90 ( 2016 ) 206 211 211 on. The Smart Eye Pro software analyses the participant s gaze direction from three cameras and decides the intersection point on the real world model of the diagnostic screen. The log information shows in the dialog window at the bottom left corner. The coordinate data of the intersection point can be exported from the log window. (a) (b) (c) Fig. 6.(a) Smart Eye Pro system consists of 3 cameras (b) real world model (c) Smart Eye Pro software monitoring panel 5. Timestamp matching The Smart Eye Pro system provides a way to obtain participants gaze position on the screen coordinate plane. The zoom analysis tool described earlier has the ability to obtain position information of the image displayed on the screen after zooming and panning are applied. By combining the two systems, it is then possible to locate a participant s gaze position on the mammography image even when this image is zoomed and panned. The Sub Screen Capture tool operates at a frequency of 10Hz and the Smart Eye Pro log file uses a 60Hz frequency. Therefore it is possible to match the timestamp by reducing the Smart Eye Pro log file s frequency to 10Hz. Both sets of data use the UNIX timestamp, which provides a more convenient way to match the timestamp. 6. Discussion and conclusion In this paper how to accurately track the visual search behaviour of radiologists as they examine mammographic images displayed on large dual clinical monitors is addressed. The radiologist will inevitably zoom and pan these images. Such movement of the displayed target images is a problem for eye tracking. Whilst various eye tracking techniques exist, none can easily handle such large displays and also allow for the radiologist to move their heads about a lot hence the Smart Eye Pro system was used here. Relating the recorded eye gaze behaviour on the monitors to the actual zoomed and panned displayed image was handled by a newly designed method. Overall the new techniques described here allows clinical studies to be successfully carried out of how radiologists interact with these large mammographic images. On the other hand, limitations were also spotted when conducting this approach. The mammography workstation used in this study consists of two curved CRT (Cathode ray tube) monitors and as demonstrated in Fig. 6 (b) only flat surfacescan be defined as the tracking target in the Smart Eye Pro configuration tool. This issue inevitably will cause inaccuracy when relating the eye movement data onto the simulated coordination map. It is hoped that by replacing CRT monitors with flat panel displays, a better eye movement data accuracy can be achieved. Another limitation about this approach is that two processes (screen capture and OCR) are used to monitor participants manipulating behavior which provide an indirect way of information extraction and consume high computing power. Future improvements also includes seeking deeper cooperation with manufactures or DICOM viewer developers to discover a more straightforward way to acquire such information. References 1. Chen, Y., 2010. An investigation of workstation image manipulation usage when examining FFDM images. Breast Cancer Research, 3(12). 2. Phillips, P., 2012. Sub Screen Capture Tool 3. Schantz, H. F., 1982. The history of OCR, optical character recognition. In: Recognition Technologies Users Association. Manchester 4. Smart Eye, 2016. Smart Eye Pro. [Online] Available at: http://smarteye.se/products/smart-eye-pro/ [Accessed May 2016].