Wolfgang Fuhl

Eberhard-Karls-Universit√§t T√ľbingen
Wilhelm-Schickard Institut f√ľr Informatik
Lehrstuhl Technische Informatik
Sand 14, C206
72076 T√ľbingen

Telefon: +49 (7071) 29-70492

E-Mail: fuhl

Sprechstunde: nach Vereinbarung

Kontakt-Formular

Lehrveranstaltungen

  • Praktikum ‚Äěprogrammieren mobiler Eingebetteter Systeme", WS14/15, WS15/16, WS16/17
  • Programmierprojekt ‚ÄěDrahloses Informations- und Aktionssystem‚Äú, SS14, SS15, SS16
  • Betreuung der √úbung und Erstellung der √úbungsblatter f√ľr die erste Vorlesung von ‚ÄěEye Movements and Visual Perception‚Äú WS14/15
  • Seminar: ‚ÄěMachine Learning and Artificial Neural Networks in Biomedical Applications‚Äú, WS14/15
  • Zusammenstellung aller Themen f√ľr Seminar: ‚ÄěComputational models of visual attention‚Äú, SS15
  • Pro Seminar: ‚ÄěTechnische Anwendungen der Informatik: Hard- und Software aktueller Eye-Tracking-Systeme‚Äú, SS16
  • Seminar: ‚ÄěComputational models of visual attention‚Äú, SS16

Publikationen

 Alle Publikationen im BibTex-Format

    2018

      Juni 2018
      • W. Fuhl, S. Eivazi, B. Hosp, A. Eivazi, W. Rosenstiel, E. Kasneci
         BORE: Boosted-oriented edge optimization for robust, real time remote pupil center detection
        ACM Symposium on Eye Tracking Research & Applications
      • W. Fuhl, D. Geisler, T. Santini, T. Appel, W. Rosenstiel, E. Kasneci
         CBF:Circular binary features for robust and real-time pupil center detection
        ACM Symposium on Eye Tracking Research & Applications
      • T. Santini, W. Fuhl, E. Kasneci
         PuReST: Robust Pupil Tracking for Real-Time Pervasive Eye Tracking
        Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA) -- To appear
      • W. Fuhl, T.Kuebler, H. Brinkmann, R. Rosenberg, W. Rosenstiel, E. Kasneci
         Region of interest generation algorithms for eye tracking data
        Third Workshop on Eye Tracking and Visualization (ETVIS), in conjunction with ACM ETRA
      Publikationen ohne Monatsangabe 2018
      • T. Santini, W. Fuhl, E. Kasneci
          PuRe: Robust Pupil Detection for Real-Time Pervasive Eye Tracking
        Elsevier Computer Vision and Image Understanding, To Appear

      2017

        2017
        • S. Eivazi, A. Hafez, W. Fuhl, H. Afkari, E. Kasneci, M. Lehecka, R. Bednarik
            Optimal eye movement strategies: a comparison of neurosurgeons gaze patterns when using a surgical microscope
          Acta Neurochirurgica, 159 (6), p.959‚Äď966
        November 2017
        • W. Fuhl, T. Santini, E. Kasneci
           Fast camera focus estimation for gaze-based focus control
          CoRR,
        • W. Fuhl, T. Santini, G. Kasneci, E. Kasneci
           PupilNet v2.0: Convolutional Neural Networks for Robust Pupil Detection
          CoRR,
        Mai 2017
        • T. Santini, W. Fuhl, E. Kasneci
           CalibMe: Fast and Unsupervised Eye Tracker Calibration for Gaze-Based Pervasive Human-Computer Interaction
          Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems
        • W. Fuhl, T. C. K√ľbler, D. Hospach, O. Bringmann, W. Rosenstiel, E. Kasneci
            Ways of improving the precision of eye tracking data: Controlling the influence of dirt and dust on pupil detection
          Journal of Eye Movement Research, 10 (3)
        März 2017
        • W. Fuhl, T. Santini, E. Kasneci
            Fast and Robust Eyelid Outline and Aperture Detection in Real-World Scenarios
          IEEE Winter Conference on Applications of Computer Vision (WACV 2017)
        • Shahram Eivazi, Michael Slupina, Wolfgang Fuhl, Hoorieh Afkari, Ahmad Hafez, Enkelejda Kasneci
            Towards automatic skill evaluation in microsurgery
          Proceedings of the 22st International Conference on Intelligent User Interfaces, IUI 2017, ACM
        • Shahram Eivazi, Wolfgang Fuhl, Enkelejda Kasneci
            Towards Intelligent Surgical Microscopes: Surgeons‚Äô Gaze and Instrument Tracking
          Proceedings of the 22st International Conference on Intelligent User Interfaces, IUI 2017, ACM
        Februar 2017
        • W. Fuhl, T. Santini, D. Geisler, T. K√ľbler, E. Kasneci
            EyeLad: Remote Eye Tracking Image Labeling Tool
          12th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2017)
        • T. Santini, W. Fuhl, D. Geisler, E. Kasneci
            EyeRecToo: Open-Source Software for Real-Time Pervasive Head-Mounted Eye-Tracking
          12th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2017)
        • D. Geisler, W. Fuhl, T. Santini, E. Kasneci
            Saliency Sandbox: Bottom-Up Saliency Framework
          12th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2017)

        2016

          Dezember 2016
          • W. Fuhl, T. Santini, C. Reichert, D. Claus, A. Herkommer, H. Bahmani, K. Rifai, S. Wahl, E. Kasneci
               Non-Intrusive Practitioner Pupil Detection for Unmodified Microscope Oculars
            Elsevier Computers in Biology and Medicine, 79, p.36‚Äď44
          Oktober 2016
          • T.C. K√ľbler, W. Fuhl, R. Rosenberg, W. Rosenstiel, E. Kasneci
              Novel methods for analysis and visualization of saccade trajectories
            ECCV Workshop VISART 2016
          September 2016
          • W. Fuhl, D. Geisler, T. Santini, E. Kasneci
              Evaluation of State-of-the-Art Pupil Detection Algorithms on Remote Eye Images
            ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct publication -- PETMEI 2016
          • W. Fuhl, T. Santini, D. Geisler, T. K√ľbler, W. Rosenstiel, E. Kasneci
              Eyes Wide Open? Eyelid Location and Eye Aperture Estimation for Pervasive Eye Tracking in Real-World Scenarios
            ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct publication -- PETMEI 2016
          • H. Bahmani, W. Fuhl, E. Gutierrez, E. Kasneci, S. Wahl
             Feature-based attentional influences on the accommodation response
            Vision Sciences Society Annual Meeting Abstract
          Juni 2016
          • Wolfgang Fuhl, Marc Tonsen, Andreas Bulling, Enkelejda Kasneci
              Pupil detection for head-mounted eye tracking in the wild: An evaluation of the state of the art
            Machine Vision and Applications, p.1-14
          • W. Fuhl, T. Santini, G. Kasneci, E. Kasneci
             PupilNet: Convolutional Neural Networks for Robust Pupil Detection
            CoRR,
          März 2016
          • T. Santini, W. Fuhl, T. K√ľbler, E. Kasneci
              Bayesian Identification of Fixations, Saccades, and Smooth Pursuits
            Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications (ETRA), p.163--170
          • W. Fuhl, T. Santini, T. K√ľbler, E. Kasneci
              ElSe: Ellipse Selection for Robust Pupil Detection in Real-World Environments
            Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications (ETRA), p.123--130
          Februar 2016
          • T. Santini, W. Fuhl, T. C. K√ľbler, E. Kasneci
              EyeRec: An Open-source Data Acquisition Software for Head-mounted Eye-tracking
            Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP), Vol.3: VISAPP, p.386--391

          2015

            September 2015
            • W. Fuhl, T. C. K√ľbler, K. Sippel, W. Rosenstiel, E. Kasneci
                ExCuSe: Robust Pupil Detection in Real-World Scenarios
              16th International Conference on Computer Analysis of Images and Patterns (CAIP 2015)
            August 2015
            • W. Fuhl, T. C. K√ľbler, K. Sippel, W. Rosenstiel, E. Kasneci
               Arbitrarily shaped areas of interest based on gaze density gradient
              European Conference on Eye Movements, ECEM 2015
            März 2015
            • E. Kasneci, T.C. K√ľbler, C. Braunagel, W. Fuhl, W. Stolzmann, W. Rosenstiel
                Exploiting the potential of eye movements analysis in the driving context
              15. Internationales Stuttgarter Symposium Automobil- und Motorentechnik, Springer Fachmedien Wiesbaden
            Januar 2015
            • K. Sippel, T.C. K√ľbler, W. Fuhl, G. Schievelbein, R. Rosenberg, W. Rosenstiel [Best paper award]
                Eyetrace2014: Eyetracking Data Analysis Tool
              8th International Conference on Health Informatics, Healthinf 2015
            Publikationen ohne Monatsangabe 2015
            • T. C. K√ľbler, K. Sippel, W. Fuhl, G. Schievelbein, J. Aufreiter, R. Rosenberg, W. Rosenstiel, E. Kasneci
                 Analysis of eye movements with Eyetrace
              Biomedical Engineering Systems and Technologies. Communications in Computer and Information Science (CCIS). Springer International Publishing

            Offene studentische Arbeiten

            1. Vein extraction and eye rotation determination
            2. EyeTrace CUDA extesion
            3. 3D Eyeball generation based on vein motion

            Vein extraction and eye rotation determination

            The first step is setting up an recording environment with fixed subject position. This
            environment is used for data acquisition with predefined head rotations of the subjects. Based on this data an algorithm has to be developed measuring the eyeball rotation of the subject. The resulting angle is then compared and validated based on the head rotation.

            EyeTrace CUDA extesion

            EyeTrace is a software for gaze data visualization and analysis. Due to the increasing amount of data these visualizations need more computation time. In this thesis existing visualizations should be implemented using CUDA for GPU computations. Additionally this includes a data storage model making it possible to shift the data between the GPU and the host computer.
            Due to the fact that nowadays not all computers have a CUDA capable card the modul should also allow CPU computations. This should be determined automatically by the module.

            3D Eyeball generation based on vein motion

            The first step is robust feature extraction. This can be done using SURF, SIFT, BRISK
            or MSER features if sufficient. Those features have to be mapped on features found in
            consecutive images. Based on the displacement a 3D model has to be computed. This model is used afterwards for gaze position estimation.