This is the final master thesis project for IT university of Technology. This master thesis research investigates innovative, secure and seamless way of verification using eye tracking technique.
Few words about the idea
Eye biometric is fairly new and rapidly growing field. It offers vast array of possible use cases, including inventive security measures. The main objective of this projec was to implement fast, secure and user friendly eye verification solution. Combining multiple complementary biometrics can also provide higher recognition accuracy than any individual biometric alone. The suggested system could serve as password input step in security procedure where username is determined by another solution, such as iris or face recognition. The verification process is based on user following a moving object on a screen while his eyes are being tracked. Data from the eye tracker is extracted and matched to predefined trajectories, thus providing an expected password input. The overall simplified structure of the system is shown in picture below.
In the main application view blobs can be anything from specific color, symbol or an image. The system is very straightforward to use and provides seamless verification without an eye tracker calibration (which is a huge advantage). It is also very versatile and can be implemented in various use cases with different levels of security. The procedure provides a vast array of possible improvements. The trajectories can be randomly generated or follow specific predefined conditions. There is also a possibility to use polarized security screen and glasses or virtual reality goggles in order to decrease a chance of malicious extraction of the eye movement and the password altogether.
Technical mumbo jumbo
Detailed, step by step technical implementation can be found in the original paper. To sum up, the whole system consisted of these steps:
- For GUI, Kivy library with python programming language was used.
- Blob movement trajectories where created by pre-determined turns shown in the picture below.
- While blobs are moving, user follows them with their eyes.
- Eye tracking data is recorded with Tobii eye tracker using Ctypes library which wraps around Windows natively compiled
- Eye tracking data is matched against predefined trajectories using Hausdorff distance method to find the closest match.
- This way user can enter very short inputs (one letter) or full words using alphabet view.
Again - full details how I came up with this structure, how I tested different comparison measurements and how pixel upscaling comes into play can be found in original paper’s model section.
How does it look in action?
The final usage of the system can be seen in the picture below. In this example the short alphabet was used with only one letter input per turn. This version was most intuitive and provided a wider variety of use cases.
The green lines shows the original input from the eye tracker (normalized). The eye tracker was not calibrated beforehand and this is one of the biggest advantage of the system as different users can use the same uncalibrated device straight away, without spending any time.
In the research a novel approach for person identification using an eye tracking technique was suggested. The method was expected to be very reliable, intuitive and user friendly security solution. The results indicated that this is a feasible concept with some limitations. Several improvements could be investigated with different interface configurations, various eye tracker devices and input matching approaches in order to determine the optimal parameters of the system, thus reducing the error rate and user annoyance. The solution can also be adopted for various applications outside the security field where the error rate is not a pressing issue. Procedures like browsing your car control menu while driving or responding to a message while wearing augmented reality goggles could benefit from fast and reliable input matching, achieved in this research.
The full project paper can be be found here.