Today Sergei, who is CTO at BASIS ID, talks about one of the elements of user verification — liveness check. This is a stage of the identity verification process, which allows us to confirm that the user, who wants to use a service, is a real, living person, who is currently at the computer or holding a smartphone.
Surely, you all know the story, when some phones were unlocked, while shown a photo of the owner to the camera? Meaning that it was possible to place a photo of a person of realistic size and good quality in front of the camera instead of a living person, in order for a phone to recognize its owner
Sometimes we also encounter users who try to cheat their way through verification process by using the same method. They record a video showing a picture or a document to the camera, which most likely does not belong to the person, trying to pass the verification. Of course, all of the modern verification systems have a more advanced liveness check than older smartphones had. At the very least, the requirement was added to turn your head in both directions. This resolves all of the doubts while verifying manually, but what about a fully automated verification using machine learning for a company, which processes huge number of users per minute, as, for example, we do it at BASIS ID?
The most primitive solution is to simply define a multitude of moving points in the frame and, based on this, assume whether the person in the video is alive, and then match the recognized image of the face with the photograph of the document.
It sounds simple, however, here you can see two separate parts of the task — motion recognition and face recognition, which can not completely eliminate the error, especially with a more sophisticated approach to the placement of the image in the frame. We, at BASIS ID, have developed a truly advanced liveness check system that links all of the elements of liveness check and eliminates the possibility of an error. Of course, this is a know-how, but we are very happy to tell you how it works.
Algorithm of the most reliable liveness check system
So, in the option described above, we see that a certain array of points is recognized and checked, whether it is moving or not. We approached this process a little differently. A human face has a set of elements by which we can track not only the movement, but also compare the video, right during processing, with a received photo or an image of the person from their document.
First of all, the distance between the eyes and temples of the user is estimated. These elements, along with the set of other points are compared to a user’s photo. Then, the person begins to turn their head and the distance between the eyes decreases, while the distance from the point in the center of the eye to the temple begins to increase. Thus, we perceive and constantly track the movement, which excludes the possibility of changing a person, or, for example, a method of cheating, when a photo of a person is placed on another person’s face.
This way, there is a tracking of the head turnings in both directions, along with a live comparison of the obtained data with the original image from the document.
By doing that, several dozen frames are analyzed, based on which the system automatically understands not only whether the person is alive in the video, but also whether that person is the same as in the document. In addition to evaluating the distances, blinks are analyzed at a speed of more than 120 frames per second, which allows us to be sure that the person is moving and receive additional confirmation of his activity.
Of course, liveness check is only one part of the identity verification process. Verification includes many more elements of due diligence, and we believe that we have managed to create the most advanced verification tool not only technically, but also in terms of customer experience. After all, our task is to make the world not only safer, but also more convenient.