Accuracy

We have a perfect score on multiple public datasets (NUAA, CASIA FASD, MSU...). These datasets are too small to train a model but large enough to help design a deep learning model.

Our deep learning model was trained on our own very challenging dataset. We have 99.62% accuracy on the validation set and 99.27% on the test set.

The model was developed using Keras with Tensorflow backend and has 80M parameters.

You can test the accuracy with your own images at https://www.doubango.org/webapps/face-liveness/.

The competition

Our demo web app is provided in unconstrained environment. You can drag&drop any image, use any resolution (4K or more), use your front or back camera, put the camera at any distance…

Other companies would not provide their product in such unconstrained environment for multiple reasons:
    1. Some implementations are texture based and use LBP variants or deep learning. This requires a very close face. Some companies clearly state on their website that “it will result in poor/undefined behavior” if you don’t follow this rule. This is why they’re requiring you to put your face in a box. We also recommend a close face but not doing so will result in predictable behavior.

    1. They will not let you choose the resolution. It’s very hard to distinguish a spoof and a genuine face when the resolution and image quality are very high. This is why some companies will not let you choose resolutions higher than HD.

    1. Some will not let you choose the back camera. The back camera has higher resolution and it’s easier to take a picture of a spoof. The front camera has lower resolution and it’s very uncomfortable to take a picture of someone else using it without having distortions. Try using the front camera and putting the face of someone else in a tiny box without distortion and you’ll feel the pain. Also, using the front camera at close range will have the screen light reflecting on the paper or screen, this make it easier to detect spoofs. Dim-out the screen light and you’ll be able to crack some implementations.