P306 - BIOSIG 2020 - Proceedings of the 19th International Conference of the Biometrics Special Interest Group
Auflistung P306 - BIOSIG 2020 - Proceedings of the 19th International Conference of the Biometrics Special Interest Group nach Erscheinungsdatum
1 - 10 von 33
Treffer pro Seite
Sortieroptionen
- KonferenzbeitragEffects of sample stretching in face recognition(BIOSIG 2020 - Proceedings of the 19th International Conference of the Biometrics Special Interest Group, 2020) Hedberg, Mathias FredrikFace stretching is something that can occur intentionally and unintentionally when preparing a face sample for enrollment in a face recognition system. In this paper we assess what affects both horizontal and vertical stretching have on a face recognition algorithms. Basic closed-set identification tests revealed that holistic face recognition algorithms performed poorly compared to feature based recognition algorithms when classifying non-stretched samples against templates based on stretched samples.
- KonferenzbeitragEnd-to-end Off-angle Iris Recognition Using CNN Based Iris Segmentation(BIOSIG 2020 - Proceedings of the 19th International Conference of the Biometrics Special Interest Group, 2020) Jalilian, Ehsaneddin; Karakaya, Mahmut; Uhl, AndreasWhile deep learning techniques are increasingly becoming a tool of choice for iris segmentation, yet there is no comprehensive recognition framework dedicated for off-angle iris recognition using such modules. In this work, we investigate the effect of different gaze-angles on the CNN based off-angle iris segmentations, and their recognition performance, introducing an improvement scheme to compensate for some segmentation degradations caused by the off-angle distortions. Also, we propose an off-angle parameterization algorithm to re-project the off-angle images back to frontal view. Taking benefit of these, we further investigate if: (i) improving the segmentation outputs and/or correcting the iris images before or after the segmentation, can compensate for off-angle distortions, or (ii) the generalization capability of the network can be improved, by training it on iris images of different gaze-angles. In each experimental step, segmentation accuracy and the recognition performance are evaluated, and the results are analyzed and compared.
- KonferenzbeitragBIOSIG 2020 - Komplettband(BIOSIG 2020 - Proceedings of the 19th International Conference of the Biometrics Special Interest Group, 2020)
- KonferenzbeitragWatchlist Adaptation: Protecting the Innocent(BIOSIG 2020 - Proceedings of the 19th International Conference of the Biometrics Special Interest Group, 2020) Günther, Manuel; Dhamija, Akshay Raj; Boult, Terrance E.One of the most important government applications of face recognition is the watchlist problem, where the goal is to identify a few people enlisted on a watchlist while ignoring the majority of innocent passersby. Since watchlists dynamically change and training times can be expensive, the deployed approaches use pre-trained deep networks only to provide deep features for face comparison. Since these networks never specifically trained on the operational setting or faces from the watchlist, the system will often confuse them with the faces of innocent non-watchlist subjects leading to difficult situations, e.g., being detained at the airport to resolve their identity. We develop a novel approach to take an existing pre-trained face network and use adaptation layers trained with our recently developed Objectosphere loss to provide an open-set recognition system that is rapidly adapted to the gallery while also ignoring non-watchlist faces as well as any background detections from the face detector. While our adapter network can be quickly trained without the need of re-training the entire representation network, it can also significantly improve the performance of any state-of-the-art face recognition network like VGG2. We experiment with the largest open-set face recognition dataset, the UnConstrained College Students (UCCS). It contains real surveillance camera stills including both known and unknown subjects, as well as many non-face regions from the face detector. We show that the Objectosphere approach is able to reduce the feature magnitude of unknown subjects as well as background detections, so that we can apply a specifically designed similarity function on the deep features of the Objectosphere network, which works much better than the direct prediction of the very same network. Additionally, our approach outperforms the VGG2 baseline by a large margin by rejecting the non-face data, and also outperforms prior state-of-the-art open-set recognition algorithms on the VGG2 baseline data.
- KonferenzbeitragCan Generative Colourisation Help Face Recognition?(BIOSIG 2020 - Proceedings of the 19th International Conference of the Biometrics Special Interest Group, 2020) Drozdowski, Pawel; Fischer, Daniel; Rathgeb, Christian; Geissler, Julian; Knedlik, Jan; Busch, ChristophGenerative colourisation methods can be applied to automatically convert greyscale images to realistically looking colour images. In a face recognition system, such techniques might be employed as a pre-processing step in scenarios where either one or both face images to be compared are only available in greyscale format. In an experimental setup which reflects said scenarios, we investigate if generative colourisation can improve face sample utility and overall biometric performance of face recognition. To this end, subsets of the FERET and FRGCv2 face image databases are converted to greyscale and colourised applying two versions of the DeOldify colourisation algorithm. Face sample quality assessment is done using the FaceQnet quality estimator. Biometric performance measurements are conducted for the widely used ArcFace system with its built-in face detector and reported according to standardised metrics. Obtained results indicate that, for the tested systems, the application of generative colourisation does neither improve face image quality nor recognition performance. However, generative colourisation was found to aid face detection and subsequent feature extraction of the used face recognition system which results in a decrease of the overall false reject rate.
- KonferenzbeitragMinutiae-based Finger Vein Recognition Evaluated with Fingerprint Comparison Software(BIOSIG 2020 - Proceedings of the 19th International Conference of the Biometrics Special Interest Group, 2020) Castillo-Rosado, Katy; Linortner, Michael; Uhl, Andreas; Mendez-Vasquez, Heydi; Hernandez-Palancar, JoséFinger vein recognition is a biometric authentication technique based on the vein patterns of human fingers. Despite the fact that classical approaches are based on correlation, the topology of vein patterns allows the use of minutiae points for their representation. Minutiae points are the most used features for representing ridge patterns in fingerprints. In literature, it has been shown that minutiae can be used for finger vein comparison, but low image quality provokes that many spurious minutiae are extracted from them. In this work, a preprocessing method is presented, that combines classical digital image processing methods and level set theory in order to extract a set with the most reliable minutiae. The experiments were performed on two publicly available databases and different comparison methods were used for testing the representative character of the minutiae set extracted. The results showed that even though the amount of extracted minutiae is around 15-30, effective identification is possible.
- KonferenzbeitragEfficiency Analysis of Post-quantum-secure Face Template Protection Schemes based on Homomorphic Encryption(BIOSIG 2020 - Proceedings of the 19th International Conference of the Biometrics Special Interest Group, 2020) Kolberg, Jascha; Drozdowski, Pawel; Gomez-Barrero, Marta; Rathgeb, Christian; Busch, ChristophSince biometric characteristics are not revocable and biometric data is sensitive, privacypreserving methods are essential to operate a biometric recognition system. More precisely, the biometric information protection standard ISO/IEC IS 24745 requires that biometric templates are stored and compared in a secure domain. Using homomorphic encryption (HE), we can ensure permanent protection since mathematical operations on the ciphertexts directly correspond to those on the plaintexts. Thus, HE allows to compute the distance between two protected templates in the encrypted domain without a degradation of biometric performance with respect to the corresponding system. In this paper, we benchmark three post-quantum-secure HE schemes, and thereby show that a face verification in the encrypted domain requires only 50 ms transaction time and a template size of 5.5 KB.
- KonferenzbeitragFisher Vector Encoding of Dense-BSIF Features for Unknown Face Presentation Attack Detection(BIOSIG 2020 - Proceedings of the 19th International Conference of the Biometrics Special Interest Group, 2020) González-Soler, Lázaro J.; Gomez-Barrero, Marta; Busch, ChristophThe task of determining whether a sample stems from a real subject (i.e, it is a bona fide presentation) or it comes from an artificial replica (i.e., it is an attack presentation) is a mandatory requirement for biometric capture devices, which has received a lot of attention in the recent past. Nowadays, most face Presentation Attack Detection (PAD) approaches have reported a good detection performance when they are evaluated on known Presentation Attack Instruments (PAIs) and acquisition conditions, in contrast to more challenging scenarios where unknown attacks are included in the evaluation. For those more realistic scenarios, the existing approaches are in many cases unable to detect unknown PAI species. In this work, we introduce a new feature space based on Fisher vectors, computed from compact Binarised Statistical Image Features (BSIF) histograms, which allows finding semantic feature subsets from known samples in order to enhance the detection of unknown attacks. This new representation, evaluated over three freely available facial databases, shows promising results in the top state-of-the-art: a BPCER100 under 17% together with a AUC over 98% can be achieved in the presence of unknown attacks.
- Konferenzbeitrag3D Face Recognition For Cows(BIOSIG 2020 - Proceedings of the 19th International Conference of the Biometrics Special Interest Group, 2020) Yeleshetty, Deepak; Spreeuwers, Luuk; Li, YanThis paper presents a method to recognize cows using their 3D face point clouds. Face is chosen because of the rigid structure of the skull compared to other parts. The 3D face point clouds are acquired using a newly designed dual 3D camera setup. After registering the 3D faces to a specific pose, the cow’s ID is determined by running Iterative Closest Point (ICP) method on the probe against all the point clouds in the gallery. The root mean square error (RMSE) between the ICP correspondences is used to identify the cows. The smaller the RMSE, the more likely that the cow is from the same class. In a closed set of 32 cows with 5 point clouds per cow in the gallery, the ICP recognition demonstrates an almost perfect identification rate of 99.53%.
- KonferenzbeitragToward to Reduction of Bias for Gender and Ethnicity from Face Images using Automated Skin Tone Classification(BIOSIG 2020 - Proceedings of the 19th International Conference of the Biometrics Special Interest Group, 2020) Molina, David; Causa, Leonardo; Tapia, JuanThis paper proposes and analyzes a new approach for reducing the bias in gender caused by skin tone from faces based on transfer learning with fine-tuning. The categorization of the ethnicity was developed based on an objective method instead of a subjective Fitzpatrick scale. A Kmeans method was used to categorize the color faces using clusters of RGB pixel values. Also, a new database was collected from the internet and will be available upon request. Our method outperforms the state of the art and reduces the gender classification bias using the skin-type categorization. The best results were achieved with VGGNET architecture with 96.71% accuracy and 3.29% error rate.