Repository

Looks good to me!

User Tools

Site Tools


projects:cybersecurity:cs5332:start

CS5332 Biometric Authentication

Week 13

Emerging biometrics

  • Iris and face recognition on the move
    • Prototype developed by HTX with NEC

HTX

  • Fingerprint anti-spoofing using "liveness" detection, e.g. using microscopic pores instead of minutiae
  • Iris anti-spoofing using "liveness" detection
  • Revocable biometrics:
    • Enrollment of a virtual face, that is synthesized from a real face and a token using a "Face Renderer" (paper)
    • Goal: irreversibility, revocability, unlinkability
    • It effectively functions as a hash of the facial features, but the goal is really more of to add a layer of obfuscation that can still interface with the face verifier.
  • Face morphing attacks: (webpage)
    • Blending images of two separate people, such that both people still pass the threshold for face authentication
      • More than 40 morphing cases detected in Slovenia [Tork2021]
    • Research study in Singapore (SMU)

History

  • Continuous authentication
    • Monitoring of keystroke dynamics (typing behaviours), webcam, mouse w/ fingerprint sensor
    • Touch dynamics, e.g. touch, swipes, gestures
  • New biometrics:
    • EEG
    • Face dynamics (rather than facial features)
    • Bone vibration between phone and accelerometer
    • Eye saccades
    • Ear canal acoustics
    • Toothprint acoustics
  • Privacy preserving technology
    • Anonymization
    • K-anonymity
    • Zero-knowledge proofs

Week 12

Public holiday.


Week 11

Term paper project details

  • Deadline: 5 May 2024
  • 84 past case studies available.

Privacy

4 notions of privacy, and the first three are impacted by biometrics:

  • Territorial: Physical space, e.g. where you live, and personal space (e.g. face + surveillance)
  • Informational: Digital information (e.g. collection and use for identification)
  • Bodily: Physical safety (e.g. fingerprint, retina, DNA)
  • Communications: Eavesdropping

Information privacy can be broken down into five sub-areas:

  • Unauthorized collection, i.e. consent
  • Unnecessary collection
  • Unauthorized use
  • Unauthorized disclosure
  • Function creep
    • Using data beyond its original intention, e.g. office non-presence -> paycut, retina scan -> denial of insurance with medical detection, TraceTogether -> criminal investigations

  • Data protection laws are not enough. Should covert surveillance be banned (i.e. disclosure that surveillance is taking place)? What level of privacy to expect in a public place?
  • Government is usually the largest biometrics deployer, but is typically self-regulated. Can influence industry through regulation, transparency and accountability.
    • Problem: Threats thwart these ideals. Easy to strip away privacy for emergency laws in the name of national security. Hard to rescind laws when threats subside.
  • Industry must self-regulate, because privacy laws are usually not specific enough, and take a long time to pass.
    • e.g. Sydney-based Biometric Institute's Privacy Code for self-regulation (but this was revoked in 2012)
  • International cooperation needed to control transborder flows of data (need international standards and dialogues), due to frequent movement of travellers
    • e.g. EU GDPR, CCPA, China PIPL, APEC Privacy Framework

Different dimensions of privacy:

Identity

Identity is a social construct, for according rights and privileges to members of an organization, assigning roles and responsibilities, for accountability and establishing trust.

  • For many applications, identity is not required, but credentials/authorization (e.g. is the user authorized to enter?). Proof of authorization can be achieved using zero-knowledge proofs.
  • Sometimes identity is not sufficient, and the presence of the person must be proven (proof of presence), e.g. wedding witnesses, polling stations.
  • Biometrics may soon be able to detect mental/emotional states, i.e. proof of intention, e.g. neuro-economics.
  • Schools of thought:
    • James Rule: Privacy issues are ethical and political, not technological.
    • Youths living openly on social networking: Privacy is dead, get used to it.

Standards

For interoperability of biometrics:

  • For Singapore, IT Standards Committee (ITSC) supported by SPRING Singapore and IMDA, with 300+ members from public and private sectors, IHLs, RIs
  • Biometrics technical committee (2004) participates as a P member in ISO/IEC SC37, founding member of Asian Biometric Consortium.

  • Superseded by Identification Technology Technical Committee (ITTC), which has a larger scope, including smart cards. Lecturer has since left the group.
  • Some standards for interoperability from ISO/IEC 19794, this is from part 2 on Finger Minutiae:

  • ISO/IEC 29109 provides conformance testing methodologies.
  • Recent focuses in ISO/IEC SC37 WG3 on biometric sample quality, and spoofing detection.

Week 10

Attacking biometrics

Other than the usual network-related attacks, the focus here is more on biometrics-related attacks:

  • Biometric sample (1):
    • Coercive attack (force legitimate user to authenticate) [counter: supervised authentication]
    • Presentation attack (changing appearance to match, e.g. photograph or gummy fingers, where latent prints are lifted using fingerprint kits)
  • Replay attack (1,2,4) [counter: liveness detection, e.g. motion, shape, voice recording of prompt]
    • Breathing on latent prints to trigger capacitive sensors using heat and humidity
    • Generating a synthetic "master" fingerprint from partial prints, to match a significant number of users, so that can be unlocked with high-probability.
    • Use of fake iris, e.g. contact lenses, hole in photograph of iris for liveness detection
  • Trojan horse attack (3,5): Under specific trigger conditions, present a pre-selected feature or high score, via malware
  • Collusion (8): Colluding with user/operator to bypass authentication, e.g. override mode for exceptional situations
  • Denial attack: Prevent legitimate user from authentication, e.g. damaging collection sensor
  • Back-end attacks:
    • Attacking enrollment (6,9)
    • Attack database (10)
    • Attack communication channel (7)
    • Attack application (11)
  • Hill-climbing attack: Some adversarial attack, where the score of the injected biometric signal is retrieved, and used to improve the signal
    • The signal change can be subtle, e.g. scars, makeup, expressions. Procedure: Use a base facial dataset of attacker, then synthesize a photo that would successfully authenticate. This synthesized photo is used to inform the attacker on the changes to make for a live attack.

Didn't attend the live lecture, so couldn't see the guest lecturer ahhhh.


Week 9

Face recognition

Human face recognition ability is extremely robust (~100% accurate and fast), but only for familiar faces - machines do better for unfamiliar faces. Babies are also born hard-wired for facial recognition, with the presence of a grandmother neuron in monkeys that triggers when seeing the face of another monkey.

  • Detection of faces is different from recognition (i.e. prosopagnosia, where patient can detect but not recognise the face)
  • Face recognition by humans is difficult (upside-down, negatives, other races), and it is easier to recognize distinctive faces than average ones.
  • Appearance depends on age, hairstyle, cosmetics, facial expression, illumination, image quality, pose.
    • No consensus on which features are best (no features are found to be invariant against all appearance variations, including against head tilts).
      • 3D face recognition is not better than 2D recognition.
    • Best performing algorithms use appearance, not geometry (e.g. ratios of distances)
    • Best performing: ARC face that uses deep learning.
  • Eigenfaces are commonly used to capture dominant variations across different faces.
    • Sensitive to alignment of eyes.

Performance:

  • Error rates have dramatically reduced over the years, according to the Face Recognition Vendor Test: at FAR 0.1%, the FRR dropped from 80% in 1993 to 1% in 2006.
    • Still under controlled lighting, but improvement from 2002 due to algorithm improvements, higher resolution imagery (6 Mpixels), and greater lighting consistency
  • Today, lighting and facial expression variations are less of a problem. Bigger problems are head pose variations, elapsed time (between enrollment and identification), and low image quality
  • Largely solved for frontal view: using neural networks (Rowley, Sung), Adaboost (Viola, Jones), Wavelets + Bayes (Schneiderman). Other state of the art includes ArcFace for angular face recognition.
    • Generally good for access control (large face, controlled lightning) but not for surveillance (poor resolution, wide angle, non-cooperative subjects)
    • For latter, e.g. IJB-C dataset (~200 subjects, different settings, mounted cameras and drones)

Type
Mugshot
Webcam
Profile
Visa/Border/Kiosk

Biometrics applications

  • Biometrics suffer from failure to provide ad-hoc transfer of authority (e.g. passing a key to unlock)
  • Evaluation of whether biometrics should be used can be checked using a Spider Chart:
    • Urgency: Is the authentication mission-critical? Is the protected resource high value?
    • Scope: How many and how long will the system be in used?
    • Receptiveness: How do users perceive biometrics from this application?
    • Exclusivity: Are there alternative technologies available?
    • Effectiveness: Can biometrics solve the authentication problem, or is it only for deterrence?

Example: Citizen-facing applications, where individual uses biometrics as a citizen of a country for criminal identification, citizen identification, surveillance. Government is the deployer, and the system is usually large in scope, highly exclusive and mandatory.

Usecase Urgency Scope Receptiveness Exclusivity Effectiveness
Citizen identification 6 (identification with alternative tech like passports available) 9 7 9 8
Criminal identification 8 5 (only for criminals) 9 10 9
Surveillance (e.g. passive biometrics, voice) 8 8 7 10 4

Example: Employee-facing applications, where individual is an employee for PC/network access, physical access, attendance. Employer is the deployer, usually mandatory, smaller in scope, closed-world identification. Identity is already established via other processes.

Usecase Urgency Scope Receptiveness Exclusivity Effectiveness
PC/Network access 7 8 8 6 8
Physical access (replacing passwords, keys) 6 8 7 5 6

Example: Customer-facing for retail/ATM/POS, e-commerce/telephony. Organization providing service/product is deployer. Usually not mandatory, limited in scope to company.

Usecase Urgency Scope Receptiveness Exclusivity Effectiveness
Retail/ATM/POS 5 9 8 5 7
E-commerce/Telephony 8 9 9 6 7

Example: Non-security applications, e.g. video/photo indexing, customization of product/service, targeted advertising, entertainment. In the context of faster searching of large archives.


Week 8

Midterm exam.


Week 7

Fingerprints

Three different impressions of fingerprints, which are friction ridges. There is also an additional category for miniature fingerprint sensors (e.g. on phones/laptops), that capture a partial print instead.

Physiology

Some techniques to improve fingerprint images:

  • Increasing contrast using local contrast enhancement (i.e. histogram equalization)
  • Compute the orientation field, or Gabor filters-based contextual filtering
  • Binarization from greyscale images, and subsequently thinning into a single pixel width (buzzwords: dilation, erosion)

Processing

State-of-the-art template matchers as of 2018, using the NIST Fingerprint Vendor Fingerprint Technology Evaluation (FpVTE) dataset. Note that this is under ideal conditions of fingerprint capture, i.e. testing of algorithm alone - real-life deployment will increase the errors.

  • A bit more modern is to use Minex (Minutiae Interoperability Exchange), with associate reports for each algorithmic submission.
  • Current state of the art is around 0.03% FNMR @ 1% FMR with 2 pooled fingers, and 0.5% FNMR @ FMR 0.01% with 1 native finger.
  • FNMR: False Non-Matching Rate

  • FIDO has an accredited biometric laboratory list, that performs certification of biometric devices, not just the algorithms themselves.

Iris

Uses:

  • Immigration Checkpoint Authority (ICA) previously released iris authentication for motorcyclists in 2007. Phased out due to slow acquisition (e.g. outdoor lighting, removal of helmet visors and glasses), instead replacing with fingerprints.
  • Automated Passenger In-car Clearance System (APICS), 1st iteration uses fingerprints, 2nd iteration uses iris as well. Around 2022.

Physiology

Biometrics strategy is to use near-IR to illuminate the iris, and using an IR camera to capture the image. Near-IR preferred because the wavelength will always be longer than any iris color, resulting in a bright illumination of the iris, even for dark brown irises.

Acquisition and challenges

Processing by first generation iris systems

Performance, evaluation and certification:

  • NIST performs evaluation and certification of irides under the IREX III (2012) specification.
    • Performance roughly 1.5% for single iris and 0.7% for double iris.
    • Mate and nonmate to denote whether the iris belongs to an outsider.
    • Matching on the order of 20ms - 1000ms.
  • Typically better than face recognition: at FPR of 1/16e9, iris has lower misses than face (2% vs 20%).
    • About factor of four on average.
  • Iris algorithms tend to be faster on average, but there are large speed variations. Face algorithms also show better-than-linear dependence of speed on enrolled population size.
  • Iris tends to change over time:


Week 6

Multilayer perceptron

Support Vector Machines

The sklearn page seems to be a pretty good resource. Because it is inherently a 2-class classifier, to extend to C classes:

  • One-vs-one strategy: Build individual SVMs to compare between all possible pairs, e.g. for A,B,C classes, build A/B, B/C, A/C. Note that this roughly scales quadratically.
    • Since this may not resolve by majority rule, the distance metric is important to discriminate.
  • One-vs-rest strategy: Build SVMs to check each classes against everything else, e.g. build A/non-A, B/non-B, C/non-C

Principal Component Analysis

Has altnerative names, e.g. Discrete Karhunen Loeve Transform, Hotelling Transform.

  • PCA not great at handling shadows, because shadows are also interpreted as features.
  • Alternatives: Fisher, Laplacian, Independent Component Analysis, DFT

Week 5

Naive Bayes classifier

Bayesian statistics and using maximum a posteriori. Encapsulated in the equation below. Note that here this is for a single datapoint with D features (with the assumption that the features are statistically independent, to allow the reduction of the full likelihood into the product of likelihoods):

\begin{align*} w^* &= \text{argmax}_{w_j} P(w_j|x) \\ &\propto{} \text{argmax}_{w_j} P(x|w_j)P(w_j) \\ &= \text{argmax}_{w_j} P(w_j) \prod_{i=1}^D P(x_i |w_j) \end{align*}

Some key points:

  • Bayes' classification is essentially optimal, but practically has errors since we do not know the true conditional probabilities.
  • Log of MAP used to avoid underflow problem.
  • Naive Bayes classifier by sklearn.
  • Benefit being the original training data does not need to be stored, only the computed PDFs for a new observation, and lookup tables for the priors.

KNN classifier

Here, all the training data needs to be stored for calculation of some distance metric with the new observation. Not an optimal classifier, but with enough training data:

$$$$ P(\text{error}_{KNN}) \le{} 2P(\text{error}_{Bayes}) $$$$

The accuracy of this classifier is sensitive to the distance metric used, and this classifier is easy to implement but cannot scale.

  • KNN classifier implicitly estimates the decision boundary.
  • Some examples of distance metrics, including Manhattan and Euclidean geometry (both are generalizations of p-norm). Note that all the dimensions must be of similar scale, otherwise the dimension with the largest scale will dominate the distance.
    • Normalization is important!

p-norm here:

$$$$ D_p (\mathbb{x}, \mathbb{y}) = \left(\sum_{i=1}^nw_i|x_i - y_i|^p\right)^{1/p} $$$$


Week 4

Basically motivation for classification, and the use of linear/nonlinear classifiers. A small note on decision boundaries, and the use of less correlated metrics for adding dimensionality for pattern recognition.

  • Using Bayes' is an example of statistical pattern recognition.
  • g(x) = P(w1|x) - P(w2|x) is called the discriminant function, e.g. g(x)>0 ? w1 : w2

The concept of risk can be extended from the classification task, i.e. actions to be taken such as rejection (rather than performing classification). Then the overall risk is defined as:

\begin{align*} R &= \sum_{i=1}^a R(\alpha_i|x) \\ &= \sum_{i=1}^a \sum_{j=1}^C \lambda(\alpha_i|w_j)P(w_j|x) \end{align*}

where the latter step is applicable because there are C discrete classes. $$\lambda$$ is defined as the loss incurred for taking a particular action given the class. In an example with two actions:

Conditional risk:

\begin{align*} R(\alpha_1|x) = \lambda_{11} P(w_1|x) + \lambda_{12}P(w_2|x) \\ R(\alpha_1|x) = \lambda_{11} P(w_1|x) + \lambda_{12}P(w_2|x) \end{align*}

Decision rule:

\begin{align*} R(\alpha_1|x) &< R(\alpha_2|x) \\ (\lambda_{21}-\lambda_{11})P(x|w_1)P(w_1) &> (\lambda_{12}-\lambda_{22})P(x|w_2)P(w_2) \\ \frac{P(x|w_1)}{P(x|w_2)} &> \frac{\lambda_{12}-\lambda_{22}}{\lambda_{21}-\lambda_{11}} \cdot{} \frac{P(w_2)}{P(w_1)} \end{align*}

This derivation of the likelihood ratio makes the RHS independent of the input $$x$$.

To evaluate the accuracy of the classifier, we can perform N-fold cross validation. Procedure:

  1. Divide

Week 3

Above is a diagram representing a biometric system. Relatively straightforward: Biometric samples are obtained and preprocessed (for noise, distortions, etc.), then feature extraction is done and stored as a template.

Important errors:

  • Verification
    • False Reject Rate (FRR), false negative rate for genuine users. Quantifies user inconvenience.
    • False Accept Rate (FAR), false positive rate for non-users. Quantifies security risk.
    • ROC curve given by (FAR,1-FRR), i.e. (FPR,TPR)
      • Alternative is Equal Error Rate (EER) where FAR = FRR
  • Enrollment
    • Failure to Enroll Rate (FTE), etc. for fingerprints
  • Identification
    • Misclassification Rate (vis a vis Accuracy Rate)
    • Open-set identification vs closed-set identification (whether the user to be identified needs to have been enrolled prior)

Note the following syntactic sugar to represent "If score S > threshold, then accept, else reject":

A figure of merit is the expected cost of the system by assigning a cost to each FRR and FAR.

Errors for identification come in three types (again, adjustable based on threshold set):

  • False Positive Identification Rate (FPIR), similar to FPR
  • False Negative Identification Rate (FNIR), similar to FNR
  • Mixed-up Rate (MXR), rate of misidentification of employee

Detection Error Tradeoff (DET) curve demonstrate the tradeoff between FNIR and FPIR. Interpretations of DET and regimes for different use cases of algorithms operating at different parts of DET:

State-of-the-art systems. Takeaways: (1) face recognition is much better than fingerprint,

Different technologies comparison from 1). No such thing as the best biometric:

  • Intrinsic properties:
    • Universality: How common across human population?
    • Uniqueness: How unique to one person?
    • Permanence: Does it change over time?
  • Technology-dependent:
    • Collectability: How easy to acquire?
    • Performance: Accuracy, speed, robustness?
  • User perception:
    • Acceptability: How well do user accept system?
    • Circumvention: How easy to fool system?

Week 2

Basic linear algebra, plus fundamental subspaces (basis?) and null spaces.

  1. Col space: $$Col(A) = \{x: x = \lambda_i a_i \}$$
  2. Null space: $$Null(A) = \{x:Ax=0\}$$, i.e. the set of all vectors that represent the linear combination of column vectors that yield the zero vector.
    1. Dimensionality of null space $$Null(A)$$ = Number of columns in matrix $$A$$ - Number of independent columns in matrix $$A$$
  3. Row space: $$Row(A) = Col(A^T)$$
  4. Left nullspace: $$Null(A^T)$$

Eigenvalues and eigenvectors again:

  1. Eigenvalue equation, given the eigenvector matrix $$E$$: $$Ax_i = x_i\lambda_i \Rightarrow AE = E\Lambda$$
  2. Spectral theorem: If A is real and symmetric ($$A=A^T$$), then $$E$$ is orthogonal ($$E^TE = I$$) and $$\Lambda$$ is real.
    1. The main result is $$E^{-1} = E^T$$ since E is going to be an n-by-n matrix, the former is an O(n^3) operation.
    2. $$A = E\Lambda{}E^T$$
  3. Determinant $$det(A) \equiv |A|$$ is the product of eigenvalues.
  4. Trace $$Tr(A)$$ is the sum of diagonals (eigenvalues for eigenvalue matrix)

Bayes' Theorem: $$P(B|A) = \frac{P(A|B)P(B)}{P(A)}$$


Week 1

Module information

  • Lecturer: Terence Sim
  • Lectures are recorded, released one day later
  • No official office hours
  • Coursework: 3 assignments, 1 midterm, 1 term paper
  • Texts: Jain (2011) "Introduction to Biometrics", Li and Jain (2019), "Encyclopedia of Biometrics"
  • TODO: Populate biography on Canvas. Guy on the left: Ritu.

Expectations

Introduction

Different factors for authentication, in order to perform identification: something you have/know/are.

Two different roles for biometric authentication. The task of a biometric system is easily identified by the desired output, e.g. iPhone FaceID is a verification task (the claimed ID "owner of phone" is implicit).

  1. Identification: Who am I?
    • Biometrics -> Identity
  2. Verification: Am I who I claim to be?
    • Biometrics + claimed identity -> Accept/Reject

Neither is a trivial problem, and which is tougher to implement correct is an open research question. Note that each can be built based off the other, see below:

Building on top the other

Modalities of biometrics

Mainly categorized into two large groups:

  1. Physiological: Static properties, e.g. fingerprint, face, iris/retina, hand geometry, DNA
  2. Behavioral: Includes some temporal aspect, e.g. voice, gait, keystroke dynamics, written signatures

Other exotic physiological techniques: thermograms, earlobe shape, palm/finger vein, body odor, ear canal.

Fingerprints
Based on minutiae (ridge discontinuities). Many different sensors available, including optical (bright/dark fringes), capacitance (treating air gap between skin and surface as a capacitor), pressure and ultrasound.
High accuracy (0.1% false negative, 1% false positive, FpVTE 2003), with large databases available and cheap sensors. Wear-and-tear in fingerprint + sensor hygiene concerns.
1)
Jain, A. K. (28-30 April 2004), "Biometric recognition: how do I know who you are?", Signal Processing and Communications Applications Conference, 2004. Proceedings of the IEEE 12th: 3 - 5
projects/cybersecurity/cs5332/start.txt · Last modified: 6 months ago (17 April 2024) by justin