Biometrics: How to plan your very own project

By on
Biometrics: How to plan your very own project

Good planning before you implement biometrics is key if you want to reap the benefits of this technology, Martin Jacobs explains why

With the need for better security and risk-management procedures both at a corporate and governmental level, biometric uptake is finally starting to gather pace.

Government projects have provided proof-of-concept of large-scale identification, and as wider implementation drives down costs, demand for the technology will increase.

It is important for any organizations implementing a biometric project do so in a joined-up fashion. From the start, they should consider the underlying requirements, including implementation plans and the potential impact of growth, mergers and de-mergers, joint ventures and technology refresh. Limited-vision pilots can easily turn into compatibility millstones unless they are managed as part of a properly considered strategic program.

The first thing to bear in mind is that biometrics is a constituent technology. The implementation of a biometric project must be part of well-planned and well-coordinated security program. It would simply be foolhardy to have iris recognition systems on the front door if the back door is left unlocked.

Once companies are certain that the implementation of a biometric function will complement and improve existing security, however, there are a number of key points that must be addressed. From the implications of geographical range to the management of inevitable exceptions, these considerations are common not only to most security technologies but, in some cases, peculiar to biometric technologies.

The need for standards for biometric data is crucial in all projects, but this becomes much more important when projects cover different geographies. Characteristics vary between different pieces of equipment, and there is no universal plug-and-play that will achieve common match accuracy, suit all usage levels and be universally convenient. The full range of application conditions and cultural acceptability needs to be considered before selecting a particular performance level.

Also, be prepared for the logistical exercise around data capture, which is likely to be one of the largest costs. Arranging to have thousands of people scanned for one or multiple biometrics might not sound like a difficult task, but costs can grow exponentially with the number of people involved.

Don't ignore the alternatives

Remember too that there will always be some people for whom a particular biometric will never work satisfactorily, if at all. So consider alternative technologies to give flexibility and the ability to cope with temporary issues, such as bandages or equipment malfunction. There are advantages in allowing a choice of biometric. Enrolling with two or more biometrics in one session is not twice the net real cost, and the flexibility obtained could be valuable where there are substantially different environments within one organization.

Make sure the system is easy to use. Most people are tolerant of the enrolment process if it is handled efficiently, but will not tolerate subsequent match failures or slow capture and matching techniques.

As with any security system, the reference material for determining permissions needs to be collated and replicated to all reference servers. This is a more complex issue than centralised password management, particularly as refresh systems are needed (all biometrics will change during a life-time).

A degree of redundancy is needed to cope with network or server failures as well as reading and enrolment equipment. How far you need to go depends on your risk assessments for the various locations, but you need to be wary of lowering standards as a result of system failures. The best way to maintain standards of security is to ensure that the mechanisms run reliably and smoothly.

One-to-one or one-to-many?

It is also important to realise the purpose of an implementation. Will the biometric data be used just to confirm that the user is who they say they are – a relatively low-stress database search – or will it require the user to be matched against a data set in order to provide independent identification?

Issues to be considered are whether declared identity (such as name or logon ID) is to be used as primary identifier or the biometric is the primary or only identifier, whether the declared identity is an adequate reference for establishing a biometric cross-check, and whether establishing uniqueness of a person is important (as in policing functions and fraud detection).

One-to-many or one-to-one?

If you look at the wider context of matching, then all permission control uses a one-to-many approach for the primary identifier. The issue is whether fuzzy matching is needed or acceptable – for declared identity, an exact match is normal, but biometrics will always require some latitude in matching with a consequent processing requirement.

The "intelligent camera" approach featured in so many spy-thrillers looks attractive from many points of view, but requires significant processing power and quality enrolment to achieve a high enough accuracy.

The technological implications of one-to-many against one-to-one are significant, and impact on qualitative aspects of the capture and matching processes, the equipment and software used and, of course, on the matching algorithms used.

While immigration and policing might need provable uniqueness of identity worldwide, most other applications can manage with much simpler matching criteria. This is very much related to scale – allowing three people entry to a secured area is very different from checking 30,000 staff or three million customers.

For any biometric matching, there is a relationship between the quality of the initial data capture and the reliability of subsequent matches. Low-cost readers might provide an adequate match, provided the initial capture was of sufficient quality, while higher-quality biometric equipment can cost less to support, because it generates fewer match failures through limited definition and discrimination. This is also affected by the multi-factor argument. Capturing two different biometrics can be treated as providing two separate factors, because where both can be used, then discrimination thresholds on each can be lowered.

It is also important to consider the nature of what you are trying to protect and how it is accessed. Protecting a physical item is different to protecting data. Access to physical property might require access at one point only, while access to data systems could need access points on every PC in the organization. Potentially, biometric verification could be used universally to verify internet transactions, in which case a given accuracy factor has to be allowed for.

One issue to consider is that, unlike passwords, biometrics do alter over time. There is evidence to suggest, for example, that facial and fingerprint are subject to gradual change. This would mean that data would have to be recaptured at regular intervals – although realistically this would stand at ten years or more – which would have a significant cost implication on any project. On current evidence, it seems that iris patterns might be more stable through a person's lifetime, although the technologies are evolving so fast that it will probably be advantageous to refresh any biometric reference for one reason or another during a person's life.

When planning a scheme, it is important to recognise that the longer the operational timescale is expected to be, the more likely it is that the technology will require upgrading or that superior standards might be introduced that conflict with those ongoing selected for the project. This is an important consideration and one that should not be overlooked, despite the complexity that it will add to any project going forwards. How many IT solutions are stable for more than five years, let alone a lifetime?

Exception management

As with many "real-life" problems, exception management is perhaps one of the key factors to be assessed. Any exception requires a backup system to be in place to deal with the failure, which has vast implications for both the cost and security of the technology.

For example, if a single biometric is required to gain access to a building and fails, will this result in human intervention? Is a guard needed to open the door, or can you realistically refuse entry to the essential maintenance engineer or the CEO? The way exceptions are handled in practise can make or break the scheme by reducing efficiency or effectiveness, or both, so would put into question the justification for having the biometric security layer.

Exceptions can be minimized through using two or three factors of identification, or through better quality data capture, but the larger the scale of the project, the higher the likelihood that this will become an issue. What happens when the biometric fails is critical to consider at the outset of a planning for a project, especially if the option to refuse access simply does not exist – for legal and discriminatory reasons, for example. Exception management must be built into the project, or failure from the off might well be a possibility.

The business case

The business case for a biometric implementation is quite limited at present, and primarily applies to situations where tokens and passwords are impractical or are insufficient evidence of identity. But as costs come down and the use of biometrics becomes more visible, the tables are turned – why issue tokens and passwords when your biometrics are always with you?

The convenience and overhead factors will probably be the strongest drivers in the market. Other aspects of the business case include the fall in supporting costs around security – no more password resets, for example, and a lower level of operational failures.

In the short term, it is the potential for the highest levels of personal identification that is driving the uptake of biometrics, but the familiarity that brings will power the next phase. How soon will it be before consumers will prefer to put a finger on a pad or glance at a camera rather than remembering a PIN and watching who's watching your fingers?

A combination of lower costs, greater experience by integrators and familiarity for users will lead to more wide-spread use in the public as well as the corporate market places. Chip-and-finger as an alternative to chip-and-pin could be a godsend to those with a bad memory, and removes the risk of being over-looked. Meanwhile, when you look at the dashboard ready to start the car, will it look back at you to make sure it recognises your eyes, face or hands?

And as the technology gets even cheaper, then it's time to think small and numerous – the domestic market. If the next generation of home computer has a finger-reading mouse and a camera built-in, then the PC might be able to switch profile to whichever member of the family sits in front of it, in a blink of an eye (or a nod of the head or touch on the mouse).

Now that's security.

Martin Jacobs is principal consultant at Atos Origin

Copyright © SC Magazine, US edition

Most Read Articles

Log In

|  Forgot your password?