• Sign in
  • Sign up

Objective analysis of available biometric technologies

41 Pages


Words: 12264

Date added: 17-06-26

rated 4.6/5 based on 7 customer reviews.

Category: Uncategorized


open document save to my library


There have two aims of this project.

Firstly is to provide an objective analysis of available biometric technologies, to identify their strengths and weaknesses and to investigate a broad range of application scenario in where biometric techniques are better than traditional recognition and verification method.

Another aim is to develop a product. Now a day most of the online banking and financial organization are trying to convert their existing online banking in open source Java or in some other open source platform, so that it could be more reliable, secure and difficult for the hacker to hack such open source management system. Most of the systems are still using the login ID and password typing functionality which is not secure at all as anybody can steal password by using a hidden Keystroke logger or like this sort of software and another problem is user need to remember so many password and user ID for different web services. From a statistical observation it found that more than 70% people write down their Username and password, which can be stolen, lost and can be misuse by others. If the organizations could integrate secure fingerprint or any other biometrics built in functionality then it could be more secure, reliable, easier and hassle free for the user.

To get ride from such problem I have tried to develop such a model of secure web service integrating with fingerprint recognition where users no need to remember or insert anymore user name or password. Although there has lots of password replacement fingerprint software available in the market but as my knowledge such software doesn't work for completely platform independent (Java based) secure web service. I have used platform-independent Java 2 Platform Enterprise Edition (J2EE), Netbean, Jboss server, sql data base and open source bio-sdk to develop this model.


Although this web service has integrated only with the fingerprint functionality due to limitations of hardware and other resources but in here has critically investigate about the strengths and the security hole of other biometric functionality, so that in future such biometrics functionality can be imply.

Another constraint with regard to this report is time. To provide more strength and security for that system, many features could be added like development of better algorithm to fix the security hole of the fingerprint software. To cope with the time changes are an inevitable part of the software or web service development but many have been entirely avoided in this case as they would not have added any value to the principal purpose of this project.

Problem Areas for that Project

Biometrics is a young technology, therefore relative hardware is not that available in the local market and they are so expensive to buy personally.

Unfortunately there is no biometrics hardware in the CMS's hardware lab. As well as there is no biometrics software or equipment. It was requested to buy some hardware for this thesis purpose but unfortunately the university was not agree to buy or manage anything which is related to biometrics.

Many companies of this biometrics fields were requested personally to help or give information regarding their product but they denied for the marketing reason.

There was no biometrics related books in the university library. Moreover the library was unable to provide.

So without any technical and theoretical support it was really hard to gain new idea and to make a new product which is related to the biometrics.

Some biometrics hardware has been bought personally for this thesis. With the extraordinary help, advice and encourage from the supervisor this work has been done.

Section One: Background Literature of Biometrics

Chapter 2:

Background Literature of Biometrics

Now a day biometrics is a well known term in the information technology. The origin of the word biometrics comes from Greek language. Bio means life and metrics means measurement. So the biometrics is related to the measurement of a living thing. But in the information technology it means an automated process where a human is recognised or identified using his/her physiological or behavioural characteristics. The specific physiological characteristics is collected, quantified, measured, compared with the previous stored characteristic and decided. So it is the process for the identification not any innovation.

2.1 A short history of biometrics:

In the normal life a person has been recognised or identified based on face, body structure, height, colour, hair etc. So in that sense the history of biometrics identifiers or characteristics is as old as mankind history. In the ancient East Asia, plotters used their fingerprint on their products which is the identification of individual. In the ancient Egypt the people use some characteristics such as complexion, eye colour, hair, height to identify trusted traders. But for a long time biometrics had not been considered as a field of study.

At the late 1880, the biometrics gained the interest as a field of study. The credit was Alphonse Bertillon who was an anthropologist and police clerk. He was tried to identify convicted criminal to others. He first discovered and mentioned that some physical measurement of an adult human is invariant of time. These combinations of measurements are different to human to human. So these measurements can be used to recognize an individual from other (Scottish Criminal Record Office, 2002a). His theory was known as Bertillonage or anthropometry. That time his theory was appreciated and thought to be well established. The main measurements which he suggested are given in the picture 2.1. But in the year 1903, it was found that his theory was wrong for the identical twins. That time an identical twin was found, according to his theory they are single person. So the new theory or new characteristics were looking for the identification.

It was said that Sir Edward Henry was the first who interested on finger print for the purpose of identification. He was an Inspector General of Bengal police. In 1986, he ordered to record the prisoners fingerprint as an identification measurement. He tried to introduce the classification system of the fingerprint. In the year 1901, Sir Henry was joined as Assistant Commissioner of the Scotland Yard. After then a finger print bureau was established. That time the failure of the anthropometry system made the finger print system well known. Finger print system was started to use for the purpose of identification of a person. The system is used as same way still today.

Automated system to read finger print was first introduced in the early 1970s. The first finger-print measurement device was first used in 1972 which was known as Identimeter. This device was used at Shearson Hamil named Wall Street Company. The purpose of this device was time keeping and monitoring.

Day after day the interest of this biometric system was increased. The decrease of the hardware cost of the computer and improvement of the algorithm increase the research of the biometrics.

2.2 Biometric characteristics:

2.2.1 General requirements for a characteristic using as a biometric identifier:

In the biometric history section, it has been discussed that several characteristics were consider as an identifier of human. But many of them were rejected. According to the Amberg 2003, if a characteristic can be considered as an identifier for the biometric purpose then it should mitigate some requirements such as university (Every human should have that characteristics), uniqueness (That characteristic should be different person to person), permanence (that characteristic should be permanent) and collect ability (that characteristic should be able to collect and that should also be measurable). There are some additional requirement can be applied with a these requirement such as performance (It accuracy should be high, it should need minimum resources), acceptability (it should be accept everywhere and it should also be acceptable to the future users), fraud resistance (It should have higher security level and can be resistance to fraudulent), cost effective (it users benefit should be many times higher then its using cost).

2.2.2 Classification of the characteristics which can be used as biometric identifiers:

Biometrics characteristics or identifiers can be categorized into two groups. They are Physiological type and Behavioural type.

Physiological type: This type of characteristics is related to human body or anatomy. Finger print reading, DNA analysis and face of individual which are frequently used as biometric identifiers of this type. The use of retina and the iris will be prospective future. This type pf characteristic can be divided as genotype and phenotype. A group of people can have the same genotype characteristics. Blood group, DNA analysis these are the two most commonly used genotype characteristics. In contrast to genotype characteristics, phenotype characteristics can be having only single individual, so this type of characteristics is different from person to person. Finger print, retina and iris are this type of characteristic.

Behavioural Characteristics: This type of the characteristic is related to human behaviour. Signature is the most commonly used characteristics of this type. Human voice analysis and key stoke is another two characteristics which are now also be used. This kind of characteristics is the indirect measurement of the human body. This type of characteristics has been learned or trained; therefore these can be different from time to time. But when a human reach in a certain age, the change of behaviour is negligible, therefore these type characteristic used as identifiers. In the 2.2 the frequently used biometrics characteristics have been shown.

2.2.3 Contrast of the biometrics characteristics:

A contrast of biometrics characteristics has been given in the table 2.1.

Table 2.1: A contrast of the biometrics characteristics (Jaine et al. 1999)

From the table 2.1, it has been said that the physiological characteristics have the better performance then the behavioural characteristics.

From the table 2.1, it has also been seen that some biometrics trait can be regarded more universal, unique and permanent then the other. Such as Iris, DNA, body odour and finger print. But the Iris, DNA and body odour are promising, they need future research and Experiment. Their cost is high, so they are not cost effective. So, now in present the finger print is one of the most accepted biometric traits.

2.3 Establish Identity

Now a day society has been changed significantly. In the past, everyone of a community knew everyone. But now a day, globalization has been changed the situation. Peoples are now interconnected electronically. They are mobile all around the world. So establishing identity is one of the most important task.

2.3.1 Resolving identity of an individual:

There are two fundamental problems occurs for this purpose. They are authentication and identification.

Authentication problem: This problem is also known as verification. This problem arises to confirm or denied anyone's claimed identity. When any person claimed an identity then this operation process required a comparison. The comparison occurs between submitted biometric samples and the stored samples for the claimed identity. This process is called a ‘one to one' comparison. For an example an ATM (automatic teller machine) can be considered. For ATM machine the authentication problem has been solved in a two stages process. First stage is to possess a valid ATM card. The second stage is to know the PIN (Personal Identification Number). If anyone know the other person's PIN and possess his/her correspondence ATM card then that person can claimed the identity of the original ATM card owner identity. This kind of fraud activities have been increasing day after day. According to Jain Et Al, 1999, In 1996 ATM associated swindle activities valued in USA 3 billion US dollar. In the other hand biometrics system promotes a system which can overcome this authentication problem.

Recognition problem: This is also known as identification problem. This problem occurs when a person has been identified from a set template of database. In this problem the person's data has been compared against the data from the database. It is ‘one to many' system. An example would help to clear the concept. To identify a criminal a law enforce officials some time lifted finger print or other data from the crime scene. After then they compare the data with the stored data of known criminal. By this way they might be able to identify the criminal.

According to the UK Biometrics Working Group (2002), all the biometric matters does not included in the title of verification and identification. Therefore three more pair of terms has been introduced. These three pairs are (1) Positive claim of identity and negative claim of identity, (2) Explicit claim of identity and implicit claim of identity, and (3) Genuine claim of identity and imposter claim of identity.

Positive claim of identity is also known as positive identification. In this process the claimed person's identity should have to be enrolled before and known to the system. An example would help to realize the process. An online email account customer enters his or her login name and password into the system, the system compared the combination of these two against a set of data where customer data has been stored before. If the combination of the login name and password has been matched then the user has been verified. The process needs only the login and pass word nothing else. So the email provider does not know who is actually using the account.

Negative claim of identity has been known as negative identification. In this process the claimed person's identity has not been stored before. So the claimed person can enters only one time, after entering his/her identity has been stored in the system and he or she cannot enters again. Such kind of example is American Social Security. According to the Jain Et Al, 1999, around a billon of US dollar has been taken away annually by using multiple identities from the social security welfare in USA.

In the case of Explicit Claim of Identity, a person unambiguously declares his identity to the system. The claim may be negative claim or positive claim. His/ her submitted identity has been compared with the stored data in one to one comparison. (One to one comparison has been described in the authentication section). Using ATM card is an example of the positive explicit claim of identity. To realize the negative explicit claim of identity, consider an air port where the face recognition system has been established. If a passenger is similar to a known terrorist person then the system would raise the alarm. Then the passenger needs to claim the explicit negative claim of identity. So the passengers other identity such as finger print, iris etch has been compared against that known terrorist in one to one basis comparison.

Implicit claim of identity can be positive or negative claim. In this process a person's identity has been compared in ‘one to many' comparison basis against all stored identities.

When anyone claims an honest claim to be himself or herself then it is called the genuine claim of identity (UK Biometric Working Group, 2002). In this case his / her identity has been truly matched with the stored identity.

Imposter Claim of Identity is the process where anyone claims to be someone else is deceit or false (UK Biometric Working Group, 2002). In this case submitted identity does not match with the stored identity.

2.3.2 Verification Technique:

According to the Mitnick, 2002, the Verification technique can be divided into three types. They are (1) Knowledge based verification technique, (2) Token based verification technique and (3) Biometric based verification technique.

Knowledge based verification system:

In this process some information has been used, that information is secret (combination of pass word/PIN/Memorable words etc), usually the person of the original identity has been supposed to be acquainted with secret information. People may travel from distance to distance, so that their memorable secret information will be with them. So it can be said that it will be suitable to use from a distance or remote place.

But this type of authentication has some serious drawbacks. By using Trojan horses and Spywares a hacker can know the others secret information. Trojan horses and Spy wares are able to send the key stoke as email. So this knowledge based verification is not a secure system. Most of the times people use their known name as secret information for the knowledge based verification system. So, it might be possible for the others to guess. Sometimes people do not change their secret information in the knowledge based verification system for a long time. Their secret information is not secure. Sometimes they keep their initial secret information, so that it might be easy to hack. Many types of hacking methods have been developed such as dictionary attack, Hybrid methods, brute force attack etc.

In comparison to other technologies, this is cheap and has a large level of security stage.

Token based verification system:

In this system the claimed identity person should have something which should be used with the secret information. ATM card is an example of the token based verification system. It can be said that it is more secure then the knowledge based verification process because if the token has been lost or stolen then its user can notify.

Biometric verification system:

In this system users biometric distinguishing characteristics such as finger print, face, signature, etc have been used which represents the user's appearance. These characteristics are moved with the users they are more secure compare to the other two systems. It is quite impossible to use by the unauthorized person. But this system is relatively costly.

Actually no system is fully secure. All of the three systems have some serious drawbacks. Secret information can be hacked, unauthorised person can stole the token and use that and it is also possible to copy biometric information and later replay those (Woodward Et Al. 2003). In order to counter these drawbacks, multiple verification systems can be used. ATM card is an example of the combination of knowledge based verification system and token based verification system. If in the future, the iris scanner is available then it will be more secure if iris scanner has been used with the ATM card.

2.4 The components of a general biometric system and their function:

A general biometric system can be divided into five subsystems. They are: (1) Data acquisition system, (2) Data transmission system, (3) Signal processing system, (4) Data storage system and (5) Decision making system. In the 2.2 a general biometric system has been shown.

Data acquisition system: It has been assumed that every biometric system has two characteristics. They are uniqueness and repeatability. Uniqueness represents that every person's biometric trait is different. It will not be same for the two persons. The repeatability represents that the biometric trait will be same over time. In this acquisition system the sensors measure the user's biometric characteristics. These characteristics are said as samples which have definite attributes. The type of presentation and the reader quality can affect the sample qualities.

Data Transmission system: Most of the cases the data collection and processing is not at the same location. So there is a one subsystem which function is to transfer the data. In the data transmission system, compression and expansion has been functioned depend on the size of the sample. The standard protocol has been used for compression and expansion. When the facial image has been sent JPEG format has been used. WSQ format has been used for transferring the data of fingerprint and CELP format has been used for the voice.

Data processing system: there are three parts of signal processing system. They are: (1) feature extraction section (2) quality control section, and (3) pattern matching section. At the extraction section the appropriate biometric data has been split from the background information of the sample. This process is called segmentation. For an example, in a face detection system facial image has been separated from the wall or other back ground. After the extraction the quality has been checked. If the quality of the data is very poor then another sample has been asked. After this section, the pattern matching process has been started. After then the decision making section. Featured data from the pattern matching section has been stored to the storage section depends on the function of the overall biometric section.

Data storage section: From the pattern matching section, some featured of data has been stored as data storage section as template. The main purpose is to compare with the incoming feature. If the overall system is based on one to one matching then the data storage section can be decentralized but if the overall system has been functioned for the one to many matching then the central data base has been needed.

Decision making system: Quality score and the matching score have been sent to the decision making section from the processing section. The decision making system decide the sample has been accepted or denied. The policy is specific depends on the system security expectation. If the number of false non match incident has been increased then the number of false match will be decreased.

2.5 Performance of a biometric system:

The main focus of a biometric system is to ensure the security where only the authorised used can be accepted and non authorised users are denied. The system processing speed is usually given to less priority. The main considerable factors of a biometric system are mainly described by some terms such as Failure to En-roll Rate (FTE), Failure to Acquire Rate (FTA), False Acceptance rate (FAR), False Rejection rate (FRR), False Match Rate (FMR), False Non Match Rate (FNMR) etc.

False Match Rate (FMR): This represents the serious type of fault of a biometric system. This occurs when an authorised users biometric information match to an unauthorised person's identity. In this case the signal processing system produces a high matching score of a non corresponding template.

False Non Match Rate (FNMR): In this case the authorised person's biometric features are unable to produce enough high matching score to qualify. This is the opposite of FMR. One of the main reasons of FNMR is partially less quality of the biometric features.

Comparison of FMR and FNMR for the different biometric system: The main aim of a biometric security system is to reduce the rate of False Match Rate (FMR). On the other hand if the False Non Match Rate can be reduced then the system will be more fast and reliable. But all the time there is a relationship between FMR and FNMR. In the 2.4, relationships have been shown for different biometric system. Higher False Match Rate (FMR) is not acceptable, but for the low FMR the False Non Match Rate (FNMR) is considerably higher in every system.

Failure to En-roll Rate (FTE): Sometimes the biometric system cannot make a valid template for some users. Although biometric characteristics are universal but some case there are differences. For an example for a very low number of people's finger print cannot be enrolled in the system such person who use their hands aggressively such as construction workers or carpenter. So Failure to En-roll rate is the ratio of the number of the people whose biometric features cannot be enrolled to system to the number of the total person who use the system. In the 2.5 a practical test result has been shown where Failure to En-roll (FTE) has been measured for the different system (Mansfield Et Al.2001).

Failure to Acquire Rate (FTA): Sometimes the system cannot acquire data of the desired quality due to the readers/sensors, instrumental problem, environmental problem, noise level of data, background data etc. Simply Failure to Acquire Rate (FAR) represents those biometric sample which cannot get high quality score to go the decision making section.

False Acceptance Rate (FAR) and False Rejection Rate (FRR): these two terms are related to the False Match Rate and False Non Match Rate. False Acceptance Rate (FAR) and False Rejection Rate (FRR) are related to the whole biometric system. On the other hand the False Match Rate and the False Non Match rate are related to the single matching process. So in the case of FAR and FRR, Failure to Acquire Rate of the system should be included. According to Mansfield Et Al.2001, relationships can concluded as follow:

FAR (τ) = (1-FTA) FMR (τ)

FRR (τ) = (1-FTA) FNMR (τ) + FTA

Here, FAR- False Acceptance Rate

τ- Decision threshold

FTA- Failure to Acquire Rate

FMR- False Match Rate

FRR- False Rejection Rate

FNMR- False Non Matching Rate

Each point of the receiver operating characteristics (ROC) curves is corresponded to a definite threshold decision making score which has a particular False Rejection Rate and False Acceptance Rate. For the Forensic purpose, False Rejection Rate should be lowest and for the high security access purpose, False Acceptance Rate should be lowest.

Section Two: Biometric Technology

2.1 Physiological Biometric

In this section has mentioned about the pattern of fingerprint, hand geometry, pattern of iris, facial, retinal and vascular characteristics as a possible biometric identifier.

2.1.1 Fingerprint Pattern

Fingerprint is the oldest, popular and definitely the most widely publicly acceptable mature biometric identifiers. It perfectly meets the necessary criteria for of a biometric identifier like universality, distinctively, persistent and collectability.

They are impressions of the friction ridges on the surface of the hand. In the most application and in this thesis as well, the primary concern is focused on the ridges located above the end joints of fingers. However, in certain forensic applications, the area of importance is broader including the fingers, the palm and the writer's palm (WOODWARD ET AL. 2003).

Since early 1970 Federal Bureau of Investigation (FBI) has initiated extensive research and development efforts on fingerprint identification. Their main aim was to invent an automated fingerprint identification system (AFIS), so that it could be helpful for forensic purposes (RUGGLES 1996). Feature and Technology

There are two main elements in fingerprint matching technique: firstly minutiae matching and secondly pattern matching.

In the bellows shows regarding the primary technique that analyzes basic minutia types:

Macroscopic overview, universal pattern matching, focus on the integral flow of ridges -these could be categorized into three groups: loops, whorls and arches. Every individual fingerprint should be fit into one of these three categories that shown in the bellow's

Now a day most of the application depends on the minutiae matching. If a fingerprint scan device capture a typical fingerprint image then there could be identify around 30 to 60 minutia patterns. Federal Bureau of Investigation (FBI) has confirmed that it is not possible for two individuals, even for monozygotic twins also to have more than eight common minutiae. For matching minutiae are examine with type, shape, co-ordinate location (x,y) and direction. In the bellows has shown about the automated minutiae matching process based on these attributes:

In the above describes a case in where the input image (in left) is trying to match against a stored template (in right). 39 minutiae were detected in the input, while the template contained 42 different minutiae. The matching algorithm identified 36 matching data points.

(Source: Prabhakar 2001)

In the above , inputted image (in left) has detected 64 minutiae while in the template (in right) contain 65 different minutiae. The algorithm identified 25 completely non-matching data points.

There need a scanning or capture device to obtain such images. Since 1970s, lots of researches have been done to develop and improve such devices. As a result optical, capacitive, ultrasonic, thermoelectric, radio frequency and touch less scanners has invented and now a day most of them become less expensive and available in the market.

Optical device / scanner: The first method to capture the fingerprint image was the optical scanning technique. Frustrated total internal reflection is the main principle of the operation of such scanner. In that case the finger is placed on the glass platen and illuminated by the laser light. The surface of the finger reflects certain amounts of light depending on the depth of the ridges and valleys and then reflectance is captured by a CCD (charge-coupled device) camera that constitutes of an array of light sensitive diodes called photosites (O'GORMAN 1999).

The big advantage of such device is they are cheaper among all of the automated biometric devices and also available in the local market. The disadvantage for such device is: it could be easily fooled by impostors. The latent fingerprint left on the scanning surface, it's a big drawback of such device as anybody can collect the latent fingerprint image from there to spoof.

Optical Scanner “Digital Persona” has used to integrate the fingerprint scanning support for the product of that project are using popular U.are.U fingerprint recognition systems depicted in the below . In October 2003, the US Department of Defence has chosen digital persona scanner to secure network security at desktops in its offices in Washington, D.C. (digital persona 2009).

Capacitive Scanner / devices: since their first appearance in 1990, such devices have become very popular. A capacitive scanner is a solid-state device, which incorporates a sensing surface composed of an array of about 100.000 conductive plates over which lies a dielectric surface. When a user touches the sensor, the human skin acts as the other side of the array of capacitors. The measurement of voltage at a capacitor decreases with the growing distance between the plates. Therefore, the capacitance measured at the ridges of a fingerprint will be higher than the capacitance measured at the valleys. These measurements are then analyzed in a way similar to a sonar scan of the ocean bottom, resulting in a video signal depicting the surface of the fingerprint (O'GORMAN 1999).

The advantage of capacitive scanners is its very high accuracy rate. Another big advantages that they are much harder to fool than optical scanners since the process requires living tissue. As the users need to touch the silicon chip itself, solid-state scanners are susceptible to electrostatic discharge (ESD). Recent chip designs were specifically developed to withstand high levels of ESD and frequent handling. modern capacitive device manufacturer like Veridicom claims that their chips will survive around 1 million touches (Ryan 2002).

Thermoelectric device: It is silicon based. It measures the difference of temperature between the ridges touching the surface of the sensor and the valleys distant from them (O'Gorman 1999).

Although thermal scanning is very promising but it is still an uncommon method. A company named Atmel proponents of this technique. It uses finger sweep method to capture fingerprint in a tiny silicon chip named finger chip. Still it is treated as the smallest solid-state scanner of the world. It has a big advantage that it process self-cleaning functionality as latent prints are erased by sweeping across the platen. Since it is very tiny, so it can be installed in any small device such as PDAs. Now a day such chip's price is less than 5 US dollars, so it's another big advantage. Presently Hewlett Packard Company is using such Finger Chip for its iPAQ h5000 series, as shown in the .

In 1996, Ultra Scan Company brought the Ultrasonic technology in market. Ultra Scan technology works almost the same way that Optical scanner does. In this technology across the surface of the finger is scanned by an ultrasonic beam. Ultrasonic sensor are rarely affected by oil or dirt on the surface of the finger as it measures the range, thus ridge depth, of the echo signal captured at the receiver . In the bellows picture shows that ultrasonic scanner works fine even in the marker pen's mark, while optical scanner could not work. Another big advantage is that with such high precision technology scanner any children's fingerprint also can be captured accurately and reliably which is not possible with any other scanner.

Radio frequency (RF) imaging technique was an invented in 1999 and first lunch in the market by AuthenTec via its TruePrint technology. In this method the sensor restructure the image of the arrangement of skin layers underneath the skin surface.

These subsurface layers are the basis of the fingerprint patterns, which rarely affect by the damage of or wear to the finger surface (AUTHENTEC 2003a). So, RF technology bridges the FTE gap resulting from worn fingerprints and also lowers the usual FTA rates, so that method can be dependably used even under extreme circumstances involving dirt and heavy material activity such as in an automotive shop. Moreover, this technology allows protective coating of the silicon chip. As a result it has higher ESD, chemical and scratch resistance than any other solid state sensor. Until the mid 2003, AuthenTec which treated as the leader of the solid state fingerprint chip supplier has delivered more than one million fingerprint sensors in the market.

Touch less scanner: It is very similar to optical sensors except that the finger which will be examined has to be placed on a hole in a distance of about 2-3 inches to a precision glass optic. The advantage of not leaving any latent prints on the glass platen is compromised by the possibility of bad images resulting from dust and dirt falling through the hole (berg data biometrics 2003).

In conclusion, all sensor technologies have advantages and disadvantages. Costs, size and accuracy, these three important factors usually decide which one will have long term commercial success in the biometrics security verification market. Strengths and Weaknesses

Still fingerprint technologies are treated as the leading technology in biometric market. There has so many core strength that helped fingerprint technology and method to become in the top position.

Capturing digital fingerprint image technology is one of the oldest biometric identification technologies. Lots of research has been done upon this technology and method. So as a result now a day there has lots of cheap, reliable and nearly accurate various types of product in the market.

Every individual person in that world has the unique fingerprint. Even monozygotic twins also process the unique fingerprint patterns.

Fingerprint method and technology can be used for identification, authentication and other purposes as well.

Some fingerprint matching algorithms EER (Equal Error Rate) is quite low.

Fingerprint technology is very easy to use and understand and its can be successfully implement in so many application field.

Beside lots of strength, it has some weaknesses as well.

Since fingerprint technology is so well-known, a large amount of information is publicly available on how to defeat it.

Fingerprint technology is very well know and available among the general people. Lots of general and critical information are locally available from internet and different sources of knowledge. So it could be easily spoof and defeat.

In 2002, a professor of the Yokohama National University of Japan named Tsutomu Matsumoto shows how to fool some optical and capacitive sensor by using some simple, available and easy technique. For fooled the optical and capacitive sensor he used some available, cheap and easy to manage materials like gelatine and free moulding plastic. From his experiment we can find that to create a fake fingerprint just a latent print is sufficient.

In some cases to gain access a dismembered finger could be use as well.

When the a individual use a fingerprint scanner device , except the slide scanner and the touch less sensor scanner, in all other scanner or image capture device user leaves a latent fingerprint. If anybody wants then he can manage these left latent fingerprint instances to recreate exactly the same fake fingerprint, which professor Matsumoto shows in his experiment. Fingerprint integrated Smart Card

World's first fingerprint-enabled and self authenticating smart card has been developed by a company named Biometric Associates, Inc. (BAI). This card technology has developed by maintaining the ISO terms and conditions to increase its acceptability.

The above describes about the fingerprint integrated smart card and its components. BAI authenticator module is a self contained system which is consists of all the fingerprint verification steps. It is such a tiny module that contains an embedded capacitive sensor array chip. So a number of points of attack situated among usual system elements could be eliminate.

Only the legitimate user can unlock the protected function of such smart card by using their live scanned fingerprint. Higher security level has been maintained in such smart card comparing to an ordinary smart card which usually use knowledge based authentication technique.

To reduce the cost these BAI cards have been designed for mass production. Another good thing about this smart card is every single card can store up to ten fingerprint templates. So the users no need to remember about the enrolled finger that they provide during the registration. There no need to match the entire finger's template, if some finger matched then also user can be authenticate and can get access. So there has nothing to be worry if there some finger has been injured.

Now a day such smart cards are using in so many places mostly in the financial sector where electronic transaction are essential. Recently by the UK border agency such smartcard chip technology has been integrated for the new UK e-passport and National ID card.

2.1.2 Hand Geometry

In the past hand geometry scanning devices were vastly used but after 2001 the use of such devices reduced dramatically because of the limitation of the integration with the application. After such limitation as well still such devices are widely using in some sector for identification because of their resistance to fraud, speediness and easier to use functionality. Feature and Technology

Recognition System Inc is treated as the main founder of the commercial hand geometry devices. Such scanner uses a CCD camera and infrared (IR) light emitting diode with mirror and reflector to capture black & white images of the individual hand silhouetted against a 32,000 pixel field. In such device optic produces top and side images of the hand. After capture the image of the hand, ninety six types of measurement are performed by the device which is stored in 9Bytes templates during the enrolment. To use such devices users need to place their hand on the scanner surface around 3 consecutive times. Proper finger position is ensured by the Pins, projecting from the platen as described in the . Then the central processing unit and the software determine the average of the three measurement sessions and then produce a template that is stored for future verification.

To initiate the identity verification firstly the user need to insert certain pin by using a keypad. After authentication of the identification number the device retrieves the matching template for comparison. The images of the hand will be capture, process and then match against the template. The matching score is acceptable or not, it's determined by the preset threshold.

A big weakness of such device is it has only authentication functionality no identification technique or functionality. User makes an explicit positive claim about their identity before the matching process can take place. Hands are satisfyingly unique for such verification purposes. However, hand geometry-based scanners cannot be used for recognition. Geometrical measurements of human body parts do not produce enough evidence to establish identity as already proven by the failure of anthropometry in the early 1900s

Although there has such big weakness of hand geometry system then also still such systems are vastly used as recognition system. As an example, still in the San Francisco airport around eighteen thousand employees used hand reader recognition system for access into the tarmac. Hand readers have been used in 1996's Olympic game to secure the entry into the Olympic village.

Disney Land in Orlando, Florida is using the hand geometry system in a large scale for the quick processing of authentication for the returning pass holder which takes about only eleven seconds and this is very easy to understand process for everybody.

As a conclusion we can say that such hand geometry devices and system can provide reliable and accurate verification mainly for the purpose of

Attendance and time maintenance: Hand geometry system is very useful to prevent so called buddy-punching (punches in or punches out for a not present co-worker), so it ensures the accurate payroll for the employees.

Manage access control: such technology prevent users to enter using others card or personal identification number. In the sophisticated places like nuclear power stations are always use such employ recognition system. Strengths and Weaknesses

The uses of hand geometry scanners are very successful because

Since such scanners using process is very easy so it could convince more users. By such scanners even dirty hands also can be scanned. Only the elderly users who are suffering from arthritic could have problem to place their hand on the scanner's surface appropriately.

Such technology is considered much more fraud resistant as it is very difficult to make accurate measurements fake hand and then submit the sample without the notice will not be that easy.

The size of the hand geometry template is very small only 9 bytes. This is very easier to store even on a magnetic stripe card. Where an iris pattern's template usually takes 512 bytes of space and a voice template occupies storage space around 1500 to 3000 bytes.

Furthermore, there has no forensic relation with such hand geometry scanning system in where some biometric system like retinal scanning faces that which vastly reduces the users to adopt such technology.

However since the beginning of this millennium the uses of hand geometry scanner decreased dramatically because of

Its size is too large, so it's not that suitable to setup in every place or integrate in every application or system.

The practical application of hand geometry technology is very limited because it can only be operated in verification mode, performing 1:1 matches.

The price of hand geometry scanner is still higher like US$1,400 to UD$2,000 where biometric finger print scanner can be found in much cheaper price.

2.1.3 Iris Patterns

Since there has no ancient history of the iris pattern recognition so still it is treated as an infant technology and still lots of research and development going on it.

In 1994, Dr. Daugman of the University of Cambridge is the first scientist who invented the concept to recognize individuals by their iris patterns. Iridian technologies has gained the copyright of Dr. Daugman's algorithm to develop and marketing of the iris scanning device. Several testing organizations and companies has tested iris device to implement in different sectors , the main interesting thing is still they could not find any false match. It is considered as highly accurate, non-invasive and speedy. (IRIDIAN TECHNOLOGIES 2008a).

The accuracy and performance of the iris recognition system is much better than any other biometric technology, although such device's price is still higher but for a mass production its price is in the way to decrease. Feature and Technology

The iris is the round, pigmented tissue surrounding the pupil of the eye that controls the amount of light entering the eye. It is located behind the cornea, the transparent front layer of the eye, and in front of the lens (WOODWARD ET AL. 2003).

In that technology iris pattern can be scanned from up to 1 meter distance. It can reveal nearly 250 independent degrees-of-freedom (DoF) of textural variation across individuals (DAUGMAN 1999) where fingerprint has only 50 DoF. Iris patterns form randomly during the birth and the texture phase sequence are not dependent with ancestor's gene. Moreover, left and right eye's iris patterns also different.

This technology uses invisible infrared wavelength to scan the iris and reveal rich and complex texture. The image is captured by a monochrome CCD camera. The two-dimensional (2D) modulations that create iris patterns are encoded into a so-called “IrisCode”. Complex-valued 2D Gabor wavelets demodulate the patterns and represent them as phasors in the complex plane (DAUGMAN 2004a). 2048 phase bits (256 bytes) also calculated for each iris.

A test for the statistical independence is the main element of the iris recognition technology. The test involves so many DoF that IrisCodes for two different irises are almost certain to pass, but two IrisCodes generated from different images of the same eye are almost certain to fail. Strength and weakness

There have lots of advantages to use iris patterns as the biometric identifier.

Iris patterns are unique and independent and they remain almost the same during the whole natural life except any injury or suffer from any certain diseases.

As in this technology , during capturing image there remain around one meter distance between eye and the camera, so it doesn't make any barrier or uncomforted situation to wear a glass or contact lenses (colour or clear) for a user.

In the both identification and authentication stage it works very first, reliable and accurately.

Since iris is an internal organ so it is very difficult to spoof. Even if anybody wear contact lenses printed with a pre-defined iris pattern then also it is not possible to fool the device as the device can easily recognize it by using 2D Fourier domain artifact of printing technique.

Weakness of the iris pattern recognition technology

Capturing appropriate image is not that easy in that technique as it depends upon several factors such as darkness, lighting, CCD camera quality, user's position in the perspective of the scanning device etc. Another big problem is eyelids and eyelashes may obscure the iris, so there has a risk of injury and diseases.

Another challenge in image capturing is pupillary dilation and constriction, which temporarily deform irises non-elastically (DAUGMAN 1999).

Aniridia patients cannot be identified by the iris. Aniridia is a very rare medical condition in which a person lacks one or both irises (CHIRILLO AND BLAUL 2003). This genetic defect occurs in one in 75,000 persons.

Some people don't feel comfort with that iris scanning technology as they think it might damage their eye; create disease and also threat for personal privacy and freedom.

2.1.4 Retinal / Choroidal Pattern

It is assume that retinal technology are very similar and sometimes more accurate than the iris technology. Commercially still there have no retinal based devices or system available in the market. Feature and Technology

Vein patterns known as choroidal vasculature which are situated behind the retina in the back of the eye are used for that biometric identifier technology. Like iris pattern these vascular patterns are also very unique, reliable and remain unchanged for the period of a human's natural life cycle.

A CCD camera is used to capture choroidal images. In that case the device uses reflectance of infrared illumination that makes the retina transparent. Retinal blood vessels are not externally visible, so to make these visible users need to cooperate. Users need to position their eyes in a specific distance from the image capturing device. To capture the image the device may take up to five seconds. For any movement of the eyes during that time may cause to be the image as useless. So to avoid such problem five images need to be taken which could be irritating and time consuming for the users. Strength and Weakness

Some positive points regarding this technology are:

Pattern of choroidal vein are very unique, reliable and remain unchanged for the period of a human's natural life cycle.

The accuracy of this technology is very high.

Many data points can be gathered into small templates of 48 to 96 bytes.

Since the retina is an internal organ, systems based on its appearance are hard to spoof.

Disadvantages of using the choroidal vasculature for identification include (BIOMETRIC TECHNOLOGY 2002 and CHIRILLO AND BLAUL 2003):

The required equipment is expensive, when compared to other biometric methods.

Image acquisition is intrusive, demanding high levels of user cooperation and concentration. No serious market presence is expected until an acceptable image capture process and device are available.

Retinal scans can reveal information about the medical condition of the user.

Many people fear eye damage through the use of the technology. However, correctly designed scanners use a frequency range that is not considered harmful to the eye.

2.2 Behavioural Biometrics

Verification of Voice, signature and keystroke dynamics are the three best developed biometric methods based on behavioral characteristics

2.2.1 Voice Pattern

Voice verification and speech recognition is a natural way to identify some body. Most of the time, we can recognize our friends or family members on the phone just by their voice. In the automated telephone banking two technologies are worked together: voice identification and speech recognition. Feature and Technology

Usually men's vocal cord vibration is 80Hz and women's one is 400Hz. When an individual tries to speak then these vibrations are controlled and modified by tongue, pharynx, larynx, mouth and jaw. This characteristic makes a human voice unique and suitable for identification. (BIOMETRIC TECHNOLOGY 2002).

In that system after capture the individual's voice, it analyse and compare with the stored 70-80 bps voice template which taken during the enrolment. The voice quality, duration, loudness and pitch are considered during the matching process.

Although a user friendly interface of identification and authentication is provided by the speaker recognition system then also it's EER (Equal Error Rate) is very poor which is 2%, so this technology is not suitable where security is the main issue. Strength and Weakness

For user friendly and easy to use characteristics voice verification system is very famous and well known technology in biometric sector.

Speech-based biometric systems are usually well accepted by users, due to its non-invasive and easy to use functionality.

This voice recognition system are competitively cheaper than most of the biometric system since in this system there no need any expensive hardware , only the existing phone systems, microphones and software are enough.

There has lots of weakness of such speech processing technology

Naturally voice patterns are not highly repeatable. During voice recognition and verification lots of error could be occur for sickness, noisy background sound, extreme emotional state and aging.

As EER (Equal Error Rate) of this technology are very poor which is 2%, so this technology is not suitable where security is the main concern.

It is easier to deceive such system as any impostor can record a legitimate user's voice by any way and may try to masquerade as a legitimate user to fool the system.

2.2.2 Signature Dynamics

This is the most easy and widely used biometric traits since the ancient time. Still all most everywhere signatures are used to validate any document and finalized any contact. Feature and Technology

Verification of signature depends on some factors like geometry, shape and curvature of single letters and complete words. It also depends on the dynamic characteristics like pressure, speed, stroke direction, pen-up and pen-down movements. Now a day these characteristics can be capture digitally by using an e-pad and stylus.

Such devices have several built-in sensors to measure the dynamic features of a signature. A sample of signals captured by another system called HESY is demonstrated in the below

Same person's signature can be vary time to time. So there has no prove of permanence and uniqueness of the signature dynamics. Strength and Weakness

Signature authentication is still widely used and accepted in all over the world.

Imitate the dynamic data (pressure, speed, stroke direction, pen-up and pen-down movements etc.) of a signature is very difficult and some times impossible.

Although the ink signature are still using world wide then also signature dynamics based biometric products are not that popular in the market as there has no prove of permanence and uniqueness of the signature dynamics.

And also there has no prove that the existing algorithms are accurate.

2.3 Issues of Privacy

User privacy is one of the main issues behind controlling some biometric identification systems. There are three different types of privacy- informational, decisional and physical which are defined by the American Courts. Preliminary informational privacy seems the main concern for an individual as they sometimes feel uncomfortable to disclose their private information and they always remain in hesitation about the control and about to whom, how, why ,when and how much information they should disclose.

Biometric opponents arise some of these concern against implementing of the Biometric system

The biometric identification could be a threat of anonymity and autonomy because during using the biometric system users left their information on the system, later on which could be used by an impostor to steal identity of a legitimate user.

After capture the biometric identification by a device it creates a template data which could be easily copy and share in several databases together and from there an impostor may gain data to regenerate a sensitive document of legitimate users like passport, national ID, tax documents, driving licence etc. So strict rules and regulation must need to apply so that collected biometric information will use only for initially intend purposes under the full awareness of the user.

Biometric trait may carry users some medical conditions which the user may not reveal publically. So it is very important to make sure that biometric template doesn't contain any excess data which are not necessary for the required identification. And also needs to ensure that raw images and data are properly destroyed after generate the template.

Some users believe that uses of biometric product may cause of some serious health problem. So a wide description regarding the health issue should be reveal among the users to deprive the fear.

Section Four: Product

4.1 Critical review of the product

The element of J2EE documentation , java servlets are used to design this product. servlet is a kind of server-side applet. servlet runs in the same way in the web server as the same way applets run inside any web browser. To run a java servlet there need a web server that understands servlets and knows about how to deal with servlet. Like Tomcat or Jboss. For this product Jboss is used because it's a more robust server that is used in the industry for a large scale of application . During deployment of a new version of servlet to JBoss, there no need to restart the server which usually need to do in Tomcat. Servlets are usually deployed in a war (Web Application Archive) file format which is consisted with jsp files and JBoss can work directly from a war file where Tomcat needs to unpack it first into a directory with the same name. In this product after execute the Olympics_F.war deployed on the JBoss server.

JDBC (Java Database connectivity API) is used to make a connection between servlets and database. JDBC is a Java API. It works as a piece of middleware. It helps to establish connection among database or database management system (DBMS) and helps to execute SQL query and helps to process the result. The connection class “DriverManager.getConnection” call and manage the communication between the specific database and the client applications, including passing of SQL statements.

There could arise a question that why build.xml has used rather than simply allow the user to surf to the web page via the war file or the name of the servlet? The answer is the to increase the security. If the actual name of the servlet being used then it could be very easier for the hacker to hack into the system and change the servlet. The build.xml file provides much more than a simple redirection such as it can specify security configurations, filter information before calling a servlet, security roles and when a session should timeout etc.

For this web service derby database has used and which linked by the separate port (default port 1527) via jdbc driver. So for a hacker it's very difficult to hack the database as well.

To ensure the strength JPA (Java Persistence API) is used for that web service. It automatically maps between an object-oriented view of data and a relational database view. If the database structure changes then also it is easier to maintain. Automatically maps to the relational view in the database. It works with different persistence provider like Hibernate (default for JBoss).

A set of operations like transaction is used to develop this web service to ensure database integrity. JTA (Java Transaction API, a standard java interfaces between a transaction manager) can cope with transactions distributed across multiple database. In here Servlet using JTA style transaction management.

In here the source directory structured into a business package, integration package and presentation package. The transfer object is in the business package, The DAO object is in the integration package and the servlet files are in the presentation package.

The entire content of the integration and business layers are packaged into a lib.jar file which put into the output/lib directory.

In a web development team there could be a variety of skill sets like specialist in programming, multimedia, web design etc. and they may not understand each others area. So composite view pattern is used to modularise the page layout and reduce the need to cut and past HTML around the system. Tiles which is a very powerful and useful technology and typically used with struts framework used in here to make the changes in one place , to create a common look & shape for the whole website or application and to create reusable static and dynamic view components.

In a normal web service / application there don't have any use of centralized point of contact for request handling. In that case when any user access the view directly without going through a centralized mechanism, there may occur several problems like - Firstly each view has to provide its own system service e.g. security checking and it will often result in the need to duplicate code. Secondly, the view navigation is left to the views which may outcome in commingled view content and view navigation.

In addition, distributed control is more difficult to maintain, since changes will often need to be made in numerous places.

But in this case the front controller pattern solves such problems by providing a

central place to handle system services such as security and business logic across multiple requests. Centralised access to an application means that requests can be easily tracked and logged.

In that case all presentation queries are sent via the front controller code before being forwarded on for processing.

This will provide support for the handling of the requests, including invoking security services such as authentication and authorization, delegating business processing, managing the choice of an appropriate view, handling errors, and managing the selection of content creation strategies. In addition, auditing a single entrance into the application requires fewer resources than distributing security checks across all pages.

In this product all the action is set into the "FrontControllerServlet“ instead of calling the web page directly.

So, when front controller gets a call then it response by sending a code in a hidden field. e.g. in the ViewEventPage_Body.jsp hidden field is called as actionID and the value is viewMatches. The front controller will pick up the value of the hidden field and use it to decide from where the request has come and where it should go?

All incoming HTTP requests are sent to a Front Controller then the Front Controller servlet delegates control to the application controller. It reads protocol-specific data and adds this to a request context object and then it looks up the appropriate action to process the request. The action will execute code to perform the task and return a mapping key to indicate the next view to show. The application controller then looks up the right page from the mapping and passes it back to the front controller which forwards the HTTP request.

To increase the security level and to prevent any unauthorised entry even via the Front Controller, filter up technology has applied in this product. In this case a security filter class has declared, which is LoginFilter_v1.java and it has been put in presentation.controller package.

To implement the filter, <filter> elements (e.g. filter mapping, servlet, servlet mapping) has been declared in the web.xml file which is situated in the meta-data package. Which request have to be catch that can determine by specify the URL pattern in <filter-mapping> element. In this case the URL pattern has been specified as “/*” which can catch every request of the application.

As we trapping all the URLs in the above filter, so LoginFilter_v1.java page will detect the fact that there is no FrontController Servlet in the URL and will automatically divert it to the LoginPage.jsp which is presenting the user id , password etc fields. The form action sends the request to the FrontControllerServlet. The hidden field sets the actionID to loginservlet and that value is used by the front controller to make a decision what to do next.

When the front controller receives the actionID “loginservlet” which was set by the LoginPage.jsp page, it calls the LoginServlet.java page to process the login.

A RequestDispatcher known as controller is declared and is set up to forward control back to the front controller.

A call is make to the dao object to check whether the loginID and password are valid. It compares these to the login values stored in the database.

If the login was ok, it sets an attribute in the request object which known as subAction to be “loginok” and also sets an attribute in the user session called loginID to be the user id that the person logged in with.

The front controller will need this to check that the person has logged in later on before it allows them to surf to any other page.

If any error occurs, then it sets the subAction “loginerror” and also sets the attribute “messageText” to an appropriate error message for the LoginPage.jsp to pick up and display to the user.

4.2 Existing products - problem & Comparison

In a web development team there could be a variety of skill sets like specialist in programming, multimedia, web design etc. and they may not understand each others area. So composite view pattern is used to modularise the page layout and reduce the need to cut and past HTML around the system. Tiles which is a very powerful and useful technology and typically used with struts framework used in here to make the changes in one place , to create a common look & shape for the whole website or application and to create reusable static and dynamic view components.

In a normal web service / application there don't have any use of centralized point of contact for request handling. In that case when any user access the view directly without going through a centralized mechanism, there may occur several problems like - Firstly each view has to provide its own system service e.g. security checking and it will often result in the need to duplicate code. Secondly, the view navigation is left to the views which may outcome in commingled view content and view navigation.

In addition, distributed control is more difficult to maintain, since changes will often need to be made in numerous places.

But in this case the front controller pattern solves such problems by providing a

central place to handle system services such as security and business logic across multiple requests. Centralised access to an application means that requests can be easily tracked and logged.

In that case all presentation queries are sent via the front controller code before being forwarded on for processing.

4.3 Universality

As an Enterprise application this product / web service could be distributed, talk to a database, could have a lot of users, mostly internet facing. It has wide traffic fluctuations ability. It has the functionality of heavy server usage. It is reliable, maintainable, support failover. It has the ability to handle lots of concurrent sessions and has the ability to increase capacity through clustering. It also can avoid bottlenecks etc.

4.4 Acceptability

Java EE (enterprise edition) is designed to support applications that implement enterprise services for customers, employees, suppliers etc. who make demands on, or contributions to, the enterprise. These applications are inherently complex, potentially accessing data from a variety of sources and distributing applications to a variety of clients. To better control and manage these applications, the business functions to support these various users are conducted in the middle tier. The Java Enterprise Edition (Java EE) application model defines architecture for implementing services as multi-tier applications that deliver the scalability, accessibility, and manageability needed by enterprise-level applications.

(Ref: http://java.sun.com/javaee/5/docs/tutorial/doc )

4.5 Distinctive

This web application is treated as multi-tiered (three-tiered) application because they are distributed over three locations - in the client machines, Java EE server machine and in the database or legacy machine at the back end.

It's very difficult for a programmer to program from the scratch. As it got the infrastructure support so developers can easily concentrate on the business code. It used JEE or a framework to support development. Middleware can uniquely provide service like security, persistence, transaction, messaging etc.

4.6 Integration of open source bio-sdk

Fingerprint verification process compares the captured fingerprint data with the stored fingerprint templates that generated during the enrolment and then decide the percentage of match and non-match. In the below describes about the procedure of the fingerprint verification system.

For the workflow of fingerprint enrolment for that web service Java API functions has been used. The workflow performs: enrolment of fingerprint, enrolment of fingerprint with UI support, verification of fingerprint, verification of fingerprint with UI support, Fingerprint data object serialization and deserialization

Verification of fingerprint

In this has described about performing verification of the fingerprint. Java API class functions are used to perform tasks in the workflow.

Graphical User Interfaces

This SDK uses a namespace “com.digitalpersona.onetouch.ui.swing.DPFPEnrollmentControl” to describe the functionality of the graphical user interface. This namespace contains the constructor, properties, and the event handlers.

In the table below depicts the interaction between the user and the graphical user interface during fingerprint enrolment.

Graphical User Interface: DPFPVerificationControl In the GUI connection status of the fingerprint reader, produce of feedback and the verification of fingerprint are controlled by the “DPFPVerificationControl” object which depicts in the bellows table:

Setting the False Accept Rate

The SDK which is used with this product allow users and the developers to specify a false accept rate (FAR)

False Accept Rate (FAR)

The FAR which is treated as the security level, is the proportion of fingerprint verification operations by authorized users that incorrectly returns a comparison decision of match. FAR means the ratio of the expected number of false accept errors divided by the total number of verification attempts, or the probability that a biometric system will falsely accept an unauthorized user. For example, a probability of 0.001 (or 0.1%) means that out of 1,000 verification operations by authorized users, a system is expected to return 1 incorrect match decision. Increasing the probability to, say, 0.0001 (or 0.01%) changes this ratio from 1 in 1,000 to 1 in 10,000.

As FAR is the opposite of the False Reject Rate (FRR), so increase of the FAR will decrease the FRR and decrease of FAR will increase of the FRR. Usually higher security level of FAR and FRR is chosen to increase the security of the system but some times it could be relaxed where easy access is the main issue.

Representation of Probability

The fingerprint recognition engine that is used to integrate this web service is compatible with BioAPI 1.1, BioAPI 2.0, and UPOS standard.

PROBABILITY_ONE provides a suitable way of using this illustration. PROBABILITY_ONE has the value 0x7FFFFFFF (where the prefix 0x denotes base 16 notation), which is 2147483647 in decimal notation. If the probability (P) is encoded by the value (INT_N), then

Probability P should always be in the range from 0 to 1. Some common representations of probability are listed in column one of the table below. The value in the third row represents the current default value used by the DigitalPersona Fingerprint Recognition Engine, which offers a mid-range security level. The value in the second row represents a typical high FAR/low security level, and the value in the fourth row represents a typical low FAR/high security level. The resultant value of INT_N is represented in column two, in decimal notation.

Table: Common values of probability and the values of resultant INT_N

Specifying the FAR

To set up the user defined value of FAR setFARRequested method has been called and used the code:

“ matcher.setFARRequested(DPFPVerification.MEDIUM_SECURITY_FAR); ”

Achieved FAR

The actual value of the FAR achieved for a particular verification operation can be retrived using the getFalseAcceptRate method of the DPFPVerificationResult interface This value is typically much smaller than the requested FAR due to the accuracy of the that fingerprint recognition engine. The requested FAR specifies the maximum value of the FAR to be used by the Engine in making the verification decision. The actual FAR achieved by the Engine when conducting a legitimate comparison is usually a much lower value. The Engine implementation may choose the range and granularity for the achieved FAR. If you make use of this value in your application, for example, by combining it

DPFPVerification verification =


DPFPVerificationResult result = verification.verify(featureSet, template);

int FAR = result.getFalseAcceptRate();

4.7 Drawback and overcome such lacking

As front controller control centralises, so it is also possible to introduce a single point of failure although this rarely treated as a problem as multiple controllers typically exist, either within a single server or in a cluster.

4.8 Future development

In future it can be expected that there will be biometric hardware integrated with maximum of the systems, so the organizations may find it easier to convert their existing system or develop the new system using J2EE or higher version that may invent in future. Organizations will definitely try to develop an anti-hacking and more secure biometric integrated robust system. So for develop such system there has lots of things to do like development of the algorithm of the biometric software, development of J2EE etc. Now a day most of the fingers print scanners software can be spoofed by using various technique. So there have lots of things to develop to stop such spoof and increase the security level.

4.9 Installation and working procedure of that system

A. Creating Database

The database of this service is based on the sample Derby database supplied with the Sun Application Server. Set it up as follows.

1. Start NetBeans

2. Select the Services tab

3. Right-click on the "Java DB" tab

4. Select Create Database

5. Write the database name as “prjDB”

6. Enter both the User Name as Password as "admin".

7. Click OK

8. An entry for the database should appear under the Databases node in the Services tab

9. Right-click on the database entry (jdbc:derby://localhost:1527/prjDB [admin on ADMIN] ) and select Connect

10. Again right click on “jdbc:derby://localhost:1527/prjDB [admin on ADMIN]” and select execute command

11. Copy and past the sql commands from the tblSQL file which is situated in the project folder.

12. Press Ctr + Shift + E or Click the green Run SQL button just to the right of the Connection box to run SQL

B. Adding tables and data to the database

1. Having run the SQL script we should be able to see the following tables under the node of our database: STWDMATCHALLOT, STEWARD, SPORT, and LOGIN.

2. If we right click on each of the table then we should be able to Add column, view data etc.

C. Run the Jboss 4.2.3 server

1. Open the Jboss 4.2.3 server folder

2. double click on run file from the bin directory to run the Jboss server

D. Run the servlets

1. Open the project in Net Beans.

2. Expand the Source Packages node so that can be seen each of the Java files in the packages business, integration, presentation and presentation.controller.

3. Double click on build.xml and in the 3rd line have to show the correct server path (<property name="server.root" value="C:jboss-4.2.3/"/> )

4. Save the project first for any change

5. Right click on project node and select Run to Run the project.

E. Show in the browser

1. Open any browse and type http://localhost:8080/ for check the server running in the local machine. [ The port number 8080 is the default]

2. When we run the project then we can see in the output dialogue box that it deploy a file in the JBoss server. In that case it is “Olympics_F”

3. So we can write http://localhost:8080/Olympics_F/ in the address bar of the browser to initiate the service. Please be noted that Jboss and Tomcat are case sensitive so must need to type the war file exactly.

4. Please be noted that Login ID and password can be add, delete or modify from the SQL table

F. Setup Biometric Fingerprint scanner for that service

1. To avoid writing the information and increase security any standard biometric scanner can be integrated with this service.

2. I am working to develop and integrate some more algorithms to increase its security level.

3. In that case I have checked with digital persona scanner and open source SDK

4.10 Design Documentation:

4.10.1 Used case diagram

Admin and Steward Users

Class Diagram and the List of Classes and Interfaces that have used in this web application with the description of the methods:

All the get functions are used to get the appropriate field values by using hash mapping. For example, by using HashMap<String,String> getSports(), we are getting sports field value from the appropriate table “Sport” which is mapped in sport.java file.

boolean checkLogin(String loginID, String password, String Status) function is used to check the login details is correct or not according to the database. For successful login it returns true value “1”.

4.10.2 Compact Model:


http://www.velocityreviews.com/forums/t679698-executing-external-program-through-jsp.html [java code taken for bio sdk]

http://www.digitalpersona.com/embedded/one-touch-sdk [Digital Persona One Touch
Software Development Kits (SDKs)]


Read full document← View the full, formatted essay now!
Is it not the essay you were looking for?Get a custom essay exampleAny topic, any type available
We use cookies to give you the best experience possible. By continuing we'll assume you're on board with our cookie policy. That's Fine