Application Independent ICC to Terminal Interface Requirements.

Book 1. Application Independent ICC to Terminal Interface Requirements. Describes the minimum requirements for microprocessor-based cards (ICC— Integrated Circuit Card) and terminals that ensure that the terminal and the card interact, regardless of which card application is used.
The book defines requirements for Electromechanical characteristics of the map (size and location of the contacts, the height ciowego module, the characteristics of supplied power, clock frequency, signal initial setup of the card, the resistance between the pair of card contacts and the terminal), describes all the stages through which the card is in the process of operation, starting from its initiation and ending with deactivation. Book 1 also contains a description of asynchronous data transfer protocols between the card and the terminal T= 0 and T= 1.
A separate section of the book describes the file structure of the map, data elements, and commands. In particular, it describes the elements and data objects used, the command structure, how to access files, the command structure, and the procedure used to select an application.
Book 2. Security and Key Management. Describes the minimum requirements for the logical security functions of microprocessor cards and electronic terminals used in operations. Book 2 contains a description of the procedures for static and dynamic card authentication, PIN code encryption, ensuring the integrity and confidentiality of information exchange between the card and the Issuer, and key management principles and policies. In addition, it contains a description of cryptographic algorithms used for signing data, verifying the signature and restoring data from the signature, encrypting data, calculating the values of data integrity control codes (Message Authentication Code), displaying map keys and session keys.
Book 3. Application Specification. Contains a description of the data items, files, and commands associated with executing the transaction. It provides a list of payment application functions with a description of the data elements and commands used to perform these functions, as well as a description of the sequence of events and commands that occur during the processing of a transaction.
Book 4. Cardholder, Attendant, and Acquirer Interface Requirements. Contains a description of terminal types and their capabilities, as well as functional requirements for terminals that are required to perform operations on EMV-co-capacity cards, and requirements for the physical characteristics of terminals. Book 4 describes the architecture of the terminal software, including data management principles, requirements for the “terminal — cardholder” and “terminal — servicing Bank” interfaces.
The specifications in this book contain the requirements for implementing microprocessor cards. These requirements complement the rules of payment systems for terminals that accept magnetic stripe cards, and make it possible to accept magnetic stripe cards and microprocessor cards on the same terminal.
Based on the EMV standard, the largest payment systems VISA and MasterCard have released specifications for their applications for microprocessor cards. Their latest versions are known by the acronyms M/Chip 4 (MasterCard) and VIS 1.4 (VISA). In addition, the VISA payment system offers its banks specifications for Java applets of its applications that are compatible with the Global Platform/Java Card operating environment. Versions of the VISA payment system applets 2.4.0, 2.4.1 and 2.5.0 are available on the MARKET.
It should be noted that the differences in the applications of the leading payment systems are quite noticeable. They are primarily concerned with:

  • formats and semantics of key data elements (for example, Issuer Application Data, Card Verification Results, Issuer Authentication Data, card risk management parameters) used in applications;

command response formats (in M/Chip 4, responses to the GENERATE AC, GET PROCESSING OPTIONS, and INTERNAL AUTHENTICATE commands use format 2, while in VIS 1.4, responses to the last two commands must use format 1, and the response to the GENERATE AC command is format 1 for SDA and DDA cards, and format 2 for CDA cards) , and even
the set of commands used (for example, the EXTERNAL AUTHENTICATE command is not used in VIS 1.4, but is used in M / Chip 4), as well as the method of processing the Issuer’s commands (VISA cards support the execution of such commands only after processing the SECOND generate AC command).
In July 2005, EMVCo released a draft of the Common Payment Application (CPA), which is simultaneously supported by both payment systems. A Bank that has chosen the CPA application as a universal payment application, practically preserving the functionality of the M/Chip 4 and VIS 1.4 applications, and sometimes expanding it, makes it easier for itself to solve the problem of risk management and personalization of microprocessor cards, and also unifies the transaction processing on its host (making the transaction processing algorithm independent of which payment system the transaction belongs to).
Among the standards related to microprocessor cards, we should mention the PC / SC Workgroup Specifications, which describe the interaction of a personal computer application with a microprocessor card. From the very beginning, the microprocessor card was considered as a secure universal General-purpose computing platform. Therefore, the need for interaction between the card and the computer was obvious.
In September 1996, the PC/SC Workgroup group was created by computer, software, and IPC manufacturers. it developed open PC / SC Workgroup Specifications that define the model of interaction between a computer program and a microprocessor card. According to this model, several card readers can be connected to a computer using different physical interfaces (for example, RS-232C, PS/2, PCMCIA, etc.). The model defines a computer software module that controls access to map and reader resources (ICC Resource Manager), as well as modules for providing services to computer applications (ICC Service Provider). These services include performing cryptographic operations, implementing file access methods, authentication, and so on.
Today, PC/SC specifications are widely used in the field of information technology. However, the trend of using the USB Protocol in microprocessor cards as a means of communicating with an external computer indicates that the PC/SC specifications will be replaced in the future by the possibility of direct communication between the card and the computer, for example, over the TCP/IP Protocol. (For more information, see section 2.8.)
In conclusion, we will give a brief overview of the state of Affairs in the field of biometric identification/authentication, which should eventually find its application as a means of authenticating the holder of a microprocessor card.
Development of technologies for pattern recognition on different biometric characteristics have begun to do already for a long time, in the early 1960s, However, the practical results have been obtained mainly in recent years. The power of modern computers and improved algorithms for processing biometric information have made it possible to create products that are interesting and accessible to a wide range of users due to their technical characteristics and prices.
For biometric identification/authentication of an individual (object), six technologies are most often used, which differ in the type of biometric information used in them:
fingerprints;
hand geometry;
facial features (based on optical and infrared images);
eye;
voice;
signature.
Biometric technologies are easy to use and accurate results. At the same time, they all use common approaches. Biometric identification/authentication of an object is performed in several stages:
scanning an object and creating an image of the object (the current template);
comparison of the current template with the template-a reference image of the object stored in the authentication center (the authentication center can also be an IPC).

Book 1. Application Independent ICC to Terminal Interface Requirements. Describes the minimum requirements for microprocessor-based cards (ICC— Integrated Circuit Card) and terminals that ensure that the terminal and the card interact, regardless of which card application is used.
The book defines requirements for Electromechanical characteristics of the card (size and location of the contacts, the height ciowego module, the characteristics of supplied power, clock frequency, signal initial setup of the card, the resistance between the pair of card contacts and the terminal), describes all the stages through which the card is in the process of operation, starting from its initiation and ending with deactivation. Book 1 also contains a description of asynchronous data transfer protocols between the card and the terminal T= 0 and T= 1.
A separate section of the book describes the file structure of the card, data elements, and commands. In particular, it describes the elements and data objects used, the command structure, how to access files, the command structure, and the procedure used to select an application.
Book 2. Security and Key Management. Describes the minimum requirements for the logical security functions of microprocessor cards and electronic terminals used in operations. Book 2 contains a description of the procedures for static and dynamic card authentication, PIN code encryption, ensuring the integrity and confidentiality of information exchange between the card and the Issuer, and key management principles and policies. In addition, it contains a description of cryptographic algorithms used for signing data, verifying the signature and restoring data from the signature, encrypting data, calculating the values of data integrity control codes (Message Authentication Code), displaying card keys and session keys.
Book 3. Application Specification. Contains a description of the data items, files, and commands associated with executing the transaction. It provides a list of payment application functions with a description of the data elements and commands used to perform these functions, as well as a description of the sequence of events and commands that occur during the processing of a transaction.
Book 4. Cardholder, Attendant, and Acquirer Interface Requirements. Contains a description of terminal types and their capabilities, as well as functional requirements for terminals that are required to perform operations on EMV-co-capacity cards, and requirements for the physical characteristics of terminals. Book 4 describes the architecture of the terminal software, including data management principles, requirements for the “terminal — cardholder” and “terminal — servicing Bank” interfaces.
The specifications in this book contain the requirements for implementing microprocessor cards. These requirements complement the rules of payment systems for terminals that accept magnetic stripe cards, and make it possible to accept magnetic stripe cards and microprocessor cards on the same terminal.
Based on the EMV standard, the largest payment systems VISA and MasterCard have released specifications for their applications for microprocessor cards. Their latest versions are known by the acronyms M/Chip 4 (MasterCard) and VIS 1.4 (VISA). In addition, the VISA payment system offers its banks specifications for Java applets of its applications that are compatible with the Global Platform/Java Card operating environment. Versions of the VISA payment system applets 2.4.0, 2.4.1 and 2.5.0 are available on the MARKET.
It should be noted that the differences in the applications of the leading payment systems are quite noticeable. They are primarily concerned with:

  • formats and semantics of key data elements (for example, Issuer Application Data, Card Verification Results, Issuer Authentication Data, card risk management parameters) used in applications;

command response formats (in M/Chip 4, responses to the GENERATE AC, GET PROCESSING OPTIONS, and INTERNAL AUTHENTICATE commands use format 2, while in VIS 1.4, responses to the last two commands must use format 1, and the response to the GENERATE AC command is format 1 for SDA and DDA cards, and format 2 for CDA cards) , and even
the set of commands used (for example, the EXTERNAL AUTHENTICATE command is not used in VIS 1.4, but is used in M / Chip 4), as well as the method of processing the Issuer’s commands (VISA cards support the execution of such commands only after processing the SECOND generate AC command).
In July 2005, EMVCo released a draft of the Common Payment Application (CPA), which is simultaneously supported by both payment systems. A Bank that has chosen the CPA application as a universal payment application, practically preserving the functionality of the M/Chip 4 and VIS 1.4 applications, and sometimes expanding it, makes it easier for itself to solve the problem of risk management and personalization of microprocessor cards, and also unifies the transaction processing on its host (making the transaction processing algorithm independent of which payment system the transaction belongs to).
Among the standards related to microprocessor cards, we should mention the PC / SC Workgroup Specifications, which describe the interaction of a personal computer application with a microprocessor card. From the very beginning, the microprocessor card was considered as a secure universal General-purpose computing platform. Therefore, the need for interaction between the card and the computer was obvious.
In September 1996, the PC/SC Workgroup group was created by computer, software, and IPC manufacturers. it developed open PC / SC Workgroup Specifications that define the model of interaction between a computer program and a microprocessor card. According to this model, several card readers can be connected to a computer using different physical interfaces (for example, RS-232C, PS/2, PCMCIA, etc.). The model defines a computer software module that controls access to card and reader resources (ICC Resource Manager), as well as modules for providing services to computer applications (ICC Service Provider). These services include performing cryptographic operations, implementing file access methods, authentication, and so on.
Today, PC/SC specifications are widely used in the field of information technology. However, the trend of using the USB Protocol in microprocessor cards as a means of communicating with an external computer indicates that the PC/SC specifications will be replaced in the future by the possibility of direct communication between the card and the computer, for example, over the TCP/IP Protocol. (For more information, see section 2.8.)
In conclusion, we will give a brief overview of the state of Affairs in the field of biometric identification/authentication, which should eventually find its application as a means of authenticating the holder of a microprocessor card.
Development of technologies for pattern recognition on different biometric characteristics have begun to do already for a long time, in the early 1960s, However, the practical results have been obtained mainly in recent years. The power of modern computers and improved algorithms for processing biometric information have made it possible to create products that are interesting and accessible to a wide range of users due to their technical characteristics and prices.
For biometric identification/authentication of an individual (object), six technologies are most often used, which differ in the type of biometric information used in them:
fingerprints;
hand geometry;
facial features (based on optical and infrared images);
eye;
voice;
signature.
Biometric technologies are easy to use and accurate results. At the same time, they all use common approaches. Biometric identification/authentication of an object is performed in several stages:
scanning an object and creating an image of the object (the current template);
comparison of the current template with the template-a reference image of the object stored in the authentication center (the authentication center can also be an IPC).

The BioAPI version 2.0 standard, also known as the ISO/ IEC 19794-1 standard, defines the following steps in the biometric identification/authentication process::
control of sensors — physical devices that capture biometric data from an object;
algorithms for processing various types of object images in order to create the current template and template;
algorithms for finding a match between the current template and the template;
managing access to the template database.
Information about this standard can be found on the Internet at: www.bioapi.org.
Another important standard in the field of biometric identification is the ANSI X9. 84 standard “Biometric Management and Security for the Financial Services Industry”, used in the banking industry to identify customers and employees.
Since biometric data is public — an identification/authentication object can be photographed, recorded, signed, and fingerprinted, the X9.84 standard sets out requirements for the protection of biometric data in the process of processing it, starting from the moment the information is collected and ending with its analysis. The standard describes mechanisms for maintaining the integrity and confidentiality of biometric data, as well as authentication of their source at all stages of the biometric information processing process.
The X9.84 standard also defines the format of identification data, the method of storing and accessing biometric data templates in accordance with the ANSI X. 509 V. 3 standard.
Finally, mention should be made of the CBEFF (Common Biometric Exchange File Format) standard, which recently became the ISO 19785-1 international standard. This standard defines the data format (a set of mandatory and optional elements) used for the exchange of biometric information between programs that process this information. Thus, the standard provides the possibility of interaction between different biometric applications, defining a common language for this interaction.
Using biometric methods allows you to build a three-factor model of authentication of the person performing the operation, which significantly increases the security of card transactions. This model is based on the following security elements:
a card confirming that the person performing the transaction has a certain instrument issued by an authorized Bank, the authenticity of which is proved during the operation;
PIN – code-a secret shared by the cardholder and the Issuer, the knowledge of which must be confirmed by the person performing the transaction;
biometric information received from the person performing the transaction, which must correspond to the cardholder’s biometric data.
Instead of the described three-level model, a two-level model can be used, the components of which are a microprocessor card and biometric information. This approach is quite acceptable when entering biometric data about the cardholder is at an acceptable level of security. For example, it is almost difficult to enter someone else’s biometric data on an ATM, since physical access to the computing resources of this device is limited.
Depending on the type of biometric data, the size of the current template can range from 10 KB (one fingerprint), 15-20 KB (face image) to 30 KB (iris image of one eye). For visual verification of biometric data, the template size ranges from 1-2 KB (face drawing) to 5 KB (fingerprint).
When using biometric methods with the storage of the theme-plate on the IPC, the speed of data exchange between the card and the terminal becomes critical. This is due to the relatively large size of the template (up to 30 KB) and the often flow-based nature of identification/authentication procedures (for example, when checking electronic passports). Therefore, to implement applications that use biometric methods, you should use cards with a radio and / or USB interface.
The reliability of object identification/authentication using biometric methods is usually measured by the value TAR (True Accept Rate). It represents the probability that the current template will be correctly recognized by the algorithm for finding a match between the current template and the template in the database of the authentication center, provided that the current template is obtained from an object whose template is stored in the database/on the card. Note that this probability includes an event that is associated with getting a low-quality current template. Thus, TAR characterizes the reliability of the main biometric identification processes: the process of obtaining a template and the process of finding its compliance with the standard in the database.
It is obvious that the value of TAR can be set to 1 if you do not limit the probability of an error in the result of the algorithm for finding a match between the current template and the template (i.e., the probability of an event that a match between the current template and the template is found, provided that the objects corresponding to the template and the template are different). Usually, the probability of recognizing an incorrect match is called FAR (False Accept Rate) and is limited to 0.01% from the top. In table. 1.1 the TAR values are given provided that FAR is <0.01%.
The TAR values shown in the second and third columns of the table. 1.1, differ. The second column corresponds to the case when a single image of the identified object is “removed” to get the current template. Tar probability values when multiple object images are used to get the template.

It should be noted that the values 100% – TAR and FAR in terms of mathematical statistics are characteristics of the hypothesis selection algorithm and are called, respectively, error probabilities of the first and second kind.
The above values of the sizes of biometric templates are typical for creating complete images of an object that are sufficient for applying various reliable search algorithms for comparing the current template and template to these images. In other words, the given values of the object image size are sufficient for reliable identification/authentication of the object using the appropriate comparison algorithms. When targeting a specific match search algorithm, the template size can be significantly reduced (up to 250 bytes — 2 KB, depending on the type of biometric data). This is because the size of the” criteria ” for comparing the current template and template is small.This way you can reduce the amount of stored template (important for IPC) by selecting a specific comparison algorithm.

The BioAPI version 2.0 standard, also known as the ISO/ IEC 19794-1 standard, defines the following steps in the biometric identification/authentication process::
control of sensors — physical devices that capture biometric data from an object;
algorithms for processing various types of object images in order to create the current template and template;
algorithms for finding a match between the current template and the template;
managing access to the template database.
Information about this standard can be found on the Internet at: www.bioapi.org.
Another important standard in the field of biometric identification is the ANSI X9. 84 standard “Biometric Management and Security for the Financial Services Industry”, used in the banking industry to identify customers and employees.
Since biometric data is public — an identification/authentication object can be photographed, recorded, signed, and fingerprinted, the X9.84 standard sets out requirements for the protection of biometric data in the process of processing it, starting from the moment the information is collected and ending with its analysis. The standard describes mechanisms for maintaining the integrity and confidentiality of biometric data, as well as authentication of their source at all stages of the biometric information processing process.
The X9.84 standard also defines the format of identification data, the method of storing and accessing biometric data templates in accordance with the ANSI X. 509 V. 3 standard.
Finally, mention should be made of the CBEFF (Common Biometric Exchange File Format) standard, which recently became the ISO 19785-1 international standard. This standard defines the data format (a set of mandatory and optional elements) used for the exchange of biometric information between programs that process this information. Thus, the standard provides the possibility of interaction between different biometric applications, defining a common language for this interaction.
Using biometric methods allows you to build a three-factor model of authentication of the person performing the operation, which significantly increases the security of card transactions. This model is based on the following security elements:
a card confirming that the person performing the transaction has a certain instrument issued by an authorized Bank, the authenticity of which is proved during the operation;
PIN – code-a secret shared by the cardholder and the Issuer, the knowledge of which must be confirmed by the person performing the transaction;
biometric information received from the person performing the transaction, which must correspond to the cardholder’s biometric data.
Instead of the described three-level model, a two-level model can be used, the components of which are a microprocessor card and biometric information. This approach is quite acceptable when entering biometric data about the cardholder is at an acceptable level of security. For example, it is almost difficult to enter someone else’s biometric data on an ATM, since physical access to the computing resources of this device is limited.
Depending on the type of biometric data, the size of the current template can range from 10 KB (one fingerprint), 15-20 KB (face image) to 30 KB (iris image of one eye). For visual verification of biometric data, the template size ranges from 1-2 KB (face drawing) to 5 KB (fingerprint).
When using biometric methods with the storage of the theme-plate on the IPC, the speed of data exchange between the card and the terminal becomes critical. This is due to the relatively large size of the template (up to 30 KB) and the often flow-based nature of identification/authentication procedures (for example, when checking electronic passports). Therefore, to implement applications that use biometric methods, you should use cards with a radio and / or USB interface.
The reliability of object identification/authentication using biometric methods is usually measured by the value TAR (True Accept Rate). It represents the probability that the current template will be correctly recognized by the algorithm for finding a match between the current template and the template in the database of the authentication center, provided that the current template is obtained from an object whose template is stored in the database/on the map. Note that this probability includes an event that is associated with getting a low-quality current template. Thus, TAR characterizes the reliability of the main biometric identification processes: the process of obtaining a template and the process of finding its compliance with the standard in the database.
It is obvious that the value of TAR can be set to 1 if you do not limit the probability of an error in the result of the algorithm for finding a match between the current template and the template (i.e., the probability of an event that a match between the current template and the template is found, provided that the objects corresponding to the template and the template are different). Usually, the probability of recognizing an incorrect match is called FAR (False Accept Rate) and is limited to 0.01% from the top. In table. 1.1 the TAR values are given provided that FAR is <0.01%.
The TAR values shown in the second and third columns of the table. 1.1, differ. The second column corresponds to the case when a single image of the identified object is “removed” to get the current template. Tar probability values when multiple object images are used to get the template.

It should be noted that the values 100% – TAR and FAR in terms of mathematical statistics are characteristics of the hypothesis selection algorithm and are called, respectively, error probabilities of the first and second kind.
The above values of the sizes of biometric templates are typical for creating complete images of an object that are sufficient for applying various reliable search algorithms for comparing the current template and template to these images. In other words, the given values of the object image size are sufficient for reliable identification/authentication of the object using the appropriate comparison algorithms. When targeting a specific match search algorithm, the template size can be significantly reduced (up to 250 bytes — 2 KB, depending on the type of biometric data). This is because the size of the” criteria ” for comparing the current template and template is small.This way you can reduce the amount of stored template (important for IPC) by selecting a specific comparison algorithm.