How can I read smart card id? [duplicate] - java

Which APDU command gets 7 bytes of the card ID?
I use T=CL (ISO7816) protocol with ISO14443 layer. On detect card I can see only 4 bytes of the card ID.
I found that this should be the APDU command to get a card ID.
For example its:
0xFF, 0xCA, 0x00, 0x00, 0x00
but result of this command is: 6E 00, that on specifications of APDU answers tell that "Class not supported"
Then I found that its APDU command may be as:
0x00, 0xCA, 0x00, 0x00, 0x00
this command return 6A 88
where 6A XX - "Wrong parameter(s) P1-P2" and 88 - "Referenced data not found"
What you think about it?
Thank you!
P.S. All command as: CLA, INS, P1, P2, LenData, Data
Other my command work normaly (such as sellect aplet and work with it), problem only at getting card ID

The answer given before is wrong. This is because we are not talking about a ISO 7816 command here but a internal command of the PC/SC API.
The APDU "0xFF 0xCA 0x00 0x00 0x00" is in fact correct and I have cards for which I get a 7 byte answer. Please note that this will only work with contactless (RFID) cards because this UID is part of the radio protocol. Please note further that some chips will return a new random UID after each power up. This is for example true for my passport chip as well as my german national identity card and a countermeasure to prevent tracking of card holders. In theory such random UIDs shall begin with 0x08 but this is not always the case.
As the UID is a "internal" value of the protocol, the APDU in question is NOT sent to the card but is only a internal command (of the PC/SC Interface) to get the UID from the card reader driver. CLA 0xFF is generally not in normal use as it is only used for reserved for "Protocol Parameter Selection" (PPS). PC/SC abuses this CLA for internal commands.
The command here is the PC/SC internal "Get Data" Command, specified in Part 3, Section 3.2.2.1.3 of the PC/SC specification. Here P1 and P2 have special predefined meanings, so there is no point in trying different values. The standard only defineds P1=0,P2=0 for getting the UID and P1=1,P2=0 for "all historical bytes from the ATS of a ISO 14443 A card without CRC". Other values are not supported.
Interestingly the answer 0x6A 0x88 is not defined in the standard. 0x6a 0x81 would mean "Function not supported" which would be the case which cards which don't have a UID (standard mentions 7816-10 contact card). The two other defined answers (0x62 0x82 and 0x6C 0xXX) define a mismatch between the requested answer length and the actual amount of data and won't occur here, because we simply request any length data by specifying 0 in the last byte of the request.
So why it isn't working for the submitter I don't know. For me it works, some cards return 4 bytes, other return 7 bytes.
See the PC/SC standard, part 3 in particular, here: http://www.pcscworkgroup.com/specifications/specdownload.php

0xCA is the GET DATA command. You must supply a TLV Tag in P1-P2.
ISO 7816 part 6 "Interindustry data elements for interchange" has a list of these tags, but none of them corresponds unambiguously to "card ID". I suggest that you try all values of P2, with P1 equal to 0x00, 0x5F, or 0x7F, to find out which data elements are supported by your card.

I think your second command is correct, but the card has not been programmed with an application Id.
For 6A88 the BasicCard manual says: "The built-in command GET APPLICATION ID returns this error code if no application ID was configured in the BasicCard".

This is a very often discussed problem.
0xFF, 0xCA, 0x00, 0x00, 0x00 is the correct pcsc command to get th card uid.
If you get a 6E00 response, then your driver has a bug. Update the driver or try another reader.

I tried :
byte data[] = new byte[]{};
CommandApdu((byte)0xA0, (byte)0xC0, (byte)0x00, (byte)0x00, data)
I got SW1=(byte)0x9F SW2=(byte)0xXX
9FXX = "Command successfully executed; ‘xx’ bytes of data are available and can be
requested using GET RESPONSE."
Except 9F00 and 9F04 which means
9F00=PIN blocked and Unblock Try Counter is 3
9F04=PIN not succesfully verified, PIN blocked and Unblock Try Counter is 3

Related

Why APDU Response in READ BINARY COMMAND return 233 bytes and not 256

I am working with smart cards in java, trying to get a file. For this I am using APDU commands.
The "SELECT FILE" command is executed correctly, it answers me 90 00, but when obtaining the file through a cycle, each iteration is returning 233 bytes and not 256, and when assembling the file it does not work.
ResponseAPDU answer = channel.transmit(new CommandAPDU(new byte[]{(byte) 0x00, (byte) 0xB0, (byte) 0X00, (byte) 0x00, (byte) 0x00}));
System.out.println(answer.toString());
Prints: ResponseAPDU: 233 bytes, SW=9000
Thanks in advance for the help
Several possible reasons:
the file might just have 233 bytes
there is a trusted channel or some other kind of secure messaging involved and TLV format overhead as well as MAC consume the missing bytes
toString might be confused by non-printable characters
reader or card might be configured to a lower i/o buffer size
If by "assembling" you mean concatenating the results, try to adjust the start offset to the number of bytes you actually got.

Authenticating Ultralight EV1 with PC/SC reader

I have a problem trying to authenticate an Ultralight EV1 card using a PC/SC reader (specifically an ACR1222L) in Java. I'm able to write and read on an unprotected tag using the corresponding APDUs for ISO 14443-3 tags. However, I can't find a way to run the PWD_AUTH command since it is not part of the 14443-3 standard (or any native command for that matter). Is it possible to run this command (or any native command for that matter)?
I have tried sending the following APDU {e0 00 00 24 07 1b ff ff ff ff 63 00} where 1b is the native command, ff ff ff ff is the password and 63 00 is the CRC_A of the command plus password. I have also tried without the CRC, switching the order of the parameters, etc., but so far I could not get it working.
I also tried wrapping the APDU (as described in https://stackoverflow.com/a/41729534/3613883). I got it working with a Desfire EV1 card but it doesn’t work with the ultralight EV1 (since it doesn’t support ISO7816-4 obviously).
So, is there a way to authenticate a Ultralight EV1 card using a PC/SC reader?
First of all, MIFARE Ultralight EV1 does not speak APDUs. Instead it uses commands based directly on the framing defined in ISO/IEC 14443-3. Since ISO/IEC 14443-3 only defines the framing and the anti-collision/enumeration commands, any protocol on top of that (e.g. the MIFARE Ultralight/NTAG command sets) is proprietary.
The correct command for password authentication using the password FF FF FF FF would be:
byte[] tagCommand = new byte[] { (byte)0x1B, (byte)0xFF, (byte)0xFF, (byte)0xFF, (byte)0xFF };
Note that the CRC will typically be handled by the contactless frontend chip so you don't need to apped it manually.
With the ACR1222L, there are multiple different ways to exchange such proprietary commands:
You can use PC_to_RDR_Escape (note that that's only available if you installed the original ACR driver package for the reader). Assuming that you are using the Java Smartcard IO API, you would do that using the method Card.transmitControlCommand():
byte[] response = card.transmitControlCommand(SCARD_CTL_CODE(3500), command);
The definition of the method SCARD_CTL_CODE can be found in this post.
The command needs to be a byte array that contains an APDU header for the pseudo-APDU that passes raw commands to the contactless frontend chip and the actual command for the contactless frontend chip. Since the ACR1222L is based on an NXP PN532(?), the command for the contactless frontend chip would be the InDataExchange command (see the user manual):
byte[] interfaceCommandHeader = new byte[] { (byte)0xD4, (byte)0x40, (byte)0x01 };
byte[] interfaceCommand = Arrays.copyOf(interfaceCommandHeader, interfaceCommandHeader.length + tagCommand.length);
System.arraycopy(tagCommand, 0, interfaceCommand, interfaceCommandHeader.length, tagCommand.length);
Depending on how the reader actually activates the card, you might need to use the InCommunicateThru command instead of InDataExchange:
byte[] interfaceCommandHeader = new byte[] { (byte)0xD4, (byte)0x42 };
byte[] interfaceCommand = Arrays.copyOf(interfaceCommandHeader, interfaceCommandHeader.length + tagCommand.length);
System.arraycopy(tagCommand, 0, interfaceCommand, interfaceCommandHeader.length, tagCommand.length);
The pseudo APDU header can be added by:
byte[] commandHeader = new byte[] { (byte)0xE0, (byte)0x00, (byte)0x00, (byte)0x24, (byte)0x00 };
byte[] command = Arrays.copyOf(commandHeader, commandHeader.length + interfaceCommand.length);
System.arraycopy(interfaceCommand, 0, command, commandHeader.length, interfaceCommand.length);
command[4] = (byte)(interfaceCommand.length & 0x0FF); // update Lc field
Another option is to send commands directly using PC_to_RDR_XfrBlock. This maps to CardChannel.transmit() in the Java Smartcard IO API:
ResponseAPDU responseApdu = cardChannel.transmit(commandAPDU);
The manual of your reader is not quite clear if the same pseudo APDU header can be used over that interface. However, if you look into appendix H, you'll find a different header from wrapping into a pseudo APDU (the ACR122U legacy mode). So you could use the following:
CommandAPDU commandAPDU = new CommandAPDU(0xFF, 0x00, 0x00, 0x00, interfaceCommand);
Note that, again, you have to wrap the tag command into the InDataExchange command for the contactless frontend chip.

javax.smartcardio case 4 APDU vanishing - 6700 response - warning

Using javax.smartcardio classes for smartcard programming, I encountered a persistent error - getting back 6700 (invalid length) and similar error codes from the card when the code looked fine. Example code:
req = new CommandAPDU(0x00, 0xA4, 0x04, 0x00, aid, 0x00);
This is supposed to construct a case 4 APDU. Why does the card respond as if I were missing something?
req = new CommandAPDU(0x00, 0xA4, 0x04, 0x00, aid, 0x00);
This is supposed to construct a case 4 APDU. Why does the card respond as if I were missing something?
Short answer
Use aid, 0x100 instead of aid, 0x00.
Long answer (better get some coffee):
That's because of the confusion between Ne and Le. Ne is the maximum amount of bytes that can be returned to the terminal. Ne is a number without specific representation. Le however is the encoding or representation in bytes of Ne.
Now for ISO/IEC 7816-4 there is a little trick: Le is absent (no bytes) in case of an ISO case 1 or 3 command without response data (RDATA). So defining Le = 00 to mean "no response data" is spurious. Instead 7816-4 uses Le = 00 to mean Ne = 256. Similarly, Le = 0000 (or Le = 000000) means Ne = 65536, i.e. 2^16. The double and triple byte encoding are only used for extended length APDU's.
As you can see in the CommandAPDU constructor however you have to specify Ne, not Le. What you specify is therefore the same as saying that there is no response data. So the APDU will not be interpreted correctly as an ISO case 4 and the command will fail (correctly in this case, 6700 is exactly what you should expect).
So just specify how many bytes you expect. If the value is larger than 256 then an extended length APDU will be required (or command chaining, but that's a topic in itself). Ne < 0 or Ne > 64Ki is of course not supported.
Note that many protocol descriptions including the Java Card API got the distinction between Ne and Le wrong (this has been fixed in the Java Card API v3.0.5 by the way). That's kind of strange as there are many many issues with 7816-4, but this is not one of them. It's specified pretty clearly.

Reading phantom NFC tags via javax.smartcardio

I have an an old NFC reader for the tikitag web service (which was later renamed touchatag, and finally abandoned around 2012). Since the website is no longer available, I could no longer find the original tikitag/touchatag drivers. After some searching, I found this NFC reader is a generic ACS ACR122U USB reader, and installed a suitable driver from here. My system is Windows 7 (64-bits).
First, I tried the NFC Tools library for high-level read and write access to NFC tags. I got an error saying an unsupported tag was encountered; although no tag was present on the reader, or even remotely nearby. It seems other developers also encountered the same error with this library, as shown here. Note this tag is detected ad infinitum (so, it does not just disappear after being detected once).
I copied the required low-level code into a separate class (i.e., independent from the NFC Tools library). You can find this code below (similar code can also be found in tutorials):
import java.util.List;
import javax.smartcardio.Card;
import javax.smartcardio.CardTerminal;
import javax.smartcardio.TerminalFactory;
import org.nfctools.utils.NfcUtils;
public class NdefTest {
public static void main(String[] args) throws Exception {
TerminalFactory factory = TerminalFactory.getDefault();
List<CardTerminal> terminals = factory.terminals().list();
CardTerminal terminal = terminals.get(0);
if (terminal.waitForCardPresent(5000)) {
Card card = terminal.connect("T=0");
System.out.println(NfcUtils.convertBinToASCII(card.getATR().getHistoricalBytes()));
}
}
}
This code detects the exact same "phantom" tag as when using the NFC Tools library. Therefore, this issue seems unrelated to the NFC Tools library (as implied by the library developer in response to the error report). Either I'm missing something, or the issue is either related to the installed driver, the NFC reader hardware, or some unfixed bug in javax.smartcardio (listed in order of likelihood).
I have tried uninstalling the aforementioned driver and letting Windows 7 install a suitable driver on its own (called "Microsoft Usbccid Smartcard Reader (WUDF)"), which results in the same errors as described above. I have not tried another reader, since I only have the one.
(Note: the name of this NFC reader in the Windows device overview is "CCID USB Reader", instead of "ACS ACR122" or something related. Don't know whether this is important, just thought I would mention it.).
Has anyone encountered this issue, and managed to resolve it?
UPDATE
Ok, I've tried sending a CLF command to the reader after the simulated tag has been detected; namely, getting the ATS of the connected PICC (p. 11 of the ACR122U manual):
TerminalFactory factory = TerminalFactory.getDefault();
List<CardTerminal> terminals = factory.terminals().list();
// (this is the correct terminal)
CardTerminal terminal = terminals.get(0);
if (terminal.waitForCardPresent(5000)) {
Card card = terminal.connect("*");
CardChannel channel = card.getBasicChannel();
// (I tried both 0x00 and 0x01 as P1, as well as 0x05 for Le)
CommandAPDU getAts = new CommandAPDU(0xFF, 0xCA, 0x00, 0x00, 0x04);
ResponseAPDU response = channel.transmit(getAts);
System.out.println(response.getSW1());
System.out.println(response.getSW2());
}
But I keep getting an error response code (0x63 0x00). Any ideas on what I could be doing wrong?
The problem you encounter is that this version of the ACR122U reader uses PC/SC (CCID) in a somewhat non-standard way.
The "card" that you detect with the PC/SC API is actually either a dummy card simulated by the reader (to allow the PC/SC API to open a connection even if no card is present) or a smartcard chip in the reader's SAM slot (contact card present inside the reader's casing).
In either case, this reader uses PC/SC only as a transport protocol for native commands of the contactless frontend chip used within this reader (NXP PN532). Thus, if you want to use the reader's contactless functionality, you have to use the CLF's native command set. See the ACR122U API documentation or the libnfc implementation for further details.
(all credit goes to Michael Roland; this post is meant as a solution summary)
Ok Michael, given the example in your last comment, I finally understand what you mean by using the PC/SC protocol for tunneling CLF commands. I tested some of the commands in the PN532 documentation, and they return valid results. (However, the command you gave as an example didn't work and crashed the reader; it had to be reset.)
For instance, to get the firmware version:
CommandAPDU commApdu = new CommandAPDU(0xFF, 0x00, 0x00, 0x00,
new byte[] { (byte)0xD4, (byte)0x02 });
InDataExchange command:
CommandAPDU commApdu = new CommandAPDU(0xFF, 0x00, 0x00, 0x00,
new byte[] { (byte)0xD4, (byte)0x40, 0x01 });
I found the NFCIP library, which supports sending byte arrays between peers (examples are ACS ACR122 and Nokia 6131) using the InDataExchange command. When reading the PN532 documentation (p. 131), it seems that this command allows reading tags as well. Michael, do you happen to know of any library that handles uses these low-level commands with the goal of reading (different types of) tags?

Is UTF to EBCDIC Conversion lossless?

We have a process which communicates with an external via MQ. The external system runs on a mainframe maching (IBM z/OS), while we run our process on a CentOS Linux platform. So far we never had any issues.
Recently we started receiving messages from them with non-printable EBCDIC characters embedded in the message. They use the characters as a compressed ID, 8 bytes long. When we receive it, it arrives on our queue encoded in UTF (CCSID 1208).
They need to original 8 bytes back in order to identify our response messages. I'm trying to find a solution in Java to convert the ID back from UTF to EBCDIC before sending the response.
I've been playing around with the JTOpen library, using the AS400Text class to do the conversion. Also, the counterparty has sent us a snapshot of the ID in bytes. However, when I compare the bytes after conversion, they are different from the original message.
Has anyone ever encountered this issue? Maybe I'm using the wrong code page?
Thanks for any input you may have.
Bytes from counterparty(Positions [5,14]):
00000 F0 40 D9 F0 F3 F0 CB 56--EF 80 04 C9 10 2E C4 D4 |0 R030.....I..DM|
Program output:
UTF String: [R030ôîÕ؜IDMDHP1027W 0510]
EBCDIC String: [R030ôîÃÃÂIDMDHP1027W 0510]
NATIVE CHARSET - HEX: [52303330C3B4C3AEC395C398C29C491006444D44485031303237572030353130]
CP500 CHARSET - HEX: [D9F0F3F066BE66AF663F663F623FC9102EC4D4C4C8D7F1F0F2F7E640F0F5F1F0]
Here is some sample code:
private void readAndPrint(MQMessage mqMessage) throws IOException {
mqMessage.seek(150);
byte[] subStringBytes = new byte[32];
mqMessage.readFully(subStringBytes);
String msgId = toHexString(mqMessage.messageId).toUpperCase();
System.out.println("----------------------------------------------------------------");
System.out.println("MESSAGE_ID: " + msgId);
String hexString = toHexString(subStringBytes).toUpperCase();
String subStr = new String(subStringBytes);
System.out.println("NATIVE CHARSET - HEX: [" + hexString + "] [" + subStr + "]");
// Transform to EBCDIC
int codePageNumber = 37;
String codePage = "CP037";
AS400Text converter = new AS400Text(subStr.length(), codePageNumber);
byte[] bytesData = converter.toBytes(subStr);
String resultedEbcdicText = new String(bytesData, codePage);
String hexStringEbcdic = toHexString(bytesData).toUpperCase();
System.out.println("CP500 CHARSET - HEX: [" + hexStringEbcdic + "] [" + resultedEbcdicText + "]");
System.out.println("----------------------------------------------------------------");
}
If a MQ message has varying sub-message fields that require different encodings, then that's how you should handle those messages, i.e., as separate message pieces.
But as you describe this, the entire message needs to be received without conversion. The first eight bytes need to be extracted and held separately. The remainder of the message can then have its encoding converted (unless other sub-fields also need to be extracted as binary, unconverted bytes).
For any return message, the opposite conversion must be done. The text portion of the message can be converted, and then that sub-string can have the original eight bytes prepended to it. The newly reconstructed message then can be sent back through the queue, again without automatic conversion.
Your partner on the other end is not using the messaging product correctly. (Of course, you probably shouldn't say that out loud.) There should be no part of such a message that cannot automatically survive intact across both directions. Instead of an 8-byte binary field, it should be represented as something more like a 16-byte hex representation of the 8-byte value for one example method. In hex, there'd be no conversion problem either way across the route.
It seems to me that the special 8 bytes are not actually EBCDIC character but simply 8 bytes of data. If it is in such case then I believe, as mentioned by another answer, that you should handle that 8 bytes separately without allowing it convert to UTF8 and then back to EBCDIC for further processing.
Depending on the EBCDIC variant you are using, it is quite possible that a byte in EBCDIC is not converting to a meaningful UTF-8 character, and hence, you will fail to get the original byte by converting the UTF8 character to EBCDIC you received.
A brief search on Google give me several EBCDIC tables (e.g. http://www.simotime.com/asc2ebc1.htm#AscEbcTables). You can see there are lots of values in EBCDIC that do not have character assigned. Hence, when they are converted to UTF8, you may not assume each of them will convert to a distinct character in Unicode. Therefore your proposed way of processing is going to be very dangerous and error-prone.

Categories