How to decode ResponseAPDU to XML from an Austrian e-card? - java

I try to read information from an Austrian e-card to get the first name and last name.
What works for now is: Access the card, send APDU commands and get the information as byte array.
How can I convert the received byte array to XML to extract the needed data?
Here is the code:
import java.util.List;
import javax.smartcardio.Card;
import javax.smartcardio.CardChannel;
import javax.smartcardio.CardException;
import javax.smartcardio.CardTerminal;
import javax.smartcardio.CommandAPDU;
import javax.smartcardio.ResponseAPDU;
import javax.smartcardio.TerminalFactory;
public class Main2 {
public static void main(String[] args) {
TerminalFactory factory = TerminalFactory.getDefault();
List<CardTerminal> terminals;
try {
terminals = factory.terminals().list();
CardTerminal terminal = terminals.get(0);
Card card = terminal.connect("*");
CardChannel channel = card.getBasicChannel();
// Select the MF
byte[] aid = { (byte) 0xD0, 0x40, 0x00, 0x00, 0x17, 0x01, 0x01, 0x01 };
ResponseAPDU resp = channel.transmit(new CommandAPDU(0x00, 0xA4, 0x04, 0x00, aid));
System.out.println("Response: " + resp.toString());
// Select the Personaladata-file
byte[] aid2 = { (byte) 0xEF, 0x01 };
resp = channel.transmit(new CommandAPDU(0x00, 0xA4, 0x02, 0x04, aid2));
System.out.println("Response: " + resp.toString());
// Get the data from the file
resp = channel.transmit(new CommandAPDU(0x00, 0xB0, 0x00, 0x00, 0xFF));
System.out.println("Response: " + resp.toString());
System.out.println("Response String: " + new String(resp.getData()));
card.disconnect(false);
} catch (CardException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}

I'm not sure how to transform that data into an XML structure (and according to what schema). However, the byte array that I received from my SV card looks like an ASN.1 DER encoded TLV structure:
30 xxxx
SEQUENCE
30 18
SEQUENCE
06 08
OBJECT IDENTIFIER
2A28000A01040101
=> OID 1.2.40.0.10.1.4.1.1 (SV number)
31 0C
SET
12 0A
NumericString
nnnnnnnnddddmmmmyyyy
=> SV number: NNNN DDMMYY
30 0F
SEQUENCE
06 08
OBJECT IDENTIFIER
2A28000A01040103
=> OID 1.2.40.0.10.1.4.1.3 (Card sequence number)
31 03
SET
02 01
INTEGER
xx
=> Card sequence number: xx
30 xx
SEQUENCE
[...]
30 xx
SEQUENCE
06 03
OBJECT IDENTIFIER
55042A
=> OID 2.5.4.42 ({joint-iso-itu-t(2) ds(5) attributeType(4) givenName(42)})
31 xx
SET
0C xx
UTF8String
4D69636861656C
=> Given name: "Michael"
30 xx
SEQUENCE
06 03
OBJECT IDENTIFIER
550404
=> OID 2.5.4.4 ({joint-iso-itu-t(2) ds(5) attributeType(4) surname(4)})
31 xx
SET
0C xx
UTF8String
526F6C616E64
=> Surname: "Roland"
30 xx
SEQUENCE
[...]
30 1D
SEQUENCE
06 08
OBJECT IDENTIFIER
2B06010505070901
=> OID 1.3.6.1.5.5.7.9.1 ({iso(1) identified-organization(3) dod(6) internet(1) security(5) mechanisms(5) pkix(7) pda(9) dateOfBirth(1)})
31 11
SET
18 0F
GeneralizedTime
yyyyyyyymmmmdddd3132303030305A
=> Date of birth: YYYY-MM-DD 12:00:00Z
30 0F
SEQUENCE
06 08
OBJECT IDENTIFIER
2B06010505070903
=> OID 1.3.6.1.5.5.7.9.3 ({iso(1) identified-organization(3) dod(6) internet(1) security(5) mechanisms(5) pkix(7) pda(9) gender(3)})
31 03
SET
13 01
PrintableString
4D
=> Gender: M (male)
So this seems to follow something like the following ASN.1 notation:
SVPersonGrunddaten ::= SEQUENCE OF Attribute
Attribute ::= SEQUENCE {
attributeName OBJECT IDENTIFIER,
attributeValue SET OF AttributeType }
AttributeType ::= CHOICE {
numericString NumericString,
integer INTEGER,
utf8String UTF8String,
time GeneralizedTime,
printableString PrintableString }
Where the attributes for the given name and the surname are
givenName Attribute ::= {
attributeName 2.5.4.42,
attributeValue { utf8String "Given Name" }
}
surname Attribute ::= {
attributeName 2.5.4.4,
attributeValue { utf8String "Surname" }
}
So in order to get the given name and the surname, you would parse the TLV structure, search for the OIDs of those two elements, and decode the associated values as a UTF8 string.
Note that simply assuming that the fields are there at the exact positions does not seem to be a good idea. For instance, there is a field 30 xx ... (a field of type Attribute) before the given name field that seems to be only present if there is an academic/professional title (e.g. "Dr." in my case) printed on the card. Similarly, there is another optional field for academic suffixes (such as "M.Sc.") that is only present if such a suffix is printed on the card. Though all other fields were always in the same order on my cards, I'm not sure if that's even required.

Thanks for the hint, here are code to decode the DER byte array to String
ASN1InputStream input = new ASN1InputStream(resp.getData());
ASN1Primitive p;
try {
while ((p = input.readObject()) != null) {
// System.out.println("DEBUG: " + ASN1Dump.dumpAsString(p));
// Sozialversicherungsnummer
ASN1Sequence asn1 = ASN1Sequence.getInstance(p);
ASN1Sequence seq = DLSequence.getInstance(asn1.getObjectAt(0));
ASN1Set svn = DLSet.getInstance(seq.getObjectAt(1));
DERNumericString svnObject = DERNumericString.getInstance(svn.getObjectAt(0));
System.out.println("SVN: " + svnObject.getString());
// Vorname
seq = DLSequence.getInstance(asn1.getObjectAt(2));
svn = DLSet.getInstance(seq.getObjectAt(1));
DERUTF8String stringObject = DERUTF8String.getInstance(svn.getObjectAt(0));
System.out.println("Vorname: " + stringObject.getString());
// Nachname
seq = DLSequence.getInstance(asn1.getObjectAt(3));
svn = DLSet.getInstance(seq.getObjectAt(1));
stringObject = DERUTF8String.getInstance(svn.getObjectAt(0));
System.out.println("Vorname: " + stringObject.getString());
// Geschlecht
seq = DLSequence.getInstance(asn1.getObjectAt(5));
svn = DLSet.getInstance(seq.getObjectAt(1));
DERPrintableString charObject = DERPrintableString.getInstance(svn.getObjectAt(0));
System.out.println("Geschlecht: " + charObject.getString());
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}

Related

CryptoJS encrypt HMACSha256 different than Java

I'm trying to convert this code of CryptoJS to Kotlin:
const hash = CryptoJS.HmacSHA256(message, key);
const signature = CryptoJS.enc.Hex.stringify(hash);
That's the kotlin code equivalent to above snippet:
private fun generateSignature(key: String, payload: String): String {
val algorithm = "HmacSHA256"
return Mac.getInstance(algorithm)
.apply { init(SecretKeySpec(key.toByteArray(), algorithm)) }
.run { doFinal(payload.toByteArray()) }
.let { HexUtils.toHexString(it) }
}
But it is not working at all. They generate different results. CryptoJS generates an array of bytes that has 8 positions, the Java code generates an array of bytes that has 32 positions.
I don't know what Im doing wrong. I need to make my Kotlin code work exactly as the javascript one.
Update: I can't change the Javascript way. I have to do the exactly same thing in Kotlin
Update2: Here is a test where the JS code and the Kotlin code generates different results.
Input:
key = 's21fk4vb-5415-46c7-aade-303dcf432bb4'
message = 'POST,/wallets/3323461f96-bdf3-4e03-bc93-7da1fb27aee7/withdraw/,1573148023809,{"amount":"1.0","bank":{"bank":"789","agency":"456","account":"12378","accountDigit":"6","name":"joao","taxId":"33206913098","holderType":"personal"}}'
Results with JS code:
Result of encrypt in bytes:
{sigBytes: 32, words: [8]}
sigBytes: 32
words: [8]
0: 2102759135
1: -196086391
2: -2099697915
3: -1620551271
4: 2463524
5: 1757965357
6: -1039993965
7: -1798822705
Bytes to Hex:
7d558edff44ff58982d927059f6859990025972468c86c2dc202f39394c824cf
Results with Kotlin code:
Result of encrypt in bytes:
{byte[32]#1126}
0 = 82
1 = -110
2 = -100
3 = -128
4 = -63
5 = 22
6 = -103
7 = -31
8 = 83
9 = -125
10 = -72
11 = 109
12 = -91
13 = -69
14 = 54
15 = -41
16 = 27
17 = -107
18 = -60
19 = -110
20 = -57
21 = -29
22 = -20
23 = -32
24 = -66
25 = 88
26 = 87
27 = -50
28 = -47
29 = -18
30 = -96
31 = 25
Bytes to Hex:
52929c80c11699e15383b86da5bb36d71b95c492c7e3ece0be5857ced1eea019
No SHA-256 hash can have only 8 byte positions. The output, as the name suggests, should be 256 bits or 32 bytes. What I suspect to happen is that the input of stringify is already presumed to be bytes, while CryptoJS functions return a WordArray of 32 bit words. As 8 * 32 = 256 this seems reasonable.
So I presume you can simply fix this by using a function on the WordArray instead, for instance hash.toString('hex').

ECDSA sign with BouncyCastle and verify with Crypto++

Here is the Java code:
public static String sign(String data) throws Exception {
KeyPair keyPair = loadKeyPair(System.getProperty("user.dir"), "ECDSA");
Signature signature = Signature.getInstance("SHA256withECDSA", "BC");
signature.initSign(keyPair.getPrivate(), new SecureRandom());
byte[] message = data.getBytes();
signature.update(message);
byte[] sigBytes = signature.sign();
String signatureStr = new BigInteger(1, sigBytes).toString(16);
return signatureStr;
}
Then the C++ Code to verify signatures
bool VerifyMessage( const ECDSA<ECP, SHA256>::PublicKey& key, const string& message, const string& signature )
{
bool result = false;
// Hexa encoding version, more readable
std::string decodedSignature;
StringSource(signature, true,
new HexDecoder(
new StringSink(decodedSignature)));
StringSource(decodedSignature+message, true,
new SignatureVerificationFilter(ECDSA<ECP,SHA256>::Verifier(key),
new ArraySink((byte*)&result, sizeof(result))));
return result;
}
I was thinking that I need to encode my signature to hexa but it didn't resolve my problem. I've written a c++ version of the sign method using crypto++ and it's verified. so why when I use the java code, the signature is not verified. Thanks
... why when I use the java code, the signature is not verified?
OpenSSL and Java use an ASN.1/DER encoding for the signature, and Crypto++ uses IEEE P1363's format for the signature.
ASN.1: SEQUENCE ::= { r INTEGER, s INTEGER }
P1363: [byte array r][byte array s]
You need to convert between the formats. Crypto++ provides DSAConvertSignatureFormat to convert between formats. There is an example on the Crypto++ wiki at Elliptic Curve Digital Signature Algorithm | OpenSSL and Java Interop.
Here is the Crypto++ code from the wiki. It uses OpenSSL and its command line tools rather than Java. There is no material difference because OpenSSL and Java output signatures in ASN.1/DER format.
#include "cryptlib.h"
#include "eccrypto.h"
#include "files.h"
#include "dsa.h"
#include "sha.h"
#include "hex.h"
#include <iostream>
using namespace CryptoPP;
int main(int argc, char* argv[])
{
// Load DER encoded public key
FileSource pubKey("secp256k1-pub.der", true /*binary*/);
ECDSA<ECP, SHA1>::Verifier verifier(pubKey);
// Java or OpenSSL created signature. It is ANS.1
// SEQUENCE ::= { r INTEGER, s INTEGER }.
const byte derSignature[] = {
0x30, 0x44, 0x02, 0x20, 0x08, 0x66, 0xc8, 0xf1,
0x6f, 0x15, 0x00, 0x40, 0x8a, 0xe2, 0x1b, 0x40,
0x56, 0x28, 0x9c, 0x17, 0x8b, 0xca, 0x64, 0x99,
0x37, 0xdc, 0x35, 0xad, 0xad, 0x60, 0x18, 0x4d,
0x63, 0xcf, 0x4a, 0x06, 0x02, 0x20, 0x78, 0x4c,
0xb7, 0x0b, 0xa3, 0xff, 0x4f, 0xce, 0xd3, 0x01,
0x27, 0x5c, 0x6c, 0xed, 0x06, 0xf0, 0xd7, 0x63,
0x6d, 0xc6, 0xbe, 0x06, 0x59, 0xe8, 0xc3, 0xa5,
0xce, 0x8a, 0xf1, 0xde, 0x01, 0xd5
};
// P1363 'r || s' concatenation. The size is 32+32 due to field
// size for r and s in secp-256. It is not 20+20 due to SHA-1.
SecByteBlock signature(verifier.SignatureLength());
DSAConvertSignatureFormat(signature, signature.size(), DSA_P1363,
derSignature, sizeof(derSignature), DSA_DER);
// Message "Attack at dawn!"
const byte message[] = {
0x41, 0x74, 0x74, 0x61, 0x63, 0x6b, 0x20, 0x61,
0x74, 0x20, 0x64, 0x61, 0x77, 0x6e, 0x21, 0x0a
};
// https://www.cryptopp.com/wiki/Elliptic_Curve_Digital_Signature_Algorithm
bool result = verifier.VerifyMessage(message, sizeof(message), signature, signature.size());
if (result)
std::cout << "Verified message" << std::endl;
else
std::cout << "Failed to verify message" << std::endl;
return 0;
}
And here is the result of running the test program.
$ ./test.exe
Signature (64):
0866C8F16F1500408AE21B4056289C178BCA649937DC35ADAD60184D63CF4A06784CB70BA3FF4FCE
D301275C6CED06F0D7636DC6BE0659E8C3A5CE8AF1DE01D5
Verified message
Here is the setup I used to reproduce cat test.txt | openssl dgst -ecdsa-with-SHA1 -sign sample.key -keyform DER > test.sig. It is from #DivB's question at ECDSA sign with OpenSSL, verify with Crypto++.
$ cat test.txt
Attack at dawn!
$ hexdump -C test.txt
00000000 41 74 74 61 63 6b 20 61 74 20 64 61 77 6e 21 0a |Attack at dawn!.|
00000010
# Create private key in PEM format
$ openssl ecparam -name secp256k1 -genkey -noout -out secp256k1-key.pem
$ cat secp256k1-key.pem
-----BEGIN EC PRIVATE KEY-----
MHQCAQEEIO0D5Rjmes/91Nb3dHY9dxmbM7gVfxmB2+OVuLmWMbGXoAcGBSuBBAAK
oUQDQgAEgVNEuirUNCEVdf7nLSBUgU1GXLrtIBeglIbK54s91HlWKOKjk4CkJ3/B
wGAfcYKa+DgJ2IUQSD15K1T/ghM9eQ==
-----END EC PRIVATE KEY-----
# Convert private key to ASN.1/DER format
$ openssl ec -in secp256k1-key.pem -inform PEM -out secp256k1-key.der -outform DER
$ dumpasn1 secp256k1-key.der
0 116: SEQUENCE {
2 1: INTEGER 1
5 32: OCTET STRING
: ED 03 E5 18 E6 7A CF FD D4 D6 F7 74 76 3D 77 19
: 9B 33 B8 15 7F 19 81 DB E3 95 B8 B9 96 31 B1 97
39 7: [0] {
41 5: OBJECT IDENTIFIER secp256k1 (1 3 132 0 10)
: }
48 68: [1] {
50 66: BIT STRING
: 04 81 53 44 BA 2A D4 34 21 15 75 FE E7 2D 20 54
: 81 4D 46 5C BA ED 20 17 A0 94 86 CA E7 8B 3D D4
: 79 56 28 E2 A3 93 80 A4 27 7F C1 C0 60 1F 71 82
: 9A F8 38 09 D8 85 10 48 3D 79 2B 54 FF 82 13 3D
: 79
: }
: }
# Create public key from private key
$ openssl ec -in secp256k1-key.der -inform DER -pubout -out secp256k1-pub.der -outform DER
$ dumpasn1 secp256k1-pub.der
0 86: SEQUENCE {
2 16: SEQUENCE {
4 7: OBJECT IDENTIFIER ecPublicKey (1 2 840 10045 2 1)
13 5: OBJECT IDENTIFIER secp256k1 (1 3 132 0 10)
: }
20 66: BIT STRING
: 04 81 53 44 BA 2A D4 34 21 15 75 FE E7 2D 20 54
: 81 4D 46 5C BA ED 20 17 A0 94 86 CA E7 8B 3D D4
: 79 56 28 E2 A3 93 80 A4 27 7F C1 C0 60 1F 71 82
: 9A F8 38 09 D8 85 10 48 3D 79 2B 54 FF 82 13 3D
: 79
: }
# Sign the message using the private key
$ cat test.txt | openssl dgst -ecdsa-with-SHA1 -sign secp256k1-key.der -keyform DER > test.sig
# Dump the signature as hex
$ hexdump -C test.sig
00000000 30 44 02 20 08 66 c8 f1 6f 15 00 40 8a e2 1b 40 |0D. .f..o..#...#|
00000010 56 28 9c 17 8b ca 64 99 37 dc 35 ad ad 60 18 4d |V(....d.7.5..`.M|
00000020 63 cf 4a 06 02 20 78 4c b7 0b a3 ff 4f ce d3 01 |c.J.. xL....O...|
00000030 27 5c 6c ed 06 f0 d7 63 6d c6 be 06 59 e8 c3 a5 |'\l....cm...Y...|
00000040 ce 8a f1 de 01 d5 |......|
00000046
# Dump the signature as ASN.1/DER
$ dumpasn1 test.sig
0 68: SEQUENCE {
2 32: INTEGER
: 08 66 C8 F1 6F 15 00 40 8A E2 1B 40 56 28 9C 17
: 8B CA 64 99 37 DC 35 AD AD 60 18 4D 63 CF 4A 06
36 32: INTEGER
: 78 4C B7 0B A3 FF 4F CE D3 01 27 5C 6C ED 06 F0
: D7 63 6D C6 BE 06 59 E8 C3 A5 CE 8A F1 DE 01 D5
: }
By the way, another way around your problem (especially allowing you to avoid the command line) would be to modify the Java code in order to have a way to produce the R and S values, as well as to reproduce the DER encoded values.
For example you can extract the R and S values from the Java signature using those:
public static BigInteger extractR(byte[] signature) throws Exception {
int startR = (signature[1] & 0x80) != 0 ? 3 : 2;
int lengthR = signature[startR + 1];
return new BigInteger(Arrays.copyOfRange(signature, startR + 2, startR + 2 + lengthR));
}
public static BigInteger extractS(byte[] signature) throws Exception {
int startR = (signature[1] & 0x80) != 0 ? 3 : 2;
int lengthR = signature[startR + 1];
int startS = startR + 2 + lengthR;
int lengthS = signature[startS + 1];
return new BigInteger(Arrays.copyOfRange(signature, startS + 2, startS + 2 + lengthS));
}
These methods are notably used in Wycheproof to play around with the BigIntegers directly.
These can allow you to reconstruct the P1363 encoding used by CryptoPP in Java, but be careful not to forget the left padding with 0s of the bytearrays. (Otherwise you may have problems when the R or S bytearray is smaller than the expected length.)
And you can also reconstruct the DER encoded signature from big integers using:
public static byte[] derSign(BigInteger r, BigInteger s) throws Exception {
byte[] rb = r.toByteArray();
byte[] sb = s.toByteArray();
int off = (2 + 2) + rb.length;
int tot = off + (2 - 2) + sb.length;
byte[] der = new byte[tot + 2];
der[0] = 0x30;
der[1] = (byte) (tot & 0xff);
der[2 + 0] = 0x02;
der[2 + 1] = (byte) (rb.length & 0xff);
System.arraycopy(rb, 0, der, 2 + 2, rb.length);
der[off + 0] = 0x02;
der[off + 1] = (byte) (sb.length & 0xff);
System.arraycopy(sb, 0, der, off + 2, sb.length);
return der;
}
As you can see, these methods might be translated into C++ code, since they are really basic byte manipulations, but that's another story ;)
Building on top of Lery's excellent answer, I found myself wanting a 64-byte fixed P1363 style signature. The Java solution posted is great, but the r and s values may contain sign bits and therefore result in a 64-66 bytes signature.
In this Kotlin function, I compute the r and s values, and I take only the lower 32 bytes each, and this gave me the 64-byte signature I wanted.
fun generateSignatureFromKeystore(message: ByteArray, privateKey: PrivateKey): ByteArray {
// BouncyCastle's signing doesn't work with Android Keystore's ECPrivateKey
val signatureConfig = Signature.getInstance("SHA256withECDSA").apply {
initSign(privateKey)
update(message)
}
val signature = signatureConfig.sign()
// Convert ASN.1 DER signature to IEEE P1363
val startR = if (signature[1].toUnsignedInt().and(0) != 0) 3 else 2
val lengthR = signature[startR + 1].toUnsignedInt()
val r = signature.copyOfRange(startR + 2, startR + 2 + lengthR).takeLast(32).toByteArray()
val startS = startR + 2 + lengthR
val lengthS = signature[startS + 1].toInt()
val s = signature.copyOfRange(startS + 2, startS + 2 + lengthS).takeLast(32).toByteArray()
return r + s
}
private fun Byte.toUnsignedInt(): Int = toInt().and(0xFF)

How can I write bits to byte array in java?

I try to create mpegts presentation timestamp. It is 5 bytes length. I found solution in source code of VLC player. It's looks like this (in C lang code):
bits_write( &bits, 4, i_pts_dts ); // '0010' or '0011'
bits_write( &bits, 3, i_pts >> 30 );
bits_write( &bits, 1, 0x01 ); // marker
bits_write( &bits, 15, i_pts >> 15 );
bits_write( &bits, 1, 0x01 ); // marker
bits_write( &bits, 15, i_pts );
bits_write( &bits, 1, 0x01 ); // marker
i_header_size -= 0x5;
That means I must to collect 5 bytes from 40 bits.
For example, I need to 5 bytes from 2350 decimal number. Binary view:
1001 0010 1110
After VLC manipulation I must have this binary view:
0010 000 1 000000000000000 1 000100100101110 1
Hex view:
21 00 01 12 5D
How can I do it in Java?
Also I found Java-solution on GitHub: https://github.com/taktod/myLib/blob/master/myLib.MIT/myLib.container.mpegts/src/main/java/com/ttProject/container/mpegts/field/PtsField.java
But this realization is too difficult. For a one-time operation it is necessary to create too many helper classes like Bit1, Bit2, Bit3, etc...
This is simple bit-manipulation:
int dts = 2; // must be 2 or 3
long pts = 2350; // must be less than 8,589,934,592
byte[] output = new byte[] {
(byte) (dts << 4 | pts >> 30 | 1),
(byte) (pts >> 22),
(byte) (pts >> 15 | 1),
(byte) (pts >> 7),
(byte) (pts << 1 | 1)
};
for (byte b : output)
System.out.printf("%02x ", b); // prints: 21 00 01 12 5d

SmartCard: APDU command READ_BINARY returns error: Wrong parameters P1-P2

I am trying to read a file that is under MF. The EF.DIR file. I got the file's SFID, so I don't use the method SELEC_FILE first (since it's not necessary).
I might be having some problems with understanding the P2 parameter (the OFFSET). I read couple of explanations, but still don't get what OFFSET do they mean. But I tried all the numbers from 0-8 just in case, none worked.
CLA = 0x00
INS_READ = 0xB0
P1_READ = 0x9E (by the datasheed: bit(8) = 1, bit(7:6) = 00, bit(5:1): SFID)
P2 = 0x04 (I figured that the offset should be from bit0 to bit4 (the SFID)
Le = 0 (by the datasheet I have, this should mean that any size will be returned)
This is my code:
byte[] readBinary = { CLA, INS_READ, P1_READ, (byte) 0x04, (short) 0};
ResponseAPDU read = channel.transmit(new CommandAPDU(readBinary));
String responseReadToString =read.toString();
System.out.println("Response Read: " + responseReadToString + "\n" + "Response Read (HEX): " + responseReadHex );
The output I get in Console is:
Response Read: ResponseAPDU: 2 bytes, SW=6b00
Response Read (HEX): 6B00
Explanations of SW1-SW2 for 6B00:
Incorrect parameters P1-P2
I really don't know what is wrong and it's really hard to find support on SmartCards online, so hopefully someone who knows this better can help me out. I also tried with using SELECT_FILE first and the use READ_BINARY after it (without the SFID in P1 parametr ofcourse), but it responded with "No EF is set as current".
Any help guys?
The offset is the position/startpoint from where you start reading.
Example: Data = [0x00 0x01 0x02 0x03 0x04 0x05]
When you query a ReadBinary with offset=2 then returned data will be [0x02 0x03 0x4 0x05]
As you probably want to read the whole EF.DIR file offset shall be zero.
For reading EF.DIR you can either send
00 B0 9E 00 00
or
00 B1 2F 00 04 54 02 00 00 00
or
00 A4 02 0C 02 2F 00
00 B0 00 00 00

PNG True color with Alpha decoding

I am writing a PNG decoder and I am encountering some weirdness. Going through the PNG file format, I have managed to decode PNGs with Color Index + tRNS (alpha) and True Color + tRNS (alpha) correctly. I am currently not sure why I cannot decode a PNG with True Color with Alpha type PNG. I have verified that my inflate of IDAT chunk is correct. Here's what the chunks looks like:
Width: 256
Height: 256
Bit Depth: 8
Color Type: 6
Compression: 0
Filter: 0
Interlace: 0
Length: 25
Type: tEXt
Data: 53 6f 66 74 77 61 72 65 00 41 64 6f 62 65 20 49 6d 61 67 65 52 65 61 64 79
CRC: 71c9653c
Length: 20690
Type: IDAT
Data: 78 da ec 7d 09 9c 1c 57 99 df f7 7a a6 e7 3e 5a a3 fb ...
CRC: 21550259
The actual data is too long to be printed here. Here's my logic of decoding this, please correct me if I'm wrong:
Inflate all the bytes given in the IDAT chunk
Un-filter the inflated chunks. In this case, all filters are of type 0 and therefore, we simply discard the filter byte.
Since this is color type 6, a pixel is represented by RGBA channels with 1 byte each. This means we need to interpret 4 bytes at a time. The following code is used:
ByteBuffer image = BufferUtil.getDirectByteBuffer(data.length, ByteOrder.BIG_ENDIAN);
int i = 0;
while(i < data.length){
int color = ( (data[i] & 0xff) << 24) | ( (data[i+1] & 0xff) << 16) | ( (data[i+2] & 0xff) << 8) | (data[i+3] & 0xff);
image.putInt(color);
i += 4;
What's strange is that I get mostly RRGGBBAA = 0x00000000 data resulting in a clear image with little color.
The problem is you are neglecting to observe the filtering for each scanline.
From the image provided the decompressed data looks like
1 ffffffff 0 0 0 ...
2 0 0 0 0 0 0 ...
..
the first value in each line conforms to the filter method used [http://www.w3.org/TR/PNG/#9Filters]
the scanlines post processing will look like
ffffffff ffffffff ffffffff ...
ffffffff ffffffff ffffffff ...
...
here is some example code that handles methods 0, 1 and 2.
private static void processScanLine(byte filterValue, byte[] scanLine, byte[] previousScanLine) {
switch(filterValue){
case 0:break;
case 1:
for (int i =4;i<scanLine.length;i++){
scanLine[i] = (byte)(scanLine[i]+scanLine[i-4]);
}
break;
case 2:
for (int i =0;i<scanLine.length;i++){
scanLine[i] = (byte)(scanLine[i]+previousScanLine[i]);
}
break;
}
}

Categories