OpenSSL override RSA_sign to be performed by Smart Card - java

I'm trying to get OpenSSL working with Java and Native C for my Android application.
What I did so far:
Initialised OpenSSL like:
ret = SSL_library_init();
SSL_load_error_strings();
ctx = SSL_CTX_new(SSLv23_method());
ret = SSL_CTX_use_certificate(ctx, sc_cert); // sc_cert is the Smart Cards auth certificate -> this is working!
_ssl = SSL_new(ctx);
Now i tryed to set the rsa_sign function (my own one) callback:
RSA_METHOD *rsameth = RSA_get_default_method();
rsameth -> rsa_verify = &sc_rsa_verify; // Just returns 1, but gets never called.
rsameth -> rsa_sign = &sc_rsa_sign;
rsameth -> flags |= RSA_FLAG_SIGN_VER; // If i would use 0x1FF my function gets called, why?
RSA_set_default_method(rsameth);
_rsa = RSA_new(); // handle error
// No need to do this: RSA_set_default_method already did that!
//_rsa -> meth = rsameth;
//_rsa -> flags |= RSA_FLAG_SIGN_VER;
//RSA_set_method(_rsa, rsameth);
ret = SSL_use_RSAPrivateKey(_ssl, _rsa);
RSA_set_default_method(rsameth);
Now the my last steps:
sbio = BIO_new_socket(sock, BIO_NOCLOSE); // Sock had been created before and is working!
SSL_set_bio(_ssl, sbio, sbio);
if(_session) SSL_set_session(_ssl, _session);
ret = SSL_connect(_ssl);
Now after SSL_connect I get either:
No error: when my own RSA_sign (sc_rsa_sign) was NOT called
Or: error:1409441B:SSL routines:SSL3_READ_BYTES:tlsv1 alert decrypt error, when my own RSA_sign (sc_rsa_sign) WAS called
Now you can take a look inside my own RSA_sign (sc_rsa_sign) function:
jbyteArray to_crypt = (*_env) -> NewByteArray(_env, m_length);
(*_env) -> SetByteArrayRegion(_env, to_crypt, 0, m_length, m);
// Jump into Java and do the crypt on card. This is working!
jbyteArray crypted = (*_env) -> CallObjectMethod(_env, _obj, _callback_cryptoncard, to_crypt);
// I also read that siglen should be the size of RSA_size(rsa), thus rsa -> n is not allowed to be NULL here. But it is! What is wrong here?
//int size = RSA_size(rsa);
//sigret = malloc(size);
// Obtain bytes from Java. Working (right size and crypted)!
*siglen = (*_env) -> GetArrayLength(_env, crypted);
sigret = (*_env) -> GetByteArrayElements(_env, crypted, NULL);
//(*_env) -> ReleaseByteArrayElements(_env, crypted, sigret, 0);
return 1;
Thats all I did so far. Been struggling with this for weeks now! Hope that somebody can help me!

I got the mistake (embarassing):
sigret = (*_env) -> GetByteArrayElements(_env, crypted, NULL);
overwrote the pointer, I changed it to:
unsigned char *sigrettemp = (*_env) -> GetByteArrayElements(_env, crypted, NULL);
memcpy(sigret, sigrettemp, siglen);
and everything is working fine now!

Related

CreateProcessAsUserW error code 6 Invalid Handle JNA

I'm using the JNA to call the Windows API. I want to start a process (doesn't matter which) as a specific user. The two API calls I use are:
LogonUserW
CreateProcessAsUserW
LogonUserW succeeds, but CreateProcessAsUserW fails with Error 6. According to the Windows System Error Codes Doc, this corresponds to "ERROR_INVALID_HANDLE".
As far as I can tell, the only handle I pass in is the user handle. I don't see what could be wrong with that. According to the LogonUserW doc,
In most cases, the returned handle is a primary token that you can use in calls to the CreateProcessAsUser function. However, if you specify the LOGON32_LOGON_NETWORK flag, LogonUser returns an impersonation token that you cannot use in CreateProcessAsUser unless you call DuplicateTokenEx to convert it to a primary token.
However, I don't use LOGON32_LOGON_NETWORK.
Some of the struct parameters have handles, but I either pass NULL or they are populated by the API call instead of by me.
Here's the meat of my code:
final PointerByReference userPrimaryToken =
new PointerByReference();
System.out.printf(
"ptr.peer = %d\n",
Pointer.nativeValue(userPrimaryToken.getValue())
);
final boolean logonOk = MyWinBase.INSTANCE.LogonUserW(
toCString(<my-username>), // hidden
toCString("ANT"),
toCString(<my-password>), // hidden
/* This logon type is intended for batch servers, where
processes may be executing on behalf of a user without their
direct intervention. This type is also for higher
performance servers that process many plaintext
authentication attempts at a time, such as mail or web
servers.*/
WinBase.LOGON32_LOGON_BATCH,
WinBase.LOGON32_PROVIDER_DEFAULT,
userPrimaryToken
);
System.out.printf("ok = %b\n", logonOk);
System.out.printf(
"ptr.peer = %d\n",
Pointer.nativeValue(userPrimaryToken.getValue())
);
final STARTUPINFOW.ByReference startupInfoW =
new STARTUPINFOW.ByReference();
startupInfoW.cb = startupInfoW.size();
startupInfoW.lpReserved = Pointer.NULL;
startupInfoW.lpDesktop = Pointer.NULL;
startupInfoW.lpTitle = Pointer.NULL;
startupInfoW.dwFlags
= startupInfoW.dwX = startupInfoW.dwY
= startupInfoW.dwXSize = startupInfoW.dwYSize
= startupInfoW.dwXCountChars = startupInfoW.dwYCountChars
= startupInfoW.dwFillAttribute
= startupInfoW.wShowWindow
= 0;
startupInfoW.cbReserved2 = 0;
startupInfoW.lpReserved2 = Pointer.NULL;
startupInfoW.hStdInput = startupInfoW.hStdOutput
= startupInfoW.hStdError
= Pointer.NULL;
final PROCESS_INFORMATION.ByReference processInformation =
new PROCESS_INFORMATION.ByReference();
processInformation.hProcess = processInformation.hThread
= Pointer.NULL;
processInformation.dwProcessId = processInformation.dwThreadId
= 0;
final boolean createProcessOk = MyProcessThreadsApi.INSTANCE
.CreateProcessAsUserW(
userPrimaryToken.getPointer(),
toCString("C:\\Windows\\System32\\cmd.exe"),
// execute and terminate
toCString("/c whoami > whoami.txt"),
Pointer.NULL,
Pointer.NULL,
false,
WinBase.CREATE_UNICODE_ENVIRONMENT,
new PointerByReference(),
Pointer.NULL,
startupInfoW,
processInformation
);
System.out.printf("ok = %b\n", createProcessOk);
System.out.printf(
"dwProcessId = %d\n", processInformation.dwProcessId
);
System.out.printf(
"last err code = %d\n",
ErrHandlingApi.INSTANCE.GetLastError()
);
Here's my output:
ptr.peer = 0
ok = true
ptr.peer = 1040
ok = false
dwProcessId = 0
last err code = 6
Any suggestions?
Looking at this piece of code:
final PointerByReference userPrimaryToken = ...;
Online documentation says it represents a pointer to pointer, C notation void**
https://java-native-access.github.io/jna/4.2.1/com/sun/jna/ptr/PointerByReference.html
On documentation for LogonUser it expects a PHALDLE pointer to a HANDLE, which is similar to a pointer to pointer, since HANDLE is similar to pointer (it is declared as typedef void *HANDLE;).
https://learn.microsoft.com/en-us/windows/win32/api/winbase/nf-winbase-logonuserw
BOOL LogonUserW(
....
DWORD dwLogonProvider,
PHANDLE phToken
);
But in documentation to CreateProcessAsUser is specified that this function accepts a HANDLE, not a PHANDLE
https://learn.microsoft.com/en-us/windows/win32/api/processthreadsapi/nf-processthreadsapi-createprocessasuserw
BOOL CreateProcessAsUserW(
HANDLE hToken,
LPCWSTR lpApplicationName,
....
);
So I would expect you to pass rather getValue than getPointer. By using getPointer you get the pointer itself, which is most probably the pointer to pointer in your case. I don't know JNA, but expectances are from knowledge of WinAPI
final boolean createProcessOk = MyProcessThreadsApi.INSTANCE
.CreateProcessAsUserW(
userPrimaryToken.getValue(),
....
);

java.lang.NullPointerException: lock == null from InputStreamReader

So I'm trying to parse an .obj wavefront file to be displayed with OpenGL ES, thing is, I'm getting the Nullpointer as if the file did not exist or was empty (?).
I tried two different ways of getting to parse the file, also made sure there were no empty lines on it, put it in different folders (assets, src root, res, etc...) but the result is the same. Maybe the error I'm getting is more to do with the OpenGL part of the code? But I'm kinda lost, because apparently it should work...
Also tried buffering the file outside the function, same happened. From another question here, the problem the person had, had to do with " trying to update UI from worker Thread ". Async did not help me here.
I got the code idea form this blog: http://etcodehome.blogspot.com/2011/07/android-rendering-3d-blender-models.html
And the file to base my work on from here: https://github.com/MartianIsMe/earth-live-wallpaper/blob/d71902aa642bad0c10fc46d6839ced6e15995f7b/%20earth-live-wallpaper/SLWP/src/com/seb/SLWP/DeathStar.java
fun loadObjFile() {
try {
var str: String
var tmp: Array<String>
var ftmp: Array<String>
var v: Float
val vlist = ArrayList<Float>()
val nlist = ArrayList<Float>()
val fplist = ArrayList<Fp>()
val mContext: Context? = null
//val inb: BufferedReader = File("androidmodel.obj").bufferedReader()
//val inputString = inb.use { it.readText() }
val inb = BufferedReader(InputStreamReader(mContext?.getAssets()?.open
("src/main/res/androidmodel.obj")), 1024) //Error is here at com.example.xxx.MyGLRenderer.loadObjFile
while (inb.readLine().also { str = it } != null) {
tmp = str.split(" ".toRegex()).toTypedArray()
//Parse the vertices
if (tmp[0].equals("v", ignoreCase = true)) {
for (i in 1..3) {
v = tmp[i].toFloat()
vlist.add(v)
}
}
//Parse the vertex normals
if (tmp[0].equals("vn", ignoreCase = true)) {
for (i in 1..3) {
v = tmp[i].toFloat()
nlist.add(v)
}
}
//Parse the faces/indices
if (tmp[0].equals("f", ignoreCase = true)) {
for (i in 1..3) {
ftmp = tmp[i].split("/".toRegex()).toTypedArray()
val chi = ftmp[0].toInt() - 1.toLong()
var cht = 0
if (ftmp[1] != "") cht = ftmp[1].toInt() - 1
val chn = ftmp[2].toInt() - 1
fplist.add(Fp(chi, cht, chn))
}
NBFACES++
}
}
val vbb = ByteBuffer.allocateDirect(fplist.size * 4 * 3)
vbb.order(ByteOrder.nativeOrder())
mVertexBuffer = vbb.asFloatBuffer()
val nbb = ByteBuffer.allocateDirect(fplist.size * 4 * 3)
nbb.order(ByteOrder.nativeOrder())
mNormBuffer = nbb.asFloatBuffer()
for (j in fplist.indices) {
mVertexBuffer?.put(vlist[(fplist[j].Vi * 3).toInt()])
mVertexBuffer?.put(vlist[(fplist[j].Vi * 3 + 1).toInt()])
mVertexBuffer?.put(vlist[(fplist[j].Vi * 3 + 2).toInt()])
mNormBuffer?.put(nlist[fplist[j].Ni * 3])
mNormBuffer?.put(nlist[fplist[j].Ni * 3 + 1])
mNormBuffer?.put(nlist[fplist[j].Ni * 3 + 2])
}
mIndexBuffer = CharBuffer.allocate(fplist.size)
for (j in fplist.indices) {
mIndexBuffer?.put(j.toChar())
}
mVertexBuffer?.position(0)
mNormBuffer?.position(0)
mIndexBuffer?.position(0)
} catch (e: IOException) {
e.printStackTrace()
}
}
private class Fp
(var Vi: Long, var Ti: Int, var Ni: Int)
The problem is that you pass null into InputStreamReader. The path to the asset is wrong.
First of all the file should be located under assets directory that is positioned on the same level in directory hierarchy as the java and res folder.
Second, you should pass path relative to the assets directory. So if your file is located directly under assets then the relative path is "androidmodel.obj". Thus, creating input stream will look like this:
InputStreamReader(mContext?.getAssets()?.open("androidmodel.obj"))
But I strongly recommend you to check for non-null because if mContext is null - the issue will return.
mContext?.getAssets()?.open("androidmodel.obj")?.let { nonNullAsset ->
InputStreamReader(nonNullAsset)
}
This part is crucial ?.let { as it runs the let function only if the object is not null.
If there is no assets directory, just create it as a simple directory and it will be picked up by IDE automatically:
Update
As the NPE still occurs the only reason left is the null value in mContext variable. Make sure it is initialized.
And after a little bit more digging, I can say that this was the issue from the beginning. Any attempt to pass the wrong path of a file to the assets.open(fileName) function will result in FileNotFoundException. Thus, even though the path you use is wrong you did not even reach the point of opening a file as the context is null.

SHA256withECDSA Signature not Verifying, Android

Update: The critical aspect is that kp (keypair) is generated "outside" this code. This code is an onclick function whereas kp is defined in the code that sets up the onclick. Shouldn't matter but that's what seems to be the problem and that's inexplicable.
What's wrong with the following code? It always prints (logs) false for the verification b even though the data is the same string "foo" and the signature is the same one that was generateded earlier in the code sig
val sig = Signature.getInstance("SHA256withECDSA").run {
initSign(kp.private)
update("foo".toByteArray())
sign()
}
Log.d(tag, "sig: " + sig.toString())
val o = Signature.getInstance("SHA256withECDSA")
o.initVerify(kp.public)
o.update("foo".toByteArray())
val b = o.verify(sig)
Log.d(tag, b.toString())

verify a rsa sign from java in php

Since a few days I've got a problem that I can't solve on my own:
On a JavaCard I generate a RSA KeyPair (length: 1024) and a signature (Mode:ALG_RSA_MD5_PKCS1).
Now I have to verify the signature in php.
From my JavaCard I get the exponent, modulus and the signature in hexadecimal:
$mod = '951ADDA04637190B6202BB52787D3C19160A383C80C2E7242D0A7850FDD80C1CD1CCCF1395F8CA0B20270E3BC6C86F78232D65D148258BEFD0884563C60AB2C327506FB4FA0095CF0B1C527D942155731451F790EC0A227D38613C9EBFB2E04A657B3BA5456B35F71E92E14B7E1CB38DB6572559BFCA3B0AD8AA061D48F68931';
$exp = '010001';
$sign ='75867D42BDE6DF1066D4AF69418FCDD4B0F19173141128DFEBC64AF6C014CB92D38F4824E52BB064A610E07C7783AE57AE993A792F15208FB199CB1F45B64623AACB7FBA07AD89513C8DBA893C9FA6939857AA2CA53AAD99D9A9C1C32DF4E2769FCACB72E2C2C495727D368D953A911D32E79E230751202714DD15C0B6A34782';
$plaintext = '01020304';
A Verification in Java is no problem. But know I have to verify the signature in PHP (I take phpseclib).
In PHP I generate my public_key with CRYPT_RSA_PUBLIC_FORMAT_RAW:
$rsa = new Crypt_RSA();
$pk = array(
'e' => new Math_BigInteger($exp, 16),
'n' => new Math_BigInteger($mod, 16)
);
$rsa->loadKey($pk, CRYPT_RSA_PUBLIC_FORMAT_RAW);
$rsa->setSignatureMode(CRYPT_RSA_SIGNATURE_PKCS1);
echo $rsa->verify($plaintext, $sign) ? 'verified' : 'unverified';
The problem know is to set the correct values in the function verify.
If I just set my signature in hexadecimal I get the notice:
Invalid signature: length = 256, k = 128 in C:\xampp\php\PEAR\Crypt\RSA.php on line 2175
So I have to customize the length of my signature:
$sign_bigInteger = new Math_BigInteger($sign, 16);
$sign_bytes = $sign_bigInteger->toBytes();
echo $rsa->verify($plaintext, $sign_bytes) ? 'verified' : 'unverified';
But the verification is false.
I get the output of the verification function in RSA.php (_rsassa_pkcs1_v1_5_verify) where plaintext is compared with the signature :
//sign
"ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿ0 0*†H†÷ ÖÀZ!Q*y¡ßë*&/"
//plaintext
"ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿ0!0 +•q£îê“O•äQ».åüÓSœÝ["
I don't really understand whats happening in the Class RSA.php.
Can anyone help me and say what I do wrong?
EDIT:
Now I tried to convert my hexString.
$plaintext_bin = pack("H*", $plaintext);
$sign_bin = pack("H*", $sign);
I think that my public key is correct generated, so I just change the input of my verify:
$rsa->verify($plaintext_bin, $sign_bin) ? 'verified' : 'unverified';
Output:
em: string(128) "ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿ0 0*†H†÷ ÖÀZ!Q*y¡ßë*&/"
em2: string(128) "ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿ0!0 +ÚÚÿMG‡­ã31G ,;D>7o"
It's still not the same.
EDIT:
I fixed my problem. I forgot to set the Hash:
$rsa1->setHash('md5');
Now it works!
Thank you GregS.
All your values are hex strings. Just convert them using hex2bin() or pack("H*", $hex_string);

Getting output parameter value set by VBScript (WMI) method in java via JACOB

Am trying to convert a VBScript to java using JACOB - Java COM bridge library.
'Create' method in VBScript accepts a [out] param in it's method and it sets it upon method execution and i couldn't figure out how to retrieve it back via JACOB.
VBScript in question:
Function CreateProcess(strComputer, strCommand)
Dim objWMIService, objProcess
Set objWMIService = GetObject("winmgmts:" & "{impersonationLevel=impersonate}!\\" & strComputer & "\root\cimv2")
Set objProcess = objWMIService.Get("Win32_Process")
errReturn = objProcess.Create (strCommand, Null, Null, intProcessID)
Set objWMIService = Nothing
Set objProcess = Nothing
CreateProcess = intProcessID
End Function
intProcessID is [out] param set after method execution. (Create API contract)
Converted java code(incomplete and modified slightly for demonstration):
public static void createProcess() {
String host = "localhost";
String connectStr = String
.format("winmgmts:{impersonationLevel=impersonate}!\\\\%s\\root\\CIMV2",
host);
ActiveXComponent axWMI = new ActiveXComponent(connectStr);
Variant vCollection = axWMI.invoke("get", new Variant("Win32_Process"));
Dispatch d = vCollection.toDispatch();
Integer processId = null;
int result = Dispatch.call(d, "Create", "notepad.exe", null, null, processId)
.toInt();
System.out.println("Result:" + result);
// WORKS FINE until here i.e. notepad launches properly, however processId still seems to be null. Following commented code is wrong - doesn't work
//Variant v = Dispatch.get(d, "processId"); // even ProcessId doesn't work
//int pId = v.getInt();
//System.out.println("process id:"
// + pId);
// what is the right way to get the process ID set by 'Create' method?
}
Would be great if you could provide some pointers or relevant code. Ask me more if needed. Thanks in advance.
Replacing
Integer processId = null;
with
Variant processId = new Variant(0, true);
should solve the problem. You should then have process ID of the notepad.exe process in the processId variant, and it can be fetched by
processId.getIntRef()

Categories