Create JVM crashs outside from Visual Studio - java

i have following Problem. I write a little Application , that creates a Java Virtual Machine. If I start this programm inside of Visual Studio it works fine. But if i start it outside of visual studio the programm does not work and i have a ntdll.dll crash.
Here is my Code :
int result = 0;
LoadRuntimeLibrary(libPath);
// Load the JVM library
g_jniLibrary = LoadLibrary(libPath);
if (g_jniLibrary == NULL) {
info->Error("Could not load libary: ");
return -1;
}
// Grab the create VM function address
JNI_createJavaVM createJavaVM = (JNI_createJavaVM)GetProcAddress(g_jniLibrary, "JNI_CreateJavaVM");
if (createJavaVM == NULL) {
info->Error("ERROR: Could not find JNI_CreateJavaVM function");
return -1;
}
// Count the vm args
int numVMArgs = -1;
while (vmArgs[++numVMArgs] != NULL) {}
// Add the options for exit and abort hooks
int numHooks = 0;
JavaVMOption* options = (JavaVMOption*)malloc((numVMArgs + numHooks) * sizeof(JavaVMOption));
for (int i = 0; i < numVMArgs; i++){
options[i].optionString = vmArgs[i];
options[i].extraInfo = 0;
}
// Setup hook pointers
options[numVMArgs].optionString = "abort";
options[numVMArgs].extraInfo = (void*)&AbortHook;
options[numVMArgs + 1].optionString = "exit";
options[numVMArgs + 1].extraInfo = (void*)&ExitHook;
JavaVMInitArgs init_args;
memset(&init_args, 0, sizeof(init_args));
init_args.version = JNI_VERSION_1_8;
init_args.options = options;
init_args.nOptions = numVMArgs + numHooks;
init_args.ignoreUnrecognized = JNI_FALSE;
result = createJavaVM(&jvm, &env, &init_args); // here is the crash
env = GetJNIEnv(false);
Init(env);
result = RunMainClass(env, mainCls, argc, javaargs);
jvm->DestroyJavaVM();
FreeLibrary(g_jniLibrary);
return result;
I hope you hava any idea , what could be wrong

You are accessing the options array out of bounds. It only contains numVMArgs elements, as numHooks is zero.
This of course leads to undefined behavior when you do
options[numVMArgs].optionString = "abort";
options[numVMArgs].extraInfo = (void*)&AbortHook;
options[numVMArgs + 1].optionString = "exit";
options[numVMArgs + 1].extraInfo = (void*)&ExitHook;
as the indexes numVMArgs and numVMArgs + 1 are out of bounds.

Related

I need help making prims algorithm the way my instructor wants

So i have checked through the previously posted prims algorithm posts. and i cant find one that satisfies my teachers requirements. I worked on this code with him, and have it mostly working. however for somereason when it gets to a certian point, it breaks and goes to the wrong edge.
'''public int prims(T startVertex) {
int tempWeight = 0;
int championWeight = 0;
int totalWeight = 0;
int i = 0;
boolean firstOne = false;
T championVertex = null;
T currentVertex = null;
T checkVertex = null;
T championMarked = null;
UnboundedQueueInterface<T> vertexQueue = new LinkedUnbndQueue<T>();
clearMarks();
markVertex(startVertex);
currentVertex = startVertex;
do {
for (int y = 0; y < numVertices; y++) {
currentVertex = vertices[y];
if (isMarked(currentVertex)) {
championWeight = 0;
championVertex = null;
checkVertex = null;
firstOne = true;
vertexQueue = getToVertices(currentVertex);
while (!vertexQueue.isEmpty()) {
checkVertex = vertexQueue.dequeue();
if ((!(isMarked(checkVertex)))) {
tempWeight = weightIs(currentVertex, checkVertex);
if (championWeight > tempWeight || firstOne == true) {
championWeight = tempWeight;
championVertex = checkVertex;
championMarked = currentVertex;
firstOne = false;
}
}
}
}
}
System.out.println((String) championMarked + (String) championVertex + championWeight);
markVertex(championVertex);
totalWeight += championWeight;
} while (!(getUnmarked() == null));
System.out.println("Total cost is " + totalWeight);
return totalWeight; '''
when i run it i get the following output
Graph 1
AD1
DF4
FC3
FE12
FZ17
Enull0
the output is correct for the graph until the line FE12. it should be CE4. when i run debug, i watch the code find the answer, but then jumps up to the for loop and looses the right answer. I know there is an error in my logic, but I cant quite figure it out. Your input is appreciated. thanks
So i have figured out my issue, I needed to put the resets for the code, after the code outputs the solution, otherwise where they were, if there were any vertices left to check that were not used already, the code would loose the current values.
they needed to go here
'''System.out.println((String) championMarked + (String) championVertex +
championWeight);
markVertex(championVertex);
totalWeight += championWeight;
championWeight = 0;
championVertex = null;
checkVertex = null;
firstOne = true;
} while (!(getUnmarked() == null));'''

CPLEX warm start error when using OPL model in Java API

I am trying to do a warm start using the Java API and having some issues when passing the initial solution to the model.
In my model file(.mod) I have a 2D decision variable defined as,
range nodes = 1..5;
range vehicles = 1..2;
dvar int service_time[nodes][vehicles];
In my java file I am building the model as below and trying to pass an initial solution to the above decision variable using the addMipStart() function (as described here),
static public void main(String[] args) throws Exception {
int status = 127;
try {
IloOplFactory.setDebugMode(true);
IloOplFactory oplF = new IloOplFactory();
IloOplErrorHandler errHandler = oplF.createOplErrorHandler(System.out);
IloOplModelSource modelSource = oplF.createOplModelSource(DATADIR + "/myModFile.mod");
IloOplSettings settings = oplF.createOplSettings(errHandler);
IloOplModelDefinition def = oplF.createOplModelDefinition(modelSource, settings);
IloCplex cplex = oplF.createCplex();
IloOplModel opl = oplF.createOplModel(def, cplex);
//adding the custom data source
IloOplDataSource dataSource = new VRPDataSource(oplF);
opl.addDataSource(dataSource);
//generating the model
opl.generate();
//creating the initial solution
int i = 5;
int j = 2;
IloIntVar[][] var2D = new IloIntVar[i][];
double[][] var2D_startingVals = new double[i][];
for(int index1=0; index1 < i; index1++){
var2D[index1] = new IloIntVar[j];
var2D_startingVals[index1] = new double[j];
for(int index2 = 0; index2 < j; index2++){
String varName = "service_time("+ (index1+1) +")("+ (index2+1) +")";
var2D[index1][index2] = cplex.intVar(0, 50, varName);
//lets assume a unit matrix as the starting solution
var2D_startingVals[index1][index2] = 1;
}
}
//flatten the multi-dimensional IloNumVar and double arrays
IloNumVar[] flat_var2D = new IloNumVar[i*j];
double[] flat_var2D_startingVals = new double[i*j];
for(int index1=0; index1 < i; index1++){
for(int index2=0; index2 < j; index2++){
flat_var2D[index1*j + index2] = var2D[index1][index2];
flat_var2D_startingVals[index1*j + index2] = var2D_startingVals[index1][index2];
}
}
// adding the MIPStart
cplex.addMIPStart(flat_var2D, flat_var2D_startingVals, IloCplex.MIPStartEffort.Auto, "addMIPStart start");
if(cplex.solve()){
// more code
}else{
// more code
}
// more code
}catch(Exception ex){
// more code
}
}
Unfortunately I am having an exception in the line which calls the cplex.addMIPStart() function as,
[java] ### CONCERT exception: The referenced IloExtractable has not been extracted by the IloAlgorithm
[java] ilog.concert.IloException: The referenced IloExtractable has not been extracted by the IloAlgorithm
[java] at ilog.cplex.cppimpl.cplex_wrapJNI.IloCplex_addMIPStart__SWIG_0(Native Method)
[java] at ilog.cplex.cppimpl.IloCplex.addMIPStart(IloCplex.java:866)
[java] at ilog.cplex.IloCplex.addMIPStart(IloCplex.java:13219)
[java] at ilog.cplex.IloCplex.addMIPStart(IloCplex.java:13228)
[java] at myJavaClass.myJavaClass.main(myJavaClass.java:412)
I am thinking the error is due to the way I prepare the initial solution, can somebody please help me to sort this out.
Thank you very much.
The problem is that you're creating new variables, not referencing the existing variables in the model. These new variables do not exist in the objective, constraints, etc., so you get the IloException (see this technote).
You should be able to access the existing variables doing something like the following (note that this code has not been tested):
IloIntRange nodes = opl.getElement("nodes").asIntRange();
IloIntRange vehicles = opl.getElement("vehicles").asIntRange();
IloIntVarMap serviceTime = opl.getElement("service_time").asIntVarMap();
final int nbNodes = nodes.getSize();
final int nbVehicles = vehicles.getSize();
IloNumVar[] startX = new IloNumVar[nbNodes * nbVehicles];
double[] startVals = new double[nbNodes * nbVehicles];
for (int i = 0; i < nbNodes; i++) {
IloIntVarMap inner = serviceTime.getSub(nodes.getValue(i));
for (int j = 0; j < nbVehicles; j++) {
int idx = i * nbVehicles + j;
startX[idx] = inner.get(vehicles.getValue(j));
startVals[idx] = 1.0;
}
}
cplex.addMIPStart(startX, startVals);
Take a look at the Iterators.java example and the documentation for getElement.

Converting an iterative function to recursive

I am trying to convert an iterative function to Recursion.
But once I tried to do that it is runnning continuously like an infinite loop.
This is my iterative code
private static Node buildModelTree(String[] args) {
// TODO Auto-generated method stub
String clsIndex = args[3];
splitted.add(currentsplit);
double entropy = 0;
int total_attributes = (Integer.parseInt(clsIndex));// class index
int split_size = splitted.size();
GainRatio gainObj = new GainRatio();
while (split_size > current_index) { //iterate through all distinct pair for building children
currentsplit = (SplitInfo) splitted.get(current_index);
System.out.println("After currentsplit --->" + currentsplit);
gainObj = new GainRatio();
int res = 0;
res = ToolRunner.run(new Configuration(),new CopyOfFunID3Driver(), args);
gainObj.getcount(current_index);
entropy = gainObj.currNodeEntophy();
clsIndex = gainObj.majorityLabel();
currentsplit.classIndex = clsIndex;
if (entropy != 0.0 && currentsplit.attr_index.size() != total_attributes) { //calculate gain ration
bestGain(total_attributes,entropy,gainObj);
} else {
//When entropy is zero build tree
Node branch = new Node();
String rule = "";
Gson gson = new Gson();
int temp_size = currentsplit.attr_index.size();
for (int val = 0; val < temp_size; val++) {
int g = 0;
g = (Integer) currentsplit.attr_index.get(val);
if (val == 0) {
rule = g + " " + currentsplit.attr_value.get(val);
//JSON
// branch.add(g, currentsplit.attr_value.get(val).toString(), new Node(currentsplit.classIndex, true));
} else {
rule = rule + " " + g + " "+ currentsplit.attr_value.get(val);
//branch.add(g, currentsplit.attr_value.get(val).toString(), buildModelTree(args));
}
}
rule = rule + " " + currentsplit.classIndex;
}
split_size = splitted.size();
current_index++;
}
}
where all should I make change?
I am trying to build tree. So inoredr to get the tree structure I am trying to make my id3 code recursive.
with my current code I am only getting output as this ,But I want it as tree structure
Please suggest.
The Recursion algorithm must have following
1.Each time the function invokes itself, the Problem size has to be reduced.
(ie. If suppose first you are calling the function with array of size n, then the next time it has to be lesser than n.
Base Case - the condition for the return statement.
(For example, if the array size is 0 then return)
In your code, these two are missing.
You're keep on calling the function with the same size of array. That's the problem.
Thanks

JNI: Catching Init-Time Exceptions

Okay, I'm all out of ideas on this one. Does anyone have any idea how I can hook into Java's exception pipeline in order to catch (and log to a text file) all exceptions that are occurring?
The situation is this: I have a library in a JAR file (A) which in turn depends on a second JAR file (B). A has no main class, as it's simply a class library, which I'm accessing and invoking through the JNI. The problem I'm having is this. When I attempt to initialise the JNI with A loaded, the JNI returns an unspecified error.
I strongly suspect that this error originates from an instantiation of Log4J's logger unit, which is occurring in static code (outside of a method) in B, which I believe is throwing an IOException as a result of permissions problems on the log file. I'm having issues finding out what's going on, however, as the exception (which I suspect is the cause of the problem) is being thrown during the linking stage (when A imports B) and so cannot be caught by a try-catch block. Also, since there is no main method there is no obvious place to put a try-catch block in order to catch this exception.
I would like some way of catching all exceptions that arise in either JAR and dumping them into a text file. I cannot (easily) modify B (I do not have the decompiled JAR). Any ideas?
Here is the C code which invokes the JNI with the specified libraries and options:
_DLL_EXPORT PyObject *initVM(PyObject *self, PyObject *args, PyObject *kwds)
{
static char *kwnames[] = {
"classpath", "initialheap", "maxheap", "maxstack",
"vmargs", NULL
};
char *classpath = NULL;
char *initialheap = NULL, *maxheap = NULL, *maxstack = NULL;
char *vmargs = NULL;
if (!PyArg_ParseTupleAndKeywords(args, kwds, "|zzzzz", kwnames,
&classpath,
&initialheap, &maxheap, &maxstack,
&vmargs))
return NULL;
if (env->vm)
{
PyObject *module_cp = NULL;
if (initialheap || maxheap || maxstack || vmargs)
{
PyErr_SetString(PyExc_ValueError,
"JVM is already running, options are ineffective");
return NULL;
}
if (classpath == NULL && self != NULL)
{
module_cp = PyObject_GetAttrString(self, "CLASSPATH");
if (module_cp != NULL)
classpath = PyString_AsString(module_cp);
}
if (classpath && classpath[0])
env->setClassPath(classpath);
Py_XDECREF(module_cp);
return getVMEnv(self);
}
else
{
JavaVMInitArgs vm_args;
JavaVMOption vm_options[32];
JNIEnv *vm_env;
JavaVM *vm;
unsigned int nOptions = 0;
PyObject *module_cp = NULL;
vm_args.version = JNI_VERSION_1_4;
JNI_GetDefaultJavaVMInitArgs(&vm_args);
if (classpath == NULL && self != NULL)
{
module_cp = PyObject_GetAttrString(self, "CLASSPATH");
if (module_cp != NULL)
classpath = PyString_AsString(module_cp);
}
#ifdef _jcc_lib
PyObject *jcc = PyImport_ImportModule("jcc");
PyObject *cp = PyObject_GetAttrString(jcc, "CLASSPATH");
if (classpath)
add_paths("-Djava.class.path=", PyString_AsString(cp), classpath,
&vm_options[nOptions++]);
else
add_option("-Djava.class.path=", PyString_AsString(cp),
&vm_options[nOptions++]);
Py_DECREF(cp);
Py_DECREF(jcc);
#else
if (classpath)
add_option("-Djava.class.path=", classpath,
&vm_options[nOptions++]);
#endif
Py_XDECREF(module_cp);
if (initialheap)
add_option("-Xms", initialheap, &vm_options[nOptions++]);
if (maxheap)
add_option("-Xmx", maxheap, &vm_options[nOptions++]);
if (maxstack)
add_option("-Xss", maxstack, &vm_options[nOptions++]);
if (vmargs)
{
#ifdef _MSC_VER
char *buf = _strdup(vmargs);
#else
char *buf = strdup(vmargs);
#endif
char *sep = ",";
char *option;
for (option = strtok(buf, sep); option; option = strtok(NULL, sep))
{
if (nOptions < sizeof(vm_options) / sizeof(JavaVMOption))
add_option("", option, &vm_options[nOptions++]);
else
{
free(buf);
for (unsigned int i = 0; i < nOptions; i++)
delete vm_options[i].optionString;
PyErr_Format(PyExc_ValueError, "Too many options (> %d)",
nOptions);
return NULL;
}
}
free(buf);
}
//vm_options[nOptions++].optionString = "-verbose:gc";
//vm_options[nOptions++].optionString = "-Xcheck:jni";
vm_args.nOptions = nOptions;
vm_args.ignoreUnrecognized = JNI_FALSE;
vm_args.options = vm_options;
if (JNI_CreateJavaVM(&vm, (void **) &vm_env, &vm_args) < 0)
{
for (unsigned int i = 0; i < nOptions; i++)
delete vm_options[i].optionString;
PyErr_Format(PyExc_ValueError,
"An error occurred while creating Java VM");
return NULL;
}
env->set_vm(vm, vm_env);
for (unsigned int i = 0; i < nOptions; i++)
delete vm_options[i].optionString;
t_jccenv *jccenv = (t_jccenv *) PY_TYPE(JCCEnv).tp_alloc(&PY_TYPE(JCCEnv), 0);
jccenv->env = env;
#ifdef _jcc_lib
registerNatives(vm_env);
#endif
return (PyObject *) jccenv;
}
}
Okay, so I've got the solution I was after. The solution is an update to the following segment of the code listed in the question:
if (JNI_CreateJavaVM(&vm, (void **) &vm_env, &vm_args) < 0)
{
for (unsigned int i = 0; i < nOptions; i++)
delete vm_options[i].optionString;
PyErr_Format(PyExc_ValueError,
"An error occurred while creating Java VM");
return NULL;
}
The adaptation supports the construction of a more detailed error message which adds two specific pieces of information:
The error code (if any) which is returned by the JNI_CreateJavaVM method;
The detailed Java exception which occurs in the event that such an error code arises.
The above snippet from the original code was replaced with the following:
vmInitSuccess = JNI_CreateJavaVM(&vm, (void **) &vm_env, &vm_args);
if (vmInitSuccess < 0)
{
for (unsigned int i = 0; i < nOptions; i++)
delete vm_options[i].optionString;
//Set up basic error message
sprintf(strVMInitSuccess, "%d", vmInitSuccess);
strcpy(strVMError, "An error occurred while creating Java VM (No Exception): ");
strcat(strVMError, strVMInitSuccess);
//Get exception if there is one
if((exc = vm_env->ExceptionOccurred()))
{
//Clear the exception since we have it now
vm_env->ExceptionClear();
//Get the getMessage() method
if ((java_class = vm_env->FindClass ("java/lang/Throwable")))
{
if ((method = vm_env->GetMethodID(java_class, "getMessage", "()Ljava/lang/String;")))
{
int size;
strExc = static_cast<jstring>(vm_env->CallObjectMethod(exc, method));
charExc = vm_env->GetStringUTFChars(strExc, NULL);
size = sizeof(strVMError) + sizeof(charExc);
char strVMException[size];
strcpy(strVMException, "An error occurred while creating Java VM (Exception): ");
strcat(strVMException, charExc);
PyErr_Format(PyExc_ValueError, strVMException);
return NULL;
}
}
}
PyErr_Format(PyExc_ValueError, strVMError);
return NULL;
}
Thanks to #Parsifal for help with this solution.

DDMS java profiler says: java/lang/StringBuilder.<init> - Where is it in my code?

I have the following java method, written for an android application.
private String removeWifiFudge(String message, String removedFudge)
{
int find1 = 255; //FF
int find2 = 0; //00
int find3 = 204; //CC
int find4 = 36; //24
char[] charMessage = message.toCharArray();
boolean find1True = false;
for (char eachCharacter : charMessage)
{
if (find1True)
{
if ((int) eachCharacter == find2)
{
removedFudge = removedFudge + String.valueOf((char)find1);
}
else
{
if ((int) eachCharacter == find3)
{
removedFudge = removedFudge + String.valueOf((char)find4);
}
else
{
removedFudge = removedFudge + String.valueOf((char)find1);
removedFudge = removedFudge + String.valueOf(eachCharacter);
}
}
find1True = false;
}
else
{
if ((int) eachCharacter == find1)
{
find1True = true;
}
else
{
removedFudge = removedFudge + String.valueOf(eachCharacter);
}
}
}
return removedFudge;
}
In a nutshell, it takes a string, message and searches it character by character for instances of 0xFF00 and 0xFFCC. On finding these instances, it replaces them by 0xFF and 0x24 respectively, putting it in a new string removedFudge.
This method is taking up a massive percentage of the CPU time and whilst using the java profiler embedded in eclipse, DDMS, it informed me 53% of the method time is spent:
java/lang/StringBuilder. (Ljava/lang/String;)V
This seems to be saying it is taking up the time initialising a string, however as I am passing it the already initialised string to put the new message in, I can't see where the initialising String is coming from.
Anyone an expert on DDMS?
String concatenation results in using a StringBuilder at runtime and so it's appearing in your profiler.

Categories