How to load HashMap properly from a .yml file? - java

I am trying to load a HashMap from the config file using the standard Bukkit configuration files API.
HashMap:
public static HashMap<String, String> banned = new HashMap<String, String>();
This is the way I am trying to get the data:
public static boolean isBanned(String uuid) {
if (Dogends.config.getConfigurationSection("Banned").getKeys(true).contains(uuid)) {
return true;
}
return false;
}
If the player is banned then it's ok, but when the player is not banned, then it throws a NullPointerException out.
NullPointerException:
Could not pass event PlayerLoginEvent to Dogends v1.0
org.bukkit.event.EventException
at org.bukkit.plugin.java.JavaPluginLoader$1.execute(JavaPluginLoader.java:302) ~[cb.jar:git-Bukkit-880a532]
at org.bukkit.plugin.RegisteredListener.callEvent(RegisteredListener.java:62) ~[cb.jar:git-Bukkit-880a532]
at org.bukkit.plugin.SimplePluginManager.fireEvent(SimplePluginManager.java:501) [cb.jar:git-Bukkit-880a532]
at org.bukkit.plugin.SimplePluginManager.callEvent(SimplePluginManager.java:486) [cb.jar:git-Bukkit-880a532]
at net.minecraft.server.v1_8_R3.PlayerList.attemptLogin(PlayerList.java:439) [cb.jar:git-Bukkit-880a532]
at net.minecraft.server.v1_8_R3.LoginListener.b(LoginListener.java:89) [cb.jar:git-Bukkit-880a532]
at net.minecraft.server.v1_8_R3.LoginListener.c(LoginListener.java:53) [cb.jar:git-Bukkit-880a532]
at net.minecraft.server.v1_8_R3.NetworkManager.a(NetworkManager.java:222) [cb.jar:git-Bukkit-880a532]
at net.minecraft.server.v1_8_R3.ServerConnection.c(SourceFile:168) [cb.jar:git-Bukkit-880a532]
at net.minecraft.server.v1_8_R3.MinecraftServer.B(MinecraftServer.java:744) [cb.jar:git-Bukkit-880a532]
at net.minecraft.server.v1_8_R3.DedicatedServer.B(DedicatedServer.java:335) [cb.jar:git-Bukkit-880a532]
at net.minecraft.server.v1_8_R3.MinecraftServer.A(MinecraftServer.java:628) [cb.jar:git-Bukkit-880a532]
at net.minecraft.server.v1_8_R3.MinecraftServer.run(MinecraftServer.java:536) [cb.jar:git-Bukkit-880a532]
at java.lang.Thread.run(Unknown Source) [?:1.8.0_91]
Caused by: java.lang.NullPointerException
at me.woulfiee.server.ban.BanCommand.isBanned(BanCommand.java:47) ~[?:?]
at me.woulfiee.server.ban.BanCommand.onPlayerLogin(BanCommand.java:103) ~[?:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_91]
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:1.8.0_91]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:1.8.0_91]
at java.lang.reflect.Method.invoke(Unknown Source) ~[?:1.8.0_91]
at org.bukkit.plugin.java.JavaPluginLoader$1.execute(JavaPluginLoader.java:300) ~[cb.jar:git-Bukkit-880a532]
... 13 more
config.yml:
Ranks:
Player:
Players: []
Mythic:
Players: []
Doge:
Players: []
Youtuber:
Players: []
Builder:
Players: []
Mod:
Players: []
Admin:
Players: []
Owner:
Players:
- d166739c-32d3-4b37-a1be-883be57d736c
Broadcast:
Interval: 120
Banned:
d166739c-32d3-4b37-a1be-883be57d736c: "CONSOLE \xa7eHELP"

To accomplish what you wish, you should try the following:
Make sure your config is not null/exists
boolean isBanned(String uuid) {
FileConfiguration yourConfig;
//Getting the Banned section
ConfigurationSection banned = yourConfig.getConfigurationSection("Banned");
//All the keys inside the banned configuration section
Set<String> keys = banned.getKeys(false); //We don't want it to be deep
if (keys.contains(uuid))return true; //UUID is on the keys list, so the player is banned
return false; //UUID is not on the keys list, so the player is not banned
}
I don't believe you actually need the hashmap, unless you're using it for something else

getConfigurationSection:
If the ConfigurationSection does not exist but a default value has
been specified, this will return the default value. If the
ConfigurationSection does not exist and no default value was
specified, this will return null.
I'm guessing if there are no users banned, there is no Banned section, so getConfigurationSection returns null, which is why your getKeys() call throws a NPE.
So you should first check if the configuration section exists, and only then try to use it.

Related

Java Spark GroupByFailure

I'm attempting to use the Java Spark libraries with a cluster running Spark 2.3.0 over Hadoop 3.1.0 (and using those versions of the Java libraries).
I've run into a problem where I simply cannot use groupByKey, and I am at a loss to explain why. Any attempted usage of groupByKey for any reason in any circumstance is returning a java.lang.IllegalArgumentException.
I've boiled this down to about the simplest test I can think of:
package com.failuretest;
import java.util.ArrayList;
import java.util.List;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import scala.Tuple2;
public class TestReport {
public static void main(String[] args) throws Exception {
SparkConf conf = new SparkConf().setAppName("TestReport").set("spark.executor.memory", "20G");
JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<String> test = sc.parallelize(generateTestData());
test.saveAsTextFile("/TEST/testfile1");
test.mapToPair(line -> {
String[] testParts = line.split(" ");
return new Tuple2<String, String>(testParts[0], testParts[1]);
}).groupByKey().saveAsTextFile("/TEST/testfile2");
sc.close();
}
private static List<String> generateTestData() {
List<String> testList = new ArrayList<String>();
int keyCount = 0;
int valCount = 0;
while (valCount++ < 2000000) {
if (valCount % 10 == 0) {
keyCount++;
}
testList.add("Key" + keyCount + " " + "Val" + valCount);
}
return testList;
}
}
I'm just programmatically creating an RDD that produces 10 values per key, then creating my JavaPairRDD with a simple split, then attempting groupByKey.
When it runs, I receive the following stack:
Exception in thread "main" java.lang.IllegalArgumentException
at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:46)
at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:449)
at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:432)
at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)
at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:103)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:432)
at org.apache.xbean.asm5.ClassReader.a(Unknown Source)
at org.apache.xbean.asm5.ClassReader.b(Unknown Source)
at org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
at org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:262)
at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:261)
at scala.collection.immutable.List.foreach(List.scala:381)
at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:261)
at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:159)
at org.apache.spark.SparkContext.clean(SparkContext.scala:2292)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$combineByKeyWithClassTag$1.apply(PairRDDFunctions.scala:88)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$combineByKeyWithClassTag$1.apply(PairRDDFunctions.scala:77)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
at org.apache.spark.rdd.PairRDDFunctions.combineByKeyWithClassTag(PairRDDFunctions.scala:77)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$groupByKey$1.apply(PairRDDFunctions.scala:505)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$groupByKey$1.apply(PairRDDFunctions.scala:498)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
at org.apache.spark.rdd.PairRDDFunctions.groupByKey(PairRDDFunctions.scala:498)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$groupByKey$3.apply(PairRDDFunctions.scala:641)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$groupByKey$3.apply(PairRDDFunctions.scala:641)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
at org.apache.spark.rdd.PairRDDFunctions.groupByKey(PairRDDFunctions.scala:640)
at org.apache.spark.api.java.JavaPairRDD.groupByKey(JavaPairRDD.scala:559)
at com.failuretest.TestReport.main(TestReport.java:22)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:879)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
It doesn't get any further than the groupByKey (I'm writing a file above with the results, but it really doesn't matter since it never gets there).
I can run it all day long in my local dev instance, but running spark-submit with a jar containing the above fails every time in the cluster.
I'm really not sure where to go from here - what I am trying to do is a bit of a challenge if I cannot group by key.
Am I messing up? Is this a version conflict somewhere?
Dave
I actually figured this out before posting this, but in the interests of helping others...
I discovered that one of my colleagues had decided to have a play around with Java 10 on this particular cluster. Moved it back to Java 8 (sorry - didn't try 9) and the problem went away.
Dave

Octave not able to evaluate script due to failure in loading scalar constant

I have a project where I need to make several computations using Octave. I communciate with Octave using the javaoctave bridge. Below is simplified example code to show what I am doing.
// One of the fields of my class that evaluates equations. This class is provided by teh bridge.
private OctaveEngine scriptEngine;
// The script contains the equation, e.g.: Ampere = PowerConsumption (Lights, Dryer, Dishwasher, Laptop, Smartphone);
private String script;
public synchronized void evaluate() {
// Variable is a class implementing Cloneable and harbouring a unit as well as its unit and a few utility functions.
LinkedHashSet<Variable> dependentVariables;
String evaluationErrors;
Preconditions.checkState(scriptEngine != null);
try {
engineErrors.reset();
scriptEngine.setErrorWriter(engineErrors);
initParameters();
scriptEngine.eval(this.script);
} catch (Exception exception) {
LOGGER.log(Level.SEVERE, "Error while evaluating script " + this.script, exception); //$NON-NLS-1$
}
// The mapping stores the variables and the interface objects changing or reading their values.
dependentVariables = mapping.getDependentVariables();
for (Variable variable : dependentVariables) {
try {
OctaveDouble result = scriptEngine.get(OctaveDouble.class, variable.getName());
variable.setValue(result.get(1));
} catch (Exception exception) {
LOGGER.log(Level.SEVERE, "Error while retrieving variable " + variable.getName(), exception); //$NON-NLS-1$
}
}
evaluationErrors = engineErrors.toString();
if (evaluationErrors.length() > 0) {
LOGGER.log(Level.WARNING, "Error while evaluating equation :" + evaluationErrors); //$NON-NLS-1$
}
}
private void initParameters() {
for (Variable variable : lockedMapping.getIndependentVariables()) {
Double numericValue = Double.valueOf(variable.getValue().toString());
if (isNegative(numericValue)) {
numericValue *= -1d;
}
scriptEngine.put(variable.getName(), Octave.scalar(numericValue));
}
}
private static boolean isNegative(double _double) {
return Double.doubleToRawLongBits(_double) < 0;
}
What I was getting as error message was that:
WARNING: Error while evaluating equation :error: load: failed to load scalar constant
error: load: trouble reading ascii file '-'
error: load: reading file -
Googling a bit I found that negative values (or large inputs) can cause this problem. However, it was deemed to be solved. I don't have large inputs and I tried to make sure my values are positive (which all were even by testing using a syso!
I did a more in-depth search for the root cause and stumbled upon it. It seems that in the OcaveExec class, line 157 in the getFromFuture method where it calles future.get(), an ExecutionException is thrown. Below is the immediate stacktrace:
java.util.concurrent.ExecutionException: dk.ange.octave.exception.OctaveIOException: IOException during close
at java.util.concurrent.FutureTask.report(Unknown Source)
at java.util.concurrent.FutureTask.get(Unknown Source)
at dk.ange.octave.exec.OctaveExec.getFromFuture(OctaveExec.java:157)
at dk.ange.octave.exec.OctaveExec.eval(OctaveExec.java:142)
at dk.ange.octave.io.OctaveIO.set(OctaveIO.java:56)
at dk.ange.octave.OctaveEngine.put(OctaveEngine.java:141)
at [redacted].OctaveEquation.initParameters(OctaveEquation.java:109)
at [redacted].OctaveEquation.evaluate(OctaveEquation.java:72)
at [redacted].LinearEquationSystem.inputChanged(LinearEquationSystem.java:105)
at [redacted].variable.NumericalVariable.setValue(NumericalVariable.java:127)
at [redacted].widget.ValueWidget.actionDrop(ValueWidget.java:88)
at [redacted].TangibleObjectManager.dropObject(TangibleObjectManager.java:207)
at [redacted].adapter.TuioAdapter.addTangibleObject(TuioAdapter.java:318)
at [redacted].adapter.TuioAdapter.addTuioObject(TuioAdapter.java:295)
at TUIO.TuioClient.acceptMessage(TuioClient.java:339)
at com.illposed.osc.utility.OSCPacketDispatcher.dispatchMessage(OSCPacketDispatcher.java:73)
at com.illposed.osc.utility.OSCPacketDispatcher.dispatchPacket(OSCPacketDispatcher.java:49)
at com.illposed.osc.utility.OSCPacketDispatcher.dispatchBundle(OSCPacketDispatcher.java:56)
at com.illposed.osc.utility.OSCPacketDispatcher.dispatchPacket(OSCPacketDispatcher.java:40)
at com.illposed.osc.OSCPortIn.run(OSCPortIn.java:65)
at java.lang.Thread.run(Unknown Source)
Caused by: dk.ange.octave.exception.OctaveIOException: IOException during close
at dk.ange.octave.exec.OctaveReaderCallable.call(OctaveReaderCallable.java:65)
at dk.ange.octave.exec.OctaveReaderCallable.call(OctaveReaderCallable.java:1)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
... 1 more
Caused by: java.io.IOException: Pipe to octave-process broken
at dk.ange.octave.exec.OctaveExecuteReader.read(OctaveExecuteReader.java:68)
at java.io.Reader.read(Unknown Source)
at dk.ange.octave.exec.OctaveExecuteReader.close(OctaveExecuteReader.java:96)
at dk.ange.octave.exec.OctaveReaderCallable.call(OctaveReaderCallable.java:61)
... 5 more
java.util.concurrent.ExecutionException: dk.ange.octave.exception.OctaveIOException: IOException during close
at java.util.concurrent.FutureTask.report(Unknown Source)
at java.util.concurrent.FutureTask.get(Unknown Source)
at dk.ange.octave.exec.OctaveExec.getFromFuture(OctaveExec.java:157)
at dk.ange.octave.exec.OctaveExec.eval(OctaveExec.java:142)
at dk.ange.octave.io.OctaveIO.set(OctaveIO.java:56)
at dk.ange.octave.OctaveEngine.put(OctaveEngine.java:141)
at [redacted].OctaveEquation.initParameters(OctaveEquation.java:109)
at [redacted].OctaveEquation.evaluate(OctaveEquation.java:72)
at [redacted].LinearEquationSystem.inputChanged(LinearEquationSystem.java:105)
at [redacted].variable.NumericalVariable.setValue(NumericalVariable.java:127)
at [redacted].widget.ValueWidget.actionDrop(ValueWidget.java:88)
at [redacted].TangibleObjectManager.dropObject(TangibleObjectManager.java:207)
at [redacted].adapter.TuioAdapter.addTangibleObject(TuioAdapter.java:318)
at [redacted].adapter.TuioAdapter.addTuioObject(TuioAdapter.java:295)
at TUIO.TuioClient.acceptMessage(TuioClient.java:339)
at com.illposed.osc.utility.OSCPacketDispatcher.dispatchMessage(OSCPacketDispatcher.java:73)
at com.illposed.osc.utility.OSCPacketDispatcher.dispatchPacket(OSCPacketDispatcher.java:49)
at com.illposed.osc.utility.OSCPacketDispatcher.dispatchBundle(OSCPacketDispatcher.java:56)
at com.illposed.osc.utility.OSCPacketDispatcher.dispatchPacket(OSCPacketDispatcher.java:40)
at com.illposed.osc.OSCPortIn.run(OSCPortIn.java:65)
at java.lang.Thread.run(Unknown Source)
Caused by: dk.ange.octave.exception.OctaveIOException: IOException during close
at dk.ange.octave.exec.OctaveReaderCallable.call(OctaveReaderCallable.java:65)
at dk.ange.octave.exec.OctaveReaderCallable.call(OctaveReaderCallable.java:1)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
... 1 more
Caused by: java.io.IOException: Pipe to octave-process broken
at dk.ange.octave.exec.OctaveExecuteReader.read(OctaveExecuteReader.java:68)
at java.io.Reader.read(Unknown Source)
at dk.ange.octave.exec.OctaveExecuteReader.close(OctaveExecuteReader.java:96)
at dk.ange.octave.exec.OctaveReaderCallable.call(OctaveReaderCallable.java:61)
... 5 more
I am working on a Windows 7 machine, have the path to the Octave bin on my classpath and not sure what I am doing wrong. I assume something goes wrong during IO but I can't seem to put my finger on it.
To be sure all my inputs are known, below are my *.m filed holding both scripts. All my inputs are positive and range from 0 to 2200 (as doubles).
function Load = CircuitLoad(Breaker, Lights, Dryer, Dishwasher, Laptop, Smartphone)
Load = Breaker / ((Lights + Dryer + Dishwasher + Laptop + Smartphone) / 230);
And the second one:
function Ampere = PowerConsumption(Lights, Dryer, Dishwasher, Laptop, Smartphone)
Ampere = (Lights + Dryer + Dishwasher + Laptop + Smartphone) / 230;

How to find where the java ClassCastException is happening

Just wondering if anyone can see why i am getting the exception "java.lang.ClassCastException" from the code below.
RISService, RisPortType are lib that i got from a WSDL file and then use wsimport to generate the .java files
I know what the exception means but i am just not sure how to track it down.
// Instantiate the wsimport generated SXML API Service client --
RISService risportService = new RISService();
RisPortType risportPort = risportService.getRisPort();
// Set the URL, user, and password on the JAX-WS client
String hostUrl = "https://10.1.1.1:8443/realtimeservice2/services/RISService";
((BindingProvider) risportPort).getRequestContext().put(BindingProvider.ENDPOINT_ADDRESS_PROPERTY, hostUrl);
((BindingProvider) risportPort).getRequestContext().put(BindingProvider.USERNAME_PROPERTY, cucmDetails.getAxlUsername());
((BindingProvider) risportPort).getRequestContext().put(BindingProvider.PASSWORD_PROPERTY, cucmDetails.getAxlPassword());
// create and populate the selectCmDevice request
SelectCmDevice sxmlParams = new SelectCmDevice();
CmSelectionCriteria criteria = new CmSelectionCriteria();
long maxNum = 200;
long modelNum = 255;
ArrayOfSelectItem items = new ArrayOfSelectItem();
//create a select item criteria to retrieve devices with names matching "SEP123412341234"
SelectItem item = new SelectItem();
item.setItem("SEP123412341234");
items.getItem().add(item);
//Search on all nodes
criteria.setNodeName("Any");
//get back max 200 phones. 9+ can get upto 1000
criteria.setMaxReturnedDevices(maxNum);
//get back phones only
criteria.setDeviceClass("Phone");
//255 means get back ALL phone models
criteria.setModel(modelNum);
//get back only Registered phones
criteria.setStatus("Registered");
//return results in order of name
criteria.setSelectBy("Name");
//array of phones to get results back for
criteria.setSelectItems(items);
sxmlParams.setCmSelectionCriteria(criteria);
//make selectCmDevice request
SelectCmDeviceReturn selectResponse = risportPort.selectCmDevice("",criteria); << This is where i get the exception outline below
Exception in thread "AWT-EventQueue-0" javax.xml.ws.WebServiceException:
java.lang.ClassCastException: [C cannot be cast to java.lang.String
at com.sun.xml.internal.ws.transport.http.client.HttpTransportPipe.process(Unknown Source)
at com.sun.xml.internal.ws.transport.http.client.HttpTransportPipe.processRequest(Unknown Source)
at com.sun.xml.internal.ws.transport.DeferredTransportPipe.processRequest(Unknown Source)
at com.sun.xml.internal.ws.api.pipe.Fiber.__doRun(Unknown Source)
at com.sun.xml.internal.ws.api.pipe.Fiber._doRun(Unknown Source)
at com.sun.xml.internal.ws.api.pipe.Fiber.doRun(Unknown Source)
at com.sun.xml.internal.ws.api.pipe.Fiber.runSync(Unknown Source)
at com.sun.xml.internal.ws.client.Stub.process(Unknown Source)
at com.sun.xml.internal.ws.client.sei.SEIStub.doProcess(Unknown Source)
at com.sun.xml.internal.ws.client.sei.SyncMethodHandler.invoke(Unknown Source)
at com.sun.xml.internal.ws.client.sei.SyncMethodHandler.invoke(Unknown Source)
at com.sun.xml.internal.ws.client.sei.SEIStub.invoke(Unknown Source)
at com.sun.proxy.$Proxy40.selectCmDevice(Unknown Source)
at utils._9.APIRIS9.getPhoneIPadd(APIRIS9.java:66)
Thanks
Alexis
I bet your password is being returned as a char[] and jaxws is expecting a String.
in my case
Object port = service.getPort(qname, c);
WSBindingProvider bp = (WSBindingProvider) port;
// Manually set connection timeouts as we seem to hit them during IT testing
Map<String, Object> requestContext = bp.getRequestContext();
requestContext.put(BindingProviderProperties.REQUEST_TIMEOUT, env.getProperty("timeout"));
requestContext.put(BindingProviderProperties.CONNECT_TIMEOUT, env.getProperty("timeout"));
as you can see requestContext.put() takes a String and a object, you thought putting a String timeout would work, but NO, java ws is expecting a int..
this is a massive catch ya.

Saving List of Objects Throws Hibernate Exception

When I am saving a List of objects by calling saveListOfPageChooserElement, it throws the below exception
Whereas, when I am saving a single instance by calling saveOrUpdate, then it works fine.
But to improve performance I want to save a List batch rather than single object at a time.
Can anyone suggest what's the problem with saving a whole list at once?
List<Abc> listabc = widgetCopyDAO
.fetchabcByPageId(id);
for (Abc abc: listabc ) {
abc.setLastUpdatedBy(null);
abc.setLastUpdatedOn(null);
abc.setCreatedBy(widgetCopyDTO.getUserName());
abc.setCreatedOn(new Date());
abc.setPageChooser(new PageChooser(chooser.getId()));
abc.setId(0l);
issuePageWidgetDAO.saveOrUpdate(abc);
}
// widgetCopyDAO.saveListOfPageChooserElement(listabc);
public void saveOrUpdate(Abc abc) {
if (abc.getId() == 0) {
Long id = (Long) this.getHibernateTemplate().save(
abc);
abc.setId(id);
} else {
this.getHibernateTemplate().update(abc);
}
}
public void saveListOfPageChooserElement(
List<Abc> listabc) {
this.getHibernateTemplate().saveOrUpdateAll(listabc);
}
The exception is
org.springframework.orm.hibernate3.HibernateSystemException: identifier of an instance of com.mct.model.Abc was altered from 138 to 0; nested exception is org.hibernate.HibernateException: identifier of an instance of com.mct.model.Abc was altered from 138 to 0
at org.springframework.orm.hibernate3.SessionFactoryUtils.convertHibernateAccessException(SessionFactoryUtils.java:676)
at org.springframework.orm.hibernate3.HibernateAccessor.convertHibernateAccessException(HibernateAccessor.java:412)
at org.springframework.orm.hibernate3.HibernateTemplate.doExecute(HibernateTemplate.java:424)
at org.springframework.orm.hibernate3.HibernateTemplate.executeWithNativeSession(HibernateTemplate.java:374)
at org.springframework.orm.hibernate3.HibernateTemplate.findByCriteria(HibernateTemplate.java:1055)
at org.springframework.orm.hibernate3.HibernateTemplate.findByCriteria(HibernateTemplate.java:1048)
at com.mct.dao.WidgetCopyDAO.fetchPageChooserWithImagesByChooser(WidgetCopyDAO.java:82)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:307)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:182)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:149)
at org.springframework.aop.framework.adapter.ThrowsAdviceInterceptor.invoke(ThrowsAdviceInterceptor.java:126)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171)
at org.springframework.aop.framework.adapter.AfterReturningAdviceInterceptor.invoke(AfterReturningAdviceInterceptor.java:50)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171)
at org.springframework.aop.framework.adapter.MethodBeforeAdviceInterceptor.invoke(MethodBeforeAdviceInterceptor.java:50)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:106)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204)
at $Proxy58.fetchPageChooserWithImagesByChooser(Unknown Source)
at com.mct.service.widgethelper.ChooserWidget.copyWidget(ChooserWidget.java:676)
at com.mct.service.widgethelper.ChooserWidget.copyAllWidgets(ChooserWidget.java:634)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
You set ht Ids of all objects in the list:
abc.setId(0l);
And that's what causes the error.
You cannot change an auto-generated ID by your own.
Remove this line.
In hibernate You can't set Id (Autogenrated) manulally like below.
abc.setId(0l);
Remove this above line try again.
The problem appears to be this line:
abc.setId(0l);
You are clearing the ids of the entities you've loaded from the database.

Java Stanford NLP: ArrayIndexOutOfBounds after loading second lexicon

I am using the Stanford Natural Language processing toolkit. I've been trying to find spelling errors with Lexicon's isKnown method, but it produces quite a few false positives. So I thought I'd load a second lexicon, and check that too. However, that causes a problem.
private static LexicalizedParser lp = new LexicalizedParser(Constants.stdLexFile);
private static LexicalizedParser wsjLexParse = new LexicalizedParser(Constants.wsjLexFile);
static {
lp.setOptionFlags(Constants.lexOptionFlags);
wsjLexParse.setOptionFlags(Constants.lexOptionFlags);
}
public ParseTree(String input) throws IllegalArgumentException, IllegalAccessException, InvocationTargetException {
initialInput = input;
DocumentPreprocessor process = new DocumentPreprocessor();
sentences = process.getSentencesFromText(new StringReader(input));
for (List<? extends HasWord> sent : sentences) {
if(lp.parse(sent)) { // line 65
forest.add(lp.getBestParse()); //non determinism?
}
}
partsOfSpeech = pos();
runAnalysis();
}
The following fail trace is produced:
java.lang.ArrayIndexOutOfBoundsException: 45547
at edu.stanford.nlp.parser.lexparser.BaseLexicon.initRulesWithWord(BaseLexicon.java:300)
at edu.stanford.nlp.parser.lexparser.BaseLexicon.isKnown(BaseLexicon.java:160)
at edu.stanford.nlp.parser.lexparser.BaseLexicon.ruleIteratorByWord(BaseLexicon.java:212)
at edu.stanford.nlp.parser.lexparser.ExhaustivePCFGParser.initializeChart(ExhaustivePCFGParser.java:1299)
at edu.stanford.nlp.parser.lexparser.ExhaustivePCFGParser.parse(ExhaustivePCFGParser.java:388)
at edu.stanford.nlp.parser.lexparser.LexicalizedParser.parse(LexicalizedParser.java:234)
at nth.compling.ParseTree.<init>(ParseTree.java:65)
at nth.compling.ParseTreeTest.constructor(ParseTreeTest.java:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.junit.internal.runners.BeforeAndAfterRunner.invokeMethod(BeforeAndAfterRunner.java:74)
at org.junit.internal.runners.BeforeAndAfterRunner.runBefores(BeforeAndAfterRunner.java:50)
at org.junit.internal.runners.BeforeAndAfterRunner.runProtected(BeforeAndAfterRunner.java:33)
at org.junit.internal.runners.TestClassRunner.run(TestClassRunner.java:52)
at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:45)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
If I comment out this line: (and other references to wsjLexParse)
private static LexicalizedParser wsjLexParse = new LexicalizedParser(Constants.wsjLexFile);
then everything works fine. What am I doing wrong here?
Looks like a bug in the Stanford library. You should report it to them.
Does the second lexicon work when you load only it (and not the other one)?
Does the same error occur when you load the two lexica in different order?

Categories