I am trying to predict test times in a Kaggle comp using the H2OGeneralizedLinearEstimator function. The model trains normally in line 3 and the metrics are all reasonable. However when I come to the predict step I get an error despite the test data frame matching the train data frame.
Has anyone seen this error before?
h2o_glm = H2OGeneralizedLinearEstimator()
h2o_glm.train(training_frame=train_h2o,y='y')
h2o_glm_predictions = h2o_glm.predict(test_data=test_h2o).as_data_frame()
test_pred = pd.read_csv('test.csv')[['ID']]
test_pred['y'] = h2o_glm_predictions
test_pred.to_csv('h2o_glm_predictions.csv',index=False)
glm Model Build progress: |███████████████████████████████████████████████| 100%
glm prediction progress: | (failed)
OSError Traceback (most recent call last) in () 3 h2o_glm.train(training_frame=train_h2o,y='y') 4 ----> 5 h2o_glm_predictions = h2o_glm.predict(test_data=test_h2o).as_data_frame() 6 7 test_pred = pd.read_csv('test.csv')[['ID']]
/Applications/anaconda/lib/python3.6/site-packages/h2o/model/model_base.py in predict(self, test_data) 130 j = H2OJob(h2o.api("POST /4/Predictions/models/%s/frames/%s" % (self.model_id, test_data.frame_id)), 131 self._model_json["algo"] + " prediction") --> 132 j.poll() 133 return h2o.get_frame(j.dest_key) 134
/Applications/anaconda/lib/python3.6/site-packages/h2o/job.py in poll(self) 71 if (isinstance(self.job, dict)) and ("stacktrace" in list(self.job)): 72 raise EnvironmentError("Job with key {} failed with an exception: {}\nstacktrace: " ---> 73 "\n{}".format(self.job_key, self.exception, self.job["stacktrace"])) 74 else: 75 raise EnvironmentError("Job with key %s failed with an exception: %s" % (self.job_key, self.exception))
OSError: Job with key
$03017f00000132d4ffffffff$_868312f4c32f683871930a1145c1476a failed
with an exception: DistributedException from /127.0.0.1:54321: 'null',
caused by java.lang.ArrayIndexOutOfBoundsException stacktrace:
DistributedException from /127.0.0.1:54321: 'null', caused by
java.lang.ArrayIndexOutOfBoundsException at
water.MRTask.getResult(MRTask.java:478) at
water.MRTask.getResult(MRTask.java:486) at
water.MRTask.doAll(MRTask.java:390) at
water.MRTask.doAll(MRTask.java:396) at
hex.glm.GLMModel.predictScoreImpl(GLMModel.java:1215) at
hex.Model.score(Model.java:1077) at
water.api.ModelMetricsHandler$1.compute2(ModelMetricsHandler.java:351)
at water.H2O$H2OCountedCompleter.compute(H2O.java:1349) at
jsr166y.CountedCompleter.exec(CountedCompleter.java:468) at
jsr166y.ForkJoinTask.doExec(ForkJoinTask.java:263) at
jsr166y.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:974) at
jsr166y.ForkJoinPool.runWorker(ForkJoinPool.java:1477) at
jsr166y.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:104) Caused
by: java.lang.ArrayIndexOutOfBoundsException
To summarize the comments above, the current solution is to add a response column (with fake data if it doesn't exist) to the test_data frame. However, this is a bug that should be fixed. The JIRA is here.
Related
Note to community: Please do not close this as duplicate because the particular issue I am researching has manifested as a null pointer exception. As you can see from the stack trace, the NPE is buried 4 layers deep in the Tika library. That means of all the great advice that was given in the existing StackExchange post on NPE, none of the Tika developers saw fit to apply that advice (checking for null pointers) in four modules. Rather than learn Tika and retrofit their code with a patch to do that work, it think it would be more efficient to ask if anyone had achieved the common use case of using the SourcCodeParser.
I am looking for help with a published example for the Tika library here. I did not author the example code. I have seen many similar questions relating to the Tika library, which has 20 contributors and thousands of lines of code. Please do not close this question as I believe this can be quickly easily answered by anyone who used this Parser before. I have already read the post on NullPointerException, and am following this advice from that question:
I still can't find the problem
If you tried to debug the problem and still don't have a solution, you
can post a question for more help, but make sure to include what
you've tried so far. At a minimum, include the stacktrace in the
question, and mark the important line numbers in the code.
As I spent much time authoring this post, retrieving and including relevant stack trace and source code, I would really appreciate it if you would allow this to spend a little bit of time in an unclosed state so that someone who is familiar with Tika might take a look at what appears to be fairly common issue. As you would know as a Java expert, many null pointer exception issues can be non-trivial, particularly when working with a large unfamiliar framework. I really appreciate your help.
I wrote a simple program to test the Tika SourceCodeParser, by substituting it for the AutoDetectParser in the XHTML parsing example from the Tika Examples page. When executing the parse command on line 137, there is a NullPointerException. It appears that there may be a delegate missing from the in on line 180 of the Parser code.
The AutoDetectParser works but does not identify the source code as java.
When I use the Tika desktop app, it works fine and recognizes the code as Java.
How do I initialize the SourceCodeParser to avoid the NullPointerException when operating it?
Example using Tika "Example" Package
LocalFile.toTikaXhtmlString()
123 /** Parses as Tika using source code parser.
124 *
125 * #param filePathParam path to file to parse
126 */
127 public static String toTikaXhtmlString(final String filePathParam)
128 throws IOException, SAXException, TikaException
129 {
130 SourceCodeParser parser = new SourceCodeParser();
131 ContentHandler handler = new ToXMLContentHandler();
132 Metadata metadata = new Metadata();
133 File file = new File(filePathParam);
134 try (InputStream stream
135 = ContentHandlerExample.class
136 .getResourceAsStream(filePathParam)) {
137 parser.parse(stream, handler, metadata);
138 return handler.toString();
139 } catch (Exception e) {
140 System.out.println("Caught exception.");
141 System.out.println(e.toString());
142 e.printStackTrace();
143 throw e;
144 }
145
146 }
I also tried avoiding the Tika 'ContentHandlerExample' class using direct call with InputStreamReader, to the same result:
public static String toTikaXhtmlString(final String filePathParam)
throws IOException, SAXException, TikaException
{
SourceCodeParser parser = new SourceCodeParser();
ContentHandler handler = new ToXMLContentHandler();
Metadata metadata = new Metadata();
File file = new File(filePathParam);
try (InputStream stream = new FileInputStream(file)) {
parser.parse(stream, handler, metadata);
return handler.toString();
} catch (Exception e) {
throw new RuntimeException(e.getMessage());
}
}
JUNIT Test
108 #Test
109 public void parseFile() {
110 String fileName, verifyInput, resultContent;
111
112 //arrange
113 fileName = "/Users/johnmeyer/Projects/code-proc/FileParseTest-run.txt";
114
115 String fileContent = "/** Test */ public MyTestClass {"
116 + "public static void main(String[] args) {"
117 + "System.out.println(\"This is a test.\"); }";
118
119
120 LocalFile.putText(fileName, fileContent);
121
122 verifyInput = LocalFile.getContent(fileName);
123
124 assertEquals(fileContent, verifyInput);
125 //act (and clean up)
126
127 try {
128
129 resultContent = LocalFile.toTikaXhtmlString(fileName);
130 } catch (Exception e) {
131 throw new RuntimeException(e.getMessage());
132 }
133
134 LocalFile.delete(fileName);
135
136 //assert
137 assertEquals(fileContent, resultContent);
138 }
Stack Trace
[INFO] Running us.johnmeyer.test.tools.FileParseTest Caught exception.
java.lang.NullPointerException java.lang.NullPointerException at
org.apache.commons.io.input.ProxyInputStream.markSupported(ProxyInputStream.java:181)
at
org.apache.tika.detect.AutoDetectReader.getBuffered(AutoDetectReader.java:137)
at
org.apache.tika.detect.AutoDetectReader.(AutoDetectReader.java:114)
at
org.apache.tika.parser.code.SourceCodeParser.parse(SourceCodeParser.java:93)
at
org.apache.tika.parser.AbstractParser.parse(AbstractParser.java:53)
at
us.johnmeyer.utilities.LocalFile.toTikaXhtmlString(LocalFile.java:137)
at
us.johnmeyer.test.tools.FileParseTest.parseFile(FileParseTest.java:129)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271) at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) at
org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) at
org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) at
org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) at
org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) at
org.junit.runners.ParentRunner.run(ParentRunner.java:309) at
org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:369)
at
org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:275)
at
org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:239)
at
org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:160)
at
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:373)
at
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:334)
at
org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:119)
at
org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:407)
Tika Source Code
17 package org.apache.tika.io;
18
19 import java.io.FilterInputStream;
20 import java.io.IOException;
21 import java.io.InputStream;
22
23 /**
24 * A Proxy stream which acts as expected, that is it passes the method
25 * calls on to the proxied stream and doesn't change which methods are
26 * being called.
27 * <p>
28 * It is an alternative base class to FilterInputStream
29 * to increase reusability, because FilterInputStream changes the
30 * methods being called, such as read(byte[]) to read(byte[], int, int).
31 * <p>
32 * See the protected methods for ways in which a subclass can easily decorate
33 * a stream with custom pre-, post- or error processing functionality.
34 *
35 * #author Stephen Colebourne
36 * #version $Id$
37 */
38 public abstract class ProxyInputStream extends FilterInputStream {
40 /**
41 * Constructs a new ProxyInputStream.
42 *
43 * #param proxy the InputStream to delegate to
44 */
45 public ProxyInputStream(InputStream proxy) {
46 super(proxy);
47 // the proxy is stored in a protected superclass variable named 'in'
48 }
...
174 /**
175 * Invokes the delegate's <code>markSupported()</code> method.
176 * #return true if mark is supported, otherwise false
177 */
178 #Override
179 public boolean markSupported() {
180 return in.markSupported();
181 }
I am trying to read an EDI Message and converting it to Java object ,but I am ended with below exception .
Exception in thread "main" org.milyn.SmooksException: Failed to filter
source. at
org.milyn.delivery.sax.SmooksSAXFilter.doFilter(SmooksSAXFilter.java:97)
at
org.milyn.delivery.sax.SmooksSAXFilter.doFilter(SmooksSAXFilter.java:64)
at org.milyn.Smooks._filter(Smooks.java:526) at
org.milyn.Smooks.filterSource(Smooks.java:482) at
org.milyn.Smooks.filterSource(Smooks.java:456) at
org.milyn.edi.unedifact.d97a.D97AInterchangeFactory.fromUNEdifact(D97AInterchangeFactory.java:58)
at
org.milyn.edi.unedifact.d97a.D97AInterchangeFactory.fromUNEdifact(D97AInterchangeFactory.java:40)
at com.ibm.gpohub.edi.common.SmooksSample.main(SmooksSample.java:18)
Caused by: org.milyn.edisax.EDIParseException: EDI message processing
failed [ORDRSP][D:97A:UN]. Segment [FTX], field 4 (TEXT_LITERAL),
component 1 (Free_text_-_-1) expected to contain a value. Currently
at segment number 6. at
org.milyn.edisax.EDIParser.mapComponent(EDIParser.java:687) at
org.milyn.edisax.EDIParser.mapField(EDIParser.java:636) at
org.milyn.edisax.EDIParser.mapFields(EDIParser.java:606) at
org.milyn.edisax.EDIParser.mapSegment(EDIParser.java:564) at
org.milyn.edisax.EDIParser.mapSegments(EDIParser.java:535) at
org.milyn.edisax.EDIParser.mapSegments(EDIParser.java:453) at
org.milyn.edisax.EDIParser.parse(EDIParser.java:428) at
org.milyn.edisax.EDIParser.parse(EDIParser.java:410) at
org.milyn.edisax.unedifact.handlers.UNHHandler.process(UNHHandler.java:97)
at
org.milyn.edisax.unedifact.handlers.UNGHandler.process(UNGHandler.java:58)
at
org.milyn.edisax.unedifact.handlers.UNBHandler.process(UNBHandler.java:75)
at
org.milyn.edisax.unedifact.UNEdifactInterchangeParser.parse(UNEdifactInterchangeParser.java:113)
at
org.milyn.smooks.edi.unedifact.UNEdifactReader.parse(UNEdifactReader.java:75)
at org.milyn.delivery.sax.SAXParser.parse(SAXParser.java:76) at
org.milyn.delivery.sax.SmooksSAXFilter.doFilter(SmooksSAXFilter.java:86)
... 7 more
Here is the code snippet:
D97AInterchangeFactory d97InterChangeFactory = (D97AInterchangeFactory)SmooksFactoryImpl.D97A_FACTORY.getInstance();
InputStream ediSource = new FileInputStream("C:\\EDIFACT_MSG.txt");
UNEdifactInterchange interchange = d97InterChangeFactory.fromUNEdifact(ediSource);
if(interchange instanceof UNEdifactInterchange41){
List<UNEdifactMessage41> messages = ((UNEdifactInterchange41) interchange).getMessages();
for(UNEdifactMessage41 msg:messages){
System.out.println(msg.toString());
}
}
EDIMessage :
UNA:+.?
UNB+UNOC:3+662424795TEST:16+IBMEDIID:ZZ+160330:1416+IG-62779496
UNG+ORDRSP+662424795TEST:16+IBMEDIID:ZZ+160330:1420+FG-34160863+UN+D:97A
UNH+80534414+ORDRSP:D:97A:UN BGM+231+20160330+4
DTM+69:20150501150000UTC?+12:304 FTX+SSR+++:Blank FTX+AAR++ST
FTX+COI+++CLW FTX+PRI++8 FTX+DEL++06 FTX+CUR+++Pack all item into one
box FTX+DIN+++make a call to customer before delivery
FTX+PRD+++1:1:PC01 FTX+AAP+++900:accept RFF+PC:20AMS67000
RFF+SE:PC01K33E RFF+SZ:ND RFF+ABO:Y RFF+CO:IBM1234501
DTM+4:20150501010101UTC?+12:304 RFF+ACW:CASE_12345 RFF+ADG:Y RFF+ACH:Y
RFF+ZOD:order_desk01 RFF+ZSD:IBM RFF+ZPD:30006672 RFF+ZCS:Blank
RFF+ZZZ NAD+SE+30001234++IBM NAD+BY+US00000001++Coca Cola:CA+9/F:841
WEBSTER ST:stress 3:Blank+SAN FRANCISCO++94117+US CTA+PD+:Jordan
Surzyn COM+Minako#DHL.com:EM COM+6508624654:TE NAD+OY+US00000001++IBM
Field Service:CA+9/F:900 WEBSTER ST:stress 3:Blank+SAN
FRANCISCO++94117+US CTA+CR+:Will Smith COM+Will#ibm.com:EM
COM+6508624654:TE LIN+10 PIA+5+04X6076 IMD+F++:::KEYBOARD NetVista
Keyboard (USB) QTY+21:1:EA DTM+69:20160610120000UTC?+12:304
FTX+OSI+++INW FTX+LIN+++ZSP1 FTX+AAP+++900:Accept FTX+ZCT+++STO from
DC to FSL RFF+ZSB:01 RFF+ZRO:Y RFF+ZOR:KEYBOARD in good condition
RFF+ZST:SOFT UNS+S UNT+50+80534414 UNE+1+FG-34160863 UNZ+1+IG-62779496
Can anyone guide me , where I am doing wrong ?
thanks in advance.
It was because of the improper EDIFACT message format. It is resolved after I got the proper EDIFACT message, as shown below. Hope any one faced similar issue may help this . --thanks
UNA:+.? '
UNB+UNOC:3+IBM:ZZZ+662424795TEST:16+160330:1416+00000016086706++++1'
UNG+ORDRSP+IBM:ZZZ+662424795TEST:16+160330:1420+00000000160867+UN+D:97A'
UNH+1+ORDRSP:D:97A:UN' BGM+231+20160330+4'
DTM+69:20160501150000UTC?+12:304' FTX+AAR++ER' FTX+SSR+++N:AM'
FTX+COI+++CLW' FTX+PRI++8' FTX+DEL++04' FTX+CUR+++Pack all item into
one box' FTX+DIN+++make a call to customer before delivery'
FTX+PRD+++IBMDECK001::PC01' FTX+AAP+++900:accept' RFF+PC:20AMS67000'
RFF+SE:PC01K33E' RFF+SZ:ND' RFF+ABO:N' RFF+CO:IBM1234501'
RFF+ACW:IBMCASE12301' DTM+4:20150501000000UTC?+12:304'
NAD+SE+30006672++3100001' NAD+BY+US00000001++CA:NEC Personal
Computers, Ltd.+9/F:841 WEBSTER ST:stress 3+SAN
FRANCISCO++941171717+US' CTA+PD+:Jordan Surzyn' COM+Minako#DHL.com:EM'
COM+6508624654:TE' NAD+OY+US00000001++CA:NEC Personal Computers,
Ltd.+9/F:841 WEBSTER ST:stress 3+SAN FRANCISCO++941171717+US'
CTA+CR+:Jordan Surzyn' COM+Minako#DHL.com:EM' COM+6508624654:TE'
LIN+20+++1:10' PIA+5+04X6076' IMD+F++:::KEYBOARD NetVista Keyboard
(USB)' QTY+21:1:EA' DTM+69:20160610120000UTC?+12:304' FTX+LIN+++ZSP1'
FTX+AAP+++900:Accpet' FTX+OSI+++INW' FTX+BSC+++KEYBOARD in good
condition' RFF+SE:Y' NAD+OY+01+SOFT' UNS+S' UNT+41+1'
UNE+1+00000000160867' UNZ+1+00000016086706'
This question already has answers here:
what is java.io.EOFException, Message: Can not read response from server. Expected to read 4 bytes, read 0 bytes
(10 answers)
Closed 6 years ago.
There is a java based cron, which does some processing and at the end it inserts some rows to a InnoDB.
This cron runs 24 times a day at each hour, but fails at some same particular time everyday since quite few days.
Exception :
The last packet successfully received from the server was 465,902 milliseconds ago. The last packet sent successfully to the server was 8 milliseconds ago.
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:400)
at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:1038)
at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:3434)
at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:3334)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3774)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2447)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2594)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2541)
at com.mysql.jdbc.StatementImpl.executeUpdate(StatementImpl.java:1604)
at com.mysql.jdbc.StatementImpl.executeUpdate(StatementImpl.java:1535)
at DBConnection.QueryUpdate(DBConnection.java:41)
at Mailer.insertDetailstoDB(Mailer.java:203)
at Mailer.run(Mailer.java:64)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.EOFException: Can not read response from server. Expected to read 4 bytes, read 0 bytes before connection was unexpectedly lost.
at com.mysql.jdbc.MysqlIO.readFully(MysqlIO.java:2926)
at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:3344)
... 11 more
com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
Experiments done:
Issue happens only while insertion. Update/select query on other table works fine. Even select query on same ( issue ) table works fine.
JDBC URL : DB_URL=URL+DB+"?zeroDateTimeBehavior=convertToNull&autoReconnect=true&characterEncoding=UTF-8&characterSetResults=UTF-8&failOverReadOnly=true";
Took sql varibale dump, but both are same even when there is no issue.
Variable DUMP :
auto_increment_increment.........2
auto_increment_offset.........1
autocommit.........ON
automatic_sp_privileges.........ON
back_log.........150
basedir........./usr/local/mysql6
big_tables.........OFF
bind_address.........*
binlog_cache_size.........32768
binlog_checksum.........CRC32
binlog_direct_non_transactional_updates.........OFF
binlog_error_action.........IGNORE_ERROR
binlog_format.........MIXED
binlog_gtid_simple_recovery.........OFF
binlog_max_flush_queue_time.........0
binlog_order_commits.........ON
binlog_row_image.........FULL
binlog_rows_query_log_events.........OFF
binlog_stmt_cache_size.........32768
binlogging_impossible_mode.........IGNORE_ERROR
block_encryption_mode.........aes-128-ecb
bulk_insert_buffer_size.........8388608
character_set_client.........latin1
character_set_connection.........latin1
character_set_database.........latin1
character_set_filesystem.........binary
character_set_results.........latin1
character_set_server.........latin1
character_set_system.........utf8
character_sets_dir........./usr/local/mysql-5.6.23-linux-glibc2.5-x86_64/share/charsets/
collation_connection.........latin1_swedish_ci
collation_database.........latin1_swedish_ci
collation_server.........latin1_swedish_ci
completion_type.........NO_CHAIN
concurrent_insert.........AUTO
connect_timeout.........10
core_file.........OFF
datadir........./usr/local/mysql6/data_99acres06/
date_format.........%Y-%m-%d
datetime_format.........%Y-%m-%d %H:%i:%s
default_storage_engine.........MyISAM
default_tmp_storage_engine.........InnoDB
default_week_format.........0
delay_key_write.........ON
delayed_insert_limit.........100
delayed_insert_timeout.........300
delayed_queue_size.........1000
disconnect_on_expired_password.........ON
div_precision_increment.........4
end_markers_in_json.........OFF
enforce_gtid_consistency.........OFF
eq_range_index_dive_limit.........10
event_scheduler.........OFF
expire_logs_days.........5
explicit_defaults_for_timestamp.........OFF
flush.........OFF
flush_time.........0
foreign_key_checks.........ON
ft_boolean_syntax.........+ -><()~*:""&|
ft_max_word_len.........84
ft_min_word_len.........4
ft_query_expansion_limit.........20
ft_stopword_file.........(built-in)
general_log.........OFF
general_log_file........./usr/local/mysql6/data/npl9dba08.log
group_concat_max_len.........1073741824
gtid_executed.........
gtid_mode.........OFF
gtid_owned.........
gtid_purged.........
have_compress.........YES
have_crypt.........YES
have_dynamic_loading.........YES
have_geometry.........YES
have_openssl.........DISABLED
have_profiling.........YES
have_query_cache.........YES
have_rtree_keys.........YES
have_ssl.........DISABLED
have_symlink.........YES
host_cache_size.........628
hostname.........npl9dba08.ieil.net
ignore_builtin_innodb.........OFF
ignore_db_dirs.........
init_connect.........
init_file.........
init_slave.........
innodb_adaptive_flushing.........ON
innodb_adaptive_flushing_lwm.........10
innodb_adaptive_hash_index.........ON
innodb_adaptive_max_sleep_delay.........150000
innodb_additional_mem_pool_size.........8388608
innodb_api_bk_commit_interval.........5
innodb_api_disable_rowlock.........OFF
innodb_api_enable_binlog.........OFF
innodb_api_enable_mdl.........OFF
innodb_api_trx_level.........0
innodb_autoextend_increment.........64
innodb_autoinc_lock_mode.........1
innodb_buffer_pool_dump_at_shutdown.........OFF
innodb_buffer_pool_dump_now.........OFF
innodb_buffer_pool_filename.........ib_buffer_pool
innodb_buffer_pool_instances.........40
innodb_buffer_pool_load_abort.........OFF
innodb_buffer_pool_load_at_startup.........OFF
innodb_buffer_pool_load_now.........OFF
innodb_buffer_pool_size.........42949672960
innodb_change_buffer_max_size.........25
innodb_change_buffering.........all
innodb_checksum_algorithm.........innodb
innodb_checksums.........ON
innodb_cmp_per_index_enabled.........OFF
innodb_commit_concurrency.........0
innodb_compression_failure_threshold_pct.........5
innodb_compression_level.........6
innodb_compression_pad_pct_max.........50
innodb_concurrency_tickets.........5000
innodb_data_file_path.........ibdata1:100M:autoextend
innodb_data_home_dir........./usr/local/mysql6/data
innodb_disable_sort_file_cache.........OFF
innodb_doublewrite.........ON
innodb_fast_shutdown.........1
innodb_file_format.........Antelope
innodb_file_format_check.........ON
innodb_file_format_max.........Antelope
innodb_file_per_table.........ON
innodb_flush_log_at_timeout.........1
innodb_flush_log_at_trx_commit.........1
innodb_flush_method.........O_DIRECT
innodb_flush_neighbors.........1
innodb_flushing_avg_loops.........30
innodb_force_load_corrupted.........OFF
innodb_force_recovery.........0
innodb_ft_aux_table.........
innodb_ft_cache_size.........8000000
innodb_ft_enable_diag_print.........OFF
innodb_ft_enable_stopword.........ON
innodb_ft_max_token_size.........84
innodb_ft_min_token_size.........3
innodb_ft_num_word_optimize.........2000
innodb_ft_result_cache_limit.........2000000000
innodb_ft_server_stopword_table.........
innodb_ft_sort_pll_degree.........2
innodb_ft_total_cache_size.........640000000
innodb_ft_user_stopword_table.........
innodb_io_capacity.........200
innodb_io_capacity_max.........2000
innodb_large_prefix.........OFF
innodb_lock_wait_timeout.........100
innodb_locks_unsafe_for_binlog.........OFF
innodb_log_buffer_size.........16777216
innodb_log_compressed_pages.........ON
innodb_log_file_size.........536870912
innodb_log_files_in_group.........2
innodb_log_group_home_dir........../
innodb_lru_scan_depth.........1024
innodb_max_dirty_pages_pct.........75
innodb_max_dirty_pages_pct_lwm.........0
innodb_max_purge_lag.........0
innodb_max_purge_lag_delay.........0
innodb_mirrored_log_groups.........1
innodb_monitor_disable.........
innodb_monitor_enable.........
innodb_monitor_reset.........
innodb_monitor_reset_all.........
innodb_old_blocks_pct.........37
innodb_old_blocks_time.........1000
innodb_online_alter_log_max_size.........134217728
innodb_open_files.........400
innodb_optimize_fulltext_only.........OFF
innodb_page_size.........16384
innodb_print_all_deadlocks.........OFF
innodb_purge_batch_size.........300
innodb_purge_threads.........1
innodb_random_read_ahead.........OFF
innodb_read_ahead_threshold.........56
innodb_read_io_threads.........4
innodb_read_only.........OFF
innodb_replication_delay.........0
innodb_rollback_on_timeout.........OFF
innodb_rollback_segments.........128
innodb_sort_buffer_size.........1048576
innodb_spin_wait_delay.........6
innodb_stats_auto_recalc.........ON
innodb_stats_method.........nulls_equal
innodb_stats_on_metadata.........OFF
innodb_stats_persistent.........ON
innodb_stats_persistent_sample_pages.........20
innodb_stats_sample_pages.........8
innodb_stats_transient_sample_pages.........8
innodb_status_output.........OFF
innodb_status_output_locks.........OFF
innodb_strict_mode.........OFF
innodb_support_xa.........ON
innodb_sync_array_size.........1
innodb_sync_spin_loops.........30
innodb_table_locks.........ON
innodb_thread_concurrency.........0
innodb_thread_sleep_delay.........10000
innodb_undo_directory..........
innodb_undo_logs.........128
innodb_undo_tablespaces.........0
innodb_use_native_aio.........ON
innodb_use_sys_malloc.........ON
innodb_version.........5.6.23
innodb_write_io_threads.........4
interactive_timeout.........28800
join_buffer_size.........8388608
keep_files_on_create.........OFF
key_buffer_size.........12884901888
key_cache_age_threshold.........300
key_cache_block_size.........1024
key_cache_division_limit.........100
large_files_support.........ON
large_page_size.........0
large_pages.........OFF
lc_messages.........en_US
lc_messages_dir........./usr/local/mysql-5.6.23-linux-glibc2.5-x86_64/share/
lc_time_names.........en_US
license.........GPL
local_infile.........ON
lock_wait_timeout.........31536000
locked_in_memory.........OFF
log_bin.........ON
log_bin_basename........./usr/local/mysql6/data/npl9dba08_99acres
log_bin_index........./usr/local/mysql6/data/npl9dba08_99acres.index
log_bin_trust_function_creators.........OFF
log_bin_use_v1_row_events.........OFF
log_error........./usr/local/mysql6/data/npl9dba08.ieil.net.err
log_output.........FILE
log_queries_not_using_indexes.........OFF
log_slave_updates.........ON
log_slow_admin_statements.........OFF
log_slow_slave_statements.........OFF
log_throttle_queries_not_using_indexes.........0
log_warnings.........1
long_query_time.........1.000000
low_priority_updates.........OFF
lower_case_file_system.........OFF
lower_case_table_names.........0
master_info_repository.........FILE
master_verify_checksum.........OFF
max_allowed_packet.........33554432
max_binlog_cache_size.........18446744073709547520
max_binlog_size.........1073741824
max_binlog_stmt_cache_size.........18446744073709547520
max_connect_errors.........100
max_connections.........500
max_delayed_threads.........20
max_error_count.........64
max_heap_table_size.........16777216
max_insert_delayed_threads.........20
max_join_size.........18446744073709551615
max_length_for_sort_data.........1024
max_prepared_stmt_count.........16382
max_relay_log_size.........0
max_seeks_for_key.........18446744073709551615
max_sort_length.........1024
max_sp_recursion_depth.........0
max_tmp_tables.........32
max_user_connections.........0
max_write_lock_count.........18446744073709551615
metadata_locks_cache_size.........1024
metadata_locks_hash_instances.........8
min_examined_row_limit.........0
multi_range_count.........256
myisam_data_pointer_size.........6
myisam_max_sort_file_size.........9223372036853727232
myisam_mmap_size.........18446744073709551615
myisam_recover_options.........OFF
myisam_repair_threads.........1
myisam_sort_buffer_size.........12582912
myisam_stats_method.........nulls_unequal
myisam_use_mmap.........OFF
net_buffer_length.........16384
net_read_timeout.........30
net_retry_count.........10
net_write_timeout.........60
new.........OFF
old.........OFF
old_alter_table.........OFF
old_passwords.........0
open_files_limit.........20000
optimizer_prune_level.........1
optimizer_search_depth.........62
optimizer_switch.........index_merge=on,index_merge_union=on,index_merge_sort_union=on,index_merge_intersection=on,engine_condition_pushdown=on,index_condition_pushdown=on,mrr=on,mrr_cost_based=on,block_nested_loop=on,batched_key_access=off,materialization=on,semijoin=on,loosescan=on,firstmatch=on,subquery_materialization_cost_based=on,use_index_extensions=on
optimizer_trace.........enabled=off,one_line=off
optimizer_trace_features.........greedy_search=on,range_optimizer=on,dynamic_range=on,repeated_subselect=on
optimizer_trace_limit.........1
optimizer_trace_max_mem_size.........16384
optimizer_trace_offset.........-1
performance_schema.........ON
performance_schema_accounts_size.........100
performance_schema_digests_size.........10000
performance_schema_events_stages_history_long_size.........10000
performance_schema_events_stages_history_size.........10
performance_schema_events_statements_history_long_size.........10000
performance_schema_events_statements_history_size.........10
performance_schema_events_waits_history_long_size.........10000
performance_schema_events_waits_history_size.........10
performance_schema_hosts_size.........100
performance_schema_max_cond_classes.........80
performance_schema_max_cond_instances.........3300
performance_schema_max_file_classes.........50
performance_schema_max_file_handles.........32768
performance_schema_max_file_instances.........30770
performance_schema_max_mutex_classes.........200
performance_schema_max_mutex_instances.........10000
performance_schema_max_rwlock_classes.........40
performance_schema_max_rwlock_instances.........5000
performance_schema_max_socket_classes.........10
performance_schema_max_socket_instances.........1020
performance_schema_max_stage_classes.........150
performance_schema_max_statement_classes.........168
performance_schema_max_table_handles.........800
performance_schema_max_table_instances.........12500
performance_schema_max_thread_classes.........50
performance_schema_max_thread_instances.........1100
performance_schema_session_connect_attrs_size.........512
performance_schema_setup_actors_size.........100
performance_schema_setup_objects_size.........100
performance_schema_users_size.........100
pid_file........./usr/local/mysql6/data/npl9dba08.ieil.net.pid
plugin_dir........./usr/local/mysql6/lib/plugin/
port.........3306
preload_buffer_size.........32768
profiling.........OFF
profiling_history_size.........15
protocol_version.........10
query_alloc_block_size.........8192
query_cache_limit.........1048576
query_cache_min_res_unit.........4096
query_cache_size.........268435456
query_cache_type.........ON
query_cache_wlock_invalidate.........OFF
query_prealloc_size.........8192
range_alloc_block_size.........4096
read_buffer_size.........16777216
read_only.........OFF
read_rnd_buffer_size.........262144
relay_log.........
relay_log_basename.........
relay_log_index.........
relay_log_info_file.........relay-log.info
relay_log_info_repository.........FILE
relay_log_purge.........ON
relay_log_recovery.........OFF
relay_log_space_limit.........0
report_host.........npl9dba0806
report_password.........
report_port.........3306
report_user.........
rpl_stop_slave_timeout.........31536000
secure_auth.........ON
secure_file_priv.........
server_id.........178806
server_id_bits.........32
server_uuid.........561d32a3-5544-11e5-b989-44a8423fc648
simplified_binlog_gtid_recovery.........OFF
skip_external_locking.........ON
skip_name_resolve.........ON
skip_networking.........OFF
skip_show_database.........OFF
slave_allow_batching.........OFF
slave_checkpoint_group.........512
slave_checkpoint_period.........300
slave_compressed_protocol.........ON
slave_exec_mode.........STRICT
slave_load_tmpdir........./usr/local/mysql6/data/tmp
slave_max_allowed_packet.........1073741824
slave_net_timeout.........3600
slave_parallel_workers.........0
slave_pending_jobs_size_max.........16777216
slave_rows_search_algorithms.........TABLE_SCAN,INDEX_SCAN
slave_skip_errors.........OFF
slave_sql_verify_checksum.........ON
slave_transaction_retries.........10
slave_type_conversions.........
slow_launch_time.........2
slow_query_log.........ON
slow_query_log_file.........npl9dba08.slow
socket........./tmp/mysql_06.sock
sort_buffer_size.........8388608
sql_auto_is_null.........OFF
sql_big_selects.........ON
sql_buffer_result.........OFF
sql_log_bin.........ON
sql_log_off.........OFF
sql_mode.........NO_ENGINE_SUBSTITUTION
sql_notes.........ON
sql_quote_show_create.........ON
sql_safe_updates.........OFF
sql_select_limit.........18446744073709551615
sql_slave_skip_counter.........0
sql_warnings.........OFF
ssl_ca.........
ssl_capath.........
ssl_cert.........
ssl_cipher.........
ssl_crl.........
ssl_crlpath.........
ssl_key.........
storage_engine.........MyISAM
stored_program_cache.........256
sync_binlog.........0
sync_frm.........ON
sync_master_info.........10000
sync_relay_log.........10000
sync_relay_log_info.........10000
system_time_zone.........IST
table_definition_cache.........600
table_open_cache.........400
table_open_cache_instances.........1
thread_cache_size.........400
thread_concurrency.........10
thread_handling.........one-thread-per-connection
thread_stack.........262144
time_format.........%H:%i:%s
time_zone.........Asia/Calcutta
timed_mutexes.........OFF
tmp_table_size.........3221225472
tmpdir........./usr/local/mysql6/data/tmp
transaction_alloc_block_size.........8192
transaction_prealloc_size.........4096
tx_isolation.........REPEATABLE-READ
tx_read_only.........OFF
unique_checks.........ON
updatable_views_with_limit.........YES
version.........5.6.23-log
version_comment.........MySQL Community Server (GPL)
version_compile_machine.........x86_64
version_compile_os.........linux-glibc2.5
wait_timeout.........200
query_cache_size.........268435456 -- Not more than about 50M, else the prunes will cause performance problems in INSERTs and other writes.
INSERT look like? One row? Many rows? INSERT..SELECT? Show us the INSERT.
Do you have at least 64GB of RAM?
You are using only MyISAM? Ouch!
40GB is wasted in the buffer_pool that is used only by InnoDB.
group_concat_max_len.........1073741824 -- get a few queries going at once, and you could run out of RAM!
Ditto for tmp_table_size.........3221225472.
Is a FULLTEXT index involved with the naughty INSERT?
I'm facing the problem that Shiro shows some odd behavior in converting a byte
array to a salt.
I started to implement all classes involved in the process into my application which are:
org.apache.shiro.realm.AuthenticatingRealm
org.apache.shiro.authc.credential.HashedCredentialsMatcher
Upon User creation the user password is hashed with a generated salt and then stored in my database:
import org.apache.shiro.crypto.hash.Sha256Hash;
import org.apache.shiro.crypto.RandomNumberGenerator;
import org.apache.shiro.crypto.SecureRandomNumberGenerator;
RandomNumberGenerator rng = new SecureRandomNumberGenerator();
Object salt = rng.nextBytes();
String hashedPasswordBase64 = new Sha256Hash(password, salt, 1024).toBase64();
shiro.ini looks like this:
# SALTED JDBC REALM
saltedJdbcRealm=com.mycompany.ssp.SaltedJdbcRealm
dataSource = org.postgresql.ds.PGSimpleDataSource
dataSource.databaseName = Self-Service-Portal
dataSource.serverName = localhost
dataSource.portNumber = 5432
dataSource.user = postgres
dataSource.password = admin
saltedJdbcRealm.dataSource = $dataSource
saltedJdbcRealm.authenticationQuery = SELECT umgmt_users.password, umgmt_users.salt FROM umgmt_users WHERE umgmt_users.user = ?
sha256Matcher = org.apache.shiro.authc.credential.Sha256CredentialsMatcher
# base64 encoding, not hex in this example:
sha256Matcher.storedCredentialsHexEncoded = false
sha256Matcher.hashIterations = 1024
saltedJdbcRealm.credentialsMatcher = $sha256Matcher
################################################################################
# SECURITY MANAGER #
securityManager.realms = $saltedJdbcRealm
strategy = org.apache.shiro.authc.pam.FirstSuccessfulStrategy
securityManager.authenticator.authenticationStrategy = $strategy
################################################################################
my custom saltedJdbcRealm just overrides the doGetAuthenticationInfo. This code is from this blog ->
#Override
protected AuthenticationInfo doGetAuthenticationInfo(AuthenticationToken token) throws AuthenticationException {
//identify account to log to
UsernamePasswordToken userPassToken = (UsernamePasswordToken) token;
String username = userPassToken.getUsername();
if (username == null) {
log.debug("Username is null.");
return null;
}
// read password hash and salt from db
PasswdSalt passwdSalt = getPasswordForUser(username);
if (passwdSalt == null) {
log.debug("No account found for user [" + username + "]");
return null;
}
// return salted credentials
SimpleAuthenticationInfo info = new SimpleAuthenticationInfo(username, passwdSalt.password, getName());
info.setCredentialsSalt(new SimpleByteSource(passwdSalt.salt));
return info;
}
Debugging after return info goes like this:
AuthenticatingRealm.java: Mehtod: assertCredentialsMatch()
HashedCredentialsMatcher.java: Method: doCredentialsMatch()
HashedCredentialsMatcher.java: Method: hashProvidedCredentials()
Looking for the error I ended up finding it here in
org.apache.shiro.authc.credential.HashedCredentialsMatcher.java:
protected Object hashProvidedCredentials(AuthenticationToken token, AuthenticationInfo info) {
Object salt = null;
if (info instanceof SaltedAuthenticationInfo) {
// STOP HERE AND SEE BELOW PART 1!!!
salt = ((SaltedAuthenticationInfo) info).getCredentialsSalt();
// STOP HERE AND SEE BELOW PART 2!!!
} else {
//retain 1.0 backwards compatibility:
if (isHashSalted()) {
salt = getSalt(token);
}
}
return hashProvidedCredentials(token.getCredentials(), salt, getHashIterations());
}
Part 1:
lets take a look at the variable info:
The full byte array is the following:
57 109 102 43 65 87 118 88 70 76 105 82 116 104 113 108 116 100 101 108 79 119 61 61
which correctly represents the salt in my database:
9mf+AWvXFLiRthqltdelOw==
Next Step in the code is to extract the Salt from the info variable and store it in the variable salt of type Object.
Part 2:
looking at the variable salt after this line:
salt = ((SaltedAuthenticationInfo) info).getCredentialsSalt();
executed I get this result:
OW1mK0FXdlhGTGlSdGhxbHRkZWxPdz09
Edit:
I did another example and show you the 2 methods that 1) hash the submitted password 2) take the password from database for comparison & that they are not
the same:
I start off with 2 variables, token (submitted password) & info (stored password information):
Stored Credentials:
credentials:
d5fHxI7kYQYtyqo6kwvZFDATIIsZThvFQeDVidpDDEQ
storedBytes before decoding:
100 53 102 72 120 73 55 107 89 81 89 116 121 113 111 54 107 119 118 90 70 68 65 84 73 73 115 90 84 104 118 70 81 101 68 86 105 100 112 68 68 69 81 61
storedBytes after decoding:
119 -105 -57 -60 -114 -28 97 6 45 -54 -86 58 -109 11 -39 20 48 19 32 -117 25 78 27 -59 65 -32 -43 -119 -38 67 12 68
hash:
7797c7c48ee461062dcaaa3a930bd9143013208b194e1bc541e0d589da430c44
Submitted Credentials:
char[] credentials:
[0] = 1
[1] = 2
[2] = 3
byte[] bytes:
50 69 81 77 57 55 80 53 53 112 89 52 122 69 78 54 57 98 53 56 82 65 61 61
which is 2EQM97P55pY4zEN69b58RA== and this is whats inside the database
cachedBase64:
MkVRTTk3UDU1cFk0ekVONjliNThSQT09
return value is this hash:
af9a7ef0ea9fa4d93eae1ca5d16c03c516f4822ec3e9017f14f694175848a6ab
As the 2 Hash values are not the same I get why my Application is telling me wrong password BUT I created this user with the password 123 using the code above (first code block)
Edit End
So does anyone know why the hash calculation is not giving the same hash value for the same password??? Or what else I might have done wrong (i doubt that the shiro code is wrong so it may be something wrong in my code with generation the password hash/salt or shiro.ini configuration?)
ufff, after a little more playing around with those functions I found the solution why the submitted password is hashed with a wrong salt value
I added 3 lines to the method hashProvidedCredentials inside
org.apache.shiro.authc.credential.HashedCredentialsMatcher.java
protected Object hashProvidedCredentials(AuthenticationToken token, AuthenticationInfo info) {
Object salt = null;
if (info instanceof SaltedAuthenticationInfo) {
salt = ((SaltedAuthenticationInfo) info).getCredentialsSalt();
// Get base64 Decoder
java.util.Base64.Decoder Decoder = java.util.Base64.getDecoder();
// decode salt from database
byte[] encodedJava8 = null;
encodedJava8 = Decoder.decode(((SaltedAuthenticationInfo) info).getCredentialsSalt().getBytes());
// save decoded salt value in previous salt Object
salt = ByteSource.Util.bytes(encodedJava8);
// The 3 steps above are nessecary because the Object salt is of type
// SimpleByteSource and:
// - it holds a byte[] which holds the salt in its correct form
// - it also holds a cachedBase64 encoded version of this byte[]
// (which is of course not the actual salt)
// The Problem is that the next method call below that hashes the
// submitted password uses the cachedBase64 value to hash the
// passwort and not the byte[] which represents the actual salt
// Therefor it is nessecary to:
// - create SimpleByteSource salt with the value from the database
// - decode the byte[] so that the cachedBase64 represents the actual salt
// - store the decoded version of the byte[] in the SimpleByteSource variable salt
} else {
//retain 1.0 backwards compatibility:
if (isHashSalted()) {
salt = getSalt(token);
}
}
return hashProvidedCredentials(token.getCredentials(), salt, getHashIterations());
}
Now the user submitted password on login is hashed the same way as it was when being generated this way:
RandomNumberGenerator rng = new SecureRandomNumberGenerator();
Object salt = rng.nextBytes();
//Now hash the plain-text password with the random salt and multiple
//iterations and then Base64-encode the value (requires less space than Hex):
String hashedPasswordBase64 = new Sha256Hash(password, salt, 1024).toBase64();
Note: This is not the final version of password hashing. Salt is going to be at least 256bit & iterations are going to be around 200k-300k.
Having the problem solved, I narrowed down the problem to 4 possible options:
1)
There is a major Error in the shiro code (HashedCredentialsMatcher.java) (at least from my point of view it is) because password varification using a salt will always fail this way (see my description inside the code block).
2)
I either uses the wrong CredentialsMatcher for hased & salted passwords and I have no clue which one to use instead.
3)
My Implementation of the doGetAuthenticationInfo Method in my custom Realm has an error. For my Custom Realm I used this tutorial:
Apache Shiro Part 2 - Realms, Database and PGP Certificates
4)
I made a mistake on creation of the password hash (although that code is from the official Apache Shiro Website Link
From my Point of view the options 1 & 4 are not the problem so its either 2 or 3 which cause this problem and make it nessecary to add some code to HashedCredentialsMatcher.java Method: hashProvidedCredentials()
So concluding, does anyone have any idea on this issue just to get clarification??
I'm trying to query a file based on the eXist database.
Through a simple function to display the contents of the file, no problem:
XMLResource res = (XMLResource) col.getResource(resourceName);
System.out.println(res.getContent());
But when I try against making a request impossible.
String xQuery = "for $x in doc(\"" + resourceName + "\")." + "return data($x).";
ResourceSet result = service.query(xQuery);
ResourceIterator i = result.getIterator();
I have the following errors:
Exception in thread "main" org.xmldb.api.base.XMLDBException: Failed to invoke method queryP in class org.exist.xmlrpc.RpcConnection: org.exist.xquery.StaticXQueryException: exerr:ERROR org.exist.xquery.XPathException: exerr:ERROR err:XPST0003 in line 1, column 58: unexpected token: .
at org.exist.xmldb.RemoteXPathQueryService.query(RemoteXPathQueryService.java:114)
at org.exist.xmldb.RemoteXPathQueryService.query(RemoteXPathQueryService.java:71)
at ExistAccess.main(ExistAccess.java:45)
Caused by: org.apache.xmlrpc.XmlRpcException: Failed to invoke method queryP in class org.exist.xmlrpc.RpcConnection: org.exist.xquery.StaticXQueryException: exerr:ERROR org.exist.xquery.XPathException: exerr:ERROR err:XPST0003 in line 1, column 58: unexpected token: .
at org.apache.xmlrpc.client.XmlRpcStreamTransport.readResponse(XmlRpcStreamTransport.java:197)
at org.apache.xmlrpc.client.XmlRpcStreamTransport.sendRequest(XmlRpcStreamTransport.java:156)
at org.apache.xmlrpc.client.XmlRpcHttpTransport.sendRequest(XmlRpcHttpTransport.java:143)
at org.apache.xmlrpc.client.XmlRpcSunHttpTransport.sendRequest(XmlRpcSunHttpTransport.java:69)
at org.apache.xmlrpc.client.XmlRpcClientWorker.execute(XmlRpcClientWorker.java:56)
at org.apache.xmlrpc.client.XmlRpcClient.execute(XmlRpcClient.java:167)
at org.apache.xmlrpc.client.XmlRpcClient.execute(XmlRpcClient.java:158)
at org.apache.xmlrpc.client.XmlRpcClient.execute(XmlRpcClient.java:147)
at org.exist.xmldb.RemoteXPathQueryService.query(RemoteXPathQueryService.java:99)
... 2 more
[B#105081caorg.apache.xmlrpc.XmlRpcException: Failed to invoke method queryP in class org.exist.xmlrpc.RpcConnection: org.exist.xquery.StaticXQueryException: exerr:ERROR org.exist.xquery.XPathException: exerr:ERROR err:XPST0003 in line 1, column 58: unexpected token: .
I checked all my .jar file, and all of them are present... I really need help ! Thanks in advance!
Your query:
String xQuery = "for $x in doc(\"" + resourceName + "\")." + "return data($x).";
The core of the error:
err:XPST0003 in line 1, column 58: unexpected token: .
As the error message states, eXist-db recognizes an error with the "."; this period/dot is invalid XQuery. Remove the dot from the query, and you should be fine. The query text itself should look like this:
for $x in doc("/db/mycollection/mydocument.xml") return data($x)
Also, it appears your FLWOR loop is iterating over a single item - the resource. Therefore, the FLWOR is extraneous. You could refactor this as:
data(doc("/db/mycollection/mydocument.xml"))
I think you string concat make this issue, why not try to add a space after ".". Change your code like
String xQuery = "for $x in doc(\"" + resourceName + "\"). " + "return data($x).";