jaxb too slow when parsing large file on servicemix - java

I'm trying to parse an xml file containing 100000 lines
schéma like :
<RplyColl ...>
<Rply>
....
</Rply>
</RplyColl>
<EnvColl>
<Env>
...
</Env>
</EnvColl>
<FpdColl rowID="73">
<Fpd>
...
</Fpd>
</FpdColl>
I parse the file like this:
final Unmarshaller unMarshaller = JAXBContext.newInstance("my.context", ObjectFactory.class.getClassLoader()).createUnmarshaller();
object= unMarshaller.unmarshal(new StreamSource(new StringReader(new String(message.getBytes(), "UTF-8"))));
I'm using servicemix with jaxb-impl
90] [Active ] [ ] [ ] [ 50] JAXB2 Basics - Runtime (0.6.4)
[ 91] [Active ] [ ] [ ] [ 50] Apache ServiceMix :: Bundles :: jaxb-impl (2.2.11.1)
So when i'm going in debug mode i see my jaxbcontext is :
bundle://91.0:1/com/sun/xml/bind/v2/runtime/JAXBContextImpl.class Build-Id: 2.2.11
Classes known to this context:
...
...
After this, the unmarshall method call take 3minutes30sec :(
I try this code in Unit test and it take 10seconds,
Here is the jaxbcontext class to compare:
jar:file:/D:/maven/repository/com/sun/xml/bind/jaxb-impl/2.2.11/jaxb-impl-2.2.11.jar!/com/sun/xml/bind/v2/runtime/JAXBContextImpl.class Build-Id: 2.2.11
Classes known to this context:
[B
...
...
So why my servicemix take 3minutes30seconds and my unit test take only 10 seconds during unmarshall operation?
Does i miss something?
thank you very much
VERSION
jaxb-impl 2.2.11
servicemix :5.5.2

finally i found my answer:
in servicemix someone change the value of :
org.apache.servicemix.specs.timeout=100
i change it to :
org.apache.servicemix.specs.timeout=0
and now it rocks!

Related

Spring Config Server - resolvePlaceholders + enviroment variables

I've got question about resolving environment variables in shared files of config server.
My current setup is pretty minimal :
src/main/resources/shared/application.yml :
application:
version: 0.0.1-early
test: ${JAVA_HOME}
src/main/resources/application.properties :
spring.profiles.active=native
spring.cloud.config.server.native.searchLocations=classpath:/shared
Using gradle with :
spring-boot-gradle-plugin:2.0.0.RELEASE
spring-cloud-dependencies:Camden.SR7
And then of course compile 'org.springframework.cloud:spring-cloud-config-server' in deps
Problem :
GET http://localhost:8888/apptest/application gives me :
{
"name": "apptest",
"profiles": [
"application"
],
"label": null,
"version": null,
"state": null,
"propertySources": [
{
"name": "classpath:/shared/application.yml",
"source": {
"application.version": "0.0.1-early",
"application.test": "${JAVA_HOME}"
}
}
]
}
So env variable is not resolved. Same thing is with :
http://localhost:8888/apptest/application?resolvePlaceholders=true
http://localhost:8888/lab/apptest-application.properties?resolvePlaceholders=true
http://localhost:8888/lab/apptest-application.properties?resolvePlaceholders=false
http://localhost:8888/apptest-application.properties?resolvePlaceholders=true
I've looked at Spring cloud config server. Environment variables in properties but solution didn't help me + there where few new versions since then. So I'm opening new question.
Actually it's not a bug and everything is fine. I did not understood how Config server works.
http://localhost:8888/apptest/application - returns yet not resolved value of ${JAVA_HOME}
When we get ei. into container "C" that pings Config Service for configuration and do curl http://config:8888/apptest/application we get the same - unresolved ${JAVA_HOME}
But when we look into Spring application ei. in container "C" and try to inject #Value("${application.test}") somewhere, we get proper value or info that env variable was not set.
It means that environment variables are resolved on client side.
Thanks to that I've understood how NOT production ready env_variables approach is.
Well the changes happened here https://github.com/spring-cloud/spring-cloud-config/commit/f8fc4e19375d3b4c0c2562a71bc49ba288197100 that removes the support of replacing the environment variables.
You can always add a new controller and override the behaviour of the EnvironmentPropertySource#prepareEnvironment

azure function in java configure using function.json

I'm trying to create an Azure Function written in Java and configured using a function.json file (rather than using annotations).
To get started I followed the MS tutorial (which works ok).
Next, I tried to modify the class to remove the Function annotations and add a function.json as indicated here in the section "The same function written without annotations".
The class source code is now:
public class Function {
public static String hello(String req, ExecutionContext context) {
return String.format("Hi, %s!", req);
}
}
My function.json file is
{
"scriptFile": "ServiceBusQueueMsgToLogWriter-1.0-SNAPSHOT.jar",
"entryPoint": "com.oneadvanced.adv365.mgdsvc.azure.func.test1.Function.hello",
"bindings": [
{
"type": "httpTrigger",
"name": "req",
"direction": "in",
"authLevel": "anonymous",
"methods": [ "post" ]
},
{
"type": "http",
"name": "$return",
"direction": "out"
}
]
}
The output of running the command:
mvn clean package
Includes the output:
AI: INFO 12-10-2017 21:50, 1: Configuration file has been successfully found as resource
AI: INFO 12-10-2017 21:51, 1: Configuration file has been successfully found as resource
[INFO]
[INFO] Step 1 of 6: Searching for Azure Function entry points
[INFO] Reflections took 33 ms to scan 1 urls, producing 0 keys and 0 values
[INFO] 0 Azure Function entry point(s) found.
[INFO]
[INFO] Step 2 of 6: Generating Azure Function configurations
[INFO] No Azure Functions found. Skip configuration generation.
This makes me wonder if the function.json file isn't in the right place...
Does anyone know where the correct place to put the function.json file for an Azure Function written in Java?
I've tried:
in the root of the project
in src/main/resources (which I think would be the standard place for this kind of thing in a typical Java/Maven project)
in the same folder as the Java source file
Same outcome in every case :(
I'd be grateful for any pointers on what I should be doing.
Thanks, Andy
The code for all of the functions in a given function app lives in a
root folder that contains a host configuration file and one or more
subfolders, each of which contain the code for a separate function.
Example Structure:
wwwroot
| - host.json
| - mynodefunction
| | - function.json
| | - index.js
| | - node_modules
| | | - ... packages ...
| | - package.json
| - mycsharpfunction
| | - function.json
| | - run.csx
You could find the description above from official doc.
However,I notice that your function.json file doesn't set disabled property.
Set the disabled property to false to enable the function which is mentioned here.
{
"bindings": [
{
"name": "myBlob",
"type": "blobTrigger",
"direction": "in",
"path": "jaygong/test",
"connection": "AzureWebJobsStorage"
}
],
"disabled": false
}
Update Answer:
I tried to create my own azure function with the steps you provided by configuring function.json and I reproduced your issue!
My function code:
package cn.gjj;
public class Function {
public static String hello(String in) {
return in;
}
}
My function test code:
package cn.gjj;
import org.junit.Test;
import static org.junit.Assert.assertEquals;
/**
* Unit test for Function class.
*/
public class FunctionTest {
/**
* Unit test for hello method.
*/
#Test
public void testHello() throws Exception {
final Function function = new Function();
final String ret = function.hello("function");
assertEquals("function", ret);
}
}
My function.json file:
{
"scriptFile": "JayGongTestAzureFunction-1.0-SNAPSHOT.jar",
"entryPoint": "cn.gjj.Function.echo",
"bindings": [
{
"type": "httpTrigger",
"name": "req",
"direction": "in",
"authLevel": "anonymous",
"methods": [ "post" ]
},
{
"type": "http",
"name": "$return",
"direction": "out"
}
]
}
When I run the command mvn clean package, the log shows as same as you.
AI: INFO 17-10-2017 14:07, 1: Configuration file has been successfully found as resource
AI: INFO 17-10-2017 14:07, 1: Configuration file has been successfully found as resource
[INFO]
[INFO] Step 1 of 6: Searching for Azure Function entry points
[INFO] Reflections took 31 ms to scan 1 urls, producing 0 keys and 0 values
[INFO] 0 Azure Function entry point(s) found.
[INFO]
[INFO] Step 2 of 6: Generating Azure Function configurations
[INFO] No Azure Functions found. Skip configuration generation.
[INFO]
[INFO] Step 3 of 6: Validating generated configurations
[INFO] No configurations found. Skip validation.
[INFO]
[INFO] Step 4 of 6: Saving empty host.json
[INFO] Successfully saved to E:\TestAzureFunction\JayGongTestAzureFunction\target\azure-functions\jaygongtestazurefunction-20171017132623892\host.json
[INFO]
[INFO] Step 5 of 6: Saving configurations to function.json
[INFO] No configurations found. Skip save.
[INFO]
[INFO] Step 6 of 6: Copying JARs to staging directory E:\TestAzureFunction\JayGongTestAzureFunction\target\azure-functions\jaygongtestazurefunction-20171017132623892
After careful comparison of function.json configuration way and annotations way, I found directories of Project/target/azure-functions/functionName/ folder are different.
annotations way:
function.json configuration way:
Note that if the function.json file is not found after mvn compile,please add the key line below in your pom.xml file.
<resources>
<resource>
<directory>${project.basedir}</directory>
<includes>
<include>host.json</include>
<include>local.settings.json</include>
**<include>function.json</include>**
</includes>
</resource>
</resources>
Solution:
Create a job folder and move your function.json file into it.Then it works for me.
Please give it a try.
Hope it helps you.

How to print custom-fields into a log file from java application with the help of log4j2 JSON technique?

I am looking for to print-out a log in a custom manner.
For example,
currently we have following log structure in form of JSON,
{
"timeMillis" : 1488791217953,
"thread" : "restartedMain",
"level" : "DEBUG",
"loggerName" : "org.springframework.jdbc.datasource.DriverManagerDataSource",
"message" : "hello world",
"endOfBatch" : false,
"loggerFqcn" : "org.apache.commons.logging.impl.SLF4JLocationAwareLog",
"threadId" : 17,
"threadPriority" : 5
}
Now I found that, there were couple of fields are still missing which are important for me to have in a log,
Expected JSON would be likewise :
{
"timeMillis" : 1488791217953,
"thread" : "restartedMain",
"level" : "DEBUG",
..................
"file" : "p1.pck.HelloWorld.java",
"line" : "190",
"application-id" : "101",
"logged in user id " : "199",
"etc" : "etc"
..................
"threadPriority" : 5
}
NOTE : the log configuration file log4j2.yml has following configuration
JsonLayout:
propertiesAsList: true
You'll want to include the log4j-web artifact to get context data in your log -
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-web</artifactId>
<version>2.8.2</version>
</dependency>
Also, you may find an artifact I created useful [extended-jsonlayout]. It allows you to add additional information to the json log message by implementing a simple interface, and including the jar on your classpath. You can check it out here -
https://github.com/savantly-net/log4j2-extended-jsonlayout

com.thoughtworks.xstream.converters.ConversionException

[EDITED]
The project i'm working on is a 3 folder project in Java J2EE with servlets and Hibernate for the persistance. The structure is as follow: - Admin -> the main program with the beans and HTML/CSS - Jar -> with the jars, Hibernate tools and classes - War -> with the Servlets
Between them, I use Xstream to share the classes and important info.
I'm using Eclipse and Tomcat 7.
Hope that with this all of you get the global idea.
This what the Xstream debugger said:
Caused by: com.thoughtworks.xstream.converters.ConversionException: satdata.musicoterapia.hibernate.Terapeuta0 : satdata.musicoterapia.hibernate.Terapeuta0
---- Debugging information ----
message : satdata.musicoterapia.hibernate.Terapeuta0
cause-exception : com.thoughtworks.xstream.mapper.CannotResolveClassException
cause-message : satdata.musicoterapia.hibernate.Terapeuta0
class : satdata.musicoterapia.hibernate.Usuario
required-type : satdata.musicoterapia.hibernate.Usuario
converter-type : com.thoughtworks.xstream.converters.reflection.ReflectionConverter
path : /list/Usuario[2]/terapeuta
class[1] : java.util.ArrayList
converter-type[1] : com.thoughtworks.xstream.converters.collections.CollectionConverter
version : null
Links (I don't have enough reputiation for have more than 2 links):
Complete StackTrace: http://pastebin.com/6vXyD6hC
XML: http://pastebin.com/YM9q3uvq
Servlet: below, in the comment
Where the problem occurs: below, in the comment
Java classes: below, in the comment
If something is missing, ask and I'll put it here. Thanks for all!!!
In your servlet code you are are mentioning :
xstream.alias("Terapeuta", Terapeuta.class);
In XML file it is given as:
<terapeuta class="satdata.musicoterapia.hibernate.Terapeuta0" resolves-to="Terapeuta">
So in logs you are getting error as:
The exception in logs says:
com.thoughtworks.xstream.mapper.CannotResolveClassException:
satdata.musicoterapia.hibernate.Terapeuta0
it seems your class namein MXL should be satdata.musicoterapia.hibernate.Terapeuta
satdata.musicoterapia.hibernate.Terapeuta0

Build a Model From Stream (Exception: Reader not found on classpath)

I’m just trying to add my ontology to the DB repository ……
My code is:
oConnection = H2Db.getM_oConnection();
m_oSDBConnection = new SDBConnection(oConnection);
StoreDesc oStoreDesc = new StoreDesc(LayoutType.LayoutTripleNodesHash, DatabaseType.H2);
m_oStore = SDBFactory.connectStore(m_oSDBConnection, oStoreDesc);
m_oModel = SDBFactory.connectDefaultModel(m_oStore);
InputStream oInputStream = this.getClass().getResourceAsStream("/META-INF/betaas_context.owl");
m_oModel.read(oInputStream, null);
At the beginning with the following dependencies:
ID State Blueprint Level Name
[ 994] [Active ] [ ] [ 80] H2 Database Engine (1.3.170)
[1114] [Active ] [ ] [ 80] wrap_mvn_org.apache.jena_jena-iri_1.0.0 (0)
[1223] [Active ] [ ] [ 80] wrap_mvn_org.apache.jena_jena-arq_2.11.0 (0) -> contains package org.apache.jena.riot.adapters
[1279] [Active ] [ ] [ 80] wrap_mvn_org.apache.jena_jena-core_2.11.0 (0)
[1311] [Active ] [ ] [ 80] wrap_mvn_xerces_xercesImpl_2.7.1 (0)
[1314] [Active ] [ ] [ 80] wrap_mvn_com.ibm.icu_icu4j_3.4.4 (0)
And I get this exception:
java.lang.Exception: com.hp.hpl.jena.shared.ConfigException: Reader not found on classpath
Caused by: com.hp.hpl.jena.shared.ConfigException: Reader not found on classpath
Caused by: java.lang.ClassNotFoundException:
org.apache.jena.riot.adapters.JenaReadersWriters$RDFReaderRIOT_RDFXML
And the problem was with the following line:oModel.read(oInputStream, null);
EDIT
Then, following the suggestion mentioned by AndyS(see bellow), I updated my Jena libraries and used the 2.11.1-SNAPSHOT:
[1511] [Active ] [ ] [ 80] wrap_mvn_org.apache.jena_jena-core_2.11.1-SNAPSHOT (0)
[1512] [Active ] [ ] [ 80] wrap_mvn_org.apache.jena_jena-arq_2.11.1-SNAPSHOT (0)
[1515] [Active ] [ ] [ 80] wrap_mvn_org.apache.jena_jena-sdb_1.4.1-SNAPSHOT (0)
[1516] [Resolved ] [ ] [ 80] wrap_mvn_xerces_xercesImpl_2.11.0 (0)
[1521] [Active ] [ ] [ 80] wrap_mvn_org.apache.jena_jena-iri_1.0.1-SNAPSHOT (0)
But the exception is the same:
ClassNotFoundException: org.apache.jena.riot.adapters.JenaReadersWriters$RDFReaderRIOT_RDFXML
There is a fixed bug to do with handling .owl files. This maybe the issue for some of the exceptions you are seeing. The current development snapshots have this fixed.
Unrelated:
You have the wrong version of xerces for jena. You do not need icu4j anymore.
The root error is a ClassNotFoundException for the following class: org.apache.jena.riot.adapters.JenaReadersWriters$RDFReaderRIOT_RDFXML. It looks like this library is using dynamic reflection to load the class, e.g. using Class.forName(), which is a really bad thing to do.
Anyway you should be able to fix it by adding the package org.apache.jena.riot.adapters to your Import-Package list.
At last, it was certainly a problem of Jena libraries versions.
I solved it with the following ones:
xerces/xercesImpl/2.9.1
com.ibm.icu/icu4j/3.4.4
org.slf4j/slf4j-api/1.6.1
com.hp.hpl.jena/arq/2.8.7
com.hp.hpl.jena/jena/2.6.4
com.hp.hpl.jena/iri/0.8
com.hp.hpl.jena/sdb/1.3.4
As Andy S. suggested me: "the problem is in the bundling.jena-core makes a reflection call to set up RIOT, and RIOT installs JenaReadersWriters$RDFReaderRIOT_RDFXML. This is instantiated with call of Class.newInstance(). It looks like the latter is failing.
This may be because of classloaders as setup by the OSGi bundling. You probably want one bundle with all of the Jena jars in it. As I understand your reported setup, you have a separate bundle, hence a different classloader, hence not found."
I have not checked it but probably the solution is to make a unique bundle jar with at least 2.11.0 version:
org.apache.jena/jena-core/2.11.0
org.apache.jena/jena-arq/2.11.0
org.apache.jena/jena-sdb/1.4.0
org.apache.jena/jena-iri/1.0.0
xerces/xercesImpl/2.11.0`
Thanks for the post!!
I had the same issue and fixed it by just adding org.apache.jena.riot.adapters package at the imports of jena-core.jar.
E.g. BND configuration file of jena-core-2.11.1.jar:
version=2.11.1
Bundle-Version: ${version}
Bundle-Name: Jena CORE
Export-Package: !etc, !jena, !jena-log4j.properties, !jena.cmdline, !ont-policy.rdf, *; version=${version}
Import-Package: org.apache.jena.riot.adapters, *

Categories