Exception in EDI Stream to Java Object conversion - java

I am trying to read an EDI Message and converting it to Java object ,but I am ended with below exception .
Exception in thread "main" org.milyn.SmooksException: Failed to filter
source. at
org.milyn.delivery.sax.SmooksSAXFilter.doFilter(SmooksSAXFilter.java:97)
at
org.milyn.delivery.sax.SmooksSAXFilter.doFilter(SmooksSAXFilter.java:64)
at org.milyn.Smooks._filter(Smooks.java:526) at
org.milyn.Smooks.filterSource(Smooks.java:482) at
org.milyn.Smooks.filterSource(Smooks.java:456) at
org.milyn.edi.unedifact.d97a.D97AInterchangeFactory.fromUNEdifact(D97AInterchangeFactory.java:58)
at
org.milyn.edi.unedifact.d97a.D97AInterchangeFactory.fromUNEdifact(D97AInterchangeFactory.java:40)
at com.ibm.gpohub.edi.common.SmooksSample.main(SmooksSample.java:18)
Caused by: org.milyn.edisax.EDIParseException: EDI message processing
failed [ORDRSP][D:97A:UN]. Segment [FTX], field 4 (TEXT_LITERAL),
component 1 (Free_text_-_-1) expected to contain a value. Currently
at segment number 6. at
org.milyn.edisax.EDIParser.mapComponent(EDIParser.java:687) at
org.milyn.edisax.EDIParser.mapField(EDIParser.java:636) at
org.milyn.edisax.EDIParser.mapFields(EDIParser.java:606) at
org.milyn.edisax.EDIParser.mapSegment(EDIParser.java:564) at
org.milyn.edisax.EDIParser.mapSegments(EDIParser.java:535) at
org.milyn.edisax.EDIParser.mapSegments(EDIParser.java:453) at
org.milyn.edisax.EDIParser.parse(EDIParser.java:428) at
org.milyn.edisax.EDIParser.parse(EDIParser.java:410) at
org.milyn.edisax.unedifact.handlers.UNHHandler.process(UNHHandler.java:97)
at
org.milyn.edisax.unedifact.handlers.UNGHandler.process(UNGHandler.java:58)
at
org.milyn.edisax.unedifact.handlers.UNBHandler.process(UNBHandler.java:75)
at
org.milyn.edisax.unedifact.UNEdifactInterchangeParser.parse(UNEdifactInterchangeParser.java:113)
at
org.milyn.smooks.edi.unedifact.UNEdifactReader.parse(UNEdifactReader.java:75)
at org.milyn.delivery.sax.SAXParser.parse(SAXParser.java:76) at
org.milyn.delivery.sax.SmooksSAXFilter.doFilter(SmooksSAXFilter.java:86)
... 7 more
Here is the code snippet:
D97AInterchangeFactory d97InterChangeFactory = (D97AInterchangeFactory)SmooksFactoryImpl.D97A_FACTORY.getInstance();
InputStream ediSource = new FileInputStream("C:\\EDIFACT_MSG.txt");
UNEdifactInterchange interchange = d97InterChangeFactory.fromUNEdifact(ediSource);
if(interchange instanceof UNEdifactInterchange41){
List<UNEdifactMessage41> messages = ((UNEdifactInterchange41) interchange).getMessages();
for(UNEdifactMessage41 msg:messages){
System.out.println(msg.toString());
}
}
EDIMessage :
UNA:+.?
UNB+UNOC:3+662424795TEST:16+IBMEDIID:ZZ+160330:1416+IG-62779496
UNG+ORDRSP+662424795TEST:16+IBMEDIID:ZZ+160330:1420+FG-34160863+UN+D:97A
UNH+80534414+ORDRSP:D:97A:UN BGM+231+20160330+4
DTM+69:20150501150000UTC?+12:304 FTX+SSR+++:Blank FTX+AAR++ST
FTX+COI+++CLW FTX+PRI++8 FTX+DEL++06 FTX+CUR+++Pack all item into one
box FTX+DIN+++make a call to customer before delivery
FTX+PRD+++1:1:PC01 FTX+AAP+++900:accept RFF+PC:20AMS67000
RFF+SE:PC01K33E RFF+SZ:ND RFF+ABO:Y RFF+CO:IBM1234501
DTM+4:20150501010101UTC?+12:304 RFF+ACW:CASE_12345 RFF+ADG:Y RFF+ACH:Y
RFF+ZOD:order_desk01 RFF+ZSD:IBM RFF+ZPD:30006672 RFF+ZCS:Blank
RFF+ZZZ NAD+SE+30001234++IBM NAD+BY+US00000001++Coca Cola:CA+9/F:841
WEBSTER ST:stress 3:Blank+SAN FRANCISCO++94117+US CTA+PD+:Jordan
Surzyn COM+Minako#DHL.com:EM COM+6508624654:TE NAD+OY+US00000001++IBM
Field Service:CA+9/F:900 WEBSTER ST:stress 3:Blank+SAN
FRANCISCO++94117+US CTA+CR+:Will Smith COM+Will#ibm.com:EM
COM+6508624654:TE LIN+10 PIA+5+04X6076 IMD+F++:::KEYBOARD NetVista
Keyboard (USB) QTY+21:1:EA DTM+69:20160610120000UTC?+12:304
FTX+OSI+++INW FTX+LIN+++ZSP1 FTX+AAP+++900:Accept FTX+ZCT+++STO from
DC to FSL RFF+ZSB:01 RFF+ZRO:Y RFF+ZOR:KEYBOARD in good condition
RFF+ZST:SOFT UNS+S UNT+50+80534414 UNE+1+FG-34160863 UNZ+1+IG-62779496
Can anyone guide me , where I am doing wrong ?
thanks in advance.

It was because of the improper EDIFACT message format. It is resolved after I got the proper EDIFACT message, as shown below. Hope any one faced similar issue may help this . --thanks
UNA:+.? '
UNB+UNOC:3+IBM:ZZZ+662424795TEST:16+160330:1416+00000016086706++++1'
UNG+ORDRSP+IBM:ZZZ+662424795TEST:16+160330:1420+00000000160867+UN+D:97A'
UNH+1+ORDRSP:D:97A:UN' BGM+231+20160330+4'
DTM+69:20160501150000UTC?+12:304' FTX+AAR++ER' FTX+SSR+++N:AM'
FTX+COI+++CLW' FTX+PRI++8' FTX+DEL++04' FTX+CUR+++Pack all item into
one box' FTX+DIN+++make a call to customer before delivery'
FTX+PRD+++IBMDECK001::PC01' FTX+AAP+++900:accept' RFF+PC:20AMS67000'
RFF+SE:PC01K33E' RFF+SZ:ND' RFF+ABO:N' RFF+CO:IBM1234501'
RFF+ACW:IBMCASE12301' DTM+4:20150501000000UTC?+12:304'
NAD+SE+30006672++3100001' NAD+BY+US00000001++CA:NEC Personal
Computers, Ltd.+9/F:841 WEBSTER ST:stress 3+SAN
FRANCISCO++941171717+US' CTA+PD+:Jordan Surzyn' COM+Minako#DHL.com:EM'
COM+6508624654:TE' NAD+OY+US00000001++CA:NEC Personal Computers,
Ltd.+9/F:841 WEBSTER ST:stress 3+SAN FRANCISCO++941171717+US'
CTA+CR+:Jordan Surzyn' COM+Minako#DHL.com:EM' COM+6508624654:TE'
LIN+20+++1:10' PIA+5+04X6076' IMD+F++:::KEYBOARD NetVista Keyboard
(USB)' QTY+21:1:EA' DTM+69:20160610120000UTC?+12:304' FTX+LIN+++ZSP1'
FTX+AAP+++900:Accpet' FTX+OSI+++INW' FTX+BSC+++KEYBOARD in good
condition' RFF+SE:Y' NAD+OY+01+SOFT' UNS+S' UNT+41+1'
UNE+1+00000000160867' UNZ+1+00000016086706'

Related

Firebase JAVA admin sdk FirebaseAuthException

Since today morning at about 7 AM ET we started getting FirebaseAuthException for many requests we made in the morning till about 8 AM ET. It stopped for a while but continued again for an hour or so. The error message says Error while verifying signature. However, I am certain of one thing not everybody had a bad token. It has stopped for an hour and half for now. But I wanted to know what caused this.
Here is a top level stack trace if that helps:
Error while verifying signature. at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:184):184 at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:172):172 at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392):392 at java.net.Socket.connect(Socket.java:589):589 at sun.security.ssl.SSLSocketImpl.connect(SSLSocketImpl.java:673):673 at sun.net.NetworkClient.doConnect(NetworkClient.java:175):175 at sun.net.www.http.HttpClient.openServer(HttpClient.java:463):463 at sun.net.www.http.HttpClient.openServer(HttpClient.java:558):558 at sun.net.www.protocol.https.HttpsClient.<init>(HttpsClient.java:264):264 at sun.net.www.protocol.https.HttpsClient.New(HttpsClient.java:367):367 at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.getNewHttpClient(AbstractDelegateHttpsURLConnection.java:191):191 at sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1156):1156 at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:1050):1050 at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect(AbstractDelegateHttpsURLConnection.java:177):177 at sun.net.www.protocol.https.HttpsURLConnectionImpl.connect(HttpsURLConnectionImpl.java:162):162 at com.google.api.client.http.javanet.NetHttpRequest.execute(NetHttpRequest.java:104):104 at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:981):981 at com.google.api.client.googleapis.auth.oauth2.GooglePublicKeysManager.refresh(GooglePublicKeysManager.java:172):172 at com.google.api.client.googleapis.auth.oauth2.GooglePublicKeysManager.getPublicKeys(GooglePublicKeysManager.java:140):140 at com.google.firebase.auth.internal.FirebaseTokenVerifier.verifySignature(FirebaseTokenVerifier.java:193):193 at com.google.firebase.auth.internal.FirebaseTokenVerifier.verifyTokenAndSignature(FirebaseTokenVerifier.java:161):161 at com.google.firebase.auth.FirebaseAuth$4.execute(FirebaseAuth.java:489):489 at com.google.firebase.auth.FirebaseAuth$4.execute(FirebaseAuth.java:476):476 at com.google.firebase.internal.CallableOperation.call(CallableOperation.java:36):36 at com.google.firebase.auth.FirebaseAuth.verifyIdToken(FirebaseAuth.java:440):440 at com.google.firebase.auth.FirebaseAuth.verifyIdToken(FirebaseAuth.java:414):414 at sun.reflect.GeneratedMethodAccessor43.invoke(Unknown Source):-1 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43):43 at java.lang.reflect.Method.invoke(Method.java:498):498 at railo.runtime.reflection.pairs.MethodInstance.invoke(MethodInstance.java:37):37 at railo.runtime.reflection.Reflector.callMethod(Reflector.java:699):699 at railo.runtime.util.VariableUtilImpl.callFunctionWithoutNamedValues(VariableUtilImpl.java:742):742 at railo.runtime.PageContextImpl.getFunction(PageContextImpl.java:1480):1480 at cfcs.vendor.firebase.firebaseadminutil_cfc$cf.udfCall(C:\websites\tools\cfcs\vendor\Firebase\FirebaseAdminUtil.cfc:66):66 at railo.runtime.type.UDFImpl.implementation(UDFImpl.java:215):215 at railo.runtime.type.UDFImpl._call(UDFImpl.java:434):434 at railo.runtime.type.UDFImpl.callWithNamedValues(UDFImpl.java:377):377 at railo.runtime.ComponentImpl._call(ComponentImpl.java:616):616 at railo.runtime.ComponentImpl._call(ComponentImpl.java:502):502 at railo.runtime.ComponentImpl.callWithNamedValues(ComponentImpl.java:1834):1834 at railo.runtime.util.VariableUtilImpl.callFunctionWithNamedValues(VariableUtilImpl.java:769):769 at railo.runtime.PageContextImpl.getFunctionWithNamedValues(PageContextImpl.java:1495):1495 at cfcs.axs.borrowersecurity_cfc$cf.udfCall(C:\websites\tools\cfcs\axs\BorrowerSecurity.cfc:441):441 at railo.runtime.type.UDFImpl.implementation(UDFImpl.java:215):215 at railo.runtime.type.UDFImpl._call(UDFImpl.java:434):434 at railo.runtime.type.UDFImpl.call(UDFImpl.java:384):384 at railo.runtime.ComponentImpl._call(ComponentImpl.java:615):615 at railo.runtime.ComponentImpl._call(ComponentImpl.java:502):502 at railo.runtime.ComponentImpl.call(ComponentImpl.java:1815):1815 at railo.runtime.util.VariableUtilImpl.callFunctionWithoutNamedValues(VariableUtilImpl.java:733):733 at railo.runtime.PageContextImpl.getFunction(PageContextImpl.java:1480):1480 at components.application.atlasapplicationhelper_cfc$cf._1(C:\websites\shared\components\application\AtlasApplicationHelper.cfc:84):84 at components.application.atlasapplicationhelper_cfc$cf.udfCall(C:\websites\shared\components\application\AtlasApplicationHelper.cfc):-1 at railo.runtime.type.UDFImpl.implementation(UDFImpl.java:215):215 at railo.runtime.type.UDFImpl._call(UDFImpl.java:434):434 at railo.runtime.type.UDFImpl.callWithNamedValues(UDFImpl.java:377):377 at railo.runtime.ComponentImpl._call(ComponentImpl.java:616):616 at railo.runtime.ComponentImpl._call(ComponentImpl.java:502):502 at railo.runtime.ComponentImpl.callWithNamedValues(ComponentImpl.java:1834):1834 at railo.runtime.util.VariableUtilImpl.callFunctionWithNamedValues(VariableUtilImpl.java:769):769 at railo.runtime.PageContextImpl.getFunctionWithNamedValues(PageContextImpl.java:1495):1495 at application_cfc$cf.udfCall(C:\websites\amcwidgets\application.cfc:25):25 at railo.runtime.type.UDFImpl.implementation(UDFImpl.java:215):215 at railo.runtime.type.UDFImpl._call(UDFImpl.java:434):434 at railo.runtime.type.UDFImpl.call(UDFImpl.java:384):384 at railo.runtime.ComponentImpl._call(ComponentImpl.java:615):615 at railo.runtime.ComponentImpl._call(ComponentImpl.java:502):502 at railo.runtime.ComponentImpl.call(ComponentImpl.java:1815):1815 at railo.runtime.listener.ModernAppListener.call(ModernAppListener.java:349):349 at railo.runtime.listener.ModernAppListener._onRequest(ModernAppListener.java:106):106 at railo.runtime.listener.MixedAppListener.onRequest(MixedAppListener.java:23):23 at railo.runtime.PageContextImpl.execute(PageContextImpl.java:2035):2035 at railo.runtime.PageContextImpl.execute(PageContextImpl.java:2002):2002 at railo.runtime.engine.CFMLEngineImpl.serviceCFML(CFMLEngineImpl.java:297):297 at railo.loader.servlet.CFMLServlet.service(CFMLServlet.java:32):32 at javax.servlet.http.HttpServlet.service(HttpServlet.java:820):820 at org.mortbay.jetty.servlet.ServletHolder.handle(ServletHolder.java:511):511 at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:390):390 at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216):216 at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182):182 at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:765):765 at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:440):440 at org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:199):199 at org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114):114 at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152):152 at org.mortbay.jetty.Server.handle(Server.java:326):326 at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542):542 at org.mortbay.jetty.HttpConnection$RequestHandler.headerComplete(HttpConnection.java:926):926 at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:549):549 at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:212):212 at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404):404 at org.mortbay.io.nio.SelectChannelEndPoint.run(SelectChannelEndPoint.java:410):410 at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582):582
Below is the code that caused this exception. As I am exclusively catching the Invalid Credential error, I have a feeling it has got to be something else.
cffunction name="getDecodedTokenDataFromFirebase" returntype="struct" access="public">
<cfargument name="authToken" type="string" required="true">
<cfscript>
local.return = {};
local.return.bSuccessfulLookup = false;
local.firebaseJavaLoader = request.cfcCache.get('shared.components.util.JavaloaderUtil')
.getFirebaseAdmin660Javaloader();
try {
local.decodedToken = local.firebaseJavaLoader.create('com.google.firebase.auth.FirebaseAuth')
.getInstance()
.verifyIdToken(arguments.authToken);
local.return.uid = local.decodedToken.getUid();
local.return.email = local.decodedToken.getEmail();
local.return.isEmailVerified = local.decodedToken.isEmailVerified().toString();
local.return.additionalData = {};
local.return.additionalData.putAll(local.decodedToken.getClaims());
local.return.bSuccessfulLookup = true;
}
catch(any e) {
if(structKeyExists(e, 'ErrorCode') and e.ErrorCode eq 'ERROR_INVALID_CREDENTIAL'){
local.return.bSuccessfulLookup = false; // just setting it to be doubly sure
}
else {
request.applicationCfc.sendErrorEmail(exception=e);
}
}
return local.return;
</cfscript>
It's a network error. The SDK failed to establish a connection with Google public key servers:
sun.net.www.protocol.https.HttpsClient.<init>(HttpsClient.java:264):264 at
sun.net.www.protocol.https.HttpsClient.New(HttpsClient.java:367):367 at
sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.getNewHttpClient(AbstractDelegateHttpsURLConnection.java:191):191 at
sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1156):1156 at
sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:1050):1050 at
sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect(AbstractDelegatHttpsURLConnection.java:177):177 at
sun.net.www.protocol.https.HttpsURLConnectionImpl.connect(HttpsURLConnectionImpl.java:162):162 at
com.google.api.client.http.javanet.NetHttpRequest.execute(NetHttpRequest.java:104):104 at
com.google.api.client.http.HttpRequest.execute(HttpRequest.java:981):981 at
com.google.api.client.googleapis.auth.oauth2.GooglePublicKeysManager.refresh(GooglePublicKeysManager.java:172):172 at
com.google.api.client.googleapis.auth.oauth2.GooglePublicKeysManager.getPublicKeys(GooglePublicKeysManager.java:140):140 at
com.google.firebase.auth.internal.FirebaseTokenVerifier.verifySignature(FirebaseTokenVerifier.java:193):193 at
com.google.firebase.auth.internal.FirebaseTokenVerifier.verifyTokenAndSignature(FirebaseTokenVerifier.java:161):161 at
Notice how the GooglePublicKeyManager.refresh() method has encountered a socket connection error.

How to process MultiLine input log file in Spark using Java

I am new to Spark and It seems very confusing to me. I had gone through the spark documentation for Java API But couldn't figure out the way to solve my problem.
I have to process a logfile in spark-Java and have very little time left for the same. Below is the log file that contains the device records(device id, decription, ip address, status) span over multiple lines.
It also contains some other log information which I am not bothered about.
How can I get the device information log from this huge log file.
Any help is much appreciated.
Input Log Data :
!
!
!
device AGHK75
description "Optical Line Terminal"
ip address 1.11.111.12/10
status "FAILED"
!
device AGHK78
description "Optical Line Terminal"
ip address 1.11.111.12/10
status "ACTIVE"
!
!
context local
!
no ip domain-lookup
!
interface IPA1_A2P_1_OAM
description To_A2P_1_OAM
ip address 1.11.111.12/10
propagate qos from ip class-map ip-to-pd
!
interface IPA1_OAM_loopback loopback
description SE1200_IPA-1_OAM_loopback
ip address 1.11.111.12/10
ip source-address telnet snmp ssh radius tacacs+ syslog dhcp-server tftp ftp icmp-dest-unreachable icmp-time-exceed netop flow-ip
What I have done so far is:
Java Code
JavaRDD<String> logData = sc.textFile("logFile").cache();
List<String> deviceRDD = logData.filter(new Function<String, Boolean>() {
Boolean check=false;
public Boolean call(String s) {
if(s.contains("device") ||(check == true && ( s.contains("description") || s.contains("ip address"))))
check=true;
else if(check==true && s.contains("status")){
check=false;
return true;
}
else
check=false;
return check; }
}).collect();
Current Output :
device AGHK75
description "Optical Line Terminal"
ip address 1.11.111.12/10
status "FAILED"
device AGHK78
description "Optical Line Terminal"
ip address 1.11.111.12/10
status "ACTIVE"
Expected Output is:
AGHK75,"Optical Line Terminal",1.11.111.12/10,"FAILED"
AGHK78,"Optical Line Terminal",1.11.111.12/10,"ACTIVE"
You can use sc.wholeTextFiles("logFile") for getting the data as key,value pair of where key will be the file name and value as data in it.
Then you can use some string operation for splitting of the data as per the start and end delimiter of single log data with "!" and do a filter first for checking if the first word is device and then do a flatMap on it which will make it as singleLog text RDD.
and then get the data from it using the map.
Please try it and let me know whether if this logic is working for you.
Added code in Spark Scala:
val ipData = sc.wholeTextFiles("abc.log")
val ipSingleLog = ipData.flatMap(x=>x._2.split("!")).filter(x=>x.trim.startsWith("device"))
val logData = ipSingleLog.map(x=>{
val rowData = x.split("\n")
var device = ""
var description = ""
var ipAddress = ""
var status = ""
for (data <- rowData){
if(data.trim().startsWith("device")){
device = data.split("device")(1)
}else if(data.trim().startsWith("description")){
description = data.split("description")(1)
}else if(data.trim().startsWith("ip address")){
ipAddress = data.split("ip address")(1)
}else if(data.trim().startsWith("status")){
status = data.split("status")(1)
}
}
(device,description,ipAddress,status)
})
logData.foreach(println)
Spark will take each line as a separate item with sc.textFile. You can get it to split on a different char using sc.hadoopConfiguration().set("textinputformat.record.delimiter", "!").
#Test
public void test() throws ParseException, IOException {
hadoop.write("/test.txt", "line 1\nline 2\n!\nline 3\nline 4");
JavaSparkContext sc = spark.getContext();
sc.hadoopConfiguration().set("textinputformat.record.delimiter", "!");
System.out.println(sc.textFile(hadoop.getMfs().getUri() + "/test.txt").collect());
assertThat(sc.textFile(hadoop.getMfs().getUri() + "/test.txt").count(), is(2L));
}
I believe the only correct way that works everywhere is
Configuration hadoopConf = new Configuration();
hadoopConf.set("textinputformat.record.delimiter", "delimiter");
JavaPairRDD<LongWritable, Text> input = jsc.newAPIHadoopFile(path,
TextInputFormat.class, LongWritable.class, Text.class, hadoopConf);
There are issues in hadoop related code. Depending on size of the input file it produces additional records: MAPREDUCE-6549,MAPREDUCE-5948. It certainly works starting with 2.7.2.
Even though as mlk suggests using spark context would perfectly work, it'll fail in case you try to read another file with different delimiter using the same spark context. By default the delimiter is new line symbol and it'll be changed as soon as this option is applied.
The reason is that spark context shares hadoopConfiguration object and it's hard to reason, where exactly this value is going to be needed. As a workaround the one might materialize RDD and cache it, but it's still might happen that the same RDD would be recomputed.
Given way would work everywhere, because every time it uses new Configuration.

How to provide correct arguments to setAsciiStream method?

This is my FULL test code with the main method:
public class TestSetAscii {
public static void main(String[] args) throws SQLException, FileNotFoundException {
String dataFile = "FastLoad1.csv";
String insertTable = "INSERT INTO " + "myTableName" + " VALUES(?,?,?)";
Connection conStd = DriverManager.getConnection("jdbc:xxxxx", "xxxxxx", "xxxxx");
InputStream dataStream = new FileInputStream(new File(dataFile));
PreparedStatement pstmtFld = conStd.prepareStatement(insertTable);
// Until this line everything is awesome
pstmtFld.setAsciiStream(1, dataStream, -1); // This line fails
System.out.println("works");
}
}
I get the "cbColDef value out of range" error
Exception in thread "main" java.sql.SQLException: [Teradata][ODBC Teradata Driver] Invalid precision: cbColDef value out of range
at sun.jdbc.odbc.JdbcOdbc.createSQLException(Unknown Source)
at sun.jdbc.odbc.JdbcOdbc.standardError(Unknown Source)
at sun.jdbc.odbc.JdbcOdbc.SQLBindInParameterAtExec(Unknown Source)
at sun.jdbc.odbc.JdbcOdbcPreparedStatement.setStream(Unknown Source)
at sun.jdbc.odbc.JdbcOdbcPreparedStatement.setAsciiStream(Unknown Source)
at file.TestSetAscii.main(TestSetAscii.java:21)
Here is the link to my FastLoad1.csv file. I guess that setAsciiStream fails because of the FastLoad1.csv file , but I am not sure
(In my previous question I was not able to narrow down the problem that I had. Now I have shortened the code.)
It would depend on the table schema, but the third parameter of setAsciiStream is length.
So
pstmtFld.setAsciiStream(1, dataStream, 4);
would work for a field of length 4 bytes.
But I dont think it would work as you expect it in the code. For each bind you should have separate stream.
This function setAsciiStream() is designed to be used for large data values BLOBS or long VARCHARS. It is not designed to read csv file line by line and split them into separate values.
Basicly it just binds one of the question marks with the inputStream.
After looking into the provided example it looks like teradata could handle csv but you have to explicitly tell that with:
String urlFld = "jdbc:teradata://whomooz/TMODE=ANSI,CHARSET=UTF8,TYPE=FASTLOADCSV";
I don't have enough reputation to comment, but I feel that this info can be valuable to those navigating fast load via JDBC for the first time.
This code will get the full stack trace and is very helpful for diagnosing problems with fast load:
catch (SQLException ex){
for ( ; ex != null ; ex = ex.getNextException ())
ex.printStackTrace () ;
}
In the case of the code above, it works if you specify TYPE=FASTLOADCSV in the connection string, but when run multiple times will fail due to the creation of the error tables _ERR_1 and _ERR_2. Drop these tables and clear out the destination tables to run again.

Java ProgramCall.run hangs

Busy trying to Call RPG function from Java and got this example from JamesA. But now I am having trouble, here is my code:
AS400 system = new AS400("MachineName");
ProgramCall program = new ProgramCall(system);
try
{
// Initialise the name of the program to run.
String programName = "/QSYS.LIB/LIBNAME.LIB/FUNNAME.PGM";
// Set up the 3 parameters.
ProgramParameter[] parameterList = new ProgramParameter[2];
// First parameter is to input a name.
AS400Text OperationsItemId = new AS400Text(20);
parameterList[0] = new ProgramParameter(OperationsItemId.toBytes("TestID"));
AS400Text CaseMarkingValue = new AS400Text(20);
parameterList[1] = new ProgramParameter(CaseMarkingValue.toBytes("TestData"));
// Set the program name and parameter list.
program.setProgram(programName, parameterList);
// Run the program.
if (program.run() != true)
{
// Report failure.
System.out.println("Program failed!");
// Show the messages.
AS400Message[] messagelist = program.getMessageList();
for (int i = 0; i < messagelist.length; ++i)
{
// Show each message.
System.out.println(messagelist[i]);
}
}
// Else no error, get output data.
else
{
AS400Text text = new AS400Text(50);
System.out.println(text.toObject(parameterList[1].getOutputData()));
System.out.println(text.toObject(parameterList[2].getOutputData()));
}
}
catch (Exception e)
{
//System.out.println("Program " + program.getProgram() + " issued an exception!");
e.printStackTrace();
}
// Done with the system.
system.disconnectAllServices();
The application Hangs at this lineif (program.run() != true), and I wait for about 10 minutes and then I terminate the application.
Any idea what I am doing wrong?
Edit
Here is the message on the job log:
Client request - run program QSYS/QWCRTVCA.
Client request - run program LIBNAME/FUNNAME.
File P6CASEL2 in library *LIBL not found or inline data file missing.
Error message CPF4101 appeared during OPEN.
Cannot resolve to object YOBPSSR. Type and Subtype X'0201' Authority
FUNNAME insert a row into table P6CASEPF through a view called P6CASEL2. P6CASEL2 is in a different library lets say LIBNAME2. Is there away to maybe set the JobDescription?
Are you sure FUNNAME.PGM is terminating and not hung with a MSGW? Check QSYSOPR for any messages.
Class ProgramCall:
NOTE: When the program runs within the host server job, the library list will be the initial library list specified in the job description in the user profile.
So I saw that my problem is that my library list is not setup, and for some reason, the user we are using, does not have a Job Description. So to over come this I added the following code before calling the program.run()
CommandCall command = new CommandCall(system);
command.run("ADDLIBLE LIB(LIBNAME)");
command.run("ADDLIBLE LIB(LIBNAME2)");
This simply add this LIBNAME, and LIBNAME2 to the user's library list.
Oh yes, the problem is Library list not set ... take a look at this discussion on Midrange.com, there are different work-around ...
http://archive.midrange.com/java400-l/200909/msg00032.html
...
Depe

Android / Java : GC_CONCURENT when creating many String's

I'm blocking with a problem from a few days. I've found some similar posts but still didn't understanding the problem of my code.
In fact, I'm reading a file (18,4Kbytes) which is containing SQL queries. So the only thing I want to do is read the file and execute the queries.
I've no problem reading the file, the problem occurs when after having executed all the queries (if I don't execute it, it works but it's not the deal!)
So here's my code (between try / catch for IO Exception):
InputStream in = ctx.getAssets().open("file.sql");
ByteArrayBuffer queryBuff = new ByteArrayBuffer(in.available());
String query = null;
int curent;
while (-1 != (curent = in.read())) {
queryBuff.append((char) curent);
if (((char) curent) == ';') {
query = new String(queryBuff.toByteArray());
db.execSQL(query);
queryBuff.clear();
query = null;
}
}
in.close();
queryBuff.clear();
And my GC_CONCURENT occurs when there is "new String" in the loop, after the end of the loop.
Thanks !
EDIT :
I'm a little annoyed, because my memory-leak didn't occurs in this part of code but in a part of code executed laterly (don't know what for now) but my problem wasn't a problem, app worked properly in fact...
Sorry !

Categories