I need to generate a large file which is going to be done in Java, so a method is hit and the service goes off to the repository returns a list of type X as Java object. I then need to place this list in a file and send this off to an ftp server.
I know I can put files on ftp servers using camel, but wanted to know if it possible for camel to generate the file from the Java object and then place on the ftp server?
My code would look like this:
List<ObjectX> xList = someRepo.getListOfx();
So I need to write xList to a file and place on the ftp server.
Generally speaking, to convert your POJO messages to/from a text (or binary) based representation, you can use a Camel Dataformat. In your route, you will use the marshall and unmarshall keywords to perform the conversion.
There are several Camel dataformats available to marshall/unmarshal CSV, including the CSV dataformat or the Bindy dataformat (but there are a few others listed on the Dataformat page, under the "Flat data structure marshalling" header). One advantage of Bindy is that it can also support other unstructured formats (such as fixed width records)
Also note :
With Bindy, you will have to add annotations to your model class (ObjectX)
With CSV, you will have to convert you model objects (of type ObjectX) to Maps (or register an appropriate TypeConverter with Camel to do this conversion automatically)
If you check the other available dataformats, they may have different requirements too
Here is a very simple example with bindy:
package mypackage;
#CsvRecord(separator = ",")
public Class MyPojo {
#DataField(pos = 1) // first column in the CSV file
private int foo;
#DataField(pos = 2) // second column in the CSV file
private String bar;
// Plus constructors, accessors, etc.
}
// In the RouteBuilder :
DataFormat bindy = new BindyCsvDataFormat("mypackage");
from("...") // Message received as a List<MyPojo>
.marshal(bindy)
.to("ftp:...");
Related
EDIT: I changed my mind. I would find a way to generate the Java class and load the JSON as an object of that class.
I just discovered that exists a variant of JSON called JSON-LD.
It seems to me a more structured way of defining JSON, that reminds me XML with an associated schema, like XSD.
Can I create a Java class from JSON-LD, load it at runtime and use it to convert JSON-LD to an instantiation of that class?
I read the documentation of both the implementations but I found nothing about it. Maybe I read them bad?
Doing a Google search brought me to a library that will decode the JSON-LD into an "undefined" Object.
// Open a valid json(-ld) input file
InputStream inputStream = new FileInputStream("input.json");
// Read the file into an Object (The type of this object will be a List, Map, String, Boolean,
// Number or null depending on the root object in the file).
Object jsonObject = JsonUtils.fromInputStream(inputStream);
// Create a context JSON map containing prefixes and definitions
Map context = new HashMap();
// Customise context...
// Create an instance of JsonLdOptions with the standard JSON-LD options
JsonLdOptions options = new JsonLdOptions();
// Customise options...
// Call whichever JSONLD function you want! (e.g. compact)
Object compact = JsonLdProcessor.compact(jsonObject, context, options);
// Print out the result (or don't, it's your call!)
System.out.println(JsonUtils.toPrettyString(compact));
https://github.com/jsonld-java/jsonld-java
Apparently, it can take it from just a string as well, as if reading it from a file or some other source. How you access the contents of the object, I can't tell. The documentation seems to be moderately decent, though.
It seems to be an active project, as the last commit was only 4 days ago and has 30 contributors. The license is BSD 3-Clause, if that makes any difference to you.
I'm not in any way associate with this project. I'm not an author nor have I made any pull requests. It's just something I found.
Good luck and I hope this helped!
see this page: JSON-LD Module for Jackson
I'm relatively new to Protobufs and I was wondering how I would use it to process a list of structures.
Lets say I have a configuration file that looks like this:
Bucket{
name = "A";
path = "~/Document/...";
traffic = 5;
}
Bucket{
name = "B";
path = "~/Document/...";
traffic = 6;
}
Bucket{
name = "C";
path = "~/Document/...";
traffic = 7;
}
etc etc.
So I am using a protobuf to essentially structure this for ease of use later in Java. I'm essentially trying to make a map in a map (a protobuf to help find the correct bucket, and then another map to obtain member attributes inside the bucket).
option java_outer_classname = "Buckets";
message Bucket {
required string name = 1;
required string path = 2;
optional int32 traffic = 3;
}
message BucketList {
required Bucket bucket = 1;
}
I'm confused on how I would link the two, as in, how would I pass the configuration file into the protobuf methods (after it compiles into a java class) and use to to access the bucket member to do stuff like say get the path from the bucket with the name A?
Thank you!
It's perfectly acceptable to use Protobuf as a mechanism to declare and parse a text configuration file. However, one must keep in mind that Protobuf's purpose is the declare the format of the file, be it plain text or its binary wire format. Higher level semantic constraints need to be enforced in custom application code, once the configuration has been read.
You've got the wrong idea of what protobuf is used for, it is a data interchange library which means its used to encode and exchange data between programs. It was never meant to be used for configuration and doesn't have a way to read a text-based definition of the data since it deals with binary files.
Looking at your config format you have two options:
The format you've chosen looks a lot like HOCON, so https://github.com/typesafehub/config should be able to read it and provide a readable config object with a small amount of editing.
If you want a type-safe config (defining the structure of the config as an actual java object) you will need to use some other format which supports deserialization to object. A JSON configuration can be read into objects using libraries like https://github.com/google/gson or https://github.com/FasterXML/jackson
Hopefully the title says it all. I understand that the newer 'SS' model supports both XLS and XLSX format files, but it doesn't seem to have an event-driven implementation, as far as I can see :-(
I only want to read the files, not write them, and I only really need the cell contents and their data type.
Yes, and No!
Apache POI does provide ways to read the two Excel file formats in a streaming, low memory, event-driven way
However... Because the two file formats are stored in very different ways (one is bits of XML held within a zip file, the other is binary records), it isn't possible to read both formats in an event way with the same code.
Your options are therefore to buy some more memory + use the UserModel approach which hides the differences by providing common interfaces, or write two lots of Event code to handle the two different formats
For XLS files / HSSF, you should follow the Apache POI docs on the HSSF Event API.
For XLSX files / XSSF, you should follow the Apache POI docs on the XSSF SAX Event API
Various examples of both of those in use can be found in the Apache POI source and examples.
Yes, in Apache POI event driven reading is available:
Please see this link: how-to-event-reading
As stated: org.apache.poi.poifs.eventfilesystem.POIFSReaderListener
is an interface used to register for documents. When a matching
document is read by the
org.apache.poi.poifs.eventfilesystem.POIFSReader, the
POIFSReaderListener instance receives an
org.apache.poi.poifs.eventfilesystem.POIFSReaderEvent instance,
which contains an open DocumentInputStream and information about the
document.
For your getting start:
public static void main(final String[] args) throws IOException
{
final String filename = args[0];
POIFSReader r = new POIFSReader();
r.registerListener(new MyPOIFSReaderListener());
r.read(new FileInputStream(filename));
}
static class MyPOIFSReaderListener implements POIFSReaderListener
{
public void processPOIFSReaderEvent(final POIFSReaderEvent event)
{
final POIFSDocumentPath path = event.getPath();
final String name = event.getName();
final DocumentInputStream stream = event.getStream();
}
}
Is there any framework/library to help writing fixed length flat files in java?
I want to write a collection of beans/entities into a flat file without worrying with convertions, padding, alignment, fillers, etcs
For example, I'd like to parse a bean like:
public class Entity{
String name = "name"; // length = 10; align left; fill with spaces
Integer id = 123; // length = 5; align left; fill with spaces
Integer serial = 321 // length = 5; align to right; fill with '0'
Date register = new Date();// length = 8; convert to yyyyMMdd
}
Into Flatfile
Jaya 123 0032120110505
Prathiksha5000 0122120110504
Prabha 1 0000120101231
...
Smooks Fixed length using this i can able to read the file as POJO, List or Map.
The issue is i am unable to find any process to write POJO to FixedLenthFile.
Also kindly suggest is there any other way to process the FixedLengthFlatFIle using camel,spring Project.
Currently in our project we are using smooks, camel, spring for EDI to POJO and POJO to EDI. Now we have to add features for FixedLengthFlatFile Processing.
I found the answer.
WRT link
bindy or beanio which can work with flat files
Camel Bindy
Camel Beanio
Smooks doesn't work with Camel 2.10 onwards (AFAIR). There is a ticket
reported to the smooks team to fix that, but they haven't done it yet,
not released a new version of Smooks in a fairly long time. I would
try to avoid using Smooks, until the project comes back to life (if it
does).
I have the following feeds from my vendor,
http://scores.cricandcric.com/cricket/getFeed?key=4333433434343&format=xml&tagsformat=long&type=schedule
I wanted to get the data from that xml files as java objects, so that I can insert into my database regularly.
The above data is nothing but regular updates from the vendor, so that I can update in my website.
can you please suggest me what are my options available to get this working
Should I use any webservices or just Xstream
to get my final output.. please suggest me as am a new comer to this concept
Vendor has suggested me that he can give me the data in following 3 formats rss, xml or json, I am not sure what is easy and less consumable to get it working
I would suggest just write a program that parses the XML and inserts the data directly into your database.
Example
This groovy script inserts data into a H2 database.
//
// Dependencies
// ============
import groovy.sql.Sql
#Grapes([
#Grab(group='com.h2database', module='h2', version='1.3.163'),
#GrabConfig(systemClassLoader=true)
])
//
// Main program
// ============
def sql = Sql.newInstance("jdbc:h2:db/cricket", "user", "pass", "org.h2.Driver")
def dataUrl = new URL("http://scores.cricandcric.com/cricket/getFeed?key=4333433434343&format=xml&tagsformat=long&type=schedule")
dataUrl.withReader { reader ->
def feeds = new XmlSlurper().parse(reader)
feeds.matches.match.each {
def data = [
it.id,
it.name,
it.type,
it.tournamentId,
it.location,
it.date,
it.GMTTime,
it.localTime,
it.description,
it.team1,
it.team2,
it.teamId1,
it.teamId2,
it.tournamentName,
it.logo
].collect {
it.text()
}
sql.execute("INSERT INTO matches (id,name,type,tournamentId,location,date,GMTTime,localTime,description,team1,team2,teamId1,teamId2,tournamentName,logo) VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)", data)
}
}
Well... you could use an XML Parser (stream or DOM), or a JSON parser (again stream of 'DOM'), and build the objects on the fly. But with this data - which seems to consist of records of cricket matches, why not go with a csv format?
This seems to be your basic 'datum':
<id>1263</id>
<name>Australia v India 3rd Test at Perth - Jan 13-17, 2012</name>
<type>TestMatch</type>
<tournamentId>137</tournamentId>
<location>Perth</location>
<date>2012-01-14</date>
<GMTTime>02:30:00</GMTTime>
<localTime>10:30:00</localTime>
<description>3rd Test day 2</description>
<team1>Australia</team1>
<team2>India</team2>
<teamId1>7</teamId1>
<teamId2>1</teamId2>
<tournamentName>India tour of Australia 2011-12</tournamentName>
<logo>/cricket/137/tournament.png</logo>
Of course you would still have to parse a csv, and deal with character delimiting (such as when you have a ' or a " in a string), but it will reduce your network traffic quite substantially, and likely parse much faster on the client. Of course, this depends on what your client is.
Actually you have RESTful store that can return data in several formats and you only need to read from this source and no further interaction is needed.
So, you can use any XML Parser to parse XML data and put the extracted data in whatever data structure that you want or you have.
I did not hear about XTREME, but you can find more information about selecting the best parser for your situation at this StackOverflow question.