I have a problem with sending ArrayList from server to my client using JAX-rs. I've got 4 classes:
Demo - there is starting REST server
FileDetails - there are 3 fields storing data
ConfigFiles - it has few methods for files and there is a list of FileDetails objects
RestServer - there is method GET
I've got the following code:
#XmlRootElement(name="FileDetails")
#Path("/easy")
public class RestSerwer {
#GET
#Path("/temporary")
#Produces("text/temporary")
public String methodGet() {
ConfigFiles cf = ConfigFiles.getInstance();
List<FileDetails> files = cf.getList();
try {
JAXBContext ctx = JAXBContext.newInstance(ArrayList.class, FileDetails.class);
Marshaller m = ctx.createMarshaller();
StringWriter sw = new StringWriter();
m.marshal(files, sw);
return sw.toString();
} catch (JAXBException e) {
e.printStackTrace();
}
return null;
}
}
At client side I've got GetRest:
public class GetRest{
HttpClient client = null;
GetMethod method = null;
private String url = null;
public GetRest(String url) {
this.url = url;
}
public String getBody(String urlDetail){
client = new HttpClient();
method = new GetMethod(url + urlDetail);
method.getParams().setParameter(HttpMethodParams.RETRY_HANDLER,
new DefaultHttpMethodRetryHandler(3, false));
try {
client.executeMethod(method);
} catch (HttpException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
byte[] responseBody = null;
try {
responseBody = method.getResponseBody();
} catch (IOException e) {
e.printStackTrace();
}finally{
method.releaseConnection();
}
String str = new String(responseBody);
return str;
}
public String getFiles(){
return getBody("/easy/temporary");
}
}
When I try:
GetRest getRestClass = new GetRest("http://localhost:8185");
//List<FileDetails> cf = new ArrayList<FileDetails>();
String xxxx = getRestClass.getFiles(); // TODO
It throws:
Caused by: com.sun.istack.internal.SAXException2: unable to marshal type "java.util.ArrayList" as an element because it is missing an #XmlRootElement annotation
At server side.
Anybody can help me?
Thanks
You basically have 2 possibilities: Create you own class which is a wrapper on top of the List or write your own provider and inject it in jersey so that it knows how to marshall/unmarshall the arraylist. Here is a simple code for the first solution:
#XmlRootElement
public class MyListWrapper {
#XmlElement(name = "List")
private List<String> list;
public MyListWrapper() {/*JAXB requires it */
}
public MyListWrapper(List<String> stringList) {
list = stringList;
}
public List<String> getStringList() {
return list;
}
}
Related
I am using BULK API of elasticsearch with version 5.6.16 of Elasticsearch and KIbana both.
The following code is working fine and uploading the code to an index of ES.
these are dependencies that I currently have.
implementation 'org.springframework.boot:spring-boot-starter-data-elasticsearch'
compile group: 'org.elasticsearch.client', name: 'transport'
compile group: 'org.elasticsearch.plugin', name: 'transport-netty4-client'
#Service
public class BulkApiService {
private final Logger log = LoggerFactory.getLogger(BulkApiService.class);
private String FOLDER_PATH = "src/main/resources/allRecipesJson";
private String index = "test";
#Autowired
ElasticSearchConfig elasticSearchConfig;
public String loadBulkData() throws UnknownHostException {
Client client = elasticSearchConfig.client();
AtomicReference<BulkRequestBuilder> request = new AtomicReference<>(client.prepareBulk());
AtomicInteger counter = new AtomicInteger();
try (Stream<Path> filePathStream = Files.walk(Paths.get(FOLDER_PATH))) {
filePathStream.forEach(filePath -> {
if (Files.isRegularFile(filePath)) {
counter.getAndIncrement();
try {
String content = Files.readString(filePath);
JSONObject jsonObject1 = new JSONObject(content);
HashMap yourHashMap1 = new Gson().fromJson(jsonObject1.toString(), HashMap.class);
request.get().add(client.prepareIndex(index, "default").setSource(yourHashMap1));
} catch (IOException ignore) {
log.error(ignore.toString());
}
}
});
BulkResponse bulkResponse = request.get().execute().actionGet();
} catch (Exception e) {
log.error(e.toString());
}
return "Bulk data loaded to index " + index + "";
}
Following is the configuration
#Configuration
public class ElasticSearchConfig {
private static final Logger log = LoggerFactory.getLogger(ElasticSearchConfig.class);
#Bean
public Client client() {
TransportClient client = null;
try {
client = new PreBuiltTransportClient(Settings.EMPTY)
.addTransportAddress(new TransportAddress(InetAddress.getByName("localhost"), 9300));
} catch (UnknownHostException e) {
log.error(log.toString());
}
return client;
}
}
Now I want to move to Elasticserch and Kibana version 7.3.1.
I am unable to load the data to the elasticsearch index. The following error is shown on IDE Log
service.BulkApiService - NoNodeAvailableException[None of the configured nodes are available: [{#transport#-1}{XY6cYmf7Sn-DRZgzeq3PBA}{localhost}{127.0.0.1:9300}]]
The error on elasticsearch imagec
Kindly help me to correct it.
Following is the correct code.
#Service
public class BulkApiService { private final Logger log = LoggerFactory.getLogger(BulkApiService.class);
private String FOLDER_PATH = "src/main/resources/allRecipesJson";
private String index = "test1";
private static final String TYPE = "test_type";
#Autowired
private RestHighLevelClient restHighLevelClient;
public String loadBulkData() throws IOException {
BulkRequest bulkRequest = new BulkRequest();
AtomicInteger counter = new AtomicInteger();
try (Stream<Path> filePathStream = Files.walk(Paths.get(FOLDER_PATH))) {
filePathStream.forEach(filePath -> {
if (Files.isRegularFile(filePath)) {
counter.getAndIncrement();
System.out.println(filePath.toFile().getAbsoluteFile().toString());
try {
String content = Files.readString(filePath);
JSONObject jsonObject1 = new JSONObject(content);
HashMap yourHashMap1 = new Gson().fromJson(jsonObject1.toString(), HashMap.class);
IndexRequest indexRequest = new IndexRequest(index, TYPE).source(yourHashMap1);
bulkRequest.add(indexRequest);
} catch (IOException e) {
e.printStackTrace();
}
}
});
}
try {
restHighLevelClient.bulk(bulkRequest, RequestOptions.DEFAULT);
} catch (IOException e) {
e.printStackTrace();
}
return "Bulk data loaded to index " + index + "";
}
}
Add the following dependency and remove the already placed dependencies.
compile 'org.elasticsearch.client:elasticsearch-rest-high-level-client:6.4.2'
Hi I'm trying to make a PACS server using Java. dcm4che appears to be quite popular. But I'm unable to find any good examples about it.
As a starting point I inspected dcmqrscp and it successfully stores a DICOM image. But I cannot manage to handle a C-MOVE call. Here's my CMove handler. It finds requested the DICOM file adds a URL and other stuff, it doesn't throw any exception yet client doesn't receive any files.
private final class CMoveSCPImpl extends BasicCMoveSCP {
private final String[] qrLevels;
private final QueryRetrieveLevel rootLevel;
public CMoveSCPImpl(String sopClass, String... qrLevels) {
super(sopClass);
this.qrLevels = qrLevels;
this.rootLevel = QueryRetrieveLevel.valueOf(qrLevels[0]);
}
#Override
protected RetrieveTask calculateMatches(Association as, PresentationContext pc, final Attributes rq, Attributes keys) throws DicomServiceException {
QueryRetrieveLevel level = QueryRetrieveLevel.valueOf(keys, qrLevels);
try {
level.validateRetrieveKeys(keys, rootLevel, relational(as, rq));
} catch (Exception e) {
e.printStackTrace();
}
String moveDest = rq.getString(Tag.MoveDestination);
final Connection remote = new Connection("reciverAE",as.getSocket().getInetAddress().getHostAddress(), 11113);
if (remote == null)
throw new DicomServiceException(Status.MoveDestinationUnknown, "Move Destination: " + moveDest + " unknown");
List<T> matches = DcmQRSCP.this.calculateMatches(keys);
if (matches.isEmpty())
return null;
AAssociateRQ aarq;
Association storeas = null;
try {
aarq = makeAAssociateRQ(as.getLocalAET(), moveDest, matches);
storeas = openStoreAssociation(as, remote, aarq);
} catch (Exception e) {
e.printStackTrace();
}
BasicRetrieveTask<T> retrieveTask = null;
retrieveTask = new BasicRetrieveTask<T>(Dimse.C_MOVE_RQ, as, pc, rq, matches, storeas, new BasicCStoreSCU<T>());
retrieveTask.setSendPendingRSPInterval(getSendPendingCMoveInterval());
return retrieveTask;
}
private Association openStoreAssociation(Association as, Connection remote, AAssociateRQ aarq)
throws DicomServiceException {
try {
return as.getApplicationEntity().connect(as.getConnection(),
remote, aarq);
} catch (Exception e) {
throw new DicomServiceException(
Status.UnableToPerformSubOperations, e);
}
}
private AAssociateRQ makeAAssociateRQ(String callingAET,
String calledAET, List<T> matches) {
AAssociateRQ aarq = new AAssociateRQ();
aarq.setCalledAET(calledAET);
aarq.setCallingAET(callingAET);
for (InstanceLocator match : matches) {
if (aarq.addPresentationContextFor(match.cuid, match.tsuid)) {
if (!UID.ExplicitVRLittleEndian.equals(match.tsuid))
aarq.addPresentationContextFor(match.cuid,
UID.ExplicitVRLittleEndian);
if (!UID.ImplicitVRLittleEndian.equals(match.tsuid))
aarq.addPresentationContextFor(match.cuid,
UID.ImplicitVRLittleEndian);
}
}
return aarq;
}
private boolean relational(Association as, Attributes rq) {
String cuid = rq.getString(Tag.AffectedSOPClassUID);
ExtendedNegotiation extNeg = as.getAAssociateAC().getExtNegotiationFor(cuid);
return QueryOption.toOptions(extNeg).contains(
QueryOption.RELATIONAL);
}
}
I added the code below to send a DICOM file as a response:
String cuid = rq.getString(Tag.AffectedSOPClassUID);
String iuid = rq.getString(Tag.AffectedSOPInstanceUID);
String tsuid = pc.getTransferSyntax();
try {
DcmQRSCP.this.as=as;
File f = new File("D:\\dcmqrscpTestDCMDir\\1.2.840.113619.2.30.1.1762295590.1623.978668949.886\\1.2.840.113619.2.30.1.1762295590.1623.978668949.887\\1.2.840.113619.2.30.1.1762295590.1623.978668949.888");
FileInputStream in = new FileInputStream(f);
InputStreamDataWriter data = new InputStreamDataWriter(in);
// !1! as.cmove(cuid,1,keys,tsuid,"STORESCU");
as.cstore(cuid,iuid,1,data,tsuid,rspHandlerFactory.createDimseRSPHandler(f));
} catch (Exception e) {
e.printStackTrace();
}
Throws this exception
org.dcm4che3.net.NoRoleSelectionException: No Role Selection for SOP Class 1.2.840.10008.5.1.4.1.2.2.2 - Study Root Query/Retrieve Information Model - MOVE as SCU negotiated
You should add a role to the application instance like:
applicationEntity.addTransferCapability(
new TransferCapability(null, "*", TransferCapability.Role.SCP, "*"));
Good day, everyone! I have a little question about testing and generating a stub for dependence through reflection. So let's assume I have a class named UnderTest:
class UnderTest{
/*field*/
SomeLogic someLogic;
/*method, that i testing*/
List<MyObject> getCalculatedObjects(params) {/*logic,based on result getSomeStuff of someLogic*/ }
}
class SomeLogic {
List<String> getSomeStuff(String param) { /*Some complex and slow code, actually don't want test this code, and want to use some reflection invocation handler*/ }
}
For me it's important to not change legacy code, which doesn't design for testing. And i don't have any reason, except testing to make SomeLogic as an interface and so on.
I can't remember how to handle method invocation of someLogic using reflection. But google search isn't helping me.
Class MainAPI is... main api of my module. NetworkProvider long open stream operation, that's why i want to stub it, on my local files. But don't using directly reference on NetworkProvider. Again sorry for my English.
public class MainAPI {
private final XPath xPath;
private final ItemParser itemParser;
private final ListItemsParser listItemsParser;
private final DateParser dateParser;
private final HtmlCleanUp htmlCleanUp;
private final NetworkProvider networkProvider;
public MainAPI(XPath xPath, ItemParser itemParser, ListItemsParser listItemsParser, DateParser dateParser, HtmlCleanUp htmlCleanUp, NetworkProvider networkProvider) {
this.xPath = xPath;
this.itemParser = itemParser;
this.listItemsParser = listItemsParser;
this.dateParser = dateParser;
this.htmlCleanUp = htmlCleanUp;
this.networkProvider = networkProvider;
}
public MainAPI() throws XPathExpressionException, IOException {
dateParser = new DateParser();
xPath = XPathFactory.newInstance().newXPath();
networkProvider = new NetworkProvider();
listItemsParser = new ListItemsParser(xPath, dateParser, item -> true);
itemParser = new ItemParser(xPath, dateParser, networkProvider);
htmlCleanUp = new HtmlCleanUpByCleaner();
}
public List<Item> getItemsFromSessionParsing(SessionParsing sessionParsing) {
listItemsParser.setCondition(sessionParsing.getFilter());
List<Item> result = new ArrayList<>();
Document cleanDocument;
InputStream inputStream;
for (int currentPage = sessionParsing.getStartPage(); currentPage <= sessionParsing.getLastPage(); currentPage++) {
try {
inputStream = networkProvider.openStream(sessionParsing.getUrlAddressByPageNumber(currentPage));
} catch (MalformedURLException e) {
e.printStackTrace();
break;
}
cleanDocument = htmlCleanUp.getCleanDocument(inputStream);
List<Item> list = null;
try {
list = listItemsParser.getList(cleanDocument);
} catch (XPathExpressionException e) {
e.printStackTrace();
break;
}
for (Item item : list) {
inputStream = null;
try {
inputStream = networkProvider.openStream("http://www.avito.ru" + item.getUrl());
} catch (MalformedURLException e) {
e.printStackTrace();
break;
}
cleanDocument = htmlCleanUp.getCleanDocument(inputStream);
try {
item.setDescription(itemParser.getDescription(cleanDocument));
} catch (XPathExpressionException e) {
e.printStackTrace();
}
}
result.addAll(list);
}
return result;
}
}
public class NetworkProvider {
private final ListCycleWrapper<Proxy> proxyList;
public NetworkProvider(List<Proxy> proxyList) {
this.proxyList = new ListCycleWrapper<>(proxyList);
}
public NetworkProvider() throws XPathExpressionException, IOException {
this(new ProxySiteParser().getProxyList(new HtmlCleanUpByCleaner().getCleanDocument(new URL("http://www.google-proxy.net").openStream())));
}
public int getSizeOfProxy() {
return proxyList.size();
}
public InputStream openStream(String urlAddress) throws MalformedURLException {
URL url = new URL(urlAddress);
while (!proxyList.isEmpty()) {
URLConnection con = null;
try {
con = url.openConnection(proxyList.getNext());
con.setConnectTimeout(6000);
con.setReadTimeout(6000);
return con.getInputStream();
} catch (IOException e) {
proxyList.remove();
}
}
return null;
}
}
All the dependencies of your class to tests are injectable using its constructor, so there shouldn't be any problem to stub these dependencies and injecting the stubs. You don't even need reflection. For example, using Mockito:
NetworkProvider stubbedNetworkProvider = mock(NetworkProvider.class);
MainAPI mainApi = new MainAPI(..., stubbedNetworkProvider);
You can also write a stub by yourself if you want:
NetworkProvider stubbedNetworkProvider = new NetworkProvider(Collections.emptyList()) {
// TODO override the methods to stub
};
MainAPI mainApi = new MainAPI(..., stubbedNetworkProvider);
I cant change the mapping. Can anybody help me to find the bug in my code?
I have found this standard way to change the mapping according to several tutorials. But when i'm try to call the mapping structure there just appear a blank mapping structure after manuall mapping creation.
But after inserting some data there appear the mapping specification because ES is using of course the default one. To be more specific see the code below.
public class ElasticTest {
private String dbname = "ElasticSearch";
private String index = "indextest";
private String type = "table";
private Client client = null;
private Node node = null;
public ElasticTest(){
this.node = nodeBuilder().local(true).node();
this.client = node.client();
if(isIndexExist(index)){
deleteIndex(this.client, index);
createIndex(index);
}
else{
createIndex(index);
}
System.out.println("mapping structure before data insertion");
getMappings();
System.out.println("----------------------------------------");
createData();
System.out.println("mapping structure after data insertion");
getMappings();
}
public void getMappings() {
ClusterState clusterState = client.admin().cluster().prepareState()
.setFilterIndices(index).execute().actionGet().getState();
IndexMetaData inMetaData = clusterState.getMetaData().index(index);
MappingMetaData metad = inMetaData.mapping(type);
if (metad != null) {
try {
String structure = metad.getSourceAsMap().toString();
System.out.println(structure);
} catch (IOException e) {
e.printStackTrace();
}
}
}
private void createIndex(String index) {
XContentBuilder typemapping = buildJsonMappings();
String mappingstring = null;
try {
mappingstring = buildJsonMappings().string();
} catch (IOException e1) {
e1.printStackTrace();
}
client.admin().indices().create(new CreateIndexRequest(index)
.mapping(type, typemapping)).actionGet();
//try put mapping after index creation
/*
* PutMappingResponse response = null; try { response =
* client.admin().indices() .preparePutMapping(index) .setType(type)
* .setSource(typemapping.string()) .execute().actionGet(); } catch
* (ElasticSearchException e) { e.printStackTrace(); } catch
* (IOException e) { e.printStackTrace(); }
*/
}
private void deleteIndex(Client client, String index) {
try {
DeleteIndexResponse delete = client.admin().indices()
.delete(new DeleteIndexRequest(index)).actionGet();
if (!delete.isAcknowledged()) {
} else {
}
} catch (Exception e) {
}
}
private XContentBuilder buildJsonMappings(){
XContentBuilder builder = null;
try {
builder = XContentFactory.jsonBuilder();
builder.startObject()
.startObject("properties")
.startObject("ATTR1")
.field("type", "string")
.field("store", "yes")
.field("index", "analyzed")
.endObject()
.endObject()
.endObject();
} catch (IOException e) {
e.printStackTrace();
}
return builder;
}
private boolean isIndexExist(String index) {
ActionFuture<IndicesExistsResponse> exists = client.admin().indices()
.exists(new IndicesExistsRequest(index));
IndicesExistsResponse actionGet = exists.actionGet();
return actionGet.isExists();
}
private void createData(){
System.out.println("Data creation");
IndexResponse response=null;
for (int i=0;i<10;i++){
Map<String, Object> json = new HashMap<String, Object>();
json.put("ATTR1", "new value" + i);
response = this.client.prepareIndex(index, type)
.setSource(json)
.setOperationThreaded(false)
.execute()
.actionGet();
}
String _index = response.getIndex();
String _type = response.getType();
long _version = response.getVersion();
System.out.println("Index : "+_index+" Type : "+_type+" Version : "+_version);
System.out.println("----------------------------------");
}
public static void main(String[] args)
{
new ElasticTest();
}
}
I just wanna change the property of ATTR1 field to analyzed to ensure fast queries.
What im doing wrong? I also tried to create the mapping after index creation but it leads to the same affect.
Ok i found the answer by my own. On the type level i had to wrap the "properties" with the type name. E.g:
"type1" : {
"properties" : {
.....
}
}
See the following code:
private XContentBuilder getMappingsByJson(){
XContentBuilder builder = null;
try {
builder = XContentFactory.jsonBuilder().startObject().startObject(type).startObject("properties");
for(int i = 1; i<5; i++){
builder.startObject("ATTR" + i)
.field("type", "integer")
.field("store", "yes")
.field("index", "analyzed")
.endObject();
}
builder.endObject().endObject().endObject();
}
catch (IOException e) {
e.printStackTrace();
}
return builder;
}
It creates mappings for the attributes ATTR1 - ATTR4. Now it is possible to define mapping for Example a list of different attributes dynamically. Hope it helps someone else.
I have a log file with the following output:
2226:org.powertac.common.TariffSpecification::6::new::1::CONSUMPTION
2231:org.powertac.common.Rate::7::new
2231:org.powertac.common.Rate::7::withValue::-0.5
2232:org.powertac.common.Rate::7::setTariffId::6
2232:org.powertac.common.TariffSpecification::6::addRate::7
2233:org.powertac.common.Tariff::6::new::6
2234:org.powertac.common.TariffSpecification::8::new::1::INTERRUPTIBLE_CONSUMPTION
2234:org.powertac.common.Rate::9::new
2234:org.powertac.common.Rate::9::withValue::-0.5
2235:org.powertac.common.Rate::9::setTariffId::8
After I parse the file, have the following pattern:
<id>:<full_classname>::<order_of_execution>::<new_or_method>::<params>
The parser works nicely, and does what I expect. Now, my goal is to marshalling that same instruction into a XML file. I'm totally unfamiliar with this kind of task.
So, the XML would have to contain both new objects and methods call.
I know using the Reflection API I would use the <full_classname> to create an object of that class:
Class<?> cl = Class.forName( className );
How could I generate such XML file from that Class object? Do I have to have a data-structure or a way to take all the methods and fields of the object and write them to the xml file? I know the Reflection API has such methods, but I would need a more general / sample idea of how could I accomplish my task.
I started to write down this method, but I'm not sure how would it work:
// would send in the object to be marshalled.
public void toXML(Object obj){
try {
JAXBContext context = JAXBContext.newInstance(Object.class);
Marshaller m = context.createMarshaller();
m.setProperty(Marshaller.JAXB_FORMATTED_OUTPUT, true);
} catch (JAXBException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
Here is a sample of the parsed file:
171269 org.powertac.common.Order 171417 new 4
171270 org.powertac.common.Order 171418 new 4
171271 org.powertac.common.Order 171419 new 4
The parse method looks like:
public void parse() throws ClassNotFoundException{
try{
//
// assure file exists before parsing
//
FileReader fr = new FileReader( this.filename );
BufferedReader textReader = new BufferedReader( fr );
String line;
File input = new File("test.xml");
//Integer id = 1;
while(( line = textReader.readLine()) != null ){
Pattern p = Pattern.compile("([^:]+):([^:]+)::([\\d]+)::([^:]+)::(.+)");
Matcher m = p.matcher( line );
if (m.find()) {
int id = Integer.parseInt(m.group(1));
String className = m.group(2);
int orderOfExecution = Integer.valueOf( m.group( 3 ));
String methodNameOrNew = m.group(4);
String[] arguments = m.group(5).split("::");
//
// there is the need to create a new object
//
if( methodNameOrNew.compareTo( "new" ) == 0 ){
//
// inner class
//
if( className.contains("$") == true){
continue;
}
else if( className.contains("genco")){
continue;
}
System.out.println("Loading class: " + className);
LogEntry le = new LogEntry(id, className, orderOfExecution, methodNameOrNew, arguments.toString());
Serializer ser = new Persister();
ser.write(le, input);
id++;
System.out.printf("%s %s %d %s %d\n", id, className, orderOfExecution, methodNameOrNew, arguments.length);
}
}
}
textReader.close();
}
catch( IOException ex ){
ex.printStackTrace();
}
catch( ArrayIndexOutOfBoundsException ex){
ex.printStackTrace();
} catch (SecurityException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IllegalArgumentException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (InstantiationException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IllegalAccessException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
public void write() throws Exception{
File file = new File("test.xml");
Serializer ser = new Persister();
for(LogEntry entry : entries){
ser.write(entry, file);
}
}
Here is a first try using Simple XML library:
#Default()
public class LogEntry
{
private int id;
private Object classInstance;
private String orderOfExecution;
private String newOrMethod;
private String params;
// throws 'Exception' only for testing
public LogEntry(int id, String className, String orderOfExecution, String newOrMethod, String params) throws Exception
{
this.id = id;
this.classInstance = Class.forName(className).newInstance();
this.orderOfExecution = orderOfExecution;
this.newOrMethod = newOrMethod;
this.params = params;
}
// getter / setter
}
And how do make XML out of the class LogEntry:
// Here is an example of an entry
LogEntry le = new LogEntry(3, "com.example.MyClass", "abc", "def", "ghi");
Serializer ser = new Persister();
ser.write(le, new File("test.xml"));
Simple XML is very easy to use, see here for tutorials and examples.
You can custumize the whole XML with the Annotations in the LogEntry Class, however you can also let #Default() do everything for you :-)
LogEntry:
#Default()
public class LogEntry
{
private int id;
private Object classInstance;
private int orderOfExecution;
private String newOrMethod;
private List<Object> args;
public LogEntry(int id, Object classInstance, int orderOfExecution, String newOrMethod, List<Object> args)
{
this.id = id;
this.classInstance = classInstance;
this.orderOfExecution = orderOfExecution;
this.newOrMethod = newOrMethod;
this.args = args;
}
public LogEntry() { }
// getter / setter / toString / ... here
}
parse Method:
// Here all entries are saved
private List<LogEntry> entries = new ArrayList<>();
// ...
public void parse() throws Exception
{
// Don't compile this in a loop!
Pattern p = Pattern.compile("([^:]+):([^:]+)::([\\d]+)::([^:]+)::(.+)");
FileReader fr = new FileReader(this.filename);
BufferedReader textReader = new BufferedReader(fr);
String line;
while( (line = textReader.readLine()) != null )
{
Matcher m = p.matcher(line);
if( m.find() )
{
LogEntry entry = new LogEntry();
entry.setId(Integer.valueOf(m.group(1)));
String className = m.group(2);
entry.setOrderOfExecution(Integer.valueOf(m.group(3)));
String methodNameOrNew = m.group(4);
entry.setNewOrMethod(methodNameOrNew); // required in LogEntry?
Object[] arguments = m.group(5).split("::");
entry.setArgs(Arrays.asList(arguments));
if( methodNameOrNew.equals("new") )
{
if( className.contains("$") == true || className.contains("genco") )
continue;
createInstance(className, arguments);
}
else
{
callMethod(className, methodNameOrNew, arguments);
}
// XXX: for testing - set the instance 'not null'
entry.setClassInstance("only for testing");
entries.add(entry);
}
}
textReader.close();
}
Edit:
Lets say your parse()-Method, the List etc are in the Class Example:
#Root
public class Example
{
private File filename = new File("test.txt");
#ElementList
private List<LogEntry> entries = new ArrayList<>();
// ...
// Only 'entries' is annotated as entry -> only it will get serialized
public void storeToXml(File f) throws Exception
{
Serializer ser = new Persister();
ser.write(this, f);
}
public void parse() throws Exception
{
// ...
}
}
Note: For this example i've added entry.setClassInstance("only for testing"); above entries.add(...), else the instance is null.
Edit #2: Helper methods for parse()
private Object createInstance(String className, Object args[])
{
// TODO
return null;
}
private void callMethod(String className, String methodName, Object args[])
{
// TODO
}