I need to load a big .csv into my database, I have the Price entity that has a reference to a Distributor:
public class Price {
#Id
private String id;
private Date dtComu;
private Double price;
#ManyToOne
#JoinColumn(name = "idDistributor")
private Distributor distributor;
//constrcutor, getters&setters
The csv with Price data that I try to upload has a reference to the Distributor id for each Price. on the ItemReader i have:
public FlatFileItemReader<Price> priceReader() {
FlatFileItemReader<Price> reader = new FlatFileItemReader<>();
reader.setResource(new ClassPathResource("file.csv"));
reader.setLinesToSkip(2);
reader.setRecordSeparatorPolicy(recordSeparatorPolicy);
reader.setLineMapper(new DefaultLineMapper<Price>() {
{
setLineTokenizer(new DelimitedLineTokenizer() {
{
setStrict(false);
setDelimiter(";");
setNames(new String[] { "idDistributor", "price", "dtComu" });
}
});
setFieldSetMapper(customMapper());
}
});
return reader;
}
And my customMapper is:
public class CustomMapper implements FieldSetMapper<Price> {
#Autowired
DistributorRepository repository;
SimpleDateFormat formatter = new SimpleDateFormat("dd/MM/yyyy HH:mm:ss");
#Override
public Price mapFieldSet(FieldSet fieldSet) throws BindException {
Distributor distributor = repository.findById(fieldSet.readInt("idDistributor")).orElse(null);
if (distributore == null) {
return null;
}
Price p = new Price();
p.setDistributor(distributor);
p.setPrice(fieldSet.readDouble("price"));
try {
p.setDtComu(formatter.parse(fieldSet.readString("dtComu")));
} catch (ParseException e) {
// TODO Auto-generated catch block
p.setDtComu(new Date());
}
//here i will crate an ID for the Price that i need to always be unique
p.setId(distributor.getIdImpianto()+ fieldSet.readString("dtComu"));
return p; }}
My Writer is as follows:
public JdbcBatchItemWriter<Price> prezzoWriter() {
JdbcBatchItemWriter<price> writer = new JdbcBatchItemWriter<Price>();
writer.setItemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<Prezzo>());
;
writer.setSql("INSERT INTO price (id_distributor, price, dt_comu) "
+ "VALUES (:idDistributor, :price, :dtComu");
writer.setDataSource(dataSource);
return writer;
}
I run the program and it completes without anything being saved. Could I be doing something very wrong? I just started with Sring Batch and have to elaborate something a bit complex..
The JdbcBatchItemWriter knows nothing about the JPA context. You need to use a DataSourceTransactionManager for this to work.
Related
I started to learn Kafka, and now,
I'm on sending/receiving serialized/desirialised java class.
My question is about: what have I missed in my config, so I can't deserialize the object from Kafka
here is my class:
public class Foo {
private String item;
private int quantity;
private Double price;
public Foo(String item, int quantity, final double price) {
this.item = item;
this.quantity = quantity;
this.price = price;
}
public String getItem() { return item; }
public int getQuantity() { return quantity; }
public Double getPrice() { return price; }
public void setQuantity(int quantity) { this.quantity = quantity; }
public void setPrice(double price) { this.price = price; }
#Override
public String toString() {
return "item=" + item + ", quantity=" + quantity + ", price=" + price;
}
}
my Properties in main class:
producerPropsObject.put(ProducerConfig.CLIENT_ID_CONFIG,
AppConfigs.applicationProducerSerializedObject);
producerPropsObject.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,
AppConfigs.bootstrapServers);
producerPropsObject.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
StringSerializer.class.getName());
producerPropsObject.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,
FooSerializer.class.getName());
producerPropsObject.put("topic", AppConfigs.topicNameForSerializedObject);
consumerPropsObject.put(ConsumerConfig.GROUP_ID_CONFIG, AppConfigs.applicationProducerSerializedObject);
consumerPropsObject.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, AppConfigs.bootstrapServers);
consumerPropsObject.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());
consumerPropsObject.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,FooDeserializer.class.getName());
consumerPropsObject.put(ConsumerConfig.MAX_POLL_INTERVAL_MS_CONFIG, 300000);
consumerPropsObject.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, true);
consumerPropsObject.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
consumerPropsObject.put("topic", AppConfigs.topicNameForSerializedObject);
following are serializer/deserializer implementations:
public class FooSerializer implements org.apache.kafka.common.serialization.Serializer {
public void configure(Map map, boolean b) { }
public byte[] serialize(String s, Object o) {
try {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ObjectOutputStream oos = new ObjectOutputStream(baos);
oos.writeObject(o);
oos.close();
byte[] b = baos.toByteArray();
return b;
} catch (IOException e) { return new byte[0]; }
}
public void close() { }
}
public class FooDeserializer implements org.apache.kafka.common.serialization.Deserializer {
#Override
public void close() { }
#Override
public Foo deserialize(String arg0, byte[] arg1) {
//Option #1:
//ObjectMapper mapper = new ObjectMapper();
//Option #2:
JsonFactory factory = new JsonFactory();
factory.enable(JsonParser.Feature.ALLOW_SINGLE_QUOTES);
ObjectMapper mapper = new ObjectMapper(factory);
Foo fooObj = null;
try {
//Option #1:
//fooObj = mapper.readValue(arg1, Foo.class); // BREAKS HERE!!!
//Option #2:
fooObj = mapper.reader().forType(Foo.class).readValue(arg1); // BREAKS HERE!!!
}
catch (Exception e) { e.printStackTrace(); }
return fooObj;
}
}
and finally the way I'm trying to produce and consume my Foo from main:
seems, like it works fine, cause I see in kafka-topic my Key && Value later on
public void produceObjectToKafka(final Properties producerProps) {
final String[] ar = new String[]{"Matrix", "Naked Gun", "5th Element", "Die Hard", "Gone with a wind"};
KafkaProducer<String, byte[]> producer = new KafkaProducer<>(producerProps);
final Foo j = new Foo(ar[getAnInt(4)], getAnInt(10), getAnDouble());
producer.send(new ProducerRecord<>(producerProps.getProperty("topic"), j.getItem(), j.toString().getBytes()));
producer.flush();
producer.close();
}
however, while my Consumer is catching the output:
public void consumeFooFromKafka(final Properties consumerProps) {
final Consumer<String, Foo> myConsumer = new KafkaConsumer<>(consumerProps);
final Thread separateThread = new Thread(() -> {
try {
myConsumer.subscribe(Collections.singletonList(consumerProps.getProperty("topic")));
while (continueToRunFlag) {
final StringBuilder sb = new StringBuilder();
final ConsumerRecords<String, Foo> consumerRecords = myConsumer.poll(Duration.ofMillis(10));
if (consumerRecords.count() > 0) {
for (ConsumerRecord<String, Foo> cRec : consumerRecords) {
sb.append( cRec.key() ).append("<<").append(cRec.value().getItem() + ",").append(cRec.value().getQuantity() + ",").append(cRec.value().getPrice()).append("|");
}
}
if (sb.length() > 0) { System.out.println(sb.toString()); }
}
}
finally {
myConsumer.close();
}
});
separateThread.start();
}
=======================================
so, actually by running "consumeFooFromKafka" , when it trigger "FooDeserializer" ...... there, I always have same error(regardless of Option #1, or Option #2):
exception:
Method threw 'com.fasterxml.jackson.core.JsonParseException' exception.
detailedMessage:
Unexpected character ('¬' (code 172)): expected a valid value (JSON String, Number, Array, Object or token 'null', 'true' or
'false')
will be very appresiated for help.......
Thank you in advance,
Steve
If you want to deserialize from json, than u need to serialize it to json, use jackson in ur serializer also, and everything should be fine
public class FooSerializer implements org.apache.kafka.common.serialization.Serializer {
public void configure(Map map, boolean b) { }
public byte[] serialize(String s, Object o) {
try {
ObjectMapper om = new ObjectMapper();//objectmapper from jackson
byte[] b = om.writeValueAsString(o).getBytes();
return b;
} catch (IOException e) { return new byte[0]; }
}
public void close() { }
}
I don't know why you're using a bytearray outputstream, but trying to read JSON in the deserializer, but that explains the error. You could even test that without using Kafka at all by invoking the serialize/deserialize methods directly
In the link provided, the serializer uses objectMapper.writeValueAsString, which returns JSON text, and not the Java specific outputstream. If you wanted to consume and produce data between different programming languages (as is often the case in most companies), you'd want to avoid such specific serialization formats
Note: Confluent provides Avro, Protobuf, and JSON serializers for Kafka, so you shouldn't need to write your own if you want to use one of those formats
I have written a controller which is a default for MototuploadService(for Motor Upload), but I need to make one Factory Design so that
based on parentPkId, need to call HealUploadService, TempUploadService, PersonalUploadService etc which will have separate file processing stages.
controller is below.
#RequestMapping(value = "/csvUpload", method = RequestMethod.POST)
public List<String> csvUpload(#RequestParam String parentPkId, #RequestParam List<MultipartFile> files)
throws IOException, InterruptedException, ExecutionException, TimeoutException {
log.info("Entered method csvUpload() of DaoController.class");
List<String> response = new ArrayList<String>();
ExecutorService executor = Executors.newFixedThreadPool(10);
CompletionService<String> compService = new ExecutorCompletionService<String>(executor);
List< Future<String> > futureList = new ArrayList<Future<String>>();
for (MultipartFile f : files) {
compService.submit(new ProcessMutlipartFile(f ,parentPkId,uploadService));
futureList.add(compService.take());
}
for (Future<String> f : futureList) {
long timeout = 0;
System.out.println(f.get(timeout, TimeUnit.SECONDS));
response.add(f.get());
}
executor.shutdown();
return response;
}
Here is ProcessMutlipartFile class which extends the callable interface, with CompletionService's compService.submit() invoke this class, which in turn executes call() method, which will process a file.
public class ProcessMutlipartFile implements Callable<String>
{
private MultipartFile file;
private String temp;
private MotorUploadService motUploadService;
public ProcessMutlipartFile(MultipartFile file,String temp, MotorUploadService motUploadService )
{
this.file=file;
this.temp=temp;
this.motUploadService=motUploadService;
}
public String call() throws Exception
{
return motUploadService.csvUpload(temp, file);
}
}
Below is MotorUploadService class, where I'm processing uploaded CSV file, line by line and then calling validateCsvData() method to validate Data,
which returns ErrorObject having line number and Errors associated with it.
if csvErrorRecords is null, then error-free and proceed with saving to Db.
else save errorList to Db and return Upload Failure.
#Component
public class MotorUploadService {
#Value("${external.resource.folder}")
String resourceFolder;
public String csvUpload(String parentPkId, MultipartFile file) {
String OUT_PATH = resourceFolder;
try {
DateFormat df = new SimpleDateFormat("yyyyMMddhhmmss");
String filename = file.getOriginalFilename().split(".")[0] + df.format(new Date()) + file.getOriginalFilename().split(".")[1];
Path path = Paths.get(OUT_PATH,fileName)
Files.copy(file.getInputStream(), path, StandardCopyOption.REPLACE_EXISTING);
}
catch(IOException e){
e.printStackTrace();
return "Failed to Upload File...try Again";
}
List<TxnMpMotSlaveRaw> txnMpMotSlvRawlist = new ArrayList<TxnMpMotSlaveRaw>();
try {
BufferedReader br = new BufferedReader(new InputStreamReader(file.getInputStream()));
String line = "";
int header = 0;
int lineNum = 1;
TxnMpSlaveErrorNew txnMpSlaveErrorNew = new TxnMpSlaveErrorNew();
List<CSVErrorRecords> errList = new ArrayList<CSVErrorRecords>();
while ((line = br.readLine()) != null) {
// TO SKIP HEADER
if (header == 0) {
header++;
continue;
}
lineNum++;
header++;
// Use Comma As Separator
String[] csvDataSet = line.split(",");
CSVErrorRecords csvErrorRecords = validateCsvData(lineNum, csvDataSet);
System.out.println("Errors from csvErrorRecords is " + csvErrorRecords);
if (csvErrorRecords.equals(null) || csvErrorRecords.getRecordNo() == 0) {
//Function to Save to Db
} else {
// add to errList
continue;
}
}
if (txnMpSlaveErrorNew.getErrRecord().size() == 0) {
//save all
return "Successfully Uploaded " + file.getOriginalFilename();
}
else {
// save the error in db;
return "Failure as it contains Faulty Information" + file.getOriginalFilename();
}
} catch (IOException ex) {
ex.printStackTrace();
return "Failure Uploaded " + file.getOriginalFilename();
}
}
private TxnMpMotSlaveRaw saveCsvData(String[] csvDataSet, String parentPkId) {
/*
Mapping csvDataSet to PoJo
returning Mapped Pojo;
*/
}
private CSVErrorRecords validateCsvData(int lineNum, String[] csvDataSet) {
/*
Logic for Validation goes here
*/
}
}
How to make it as a factory design pattern from controller,
so that if
parentPkId='Motor' call MotorUploadService,
parentPkId='Heal' call HealUploadService
I'm not so aware of the Factory Design pattern, please help me out.
Thanks in advance.
If I understood the question, in essence you would create an interface, and then return a specific implementation based upon the desired type.
So
public interface UploadService {
void csvUpload(String temp, MultipartFile file) throws IOException;
}
The particular implementations
public class MotorUploadService implements UploadService
{
public void csvUpload(String temp, MultipartFile file) {
...
}
}
public class HealUploadService implements UploadService
{
public void csvUpload(String temp, MultipartFile file) {
...
}
}
Then a factory
public class UploadServiceFactory {
public UploadService getService(String type) {
if ("Motor".equals(type)) {
return new MotorUploadService();
}
else if ("Heal".equals(type)) {
return new HealUploadService();
}
}
}
The factory might cache the particular implementations. One can also use an abstract class rather than an interface if appropriate.
I think you currently have a class UploadService but that is really the MotorUploadService if I followed your code, so I would rename it to be specific.
Then in the controller, presumably having used injection for the UploadServiceFactory
...
for (MultipartFile f : files) {
UploadService uploadSrvc = uploadServiceFactory.getService(parentPkId);
compService.submit(new ProcessMutlipartFile(f ,parentPkId,uploadService));
futureList.add(compService.take());
}
So with some additional reading in your classes:
public class ProcessMutlipartFile implements Callable<String>
{
private MultipartFile file;
private String temp;
private UploadService uploadService;
// change to take the interface UploadService
public ProcessMutlipartFile(MultipartFile file,String temp, UploadService uploadService )
{
this.file=file;
this.temp=temp;
this.uploadService=uploadService;
}
public String call() throws Exception
{
return uploadService.csvUpload(temp, file);
}
}
I am trying to write some integration tests relative to some methods that needs to extract data from MongoDB. In detail, I am using the Embedded Mongo given by Spring Data project. The embedded mongo is clearly provided by Flapdoodle.
I need to import some json file into the Embedded Mongo. I have looked at the tests provided with flapdoodle, but I am not able to understand how they integrates with the magic given by Spring Data + Spring Boot.
Can anyone post some clarifying snippets?
You can create a junit rule (ExternalResource) which runs before and after each test. Check the MongoEmbeddedRule class to get some idea on the implementation details.
Integration test:
#RunWith(SpringRunner.class)
#SpringBootTest(webEnvironment = RANDOM_PORT)
public abstract class TestRunner {
#Autowired
protected MongoTemplate mongoTemplate;
#Rule
public MongoEmbeddedRule mongoEmbeddedRule = new MongoEmbeddedRule(this);
ExternalResource Rule:
public class MongoEmbeddedRule extends ExternalResource {
private final Object testClassInstance;
private final Map<String, Path> mongoCollectionDataPaths;
private final String fieldName;
private final String getterName;
public MongoEmbeddedRule(final Object testClassInstance) {
this(testClassInstance, "mongoTemplate", "getMongoTemplate");
}
protected MongoEmbeddedRule(final Object testClassInstance, final String fieldName, final String getterName) {
this.fieldName = fieldName;
this.getterName = getterName;
this.testClassInstance = testClassInstance;
this.mongoCollectionDataPaths = mongoExtendedJsonFilesLookup();
}
#Override
protected void before() {
dropCollections();
createAndPopulateCollections();
}
#Override
protected void after() {
}
protected Set<String> getMongoCollectionNames() {
return mongoCollectionDataPaths.keySet();
}
public void dropCollections() {
getMongoCollectionNames().forEach(collectionName -> getMongoTemplate().dropCollection(collectionName));
}
protected void createAndPopulateCollections() {
mongoCollectionDataPaths.forEach((key, value) -> insertDocumentsFromMongoExtendedJsonFile(value, key));
}
protected MongoTemplate getMongoTemplate() {
try {
Object value = ReflectionTestUtils.getField(testClassInstance, fieldName);
if (value instanceof MongoTemplate) {
return (MongoTemplate) value;
}
value = ReflectionTestUtils.invokeGetterMethod(testClassInstance, getterName);
if (value instanceof MongoTemplate) {
return (MongoTemplate) value;
}
} catch (final IllegalArgumentException e) {
// throw exception with dedicated message at the end
}
throw new IllegalArgumentException(
String.format(
"%s expects either field '%s' or method '%s' in order to access the required MongoTemmplate",
this.getClass().getSimpleName(), fieldName, getterName));
}
private Map<String, Path> mongoExtendedJsonFilesLookup() {
Map<String, Path> collections = new HashMap<>();
try {
Files.walk(Paths.get("src","test","resources","mongo"))
.filter(Files::isRegularFile)
.forEach(filePath -> collections.put(
filePath.getFileName().toString().replace(".json", ""),
filePath));
} catch (IOException e) {
e.printStackTrace();
}
return collections;
}
private void insertDocumentsFromMongoExtendedJsonFile(Path path, String collectionName) {
try {
List<Document> documents = new ArrayList<>();
Files.readAllLines(path).forEach(l -> documents.add(Document.parse(l)));
getMongoTemplate().getCollection(collectionName).insertMany(documents);
System.out.println(documents.size() + " documents loaded for " + collectionName + " collection.");
} catch (IOException e) {
e.printStackTrace();
}
}
}
json file (names.json) with MongoDB Extended JSON, where every document is in one line and the collection name is the filename without extension.
{ "_id" : ObjectId("594d324d5b49b78da8ce2f28"), "someId" : NumberLong(1), "name" : "Some Name 1", "lastModified" : ISODate("1970-01-01T00:00:00Z")}
{ "_id" : ObjectId("594d324d5b49b78da8ce2f29"), "someId" : NumberLong(2), "name" : "Some Name 2", "lastModified" : ISODate("1970-01-01T00:00:00Z")}
You can have a look at this following Test class, provided by "flapdoodle". The test shows how to import a JSON file containing the collection dataset:
MongoImportExecutableTest.java
You could theoretically also import a whole dump of a database. (using MongoDB restore):
MongoRestoreExecutableTest.java
You can create an abstract class and have setup logic to start mongod and mongoimport process.
AbstractMongoDBTest.java
public abstract class AbstractMongoDBTest {
private MongodProcess mongodProcess;
private MongoImportProcess mongoImportProcess;
private MongoTemplate mongoTemplate;
void setup(String dbName, String collection, String jsonFile) throws Exception {
String ip = "localhost";
int port = 12345;
IMongodConfig mongodConfig = new MongodConfigBuilder().version(Version.Main.PRODUCTION)
.net(new Net(ip, port, Network.localhostIsIPv6()))
.build();
MongodStarter starter = MongodStarter.getDefaultInstance();
MongodExecutable mongodExecutable = starter.prepare(mongodConfig);
File dataFile = new File(Thread.currentThread().getContextClassLoader().getResource(jsonFile).getFile());
MongoImportExecutable mongoImportExecutable = mongoImportExecutable(port, dbName,
collection, dataFile.getAbsolutePath()
, true, true, true);
mongodProcess = mongodExecutable.start();
mongoImportProcess = mongoImportExecutable.start();
mongoTemplate = new MongoTemplate(new MongoClient(ip, port), dbName);
}
private MongoImportExecutable mongoImportExecutable(int port, String dbName, String collection, String jsonFile,
Boolean jsonArray, Boolean upsert, Boolean drop) throws
IOException {
IMongoImportConfig mongoImportConfig = new MongoImportConfigBuilder()
.version(Version.Main.PRODUCTION)
.net(new Net(port, Network.localhostIsIPv6()))
.db(dbName)
.collection(collection)
.upsert(upsert)
.dropCollection(drop)
.jsonArray(jsonArray)
.importFile(jsonFile)
.build();
return MongoImportStarter.getDefaultInstance().prepare(mongoImportConfig);
}
#AfterEach
void clean() {
mongoImportProcess.stop();
mongodProcess.stop();
}
public MongoTemplate getMongoTemplate(){
return mongoTemplate;
}
}
YourTestClass.java
public class YourTestClass extends AbstractMongoDBTest{
#BeforeEach
void setup() throws Exception {
super.setup("db", "collection", "jsonfile");
}
#Test
void test() throws Exception {
}
}
I am doing the Java project with spring.So I am using the Jackson library to convert to get the JSON format.
My java Class will be ,
public class ChatInteraction extends Interaction{
private int ticketId;
private String name;
private String interactionType ;
private LinkedList<InteractionInfo> interactions;
public ChatInteraction(Message response) {
super(response);
interactions = new LinkedList<InteractionInfo>();
}
public int getTicketId() {
return ticketId;
}
public void setTicketId(int ticketId) {
this.ticketId = ticketId;
System.out.println("Ticket Id for Interaction : "+this.ticketId);
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
System.out.println("Name for Interaction : "+this.name);
}
public LinkedList<InteractionInfo> getInteractions() {
return interactions;
}
public String getInteractionType() {
return interactionType;
}
public void setInteractionType(String interactionType) {
this.interactionType = interactionType;
}
public void addInteraction(InteractionInfo interaction) {
this.interactions.add(interaction);
}
public void accept(int proxyId,String intxnId,int ticketId){
RequestAccept reqAccept = RequestAccept.create();
reqAccept.setProxyClientId(proxyId);
reqAccept.setInteractionId(intxnId);
reqAccept.setTicketId(ticketId);
System.out.println("New Chat RequestAccept Request Object ::: "+reqAccept.toString());
try{
if(intxnProtocol.getState() == ChannelState.Opened){
Message response = intxnProtocol.request(reqAccept);
System.out.println("New Chat RequestAccept Response ::: "+response.toString());
if(response != null ){
if( response.messageId() == EventAck.ID){
System.out.println("Accept new chat success !");
//EventAccepted accept = (EventAccepted)response;
//return "New chat Interaction accepted";
}else if(response.messageId() == EventError.ID){
System.out.println("Accept new chat Failed !");
//return "New chat Interaction rejected";
}
}
}else{
System.out.println("RequestAccept failure due to Interaction protocol error !");
}
}catch(Exception acceptExcpetion){
acceptExcpetion.printStackTrace();
}
}
public void join(String sessionId, String subject) {
RequestJoin join = RequestJoin.create();
join.setMessageText(MessageText.create(""));
join.setQueueKey("Resources:"); //Add the chat-inbound-key in multimedia of the optional tab values of the softphone application in CME
join.setSessionId(sessionId);
join.setVisibility(Visibility.All);
join.setSubject(subject);
KeyValueCollection kvc = new KeyValueCollection();
join.setUserData(kvc);
System.out.println("Join Request Object ::: "+join.toString());
try {
if(basicProtocol != null && basicProtocol.getState() == ChannelState.Opened){
Message response = basicProtocol.request(join);
if(response != null){
System.out.println("RequestJoin response ::: "+response);
if (response.messageId() == EventSessionInfo.ID) {
System.out.println("Join Request success !");
}else{
System.out.println("Join Request Failed !");
}
}
}else{
System.out.println("BasicChat protocol Error !");
//return "BasicChat protocol Error !";
}
} catch (ProtocolException e) {
e.printStackTrace();
} catch (IllegalStateException e) {
e.printStackTrace();
}
}
}
I need to get only the interactionType and interactions property of this class in the JSON format like ,
{"interactionType":"invite","interactions" : [{"xx":"XX","yy":"YY"},{"xx":"XX","yy":"YY"}]}
Note :
I don't need the other properties of this class.
Also there is no SETTER for the interactions property . Instead of that I have the addInteractions() method . Does this affects any behaviour of JSON conversion ?
Also I have some other methods like accept(...) , Join(...).
I am using the jackson-all-1.9.0.jar
You can annotate the unneeded fields with #JsonIgnore - see Jackson's manual on annotations. That's what it will look like, using your code:
public class ChatInteraction extends Interaction{
#JsonIgnore
private int ticketId;
#JsonIgnore
private String name;
private String interactionType ;
private LinkedList<InteractionInfo> interactions;
You can use achieve this by using the #JsonIgnoreProperties annotation that can be used on class level.
From JavaDoc:
Annotation that can be used to either suppress serialization of
properties (during serialization), or ignore processing of JSON
properties read (during deserialization).
Example:
// to prevent specified fields from being serialized or deserialized
// (i.e. not include in JSON output; or being set even if they were included)
\#JsonIgnoreProperties({ "internalId", "secretKey" })
Example, In your case:
#JsonIgnoreProperties({ "ticketId", "name" })
public class ChatInteraction extends Interaction{
....
}
Finally I got the solution by others answers in the thread and similar answers in stackoverflow,
I marked the #JsonIgnore in the unwanted field in the sub class and super class suggested by fvu.
I have used the myObjectMapper.setVisibility(JsonMethod.FIELD, Visibility.ANY); in my objectMapper suggested in other thread like,
ObjectMapper mapp = new ObjectMapper();
mapp.setVisibility(JsonMethod.FIELD, Visibility.ANY);
try {
json = mapp.writeValueAsString(info);
info.clear();
System.out.println("Chat Info in JSON String is :::> "+json);
} catch (Exception e) {
e.printStackTrace();
}
Is there any module in Java equivalent to python's shelve module? I need this to achieve dictionary like taxonomic data access. Dictionary-like taxonomic data access is a powerful way to save Python objects in a persistently easy access database format. I need something for the same purpose but in Java.
I also needed this, so I wrote one. A bit late, but maybe it'll help.
It doesn't implement the close() method, but just use sync() since it only hold the file open when actually writing it.
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
import java.util.HashMap;
public class Shelf extends HashMap<String, Object> {
private static final long serialVersionUID = 7127639025670585367L;
private final File file;
public static Shelf open(File file) {
Shelf shelf = null;
try {
if (file.exists()) {
final FileInputStream fis = new FileInputStream(file);
ObjectInputStream ois = new ObjectInputStream(fis);
shelf = (Shelf) ois.readObject();
ois.close();
fis.close();
} else {
shelf = new Shelf(file);
shelf.sync();
}
} catch (Exception e) {
// TODO: handle errors
}
return shelf;
}
// Shelf objects can only be created or opened by the Shelf.open method
private Shelf(File file) {
this.file = file;
sync();
}
public void sync() {
try {
final FileOutputStream fos = new FileOutputStream(file);
ObjectOutputStream oos = new ObjectOutputStream(fos);
oos.writeObject(this);
oos.close();
fos.close();
} catch (Exception e) {
// TODO: handle errors
}
}
// Simple Test Case
public static void main(String[] args) {
Shelf shelf = Shelf.open(new File("test.obj"));
if (shelf.containsKey("test")) {
System.out.println(shelf.get("test"));
} else {
System.out.println("Creating test string. Run the program again.");
shelf.put("test", "Hello Shelf!");
shelf.sync();
}
}
}
You could use a serialisation library like Jackson which serialises POJOs to JSON.
An example from the tutorial:
Jackson's org.codehaus.jackson.map.ObjectMapper "just works" for
mapping JSON data into plain old Java objects ("POJOs"). For example,
given JSON data
{
"name" : { "first" : "Joe", "last" : "Sixpack" },
"gender" : "MALE",
"verified" : false,
"userImage" : "Rm9vYmFyIQ=="
}
It takes two lines of Java to turn it into a User instance:
ObjectMapper mapper = new ObjectMapper(); // can reuse, share globally
User user = mapper.readValue(new File("user.json"), User.class);
Where the User class looks something like this (from an entry on Tatu's blog):
public class User {
public enum Gender { MALE, FEMALE };
public static class Name {
private String _first, _last;
public String getFirst() { return _first; }
public String getLast() { return _last; }
public void setFirst(String s) { _first = s; }
public void setLast(String s) { _last = s; }
}
private Gender _gender;
private Name _name;
private boolean _isVerified;
private byte[] _userImage;
public Name getName() { return _name; }
public boolean isVerified() { return _isVerified; }
public Gender getGender() { return _gender; }
public byte[] getUserImage() { return _userImage; }
public void setName(Name n) { _name = n; }
public void setVerified(boolean b) { _isVerified = b; }
public void setGender(Gender g) { _gender = g; }
public void setUserImage(byte[] b) { _userImage = b; }
}