I am using dozer version 5.5.1. And i want to configure my custom converter so i have this
import org.dozer.DozerConverter;
import com.example.movies.api.models.response.ClientResponseDTO;
public class MyCustomConverter
extends DozerConverter<ClientResponseDTO, String> {
public MyCustomConverter() {
super(ClientResponseDTO.class, String.class);
}
#Override
public String convertTo(ClientResponseDTO source, String destination) {
return "ClientResponseDTO Converted to string!";
}
#Override
public ClientResponseDTO convertFrom(String source, ClientResponseDTO destination) {
return new ClientResponseDTO();
}
}
Which i am loading with Spring like this:
#Bean
public Mapper dozerBeanMapper() {
DozerBeanMapper mapper = new DozerBeanMapper();
List<CustomConverter> converters = new ArrayList<>();
converters.add(new MyCustomConverter(ClientResponseDTO.class, String.class));
mapper.setCustomConverters(converters);
return mapper;
}
Then, i have this usage:
#Autowired Mapper mapper;
...
ClientResponseDTO clientResponseDTO = clientService.getClient(id);
String conversion = this.mapper.map(clientResponseDTO, String.class);
And the custom converter is never being called. Do you know why is that ? Regards!
Refer to dozer documentation you should add mapping to bean description.
Eg.
<bean id="org.dozer.Mapper" class="org.dozer.DozerBeanMapper">
<property name="mappingFiles">
<list>
<value>systempropertymapping1.xml</value>
<value>dozerBeanMapping.xml</value>
<value>injectedCustomConverter.xml</value>
</list>
</property><property name="customConvertersWithId">
<map>
<entry key="CustomConverterWithId" ref="configurableConverterBeanInstance1" />
<entry key="CustomConverterWithId2" ref="configurableConverterBeanInstance2" />
</map>
</property>
</bean>
Related
I am trying to parse XML files and convert java POJO's from them. My Sample XML looks like
students.xml
<?xml version="1.0" encoding="ISO-8859-1" standalone="no"?>
<group>
<college>
<name>Hogwards</name>
<city>Unknown</city>
</college>
<student>
<name>Tony Tester</name>
<rollNo>1</rollNo>
<enrollmentDate>2016-10-31</enrollmentDate>
<sampleTimeStamp>2016-11-07T05:50:45</sampleTimeStamp>
<salary>16.57</salary>
</student>
<student>
<name>Nick Newbie</name>
<rollNo>2</rollNo>
<enrollmentDate>2017-10-31</enrollmentDate>
<sampleTimeStamp>2016-11-07T05:50:45</sampleTimeStamp>
<salary>29.68</salary>
</student>
<student>
<name>Ian Intermediate</name>
<rollNo>3</rollNo>
<enrollmentDate>2018-10-31</enrollmentDate>
<sampleTimeStamp>2016-11-07T05:50:45</sampleTimeStamp>
<salary>789.62</salary>
</student>
</group>
Here, my goal is to parse the file and populate the student information into database using spring batch and for my purpose college information is kind of header for me which is totally useless and so in my batch reader i would like to ignore it and just i want to parse the student information in chunks. As of now my code is using GroupDTO class to parse the whole record at once and is creating at objects at a single time as a result of which i am unable to leveage the functionality of spring batch. My requirement says that student information should be parsed in chunks let's say in a chunk size of 300 or so. But as of now my code parses the whole XML files at one time and populate java objects from it. Please help me to ignore the college section part and just parse student section part in chunks using spring batch or suggest some appropriate link which may help me to find some solution for my issue. Thanks in advance...
XmlConfiguration.java
#Configuration
public class XmlConfiguration
{
#Autowired
JobBuilderFactory jobBuilderFactory;
#Autowired
StepBuilderFactory stepBuilderFactory;
#StepScope
#Bean(name="xmlReader")
public SynchronizedItemStreamReader<GroupDTO> reader()
{
StaxEventItemReader<GroupDTO> xmlFileReader = new StaxEventItemReader<>();
xmlFileReader.setResource(new ClassPathResource("students.xml"));
xmlFileReader.setFragmentRootElementName("group");
Map<String, Class<?>> aliases = new HashMap<>();
aliases.put("group", GroupDTO.class);
aliases.put("college", CollegeDTO.class);
aliases.put("student", StudentDTO.class);
XStreamMarshaller xStreamMarshaller = new XStreamMarshaller();
xStreamMarshaller.setAliases(aliases);
String dateFormat = "yyyy-MM-dd";
String timeFormat = "HHmmss";
String[] acceptableFormats = {timeFormat};
xStreamMarshaller.getXStream().autodetectAnnotations(true);
xStreamMarshaller.getXStream().registerConverter(new DateConverter(dateFormat, acceptableFormats));
xStreamMarshaller.getXStream().addPermission(NoTypePermission.NONE);
xStreamMarshaller.getXStream().addPermission(NullPermission.NULL);
xStreamMarshaller.getXStream().addPermission(PrimitiveTypePermission.PRIMITIVES);
xStreamMarshaller.getXStream().allowTypeHierarchy(Collection.class);
xStreamMarshaller.getXStream().allowTypesByWildcard(new String[] {"com.example.demo.**"});
xStreamMarshaller.getXStream().addImplicitCollection(GroupDTO.class, "list");
xmlFileReader.setUnmarshaller(xStreamMarshaller);
SynchronizedItemStreamReader<GroupDTO> synchronizedItemStreamReader = new SynchronizedItemStreamReader<>();
synchronizedItemStreamReader.setDelegate(xmlFileReader);
return synchronizedItemStreamReader;
}
#Bean(name="xmlProcessor")
public ItemProcessor<GroupDTO, GroupDTO> processor()
{
return new Processor();
}
#Bean(name="xmlWriter")
public ItemWriter<GroupDTO> writer()
{
return new Writer();
}
#Bean(name="xmljobListener")
public JobExecutionListenerSupport jobListener()
{
return new JobListener();
}
#JobScope
#Bean(name="xmltaskExecutor")
public ThreadPoolTaskExecutor taskExecutor()
{
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(50);
executor.setMaxPoolSize(100);
return executor;
}
#Bean(name="xmlStep")
public Step xmlFileToDatabaseStep()
{
return stepBuilderFactory.get("xmlStep")
.<GroupDTO, GroupDTO>chunk(2)
.reader(this.reader())
.processor(this.processor())
.writer(this.writer())
.taskExecutor(this.taskExecutor())
.build();
}
#Bean(name="xmlJob")
public Job xmlFileToDatabaseJob(#Autowired #Qualifier("xmlStep") Step step)
{
return jobBuilderFactory
.get("xmlJob"+new Date())
.incrementer(new RunIdIncrementer())
.listener(this.jobListener())
.flow(step)
.end()
.build();
}
}
GroupDTO.java
#XStreamAlias("group")
public class GroupDTO
{
#XStreamAlias("college")
private CollegeDTO college;
#XStreamAlias("student")
private List<StudentDTO> list;
...... getter,setter, constructors
}
CollegeDTO.java
public class CollegeDTO
{
private String name;
private String city;
...... getter,setter and constructor
}
StudentDTO.java
public class StudentDTO
{
private String name;
private Integer rollNo;
private Date enrollmentDate;
private Date sampleTimeStamp;
private BigDecimal salary;
... getter, setter and constructor
}
Inside the job you have the tasklet that can use the chunk tag. It will has reader and writer properties and can it can has processor properties. The "processor" is optional.
<batch:job id="helloWorldJob">
<batch:step id="step1">
<batch:tasklet>
<batch:chunk reader="itemReader" writer="itemWriter"
processor="itemProcessor" commit-interval="10">
</batch:chunk>
</batch:tasklet>
</batch:step>
</batch:job>
Then, when you declare the reader tag you will define the mapper.
<!-- READER -->
<bean id = "itemReader"
class = "org.springframework.batch.item.file.FlatFileItemReader">
...
<property name = "lineMapper">
<bean class = "org.springframework.batch.item.file.mapping.DefaultLineMapper">
...
<property name = "fieldSetMapper">
<bean class = "tudy.batch.Mapper" />
</property>
</bean>
</property>
</bean>
This Mapper class is a great option to do what you want. This mapper will read the input file. What I imagine you need to do is just ignore the college tag.
public class Mapper implements FieldSetMapper<Student>{
public Student mapFieldSet(FieldSet fieldSet) throws BindException {
// Instantiating the report object
Student student = new Student();
// Setting the fields
student.setName(fieldSet.readInt(0));
student.setRollNo(fieldSet.readString(1));
student.setEnrollmentDate(fieldSet.readString(2));
student.setSampleTimeStamp(fieldSet.readString(3));
student.setSalary(fieldSet.readString(4));
return Student;
}
}
You can use the index or the name. You should debug your code and confirm the position or the name how the college is coming to ignore that.
I have tried this:
<util:list id="list1">
<value>foo#bar.com</value>
<value>foo1#bar.com</value>
</util:list>
<util:list id="list2">
<value>foo2#bar.com</value>
<value>foo3#bar.com</value>
</util:list>
<util:map id="emailMap" value-type="java.util.List">
<!-- Map between String key and List -->
<entry key="entry1" value-ref="list1" />
<entry key="entry2" value-ref="list2" />
...
</util:map>
calling this map like this
<bean id="myBean" class="com.sample.beans">
<property name="mapArray" ref="emailMap" />
</bean>
I have written a test case to see if this being populated, but where this is being called i have added a Map.isEmpty() and it always returns true, meaning map is not getting populated. Could you please guide me?
public class beans()
{
Map<String, List<String>> mapArray= new HashMap<String,ArrayList<String>>();
public void setMapArray(Map<String, List<String>> map)
{
this.mapArray= map;
}
public Array<String, List<String>> getMapArray()
{
return mapArray;
}
public void makeObject(String key)
throws Exception {
System.out.println(mapArray.isEmpty());
}
}
The test case calls a function makeobject. Here the value returned is true always
TEST:
public class testing{
#Test(enabled=true)
public void call1() throws {
ApplicationContext context = new ClassPathXmlApplicationContext("call.the.xml");
BeanFactory factory = context;
KeyedPoolFactory test = (KeyedPoolFactory) factor.getBean("myBean");
// CALLED MakeObject here. So the make object is being called for sure since it returning true for isEmpty();
}
I am wondering how to implement batch operations with my insert statements using MyBatis 3 & Spring 3?
For example, here is what is currently being done:
spring.xml:
<bean id="jndiTemplateDatasource" class="org.springframework.jndi.JndiTemplate">
<property name="environment">
<props>
<prop key="java.naming.factory.initial">${context.factory}</prop>
</props>
</property>
</bean>
<bean id="dataSource" class="org.springframework.jndi.JndiObjectFactoryBean">
<property name="jndiTemplate" ref="jndiTemplateDatasource"/>
<property name="jndiName" value="${connectionpool.jndi}"/>
</bean>
<bean id="transactionManager" class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<property name="dataSource" ref="dataSource"/>
</bean>
<tx:annotation-driven transaction-manager="transactionManager"/>
<bean id="sqlSessionFactory" class="org.mybatis.spring.SqlSessionFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="configLocation" value="classpath:mybatis-config.xml"/>
</bean>
<bean class="org.mybatis.spring.mapper.MapperScannerConfigurer">
<property name="basePackage" value="com.test" />
</bean>
MyService.xml:
<insert id="insertMyRecord" parameterType="com.test.MyRecord" >
insert into ... // code removed
</insert>
MyService.java:
public interface MyService {
public void insertMyRecord (MyRecord);
}
MyController.java:
#Controller
public class MyController {
#Autowired
private MyService myService;
#Transactional
#RequestMapping( .... )
public void bulkUpload (#RequestBody List<MyRecord> myRecords) {
for (MyRecord record : myRecords) {
myService.insertMyRecord(record);
}
}
}
Disclaimer: That is just pseudo code for demonstration purposes
So what can I do to turn that into a batch process?
Ideally I want to be able to do it with least "intrusion" into code, i.e. use annotations more preferred, but if not possible what is the next best thing?
Also, this needs to be configured just for this one service, not for everything in the project.
The accepted answer above doesn't actually get you batch mode for MyBatis. You need to choose the proper Executor via ExecutorType.BATCH. That is either passed as a parameter to SqlSession.openSession in standard MyBatis API or, if using MyBatis-Spring, as an option to the SqlSessionTemplate. That is done via:
<bean id="sqlSession" class="org.mybatis.spring.SqlSessionTemplate">
<constructor-arg index="0" ref="sqlSessionFactory" />
<constructor-arg index="1" value="BATCH" />
</bean>
There is nothing else that needs to be done.
This is running and tested example ...
Update multiple rows using batch (ibatis + java )
In this ex. I am updating attending count from table with respective to partyid.
public static int updateBatch(List<MyModel> attendingUsrList) {
SqlSession session = ConnectionBuilderAction.getSqlSession();
PartyDao partyDao = session.getMapper(PartyDao.class);
try {
if (attendingUsrList.size() > 0) {
partyDao.updateAttendingCountForParties(attendingUsrList);
}
session.commit();
} catch (Throwable t) {
session.rollback();
logger.error("Exception occurred during updateBatch : ", t);
throw new PersistenceException(t);
} finally {
session.close();
}
}
Model class where variable is defined :
public class MyModel {
private long attending_count;
private String eid;
public String getEid() {
return eid;
}
public void setEid(String eid) {
this.eid = eid;
}
public long getAttending_count() {
return attending_count;
}
public void setAttending_count(long attending_count) {
this.attending_count = attending_count;
}
}
party.xml code
Actual query where batch execute
<foreach collection="attendingUsrList" item="model" separator=";">
UPDATE parties SET attending_user_count = #{model.attending_count}
WHERE fb_party_id = #{model.eid}
</foreach>
Interface code here
public interface PartyDao {
int updateAttendingCountForParties (#Param("attendingUsrList") List<FBEventModel>attendingUsrList);
}
Here is my batch session code
public static synchronized SqlSession getSqlBatchSession() {
ConnectionBuilderAction connection = new ConnectionBuilderAction();
sf = connection.getConnection();
SqlSession session = sf.openSession(ExecutorType.BATCH);
return session;
}
SqlSession session = ConnectionBuilderAction.getSqlSession();
I'm not sure I understand the question fully correct but I will try to give you my thoughts.
For making the single service I would recommend to generify the service interface:
public void bulkUpload (#RequestBody List<T> myRecords)
Then you can check the type of the object and call the propper mapper repository.
Then you can generify it more by creating a common interface:
public interface Creator<T> {
void create(T object);
}
and extend it by your mapper interface:
public interface MyService extends Creator<MyRecord>{}
Now the most complicated step: you get the object of a particular type, see what exact mapper implements the Creator interface for this class (using java reflection API) and invoke the particular method.
Now I give you the code I use in one of my projects:
package com.mydomain.repository;
//imports ...
import org.reflections.Reflections;
#Repository(value = "dao")
public class MyBatisDao {
private static final Reflections REFLECTIONS = new Reflections("com.mydomain");
#Autowired
public SqlSessionManager sqlSessionManager;
public void create(Object o) {
Creator creator = getSpecialMapper(Creator.class, o);
creator.create(o);
}
// other CRUD methods
#SuppressWarnings("unchecked")
private <T> T getSpecialMapper(Class<T> specialClass, Object parameterObject) {
Class parameterClass = parameterObject.getClass();
Class<T> mapperClass = getSubInterfaceParametrizedWith(specialClass, parameterClass);
return sqlSessionManager.getMapper(mapperClass);
}
private static <T, P> Class<? extends T> getSubInterfaceParametrizedWith(Class<T> superInterface, Class<P> parameterType) {
Set<Class<? extends T>> subInterfaces = REFLECTIONS.getSubTypesOf(superInterface);
for (Class<? extends T> subInterface: subInterfaces) {
for (Type genericInterface : subInterface.getGenericInterfaces()) {
if (!(genericInterface instanceof ParameterizedType)) continue;
ParameterizedType parameterizedType = (ParameterizedType) genericInterface;
Type rawType = parameterizedType.getRawType();
if (rawType instanceof Class<?> && ((Class<?>) rawType).isAssignableFrom(superInterface)) {
for (Type type: parameterizedType.getActualTypeArguments()) {
if (type instanceof Class<?> && ((Class<?>) type).isAssignableFrom(parameterType)) {
return subInterface;
}
}
}
}
}
throw new IllegalStateException(String.format("No extension of %s found for parametrized type %s ", superInterface, parameterType));
}
}
Warning! This approach can have bad performance impact so use it in non-performance-critical actions
If you want bulk insert I would recommend to use mybatis foreach for bulk insert as described here.
If you think you don't want to write sql for every type of objects you better use Hibernate or any other advanced ORM. MyBatis is just an SQL mapping interface.
I'm running into a problem where my JSON response can be object or an array of objects
Foobar example with a single value:
{
"foo": {"msg": "Hello World" }
}
Foobar example with an array:
{
"foo": [
{ "msg": "Hello World" },
{ "msg": "Goodbye World" }
]
}
I want the force the single value into any array but so far, the only way I found converted all single values as arrays.
ACCEPT_SINGLE_VALUE_AS_ARRAY
http://wiki.fasterxml.com/JacksonFeaturesDeserialization
I've been looking around for an annotation that does the same thing for a single property but so far google hasn't turned up any examples.
Has anyone run into this problem before, I really don't want to rewrite everything as arrays to make RestTemplate work with a buggy service.
I want the force the single value into any array but so far, the only
way I found converted all single values as arrays.
This simply shouldn't be the case. The ACCEPT_SINGLE_VALUE_AS_ARRAY property is on/off for a given ObjectMapper, but its behavior is entirely governed by the target property the JSON value is being mapped to.
When ACCEPT_SINGLE_VALUE_AS_ARRAY is on, mapping a JSON value to a Java collection property will not result in an error.
When ACCEPT_SINGLE_VALUE_AS_ARRAY is on, mapping a JSON value to a Java basic property will (also) not result in an error.
Illustrated by the following code:
class Foo {
private String msg;
// Constructor, setters, getters
}
class Holder {
private List<Foo> foo;
private Foo other;
// Constructors, setters, getters
}
public class FooTest {
#org.junit.Test
public void testCollectionFromJSONValue() throws Exception {
final InputStream stream = Thread.currentThread()
.getContextClassLoader().getResourceAsStream("foo.json");
final String json = IOUtils.toString(stream);
final ObjectMapper mapper = new ObjectMapper();
mapper.configure(
DeserializationConfig.Feature.ACCEPT_SINGLE_VALUE_AS_ARRAY,
true);
final Holder holder = mapper.readValue(json, Holder.class);
System.out.println(holder);
}
}
Which relies on the following JSON:
{
"foo": {
"msg": "Hello World"
},
"other": {
"msg": "Goodbye"
}
}
Running the code will show that the "foo" property is successfully deserialized into a list, whereas the "other" property gets deserialized into a (basic) Foo type.
I had the same issue and struggled finding a solution to generally configure my RestTemplate that way. Because you don't always want to instantiate and alter an objectMapper... So here's my solution:
<bean id="myRestTemplate" class="org.springframework.web.client.RestTemplate">
<property name="messageConverters">
<list>
<bean class="org.springframework.http.converter.json.MappingJackson2HttpMessageConverter">
<property name="objectMapper" ref="jacksonObjectMapper" />
</bean>
</list>
</property>
</bean>
<bean id="jacksonObjectMapper" class="com.fasterxml.jackson.databind.ObjectMapper" />
<bean class="org.springframework.beans.factory.config.MethodInvokingFactoryBean">
<property name="targetObject" ref="jacksonObjectMapper" />
<property name="targetMethod" value="configure" />
<property name="arguments">
<list>
<value type="com.fasterxml.jackson.databind.DeserializationFeature">ACCEPT_SINGLE_VALUE_AS_ARRAY</value>
<value>true</value>
</list>
</property>
</bean>
You can then use this pre-configured RestTemplate by injecting it into your code:
#Autowired
private RestTemplate myRestTemplate;
Best Way to resolve this when using RestTemplate.
import org.springframework.http.converter.json.MappingJackson2HttpMessageConverter;
ObjectMapper objectMapper = new ObjectMapper()
.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false)
.configure(DeserializationFeature.ACCEPT_SINGLE_VALUE_AS_ARRAY, false);
MappingJackson2HttpMessageConverter jacksonMappingConverter
= new MappingJackson2HttpMessageConverter(objectMapper);
restTemplate.getMessageConverters().add(0, jacksonMappingConverter);
For the element to be parsed use the annotation which can be object or array define as below
import com.fasterxml.jackson.annotation.JsonFormat;
import com.fasterxml.jackson.annotation.JsonProperty;
public class ParentObject{
#JsonFormat(with = JsonFormat.Feature.ACCEPT_SINGLE_VALUE_AS_ARRAY)
#JsonProperty("InnerObject")
private List<InnerObject> innerObject;
}
If you don't want to add new mapper to restTemplate , change the exisitng one to support the use case
List<HttpMessageConverter<?>> messageConverters = restTemplate.getMessageConverters();
for (HttpMessageConverter<?> httpMessageConverter : messageConverters) {
if (httpMessageConverter instanceof MappingJackson2HttpMessageConverter) {
MappingJackson2HttpMessageConverter mappingJackson2HttpMessageConverter = (MappingJackson2HttpMessageConverter) httpMessageConverter;
mappingJackson2HttpMessageConverter.getObjectMapper()
.configure(DeserializationFeature.ACCEPT_SINGLE_VALUE_AS_ARRAY, true);
}
}
How to conditionally initialization a class via spring?
If some condtion is true then i want one argument to be passed else some other
argument
<bean id="myFactory" class="Factory">
if something then
<constructor-arg>
<util:map>
<!-- configure your map here, or reference it as a separate bean -->
<entry key="java.lang.String" value="key">....</entry>
</util:map>
</constructor-arg>
else
<constructor-arg>
<util:map>
<!-- configure your map here, or reference it as a separate bean -->
<entry key="java.lang.String" value="key">....</entry>
</util:map>
</constructor-arg>
</bean>
How?
Spring Expression Language might do the trick for you. link
You can do it exactly the way that you have specified. Define a FactoryBean this way, say for eg. For generating a Customer Bean:
public class CustomFactoryBean implements FactoryBean<Customer>{
private int customProperty;
public int getCustomProperty() {
return customProperty;
}
public void setCustomProperty(int customProperty) {
this.customProperty = customProperty;
}
#Override
public Customer getObject() throws Exception {
if (customProperty==1)
return new Customer("1", "One");
return new Customer("999", "Generic");
}
#Override
public Class<?> getObjectType() {
return Customer.class;
}
#Override
public boolean isSingleton() {
return true;
}
}
That is basically it, now based on how you inject in th properties of the factory bean, the actual bean instantiation can be controlled in the getObject method above