I would like to do mapping of nested object that needs value from the parent object. I could use solution mentioned here mapstruct - Propagate parent field value to collection of nested objects - either directly after mapping to set some value to the child object or to use context. But in my case I work with immutable objects.
example:
data class Worker(
val name: String,
val businessCard: BusinessCard? = null,
)
data class BusinessCard(
val companyName: String,
)
data class WorkerDto(
val name: String,
val businessCard: BusinessCardDto? = null,
)
data class BusinessCardDto(
val text: String, // "worker name | company name"
)
Is there a way how to directly map value without #AfterMapping modifications?
Something like this?
#Mapper(config = CustomMappingConfig::class, uses = [ComputerMapper::class])
abstract class WorkerMapper {
#Mapping(target = "businessCard.text", expression = "java(mapBcText(worker))")
abstract fun mapWorker(worker: Worker): WorkerDto
protected fun mapBcText(worker: Worker) = "${worker.name} | ${worker.businessCard?.companyName}"
}
But sadly the code above generates:
#Override
public WorkerDto mapWorker(Worker worker) {
if ( worker == null ) {
return null;
}
String name = null;
BusinessCardDto businessCard = null;
name = worker.getName();
businessCard = businessCardToBusinessCardDto( worker.getBusinessCard() );
WorkerDto workerDto = new WorkerDto( name, businessCard );
return workerDto;
}
protected BusinessCardDto businessCardToBusinessCardDto(BusinessCard businessCard) {
if ( businessCard == null ) {
return null;
}
BusinessCardDto businessCardDto = new BusinessCardDto();
businessCardDto.setText( mapBcText(worker) ); // WORKER IS NOT ACCESSIBLE HERE
return businessCardDto;
}
Does anybody have an idea how to achieve this mapping?
...I also tried to create custom BusinessCard mapper, but then I cannot access the parent data (Worker) in it then...
you need to use var instead of val in your dataclass.
mapstruct don't seem to manage immutable kotlin class for the moment.
Related
I have a problem with the inclusion and exclusion of nested fields in the aggregation.. I have a collection that contains nested objects, so when I try to build the query by including only some fields of nested objects it only works with the query, but it doesn't work with aggregation
This is a simple preview of my collection
{
"id": "1234",
"name": "place name",
"address": {
"city": "city name",
"gov": "gov name",
"country": "country name",
"location": [0.0, 0.0],
//some other data
},
//some other data
}
when i configure my fields with query it works
query.fields().include("name").include("address.city").include("address.gov")
Also when I do it with aggregation using the shell it works
db.getCollection("places").aggregate([
{ $project: {
"name": 1,
"address.city": 1,
"address.gov": 1
} },
])
but it doesn't work with aggregation in spring
val aggregation = Aggregation.newAggregation(
Aggregation.match(criteria)
Aggregation.project().andInclude("name").andInclude("address.city").andInclude("address.gov")
)
This aggregation with spring always returns address field as null, but when I put the "address" field without its nested fields the result will contain the full address object, not just its nested fields that I want to include.
Can someone tell me how to fix that?
I found a solution, it's by using the nested() function
val aggregation = Aggregation.newAggregation(
Aggregation.match(criteria)
Aggregation.project().andInclude("name")
.and("address").nested(Fields.fields("address.city", "address.gov"))
)
But it only works with hardcoded fields.. So if you want to have a function to which you pass the list of fields to include or exclude, you can use this solution
fun fieldsConfig(fields : List<String>) : ProjectionOperation
{
val mainFields = fields.filter { !it.contains(".") }
var projectOperation = Aggregation.project().andInclude(*mainFields.toTypedArray())
val nestedFields = fields.filter { it.contains(".") }.map { it.substringBefore(".") }.distinct()
nestedFields.forEach { mainField ->
val subFields = fields.filter { it.startsWith("${mainField}.") }
projectOperation = projectOperation.and(mainField).nested(Fields.fields(*subFields.toTypedArray()))
}
return projectOperation
}
The problem with this solution is that there is a lot of code, memory allocation for objects, and configuration to include fields.. In addition, it works only with inclusion, if you use it to exclude fields, it throws an exception.. Also, it does not work with the deepest fields of your document.
So I implemented a simpler and more elegant solution, which covers most cases of inclusion and exclusion of fields.
This builder class allows you to create an object containing the fields you want to include or exclude.
class DbFields private constructor(private val list : List<String>, val include : Boolean) : List<String>
{
override val size : Int get() = list.size
//overridden functions of List class.
/**
* the builder of the fields.
*/
class Builder
{
private val list = ArrayList<String>()
private var include : Boolean = true
/**
* add a new field.
*/
fun withField(field : String) : Builder
{
list.add(field)
return this
}
/**
* add a new fields.
*/
fun withFields(fields : Array<String>) : Builder
{
fields.forEach {
list.add(it)
}
return this
}
fun include() : Builder
{
include = true
return this
}
fun exclude() : Builder
{
include = false
return this
}
fun build() : DbFields
{
if (include && !list.contains("id"))
{
list.add("id")
}
else if (!include && list.contains("id"))
{
list.remove("id")
}
return DbFields(list.distinct(), include)
}
}
}
To build your fields configuration
val fields = DbFields.Builder()
.withField("fieldName")
.withField("fieldName")
.withField("fieldName")
.include()
.build()
This object you can pass it to your repository to configure the inclusion or exlusion.
I also created this class to configures the inclusion and exclusion using raw documents that will be transformed to a custom aggregation operation.
class CustomProjectionOperation(private val fields : DbFields) : ProjectionOperation()
{
override fun toDocument(context : AggregationOperationContext) : Document
{
val fieldsDocument = BasicDBObject()
fields.forEach {
fieldsDocument.append(it, fields.include)
}
val operation = Document()
operation.append("\$project", fieldsDocument)
return operation
}
}
now, you have just to use this class in your aggregation
class RepoCustomImpl : RepoCustom
{
#Autowired
private lateint mongodb : MongoTemplate
override fun getList(fields : DbFields) : List<Result>
{
val aggregation = Aggregation.newAggregation(
CustomProjectionOperation(fields)
)
return mongodb.aggregate(aggregation, Result::class.java, Result::class.java).mappedResults
}
}
I hava a class Packet.java(can't modify) in a package.
public class Packet implements java.io.Serializable, Cloneable {
private static final AtomicLong ID_ATOMICLONG = new AtomicLong();
private Long id = ID_ATOMICLONG.incrementAndGet();
}
I use own class LoginPacket.kt (can modify)
class LoginPacket : Packet () {
var id = "" ( this name must be id )
fun parsePacket(input: String): Boolean {
val map = HashMap<String,Any>()
map["id"] = "5d6ff3433354b4d43076419"
var wrapper: BeanWrapper = PropertyAccessorFactory.forBeanPropertyAccess(this)
wrapper.isAutoGrowNestedPaths = true
// question is here , I can not set id as String use BeanWrapper, Only can set id as Long
// and also I can replace id's getter and setter method
val pd = wrapper.getPropertyDescriptor("id")
pd.readMethod = LoginPacket::id.getter.javaMethod
pd.writeMethod = LoginPacket::id.setter.javaMethod
wrapper.setPropertyValues(map)
}
}
So what I can do next?
Thanks very much for sharing!
Beanwrapper link
It is not possible to override the type of a field.
What you can do instead depends on what you are trying to do, and which libraries you are using.
I can think of one way that may work, assuming your library does not need an instance nor subclass of Packet.
And that is creating your own class that only implements the interfaces:
class LoginPacket(): java.io.Serializable, Cloneable {
// You may or may not need this.
// Since the original version uses it to generate the ID,
// I think you can skip this part.
companion object {
#JvmStatic
private val ID_ATOMICLONG = AtomicLong()
}
var id : String = ""
fun parsePacket(input: String): Boolean {
val map = HashMap<String,Any>()
map["id"] = "5d6ff3433354b4d43076419"
var wrapper: BeanWrapper = PropertyAccessorFactory.forBeanPropertyAccess(this)
wrapper.isAutoGrowNestedPaths = true
val pd = wrapper.getPropertyDescriptor("id")
pd.readMethod = LoginPacket::id.getter.javaMethod
pd.writeMethod = LoginPacket::id.setter.javaMethod
wrapper.setPropertyValues(map)
}
}
It is hard to provide better answers without more context.
I'm new with Kotlin and I try to rework a small Java project to this new language. I use mongodb in my project and I have a class, for example:
class PlayerEntity {
constructor() {} //for mongodb to create an instance
constructor(id: ObjectId, name: String) { //used in code
this.id = id
this.name = name
}
#org.mongodb.morphia.annotations.Id
var id: ObjectId? = null
var name: String? = null
}
I have to mark id field as nullable (var id: ObjectId?) because of empty constructor. When I try to access this field from another class I have to use non-null check: thePlayer.id!!. But the logic of my application is that id field is never null (mongo creates an instance of Player and immediately sets id field). And I don't want to make a non-null check everywhere.
I tried to make a non-null getter, but it does not compile:
var id: ObjectId? = null
get(): ObjectId = id!!
I can also make some stub for id and use it in constructor, but this looks like a dirty hack:
val DUMMY_ID = new ObjectId("000000000000000000000000");
So is there a workaround to solve the issue?
I personally use a private var prefixed by _ + public val in similiar situations.
class Example<out T> {
private var _id: T? = null
val id: T
get() = _id!!
}
For your situation, it would look like this:
#org.mongodb.morphia.annotations.Id
private var _id: ObjectId? = null
val id: ObjectId
get() = _id!!
Alternatively, declare your variable as lateinit like this (but note that this exposes the setter publicly):
#org.mongodb.morphia.annotations.Id
lateinit var id: ObjectId
I am very new to Gson and Json. I have simple Events that I want to serialize through Json with the help of Gson.
Note: Code in Kotlin.
public abstract class Event() {
}
public class Move : Event() {
var from: Point? = null
var to: Point? = null
}
public class Fire : Event() {
var damage: Int = 0
var area: ArrayList<Point> = ArrayList(0)
}
public class Build : Event() {
var to: Point? = null
var type: String = ""
var owner: String = ""
}
I am persisting bunch of these via this way:
val list: ArrayList<Event>() = ArrayList()
list.add(move)
list.add(fire)
val str = gson.toJson(events)
And unpersisting:
val type = object : TypeToken<ArrayList<Event>>(){}.getType()
val eventStr = obj.getString("events")
val events: ArrayList<Event> = gson.fromJson(eventStr, type)
I have tried both creating a serializer & deserializer for Event-class, and registering it via registerTypeAdapter, and I have also tried the RuntimeTypeAdapterFactory, but neither will persist the information required to unpersist the correct type.
For example, the RuntimeTypeAdapterFactory says:
"cannot deserialize Event because it does not define a field named type"
EDIT: Here's the code for the "Adapter", which was.. well, adapted from another StackOverflow post:
public class Adapter :
JsonSerializer<Event>,
JsonDeserializer<Event> {
final val CLASSNAME = "CLASSNAME"
final val INSTANCE = "INSTANCE"
override fun serialize(src: Event?, typeOfSrc: Type?, context: JsonSerializationContext?): JsonElement? {
val obj = JsonObject()
val className = (src as Event).javaClass.getCanonicalName()
obj.addProperty(CLASSNAME, className)
val elem = context!!.serialize(src)
obj.add(INSTANCE, elem)
return obj
}
override fun deserialize(json: JsonElement?, typeOfT: Type?, context: JsonDeserializationContext?): Event? {
val jsonObject = json!!.getAsJsonObject()
val prim = jsonObject.get(CLASSNAME)
val className = prim.getAsString()
val klass = Class.forName(className)
return context!!.deserialize(jsonObject.get(INSTANCE), klass)
}
}
This code fails with NullPointerException on line:
val className = prim.getAsString()
You can't do it this way.
The example you are referring is not targeted to your case. It works in only one case: if you register base type (not type hierarchy) and serialize using gson.toJson(obj, javaClass<Event>()). It will never work for array except you write custom serializer for you events container object too
Generally you need another approach: use TypeAdapterFactory and delegate adapters: GSON: serialize/deserialize object of class, that have registered type hierarchy adapter, using ReflectiveTypeAdapterFactory.Adapter and https://code.google.com/p/google-gson/issues/detail?id=43#c15
I believe this approach is overcomplicated so if you have few types the easiest solution is two serialize these types by hand, field by field via custom serializer and forget about attempts to delegate to default
I have a class and there are variables inside it as well. Sometimes I want to ignore some fields and sometimes not when deserializing (maybe at serializing too). How can I do it at Jackson?
For serialization, "filtering properties" blog entry should help. Deserialization side has less support, since it is more common to want to filter out stuff that is written.
One possible approach is to sub-class JacksonAnnotationIntrospector, override method(s) that introspect ignorability of methods (and/or fields) to use whatever logic you want.
It might also help if you gave an example of practical application, i.e what and why you are trying to prevent from being deserialized.
You might want to use JsonViews ( took it originally from http://wiki.fasterxml.com/JacksonJsonViews - broken now - web archive link: https://web.archive.org/web/20170831135842/http://wiki.fasterxml.com/JacksonJsonViews )
Quoting it:
First, defining views means declaring classes; you can reuse existing ones, or just create bogus classes -- they are just view identifiers with relationship information (child inherits view membership from parents):
// View definitions:
class Views {
static class Public { }
static class ExtendedPublic extends PublicView { }
static class Internal extends ExtendedPublicView { }
}
public class Bean {
// Name is public
#JsonView(Views.Public.class) String name;
// Address semi-public
#JsonView(Views.ExtendPublic.class) Address address;
// SSN only for internal usage
#JsonView(Views.Internal.class) SocialSecNumber ssn;
}
With such view definitions, serialization would be done like so:
// short-cut:
objectMapper.writeValueUsingView(out, beanInstance, ViewsPublic.class);
// or fully exploded:
objectMapper.getSerializationConfig().setSerializationView(Views.Public.class);
// (note: can also pre-construct config object with 'mapper.copySerializationConfig'; reuse)
objectMapper.writeValue(out, beanInstance); // will use active view set via Config
// or, starting with 1.5, more convenient (ObjectWriter is reusable too)
objectMapper.viewWriter(ViewsPublic.class).writeValue(out, beanInstance);
and result would only contain 'name', not 'address' or 'ssn'.
You should probably look at the modules feature of recent Jackson versions.
One possible mechanism would be to use a BeanDeserializerModifier.
I've been looking for a useful online tutorial or example, but nothing immediately appears. It might be possible to work something up if more is known of your context. Are you managing your ObjectMappers manually, or using them in a JAX-RS setting, injected in Spring, or what?
I searched the entire web (yes I did) to find the answer. then I wrote something on my own.
I'm working with Jackson ion deserialisation. I wrote a custom reader that ignores the fields dynamically.
You can do the same thing for json deserialisation.
Lets assume an entity like this.
User {
id
name
address {
city
}
}
Create a tree structure to represent field selection.
public class IonField {
private final String name;
private final IonField parent;
private final Set<IonField> fields = new HashSet<>();
// add constructs and stuff
}
Custom Ion Reader extending from amazon ion-java https://github.com/amzn/ion-java
public class IonReaderBinaryUserXSelective extends IonReaderBinaryUserX {
private IonField _current;
private int hierarchy = 0;
public IonReaderBinaryUserXSelective(byte[] data, int offset, int length,
IonSystem system, IonField _current) {
super(system, system.getCatalog(), UnifiedInputStreamX.makeStream(data, offset, length));
this._current = _current;
}
#Override
public IonType next() {
IonType type = super.next();
if (type == null) {
return null;
}
String file_name = getFieldName();
if (file_name == null || SystemSymbols.SYMBOLS.equals(file_name)) {
return type;
}
if (type == IonType.STRUCT || type == IonType.LIST) {
IonField field = _current.getField(getFieldName());
if (field != null) {
this._current = field;
return type;
} else {
super.stepIn();
super.stepOut();
}
return next();
} else {
if (this._current.contains(file_name)) {
return type;
} else {
return next();
}
}
}
#Override
public void stepIn() {
hierarchy = (hierarchy << 1);
if (getFieldName() != null && !SystemSymbols.SYMBOLS.equals(getFieldName())) {
hierarchy = hierarchy + 1;
}
super.stepIn();
}
#Override
public void stepOut() {
if ((hierarchy & 1) == 1) {
this._current = this._current.getParent();
}
hierarchy = hierarchy >> 1;
super.stepOut();
}
Construct dynamic view. This Tree dynamically created and passed to the reader to deserialise.
Let's say we only need city inside the address.
IonField root = new IonField("user", null);
IonField address = new IonField("address", root);
IonField city = new IonField("city", address);
address.addChild(city);
root.addChild(id);
//now usual stuff.
IonFactory ionFactory = new IonFactory();
IonObjectMapper mapper = new IonObjectMapper(ionFactory);
File file = new File("file.bin"); // ion bytes
byte[] ionData = Files.readAllBytes(file.toPath());
IonSystem ionSystem = IonSystemBuilder.standard().build();
IonReader ionReader = new IonReaderBinaryUserXSelective(ionData, 0, ionData.length, ionSystem, root);
User user = mapper.readValue(ionReader, User.class);