C++ Representing Objects in a YAML file - java

I used to work with the Java framework spring boot which has this neat functionality where
you can have an yaml file like this:
config:
example_int: 17
And a java class like this:
public class Config {
int example_int;
config(example_int) {
this.example_int = example_int;
}
}
The framework would then (if I remember correctly) inject into the runtime
an instance of class Config with example_int data member initialized to 17.
I'm looking to implement a similar functionality in C++
i.e. parse a yaml file and then construct a c++ object based on the files contents.
While (I think) Spring uses runtime injection I think I could do this via meta programming to
reduce complexity.
TLDR:
Parse yaml with C++
Based on yaml configs inject an object into the runtime, or generate code via
metaprogramming, which has the characteristics defined in the yaml file.

Related

Dynamically decide which java class to create instance of using yaml/json

I have an interesting problem in with one of my projects.
I have ZipInputeStream from uploaded zip in spring boot application with several yaml file in it, which can have any name and dynamic content which we know and have java POJO objects for them.
For e.g. Below yaml corresponds to BaseModel.java
---
baseModel:
other contents
class BaseModel{}
and below yaml will correspond to TopModel.java
---
topModel:
other contents
class TopModel{}
Now the Question is how do i decide which java class to use for object creation at runtime based on the yaml file string.
I am using jackson and tried converting yaml string to JSON but that requires a type in JSON which we dont have in the file and we cant get it added as well.
Any help on this would really be appreciated.
Thanks.

Configuring global list of allowed classes for serialization

I am using Inifinispan v12.1 with String Boot v2.5.2 via org.infinispan:infinispan-spring-boot-starter-embedded. In our application we are using custom classes which we would like to cache (very common case), however it turned out that starting from v10 these classes need to be listed in "allow list".
We are using infinispan.xml configuration passed via infinispan.embedded.config-xml property as advised by sample project.
Question: How is it possible to configure allow list globally for all caches by the means of XML configuration file?
I have considered the following options:
System property infinispan.deserialization.allowlist.regexps (from ClassAllowList) – not good choice as configuration will be spread between XML file and e.g. some other place. More over if the property is renamed in future Infinispan versions one would notice it only when application is run.
Defining the <cache-container><serialization><allow-list> as to documentation is not good option because will result several identical per-cache XML configuration blocks.
The corresponding Java Config for Spring Boot application would be:
#org.springframework.context.annotation.Configuration
public class InfinispanConfiguration {
#Bean
public InfinispanGlobalConfigurationCustomizer globalCustomizer() {
return builder -> builder.allowList().addRegexp("^org\\.mycompany\\.");
}
}
P.S. Javadoc in GlobalConfiguration assumes that there is <default> XML section the configuration can be read from, but in fact XML does not support it anymore.
P.P.S. Arguably the dots in the packages should be escaped in SpringEmbeddedModule and start with ^ because ClassAllowList uses Matcher#find() (boolean regexMatch = compiled.stream().anyMatch(p -> p.matcher(className).find());):
serializationAllowList.addRegexps("^java\\.util\\..*", "^org\\.springframework\\..*");

Creating a network of objects from custom XML config with Spring

I have a custom XML config defining a kind of network like this
S1 ---- O1 ---- O2 ---- O3 ---- T1
\
+--- O4 ---- O5 ------------ T2
\
S2---+- O6 --+- O7 ------------ T4
/ /
S3-+ /
/
S4 ------+
Where
S is some kind of data source, like a web socket
O is an operator processing the data
T is the target or data sink
These elements are represented with xml blocks like this:
<source name="S1" address="ws://example/1" type="websocket" dataType="double" />
<operator name="O6" type="threshold">
<input name="S1"/>
<input name="S2"/>
<input name="S3"/>
<property name="threshold" value="10.34" />
<property name="window" value="10.0" />
</operator>
<sink name="T1" type="database">
<input name="O3"/>
</sink>
The dependencies are constructor parameters. My example operator O6 would have a constructor like this:
class ThresholdOperator extends Operator<Boolean> {
public ThresholdOperator(
String name, // "O6"
List<DataSource> sources, // [S1, S2, S3]
double threshold, // 10.34
double window) { // 10.0
...
There could be multiple instances of this class with different constructor parameters. It is possible that a class has more than one constructor. The type parameter of the base class is the output type.
The type attribute determines what concrete class has to be instantiated. The dataType attribute of the source decides which kind of converter (here String to Double) should be injected.
To create the instances I need to figurare out a dependency graph and start instantiating the objects without other objects from my graph as dependency (the sources in this case), then I would create the objects which depend only on objects created in the first step and so on.
So I would basically reinvent something like Spring for my special use case. Is there a way to leverage Spring to create and wire objects in my case? A somewhat crude hack would be to transform my xml config to a beans.xml. But maybe there is a better way using BeanFactory or the like. Or would it be possible to create the Spring meta-model directly?
I'm using Spring 4.3 but the RC of Spring 5 could be an option, if it would help.
Another alternative not yet mentioned here is using XSLT.
The idea is to define xsl that maps your domain-specific xml to spring beans xml (XSLT+XPath should be more than enough to cover your case).
You can then read domain-specific xml, transform it with that xsl and feed the result to spring.
Have a look on StaticApplicationContext. It is stated in the docs that it is:
Mainly useful for testing.
... but it is a full fledged application context that has support for programmatic bean registration.
You can read your domain-specific xml and define beans based on it inside StaticApplicationContext.
This blog post can give you an idea on how to use StaticApplicationContext to define beans with references and constructor args.
A simpler approach to instantiate your objects from the document would be to either
create an XML Schema describing your data format and using JAXB to create your Java classes
annotate your existing Java classes with JAXB annotations
The "crud" hack approach may be a better approach but instead of converting your config xml to beans xml file manually, I suggest you to look at the Extensible XML authoring approach.
The configuration parser, a.k.a. bean definition parser, allows you to build the bean definitions which will eventually be used your application's spring context to instantiate the beans.
This should also eliminate the needs of figuring out the dependency hierarchy manually and instantiation of objects yourself.
Hope it answer your question.

Preserve Generics when generating JSON schema

I'm using jackson-module-jsonSchema and jsonschema2pojo API.
Brief explanation: I'm trying to json-schemify my server's Spring controller contract objects (objects that the controllers return and objects that they accept as parameters) and package them up to use with a packaged retrofit client in order to break the binary dependency between the client and server. The overall solution uses an annotation processor to read the Spring annotations on the controller and generate a retrofit client.
I've got it mostly working, but realized today I've got a problem where generic objects are part of the contract, e.g.
public class SomeContractObject<T> {
...
}
Of course, when I generate the schema for said object, the generic types aren't directly supported. So when I send it through the jsonschema2pojo api I end up with a class like so:
public class SomeContractObject {
}
So my question is simple but may have a non-trivial answer: Is there any way to pass that information through via the json schema to jsonschema2pojo?

Jackson - Deserializing to a Different Type

Using Jackson, how can I have JSON serialized/deserialized by one application using one set of classes, but have another application deserialize the same JSON and load different implementations of those classes?
I have a (Spring MVC) web application that allows users to define steps in a script, that in turn will be executed in a client application. Steps might be things like ShowDialogStep, with properties like dialogText, or WaitStep with a property of duration.
The client application will load collections of steps from the server. However, the classes instantiated by the client need to have execution-specific functionality like execute(), which in the case of WaitStep will keep a track of how far through waiting it is. Clearly the server-side application never needs know about this, and in less trivial examples the execute/update logic of a steps involves all manner of client-specific dependencies.
So, to recap I need:
The server application to map the 'prototype' classes to JSON;
The client application to read the same JSON but instantiate execution-specific classes instead of the 'prototype' ones.
Would this be something that could be configured on the client-side mapper, perhaps if the JSON was serialized using relative class names (rather than fully-qualified) then the deserializer could be configured to look in a different package for the implementations with execution logic in them?
You can use this approach:
On the server side:
#JsonTypeInfo(use=JsonTypeInfo.Id.NAME,
include=JsonTypeInfo.As.PROPERTY, property="#type")
class Prototype {
...
}
objectMapper.registerSubtypes(
new NamedType(Prototype.class, "Execution"),
...
);
then it will serialize a Prototype instance and add a type of bean:
{
"#type" : "Execution",
...
}
on the client-side:
#JsonTypeInfo(use=JsonTypeInfo.Id.NAME,
include=JsonTypeInfo.As.PROPERTY, property="#type")
class Execution {
...
}
objectMapper.registerSubtypes(
new NamedType(Execution.class, "Execution"), // the same name
....
);
objectMapper.readValue(....); // will be deserialized to an Execution instance

Categories