MongoDB Java driver #Discriminator #BsonDiscriminator annotations - java

Mongo Docs http://mongodb.github.io/mongo-java-driver/3.11/bson/pojos/ states:
The easiest way to enable a discriminator is to annotate the abstract
class with the Discriminator annotation.
But the problem I see now, there is no #Discriminator annotation in 3.11 driver. I not sure is it the same as #BsonDiscriminator, which I can found at https://www.javadoc.io/static/org.mongodb/mongo-java-driver/3.11.0/org/bson/codecs/pojo/annotations/package-summary.html
How should I use #Discriminator and where is it?

Seems like this is a mistake in Mongo docs. I used #BsonDiscriminator and it works.
I was not able to find good example so I post here what I was implemented to check this. Please note I used Jackson but this is not shown in this answer. Thus some of annotations should be removed on Parent, Pojo1, Pojo2:
#JsonTypeInfo(include=JsonTypeInfo.As.WRAPPER_OBJECT, use=JsonTypeInfo.Id.NAME)
#JsonSubTypes({
#JsonSubTypes.Type(value = Pojo1.class),
#JsonSubTypes.Type(value = Pojo2.class)})
#BsonDiscriminator
public abstract class Parent {
}
Child classes:
#Data
#NoArgsConstructor
public class Pojo1 extends Parent {
String string;
Integer number;
Boolean flag;
}
#Data
#NoArgsConstructor
public class Pojo2 extends Parent {
String string;
Integer number;
Boolean flag;
}
Container class:
#Data
#NoArgsConstructor
public class Container {
private String name;
private List<Parent> pojos;
}
ContainerDAO persists and read container object which contains collection of Parent objects:
public class ContainerDAO {
private static final MongoDatabase DATABASE = MongoDB.getDatabase();
private static final MongoCollection<Container> CONTAINER_COLLECTION =
DATABASE.getCollection("containers", Container.class);
static {
CONTAINER_COLLECTION.createIndex(
Indexes.ascending("name"), new IndexOptions().unique(true));
}
public static void create(Container container){
Bson filter =
eq("name", container.getName());
ReplaceOptions options = new ReplaceOptions().upsert(true);
CONTAINER_COLLECTION.replaceOne(filter, container, options);
}
public static Container getByName(String name) {
Container container = CONTAINER_COLLECTION.find(
eq("name", name))
.first();
return container;
}
public static void deleteOne(String name) {
Bson filter =
eq("name", name);
CONTAINER_COLLECTION.deleteOne(filter);
}
}
And the database connection object:
/**
* MongoDb database and connection settings class
*/
public class MongoDB {
private static final MongoDatabase DATABASE;
static {
...
PojoCodecProvider pojoCodecProvider = PojoCodecProvider
.builder().register(com.researchforgood.survey.jackson.Parent.class, com.researchforgood.survey.jackson.Pojo1.class, com.researchforgood.survey.jackson.Pojo2.class).build();
CodecRegistry pojoCodecRegistry = fromRegistries(MongoClientSettings.getDefaultCodecRegistry(),
fromProviders(pojoCodecProvider, PojoCodecProvider.builder().automatic(true).build()));
MongoClientSettings settings = MongoClientSettings.builder()
.codecRegistry(pojoCodecRegistry)
.applyConnectionString(new ConnectionString(settingsMap.get("url")))
.build();
MongoClient mongoClient = MongoClients.create(settings);
DATABASE = mongoClient.getDatabase(settingsMap.get("database"));
}
public static MongoDatabase getDatabase() {
return DATABASE;
}
}
And here the test - an example of usage
#Test
public void saveAndRestorePojo() throws JsonProcessingException {
ContainerDAO.create(container);
Container containerFromDB = ContainerDAO.getByName(container.getName());
LOG.info(containerFromDB.toString());
assertEquals(containerFromDB.getPojos().get(0).getClass(), Pojo1.class);
assertEquals(((Pojo1)containerFromDB.getPojos().get(0)).getString(), "Hello1!");
}

Related

How to save java object in postgres jsonb column

Here is my code:
I need to save java object value as jsonb in database (r2dbc).
#Getter
#Setter
#ToString
#NoArgsConstructor
#AllArgsConstructor
#Table("scoring")
public class ScoringModel extends BaseModel {
#Column("client_id")
#SerializedName(value = "clientId", alternate = {"client_id"})
private String clientId;
//othres
#Column("languages")
#SerializedName(value = "languages", alternate = {"languages"})
private String languages;
#SerializedName(value = "response", alternate = {"response"})
//Need to save as JsonB
private Object response;
}
Please help me resolve the issue
You need to implement ReadingConverter and WritingConverter and then register them in R2dbcCustomConversions in configuration.
#Bean
public R2dbcCustomConversions myConverters(ConnectionFactory connectionFactory){
var dialect = DialectResolver.getDialect(connectionFactory);
var converters = List.of(…);
return R2dbcCustomConversions.of(dialect, converters);
}
Converters itself should produce JSONs.
If you using Postgres, there are two approaches you can use to map to Postgres JSON/JSONB fields.
use Postgres R2dbc Json type directly in Java entity class.
use any Java type and convert it between Json via registering custom converters.
The first one is simple and stupid.
Declare a json db type field, eg.
metadata JSON default '{}'
Declare the Json type field in your entity class.
class Post{
#Column("metadata")
private Json metadata;
}
For the second the solution, similarly
Declare json/jsonb db type in the schema.sql.
Declare the field type as your custom type.eg.
class Post{
#Column("statistics")
private Statistics statistics;
record Statistics(
Integer viewed,
Integer bookmarked
) {}
}
Declare a R2dbcCustomConversions bean.
#Configuration
class DataR2dbcConfig {
#Bean
R2dbcCustomConversions r2dbcCustomConversions(ConnectionFactory factory, ObjectMapper objectMapper) {
R2dbcDialect dialect = DialectResolver.getDialect(factory);
return R2dbcCustomConversions
.of(
dialect,
List.of(
new JsonToStatisticsConverter(objectMapper),
new StatisticsToJsonConverter(objectMapper)
)
);
}
}
#ReadingConverter
#RequiredArgsConstructor
class JsonToStatisticsConverter implements Converter<Json, Post.Statistics> {
private final ObjectMapper objectMapper;
#SneakyThrows
#Override
public Post.Statistics convert(Json source) {
return objectMapper.readValue(source.asString(), Post.Statistics.class);
}
}
#WritingConverter
#RequiredArgsConstructor
class StatisticsToJsonConverter implements Converter<Post.Statistics, Json> {
private final ObjectMapper objectMapper;
#SneakyThrows
#Override
public Json convert(Post.Statistics source) {
return Json.of(objectMapper.writeValueAsString(source));
}
}
The example codes is here.
Finally verify it with a #DataR2dbcTest test.
#Test
public void testInsertAndQuery() {
var data = Post.builder()
.title("test title")
.content("content of test")
.metadata(Json.of("{\"tags\":[\"spring\",\"r2dbc\"]}"))
.statistics(new Post.Statistics(1000, 200))
.build();
this.template.insert(data)
.thenMany(
this.posts.findByTitleContains("test%")
)
.log()
.as(StepVerifier::create)
.consumeNextWith(p -> {
log.info("saved post: {}", p);
assertThat(p.getTitle()).isEqualTo("test title");
}
)
.verifyComplete();
}

Why ReactiveMongoTemplate save does not work with Testcontainers

I have a spring-boot application.
I have entity:
#Data
#Builder
#NoArgsConstructor
#AllArgsConstructor
#Document(COLLECTION_NAME)
public class PersonEntity {
public static final String COLLECTION_NAME = "person_info";
private static final String PERSON_NAME = "person_name";
#Id
private PersonId id;
#Field(name = PERSON_NAME)
private String personName;
#Indexed(name = "ttl_index", expireAfterSeconds=20)
private LocalDateTime date;
}
I have a repository interface:
public interface PersonRepository {
void saveWithTtl(PersonEntity entity);
}
The repository implementation:
#Slf4j
#Repository
public class PersonRepositoryImpl implements PersonRepository {
private final int expireAfterSeconds;
private final ReactiveMongoTemplate mongoTemplate;
public PersonRepositoryImpl(#Value("${ttl.index}") int expireAfterSeconds,
ReactiveMongoTemplate mongoTemplate) {
this.expireAfterSeconds = expireAfterSeconds;
this.mongoTemplate = mongoTemplate;
}
#Override
public void saveWithTtl(PersonEntity entity) {
mongoTemplate.indexOps(PersonEntity.class)
.ensureIndex(new Index().on(PersonEntity.CREATED_AT, ASC)
.expire(expireAfterSeconds)).subscribe(result -> log.info("Ttl index has been created: {}", result));
mongoTemplate.save(entity).subscribe(result -> log.info("Entity has been saved: {}", result));
}
}
And, finally, I have test that does not work:
#DataMongoTest
#Testcontainers
public class PersonRepositoryIT {
#Autowired
private ReactiveMongoTemplate mongoTemplate;
#Autowired
private PersonRepository repository;
#Container
private static MongoDbContainer mongoDbContainer = new MongoDbContainer();
#AfterEach
void cleanUp() {
repository.deleteAll();
}
#DynamicPropertySource
static void registerMongoProperties(DynamicPropertyRegistry registry) {
registry.add("spring.data.mongodb.uri", mongoDbContainer::getReplicaSetUrl);
}
#Test
public void shouldCreateAndDeleteRecordsAfterDelay_whenSaveWithTtl_givenDefinedTll() {
//given
PersonEntity givenEntity = PersonEntity.builder().createdAt(LocalDateTime.now())
.personName("Joe")
.id(PERSON_ID).build();
//when
repository.saveWithTtl(givenEntity);
//then
StepVerifier.create(mongoTemplate.estimatedCount(PersonEntity.COLLECTION_NAME))
.expectNext(1L)
.verifyComplete();
}
}
On expectNext it fails coz it returns 0 and not 1.
mongoTemplate.estimatedCount returns 0
When I test the repository from Postman (repo is calling inside service), it creates the document in MongoDB wil ttl index, as expected.
In test fonfig I have set the ${ttl.index} to 20.
What am I doing wrong?
I don't know if it is to late, but I had the same problem today.
I Found your question looking for an answer for my problem hahahaha.
This snipped worked for me:
#Container
public static MongoDBContainer container = new MongoDBContainer(DockerImageName.parse("mongo:6"));
#DynamicPropertySource
static void mongoDbProperties(DynamicPropertyRegistry registry) {
registry.add("spring.data.mongodb.uri", container::getReplicaSetUrl);
}
#Bean
public ReactiveMongoTemplate reactiveMongoTemplate() throws Exception {
container.start();
ConnectionString connectionString = new ConnectionString(container.getReplicaSetUrl());
MongoClientSettings mongoClientSettings = MongoClientSettings.builder()
.applyConnectionString(connectionString)
.build();
MongoClient mongoClient = MongoClients.create(mongoClientSettings);
return new ReactiveMongoTemplate(mongoClient,"test");
}
Apparently ReactiveMongoTemplate is not being injected by default, then I created my own Bean an it worked

Spring Boot findById not working for MongoDB

I'm trying to do a simple get query on springboot using mongodb as database engine
I have tried with several stuff(sending the data as ObjectId and even changing the repository)
public ResponseEntity<Track> get(String trackId) {
Track find = mongoTemplate.findById(new ObjectId(trackId), Track.class);
Optional<Track> track = tracksRepository.findById(trackId);
if (track.isPresent()) {
return new ResponseEntity<>(track.get(), HttpStatus.OK);
}
return new ResponseEntity<>(HttpStatus.NOT_FOUND);
}
with mongo config
#Configuration
#EnableMongoRepositories(basePackages = "data.store.repositories")
public class MongoConfig extends AbstractMongoClientConfiguration {
private final Logger LOGGER = Logger.getLogger(this.getClass().getSimpleName());
#Primary
#Bean
#Override
public MongoClient mongoClient() {
return MongoClients.create(MongoClientSettings.builder()
.applyToClusterSettings(builder -> builder.hosts(Arrays.asList(new ServerAddress(host, port))))
.build());
}
private MongoCredential mongoCredentials() {
return MongoCredential.createCredential(username, database, password.toCharArray());
}
#Bean
public MongoTemplate mongoTemplate() {
MongoTemplate mongoTemplate = new MongoTemplate(mongoClient(), getDatabaseName());
mongoTemplate.setReadPreference(ReadPreference.secondaryPreferred());
return mongoTemplate;
}
protected String getDatabaseName() {
return database;
}
#Override
public boolean autoIndexCreation() {
return false;
}
}
EDIT: Adding class for context
#Document("track")
public class Track {
#Id
#Field(ATTR_ID)
#JsonProperty(ATTR_ID)
public String id;
public static final String ATTR_ID = "id";
}
and getting always null, with existing keys on my database. could you help me find the issue?
Thanks in advance
I tried this with similar configuration class and found the following worked fine creating/accessing data using MongoTemplate.
The POJO class:
public class Test {
#MongoId(FieldType.OBJECT_ID)
private String id;
private String name;
public Test() {
}
public Test(String s) {
super();
this.name = s;
}
// get, set methods
public String toString( ) {
return id + " - " + name;
}
}
From Spring's CommandLineRunner.run():
// Insert a document into the database
Test t1 = new Test("alpha");
t1 = mt.insert(t1);
System.out.println(t1); // 61e7de9f5aadc2077d9f4a58 - alpha
// Query from the database using the _id
ObjectId id = new ObjectId("61e7de9f5aadc2077d9f4a58");
Test t2 = mt.findById(id, Test.class);
System.out.println(t2);
Note that you need to do this from the class where you are running the code:
#Autowired private MongoTemplate mt;
You can use the #MongoId or #Id annotations in our POJO class to represent MongoDB _id field. The type of the field can be a String or ObjectId. It depends upon how you define.
See this from Spring Data MongoDB documentation on How the _id Field is Handled in the Mapping Layer using:
#MongoId
#Id
Solution is to add to MongoId annotation field type object id
#MongoId(FieldType.OBJECT_ID)
private String id;

org.springframework.core.convert.converter.Converter convert some classes to one(or class+parameter to one class)

I have org.springframework.core.convert.converter.Converter
#Component
public class CatalogConverter implements Converter<ServiceCatalogType, Catalog> {
#Override
public Catalog convert(ServiceCatalogType source) {
Catalog catalog = new Catalog();
//convert
return catalog;
}
}
And I register this converter:
#Configuration
public class ConvertersConfig {
private final CatalogConverter catalogConverter;
#Autowired
public ConvertersConfig(CatalogConverter catalogConverter) {
this.catalogConverter = catalogConverter;
}
#Bean(name="conversionService")
ConversionService conversionService() {
ConversionServiceFactoryBean factoryBean = new ConversionServiceFactoryBean();
HashSet<Converter> converters = new HashSet<>();
converters.add(catalogConverter);
factoryBean.setConverters(converters);
factoryBean.afterPropertiesSet();
return factoryBean.getObject();
}
}
But I need pass some parameter to my custom converter. I have some ways:
Pass it in constructor - but how can register this converter?
Use wrapper
class Wrapper{
private ServiceCatalogType catalog;
private String uuid;
}
and change converter like this:
implements Converter<ServiceCatalogType, Wrapper>
Or maybe Spring has another way?
EDIT
I need next.
in service:
pulic void send() {
ServiceCatalogType cs = getServiceCatalogType();//get from net
User user = getUser();//get from db
//convert cs+user to Catalog(all cs fields+ some user fields to catalog)
Catalog catalog = conversionService.convert(cs, user, Catalog.class);
}
EDIT2
Wrapper implementation:
#Data
#AllArgsConstructor
#NoArgsConstructor
public class CatalogWrapper {
private ServiceCatalogType serviceCatalogType;
private User user;
}
CatalogWrapper wrapper = new CatalogWrapper(getServiceCatalog(), getUser);
catalog = conversionService.convert(wrapper, Catalog.class);

Using #JsonIdentityInfo without annotations

I use Jackson 2.2.3 to serialize POJOs to JSON. Then I had the problem, that I couldn't serialize recursive structures...I solved this problem by using #JsonIdentityInfo => works great.
But, I don't want this annotation on the top of my POJO.
So my question is: Is there any other possibility to set the default behavior of my ObjectMapper to use the feature for every POJO?
So I want to transform this annotation code
#JsonIdentityInfo(generator=ObjectIdGenerators.IntSequenceGenerator.class, property="#id")
to something like
ObjectMapper om = new ObjectMapper();
om.setDefaultIdentityInfo(ObjectIdGenerators.IntSequenceGenerator.class, "#id");
Any ideas?
You can achieve that using the Jackson mix-in annotations or the Jackson annotation introspector.
Here is an example showing both methods:
public class JacksonJsonIdentityInfo {
#JsonIdentityInfo(
generator = ObjectIdGenerators.IntSequenceGenerator.class, property = "#id")
static class Bean {
public final String field;
public Bean(final String field) {this.field = field;}
}
static class Bean2 {
public final String field2;
public Bean2(final String field2) {this.field2 = field2;}
}
#JsonIdentityInfo(
generator = ObjectIdGenerators.IntSequenceGenerator.class, property = "#id2")
static interface Bean2MixIn {
}
static class Bean3 {
public final String field3;
public Bean3(final String field3) {this.field3 = field3;}
}
static class MyJacksonAnnotationIntrospector extends JacksonAnnotationIntrospector {
#Override
public ObjectIdInfo findObjectIdInfo(final Annotated ann) {
if (ann.getRawType() == Bean3.class) {
return new ObjectIdInfo(
PropertyName.construct("#id3", null),
null,
ObjectIdGenerators.IntSequenceGenerator.class,
null);
}
return super.findObjectIdInfo(ann);
}
}
public static void main(String[] args) throws JsonProcessingException {
final Bean bean = new Bean("value");
final Bean2 bean2 = new Bean2("value2");
final Bean3 bean3 = new Bean3("value3");
final ObjectMapper mapper = new ObjectMapper();
mapper.addMixInAnnotations(Bean2.class, Bean2MixIn.class);
mapper.setAnnotationIntrospector(new MyJacksonAnnotationIntrospector());
System.out.println(mapper.writeValueAsString(bean));
System.out.println(mapper.writeValueAsString(bean2));
System.out.println(mapper.writeValueAsString(bean3));
}
}
Output:
{"#id":1,"field":"value"}
{"#id2":1,"field2":"value2"}
{"#id3":1,"field3":"value3"}
After several months and a lot of research, I've implemented my own solution to keep my domain clear of jackson dependencies.
public class Parent {
private Child child;
public Child getChild(){return child;}
public void setChild(Child child){this.child=child;}
}
public class Child {
private Parent parent;
public Child getParent(){return parent;}
public void setParent(Parent parent){this.parent=parent;}
}
First, you have to declare each of your entities of the bidirectional relationship:
public interface BidirectionalDefinition {
#JsonIdentityInfo(generator=ObjectIdGenerators.PropertyGenerator.class, property="id", scope=Parent.class)
public interface ParentDef{};
#JsonIdentityInfo(generator=ObjectIdGenerators.PropertyGenerator.class, property="id", scope=Child.class)
public interface ChildDef{};
}
After that, the object mapper can be automatically configured:
ObjectMapper om = new ObjectMapper();
Class<?>[] definitions = BidirectionalDefinition.class.getDeclaredClasses();
for (Class<?> definition : definitions) {
om.addMixInAnnotations(definition.getAnnotation(JsonIdentityInfo.class).scope(), definition);
}

Categories