I'm trying to do a simple get query on springboot using mongodb as database engine
I have tried with several stuff(sending the data as ObjectId and even changing the repository)
public ResponseEntity<Track> get(String trackId) {
Track find = mongoTemplate.findById(new ObjectId(trackId), Track.class);
Optional<Track> track = tracksRepository.findById(trackId);
if (track.isPresent()) {
return new ResponseEntity<>(track.get(), HttpStatus.OK);
}
return new ResponseEntity<>(HttpStatus.NOT_FOUND);
}
with mongo config
#Configuration
#EnableMongoRepositories(basePackages = "data.store.repositories")
public class MongoConfig extends AbstractMongoClientConfiguration {
private final Logger LOGGER = Logger.getLogger(this.getClass().getSimpleName());
#Primary
#Bean
#Override
public MongoClient mongoClient() {
return MongoClients.create(MongoClientSettings.builder()
.applyToClusterSettings(builder -> builder.hosts(Arrays.asList(new ServerAddress(host, port))))
.build());
}
private MongoCredential mongoCredentials() {
return MongoCredential.createCredential(username, database, password.toCharArray());
}
#Bean
public MongoTemplate mongoTemplate() {
MongoTemplate mongoTemplate = new MongoTemplate(mongoClient(), getDatabaseName());
mongoTemplate.setReadPreference(ReadPreference.secondaryPreferred());
return mongoTemplate;
}
protected String getDatabaseName() {
return database;
}
#Override
public boolean autoIndexCreation() {
return false;
}
}
EDIT: Adding class for context
#Document("track")
public class Track {
#Id
#Field(ATTR_ID)
#JsonProperty(ATTR_ID)
public String id;
public static final String ATTR_ID = "id";
}
and getting always null, with existing keys on my database. could you help me find the issue?
Thanks in advance
I tried this with similar configuration class and found the following worked fine creating/accessing data using MongoTemplate.
The POJO class:
public class Test {
#MongoId(FieldType.OBJECT_ID)
private String id;
private String name;
public Test() {
}
public Test(String s) {
super();
this.name = s;
}
// get, set methods
public String toString( ) {
return id + " - " + name;
}
}
From Spring's CommandLineRunner.run():
// Insert a document into the database
Test t1 = new Test("alpha");
t1 = mt.insert(t1);
System.out.println(t1); // 61e7de9f5aadc2077d9f4a58 - alpha
// Query from the database using the _id
ObjectId id = new ObjectId("61e7de9f5aadc2077d9f4a58");
Test t2 = mt.findById(id, Test.class);
System.out.println(t2);
Note that you need to do this from the class where you are running the code:
#Autowired private MongoTemplate mt;
You can use the #MongoId or #Id annotations in our POJO class to represent MongoDB _id field. The type of the field can be a String or ObjectId. It depends upon how you define.
See this from Spring Data MongoDB documentation on How the _id Field is Handled in the Mapping Layer using:
#MongoId
#Id
Solution is to add to MongoId annotation field type object id
#MongoId(FieldType.OBJECT_ID)
private String id;
Related
I am writing a PUT request API with spring and mongodb. But the save() inserts a new object instead of update the current one.
#Document("Test")
public class Expense {
#Field(name = "name")
private String expenseName;
#Field(name = "category")
private ExpenseCategory expenseCategory;
#Field(name = "amount")
private BigDecimal expenseAmount;
public Expense( String expenseName, ExpenseCategory expenseCategory, BigDecimal expenseAmount) {
this.expenseName = expenseName;
this.expenseCategory = expenseCategory;
this.expenseAmount = expenseAmount;
}
public String getExpenseName() {
return expenseName;
}
public void setExpenseName(String expenseName) {
this.expenseName = expenseName;
}
public ExpenseCategory getExpenseCategory() {
return expenseCategory;
}
public void setExpenseCategory(ExpenseCategory expenseCategory) {
this.expenseCategory = expenseCategory;
}
public BigDecimal getExpenseAmount() {
return expenseAmount;
}
public void setExpenseAmount(BigDecimal expenseAmount) {
this.expenseAmount = expenseAmount;
}
}
This is my reporsitory class
public interface ExpenseRepository extends MongoRepository<Expense, String> {
}
This is my Service class which shows how to update the class.
#Service
public class ExpenseService {
private final ExpenseRepository expenseRepository;
public ExpenseService(ExpenseRepository expenseRepository) {
this.expenseRepository = expenseRepository;
}
public void updateExpense(String id, Expense expense){
Expense savedExpense = expenseRepository.findById(id)
.orElseThrow(() -> new RuntimeException(
String.format("Cannot Find Expense by ID %s", id)));
savedExpense.setExpenseName(expense.getExpenseName());
savedExpense.setExpenseAmount(expense.getExpenseAmount());
savedExpense.setExpenseCategory(expense.getExpenseCategory());
expenseRepository.save(savedExpense);
}
}
This is my controller
#RestController
#RequestMapping("/api/expense")
public class ExpenseController {
private final ExpenseService expenseService;
public ExpenseController(ExpenseService expenseService) {
this.expenseService = expenseService;
}
#PutMapping("/{id}")
public ResponseEntity<Object> updateExpense(#PathVariable String id, #RequestBody Expense expense){
expenseService.updateExpense(id, expense);
return ResponseEntity.ok().build();
}
}
As shown in mongodb compass, mongodb auto generates an _id field for every object. So I do not define a id field or use #id annotation to define a primary for the collection. However, in the service class, expenseRepository.findById(id) retrieves the desired object and update it. Why does save() do the insert instead of update? Many thanks.
JPA Can't find the existing entry as no id field id set. You need to add an id field and set generation type to auto.
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
private int id;
Hey I just began playing around with Modelmapper to map jOOQ records to POJOs.
This is the schema for the table whose records I am attempting to convert (Postgresql)
CREATE TABLE IF NOT EXISTS actor(
actor_id UUID DEFAULT uuid_generate_v4(),
first_name VARCHAR(256) NOT NULL,
last_name VARCHAR(256) NOT NULL,
PRIMARY KEY(actor_id)
);
Here is what the POJO looks like:
#JsonDeserialize(builder = Actor.Builder.class)
public class Actor {
private final UUID actorId;
private final String firstName;
private final String lastName;
private Actor(final Builder builder) {
actorId = builder.actorId;
firstName = builder.firstName;
lastName = builder.lastName;
}
public static Builder newBuilder() {
return new Builder();
}
public UUID getActorId() {
return actorId;
}
public String getFirstName() {
return firstName;
}
public String getLastName() {
return lastName;
}
#JsonIgnoreProperties(ignoreUnknown = true)
public static final class Builder {
private UUID actorId;
private String firstName;
private String lastName;
private Builder() {
}
public Builder withActorId(final UUID val) {
actorId = val;
return this;
}
public Builder withFirstName(final String val) {
firstName = val;
return this;
}
public Builder withLastName(final String val) {
lastName = val;
return this;
}
public Actor build() {
return new Actor(this);
}
}
}
I am creating a ModelMapper bean in my application and registering a UUID converter to it.
#Bean
public ModelMapper modelMapper() {
final ModelMapper mapper = new ModelMapper();
Provider<UUID> uuidProvider = new AbstractProvider<UUID>() {
#Override
public UUID get() {
return UUID.randomUUID();
}
};
final Converter<String, UUID> uuidConverter = new AbstractConverter<>() {
#Override
protected UUID convert(final String source) {
return UUID.fromString(source);
}
};
mapper.createTypeMap(String.class, UUID.class);
mapper.addConverter(uuidConverter);
mapper.getTypeMap(String.class, UUID.class).setProvider(uuidProvider);
mapper.getConfiguration()
.setSourceNameTokenizer(NameTokenizers.UNDERSCORE)
.addValueReader(new RecordValueReader())
.setDestinationNameTransformer(NameTransformers.builder("with"))
.setDestinationNamingConvention(NamingConventions.builder("with"));
mapper.validate();
return mapper;
}
I then use the model mapper to map the ActorRecord from the jOOQ autogenerated code to the POJO
public Optional<Actor> getActor(final UUID actorId) {
return Optional.ofNullable(dsl.selectFrom(ACTOR)
.where(ACTOR.ACTOR_ID.eq(actorId))
.fetchOne())
.map(e -> modelMapper.map(e, Actor.Builder.class).build());
}
This works except the UUID is always null. For example:
{"actor_id":null,"first_name":"John","last_name":"Doe"}
However when I change the following in the Builder:
public Builder withActorId(final String val) {
actorId = UUID.fromString(val);
return this;
}
It works! Unfortunately this does not work with an overloaded method:
public Builder withActorId(final String val) {
actorId = UUID.fromString(val);
return this;
}
public Builder withActorId(final UUID val) {
actorId = val;
return this;
}
As this also returns null.
You can see from the autogenerated jOOQ code it should be handling a UUID:
/**
* The column <code>public.actor.actor_id</code>.
*/
public final TableField<ActorRecord, UUID> ACTOR_ID = createField(DSL.name("actor_id"), org.jooq.impl.SQLDataType.UUID.nullable(false).defaultValue(org.jooq.impl.DSL.field("uuid_generate_v4()", org.jooq.impl.SQLDataType.UUID)), this, "");
I am not sure what I am exactly missing. I do not want to create a custom converter for each of my entities as I have a lot of them and they all contain (at least 1) UUID. Ideally I want to configure the ModelMapper to know about UUID and whenever it sees one it can handle it. Thanks!
NOTE: I also tried this with Lombok #Data object and it does not work either.
#JsonDeserialize(builder = Actor.ActorBuilder.class)
#Data
public class Actor {
private UUID actorId;
private String firstName;
private String lastName;
#JsonPOJOBuilder(withPrefix = "with")
public static class ActorBuilder {
}
}
UUID.fromString(val) is not allowed. I had same problem yesterday. Try to put to ModelMapper configurations UUID converted to String.
Mongo Docs http://mongodb.github.io/mongo-java-driver/3.11/bson/pojos/ states:
The easiest way to enable a discriminator is to annotate the abstract
class with the Discriminator annotation.
But the problem I see now, there is no #Discriminator annotation in 3.11 driver. I not sure is it the same as #BsonDiscriminator, which I can found at https://www.javadoc.io/static/org.mongodb/mongo-java-driver/3.11.0/org/bson/codecs/pojo/annotations/package-summary.html
How should I use #Discriminator and where is it?
Seems like this is a mistake in Mongo docs. I used #BsonDiscriminator and it works.
I was not able to find good example so I post here what I was implemented to check this. Please note I used Jackson but this is not shown in this answer. Thus some of annotations should be removed on Parent, Pojo1, Pojo2:
#JsonTypeInfo(include=JsonTypeInfo.As.WRAPPER_OBJECT, use=JsonTypeInfo.Id.NAME)
#JsonSubTypes({
#JsonSubTypes.Type(value = Pojo1.class),
#JsonSubTypes.Type(value = Pojo2.class)})
#BsonDiscriminator
public abstract class Parent {
}
Child classes:
#Data
#NoArgsConstructor
public class Pojo1 extends Parent {
String string;
Integer number;
Boolean flag;
}
#Data
#NoArgsConstructor
public class Pojo2 extends Parent {
String string;
Integer number;
Boolean flag;
}
Container class:
#Data
#NoArgsConstructor
public class Container {
private String name;
private List<Parent> pojos;
}
ContainerDAO persists and read container object which contains collection of Parent objects:
public class ContainerDAO {
private static final MongoDatabase DATABASE = MongoDB.getDatabase();
private static final MongoCollection<Container> CONTAINER_COLLECTION =
DATABASE.getCollection("containers", Container.class);
static {
CONTAINER_COLLECTION.createIndex(
Indexes.ascending("name"), new IndexOptions().unique(true));
}
public static void create(Container container){
Bson filter =
eq("name", container.getName());
ReplaceOptions options = new ReplaceOptions().upsert(true);
CONTAINER_COLLECTION.replaceOne(filter, container, options);
}
public static Container getByName(String name) {
Container container = CONTAINER_COLLECTION.find(
eq("name", name))
.first();
return container;
}
public static void deleteOne(String name) {
Bson filter =
eq("name", name);
CONTAINER_COLLECTION.deleteOne(filter);
}
}
And the database connection object:
/**
* MongoDb database and connection settings class
*/
public class MongoDB {
private static final MongoDatabase DATABASE;
static {
...
PojoCodecProvider pojoCodecProvider = PojoCodecProvider
.builder().register(com.researchforgood.survey.jackson.Parent.class, com.researchforgood.survey.jackson.Pojo1.class, com.researchforgood.survey.jackson.Pojo2.class).build();
CodecRegistry pojoCodecRegistry = fromRegistries(MongoClientSettings.getDefaultCodecRegistry(),
fromProviders(pojoCodecProvider, PojoCodecProvider.builder().automatic(true).build()));
MongoClientSettings settings = MongoClientSettings.builder()
.codecRegistry(pojoCodecRegistry)
.applyConnectionString(new ConnectionString(settingsMap.get("url")))
.build();
MongoClient mongoClient = MongoClients.create(settings);
DATABASE = mongoClient.getDatabase(settingsMap.get("database"));
}
public static MongoDatabase getDatabase() {
return DATABASE;
}
}
And here the test - an example of usage
#Test
public void saveAndRestorePojo() throws JsonProcessingException {
ContainerDAO.create(container);
Container containerFromDB = ContainerDAO.getByName(container.getName());
LOG.info(containerFromDB.toString());
assertEquals(containerFromDB.getPojos().get(0).getClass(), Pojo1.class);
assertEquals(((Pojo1)containerFromDB.getPojos().get(0)).getString(), "Hello1!");
}
I would like to store a method in a helper class and call that method from another class. The method also fetches data from a jpa repository.
For some reason when i call the method from the helper class, i get an error : Cannot resolve method 'getDocumentListByProduit' in 'DocumentHelper'. The method name doesn't show in IDE's autocompletion either. It's like the method isn't mapped for some reason.
Any hints why? thanks in advance.
class from where i wish to call the method:
#Entity
#Table(name = "document", schema = "table_name")
public class Document {
private int id;
private String url;
private String type;
private String titre;
private String description;
#Autowired
private DocumentHelper dh;
...
public Map<String, List<Document>> getDocumentListByProduit(int id){
Map<String, List<Document>> ret = dh.getDocumentListByProduit(id);
return ret;
}
the helper class :
#Component
public class DocumentHelper {
#Autowired
private DocumentRepository dr;
public DocumentHelper() {
}
public Map<String, List<Document>> getDocumentListByProduit(int id) {
Map<String, List<Document>> ret = new HashMap<>();
List<Document> listImg = new ArrayList<>();
List<Document> listOther = new ArrayList<>();
List<Document> dList = new ArrayList<>();
try {
dList = dr.getDocumentListByProduit(id);
for (Document tDoc : dList) {
if (tDoc.getType().equals("image")) {
listImg.add(tDoc);
} else {
listOther.add(tDoc);
}
}
ret.put("imageCollection", listImg);
ret.put("otherCollection", listOther);
} catch (Exception e) {
throw new DAOException("Une erreur est survenue : " + e.getMessage());
}
return ret;
}
}
then the Repository:
public interface DocumentRepository extends JpaRepository<Document, Integer> {
// in method getDocumentListByProduit in DocumentHelper
#Query(value = "SELECT * FROM DOCUMENT D, DOCUMENT_PRODUIT DP WHERE D.id = DP.id_document AND DP.id_produit = :id_produit;", nativeQuery = true)
List<Document> getDocumentListByProduit(#Param("id_produit") int id_produit);
}
JPA doesn't use the Spring container to instantiate its entities, so Spring does not inject dependencies into entities by default.
You can inject dependencies into objects not managed by the Spring container using #Configurable as described here. This approach requires configuring AspectJ into the project.
Another way would be injecting dependencies manually after JPA constructed an entity using AutowireCapableBeanFactory#autowireBean. This approach may be considered as a bad practice because of repetitiveness if you need it more than in one case.
ElasticSearch makes index for new records created by UI,but the records created by liquibase file not indexed so it don't appears in search result,ElasticSearch should index all records created by UI and liquibase files,Is there any process for indexing the records in liquibase files.
Liquibase only makes changes to your database. Unless you have some process which listens to the database changes and then updates Elasticsearch, you will not see the changes.
There might be multiple ways to get your database records into Elasticsearch:
Your UI probably calls some back-end code to index a create or an update into Elasticsearch already
Have a batch process which knows which records are changed (e.g. use an updated flag column or a updated_timestamp column) and then index those into Elasticsearch.
The second option can either be done in code using a scripting or back-end scheduled job or you might be able to use Logstash with the jdbc-input plugin.
As Sarwar Bhuiyan and Mogsdad sad
Unless you have some process which listens to the database changes and
then updates Elasticsearch
You can use liquibase to populate elasticsearch(this task will be executed once, just like normal migration). To do this you need to create a customChange:
<customChange class="org.test.ElasticMigrationByEntityName">
<param name="entityName" value="org.test.TestEntity" />
</customChange>
In that java based migration you can call the services you need. Here is an example of what you can do (please do not use code from this example in a production).
public class ElasticMigrationByEntityName implements CustomTaskChange {
private String entityName;
public String getEntityName() {
return entityName;
}
public void setEntityName(String entityName) {
this.entityName = entityName;
}
#Override
public void execute(Database database) {
//We schedule the task for the next execution. We are waiting for the context to start and we get access to the beans
DelayedTaskExecutor.add(new DelayedTask(entityName));
}
#Override
public String getConfirmationMessage() {
return "OK";
}
#Override
public void setUp() throws SetupException {
}
#Override
public void setFileOpener(ResourceAccessor resourceAccessor) {
}
#Override
public ValidationErrors validate(Database database) {
return new ValidationErrors();
}
/* ===================== */
public static class DelayedTask implements Consumer<ApplicationContext> {
private final String entityName;
public DelayedTask(String entityName) {
this.entityName = entityName;
}
#Override
public void accept(ApplicationContext applicationContext) {
try {
checkedAccept(applicationContext);
} catch (Exception e) {
throw new RuntimeException(e);
}
}
//We're going to find beans by name (the most controversial point)
private void checkedAccept(ApplicationContext context) throws ClassNotFoundException {
Class entityClass = Class.forName(entityName);
String name = entityClass.getSimpleName();
//Please do not use this code in production
String repositoryName = org.apache.commons.lang3.StringUtils.uncapitalize(name + "Repository");
String repositorySearchName = org.apache.commons.lang3.StringUtils.uncapitalize(name + "SearchRepository");
JpaRepository repository = (JpaRepository) context.getBean(repositoryName);
ElasticsearchRepository searchRepository = (ElasticsearchRepository) context.getBean(repositorySearchName);
//Doing our work
updateData(repository, searchRepository);
}
//Write your logic here
private void updateData(JpaRepository repository, ElasticsearchRepository searchRepository) {
searchRepository.saveAll(repository.findAll());
}
}
}
Because the beans have not yet been created, we will have to wait for them
#Component
public class DelayedTaskExecutor {
#Autowired
private ApplicationContext context;
#EventListener
//We are waiting for the app to launch
public void onAppReady(ApplicationReadyEvent event) {
Queue<Consumer<ApplicationContext>> localQueue = getQueue();
if(localQueue.size() > 0) {
for (Consumer<ApplicationContext> consumer = localQueue.poll(); consumer != null; consumer = localQueue.poll()) {
consumer.accept(context);
}
}
}
public static void add(Consumer<ApplicationContext> consumer) {
getQueue().add(consumer);
}
public static Queue<Consumer<ApplicationContext>> getQueue() {
return Holder.QUEUE;
}
private static class Holder {
private static final Queue<Consumer<ApplicationContext>> QUEUE = new ConcurrentLinkedQueue();
}
}
An entity example:
#Entity
#Table(name = "test_entity")
#Document(indexName = "testentity")
public class TestEntity implements Serializable {
private static final long serialVersionUID = 1L;
#Id
#Field(type = FieldType.Keyword)
#GeneratedValue(generator = "uuid")
#GenericGenerator(name = "uuid", strategy = "uuid2")
private String id;
#NotNull
#Column(name = "code", nullable = false, unique = true)
private String code;
...
}