I use java spring and mongodb repository for my project.
Here is repository defeniton:
#Repository
public interface InfoRepository extends MongoRepository<Info, String> {
List<InfoDto> findAll();
}
Here is Info Defenition:
#Document("info")
#Data
public class Info{
#Id
private String id = null;
private String name;
private String companyName;
private String email;
private Address address;
private String website;
}
Here is InfoDto class defenition:
#Data
public class InfoDto {
private String name;
private String companyName;
private Address address;
}
When I start to run the project IU get this error:
'findAll()' in '...repository.InfoRepository' clashes with 'findAll()'
in 'org.springframework.data.mongodb.repository.MongoRepository'; attempting to use incompatible return type
To prevent the clashe I change the name of the repository function from this:
List<InfoDto> findAll();
to this:
List<InfoDto> findAllMakeProjection();
But after I make changes above and run the function, I get this error:
org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'infoServiceImpl' defined in file
[...\InfoServiceImpl.class]:
Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException:
Error creating bean with name 'infoRepository' defined in ".../repository.InfoRepository" defined in #EnableMongoRepositories
declared on MongoRepositoriesRegistrar.EnableMongoRepositoriesConfiguration:
Invocation of init method failed; nested exception is org.springframework.data.mapping.PropertyReferenceException:
No property findAllMakeProjection found for type Info!
Any idea why I get the error and how to fix it?
List<T> findAll() is a method provided in MongoRepository Interface, so it's not possible to change its return type in sub-interfaces. At most, you can change the return type to List implementations like ArrayList<T> or LinkedList<T>.
If you change the method name to List<InfoDto> findAllMakeProjection(), Spring Data MongoDB will try to build the query using the property name, but there is no property named so it will throw an error.
But it is allowed to add anything before By word in the method name, e.g. findAllBy, findEverythingBy, findDataBy. Anything after By will work as filters(where condition) and if we don't add anything after By, it will work like findAll (no filters)
So, change the method name accordingly and you will be able to run your query.
What happens in here is findAll() is a default built in repository method name reserved for Spring Data JPA. so if you introduce your own findAll() inside your custom Repository(no matter it is JPARespository or MongoRepository ) it will clash with the findAll() provided by JPA.
changing the method name to List<InfoDto> findAllMakeProjection(); will make JPA to build the query using JPQL so it will try to extract entity properties from the method name unless you defined the query with #Query annotation.
So if you want to do so it should be something like findAllBySomeCondition or findBySomeCondition
ex : findByNameAndCompanyName(), findByEmail(), findAllByCompanyName()
Best way is to remove the List<InfoDto> findAll(); inside InfoRepository. Still, you can call
#Autowired
private InfoRepository infoRepository;
.......
infoRepository.findAll();
So this will return a List<Info>.
And simply you cannot reurn a List of DTO objects directly through MongoRepository like you did. It will originally return a List of model objects(List<Info>). in order to return a list of DTOs,
#Repository
public interface InfoRepository extends MongoRepository<Info, String> {
#Query(value="select new com.foo.bar.InfoDto(i.name,i.companyName, i.address) from Info i")
List<InfoDto> findAllInfos();
}
you might need to bit tricky and do additional things with mapping address from entity to DTO.
Related
In the application I am working on, I have several Entity classes. For example:
#Entity
public class Person implements Serializable {
#Column
private String columnA;
public String getColumnA() {
return columnA;
}
public void setColumnA(String columnA) {
this.columnA= columnA;
}
}
I have accompanying Repositories:
public interface PersonRepository extends CrudRepository<Person, Long> {
Person findByPersonID(Long id);
}
Then I utilize the repositories like so:
#Autowired
private PersonRepository personRepository;
I have several Entities (e.g., Person, Status, Training, Benefits, etc.)
I am needing to make a very specialized report query to the database that uses multiple joins across multiple tables. I have the query working in MySQL Workbench.
So, I created a new Repository:
#Repository
public interface ReportRepository extends CrudRepository<Report, Long> {
#Query("SELECT ...")
List<Report> queryReportData(String columnA, String columnB, String columnC, ...);
}
Where Report is just a POJO with the fields I need:
public class Report {
private String columnA;
private String columnB;
// etc ...
}
// Getters and Setters here
My issue is when I try to use the repository like so:
#Autowired
private ReportRepository reportRepository;
I get run-time errors:
creating bean with name 'genController': Unsatisfied dependency expressed through field 'reportRepository'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'reportRepository': Invocation of init method failed; nested exception is java.lang.IllegalArgumentException: Not a managed type: class com.me.Report
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'reportRepository': Invocation of init method failed; nested exception is java.lang.IllegalArgumentException: Not a managed type: class com.me.Report
Caused by: java.lang.IllegalArgumentException: Not a managed type: class com.me.Report
Where genController is:
#RestController
public class GenController {
}
Granted, Report.java is not an actual #Entity as it is not "really" in the database.
So, am I going about this completely wrong, or am I kind of on the right path?
What do I need to do in order to get the data I need using my custom, cross table, query?
if Report is not an entity, you can not do that. if you are using a native query already, you can just put this method into an entity repositort. Such as person. and Autowire the personRepository instead report repository
Your Report object needs to be annotated as #Entity so it will be picked up by JPA
Without that annotation, Spring, as you can see in your stack trace, throws an exception when it tries to construct an instance of a CrudRepository because Report is not a managed object
I am using AWS ECS to host my application and using DynamoDB for all database operations. So I'll have same database with different table names for different environments. Such as "dev_users" (for Dev env), "test_users" (for Test env), etc.. (This is how our company uses same Dynamo account for different environments)
So I would like to change the "tableName" of the model class using the environment variable passed through "AWS ECS task definition" environment parameters.
For Example.
My Model Class is:
#DynamoDBTable(tableName = "dev_users")
public class User {
Now I need to replace the "dev" with "test" when I deploy my container in test environment. I know I can use
#Value("${DOCKER_ENV:dev}")
to access environment variables. But I'm not sure how to use variables outside the class. Is there any way that I can use the docker env variable to select my table prefix?
My Intent is to use like this:
I know this not possible like this. But is there any other way or work around for this?
Edit 1:
I am working on the Rahul's answer and facing some issues. Before writing the issues, I'll explain the process I followed.
Process:
I have created the beans in my config class (com.myapp.users.config).
As I don't have repositories, I have given my Model class package name as "basePackage" path. (Please check the image)
For 1) I have replaced the "table name over-rider bean injection" to avoid the error.
For 2) I printed the name that is passing on to this method. But it is Null. So checking all the possible ways to pass the value here.
Check the image for error:
I haven't changed anything in my user model class as beans will replace the name of the DynamoDBTable when the beans got executed. But the table name over riding is happening. Data is pulling from the table name given at the Model Class level only.
What I am missing here?
The table names can be altered via an altered DynamoDBMapperConfig bean.
For your case where you have to Prefix each table with a literal, you can add the bean as such. Here the prefix can be the environment name in your case.
#Bean
public TableNameOverride tableNameOverrider() {
String prefix = ... // Use #Value to inject values via Spring or use any logic to define the table prefix
return TableNameOverride.withTableNamePrefix(prefix);
}
For more details check out the complete details here:
https://github.com/derjust/spring-data-dynamodb/wiki/Alter-table-name-during-runtime
I am able to achieve table names prefixed with active profile name.
First added TableNameResolver class as below,
#Component
public class TableNameResolver extends DynamoDBMapperConfig.DefaultTableNameResolver {
private String envProfile;
public TableNameResolver() {}
public TableNameResolver(String envProfile) {
this.envProfile=envProfile;
}
#Override
public String getTableName(Class<?> clazz, DynamoDBMapperConfig config) {
String stageName = envProfile.concat("_");
String rawTableName = super.getTableName(clazz, config);
return stageName.concat(rawTableName);
}
}
Then i setup DynamoDBMapper bean as below,
#Bean
#Primary
public DynamoDBMapper dynamoDBMapper(AmazonDynamoDB amazonDynamoDB) {
DynamoDBMapper mapper = new DynamoDBMapper(amazonDynamoDB,new DynamoDBMapperConfig.Builder().withTableNameResolver(new TableNameResolver(envProfile)).build());
return mapper;
}
Added variable envProfile which is an active profile property value accessed from application.properties file.
#Value("${spring.profiles.active}")
private String envProfile;
We have the same issue with regards to the need to change table names during runtime. We are using Spring-data-dynamodb 5.0.2 and the following configuration seems to provide the solutions that we need.
First I annotated my bean accessor
#EnableDynamoDBRepositories(dynamoDBMapperConfigRef = "getDynamoDBMapperConfig", basePackages = "my.company.base.package")
I also setup an environment variable called ENV_PREFIX which is Spring wired via SpEL.
#Value("#{systemProperties['ENV_PREFIX']}")
private String envPrefix;
Then I setup a TableNameOverride bean:
#Bean
public DynamoDBMapperConfig.TableNameOverride getTableNameOverride() {
return DynamoDBMapperConfig.TableNameOverride.withTableNamePrefix(envPrefix);
}
Finally, I setup the DynamoDBMapperConfig bean using TableNameOverride injection. In 5.0.2, we had to setup a standard DynamoDBTypeConverterFactory in the DynamoDBMapperConfig builder to avoid NPE.:
#Bean
public DynamoDBMapperConfig getDynamoDBMapperConfig(DynamoDBMapperConfig.TableNameOverride tableNameOverride) {
DynamoDBMapperConfig.Builder builder = new DynamoDBMapperConfig.Builder();
builder.setTableNameOverride(tableNameOverride);
builder.setTypeConverterFactory(DynamoDBTypeConverterFactory.standard());
return builder.build();
}
In hind sight, I could have setup a DynamoDBTypeConverterFactory bean that returns a standard DynamoDBTypeConverterFactory and inject that into the getDynamoDBMapperConfig() method using the DynamoDBMapperConfig builder. But this will also do the job.
I up voted the other answer but here is an idea:
Create a base class with all your user details:
#MappedSuperclass
public abstract class AbstractUser {
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private Long id;
private String firstName;
private String lastName;
Create 2 implentations with different table names and spirng profiles:
#Profile(value= {"dev","default"})
#Entity(name = "dev_user")
public class DevUser extends AbstractUser {
}
#Profile(value= {"prod"})
#Entity(name = "prod_user")
public class ProdUser extends AbstractUser {
}
Create a single JPA respository that uses the mapped super classs
public interface UserRepository extends CrudRepository<AbstractUser, Long> {
}
Then switch the implentation with the spring profile
#RunWith(SpringJUnit4ClassRunner.class)
#DataJpaTest
#Transactional
public class UserRepositoryTest {
#Autowired
protected DataSource dataSource;
#BeforeClass
public static void setUp() {
System.setProperty("spring.profiles.active", "prod");
}
#Test
public void test1() throws Exception {
DatabaseMetaData metaData = dataSource.getConnection().getMetaData();
ResultSet tables = metaData.getTables(null, null, "PROD_USER", new String[] { "TABLE" });
tables.next();
assertEquals("PROD_USER", tables.getString("TABLE_NAME"));
}
}
I have a Spring Boot application combined with MongoDB as the persistance layer. I have the following structure:
public class Resource {
#Id
public String Id;
...
}
I also have a ResourceRepository:
#RepositoryRestResource(collectionResourceRel = "resources", path = "resources")
public interface ResourceRepository extends MongoRepository<Resource, String> {
Resource findById(#Param("Id")String Id);
}
I found online that a way to have the id property returned in the JSON when you perform a GET request like http://localhost:8080/resources/ is to change the id property to Id (uppercase i). Indeed, if the property is lowercase, I don't get back an id field but if I change it to uppercase then I get it. For a reason, I need to get back the id property so I used the uppercase i. So far, so good.
However, when I tried to execute the query findById included in my repository I get an exception:
org.springframework.data.mapping.context.InvalidPersistentPropertyPath: No property id found on app.model.Resource!
If I change the Id property to id (lowercase i) I can execute successfully the /resources/search/findById?id=... GET request.
I tried creating a custom controller with a query that finds and returns a Resource based on the id that is given:
#Controller
#RequestMapping("/resource")
public class ResourceController {
#Autowired
MongoOperations mongoOperations;
#RequestMapping(value="/findById/{resourceId}/", method= RequestMethod.GET)
#ResponseBody
public Resource findByResourceId(#PathVariable("resourceId") String resourceId) {
Resource resource = mongoOperations.findOne(query(Criteria.where("Id").is(resourceId)), Resource.class,"DOJ");
}
}
but I receive the same error:
org.springframework.data.mapping.context.InvalidPersistentPropertyPath: No property id found on app.model.Resource!
Any idea on how to both have the id property displyed in the JSon and be able to findById?
Well, I found the answer myself. Switch back to lowercase id so findById works and add the following class to the project:
#Configuration
public class SpringDataRestConfiguration extends RepositoryRestConfigurerAdapter {
#Override
public void configureRepositoryRestConfiguration(RepositoryRestConfiguration config) {
config.exposeIdsFor(Resource.class);
}
}
As the name of the method suggests, this configuration makes Resource class objects to expose their ids in JSON.
UPDATE: If you are using the latest or relatively latest version of spring-boot, the RepositoryRestConfigurerAdapter class has been deprecated, and the java-doc suggests to use the interface RepositoryRestConfigurer directly.
So your code should look like this:
#Configuration
public class SpringDataRestConfiguration implements RepositoryRestConfigurer
...
I am trying to query spring data elasticsearch repositories for nested properties. My Repository looks like this:
public interface PersonRepository extends
ElasticsearchRepository<Person, Long> {
List<Person> findByAddressZipCode(String zipCode);
}
The domain objects Person and Address (without getters/setters) are defined as follows:
#Document(indexName="person")
public class Person {
#Id
private Long id;
private String name;
#Field(type=FieldType.Nested, store=true, index = FieldIndex.analyzed)
private Address address;
}
public class Address {
private String zipCode;
}
My test saves one Person document and tries to read it with the repository method. But no results are returned. Here is the test method:
#Test
public void testPersonRepo() throws Exception {
Person person = new Person();
person.setName("Rene");
Address address = new Address();
address.setZipCode("30880");
person.setAddress(address);
personRepository.save(person);
elasticsearchTemplate.refresh(Person.class,true);
assertThat(personRepository.findByAddressZipCodeContaining("30880"), hasSize(1));
}
Does spring data elasticsearch support the default spring data query generation?
Elasticsearch indexes the new document asynchronously...near real-time. The default refresh is typically 1s I think. So you must explicitly request a refresh (to force a flush and the document available for search) if you are wanting the document immediately searchable as with a unit test. So your unit test needs to include the ElasticsearchTemplate bean so that you can explicitly call refresh. Make sure you set waitForOperation to true to force a synchronous refresh. See this related answer. Kinda like this:
elasticsearchTemplate.refresh("myindex",true);
my project is using Spring data mongodb. I was not having below error until i made an edit to one of the document that has a field with Array of Documents in it. It was working fine before but now I keep getting the below error.
The field i updated was impapps in the Projects POJO class. I am not sure how to clear this error tried different things but did not work out.
SEVERE: Servlet.service() for servlet [appServlet] in context with path [/mongodproject] threw exception [Request processing failed; nested exception is org.springframework.data.mapping.model.MappingInstantiationException: Could not instantiate bean class [java.util.List]: Specified class is an interface] with root cause
org.springframework.beans.BeanInstantiationException: Could not instantiate bean class [java.util.List]: Specified class is an interface
at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:101)
at org.springframework.data.convert.ReflectionEntityInstantiator.createInstance(ReflectionEntityInstantiator.java:60)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:232)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:212)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.readValue(MappingMongoConverter.java:1008)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.access$100(MappingMongoConverter.java:75)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter$MongoDbPropertyValueProvider.getPropertyValue(MappingMongoConverter.java:957)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.getValueInternal(MappingMongoConverter.java:713)
Here are my POJO and Spring Repository class.
Project POJO Class
#Document(collection="releases")
public class Project {
#Id
private String id;
......
#Field("impapps")
private List<ImpactedApplications> impapps=new ArrayList<ImpactedApplications>();
.....getters/setters
}
ImpactedApplication POJO Class:
public class ImpactedApplications {
#Field("appid")
private String appId;
.....
#Field("repository")
private List<ScriptsRepo> rep=new ArrayList<ScriptsRepo>();
#Field("artifacts")
private List<Artifacts> artifacts=new ArrayList<Artifacts>();
//getter and setters
}
Artifacts POJO Class
public class Artifacts {
#Field("artifacttype")
private String artifactType;
#Field("documentlink")
private String documentLink;
#Field("arttestphase")
private String artTestPhase;
#Field("artifactname")
private ArtifactsEnums artifactsNames;
#Field("startdate")
private String startDate;
#Field("enddate")
private String endDate;
#Field("peerrev")
private boolean peerReview;
#Field("busrev")
private boolean busReview;
#Field("na")
private boolean na;
Spring Repository classes
public interface ProjectRepository extends Repository<Project, String> {
Project findById(String id);
List<Project> findByYearAndReleaseMonthNoOrderByProjectNameDesc(String year,String month, Sort sort);
Project findByYearAndReleaseMonthNoAndId(String year, String month,String id);
Whenever i call the above methods i keep getting the exception.
Below is how my document is looking currently.
The impapps field in your document is not an array but a nested document. So if you change your List<ImpactedApplications> to a simple ImpactedApplications this should read fine.
I have got the same exception :
Try to declare your field like this :
#Field("impapps")
ArrayList<ImpactedApplications> impapps = new ArrayList<ImpactedApplications>();
I don't like to do that but it works for me.
edit:
My issue was due to unwind operation during aggregation.
It transform array (declared as List<> in my class) into object
and then reflection doesn't work because spring was expecting a list.
use ArrayList someObjects instead of List someObjects.
It worked for me.