How to pass a parameter value into a JDBC repository #Query - java

I'm trying to use a Lookup strategy of JDBC repository:
#Query("select * from USERS where USERNAME = :id")
User findById(#Param("id") UserId id);
As you can see an id parameter has the custom type UserId. How can I convert it to a simple String? Should I register some converter? Where and how?
I also tried the next queries ... where USERNAME = :id.value, ... where USERNAME = :#{id.value} but they doesn't work.
UPD
This is my Spring Boot configuration:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jdbc</artifactId>
</dependency>
#Configuration
#EnableJdbcRepositories
public class PersistenceConfig extends AbstractJdbcConfiguration {
}

I analyzed a Spring source code and found that for such cases they used #WritingConverters (before I thought about #ReadingConverters, that's why i didn't get success) which can be registered in usual way through JdbcConfiguration:
#Configuration
#EnableJdbcRepositories
public class PersistenceConfig extends AbstractJdbcConfiguration {
#Override
#Bean
public JdbcCustomConversions jdbcCustomConversions() {
return new JdbcCustomConversions(Arrays.asList(
new UserIdToStringConverter()));
}
}
where UserIdToStringConverter has the next implementation:
#WritingConverter
public class UserIdToStringConverter implements Converter<UserId, String> {
#Override
public String convert(UserId source) {
return (String) source.value();
}
}
UPD
I also found that ConditionalGenericConverter doesn't work. For conditional converters the method getConvertibleTypes() must return null (see here). but when I do so a matches method is not called. Look's like Spring bug (spring-data-jdbc-2.0.4.RELEASE).

This should work:
#Query("select * from USERS where USERNAME = :#{#id.value}")
User findById(#Param("id") UserId id);
Read more at https://spring.io/blog/2014/07/15/spel-support-in-spring-data-jpa-query-definitions

Related

Multi-tenancy with a separate database per customer, using Spring Data ArangoDB

So far, the only way I know to set the name of a database, to use with Spring Data ArangoDB, is by hardcoding it in a database() method while extending AbstractArangoConfiguration, like so:
#Configuration
#EnableArangoRepositories(basePackages = { "com.company.mypackage" })
public class MyConfiguration extends AbstractArangoConfiguration {
#Override
public ArangoDB.Builder arango() {
return new ArangoDB.Builder();
}
#Override
public String database() {
// Name of the database to be used
return "example-database";
}
}
What if I'd like to implement multi-tenancy, where each tenant has data in a separate database and use e.g. a subdomain to determine which database name should be used?
Can the database used by Spring Data ArangoDB be determined at runtime, dynamically?
This question is related to the discussion here: Manage multi-tenancy ArangoDB connection - but is Spring Data ArangoDB specific.
Turns out this is delightfully simple: Just change the ArangoConfiguration database() method #Override to return a Spring Expression (SpEL):
#Override
public String database() {
return "#{tenantProvider.getDatabaseName()}";
}
which in this example references a TenantProvider #Component which can be implemented like so:
#Component
public class TenantProvider {
private final ThreadLocal<String> databaseName;
public TenantProvider() {
super();
databaseName = new ThreadLocal<>();
}
public String getDatabaseName() {
return databaseName.get();
}
public void setDatabaseName(final String databaseName) {
this.databaseName.set(databaseName);
}
}
This component can then be #Autowired wherever in your code to set the database name, such as in a servlet filter, or in my case in an Apache Camel route Processor and in database service methods.
P.s. I became aware of this possibility by reading the ArangoTemplate code and a Spring Expression support documentation section
(via), and one merged pull request.

How to use JPQL without #Query annotation inside the method?

I want to use dependency inversion principle inside my book rental project. Before, I used AccountRepository that extends CrudRepository, so my method looked like this:
#Query("SELECT CASE WHEN COUNT(account) > 0 THEN true ELSE false END FROM
Account account WHERE account.id =:accountID")
boolean doesAccountExistsWithGivenID(#Param("accountID") int accountID);
I've created AccountRepository and class that implements this repository.
Class that implements interface is called PostgreSQLAccountRepository. And inside doesAccountExistsWithGivenID I want to query somehow to get same result.
It looks like this:
package bookrental.account;
import bookrental.bookrentals.BookRentals;
import org.springframework.data.repository.CrudRepository;
import org.springframework.stereotype.Repository;
import java.util.List;
#Repository
public class PostgreSQLAccountRepository implements AccountRepository {
private CrudRepository<Account, Integer> repository;
public PostgreSQLAccountRepository(CrudRepository<Account, Integer> repository) {
this.repository = repository;
}
#Override
public List<BookRentals> getAccountRentalsByGivenID(int accountID) {
//TODO
}
#Override
public void deleteById(Integer id) {
this.repository.deleteById(id);
}
#Override
public List<Account> findAll() {
return (List<Account>) this.repository.findAll();
}
#Override
public boolean doesAccountExistsWithGivenID(int accountID) {
//HERE I WANT TO USE JPQL
}
``}
I do not want to use existsByID, because I have a lot of methods that use JPQL so I need to know how to implement it inside the method.
The documentation is clear on how to customize methods from a Data repository:
https://docs.spring.io/spring-data/jpa/docs/current/reference/html/#repositories.custom-implementations
Basically define the fragment of the interface you want to customize (CustomizedRepository). Extend this interface in your data repository
interface SomeRepositry extends CrudRepository<...>, CustomizedRepository
Create implementation for CustomizedRepository called CustomiyedRepositoryImpl. The Impl postfix is critical here. See the docs for more customizations.
You will need to autowire the SessionFactory and use it manually.
#Autowired
public setSessionFactory(EntityManagerFactory factory) {
if(factory.unwrap(SessionFactory.class) == null){
throw new NullPointerException("factory is not a hibernate factory");
}
this.hibernateFactory = factory.unwrap(SessionFactory.class);
}
After you have access to it, then you can use it directly
Session session = hibernateFactory.createSession();
Query query = session.createQuery("SELECT CASE WHEN COUNT(account) > 0 THEN true ELSE false END FROM Account account WHERE account.id =:accountID");
query.setParameter("accountId", "7277");
List list = query.list();

Spring Data query method with IN keyword using a List of enums as parameter

I'm using a Spring Data JPA (4.3.5) repository and a query method using IN keyword clause with a List<Enum> field as parameter. Problem is that it's not working as I expect.
Given an entity like:
#Entity
#Table(name = "R_REPRESENTACIO")
public class Representacio {
#Enumerated(EnumType.STRING)
private Estat estat;
...
//getters and setters
...
}
With that SQL declaration:
CREATE TABLE R_REPRESENTACIO (
UUID NUMBER(19) NOT NULL,
...
ESTAT VARCHAR2(255) NULL,
...
);
Estat is an Enum class like:
public enum Estat {
VALIDA,
PENDENT_VALIDACIO,
PENDENT_DOCUMENTACIO,
...
}
And a JPA repository like:
public interface RepresentacioRepository extends JpaRepository<Representacio, Long> {
List<Representacio> findAllByEstatIn(List<Estat> estats);
}
When I run (integration test class):
List<Estat> estats =
Arrays.asList(Estat.VALIDA,Estat.PENDENT_DOCUMENTACIO,Estat.PENDENT_VALIDACIO);
List<cat.aoc.representa.domain.entity.representacio.Representacio> allByEstatIn = representacioRepository.findAllByEstatIn(estats);
SQL generated is (in an in memory H2 DB):
2018-08-01 12:30:48.534--ServerSession(1175154004)--Connection(384887832)--Thread(Thread[main,5,main])--SELECT UUID, .... FROM R_REPRESENTACIO WHERE (ESTAT IN ((?,?,?)))
bind => [VALIDA, PENDENT_DOCUMENTACIO, PENDENT_VALIDACIO]
No SQL exception is thrown and zero results are returned.
But this SHOULD return 1 result as this (equivalent) SQL returns:
SELECT count(*) FROM R_REPRESENTACIO WHERE ESTAT IN ('VALIDA','PENDENT_DOCUMENTACIO','PENDENT_VALIDACIO');
COUNT(*)
----------
1
The unique difference I'm able to see is how i wrap the IN clause arguments between '' (that column is a VARCHAR).
I don't know why generated SQL from the JPA repository is not returning results.
(I've also tried findAllByEstatIsIn(List<Estat> estats) with same zero results returned).
Any suggestion/explanation?
PS: Workarounded (not happy with) using
List<Representacio> findAllByEstatOrEstatOrEstat(Estat estat, Estat estat2, Estat estat3);
but that is uggly and wrong in many ways...
I suggest a parameter with type List<String> and adding the converter String -> Enum so Spring will be able to convert that. So, basically:
List<Representacio> findByEstatIn(List<String> estats);
2.
#Configuration
public class ConverterConfiguration extends RepositoryRestConfigurerAdapter {
#Autowired
private EstatsConverter estatsConverter;
#Override
public void configureConversionService(ConfigurableConversionService conversionService) {
conversionService.addConverter(estatsConverter);
super.configureConversionService(conversionService);
}
3.
#Component
public class EstatsConverter implements Converter<String, Estat> {
#Override
public Estat convert(String source) {
return Estat.fromString(source);
}
}
I have no idea if that's gonna work, but I remember doing something similar, only in MongoDB. Let me know if you try that.
It is very simple with the spring data JPA and JPQL:
#Repository
public interface RepresentacioRepository extends JpaRepository<Representacio, Long> {
#Query("select r from Representacio r where r.estat in :estat2")
List<Representacio> findByEnumEstat(#Param("estat2") List<Estat> estatList);
}

How to set tableName dynamically using environment variable in spring boot?

I am using AWS ECS to host my application and using DynamoDB for all database operations. So I'll have same database with different table names for different environments. Such as "dev_users" (for Dev env), "test_users" (for Test env), etc.. (This is how our company uses same Dynamo account for different environments)
So I would like to change the "tableName" of the model class using the environment variable passed through "AWS ECS task definition" environment parameters.
For Example.
My Model Class is:
#DynamoDBTable(tableName = "dev_users")
public class User {
Now I need to replace the "dev" with "test" when I deploy my container in test environment. I know I can use
#Value("${DOCKER_ENV:dev}")
to access environment variables. But I'm not sure how to use variables outside the class. Is there any way that I can use the docker env variable to select my table prefix?
My Intent is to use like this:
I know this not possible like this. But is there any other way or work around for this?
Edit 1:
I am working on the Rahul's answer and facing some issues. Before writing the issues, I'll explain the process I followed.
Process:
I have created the beans in my config class (com.myapp.users.config).
As I don't have repositories, I have given my Model class package name as "basePackage" path. (Please check the image)
For 1) I have replaced the "table name over-rider bean injection" to avoid the error.
For 2) I printed the name that is passing on to this method. But it is Null. So checking all the possible ways to pass the value here.
Check the image for error:
I haven't changed anything in my user model class as beans will replace the name of the DynamoDBTable when the beans got executed. But the table name over riding is happening. Data is pulling from the table name given at the Model Class level only.
What I am missing here?
The table names can be altered via an altered DynamoDBMapperConfig bean.
For your case where you have to Prefix each table with a literal, you can add the bean as such. Here the prefix can be the environment name in your case.
#Bean
public TableNameOverride tableNameOverrider() {
String prefix = ... // Use #Value to inject values via Spring or use any logic to define the table prefix
return TableNameOverride.withTableNamePrefix(prefix);
}
For more details check out the complete details here:
https://github.com/derjust/spring-data-dynamodb/wiki/Alter-table-name-during-runtime
I am able to achieve table names prefixed with active profile name.
First added TableNameResolver class as below,
#Component
public class TableNameResolver extends DynamoDBMapperConfig.DefaultTableNameResolver {
private String envProfile;
public TableNameResolver() {}
public TableNameResolver(String envProfile) {
this.envProfile=envProfile;
}
#Override
public String getTableName(Class<?> clazz, DynamoDBMapperConfig config) {
String stageName = envProfile.concat("_");
String rawTableName = super.getTableName(clazz, config);
return stageName.concat(rawTableName);
}
}
Then i setup DynamoDBMapper bean as below,
#Bean
#Primary
public DynamoDBMapper dynamoDBMapper(AmazonDynamoDB amazonDynamoDB) {
DynamoDBMapper mapper = new DynamoDBMapper(amazonDynamoDB,new DynamoDBMapperConfig.Builder().withTableNameResolver(new TableNameResolver(envProfile)).build());
return mapper;
}
Added variable envProfile which is an active profile property value accessed from application.properties file.
#Value("${spring.profiles.active}")
private String envProfile;
We have the same issue with regards to the need to change table names during runtime. We are using Spring-data-dynamodb 5.0.2 and the following configuration seems to provide the solutions that we need.
First I annotated my bean accessor
#EnableDynamoDBRepositories(dynamoDBMapperConfigRef = "getDynamoDBMapperConfig", basePackages = "my.company.base.package")
I also setup an environment variable called ENV_PREFIX which is Spring wired via SpEL.
#Value("#{systemProperties['ENV_PREFIX']}")
private String envPrefix;
Then I setup a TableNameOverride bean:
#Bean
public DynamoDBMapperConfig.TableNameOverride getTableNameOverride() {
return DynamoDBMapperConfig.TableNameOverride.withTableNamePrefix(envPrefix);
}
Finally, I setup the DynamoDBMapperConfig bean using TableNameOverride injection. In 5.0.2, we had to setup a standard DynamoDBTypeConverterFactory in the DynamoDBMapperConfig builder to avoid NPE.:
#Bean
public DynamoDBMapperConfig getDynamoDBMapperConfig(DynamoDBMapperConfig.TableNameOverride tableNameOverride) {
DynamoDBMapperConfig.Builder builder = new DynamoDBMapperConfig.Builder();
builder.setTableNameOverride(tableNameOverride);
builder.setTypeConverterFactory(DynamoDBTypeConverterFactory.standard());
return builder.build();
}
In hind sight, I could have setup a DynamoDBTypeConverterFactory bean that returns a standard DynamoDBTypeConverterFactory and inject that into the getDynamoDBMapperConfig() method using the DynamoDBMapperConfig builder. But this will also do the job.
I up voted the other answer but here is an idea:
Create a base class with all your user details:
#MappedSuperclass
public abstract class AbstractUser {
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private Long id;
private String firstName;
private String lastName;
Create 2 implentations with different table names and spirng profiles:
#Profile(value= {"dev","default"})
#Entity(name = "dev_user")
public class DevUser extends AbstractUser {
}
#Profile(value= {"prod"})
#Entity(name = "prod_user")
public class ProdUser extends AbstractUser {
}
Create a single JPA respository that uses the mapped super classs
public interface UserRepository extends CrudRepository<AbstractUser, Long> {
}
Then switch the implentation with the spring profile
#RunWith(SpringJUnit4ClassRunner.class)
#DataJpaTest
#Transactional
public class UserRepositoryTest {
#Autowired
protected DataSource dataSource;
#BeforeClass
public static void setUp() {
System.setProperty("spring.profiles.active", "prod");
}
#Test
public void test1() throws Exception {
DatabaseMetaData metaData = dataSource.getConnection().getMetaData();
ResultSet tables = metaData.getTables(null, null, "PROD_USER", new String[] { "TABLE" });
tables.next();
assertEquals("PROD_USER", tables.getString("TABLE_NAME"));
}
}

Spring Boot + MongoDB Id query

I have a Spring Boot application combined with MongoDB as the persistance layer. I have the following structure:
public class Resource {
#Id
public String Id;
...
}
I also have a ResourceRepository:
#RepositoryRestResource(collectionResourceRel = "resources", path = "resources")
public interface ResourceRepository extends MongoRepository<Resource, String> {
Resource findById(#Param("Id")String Id);
}
I found online that a way to have the id property returned in the JSON when you perform a GET request like http://localhost:8080/resources/ is to change the id property to Id (uppercase i). Indeed, if the property is lowercase, I don't get back an id field but if I change it to uppercase then I get it. For a reason, I need to get back the id property so I used the uppercase i. So far, so good.
However, when I tried to execute the query findById included in my repository I get an exception:
org.springframework.data.mapping.context.InvalidPersistentPropertyPath: No property id found on app.model.Resource!
If I change the Id property to id (lowercase i) I can execute successfully the /resources/search/findById?id=... GET request.
I tried creating a custom controller with a query that finds and returns a Resource based on the id that is given:
#Controller
#RequestMapping("/resource")
public class ResourceController {
#Autowired
MongoOperations mongoOperations;
#RequestMapping(value="/findById/{resourceId}/", method= RequestMethod.GET)
#ResponseBody
public Resource findByResourceId(#PathVariable("resourceId") String resourceId) {
Resource resource = mongoOperations.findOne(query(Criteria.where("Id").is(resourceId)), Resource.class,"DOJ");
}
}
but I receive the same error:
org.springframework.data.mapping.context.InvalidPersistentPropertyPath: No property id found on app.model.Resource!
Any idea on how to both have the id property displyed in the JSon and be able to findById?
Well, I found the answer myself. Switch back to lowercase id so findById works and add the following class to the project:
#Configuration
public class SpringDataRestConfiguration extends RepositoryRestConfigurerAdapter {
#Override
public void configureRepositoryRestConfiguration(RepositoryRestConfiguration config) {
config.exposeIdsFor(Resource.class);
}
}
As the name of the method suggests, this configuration makes Resource class objects to expose their ids in JSON.
UPDATE: If you are using the latest or relatively latest version of spring-boot, the RepositoryRestConfigurerAdapter class has been deprecated, and the java-doc suggests to use the interface RepositoryRestConfigurer directly.
So your code should look like this:
#Configuration
public class SpringDataRestConfiguration implements RepositoryRestConfigurer
...

Categories