I have such question. I'm using DBsetup for spring boot tests and postgresql database. And I'm using DBsetup to set user, but when I'm trying to set another user by spring data I have the next exception:
Подробности: Key (id)=(1) already exists.
org.springframework.dao.DataIntegrityViolationException: could not execute statement; SQL [n/a]; constraint [users_pkey];
This is my test class:
#RunWith(SpringRunner.class)
#SpringBootTest
#TestPropertySource("/application-test.properties")
public class UserRepositoryTest {
#Autowired
private ApplicationContext applicationContext;
#Autowired
private UserRepository userRepository;
#Autowired
private DataSource dataSource;
#Before
public void insertData() throws SQLException {
Operation operation = sequenceOf(CommonOperations.DELETE_ALL, CommonOperations.INSERT_USER);
DbSetup dbSetup = new DbSetup(new DataSourceDestination(dataSource), operation);
dbSetup.launch();
}
#After
public void cleanPK() throws SQLException {
DBUtil.resetAutoIncrementColumns(applicationContext, "user");
}
#Test
public void registerUser() {
val user = new User(null, "Glass", "123123", "glass999#mail.ru");
assertEquals(user, userRepository.saveAndFlush(user));
}
}
Operations for DBsetup:
public class CommonOperations {
public static final Operation DELETE_ALL = deleteAllFrom("article_tag", "article", "tag", "users");
public static final Operation INSERT_USER =
insertInto("users")
.columns("id", "email", "password", "username")
.values(1, "krikkk998#mail.ru", "123123", "Daimon")
.build();
}
Class to reset sequence:
#NoArgsConstructor
public final class DBUtil {
public static void resetAutoIncrementColumns(ApplicationContext applicationContext,
String... tableNames) throws SQLException {
DataSource dataSource = applicationContext.getBean(DataSource.class);
String resetSqlTemplate = "ALTER SEQUENCE %s RESTART WITH 1;";
try (Connection dbConnection = dataSource.getConnection()) {
for (String resetSqlArgument: tableNames) {
try (Statement statement = dbConnection.createStatement()) {
String resetSql = String.format(resetSqlTemplate, resetSqlArgument + "_id_seq");
statement.execute(resetSql);
}
}
}
}
}
Does anyone know how to solve this problem?
One thing to look at:
public static final Operation INSERT_USER =
insertInto("users")
.columns("id", "email", "password", "username")
.values(1, "krikkk998#mail.ru", "123123", "Daimon")
.build();
Here you are using a hard-coded id i.e. 1
Now, when in the test case you are trying to create another user, you passed the id as null, assuming it's supposed to pick from the sequence. It will start from 1 too. Hence, you get a conflict.
You have an issue related to constraint violation. So one thing you can do is in your table change the 'id' column to "auto_increment". The DB will take care of incrementing this column value automatically.
At any point, if you want to reset this id value, you can call "resetAutoIncrementColumns()", and then in your INSERT SQL, you do not have to specify the 'id' column at all. It will always insert a unique value when a new user is saved.
Hope this will help you.
Related
I am using Testcontainers to load a Dockerized database to be used for my Spring Boot application for integration testing. I currently an using an initialization script to load all of the data:
CREATE TABLE public.table1 (
...
...
);
CREATE TABLE public.table2 (
...
...
);
This is all working fine. I also have my own manual test data that I insert to test different scenarios:
-- Data for pre-existing quiz
INSERT INTO public.table1 (id, prop1, prop2) values (1, 'a', 'b');
INSERT INTO public.table2 (id, prop1, prop2) values (1, 'c', 'd');
INSERT INTO public.table2 (id, prop1, prop2) values (2, 'e', 'f');
Again, this is all working fine, and I am using a YAML file to read mock these objects to be used for my tests
table1s:
first:
id: 1
prop1: a
prop2: b
table2s:
first:
id: 1
prop1: c
prop2: d
second:
id: 2
prop1: e
prop2: f
In which I will be able to put these into a class that I can read from the YAML file properties so it can be used for my test classes
public class Table1TestData {
#Autowired
private Environment env;
private UUID id;
private boolean prop1;
private boolean prop2;
public UUID getId() {
return id;
}
public void setId(UUID id) {
this.id = id;
}
public boolean getProp1() {
return prop1;
}
public void setProp1(String trial) {
this.prop1 = prop1;
}
....
public Table1TestData getFirstRowData(){
Table1HelperFactory ret = new Table1HelperFactory();
ret.setId(UUID.fromString(env.getProperty("table1s.first.id")));
ret.setProp1(env.getProperty("table1s.first.id"));
....
return ret;
}
....
}
And I use this helper as an autowired entity in my tests (especially for my Service classes):
public class Table1ServiceTest {
#ClassRule
public static PostgresContainer postgresContainer = PostgresContainer.getInstance();
#Autowired
Table1Service table1Service;
#Autowired
Table1TestData table1TestData;
#Autowired
MockMvc mockMvc;
#Autowired
ObjectMapper objectMapper;
#BeforeAll
private static void startup() {
postgresContainer.start();
}
#Test
#DisplayName("Table 1 Service Test")
#Transactional
public void findTable1ById() throws Exception {
Table1TestData testData = table1TestData.getFirstRowData();
Table1 table1 = table1Service.findTable1ById(testData.getId());
assertNotNull(table1);
assertEquals(table1.getId(), testData.getId());
assertEquals(table1.prop1(), testData.prop1());
....
}
}
However, let's say I have to apply a new column to Table1 (or any table really) and I put the new schema into the init script. I now have to manually go to each of these insert statements and put in a new column with a value (assuming there's no default), or even say if a column is removed (even if it doesn't affect the classes necessarily). This ends up being cumbersome.
So my question really is, for someone who is using an init script to populate test data for a containerized DB, what is the best way to go about maintaining this data efficiently without much manual curating?
I think you could take advantage of the initialization script feature for postgresql: Just place your sql scripts under /docker-entrypoint-initdb.d (creating the directory if necessary) and it will execute them directly without any programmatic work.
You can check out an example here:
https://github.com/gmunozfe/clustered-ejb-timers-kie-server/blob/master/src/test/java/org/kie/samples/integration/ClusteredEJBTimerSystemTest.java
Define your postgresql pointing to that directory:
.withFileSystemBind("etc/postgresql", "/docker-entrypoint-initdb.d",
BindMode.READ_ONLY)
If you want different scripts per tests, have a look a this article:
https://www.baeldung.com/spring-boot-data-sql-and-schema-sql
I've followed an open Course on Spring web. Written some code to list all orders from a database and return them through a rest api. This works perfectly. Now I'm writing some code to give the ID of the order in the request, find 0 or 1 orders and return them. However, when there is no Order find with the given ID, a nullpointerexception is given. I can't find out what is causing this. I'm assuming the .orElse(null) statement. Please advise
Controller:
#RequestMapping("api/V1/order")
#RestController
public class OrderController {
private final OrderService orderService;
#Autowired
public OrderController(OrderService orderService) {
this.orderService = orderService;
}
#GetMapping(path = "{id}")
public Order getOrderById(#PathVariable("id") int id) {
return orderService.getOrderById(id)
.orElse(null);
}
}
Service:
#Service
public class OrderService {
private final OrderDao orderDao;
#Autowired
public OrderService(#Qualifier("oracle") OrderDao orderDao) {
this.orderDao = orderDao;
}
public Optional<Order> getOrderById(int orderNumber) {
return orderDao.selectOrderById(orderNumber);
}
}
Dao:
#Override
public Optional<Order> selectOrderById(int searchedOrderNumber) {
final String sql = "SELECT \"order\", sender, receiver, patient, orderdate, duedate, paymentref, status, netprice from \"ORDER\" where \"order\" = ?";
Order order = jdbcTemplate.queryForObject(sql, new Object[] {searchedOrderNumber}, (resultSet, i) -> {
int orderNumber = resultSet.getInt( "\"order\"");
String sender = resultSet.getString("sender");
String receiver = resultSet.getString("receiver");
String patient = resultSet.getString("patient");
String orderDate = resultSet.getString("orderdate");
String dueDate = resultSet.getString("duedate");
String paymentRef = resultSet.getString("paymentref");
String status = resultSet.getString("status");
int netPrice = resultSet.getInt("netprice");
return new Order(orderNumber,sender,receiver,patient,orderDate,dueDate,paymentRef,status,netPrice);
});
return Optional.ofNullable(order);
}
For the Jdbcexception, use general query instead of the queryForObject, or use try/catch to convert the Jdbc related exception, else Spring itself will handle these internally using ExceptionTranslater, ExceptionHandler etc.
To handle optional case in controllers, just throw an exception there, for example PostController.java#L63
And handle it in the PostExceptionHandler.
Editing based on comment about stack trace
For your error please check - Jdbctemplate query for string: EmptyResultDataAccessException: Incorrect result size: expected 1, actual 0
To solve problem associated with orderService.getOrderById(id) returning null you can return ResponseEntity.ResponseEntity gives you more flexibility in terms of status code and header. If you can change your code to return ResponseEntitythen you can do something like
#GetMapping(path = "{id}")
public ResponseEntity<?> getOrderById(#PathVariable("id") int id) {
return orderService
.getOrderById(id)
.map(order -> new ResponseEntity<>(order.getId(), HttpStatus.OK))
.orElse(new ResponseEntity<>(HttpStatus.NOT_FOUND));
}
You can even write generic Exception handler using #ControllerAdvice and throw OrderNotFoundException as .orElse(throw new OrderNotFoundException);. Check more information here.
I am using r2dbc, r2dbc-h2 and experimental spring-boot-starter-data-r2dbc
implementation 'org.springframework.boot.experimental:spring-boot-starter-data-r2dbc:0.1.0.M1'
implementation 'org.springframework.data:spring-data-r2dbc:1.0.0.RELEASE' // starter-data provides old version
implementation 'io.r2dbc:r2dbc-h2:0.8.0.RELEASE'
implementation 'io.r2dbc:r2dbc-pool:0.8.0.RELEASE'
I have created reactive repositories
public interface IJsonComparisonRepository extends ReactiveCrudRepository<JsonComparisonResult, String> {}
Also added a custom script that creates a table in H2 on startup
#SpringBootApplication
public class JsonComparisonApplication {
public static void main(String[] args) {
SpringApplication.run(JsonComparisonApplication.class, args);
}
#Bean
public CommandLineRunner startup(DatabaseClient client) {
return (args) -> client
.execute(() -> {
var resource = new ClassPathResource("ddl/script.sql");
try (var is = new InputStreamReader(resource.getInputStream())) {
return FileCopyUtils.copyToString(is);
} catch (IOException e) {
throw new RuntimeException(e);
} })
.then()
.block();
}
}
My r2dbc configuration looks like this
#Configuration
#EnableR2dbcRepositories
public class R2dbcConfiguration extends AbstractR2dbcConfiguration {
#Override
public ConnectionFactory connectionFactory() {
return new H2ConnectionFactory(
H2ConnectionConfiguration.builder()
.url("mem:testdb;DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=FALSE")
.username("sa")
.build());
}
}
My service where I perform the logic looks like this
#Override
public Mono<JsonComparisonResult> updateOrCreateRightSide(String comparisonId, String json) {
return updateComparisonSide(comparisonId, storedComparisonResult -> {
storedComparisonResult.setRightSide(json);
return storedComparisonResult;
});
}
private Mono<JsonComparisonResult> updateComparisonSide(String comparisonId,
Function<JsonComparisonResult, JsonComparisonResult> updateSide) {
return repository.findById(comparisonId)
.defaultIfEmpty(createResult(comparisonId))
.filter(result -> ComparisonDecision.NONE == result.getDecision()) // if not NONE - it means it was found and completed
.switchIfEmpty(Mono.error(new NotUpdatableCompleteComparisonException(comparisonId)))
.map(updateSide)
.flatMap(repository::save);
}
private JsonComparisonResult createResult(String comparisonId) {
LOGGER.info("Creating new comparison result: {}.", comparisonId);
var newResult = new JsonComparisonResult();
newResult.setDecision(ComparisonDecision.NONE);
newResult.setComparisonId(comparisonId);
return newResult;
}
The domain looks like this
#Table("json_comparison")
public class JsonComparisonResult {
#Column("comparison_id")
#Id
private String comparisonId;
#Column("left")
private String leftSide;
#Column("right")
private String rightSide;
// #Enumerated(EnumType.STRING) - no support for now
#Column("decision")
private ComparisonDecision decision;
private String differences;
The problem is that when I try to add any object to the database it fails with the exception
org.springframework.dao.TransientDataAccessResourceException: Failed to update table [json_comparison]. Row with Id [4] does not exist.
at org.springframework.data.r2dbc.repository.support.SimpleR2dbcRepository.lambda$save$0(SimpleR2dbcRepository.java:91) ~[spring-data-r2dbc-1.0.0.RELEASE.jar:1.0.0.RELEASE]
at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:96) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onNext(FluxOnErrorResume.java:73) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.MonoUsingWhen$MonoUsingWhenSubscriber.deferredComplete(MonoUsingWhen.java:276) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.FluxUsingWhen$CommitInner.onComplete(FluxUsingWhen.java:536) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.Operators$MultiSubscriptionSubscriber.onComplete(Operators.java:1858) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.Operators.complete(Operators.java:132) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.MonoEmpty.subscribe(MonoEmpty.java:45) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
at reactor.core.publisher.MonoDefer.subscribe(MonoDefer.java:52) ~[reactor-core-3.3.1.RELEASE.jar:3.3.1.RELEASE]
For some reason during save in SimpleR2dbcRepository library class it doesn't consider the objectToSave as new, but then it fails to update as it is in reality doesn't exist.
// SimpleR2dbcRepository#save
#Override
#Transactional
public <S extends T> Mono<S> save(S objectToSave) {
Assert.notNull(objectToSave, "Object to save must not be null!");
if (this.entity.isNew(objectToSave)) { // not new
....
}
}
Why it is happening and what is the problem?
TL;DR: How should Spring Data know if your object is new or whether it should exist?
Relational Spring Data Repositories (both, JDBC and R2DBC) must differentiate on [Reactive]CrudRepository.save(…) whether the given object is new or whether it exists in your database. Performing a save(…) operation results either in an INSERT or UPDATE statement. Issuing the wrong statement either causes a primary key violation or a no-op as standard SQL does not have a way to express an upsert.
Spring Data JDBC|R2DBC use by default the presence/absence of the #Id value. Generated primary keys are a widely used mechanism. If the primary key is provided, the entity is considered existing. If the id value is null, the entity is considered new.
Read more in the reference documentation about Entity State Detection Strategies.
You have to implement Persistable because you’ve provided the #Id. The library needs to figure out, whether the row is new or whether it should exist. If your entity implements Persistable, then save(…) will use the outcome of isNew() to determine whether to issue an INSERT or UPDATE.
For example:
public class Product implements Persistable<Integer> {
#Id
private Integer id;
private String description;
private Double price;
#Transient
private boolean newProduct;
#Override
#Transient
public boolean isNew() {
return this.newProduct || id == null;
}
public Product setAsNew() {
this.newProduct = true;
return this;
}
}
May be you should consider this:
Choose data type of your id/Primary Key as INT/LONG and set it to AUTO_INCREMENT (something like below):
CREATE TABLE PRODUCT(id INT PRIMARY KEY AUTO_INCREMENT NOT NULL, modelname VARCHAR(30) , year VARCHAR(4), owner VARCHAR(50));
In your post request body, do not include id field.
Removing #ID issued insert statement
I created a SpringBoot test:
#RunWith(SpringRunner.class)
#SpringBootTest
#TestPropertySource(locations = "classpath:application-dev.properties")
#Transactional
public class ContactTests2 {
private Logger log = LogManager.getLogger();
#PersistenceContext
private EntityManager entityManager;
#Autowired
private ContactRepository customerRepository;
#Autowired
private StoreRepository storeRepository;
#Autowired
private NoteService noteService;
#Autowired
private Validator validator;
private Store store;
#Before
#WithMockUser(roles = "ADMIN")
public void setup() {
log.debug("Stores {}", storeRepository.count());
store = createStore();
storeRepository.save(store);
}
#Test
#WithMockUser(roles = "ADMIN")
public void saveWithNote() {
Contact customer = new Contact();
customer.setPersonType(PersonType.NATURAL_PERSON);
customer.setFirstName("Daniele");
customer.setLastName("Rossi");
customer.setGender(Gender.MALE);
customer.setBillingCountry(Locale.ITALY.getCountry());
customer.setShippingCountry(Locale.ITALY.getCountry());
customer.setStore(store);
Note note = new Note();
note.setGenre(NoteGenre.GENERIC);
note.setOperationType(AuditType.NOTE);
note.setText("note");
customer = customerRepository.save(customer);
noteService.addNote(note, customer);
}
#Test
#WithMockUser(roles = "ADMIN")
public void save() {
Contact customer = new Contact();
customer.setPersonType(PersonType.NATURAL_PERSON);
customer.setFirstName("Daniele");
customer.setLastName("Rossi");
customer.setGender(Gender.MALE);
customer.setBillingCountry(Locale.ITALY.getCountry());
customer.setShippingCountry(Locale.ITALY.getCountry());
customer.setStore(store);
customerRepository.save(customer);
assertEquals(customer, customerRepository.findById(customer.getId()).get());
}
// ====================================================
//
// UTILITY METHODS
//
// ====================================================
private Store createStore() {
Store store = new Store();
store.setName("Padova");
store.setCode("PD");
store.setCountry("IT");
return store;
}
}
this is the note service:
#Service
#Transactional
#PreAuthorize("isAuthenticated()")
public class NoteService {
#PersistenceContext
private EntityManager entityManager;
#Autowired
private NoteRepository noteRepository;
/**
* Add a note to a specific object (parent).
*
* #param note
* #param parent
* #return the added note
*/
public Note addNote(Note note, Persistable<Long> parent) {
// ****************************************************
// VALIDATION CHECKS
// ****************************************************
Assert.notNull(note, InternalException.class, ExceptionCode.INTERNAL_ERROR);
Assert.notNull(parent, InternalException.class, ExceptionCode.INTERNAL_ERROR);
// ****************************************************
// END VALIDATION CHECKS
// ****************************************************
note.setParentId(parent.getId());
note.setParentType(parent.getClass().getSimpleName());
note.setRemoteAddress(NetworkUtils.getRemoteIpFromCurrentContext());
note = noteRepository.save(note);
return note;
}
}
I'm using Hibernate and Mysql 5.7. The problem is that the test called saveWithNote(). When I run this test, following tests fails because the setup() method throw a duplicated exception. It seems the previous test is not rolledback.
This is what happens:
Removing the line noteService.addNote(note, customer); everything works like a charm.
What am I doing wrong? Why test isolation is not preserved?
This is because you are using a real data store as the dependency.
When running saveWithNote(), the customer entry is persisted in database. It is not removed in your test setup, so the when you run save(), you bump into a duplicate database entry.
Solution 1:
Use teardown() method to remove database entries you created during the test.
Example:
#After
#WithMockUser(roles = "ADMIN")
public void teardown() {
// delete the customer entry here
}
Reference: https://examples.javacodegeeks.com/core-java/junit/junit-setup-teardown-example/
Solution 2: Every time you run setup(), wipe the database tables clean.
Example:
#Before
#WithMockUser(roles = "ADMIN")
public void setup() {
// wipe your database tables to make them empty
}
Both solution 1 and 2 should be done with test database only. You DON'T want to clean up production DB.
Solution 3 (recommended):
Use mocked repositories and mock injection (instead of autowiring repositories with real implementation).
Sample/ Reference: https://stackoverflow.com/a/36004293/5849844
Most likely your table is using MyISAM storage engine which does not support transactions (as per Table 15.2 MyISAM Storage Engine Features docs).
Redefine the table using InnoDB storage engine. Take a look at 14.8.1.1 Creating InnoDB Tables docs, it should be on by default but you can check it with:
SELECT ##default_storage_engine;
I've an application that use mybatis for object persistence. But there are chances I need to run arbitrary sql(from user). Can I do it with mybatis?
Update:
I choose to use dbutils (JDBC) to run user-defined sql, but I need a instance of DataSource to create QueryRunner. Is there any way I can get datasource from mybatis?
I use this utilitary class:
import java.util.List;
import org.apache.ibatis.annotations.SelectProvider;
public interface SqlMapper {
static class PureSqlProvider {
public String sql(String sql) {
return sql;
}
public String count(String from) {
return "SELECT count(*) FROM " + from;
}
}
#SelectProvider(type = PureSqlProvider.class, method = "sql")
public List<?> select(String sql);
#SelectProvider(type = PureSqlProvider.class, method = "count")
public Integer count(String from);
#SelectProvider(type = PureSqlProvider.class, method = "sql")
public Integer execute(String query);
}
Your question is similar to How to exequte query directly from java code using mybatis?
I have already given the answer to that question. But I hope this solution will help you.
Mybatis has already this function, but you must use the adapter as follows.
create an adapter class;
public class SQLAdapter {
String sql;
public SQLAdapter(String sql) {
this.sql = sql;
}
public String getSql() {
return sql;
}
public void setSql(String sql) {
this.sql = sql;
} }
create typeAlias of class SQLAdapter
<typeAlias alias="sqladapter" type="com.zj.xxx.xxx.SQLAdapter" />
put select tag in each object xml where you need to execute the sql directly.
<select id="findRecords" parameterType="SQLAdapter" resultMap="xxxxxResultMap">
${sql}
</select>
call this select method like
String _sql = "select * from table where... order by... limit...";
xxxxx.findRecords(new SQLAdapter(_sql));
Things have been all done. you can no longer write complex sql language in the xml file. Good Luck.
Based on the answers provided, they both are good. But both of them required an Adapter class to be used.
Using Mybatis version 3, I succeeded using a HashMap<String, String> to keep and pass the SQL.
See the codes below.
in Mapper class
final String sql = "${sql}";
#Select(sql)
void execute(HashMap<String, String> m);
when invoke the method:
String sql = "SELECT * FROM record limit 1";
HashMap<String, String> map = new HashMap<String, String>();
map.put("sql", sql);
mapper.execute(map);
HashMap provides a way that you don't have to define the Class properties, or fields in code, you can use a Map to define it redomly.
Thanks.
Reusable fragment of SQL can be used to create select part of query dynamically. In you mapper pass query as normal parameter:
#Param("sql")String sql
In your query just access the parameter using ${sql} instead of #{sql}.
Value in parameter sql can be a fully valid sql query or a fragment of sql query.
For testing I use
import org.apache.ibatis.jdbc.ScriptRunner;
import java.io.Reader;
import java.io.StringReader;
public class test {
private static final String conf = "mybatis.conf.xml";
private SqlSessionFactoryBuilder builder;
private SqlSessionFactory sessionFactory;
Reader reader;
private SqlSession session;
private ScriptRunner runner;
#Before
public void before() {
builder = new SqlSessionFactoryBuilder();
try {
reader = Resources.getResourceAsReader(conf);
} catch (IOException e) {
e.printStackTrace();
}
sessionFactory = builder.build(reader);
session = sessionFactory.openSession();
runner = new ScriptRunner(session.getConnection());
runner.setAutoCommit(true);
runner.setStopOnError(true);
}
#Test
public void testSelectChapelStatus() {
Reader populate = new StringReader("insert into person values (7553,0,'201002496','Wish','Jill','Rain',1,0,NULL,'xxx#LCU.EDU');\r\n"
+ "");
runner.runScript(populate);
}