How do I map tables to Java objects - java

Take an example i have three tables like this.
database image
How cloud i map the third table to java object.
class StudentCourse{
Student student;
Course course;
Double score;
}
or
class StudentCourse{
Long studentId;
Long courseId;
Double score;
}
if i use the first one, after i update some datas in databse such as student informations.The next time i query StudentCourse from database(I use mybatis) will the cache cause the incorrect data?
if i use the second one, if i want to list student's course scores,i have to first query the list of StudentCourse and then query the course's information from database through courseId, for each result i need additional queries. I think that will reduce the efficiency of the program.
Is there another way to solve this problem?
For the first one.
The second time mybatis do query, if the data hasn't been updated yet, it will get result from cache.
private <E> List<E> queryFromDatabase(MappedStatement ms, Object parameter, RowBounds rowBounds, ResultHandler resultHandler, CacheKey key, BoundSql boundSql) throws SQLException {
this.localCache.putObject(key, ExecutionPlaceholder.EXECUTION_PLACEHOLDER);
List list;
try {
list = this.doQuery(ms, parameter, rowBounds, resultHandler, boundSql);
} finally {
this.localCache.removeObject(key);
}
this.localCache.putObject(key, list);
if (ms.getStatementType() == StatementType.CALLABLE) {
this.localOutputParameterCache.putObject(key, parameter);
}
return list;
}
If i have an resultMap like this
<resultMap id="studentcourse" type="StudentCourse">
<association property="student" resultMap="Student" />
<association property="course" resultMap="Course"/>
<result property="score" column="score"/>
</resultMap>
At first i get an StudentCourse object from database, and the localCache cache the object.And then i update the Course in StudentCourse(change the database record).The second time i get the some StudentCourse it will return an result from localcache.So the course information in StudentCourse is dirty data.How to deal with it if i choose the first one.

Ideally you would use a class design that best models your domain and worry about mapping to a data store in a separate persistence layer. If you need to substantially change your model to allow the persistence layer to function then you need a new ORM! While I'm not familiar with mybatis I would hope it wouldn't create a new object each time the underlying data is changed.
The keys in the course and student tables act as foreign keys in the student_course table. Foreign keys are best represented as references in Java. To use the keys at the Java level forces an extra level of indirection and open you to integrity issues (e.g. if the foreign key changes).
So I would suggest:
class StudentCourse {
private final Student student;
private final Course course;
private double score;
}
You could also consider having it inside one of the other classes - that might be more convenient depending on how the classes are used:
class Student {
private final int id;
private String name;
private List<CourseScores> scores = new ArrayList<>();
public void addCourseScore(Course course, double score) {
scores.add(new CourseScore(course, score));
}
private record CourseScores(Course course, double score) { };
}
If your ORM doesn't resolve the keys for you (i.e. look up the referred object automatically when it retrieves data) then you'll need to do that yourself. It's a pretty simple object however:
class College {
private Map<Integer,Student> students;
private Map<Integer,Course> courses;
}
So the code to convert student_course data into the model above might look like:
ResultSet data;
while (!data.isAfterLast()) {
Student student = college.getStudent(data.getInteger("student"));
Course course = college.getCourse(data.getInteger("course"));
double score = data.getDouble("score");
student.addCourseScore(course, score);
data.next();
}

Related

How to ensure backwards compatibility when changing data type of attribute DynamoDB

I am attempting to change the data type of an attribute in one of my DDB tables, but because this data is read from and written to, altering the data type of the attribute causes subsequent read failures when reading old records, which look like this:
could not unconvert attribute
DynamoDBMappingException: expected M in value {N: 1000,}
My question is about how I can change the data type of an attribute in my table, and architect the change such that I can still read the Double value that exists in previous records. Here is the class in question:
#DynamoDBTable(tableName = "Sections")
#Data
#EqualsAndHashCode(callSuper = false)
#NoArgsConstructor
#AllArgsConstructor
#Builder
public class SectionRecord {
#DynamoDBHashKey
private String id;
private Map<String, Double> sectionTarget; //previous definition: private Double sectionTarget;
public void setSectionTarget(Double sectionTarget, String key) {
if (this.sectionTarget == null) {
this.sectionTarget = new HashMap<Double, String>();
}
this.sectionTarget.put(key, sectionTarget);
}
public void getSectionTarget(String key) {
return this.sectionTarget.get(key);
}
}
And eventually, I try to read a record like this:
mapper.load(SectionRecord.class, id);
Which is presumably where the issue comes from - I'm trying to read a Double (which exists in the ddb currently) as a map (the changes I've made to the attribute).
I'd love to hear some guidance on how best to architect such a change such that these backwards compatibility issues could be mitigated.
You have to
Create a new attribute with the new type, both in dynamo and in SectionRecord
Your code should be able to read and work with both
deploy it on production and wait for the old data to disappear (or create a custom migration logic)
Delete the old field, the logic can now rely only on the new field
Welcome to dynamo where you don't have DB migrations :(

Apache Ignite - SQL Support for HashMaps?

Hi there I like to use at Apache Ignite a Pojo which has a HashMap attribute so I can work with dynamic models at runtime. Storing and Saving of such objects works fine.
However, I m wondering if there a way exist to access the key / values of such a Hashmap through a SQL query? If this is not supported any other ways I can work in Apache Ignite with dynamic objects?
POJO Class with dynamic attributes
#Data
public class Item {
private static final AtomicLong ID_GEN = new AtomicLong();
#QuerySqlField(index = true)
private Long id;
#QuerySqlField
public Map<String,Serializable> attributes = new HashMap<String,Serializable>();
public Item(Long id, String code, String name) {
this.id = id;
}
public Item() {
this(ID_GEN.incrementAndGet());
}
public void setAttribute(String name,Serializable value) {
attributes.put(name, value);
}
public Serializable getAttribute(String name) {
return attributes.get(name);
}
}
Example Query Feature illstrated
SqlFieldsQuery query = new SqlFieldsQuery("SELECT * FROM Item WHERE attributes('Price') > 100");
SQL in Ignite is not just syntactic sugar, it requires a schema of your models to be defined before you can run SQL queries and this won't work for a collection. Therefore you need to normalize the data just like with a regular DB or rework the model's structure somehow to avoid JOIN.
Apache Ignite has no support for destructuring/collections in its SQL, so you can't peek inside HashMap via SQL.
However, you may define your own SQL functions, so you can implement e.g. SELECT hashmap_get(ATTRIBUTES, 'subkey') FROM ITEM WHERE ID = ?
But you can't have indexes on function application so the usefulness is limited.

Axon: Create and Save another Aggregate in Saga after creation of an Aggregate

Update: The issue seems to be the id that I'm using twice, or in other words, the id from the product entity that I want to use for the productinventory entity. As soon as I generate a new id for the productinventory entity, it seems to work fine. But I want to have the same id for both, since they're the same product.
I have 2 Services:
ProductManagementService (saves a Product entity with product details)
1.) For saving the Product Entity, I implemented an EventHandler that listens to ProductCreatedEvent and saves the product to a mysql database.
ProductInventoryService (saves a ProductInventory entity with stock quantities of product to a certain productId defined in ProductManagementService )
2.) For saving the ProductInventory Entity, I also implemented an EventHandler that listens to ProductInventoryCreatedEvent and saves the product to a mysql database.
What I want to do:
When a new Product is created in ProductManagementService, I want to create a ProductInventory entity in ProductInventoryService directly afterwards and save it to my msql table. The new ProductInventory entity shall have the same id as the Product entity.
For that to accomplish, I created a Saga, which listes to a ProductCreatedEvent and sends a new CreateProductInventoryCommand. As soon as the CreateProductInventoryCommand triggers a ProductInventoryCreatedEvent, the EventHandler as described in 2.) should catch it. Except it doesn't.
The only thing thta gets saved is the Product Entity, so in summary:
1.) works, 2.) doesn't. A ProductInventory Aggregate does get created, but it doesn't get saved since the saving process that is connected to an EventHandler isn't triggered.
I also get an Exception, the application doesn't crash though: Command 'com.myApplication.apicore.command.CreateProductInventoryCommand' resulted in org.axonframework.commandhandling.CommandExecutionException(OUT_OF_RANGE: [AXONIQ-2000] Invalid sequence number 0 for aggregate 3cd71e21-3720-403b-9182-130d61760117, expected 1)
My Saga:
#Saga
#ProcessingGroup("ProductCreationSaga")
public class ProductCreationSaga {
#Autowired
private transient CommandGateway commandGateway;
#StartSaga
#SagaEventHandler(associationProperty = "productId")
public void handle(ProductCreatedEvent event) {
System.out.println("ProductCreationSaga, SagaEventHandler, ProductCreatedEvent");
String productInventoryId = event.productId;
SagaLifecycle.associateWith("productInventoryId", productInventoryId);
//takes ID from product entity and sets all 3 stock attributes to zero
commandGateway.send(new CreateProductInventoryCommand(productInventoryId, 0, 0, 0));
}
#SagaEventHandler(associationProperty = "productInventoryId")
public void handle(ProductInventoryCreatedEvent event) {
System.out.println("ProductCreationSaga, SagaEventHandler, ProductInventoryCreatedEvent");
SagaLifecycle.end();
}
}
The EventHandler that works as intended and saves a Product Entity:
#Component
public class ProductPersistenceService {
#Autowired
private ProductEntityRepository productRepository;
//works as intended
#EventHandler
void on(ProductCreatedEvent event) {
System.out.println("ProductPersistenceService, EventHandler, ProductCreatedEvent");
ProductEntity entity = new ProductEntity(event.productId, event.productName, event.productDescription, event.productPrice);
productRepository.save(entity);
}
#EventHandler
void on(ProductNameChangedEvent event) {
System.out.println("ProductPersistenceService, EventHandler, ProductNameChangedEvent");
ProductEntity existingEntity = productRepository.findById(event.productId).get();
ProductEntity entity = new ProductEntity(event.productId, event.productName, existingEntity.getProductDescription(), existingEntity.getProductPrice());
productRepository.save(entity);
}
}
The EventHandler that should save a ProductInventory Entity, but doesn't:
#Component
public class ProductInventoryPersistenceService {
#Autowired
private ProductInventoryEntityRepository productInventoryRepository;
//doesn't work
#EventHandler
void on(ProductInventoryCreatedEvent event) {
System.out.println("ProductInventoryPersistenceService, EventHandler, ProductInventoryCreatedEvent");
ProductInventoryEntity entity = new ProductInventoryEntity(event.productInventoryId, event.physicalStock, event.reservedStock, event.availableStock);
System.out.println(entity.toString());
productInventoryRepository.save(entity);
}
}
Product-Aggregate:
#Aggregate
public class Product {
#AggregateIdentifier
private String productId;
private String productName;
private String productDescription;
private double productPrice;
public Product() {
}
#CommandHandler
public Product(CreateProductCommand command) {
System.out.println("Product, CommandHandler, CreateProductCommand");
AggregateLifecycle.apply(new ProductCreatedEvent(command.productId, command.productName, command.productDescription, command.productPrice));
}
#EventSourcingHandler
protected void on(ProductCreatedEvent event) {
System.out.println("Product, EventSourcingHandler, ProductCreatedEvent");
this.productId = event.productId;
this.productName = event.productName;
this.productDescription = event.productDescription;
this.productPrice = event.productPrice;
}
}
ProductInventory-Aggregate:
#Aggregate
public class ProductInventory {
#AggregateIdentifier
private String productInventoryId;
private int physicalStock;
private int reservedStock;
private int availableStock;
public ProductInventory() {
}
#CommandHandler
public ProductInventory(CreateProductInventoryCommand command) {
System.out.println("ProductInventory, CommandHandler, CreateProductInventoryCommand");
AggregateLifecycle.apply(new ProductInventoryCreatedEvent(command.productInventoryId, command.physicalStock, command.reservedStock, command.availableStock));
}
#EventSourcingHandler
protected void on(ProductInventoryCreatedEvent event) {
System.out.println("ProductInventory, EventSourcingHandler, ProductInventoryCreatedEvent");
this.productInventoryId = event.productInventoryId;
this.physicalStock = event.physicalStock;
this.reservedStock = event.reservedStock;
this.availableStock = event.availableStock;
}
}
What you are noticing right now is the uniqueness requirement of the [aggregate identifier, sequence number] pair within a given Event Store. This requirement is in place to safe guard you from potential concurrent access on the same aggregate instance, as several events for the same aggregate all need to have a unique overall sequence number. This number is furthermore use to identify the order in which events need to be handled to guarantee the Aggregate is recreated in the same order consistently.
So, you might think this would opt for a "sorry there is no solution in place", but that is luckily not the case. There are roughly three things you can do in this set up:
Life with the fact both aggregates will have unique identifiers.
Use distinct bounded contexts between both applications.
Change the way aggregate identifiers are written.
Option 1 is arguably the most pragmatic and used by the majority. You have however noted the reuse of the identifier is necessary, so I am assuming you have already disregarded this as an option entirely. Regardless, I would try to revisit this approach as using UUIDs per default for each new entity you create can safe you from trouble in the future.
Option 2 would reflect itself with the Bounded Context notion pulled in by DDD. Letting the Product aggregate and ProductInventory aggregate reside in distinct contexts will mean you will have distinct event stores for both. Thus, the uniqueness constraint would be kept, as no single store is containing both aggregate event streams. Whether this approach is feasible however depends on whether both aggregates actually belong to the same context yes/no. If this is the case, you could for example use Axon Server's multi-context support to create two distinct applications.
Option 3 requires a little bit of insight in what Axon does. When it stores an event, it will invoke the toString() method on the #AggregateIdentifier annotated field within the Aggregate. As your #AggregateIdentifier annotated field is a String, you are given the identifier as is. What you could do is have typed identifiers, for which the toString() method doesn't return only the identifier, but it appends the aggregate type to it. Doing so will make the stored aggregateIdentifier unique, whereas from the usage perspective it still seems like you are reusing the identifier.
Which of the three options suits your solution better is hard to deduce from my perspective. What I did do, is order them in most reasonable from my perspective.
Hoping this will help your further #Jan!

Mapping ResultSet to Pojo Objects

Well that's really embarrassing I have made a standard pojo class and its dao class for data retrieval purpose. I am having a difficulty to understand a basic procedure to how to handle a customized query data to Pojo class.
let's say my User class is
public class User{
private int userId;
private String username;
private int addressId;
}
public class Address{
private int addressId;
private String zip;
}
public class UserDAO{
public void getUserDetails(){
String getSql = select u.userId, u.username, a.zipcode from user u, address a where u.addressId = a.addressId;
//no pojo class is now specific to the resultset returned. so we can't map result to pojo object
}
}
now how I should model this with my pojo class as if using String to manage this then concept of object oriented vanishes, also complexity would increase in the future as well. kindly guide!
Update for Further Explanation
We know that we can map same table objects with same pojo class, but when the query is customized and there is a data returned which doesn't map to any specific class then what would be the procedure? i.e. should we make another class? or should we throw that data in a String variable? kindly give some example as well.
For this purpose you can use one of implementation of JPA. But as you want to do it manually I will give you small example.
UPD:
public class User {
private int userId;
private String username;
private Address address; // USE POJO not ID
}
public class Address{
private int addressId;
private String zip;
List<User> users;
}
public User getUserById(Connection con, long userId) {
PreparedStatement stmt;
String query = "select u.user_id, u.user_name, a.id, a.zip from user u, address a where a.address_id = u.id and u.id = ?";
User user = new User();
Address address = new Address;
try {
stmt = con.prepareStatement(query);
stmt.setLong(1, userId);
ResultSet rs = stmt.executeQuery();
address.setId(rs.getInt("id"));
address.setZip(rs.getString("zip");
user.setId(rs.getInt("id"));
user.setUsername(rs.getString("user_name"));
user.setAddressId(rs.getInt("address_id"));
user.setAddress(address); // look here
} catch (SQLException e) {
if (con != null) {
try {
System.err.print("Transaction is being rolled back");
con.rollback();
} catch (SQLException excep) {
}
}
} finally {
if (stmt != null) {
stmt.close();
}
}
return user;
}
You shouldn't do new POJO for that query, you should write normal query. And remember - your object model is main, tables in DB is just a way to save data of your application.
We know that we can map same table objects with same pojo class, but when the query is customized and there is a data returned which doesn't map to any specific class then what would be the procedure? i.e. should we make another class?
JPA dynamic instantiation allows you to define a query with a POJO whose constructor specifies only the fields and types you want from the database.
This will perform a JPA selection which will return a List.
If you need to change the query later and the columns are unchanged, your POJO will still work.
If you change the columns, then also change the POJO accordingly.
NOTE:
You must specify fully qualified package and constructor arguments.
Type User must be a JPA-mapped or JPA-annotated entity class.
The entityManager is in JPA EntityManagerFactory.
TypedQuery<User> q;
String sql = "select new com.stuff.User(
int u.userId, String u.username, String a.zipcode)
from User u, Address a where u.addressId = a.addressId";
List<User> list = entityManager.createQuery(sql).getResultList();
for(User u : list) {
doStuff(u);
}
Dynamic instantiation is also handy when you want to select specified columns, but avoid those columns with large data, such as BLOB types.
For example, maybe you want a list of proxy POJO's which represent the fully populated thing, but are themselves not fully populated.
You present the proxy list, and when the user selects one, then you do another query to get the fully populated object.
Your mileage may vary.
There's many ORM frameworks that can do this including Hibernate, myBatis, JPA and spring-JDBC
spring-jdbc and myBatis give you granular control over the SQL whereas with JPA and Hibernate you are usually abstracted away from the SQL.
I suggest you do some reading and figure out which one you like before rolling your own solution.
Your question:
We know that we can map same table objects with same pojo class,
but when the query is customized and there is a data returned
which doesn't map to any specific class then what would be the procedure?
If you have 100 kinds of SQL which returns different combination of columns, could it be to create 100 different POJOs? The answer is "NO, stop using POJO".
This library qood is designed to solve this problem, you can try it.

Return two values from a java method

Let's say I have a method in java, which looks up a user in a database and returns their address and the team they are on.
I want to return both values from the method, and don't want to split the method in two because it involves a database call and splitting involves twice the number of calls.
Given typical concerns in a moderate to large software project, what's the best option?
whatGoesHere getUserInfo(String name) {
// query the DB
}
I know the question smells of duplication with existing ones, but each other question had some element that made it different enough from this example that I thought it was worth asking again.
you have some options.
The most OOP it will be create a class to encapsulate those 2 properties, something like that
private class UserInfo {
private Address address;
private Team team;
}
Or if you want a simple solution you can return an array of objects:
Object[] getUserInfo(String name) {
// query the DB
return new Object[]{address,team};
}
Or if you want to expose this method to some library you can have some interface that it will consume those properties, something like this:
class APIClass{
interface UserInfo{
public Address getAddress();
public Team getTeam();
}
UserInfo getUserInfo(String name) {
// query the DB
return new UserInfo(){
public Address getAddress(){ return address; }
public Team getTeam(){ return team; }
};
}
}
cant a map help , A MultivalueMap. Where the key is the user name and the 2 values are the adress and the team name. I am assuming both your Address and team are String variables, You can know more about Multivalue Map here
http://commons.apache.org/collections/apidocs/org/apache/commons/collections/map/MultiValueMap.html
http://apachecommonstipsandtricks.blogspot.in/2009/01/multi-value-map-values-are-list.html
First model your abstractions, relationships and multiplicity well (see an e.g. below). Then you can model tables accordingly. Once these two steps are performed you can either leverage JPA that can be configured to load your object graph or you write JDBC code and create the graph your self by running a SQL query with proper SQL JOINs.
A User has an Address
A Team can have 1 or more Users (and can a User play for more teams?)
You can return a String array with user name and group name in it . The method looks like :
public String[] getUserInfo(String name) {
String[] result = new String[2];
// query the DB
...
result[0] = userName;
result[1] = groupName;
return result;
}
A common solution to this kind of issue is to create a custom object with as many attributes as the values you want to return.
If you can't create a new class for this, you can use a Map<String, Object>, but this approach is not type-safe.
I thought Guava had a generic Pair class already, but I cannot find it. You can build your own using generics if you're on Java 1.5+.
public class Pair<X,Y>
{
public final X first;
public final Y second;
public Pair(X first, Y second) {
this.first = first;
this.second = second;
}
}
Feel free to make the fields private and add getters. :) Using it is easy:
return new Pair<Address,Team>(address, team);
Update
Apache Commons Lang has Pair. See this SO question for more options.

Categories