Objectify Transactions using VoidWork - java

I am using Objectify version 4. I want to use transactions in my project. Project is based on GWT Java and objectify. As per objectify tutorials i found that ofy().transact() method to be used. So i preferred to use the following
ofy().transact(new VoidWork() {
public void vrun() {
Here i wrote code for saving data to entity
}
});
When i execute the project on development server/local i get a error message stating that
No source code is available for type com.googlecode.objectify.VoidWork; did you forget to inherit a required module?
The method createBillingDocs() is undefined for the type new VoidWork(){}
createBillingDocs is my method which i want to execute in transaction.
So any help?
Thanks in advance

You can't run transactions or use Objectify client-side; it is a server-side framework for accessing the datastore. You need to separate out your client-side logic from your server-side logic and define your GWT modules carefully.

Related

How to dynamic generate Enum type from mysql in Java spring boot application?

In my project,we want manage all REST APIs errorcodes in a Enum type,like the following code:
package com.example.util
public enum ErrorType{
SUCCESS("0000", "success")
PARAMS_EMPTY("3001", "params cannot be empty")
}
The problem we encounter is if we put the class into util package,everytime we add a new error type in business spring boot app,we'll need to modify,publish and recompile the app and util project.That would be hard to maintance the util package.Basically,we prefer to maintance a relatively stable utility package.
So we are considering if there is a way that we can generate Enum type dynamiclly,we can comfigure the error information in mysql in advance,then we can load them into enum type in application boot procedure.
I'm not sure is this a good idea to dynamic generate enum type in Java,or if there is a better solution for this problem.
You can't add or remove values from an enum. Enums are complete static enumerations.
If you need to handle variable values you need to work with a standard class.
For example you can have something like the following:
public Error {
public static Error getByName(String name) {
....
}
public static Error getByCode(int code) {
....
}
}
and use it as follow:
Error myError404 = Error.getByCode(404);
Obviously this code gives you a lot of flexibility, but you cannot know in advance if a particular error code exists or not. And you can't use ide facilities related to enums.
Generating an enum would not be so useful I think, since part of the power of enums is that you can use them statically in your code - so then you have to update your code anyway.
How about adding a an exception handler to your util library that can be populated with error codes / description mappings from the database that can then translate errors/exceptions to response codes / error messages for your API? (just guessing you have an api here :-) )
Thanks for your reply,we finally decide to give up this idea.Dynamic generate an enum would not help,indeed it will add more complexity to our project,it's not a common use of enum.
Instead of this,we predefine our main error type likes user_error,system_error and db_error,etc.The specific error information will be processed in the business service.

How do I load an EDMX schema file in Java with OLingo 4?

I have a large edmx schema file that would be very inconvenient to manually re-create, one EntityType at a time, in Java using OLingo. While I'm not opposed to writing a loader of some kind, I wanted to make sure that OLingo 4 doesn't already provide this functionality.
I found an article that shows how OLingo 2 can load this kind of information:
#Override
public Edm readMetadata(final InputStream inputStream, final boolean validate)
throws EntityProviderException {
EdmProvider provider = new EdmxProvider().parse(inputStream, validate);
return new EdmImplProv(provider);
}
But I need to use version 4. I haven't found the same interfaces in the documentation for version 4, so I'm at a bit of a loss. Any pointers much appreciated.
After more investigation, I found that I needed the odata-server-core-ext package and I could import org.apache.olingo.server.core.MetadataParser. Among other things, this class has a function called buildEdmProvider(Reader) which does the work of building a SchemaBasedEdmProvider for you.
If you're not bound to OLingo, you could also try odata-client: https://github.com/davidmoten/odata-client
I've not had a good chance to use it myself as unfortunately the web service I'm trying to connect to is OData 2, and odata-client only supports 4. However, it looked to have some neat features (including type safety and automatic/transparent paging).

DataTableRepository in Spring data Elasticsearch

Currently, we are using spring data JPA with MySql database with DataTabaleRepository which works well with JPA. Now we are moving our data to Spring data elasticserch but DataTabaleRepository is not working with that. Is there any alternative for that or how can I implement a custom repository for that?
spring-data-jpa-datatables does not implement support for ElasticsearchRepository, as you say and use the Specification API which is not implemented by Spring Data for Elasticsearch, so extending it would take some work.
What you need to do is create your own ElasticsearchRepositoryFactoryBean (ie. ElasticsearchDataTablesRepositoryFactoryBean) and your own implementation of AbstractElasticsearchRepository that implements the specifics of spring-data-jpa-datatables just like DataTablesRepositoryImpl. You should also define your own DataTablesRepository (ElasticsearchDataTablesRepository that extends ElasticsearchRepository) with the same methods.
The org.springframework.data.jpa.datatables.mapping classes can be reused, but you'll have to recreate the logic found in SpecificationFactory for elasticsearch using QueryBuilders, which will be the most time consuming part I imagine.
When you're done, you can use the #EnableElasticsearchRepositories just like described by spring-data-jpa-datatables ie.:
#EnableElasticsearchRepositories(repositoryFactoryBeanClass = ElasticsearchDataTablesRepositoryFactoryBean.class))
And extend your repositories with your ElasticsearchDataTablesRepository interface and you're good to go.
For reference you should look at SpecificationFactory and AbstractElasticsearchRepository (the search method) and get familiar with Elasticsearch QueryBuilders.

Restrict execution of a Method with Java Annotations

Do you know, if there is the possibility to check who is calling a method and to restrict whether they are allowed to execute it with Java Annotations?
For example if you have a client and a server. There are several users, which have different roles and they login into the client. Then (the same client) with different users wants to call a getMethod on the server.
Can I restrict, who is allowed to call this methos with Java Annotations?
Like:
#Role(role="AllowedRole")
public ReturnType getMethod() {
...
}
Well, I used to achieve this with Seam/DeltaSpike in JBoss Server. It's pretty straightforward.
Basically, you have a method which you annotate with your annotation. For example, mine is #User:
public class MyClass {
#User
public Object getMethod() {
//implementation
}
}
Next, you need a class where you define how you check your annotations:
public class Restrictions {
#Secures #User
public boolean isOk(Identity identity) {
if (identity.getUsername("Peter")) {
return true;
}
return false;
}
}
That's it! Ofcourse, you need some libraries and to define these intercepting stuff in certain xml files (like beans.xml) but it can be easily done with a little googling.
Start from these links:
Seam framework
Questions I asked on JBoss community when I was starting with this
This seems to be a good case for Method Security of Spring Security.
Annotations do not include code and are not processed magically. They just define metadata, so you need some kind of engine that processes the annotations and performs the access validation.
There are a lot of frameworks and tools that do this. For example you can implement this using AspectJ, Spring framework and Java EE support similar annotations.
You can also implement this logic yourself using dynamic proxy, byte code engineering or other technique.
So, please explain better what kind of application are you implementing and we can probably give you better advice.

Unable to use multiple ebean databases in Play 2

We are setting up a slightly complicated project using Play Framework 2.0.3.
We need to access several databases (pre-existing) and would like to do it using the frameworks built-in facilities (ie. EBean).
We tried to create all model classes within the "models" package, and then map each class with its FQN to the corresponding EBean property in the application.conf:
ebean.firstDB="models.ClassA,models.ClassB,models.ClassC"
ebean.secondDB="models.ClassD"
ebean.thirdDB="models.ClassE,models.ClassF"
This doesn't seem to work:
PersistenceException: Error with [models.SomeClass] It has not been enhanced but it's superClass [class play.db.ebean.Model] is? (You are not allowed to mix enhancement in a single inheritance hierarchy) marker[play.db.ebean.Model] className[models.SomeClass]
We checked and re-checked and the configuration is OK!
We then tried to use a different Java package for each database model classes and map them accordingly in the application.conf:
ebean.firstDB = "packageA.*"
ebean.secondDB = "packageB.*"
ebean.thirdDB = "packageC.*"
This works fine when reading information from the database, but when you try to save/update objects we get:
PersistenceException: The default EbeanServer has not been defined? This is normally set via the ebean.datasource.default property. Otherwise it should be registered programatically via registerServer()
Any ideas?
Thanks!
Ricardo
You have to specify in your query which database you want to access.
For example, if you want to retrieve all users from your secondDB :
// Get access to your secondDB
EbeanServer secondDB = Ebean.getServer("secondDB");
// Get all users in secondDB
List<User> userList = secondDB.find(User.class).findList();
When using save(), delete(), update() or refresh(), you have to specify the Ebean server, for instance for the save() method:
classA.save("firstDB");
I have encounter the same problem and waste a whole day to investigate into it,finally I have got it.
1.define named eabean server
db.default.driver=com.mysql.jdbc.Driver
db.default.url="jdbc:mysql://localhost:3306/db1"
db.default.user=root
db.default.password=123456
db.aux.driver=com.mysql.jdbc.Driver
db.aux.url="jdbc:mysql://localhost:3306/db2"
db.aux.user=root
db.aux.password=123456
now you have two ebean server [default] and [aux] at run time.
2.app conf file
ebean.default="models.*"
ebean.aux= "secondary.*"
Now entiies under package models.* configured to [default] server and entities under package secondary.* configured to [aux] server. I think this may related to java class enhancement or something. You don't need to separate Entities into different packages, but if entities of different ebean servers are under same package, it may cause weird trouble and exceptions.
When using you model, save/delete/update related method should add server name as parameter
Student s = new Student(); s.save("aux");
When use finder,you should define your finder as
public static Finder find = new Finder("aux",Long.class,Student.class);
Might not be the same case, I ran to this SomeClass not enhanced PersistenceException with Play 2.1.0,
and only what was missing was a public declaration in SomeClass model class that I had forgotten..
In Play 2.1.0 the error message was a little different:
PersistenceException: java.lang.IllegalStateException: Class [class play.db.ebean.Model] is enhanced and [class models.Address] is not - (you can not mix!!)
This solved my issue with saving to my db table and resolving the error:
"javax.persistence.PersistenceException: The default EbeanServer has not been defined ? This is normally set via the ebean.datasource.default property. Otherwise it should be registered programatically via registerServer()"

Categories