Spring session stores serialized objects in my database. The problem is, sometimes my code changes. Sometimes my objects change. This is normal. However, I get errors like this:
org.springframework.core.convert.ConversionFailedException: Failed to convert from type [byte[]] to type [java.lang.Object] for value '{-84, ..., 112}'; nested exception is org.springframework.core.serializer.support.SerializationFailedException: Failed to deserialize payload. Is the byte array a result of corresponding serialization for DefaultDeserializer?; nested exception is java.io.InvalidClassException: com.mysite.MyClass; local class incompatible: stream classdesc serialVersionUID = 1432849980928799324, local class serialVersionUID = 8454085305026634675
I get this error by invoking a Spring Boot endpoint with HttpSession as an argument, such as this one:
#GetMapping("/stuff")
public #ResponseBody MyClass getStuff(HttpSession session) {
try {
Object myObject = session.getAttribute("MyClass");
if (myObject != null && myObject instanceof MyClass) {
return (MyClass) myObject;
} else {
return null;
}
} catch (Exception e) {
logger.warn("Invalid session data", e);
return null;
}
}
However, because the exception is thrown before the method gets invoked, I am not able to recover from this normal, expected error.
As a workaround, I am forced to delete the entire session table each deployment, even though most of the objects are still compatible!
To be clear, the solution is NOT to add a serialVersionUuid. Because the objects really do change in non-compatible ways from one deployment to the next. This is not a serialization question. This is a Spring Session error recovery question.
My question is: How can I gracefully recover from these issues?
You did not provide details but I assume you are using Spring's JDBC session implementation enabled by #EnableJdbcHttpSession?
In this case you can take a look at JdbcHttpSessionConfiguration and particularly at setSpringSessionConversionService and setConversionService. I believe if you provide your own implementation (you can see example at createConversionServiceWithBeanClassLoader) then you should be able to catch deserialization error and return empty session.
I think all you need is derive MyNotFailingSessionDeserializer from DeserializingConverter, override convert method, catch SerializationFailedException and return null or empty session (not sure if either works).
Then you create your conversion service like createConversionServiceWithBeanClassLoader does but use your MyNotFailingSessionDeserializer instead of DeserializingConverter
Related
I'd like some feedback on a situation where:
A method constructs an object, but some of the work done while constructing it might fail. This will lead to an object that is missing some data. I want to give the user of this method the ability to handle the object if complete but also handle the object if incomplete while also being able to handle the exception thrown.
Use Case:
I'm reading a file from disk into a POJO and some of the file attributes like date created can throw an exception while being read from the Operating System. In that case I'm throwing a custom exception but I also want the user to be able to handle that incomplete file representation (POJO).
My solution:
I used a custom exception that wraps the thrown exception and the incomplete object.
My code:
public FileDto getFromFile(File f) throws IncompleteFileDtoException {
FileDto dto = new FileDto();
dto.setName(f.getName());
dto.setPath(f.getAbsolutePath());
dto.setDirectory(f.isDirectory());
dto.setSize(f.length());
dto.setModifiedAt(f.lastModified());
try {
BasicFileAttributes attr = Files.readAttributes(f.toPath(), BasicFileAttributes.class);
dto.setCreatedAt(attr.creationTime().toMillis());
}
catch(Exception e)
{
throw new IncompleteFileDtoException("Unable to transform " +f.getAbsolutePath() + " to DTO.", e, dto );
}
return dto;
}
public static class IncompleteFileDtoException extends Exception
{
private FileDto fileDto;
public IncompleteFileDtoException(String message, Exception e, FileDto fileDto)
{
super(message,e);
this.fileDto = fileDto;
}
public FileDto getFileDto() {
return fileDto;
}
}
What negative effects could this code have ?
Your example only contained one value that might lead to a problem but as soon as you have multiple values you end up with quiet complicated code, because you have to keep the information if such an exception should be thrown.
Personally a better approach might be to just set fitting default values (if not just a null) if the processing failed but it's OK for the initialization of that particular value. And if it's OK that a value can be null you can just the whole exception-throwing. If you need to know if there was a problem during setup, add a flag in that object that gives the information if something failed that can be cheecked. That would also allow you to pass the object around without losing that information in subsequent classes, etc.
In short: Exception should only indicate exceptional situations, i.e. that an object can't be used and not to indicate expected situations
I offer you to use Builder pattern. Do create FileDtoBuilder and put it into exception. When you read file successfully, the do create FileDto instance from existed FileDtoBuilder.
Gang Of Four Design Patterns
https://en.wikipedia.org/wiki/Builder_pattern
We have an application with three databases. Two of them are only very seldomly updated. We tried JPA to create transactions around it and it worked for the databases, but grails then did not work on different places (gsp related I am told). This was tried quite a while ago (and not by me).
Due to delivery pressure we needed a solution that at least works for us, so I created a new aspect for the methods changing data in multiple databases. I got this to work, it is a fairly simple approach.
In the aspect we request to start a transaction for each data source, by calling getTransaction(TransactionDefinition def) with the propagation set to REQUIRES_NEW. We then proceed and finally rollback or commit depending on the outcome of the call.
However, one test flow failed. This is the scenario where the code requests a rollback by calling TransactionAspectSupport.currentTransactionStatus().setRollbackOnly(). Of the three TransactionStatusses obtained initially, none actually returns isRollbackOnly() with true. However calling TransactionAspectSupport.currentTransationStatus().isRollbackOnly() does return true. So this seems to point to a different transaction status.
I have not been able to figure out how to make this work, other than checking this additional status. I could not find a way to change the currentTransactionStatus to the one of created TransactionStatus. Looking at the TransactionTemplate implementation, I seem to do things correctly (it also just calls getTransaction() on the datasource).
The code calling the decorated method has specified #Transactional(propagation=Propagation.NOT_SUPPORTED), so I expected no currentTransactionStatus, but one is there.
However, if it is not there the proxied code will not be able to request a rollback the standard way, which I want to be able to fix.
So the question is, how to start a transaction correctly from an Aspect so that the currentTransactionStatus is set correctly or how to set the currentTransactionStatus to what I think is the correct one.
Regards,
Wim Veldhuis.
I finally figured it out.
#Transactional leads to a different code path, where eventually TransactionAspectSupport.invokeWithinTransaction is invoked. This method will set up the current transaction correctly.
So in order to make my approach working, I needed to derive from TransactionAspectSupport, do a number of cast operations so I could get to the correct values for the invokeWithinTransaction call, and within the guarded function block use getTransaction(def) to obtain txns for the OTHER databases. I have choose the most important database to be the one used for invoke...
To make it work I had also to provide a TransactionAttributeSource, that returned my default transaction attributes.That one is stored into the TransactionAspectSupport base class during initialization.
#Around("#annotation(framework.db.MultiDbTransactional)")
public Object multiDbTransaction(ProceedingJoinPoint proceedingJoinPoint) throws Throwable {
// Get class and method, needed for parent invocation. We need to cast to the actual
// implementation
MethodInvocationProceedingJoinPoint mipJoinPoint = (MethodInvocationProceedingJoinPoint) proceedingJoinPoint;
MethodSignature signature = (MethodSignature) mipJoinPoint.getSignature();
Class<?> clazz = mipJoinPoint.getTarget().getClass();
Method method = signature.getMethod();
return invokeWithinTransaction(method, clazz, new InvocationCallback() {
#Override
public Object proceedWithInvocation() throws Throwable {
// This class will create the other transactions, not of interest here.
MultiDbTxnContext ctx = new MultiDbTxnContext();
ctx.startTransactions();
/*
* We have started the transactions, so do the job. We mimic DEFAULT spring behavior
* regarding exceptions, so runtime exceptions roll back, the rest commits.
*/
try {
Object result = proceedingJoinPoint.proceed();
ctx.finishTransactions();
return result;
} catch (Error | RuntimeException re) {
ctx.rollbackTransactions();
throw re;
} catch (Throwable t) {
ctx.commitTransactions();
throw t;
}
}
});
}
I'm working on a project revamp for a company, where they want to split their system between front-end/client and back-end/server (more like a middleman between the front-end and the database server), and I'm supposed to use JAX-WS RPC and maintain the current functionality.
By maintaining functionality they mean that some methods are supposed to return null, which is forbidden by WS-I.
Searching for possible solutions, I stumbled upon this article: http://victor-ichim.blogspot.com.br/2011/03/rpcliteral-and-null-object-pattern.html which basically solves a similar problem by using EJB Interceptors to intercept and replace null results with empty objects.
Working around the concept, I thought of intercepting the results just like so, replacing null with something like a string template, intercepting it again on the client and replacing that template back with null.
My questions are:They don't use EJB by default, so no Interceptors per se. Is there some implementation that could work for both Tomcat and JBoss?
Even if I'm able to intercept the return server-side, how could I do it client-side?
If I can use SOAPHandlers, how can I avoid raising the SOAP Fault for trying to return null?
Since I also had problems with JAXB not handling interfaces, what I ended up doing was using the #XmlJavaTypeAdapter annotation to enable (selectively, since every return and parameter that could practically be null needs to be annotated) converting values from and back to null, in sort of a hackjob manner. I created a generic-ish adapter for Serializable objects, and followed the same sort of approach for other kinds of Objects:
public class SerializableAdapter extends XmlAdapter<String, Serializable>>{
private static final String NULL = "'NULL'"; // Will hopefully never collide
#Override
public Serializable unmarshal(String e) throws Exception {
if (e == NULL) {
return null;
}
byte [] eB = e.getBytes("ISO-8859-1");
InputStream iS = new ByteArrayInputStream(Base64.getDecoder().decode(eB));
ObjectInputStream oIS = new ObjectInputStream(iS);
return (Serializable) oIS.readObject();
}
#Override
public String marshal(Serializable o) throws Exception {
if (o == null) {
return NULL;
}
ByteArrayOutputStream bAOS = new ByteArrayOutputStream();
ObjectOutputStream oOS = new ObjectOutputStream(bAOS);
oOS.writeObject(o);
return Base64.getEncoder().encodeToString(bAOS.toByteArray());
}
}
And then annotated every Serializable instance with #XmlJavaTypeAdapter(SerializableAdapter.class) since using package-level #XmlJavaTypeAdaptersdidn't work for some reason, and so forth for other cases. JAXB seems to eagerly cast the types encoded to and from when calling the adapters, so it will compile just fine even if the object to be marshalled isn't an instance of the expected class/interface, and throw exceptions only at runtime.
I don't recommend doing it this way, since it will require annotating every single method/parameter or package, and will break at the first one that didn't get annotated and yet received null. This very adapter still serves a purpose for cases where I need to work with interfaces, and the implementing classes also implement Serializable, although there are cases that still need specific adapters, but that's usually badly thought-out code.
Partially because of the hackness of this and the hassle of annotating everything, I managed to convince the company to move away from SOAP RPC bindings, so I was able to have null parameters and returns without this.
I'm using the java.util.concurrency framework for the first time. Here's a very simplified version of what I'm doing. For those not intimately familiar with the framework, future.get() executes a Callable object defined in the future. future.getOriginatingRequest() returns an object I set in the future for use by the Callable object and I'm just trying to log which originating request object failed (its enough to know the class name of it).
try {
future.get();
} catch (ExecutionException e) {
logger.error("Failed to execute future with id '" +
future.getOriginatingRequest().getClass().getName() + "'");
}
The problem I'm having is that the logging framework is outputting this:
Failed to execute future with id '$Proxy22'
Thus instead of the real class name I am getting $Proxy22 or some other number. Is there a way to get ahold of the real class name rather than the proxy name? Bonus points is someone can clearly explain why I'm getting the proxy string in the first place!
I can answer the bonus question: the string is the name of a dynamic Proxy class, generated in runtime.
As for how you can get to the masked class, there's not even a guarantee that one exists at all. The only thing you can do is to call Proxy.getInvocationHandler() on your proxy object and hope that the invocation handler will reveal more information (unlikely but may be worth a shot).
I find good for me solution on http://www.techper.net/2009/06/05/how-to-acess-target-object-behind-a-spring-proxy/
#SuppressWarnings({"unchecked"})
protected <T> T getTargetObject(Object proxy, Class<T> targetClass) throws Exception {
if (AopUtils.isJdkDynamicProxy(proxy)) {
return (T) ((Advised)proxy).getTargetSource().getTarget();
} else {
return (T) proxy; // expected to be cglib proxy then, which is simply a specialized class
}
}
Usage
#Override
protected void onSetUp() throws Exception {
getTargetObject(fooBean, FooBeanImpl.class).setBarRepository(new MyStubBarRepository());
}
i'm guessing that the proxy class is a subclass of the class you're looking for. especially if the class you're looking for is a class you wrote.
can you access the inheritance tree for the object you found? maybe through reflection?
I have a remote EJB with a method that validates an object (returning true or false). I want to be able to pass it an ArrayList object and have the EJB load it with the errors encountered during validation, while still receiving true/false as result.
How can I do this? So far, I can send it the list, and it's affected on server side, but the original list is not modified on client side.
That's because when the list is sent over the wire to the bean, a copy must necessarily be made, because the list is moved from one JVM to another. Unlike with a normal method, it's not the same list. I don't know how it would work with local beans, but there's not other way with remote beans.
I suggest you have the bean return the list and if that's empty, the object is valid.
For example:
public List<String> methodWithValidation(Object input) {
List<String> errors = new java.util.ArrayList<String>();
//various validation tests, each adding a message on fail
return errors;
}
And the calling method would do this:
List<String> errors = bean.methodWithValidation(object);
if(!errors.isEmpty()) {
//error logic
} else {
//continue
}