I have created a WebSocket client using Tyrus.
The problem happens in the #OnMessage annotated instance method.
Within the enclosing class I have three things that seem to both have conflicting instance contexts.
I have a Logger instance in the parent class. The getter getLogger() returns the valid logger instance.
A LinkedList object for enqueuing messages. This throws a NullPointerException and is the core problem.
The enclosing instance object.
private LinkedList<String> messageQueue = new LinkedList<>();
private Logger logger = LoggerFactory.fromClass(WebSocketClient.class);
public Logger getLogger() {
return logger;
}
public WebSocketClient() {
super();
this.getLogger().info(this.toString());
}
public start() {
try {
WebSocketContainer webSocketContainer = ContainerProvider.getWebSocketContainer();
this.connectionSession = webSocketContainer.connectToServer(WebSocketClient.class, URI.create("WebSocket URL here"));
} catch (Exception exception) {
this.getLogger().error("Exchange Client Start Error", exception);
}
}
#OnMessage
public void processMessage(String message) {
// This returns the correct Logger instance
this.getLogger().info("Received Message: " + message);
// This Returns a different hashCode()
this.getLogger().info(this.toString());
// This throws a NullPointerException
this.messageQueue.add(message);
}
Through my debugging in the constructor when I log this.getLogger().info(this.toString()); and that returns the proper package, class name with the #hashCode().
But when I log it in the processMessage() method and returns a completely different hashCode().
Based on my research, hashCode on Object based classes should return the exact same hashCode.
Which leads me to believe that the processMessage is being called on either a duplicate (copy) object or something like that.
Any help would be great.
Instead of using WebSocketContainer to connectToServer I used
ClientManager client = ClientManager.createClient();
client.connectToServer(this, URI.create("WebSocket URL here"));
It seems WebSocketContainer follows the convention of finding a service provider and that might have been the issue as the service provider could've had a different implementation (Talking out of my ass here).
Related
I am using google-guice since a couple of days, and I am getting more and more impressed.
I created a MemberInjector to easily integrate the logging framework SLF4J, just with a additional annotation. That means instead of using always the long term:
private static final Logger LOG = LoggerFactory.getLogger(MyClass.class);
I am now just using:
#Log
Logger LOG;
This is really great, BUT:
I am also using a method-injection as a kind of starter function like this in the same class, AND if I access to the LOG instance there, it causes a NPE! Because it is not injected yet.
#Inject
public void start() {
//And here is the problem...
LOG.info("in start") //causes NPE, cause LOG is not injected yet
}
If I use the LOG instance in another (non-injected) method it works perfectly,.
Is there a way to change the injection order or is it possible to tell guice to inject the MemberInjector earlier? Cause I really would like to use the logging also in the method-injection part.
Thank for any hint.
One solution I found was to create just a additional listener that looks for a defined method like ("init" or "start") and just calls it after creation and injection of members.
See in module configuration:
#Override
protected void configure() {
bindListener(Matchers.any(), new InitMethodTypeListener());
//...
InitMethodTypeListener:
public class InitMethodTypeListener implements TypeListener {
private static final Logger log = LoggerFactory.getLogger(InitMethodTypeListener.class);
#SuppressWarnings("rawtypes")
static class InitInvoker implements InjectionListener {
#Override
public void afterInjection(final Object injectee) {
try {
log.info("Invoke init() from Class: {}", injectee.getClass().getName());
injectee.getClass().getMethod("init").invoke(injectee);
} catch (final Exception e) {
log.error(e.getMessage(), e);
}
}
public static final InitInvoker INSTANCE = new InitInvoker();
}
#SuppressWarnings("unchecked")
#Override
public <I> void hear(final TypeLiteral<I> type, final TypeEncounter<I> encounter) {
try {
if (type.getRawType().getMethod("init") != null) {
encounter.register(InitInvoker.INSTANCE);
}
} catch (final NoSuchMethodException | SecurityException e) {
// do nothing here, if not init-method found - no call
}
}
}
Maybe it not the straight forward way but it works. And so I am sure all members are injected well when the init-method is called.
With this implementation all objects that are under guice control their "init"-method will be called automatically after object creation and injection.
I'm trying to implement a database authentication with Eclipse Scout.
For that I created a class DataSourceCredentialVerifier in the client module, which implements the ICredentialVerifierinterface. Then I adapted the init method of the UiServletFilter class to use my verifier.
public class DataSourceCredentialVerifier implements ICredentialVerifier {
private static final Logger LOG = LoggerFactory.getLogger(DataSourceCredentialVerifier.class);
#Override
public int verify(String username, char[] password) throws IOException {
Object queryResult[][] = BEANS.get(IMySqlAuthService.class).load();
return AUTH_OK;
}
I haven't implemented any authentication logic yet. My task now is to establish a clean database connection.
For that I created the following interface in the shared module:
public interface IMySqlAuthService extends IService {
Object[][] load();
}
The implementation is in the server module:
public class MySqlAuthService implements IMySqlAuthService {
#Override
public Object[][] load() {
String sql = "select username, password from users ";
Object[][] queryResult = SQL.select(sql, null, null);
return queryResult;
}
}
First I want to see, if there is at least something in the query, but I get an AssertionException here:
Object queryResult[][] = BEANS.get(IMySqlAuthService.class).load();
org.eclipse.scout.rt.platform.util.Assertions$AssertionException: Assertion error: no instance found for query: interface org.eclipse.scout.app.shared.services.IMySqlAuthService
at org.eclipse.scout.rt.platform.util.Assertions.fail(Assertions.java:580)
at org.eclipse.scout.rt.platform.util.Assertions.assertNotNull(Assertions.java:87)
at org.eclipse.scout.rt.platform.BEANS.get(BEANS.java:41)
I don't get an instance of my MySqlAuthService implementation. I assume that the BeanManager should have created an instance for me. MySqlAuthService should be registered as a Bean, since my IMySqlAuthService interface extends from IService which has the #ApplicationScoped annotation.
Adding the #Bean annotation to MySqlAuthService results in the same exception.
Here some information about the BeanManager and annotations:
https://eclipsescout.github.io/6.0/technical-guide.html#sec-bean.manager
Here is another different approach s.o. tried, but it doesn't feel correct:
https://www.eclipse.org/forums/index.php/t/1079741/
How can I get my example to work with my service?
Here is the working solution with important explanations of Eclipse Scout principles.
The source is summarized information of the Eclipse-Scout-Technical-Guide.
In Scout there is a built in annotation: #TunnelToServer. Interfaces marked with this annotation are called on the server. The server itself ignores this annotation.
To achieve that a bean is registered on client side, this annotation is required. The platform cannot (!) directly create an instance for these beans, a specific producer is registered which creates a proxy that delegates the call to the server.
My first clear mistake was that I hadn't annotated the IMySqlAuthServicewith #TunnelToServer.
After this addition I got rid of the no instance AssertionError.
After that my code ran into the HTTP status-code: 403 access forbidden.
This occured because my code didn't run in the correct Thread. That is the current RunContext. I had to use this lines of code in my verify method of the DataSourceCredentialVerifier:
Subject subject = new Subject();
subject.getPrincipals().add(new SimplePrincipal("system"));
subject.setReadOnly();
RunContext runContext = RunContexts.copyCurrent().withSubject(subject);
Now one can use the runContext's call() or run() method, depending whether the code returns a result. The action is run in the current thread, meaning that the caller is blocked until completion.
Concrete example solution:
Object[][] result = runContext.call(new Callable<Object[][]>() {
#Override
public Object[][] call() throws Exception {
return BEANS.get(IMySqlAuthService.class).load();
}
});
//TODO implement authentication logic.
For more information about the RunContext see here:
https://eclipsescout.github.io/6.0/technical-guide.html#runcontext
I'm attempting to figure out how to test a method written in my LoggingRestService class. I'd like to verify that my (external) error method produces the correct log entry in this logger.
private static final Logger LOGGER = LoggerFactory.getLogger(LoggingRestService.class);
#POST
#Path("debug")
#ApiOperation(value = "Writes js error message log. (not secured)")
#ApiResponse(code = 200, message = "Message logged")
public void getUserList(#ApiParam(value = "Js debug message.")
String message) {
ConverterUtils.error(LOGGER, message, null);
}
}
Here is ConverterUtils error method:
public static void error(Logger logger, String mess, Exception ex) {
Class<?> aClass = logger.getClass();
try {
Method method = aClass.getMethod("error", String.class, Throwable.class);
method.invoke(logger, mess, ex);
} catch (NoSuchMethodException | InvocationTargetException | IllegalAccessException e) {
LOGGER.error("get logger error", e);
}
}
I've never used mockito but I am somewhat familiar with the concept of mocking classes..still I'm a little boggled. I was not given much direction for testing and attempting to be self sufficient is slowly rotting my confidence. Thanks a bunch.
Let me check if I get what you are saying: you want to verify that your ConverterUtil class is calling the error method with the appropriate message and exception right.
To be able to do that you would need to make sure that your ConverterUtils class is called with a mock version of a Logger. You can do that with mock(Logger.class). That will work even if Logger is a class and not an interface because Mockito can mock classes that are not final. It will actually create a subclass for you.
Your test code will look:
Logger mockedLogger = mock(Logger.class);
String expectedMessage = "Whatever";
Exception expectedException = new Exception();
// Business method
ConverterUtils.error(mockedLogger, expectedMessage, expectedException);
// Asserts
verify(mockedLogger).error(expectedMessage, expectedException);
On a side node: it puzzles me why the error method is not as simple as:
public static void error(Logger logger, String mess, Exception ex) {
logger.error(mess, ex);
}
I really don't see why you need to use the reflection layer here. You know exactly what method you want, so why don't you simply invoke the method as is.
On the server-side I have a ListenerManager which fires callbacks to its Listeners. The manager is exported using a Spring RmiServiceExporter
On the client-side I have a proxy to the manager created by an RmiProxyFactoryBean, and a Listener implementation registered through this proxy with the manager on the server side.
So far so good: the ListenerManager is given a Listener and it invokes its callbacks, however since the listener is just a deserialized copy of the client-side object, the callback runs on the server side, not the client side.
How can I get Spring to generate a proxy on the server-side to the client-side listener so that the callback invoked by the server is executed remotely on the client-side? Surely I don't need another (exporter, proxy factory) pair in the opposite direction?
A pure RMI solution: the client-side listener object needs to implement java.rmi.server.UnicastRemoteObject. If it does, and each of its methods throw RemoteException then when it is passed to the server through the manager proxy everything is wired up automatically, and method invocations on the server-side proxy to this listener are remote invocations of methods on the real client-side object.
This will do, but it's even better to be able to wrap the object for export without requiring a particular superclass. We can use a CGLIB Enhancer to "proxy" the listener as a subclass of UnicastRemoteObject that also implements the service interfaces. This still requires that the target object implement java.rmi.Remote and declare throws RemoteException.
Next step is a solution that can export arbitrary objects for remote invocation of their methods, without requiring that they implement Remote or declare throws RemoteException. We must integrate this proxying with the existing Spring infrastructure, which we can do with a new implementation of RmiBasedExporter modelled on the non-registry bits of RmiServiceExporter#prepare() to export the RMI stub of our proxy and on the invocation part of RmiClientInterceptor.doInvoke(MethodInvocation, RmiInvocationHandler). We need to be able to get hold of an exported proxy instance of our service interfaces. We can model this on the means used by Spring to apparently "export" non-RMI interfaces. Spring proxies the interface to generate a RmiInvocationWrapper for invocation of a non-RMI method, serialises the method details and arguments, then invokes this on the far side of the RMI connection.
Use a ProxyFactory and an RmiInvocationHandler implementation to proxy the target object.
Use a new implementation of RmiBasedExporter to getObjectToExport(), and export it using UnicastRemoteObject#export(obj, 0).
For the invocation handler, rmiInvocationHandler.invoke(invocationFactory.createRemoteInvocation(invocation)), with a DefaultRemoteInvocationFactory.
Handle exceptions and wrap appropriately to avoid seeing UndeclaredThrowableExceptions.
So, we can use RMI to export arbitrary objects. This means we can use one of these objects on the client-side as a parameter to an RMI method call on an RMI server-side object, and when the deserialised stub on the server-side has methods invoked, those methods will execute on the client-side. Magic.
Following Joe Kearney's explaination, I have created my RMIUtil.java. Hope there is nothing left.
BTW, please ref this
for "java.rmi.NoSuchObjectException: no such object in table"
Just add some code to Joe's answer.
Extends RmiServiceExporter and get access to exported object:
public class RmiServiceExporter extends org.springframework.remoting.rmi.RmiServiceExporter {
private Object remoteService;
private String remoteServiceName;
#Override
public Remote getObjectToExport() {
Remote exportedObject = super.getObjectToExport();
if (getService() instanceof Remote && (
getServiceInterface() == null || exportedObject.getClass().isAssignableFrom(getServiceInterface()))) {
this.remoteService = exportedObject;
}
else {
// RMI Invokers.
ProxyFactory factory = new ProxyFactory(getServiceInterface(),
new RmiServiceInterceptor((RmiInvocationHandler) exportedObject, remoteServiceName));
this.remoteService = factory.getProxy();
}
return exportedObject;
}
public Object getRemoteService() {
return remoteService;
}
/**
* Override to get access to the serviceName
*/
#Override
public void setServiceName(String serviceName) {
this.remoteServiceName = serviceName;
super.setServiceName(serviceName);
}
}
The interceptor used in the proxy (the remote service callback):
public class RmiServiceInterceptor extends RemoteInvocationBasedAccessor
implements MethodInterceptor, Serializable {
private RmiInvocationHandler invocationHandler;
private String serviceName;
public RmiServiceInterceptor(RmiInvocationHandler invocationHandler) {
this(invocationHandler, null);
}
public RmiServiceInterceptor(RmiInvocationHandler invocationHandler, String serviceName) {
this.invocationHandler = invocationHandler;
this.serviceName = serviceName;
}
/**
* {#inheritDoc}
*/
public Object invoke(MethodInvocation invocation) throws Throwable {
try {
return invocationHandler.invoke(createRemoteInvocation(invocation));
}
catch (RemoteException ex) {
throw RmiClientInterceptorUtils.convertRmiAccessException(
invocation.getMethod(), ex, RmiClientInterceptorUtils.isConnectFailure(ex),
extractServiceUrl());
}
}
/**
* Try to extract service Url from invationHandler.toString() for exception info
* #return Service Url
*/
private String extractServiceUrl() {
String toParse = invocationHandler.toString();
String url = "rmi://" + StringUtils.substringBefore(
StringUtils.substringAfter(toParse, "endpoint:["), "]");
if (serviceName != null)
url = StringUtils.substringBefore(url, ":") + "/" + serviceName;
return url;
}
}
When exporting the service with this RmiServiceExporter, we cand send a rmi callback with:
someRemoteService.someRemoteMethod(rmiServiceExporter.getRemoteService());
What is a use case for using a dynamic proxy?
How do they relate to bytecode generation and reflection?
Any recommended reading?
I highly recommend this resource.
First of all, you must understand what the proxy pattern use case. Remember that the main intent of a proxy is to control access to
the target object, rather than to enhance the functionality of the
target object. The access control includes synchronization, authentication, remote access (RPC), lazy instantiation (Hibernate, Mybatis), AOP (transaction).
In contrast with static proxy, the dynamic proxy generates bytecode which requires Java reflection at runtime. With the dynamic approach you don't need to create the proxy class, which can lead to more convenience.
A dynamic proxy class is a class that implements a list of
interfaces specified at runtime such that a method invocation through
one of the interfaces on an instance of the class will be encoded and
dispatched to another object through a uniform interface. It can be
used to create a type-safe proxy object for a list of interfaces
without requiring pre-generation of the proxy class. Dynamic proxy
classes are useful to an application or library that needs to provide
type-safe reflective dispatch of invocations on objects that present
interface APIs.
Dynamic Proxy Classes
I just came up with an interesting use for a dynamic proxy.
We were having some trouble a non-critical service that is coupled with another dependant service and wanted to explore ways of being fault-tolerant when that dependant service becomes unavailable.
So I wrote a LoadSheddingProxy that takes two delegates - one is the remote impl for the 'normal' service (after the JNDI lookup). The other object is a 'dummy' load-shedding impl. There is simple logic surrounding each method invoke that catches timeouts and diverts to the dummy for a certain length of time before retrying. Here's how I use it:
// This is part of your ServiceLocator class
public static MyServiceInterface getMyService() throws Exception
{
MyServiceInterface loadShedder = new MyServiceInterface() {
public Thingy[] getThingys(Stuff[] whatever) throws Exception {
return new Thingy[0];
}
//... etc - basically a dummy version of your service goes here
}
Context ctx = JndiUtil.getJNDIContext(MY_CLUSTER);
try {
MyServiceInterface impl = ((MyServiceHome) PortableRemoteObject.narrow(
ctx.lookup(MyServiceHome.JNDI_NAME),
MyServiceHome.class)).create();
// Here's where the proxy comes in
return (MyService) Proxy.newProxyInstance(
MyServiceHome.class.getClassLoader(),
new Class[] { MyServiceInterface.class },
new LoadSheddingProxy(MyServiceHome.JNDI_NAME, impl, loadShedder, 60000)); // 10 minute retry
} catch (RemoteException e) { // If we can't even look up the service we can fail by shedding load too
logger.warn("Shedding load");
return loadShedder;
} finally {
if (ctx != null) {
ctx.close();
}
}
}
And here's the proxy:
public class LoadSheddingProxy implements InvocationHandler {
static final Logger logger = ApplicationLogger.getLogger(LoadSheddingProxy.class);
Object primaryImpl, loadDumpingImpl;
long retry;
String serviceName;
// map is static because we may have many instances of a proxy around repeatedly looked-up remote objects
static final Map<String, Long> servicesLastTimedOut = new HashMap<String, Long>();
public LoadSheddingProxy(String serviceName, Object primaryImpl, Object loadDumpingImpl, long retry)
{
this.serviceName = serviceName;
this.primaryImpl = primaryImpl;
this.loadDumpingImpl = loadDumpingImpl;
this.retry = retry;
}
public Object invoke(Object obj, Method m, Object[] args) throws Throwable
{
try
{
if (!servicesLastTimedOut.containsKey(serviceName) || timeToRetry()) {
Object ret = m.invoke(primaryImpl, args);
servicesLastTimedOut.remove(serviceName);
return ret;
}
return m.invoke(loadDumpingImpl, args);
}
catch (InvocationTargetException e)
{
Throwable targetException = e.getTargetException();
// DETECT TIMEOUT HERE SOMEHOW - not sure this is the way to do it???
if (targetException instanceof RemoteException) {
servicesLastTimedOut.put(serviceName, Long.valueOf(System.currentTimeMillis()));
}
throw targetException;
}
}
private boolean timeToRetry() {
long lastFailedAt = servicesLastTimedOut.get(serviceName).longValue();
return (System.currentTimeMillis() - lastFailedAt) > retry;
}
}
The class java.lang.reflect.Proxy allows you to implement interfaces dynamically by handling method calls in an InvocationHandler. It is considered part of Java's reflection facility, but has nothing to do with bytecode generation.
Sun has a tutorial about the use of the Proxy class. Google helps, too.
One use case is hibernate - it gives you objects implementing your model classes interface but under getters and setters there resides db related code. I.e. you use them as if they are just simple POJO, but actually there is much going on under cover.
For example - you just call a getter of lazily loaded property, but really the property (probably whole big object structure) gets fetched from the database.
You should check cglib library for more info.