A method from an object implementing an interface is not getting executed without explicit casting.
Broader context is that I'm trying to decorate RestAssured DSL assertions to anotate in a test report from an ad-hoc testing framework, so code like:
~then().assertThat().statusCode(200).body(matcher)
makes anotations in that sense in the test report.
As for decorator pattern I had little success (I assume my limited skills at developing), so I'm trying another simpler approach, which is passing the TestReport object in the then call, touching the Rest Assured codebase, so now the syntax can be:
~then(report).assertThat().statusCode(200).body(matcher)
Down to specific point where I'm puzzled, which is apparently broken polymorphism, I tried setting a breakpoint at specific class (RestAssuredResponseImplementation) checked this is the actual type at runtime and assessed that it is not working without casting, throwing a java.lang.AbstractMethodError from the right method?!! io.restassured.internal.RestAssuredResponseImpl.then(TestCaseReport) which, of course, is not abstract.
Involved snippets are;
For the test definition:
#Test
public void test1(){
Validatable validatable = given().
header("Authorization", "Basic ASDFqwertyssW8=").
when().
get("https://some-site.org/login");
validatable.then(report).
assertThat().statusCode(200);
}
report object is provided, and, if you cast validatable object changing the line to:
((RestAssuredResponseImpl)validatable).then(report).
everything works fine, otherwise I'm getting the aforementioned exception:
java.lang.AbstractMethodError: io.restassured.internal.RestAssuredResponseImpl.then(Report;)Lio/restassured/response/ValidatableResponseOptions;
Changes in RestAssured:
interface Validatable T extends ValidatableResponseOptions adds:
T then(Report report);
class ReportingValidatableResponseImpl extends ValidatableResponseImpl:
private TestCaseReport testCaseReport;
ReportingValidatableResponseImpl(String contentType, ResponseParserRegistrar rpr, RestAssuredConfig config, Response response, ExtractableResponse<Response> extractableResponse, LogRepository logRepository, TestCaseReport testCaseReport) {
super(contentType, rpr, config, response, extractableResponse, logRepository);
this.testCaseReport = testCaseReport;
}
#Override
ValidatableResponse statusCode(int expectedStatusCode) {
testCaseReport.logMessage(LogStatus.INFO,"Asserting status code is " + expectedStatusCode)
super.statusCode(expectedStatusCode)
}
Implementation on RestAssuredResponseImpl adds:
ValidatableResponse then(TestCaseReport testCaseReport) {
return new ReportingValidatableResponseImpl(contentType, rpr, config, this, this, logRepository, testCaseReport);
}
If this is not a blunder from my side and I'm not missing something obvious, I would say this is breaking polymorphism and something very strange is happening here (I don't know where or how, since there is jUnit, Groovy involved, which implies heavy usage of reflection)
That's the use of interfaces, if at runtime, an object satisfies the interface, it doesn't matter which actual object it is.
I'm really puzzled about this, hope someone can bring some insight.
Thank you in advance.
Related
i have a little kont in my brain about structuring our code. We have a REST Backend based on SpringBoot. To handle requests regarding to security checks we use HandlerInterceptors. In some specific cases we need a specific interceptor and not our default one. The default one is registered in a 3rd party lib that no one can forget it. But i want all coders to think about this specific interceptor.
Actually, i just said it to them to achieve this.
Here's my question: Is there an option to create required (or necessary) interfaces which must be implemented? This would be a way to provide our security code by lib and to have the security that every coder implemented our specific interface (also if he just does nothing with it).
pseudo code:
public interface thinkForIt(){
Object SecBean specificSecBean;
public void methodToThinkOn();
}
public SecImpl implements thinkForIt(){
#Override
public void methodToThinkOn(){
return null; // i thought about it but i do not need to do anyting!
}
If the interface thinkForIt would have any annotations like #required, users could get warning or error if they did not implement it...
Looking for a solution and thanks for your comments in advance!
Your overall design is questionable; you are reinventing security code, which is always a red flag. Use Spring Security instead.
However, there's a simple way to ensure that "some bean of type Foo" has been registered with the context:
#Component
#RequiredArgsConstructor
public class ContextConfigurationVerifier {
final Foo required;
}
I want to pass in suitable object into the verify method, not just any().
Is there a way to do it?
I cannot just take and copy Lambda method and pass the results into the verify. That doesn't work because Lambdas cannot be tested directly.
My Unit test which is obviously not even close to testing anything:
#Test
public void testRunTrigger() {
campaignTrigger.updateCampaignStatus();
verify(jdbcTemplate).update(any(PreparedStatementCreator.class));
assertEquals("UPDATE campaign SET state = 'FINISHED' WHERE state IN ('PAUSED','CREATED','RUNNING') AND campaign_end < ? ", campaignTrigger.UPDATE_CAMPAIGN_SQL);
}
And this is the class I'm testing :
#Component
#Slf4j
public class CampaignTrigger {
final String UPDATE_CAMPAIGN_SQL = String.format("UPDATE campaign SET state = '%s' " +
" WHERE state IN (%s) AND campaign_end < ? ", FINISHED,
Stream.of(PAUSED, CREATED, RUNNING)
.map(CampaignState::name)
.collect(Collectors.joining("','", "'", "'")));
#Autowired
private JdbcTemplate jdbcTemplate;
#Scheduled(cron = "${lotto.triggers.campaign}")
#Timed
void updateCampaignStatus() {
jdbcTemplate.update(con -> {
PreparedStatement callableStatement = con.prepareStatement(UPDATE_CAMPAIGN_SQL);
callableStatement.setTimestamp(1, Timestamp.valueOf(LocalDateTime.now()));
log.debug("Updating campaigns statuses.");
return callableStatement;
});
}
Any advice, or theoretical knowledge that this is not the way to do it I would highly appreciate.
You shouldn't mock the code which you don't control. Mock only the code for which you have your tests, because when mocking you are assuming that you know (i.e. you define) how the mocked class works.
Here, you have no idea how jdbcTemplate works and whether calling it with some lambda actually does what you think it does.
Testing your code with code that you don't control is the point of integration tests. I.e. you should test your CampaignTrigger together with a real database (or in-memory one) and without mocking jdbcTemplate.
You could try your luck with capturing the object that is used for that call, see here. That allows to write code like this:
ArgumentCaptor<Person> argument = ArgumentCaptor.forClass(Person.class);
verify(mock).doSomething(argument.capture());
assertEquals("John", argument.getValue().getName());
Giving you full access to the object that was passed to your method call! And note that mockito recently introduced a #Captor annotation that makes things even easier to use.
Edit; given the comments by #Morfic: what he states is absolutely reasonable.
This answer is giving the "immediate" hint how you could solve that specific problem.
Beyond: the reasonable approach is always always always to slice that "unit under test" ... to be as small as possible!
Your class/method(s) should serve exactly one responsibility; and then you makes sure that the implementation can be tested with most simple means possible.
So: if the question is: "should I use argument captors or should I better rework my production code" - then rework your production code.
I have a class which takes enum values like Male,Female #POST . when I sent a wrong value like 'male' instead of 'Male' it shows me 400 Bad Request with this message in rest client : Can not construct instance of constants.Constants$GenderEnum from String value 'male': value not one of declared Enum instance names
at [Source: org.apache.catalina.connector.CoyoteInputStream#718a453d; line: 7, column: 23] (through reference chain: valueobjects.ConsumerValueObject["gender"])
My Rest End Point Looks like below :
#Consumes("application/json")
#Produces("application/json")
#POST
public Response addConsumer(ConsumerValueObject consumerVO)
Here ConsumerValueObject holds the enum.
How to suppress that error message in Rest client? I tried with ExceptionMapper but it did not help!I need to suppress the message due to security issues!
This is the Jackson response from either JsonParseExceptionMapper or JsonMappingExceptionMapper. These classes come with the dependency
<dependency>
<groupId>com.fasterxml.jackson.jaxrs</groupId>
<artifactId>jackson-jaxrs-json-provider</artifactId>
<version>${2.x.version}</version>
</dependency>
Whether you have this explicit dependency or you have the resteasy-jackson2-provider (which uses the above under the hood), most likely the mappers are registered implicitly through classpath scanning. For instance you have an empty Application class.
#ApplicationPath("/")
public class ResteasyApplication extends Application {}
This will cause disovery/registration through classpath scanning. If you don't have either of those dependencies, and if you are in Wildfly, I am not exactly sure how they are registered, but that is what's happening.
You could write/register your own ExceptionMappers for the JsonParseException and JsonMappingException
#Provider
public class JsonMappingExceptionMapper
implements ExceptionMapper<JsonMappingException> {
#Override
public Response toResponse(JsonMappingException e) {
return Response.status(Response.Status.BAD_REQUEST).build();
}
}
but from what I have tested, it's a tossup as to which one will be registered, yours or Jackson's. The mappers are put into a Set (so unordered), then pushed into a Map, so only one get's pushed in. The order in which they are pushed in like I said is a tossup.
I guess this is really only a partial answer, as I have not been able to find a solution that is guaranteed to use your mapper, aside from registering all your classes explicitly (ultimately disabling the classpath scanning), but that is a hassle.
But now the fight has been narrowed down. I will try again some more if I get a chance later
UPDATE
So this is not a solution, just a semi-proof-of-concept to show how we can get it to use our ExceptionMapper.
import org.jboss.resteasy.spi.ResteasyProviderFactory;
import com.fasterxml.jackson.databind.JsonMappingException;
import com.my.pkg.JsonMappingExceptionMapper;
#Path("/init")
public class InitResource {
#GET
public Response init() {
ResteasyProviderFactory factory = ResteasyProviderFactory.getInstance();
factory.getExceptionMappers().put(JsonMappingException.class,
new JsonMappingExceptionMapper());
return Response.ok("Done!").build();
}
}
Once we hit the init endpoint for first time, our JsonMappingExcpetionMapper will register, and override the existing one, whether it is Jackson's or ours.
Of course we would not want to do this for real, it's just showing how to override the mapper. The thing I can't figure out is where to put this code. I've tried a ServletContextListener, in the Application constructor, in a Feature with a low priority. I can't figure it out. None of the above occur before RESTeasy does its final registration.
Do you really want to supress the error message or do you want to fix the actual probelm?
You can actually catch all thrown exception with a custom exception mapper like
#Provider
public class CustomExceptionMapper implements ExceptionMapper<Throwable> {
#Override
public Response toResponse(Throwable t) {
return Response.ok().build();
}
}
though, this will handle all caught exceptions and return a 200 OK which tricks clients to think that the request actually succeeded - which was not the case! Instead of Throwable you should be able to catch the concrete exception (even if it is a RuntimeException) as well - maybe you have not declared it as provider or did not specify the correct exception class?
Though, as already mentioned returning a different status code for an exception is generally bad practice and should be avoided. Fixing the actual problem is probably more suitable in that case.
JAX-RS provides MessageBodyReader and MessageBodyWriter interfaces which you can declare to un/marshall an inputstream to an object or an object to return to an output-stream. The official documentation on MessageBodyReader has more detailed information on that regard.
One implementation therefore could be the following steps:
Read the input-stream to f.e. string
Replace all "male" or "female" tokens with their upper-case version
Parse the string to a json-representation (using org.json.JSONObject f.e)
Use ObjectMapper to convert the JSON representation to a Java object
return the mapped object
This works if the input failure is just a simple upper/lower case issue. If there are typos or semantically alternative available, which are not yet in your enum, you need to put in a bit more effort.
If you, however, fail to create a proper object representation, you should return a user-failure (something in the 400 range) to the client to inform the client that something went wrong.
I have a class which takes a message with payload String.
The payload is then split and used to create an Entity which is passed to DAOInterface to persist.
How can you test the call daoInterface.insert(entity) has been made?
To Mock the DAOInterface and then verify the call to DAO requires the entity in the test class i.e.
verify(daoInterface).insert(entity);
Is this bad design i.e. creating the entity at this stage? Should the Sting[] split be passed to the DAOImplementaion and initialized there. Example problem,
public class ServiceClass {
#AutoWire
private DAOInterface daoInterface;
public void serviceMessage(Message<String> message) {
MessageHeaders mh = new MessageHeaders(message.getHeaders());
String[] split = ((String) mh.get("payload")).split("_");
code omitted
...
String type = mh.get("WhatType");
Entity entity = new Entity(split[0], split[1], split[2]);
if (type.equals("one"))
{
daoInterface.insert(entity); //How to test?
}
else
{
if (type.equals("two"))
{
doaInterface.modify(entity); //How to test?
}
}
}
}
You can verify with Mockito Matchers.
If you only care that the method is called with some Entity, you can verify that with
verify(daoInterface).insert(any(Entity.class));
If you care about which Entity, and the Entity class has an equals method, you can make an entity that should be equal to the one created and verify with
verify(daoInterface).insert(eq(expectedEntity);
If it's more complex than either of these cases, you can also write your own argument matchers.
The easiest thing you can do is injecting another collaborator to the service which will transform payload to Entity. This way you can keep control on object creation (Inversion of Control). Something like the example below injected to the ServiceClass should do the job:
interface PayloadTransformer {
public Entity transform(String payload);
}
This way your code will be easy to test and you split responsibilities which is usually a good thing. Have a look on Single Responsibility principle
Pushing transformation logic down to dao is almost never a good idea.
BTW. you can write else if without additional brackets and indentations. It's more readable like:
if (a) {
// do something
} else if (b) {
// do something
} else {
// do something
}
The last advice ServiceClass is really poor name for class. The word class is redundant here. Just name it Service, EntityService, MessageService or something which fits your case well.
I wouldn't name field with suffix *Interface as well. Underneath is some implementation injected, I assume. Better name it entityDao or just dao. It's up to you though :)
If you use a test framework like PowerMock, you can invoke private constructors and private methods in your test. This makes it easy to inject mock objects like a mock DAOInterface so you can retrieve it later and test it's been called.
For example, in PowerMock, to call a private constructor:
public class ServiceClass{
#Autowire
private final DAOInterface dao;
public ServiceClass() {
}
private ServiceClass(DAOInterface dao) {
this.dao = dao;
}
}
You simply do:
PrivateConstructorInstantiationDemo instance = WhiteBox.invokeConstructor(
PrivateConstructorInstantiationDemo.class,
new MockDAOInterface() );
So if you're using a dependency inject framework like above, this dovetails nicely. You don't normally have the dependency injection working during test, since it usually requires booting a large chunk of code with a lot of configuration.
By adding a single private constructor, you avoid breaking encapsulation, but you can still inject your mock object into the code during test with a test framework like PowerMock. This is considered best practice.
You could break encapsulation and add publicly accessible methods or ctors to the SeviceClass, but if you don't need them for your design it's not good practice to add them only for test. That's why people put such effort into bypassing encapsulation in frameworks like Mockito and PowerMock. It's not just a dodge around private code, it's because you want to keep the encapsulation while still being able to test.
EDIT:
If you're not familiar with making mock objects, you should do some Google searches on the subject. It's very common and a good skill to have. With the above code, you could make your own mock object. But making mocks is so common that most test frameworks will do this for you.
For example, in PowerMock, I just looked at their page on making mocks here. You can make a mock like this
DAOInteface myMock = createMock(DAOInterface.class);
You can then ask the mock to verify that methods are called:
expect(myMock.someMethod());
Now the mock 'expects' that method to be called, and if it isn't, it'll generate an error for your test. Pretty sweet actually.
You can also return values from a call:
expect(myMock.insert()).andReturn("Test succeeded");
so your code would then see the value "Test succeeded" when it called that method. I don't see that 'insert' does return a value, that's just an example.
How can I change what a method is doing in Java ?
I mean, I am trying to use annotations to make the following code
#Anno1(Argument = "Option1")
public class TestClass
{
#Anno2
public void test()
{
}
}
Into
public class TestClass
{
private static StaticReference z;
public void test()
{
z.invokeToAll();
}
}
This is a very simplified example of what I am trying to do. Anno1 will have many possible combinations, but this is not my problem so far. My problem is how to add code to method test()
I am looking for a more generic solution if possible. Eg. A way to add every kind of code in the method (not just a way to .invokeToAll())
So far I am using import javax.annotation.processing.*; and I have the following code, but I don't know how to go on from there
private void processMethodAnnotations(RoundEnvironment env)
{
for (Element e : env.getElementsAnnotatedWith(Anno2.class))
{
//If it is a valid annotation over a method
if (e.getKind() == ElementKind.METHOD)
{
//What to do here :S
}else
{
processingEnv.getMessager().printMessage(Diagnostic.Kind.WARNING,"Not a method!", e);
}
}
}
I have found something about Java Reflection but I have not found any source to help me with what I am doing.
Obviously I extends AbstractProcessor in my code
I have found this tutorial (http://www.zdnetasia.com/writing-and-processing-custom-annotations-part-3-39362483.htm) But this concerns creating a new class, not just changing a method. and the javax.lang.model.elements do not provide any way of editing that element (which in my case represents a Method).
I hope my question is clear and inline with the rules. If not please comment and I will clarify. Thanks.
Annotation processing is the wrong way to go for you, from Wikipedia:
When Java source code is compiled,
annotations can be processed by
compiler plug-ins called annotation
processors. Processors can produce
informational messages or create
additional Java source files or
resources, which in turn may be
compiled and processed, but annotation
processors cannot modify the annotated
code itself.
People suggested to you the right way - AOP. Specifically, you can use AspectJ. "Quick result" way is (if you use Eclipse):
Install AJDT (AspectJ Development Tools)
Create an AspectJ project and add there your classes and annotations
Create Aspect:
public aspect Processor {
private StaticReference z;
pointcut generic()
// intercept execution of method named test, annotated with #Anno1
// from any class type, annotated with #Anno2
: execution(#Anno2 * (#Anno1 *).test())
// method takes no arguments
&& args ();
// here you have written what you want the method to actually do
void around () : generic() {
z.invokeToAll();
}
}
now you can execute a test and you will see that it works ;) AJDT compiles code for you automatically, so do not need any manual work to do, hope that's what you called "magic" ;)
UPDATE:
if your code in the test() method depends on the Anno1 annotation value, then inside aspect you can get class annotation for which it is executed this way:
void around () : generic() {
Annotation[] classAnnotations = thisJoinPoint.getThis().getClass().getAnnotations();
String ArgumentValue = null;
for ( Annotation annotation : classAnnotations ) {
if ( annotation instanceof Anno1 ) {
ArgumentValue = ((Anno1) annotation).Argument();
break;
}
}
if ( ArgumentValue != null && ArgumentValue.equals("Option1")) {
z.invokeToAll();
}
}
where thisJoinPoint is a special reference variable.
UPDATE2:
if you want to add System.out.println( this ) in your aspect, you need to write there System.out.println( thisJoinPoint.getThis() ), just tested and it works. thisJoinPoint.getThis() returns you "this" but not exactly; in fact this is Object variable and if you want to get any propery you need either to cast or to use reflection. And thisJoinPoint.getThis() does not provide access to private properties.
Well, now seems that your question is answered, but if I missed anything, or you get additional question/problems with this way - feel free to ask ;)
It's perfectly possible to do what you ask, although there is a caveat: relying on private compiler APIs. Sounds scary, but it isn't really (compiler implementations tend to be stable).
There's a paper that explains the procedure: The Hacker's Guide to Javac.
Notably, this is used by Project Lombok to provide automatic getter/setter generation (amongst other things). The following article explains how it does it, basically re-iterating what is said the aforementioned paper.
Well, you might see if the following boilerplate code will be useful:
public void magic(Object bean, String[] args) throws Exception {
for (Method method : bean.getClass().getDeclaredMethods()) {
if (method.isAnnotationPresent(Anno2.class)) {
// Invoke the original method
method.invoke(bean, args);
// Invoke your 'z' method
StaticReference.invokeAll();
}
}
}
As an alternative your might employ aspect oriented programming, for instance you have the AspectJ project.
I'm not sure at all if it is even possible to change the source or byte code via annotations. From what your describing it looks as if aspect oriented programming could provide a solution to your problem.
Your annotations are pretty similiar to the pointcut concept (they mark a location where code needs to be inserted) and the inserted code is close the advice concept.
Another approach would be parsing the java source file into an abstract syntax tree, modify this AST and serialize to a java compiler input.
If your class extends a suitable interface, you could wrap it in a DynamicProxy, which delegates all calls to the original methods, except the call to test.