Javadoc is great for scanning all of source files and creating HTML pages to view it. I was wondering if there is a similar tool that would go through all of your Spring controllers and collect all of the methods that have been annotated with #RequestMapping and produce a single HTML page listing them. Sort of like a pseudo site map for developers to ensure uniqueness and standardization across controllers.
I apologize if this question has been asked elsewhere already. I could not come up with an appropriate set of search terms that would provide a useful result.
This is a very good question, I often miss (and implement) functionality like this.
Use a Build Tool
What I'd do is run Maven (or ant) and execute a task after compilation that
reads all classes (perhaps with a configurable list of packages)
iterates over all methods of these classes
reads the annotations
and writes the output to HTML
Use Annotation Processing
But I guess this is a scenario, where annotation processing might also be a way to do it. Usually, you have to use some internal APIs to get stuff done in API, but using Filer.createResource(...) it should actually possible to do it out of the box.
Here's a rudimentary implementation:
public class RequestMappingProcessor extends AbstractProcessor{
private final Map<String, String> map =
new TreeMap<String, String>();
private Filer filer;
#Override
public Set<String> getSupportedAnnotationTypes(){
return Collections.singleton(RequestMapping.class.getName());
}
#Override
public synchronized void init(
final ProcessingEnvironment processingEnv){
super.init(processingEnv);
filer = processingEnv.getFiler();
}
#Override
public boolean process(
final Set<? extends TypeElement> annotations,
final RoundEnvironment roundEnv){
for(final TypeElement annotatedElement : annotations){
final RequestMapping mapping =
annotatedElement.getAnnotation(
RequestMapping.class
);
if(mapping != null){
addMapping(mapping, annotatedElement, roundEnv);
}
}
assembleSiteMap();
return false;
}
private void assembleSiteMap(){
Writer writer = null;
boolean threw = false;
try{
final FileObject fileObject =
filer.createResource(
StandardLocation.CLASS_OUTPUT,
"html", "siteMap.html"
);
writer = fileObject.openWriter();
writer.append("<body>\n");
for(final Entry<String, String> entry : map.entrySet()){
writer
.append("<a href=\"")
.append(entry.getKey())
.append("\">")
.append("Path: ")
.append(entry.getKey())
.append(", method: ")
.append(entry.getValue())
.append("</a>\n");
}
writer.append("</body>\n");
} catch(final IOException e){
threw = true;
throw new IllegalStateException(e);
} finally{
// with commons/io: IOUtils.closeQuietly(writer)
// with Guava: Closeables.close(writer, rethrow)
// with plain Java this monstrosity:
try{
if(writer != null){
writer.close();
}
} catch(final IOException e){
if(!threw){
throw new IllegalStateException(e);
}
} finally{
}
}
}
private void addMapping(final RequestMapping mapping,
final TypeElement annotatedElement,
final RoundEnvironment roundEnv){
final String[] values = mapping.value();
for(final String value : values){
map.put(
value,
annotatedElement.getQualifiedName().toString()
);
}
}
}
There's nothing that I know of that would do that. I've retrieved controllers and mappings via the app context before to create navigation, but it was a lot of work for little gain IMO:
#Component
public class SiteMap implements ApplicationContextAware, InitializingBean {
private ApplicationContext context;
private List<Page> pages = new ArrayList<Page>();
public List<Page> getPages() {
return pages;
}
public void setApplicationContext(ApplicationContext applicationContext) {
this.context = applicationContext;
}
public void afterPropertiesSet() throws Exception {
Assert.notNull(context, "applicationContext not set");
Map<String, Object> controllers = ctx.getBeansWithAnnotation(Controller.class);
for(Map.Entry<String, Object> entry : controllers.entrySet()) {
Page page = new Page();
Class<?> controllerClass = entry.getValue();
String controllerRoot = null;
RequestMapping classMapping = controllerClass.getAnnotation(RequestMapping.class);
if(classMapping != null)
controllerRoot = classMapping.value();
if(controllerRoot = null)
controllerRoot = // get and parse controller name
page.setPath(controllerRoot);
for(Method m : controllerClass.getDeclaredMethods()) {
RequestMapping rm = m.getAnnotation(RequestMapping.class);
if(rm == null)
continue;
Page child = new Page();
child.setPath(rm.value());
page.getChildren().add(child);
}
pages.add(page);
}
}
public static class Page {
private String path;
private List<Page> children = new ArrayList<Page>();
// other junk
}
}
Then access ${pages} in your site map page JSP. Might need to play with that code some if you do something similar, I freehanded it in this editor.
Related
Problem statement:
I have to process request similar to a pipeline.
For example:
When a request comes, it has to undergo a sequence of operations, like (step1,step2,step3...).
So, in order to achieve that, I am using Template design pattern.
Please review and suggest if I am implementing this problem correctly, or there is a better solution.
I am suspecting my approach will introduce code smells, as I am changing values of objects very frequently.
Also, suggest if I & how can I use Java 8 to accomplish this?
Thanks.
Code:
package com.example.demo.design;
import java.util.List;
public abstract class Template {
#Autowired
private Step1 step1;
#Autowired
private Step2 step2;
#Autowired
private Save save;
List<String> stepOutput = null;
List<String> stepOutputTwo = null;
List<String> stepOutputThree = null;
public void step1(String action1) {
stepOutput = step1.method(action1);
}
public void step2(String action2) {
stepOutputTwo = step2.method(stepOutput, action2);
}
abstract public void step3();
public void save() {
save.persist(stepOutputThree);
}
final public void run(String action1, String action2) {
step1(action1);
step2(action2);
stepOutputTwo = step3();
}
}
In Java 8 streams model, that could look like the following:
final public void run(String action1, String action2) {
Stream.of(action1) // Stream<String>
.map(s -> step1.method(s)) // Stream<List<String>>
.map(l -> step2.method(l,action2) // Stream<List<String>>
.map(l -> step3.method(l)) // Stream<List<String>>
.forEach(l -> save.persist(l));
}
I had same issue! you can do something like this: and uncheckCall method is for handling exceptions.
final public void run(String action1, String action2) {
//other stuffs
Stream.of(step1.method(action1))
.map(stepOutput->uncheckCall(() ->step2.method(stepOutput,action2)))
.forEach(stepOutputThree -> uncheckCall(()->save.persist(stepOutputThree)));
//.....
}
for uncheckCall method:
public static <T> T uncheckCall(Callable<T> callable) {
try {
return callable.call();
} catch (RuntimeException e) {
// throw BusinessException.wrap(e);
} catch (Exception e) {
//throw BusinessException.wrap(e);
}
}
Well, when there are "pipelines", "sequence of operations", etc. the first design pattern that comes to mind is Chain of Responsibility, that looks like the following
and provides you with these benefits:
allows you to add new handlers when necessary (e.g. at runtime) without modifying other handlers and processing logic (Open/Closed Principle of SOLID)
allows a handler to stop processing a request if necessary
allows you to decouple processing logic of the handlers from each other (Single Responsibility Principle of SOLID)
allows you to define the order of the handlers to process a request outside of the handlers themselves
One example of real world usage is Servlet filters where you call doFilter(HttpRequest, HttpResponse, FilterChain) to invoke the next handler
protected void doFilter(HttpServletRequest req, HttpServletResponse resp, FilterChain chain) {
if (haveToInvokeNextHandler) {
chain.doFilter(req, resp);
}
}
In case of using classical Chain of Responsibility pattern your processing pipeline may look like the following:
API
public class StepContext {
private Map<String, Object> attributes = new HashMap<>();
public <T> T getAttribute(String name) {
(T) attributes.get(name);
}
public void setAttribute(String name, Object value) {
attributes.put(name, value);
}
}
public interface Step {
void handle(StepContext ctx);
}
public abstract class AbstractStep implements Step {
private Step next;
public AbstractStep() {
}
public AbstractStep(Step next) {
this.next = next;
}
protected void next(StepContext ctx) {
if (next != null) {
next.handle(ctx);
}
}
}
Implementation
public class Step1 extends AbstractStep {
public Step1(Step next) {
super(next);
}
public void handle(StepContext ctx) {
String action1 = ctx.getAttribute("action1");
List<String> output1 = doSomething(action1);
ctx.setAttribute("output1", output1);
next(ctx); // invoke next step
}
}
public class Step2 extends AbstractStep {
public Step2(Step next) {
super(next);
}
public void handle(StepContext ctx) {
String action2 = ctx.getAttribute("action2");
List<String> output1 = ctx.getAttribute("output1");
List<String> output2 = doSomething(output1, action2);
ctx.setAttribute("output2", output2);
next(ctx); // invoke next step
}
}
public class Step3 extends AbstractStep {
public Step3(Step next) {
super(next);
}
public void handle(StepContext ctx) {
String action2 = ctx.getAttribute("action2");
List<String> output2 = ctx.getAttribute("output2");
persist(output2);
next(ctx); // invoke next step
}
}
Client code
Step step3 = new Step3(null);
Step step2 = new Step2(step3);
Step step1 = new Step1(step2);
StepContext ctx = new StepContext();
ctx.setAttribute("action1", action1);
ctx.setAttribute("action2", action2);
step1.handle(ctx);
Also all this stuff can be simplified into a chain of handlers decoupled from each other by means of removing the corresponding next references in case your processing pipeline will have to always invoke all the available steps without controlling the necessity of invocation from the previous one:
API
public class StepContext {
private Map<String, Object> attributes = new HashMap<>();
public <T> T getAttribute(String name) {
(T) attributes.get(name);
}
public void setAttribute(String name, Object value) {
attributes.put(name, value);
}
}
public interface Step {
void handle(StepContext ctx);
}
Implementation
public class Step1 implements Step {
public void handle(StepContext ctx) {
String action1 = ctx.getAttribute("action1");
List<String> output1 = doSomething(action1);
ctx.setAttribute("output1", output1);
}
}
public class Step2 implements Step {
public void handle(StepContext ctx) {
String action2 = ctx.getAttribute("action2");
List<String> output1 = ctx.getAttribute("output1");
List<String> output2 = doSomething(output1, action2);
ctx.setAttribute("output2", output2);
}
}
public class Step3 implements Step {
public void handle(StepContext ctx) {
String action2 = ctx.getAttribute("action2");
List<String> output2 = ctx.getAttribute("output2");
persist(output2);
}
}
Client code
Note that in case of Spring framework (just noticed #Autowired annotation) the client code may be simplified even more as the #Autowired annotation can be used to inject all the beans of the corresponding type into a corresponding collection.
Here what the documentation states:
Autowiring Arrays, Collections, and Maps
In case of an array, Collection, or Map dependency type, the container autowires all beans matching the declared value type. For such purposes, the map keys must be declared as type String which will be resolved to the corresponding bean names. Such a container-provided collection will be ordered, taking into account Ordered and #Order values of the target components, otherwise following their registration order in the container. Alternatively, a single matching target bean may also be a generally typed Collection or Map itself, getting injected as such.
public class StepsInvoker {
// spring will put all the steps into this collection in order they were declared
// within the spring context (or by means of `#Order` annotation)
#Autowired
private List<Step> steps;
public void invoke(String action1, String action2) {
StepContext ctx = new StepContext();
ctx.setAttribute("action1", action1);
ctx.setAttribute("action2", action2);
steps.forEach(step -> step.handle(ctx))
}
}
I'm trying to map an entity to a DTO using ModelMapper. The problem comes when a #JoinColumn is not loaded(lazy load). ModelMapper tries to access the lazy load entity's properties and then a LazyInitializationException is thrown.
I already have an strategy to solve that but I could not find a single ModelMapper feature which does what I need.
Here is what I need to do:
For each not loaded entity, I'll create a new target object using my factory. If the object is loaded, then the default mapping must be applied.
The following example is a ModelMapper feature that would fit exactly with my needs if it wan't by the fact that it does not provide the source(provides only the source type):
public static class MyConverter implements ConditionalConverter<Object, Object> {
private EntityManager em;
public MyConverter(EntityManager em) {
this.em = em;
}
#Override
public MatchResult match(Class<?> sourceType, Class<?> destinationType) {
Object source = null; // I need the source instead of its type.
PersistenceUnitUtil persistenceUnitUtil = em.getEntityManagerFactory().getPersistenceUnitUtil();
return persistenceUnitUtil.isLoaded(source) ? MatchResult.NONE : MatchResult.FULL;
}
#Override
public Object convert(MappingContext<Object, Object> context) {
return LazyEntityProxyFactory.factory(context.getSource(), context.getDestinationType()); // Creates the target object
}
}
Do you guys of any ModelMapper feature which provides what I need. Or maybe a hack?
*Obs: I've looked into ModelMapper's code and noticed that when ConditionalConverter.match is called the context already exists and therefore it possesses the source. What if ModelMapper had also a ConditionalContextConverter interface which passes the context in the match method? Just an idea.
I just found what I needed! The secret is to check the properties from the parent entity. After that I was able to leverage from the default mapping and also use my own factory if needed.
Here's my ConditionalConverter:
public static class MyConverter implements ConditionalConverter<Object, Object> {
private EntityManager em;
public MyConverter(EntityManager em) {
this.em = em;
}
#Override
public MatchResult match(Class<?> sourceType, Class<?> destinationType) {
return MatchResult.FULL;
}
#Override
public Object convert(MappingContext<Object, Object> context) {
Object source = context.getSource();
Object destination = context.getMappingEngine().createDestination(context);
try {
Field[] sourceFields = context.getSourceType().getDeclaredFields();
Field[] destinationFields = context.getDestinationType().getDeclaredFields();
for (Field sourceField : sourceFields) {
sourceField.setAccessible(true);
for (Field destinationField : destinationFields) {
destinationField.setAccessible(true);
if (sourceField.getName().equals(destinationField.getName())) {
Object sourceFieldValue = sourceField.get(source);
PersistenceUnitUtil persistenceUnitUtil = em.getEntityManagerFactory().getPersistenceUnitUtil();
if (persistenceUnitUtil.isLoaded(sourceFieldValue)) {
MappingContext<?, ?> myContext = context.create(sourceFieldValue, destinationField.getType());
Object destinationValue = context.getMappingEngine().map(myContext);
destinationField.set(destination, destinationValue);
} else {
// Here is your factory call;
destinationField.set(destination, SomeFactory.factory(sourceFieldValue, destinationField.getType()));
}
break;
}
}
}
} catch (Exception e) {
e.printStackTrace();
}
return destination;
}
}
Context
I develop, for my company a software that classifies phishing and malware containing website thanks to multiple feature extraction algorithm.
Once features are extracted we use a pool of empirical and machine learning classifiers. We choose among them thanks to election function of our own.
the code
Basically we have our classifier classes that implement the AnalysisFunction contract.
public abstract class AnalysisFunction {
abstract public StatusType analyze(List<TokenEntity> tokens);
abstract public double getPhishingProbability(List<TokenEntity> tokens);
}
Our pool of classifier is contained by a "pool" that implements AnalysisFunction.
public class PoolAnalysisFunction extends AnalysisFunction{
private final List<AnalysisFunction> candidates;
private final ChoiceFunction choice;
private static final Logger LOG = LogManager.getLogger(PoolAnalysisFunction.class);
public PoolAnalysisFunction(List<AnalysisFunction> candidates, ChoiceFunction choice) {
this.candidates = candidates;
this.choice = choice;
}
#Override
public StatusType analyze(List<TokenEntity> tokens) {
try {
return choice.chooseAmong(candidates, tokens).analyze(tokens);
} catch (ImpossibleChoiceException e){
LOG.fatal("Not enough analysis function.", e);
return StatusType.CLEAN;
}
}
#Override
public double getPhishingProbability(List<TokenEntity> tokens) {
try {
return choice.chooseAmong(candidates, tokens).getPhishingProbability(tokens);
} catch (ImpossibleChoiceException e){
LOG.fatal("Not enough analysis function.", e);
return 0;
}
}
}
To ease the deployment and testing of new function, we want to make our pool fully customizable and instanciate every function by its name. To achieve this purpose we have a key in our property file that is like analysis.pool.functions=com.vadesecure.analysis.empirical.Function1,com.vadesecure.analysis.machine.AutomaticClassifier1.
I want to instantiate my functions thanks to that.
My problem is that those classifiers depend on different things such as custom configuration object and machine learning model.
I would like to inject those dependencies that are already bound in my hk2 injector.
import org.glassfish.hk2.api.Factory;
public class PoolFunctionFactory implements Factory<AnalysisFunction> {
private final PoolAnalysisParameters parameters;
private static final Logger LOG = LogManager.getLogger(PoolAnalysisFunction.class);
#Inject
public PoolFunctionFactory(PoolAnalysisParameters parameters) {
this.parameters = parameters;
}
#Override
public AnalysisFunction provide() {
try {
Class<?> choice = Class.forName(parameters.getChoiceFunctionFQDN());
ChoiceFunction choiceFunction = new PhishingPriorityChoiceFunction(); // default choice
if(choice.getSuperclass().isInstance(ChoiceFunction.class)){
choiceFunction = (ChoiceFunction) choice.newInstance();
}
List<AnalysisFunction> analysisFunctions = new LinkedList<>();
// I want to instantiate here
}
return new PoolAnalysisFunction(analysisFunctions, choiceFunction);
} catch (ClassNotFoundException|IllegalAccessException|InstantiationException e){
LOG.fatal(e, e);
}
return null;
}
#Override
public void dispose(AnalysisFunction analysisFunction) {
LOG.trace(String.format("%s end of life", analysisFunction));
}
}
On example of model-dependant classifier is :
public class SVMF2AnalysisFunction extends AnalysisFunction {
private final SVMContainer modelContainer;
private double probability = 0.0;
private double threshold = 0.9;
#Inject // i build this model in a parallel thread
public SVMF2AnalysisFunction(SVMContainer modelContainer) {
this.modelContainer = modelContainer;
}
#Override
public StatusType analyze(List<TokenEntity> tokens) {
if (modelContainer.getModel() == null) {
return null;
}
probability = modelContainer.getModel().analyse(tokens.stream());
return probability >= threshold ? StatusType.PHISHING : StatusType.CLEAN;
}
#Override
public double getPhishingProbability(List<TokenEntity> tokens) {
return probability;
}
}
How can I achieve those instanciations.
My first approach was to inject the serviceLocator but i found no documentations for doing this and a colleague said me it was not good.
He told be to document myself about proxies but it doesn't seem to be a good thing for me or perhaps I missed something.
You could just configure all this in your binder. This way you don't need to worry about trying to instantiate everything yourself. Just let HK2 do all the work
#Override
protected void configure() {
bindAsContract(PoolAnalysisFunction.class).in(Singleton.class);
bind(choiceFnClass).to(ChoiceFunction.class);
for (Class<AnalysisFunction> analysisFnClass: analyisFnClasses) {
bind(analysisFnClass).to(AnalysisFunction.class).in(Singleton.class);
}
}
Then you can just inject everything into the PoolAnalysisFunction class, without the need to use a factory.
#Inject
public PoolAnalysisFunction(IterableProvider<AnalysisFunction> candidates,
ChoiceFunction choice) {
this.choice = choice;
this.candidates = new ArrayList<>();
candidates.forEach(this.candidates::add);
}
Notice the IterableProvider class. This is an HK2 class for injecting multiple services bound to the same contract.
Or if you want to use the factory, you could, and just inject the functions into the factory. That way you can make the PoolAnalysisFunction class independent of an HK2 classes (i.e. the InjectableProvider).
Mapping URL request parameters with Spring MVC to an object is fairly straightforward if you're using camelCase parameters in your request, but when presented with hyphen delimited values, how do you map these to an object?
Example for reference:
Controller:
#RestController
public class MyController {
#RequestMapping(value = "/search", method = RequestMethod.GET)
public ResponseEntity<String> search(RequestParams requestParams) {
return new ResponseEntity<>("my-val-1: " + requestParams.getMyVal1() + " my-val-2: " + requestParams.getMyVal2(), HttpStatus.OK);
}
}
Object to hold parameters:
public class RequestParams {
private String myVal1;
private String myVal2;
public RequestParams() {}
public String getMyVal1() {
return myVal1;
}
public void setMyVal1(String myVal1) {
this.myVal1 = myVal1;
}
public String getMyVal2() {
return myVal2;
}
public void setMyVal2(String myVal2) {
this.myVal2 = myVal2;
}
}
A request made like this works fine:
GET http://localhost:8080/search?myVal1=foo&myVal2=bar
But, what I want is for a request with hyphens to map to the object, like so:
GET http://localhost:8080/search?my-val-1=foo&my-val-2=bar
What do I need to configure in Spring to map url request parameters with hyphens to fields in an object? Bear in mind that we may have many parameters, so using a #RequestParam annotation for each field is not ideal.
I extended ServletRequestDataBinder and ServletModelAttributeMethodProcessor to solve the problem.
Consider that your domain object may already be annotated with #JsonProperty or #XmlElement for serialization. This example assumes this is the case. But you could also create your own custom annotation for this purpose e.g. #MyParamMapping.
An example of your annotated domain class is:
public class RequestParams {
#XmlElement(name = "my-val-1" )
#JsonProperty(value = "my-val-1")
private String myVal1;
#XmlElement(name = "my-val-2")
#JsonProperty(value = "my-val-2")
private String myVal2;
public RequestParams() {
}
public String getMyVal1() {
return myVal1;
}
public void setMyVal1(String myVal1) {
this.myVal1 = myVal1;
}
public String getMyVal2() {
return myVal2;
}
public void setMyVal2(String myVal2) {
this.myVal2 = myVal2;
}
}
You will need a SerletModelAttributeMethodProcessor to analyze the target class, generate a mapping, invoke your ServletRequestDataBinder.
public class KebabCaseProcessor extends ServletModelAttributeMethodProcessor {
public KebabCaseProcessor(boolean annotationNotRequired) {
super(annotationNotRequired);
}
#Autowired
private RequestMappingHandlerAdapter requestMappingHandlerAdapter;
private final Map<Class<?>, Map<String, String>> replaceMap = new ConcurrentHashMap<Class<?>, Map<String, String>>();
#Override
protected void bindRequestParameters(WebDataBinder binder, NativeWebRequest nativeWebRequest) {
Object target = binder.getTarget();
Class<?> targetClass = target.getClass();
if (!replaceMap.containsKey(targetClass)) {
Map<String, String> mapping = analyzeClass(targetClass);
replaceMap.put(targetClass, mapping);
}
Map<String, String> mapping = replaceMap.get(targetClass);
ServletRequestDataBinder kebabCaseDataBinder = new KebabCaseRequestDataBinder(target, binder.getObjectName(), mapping);
requestMappingHandlerAdapter.getWebBindingInitializer().initBinder(kebabCaseDataBinder, nativeWebRequest);
super.bindRequestParameters(kebabCaseDataBinder, nativeWebRequest);
}
private static Map<String, String> analyzeClass(Class<?> targetClass) {
Field[] fields = targetClass.getDeclaredFields();
Map<String, String> renameMap = new HashMap<String, String>();
for (Field field : fields) {
XmlElement xmlElementAnnotation = field.getAnnotation(XmlElement.class);
JsonProperty jsonPropertyAnnotation = field.getAnnotation(JsonProperty.class);
if (xmlElementAnnotation != null && !xmlElementAnnotation.name().isEmpty()) {
renameMap.put(xmlElementAnnotation.name(), field.getName());
} else if (jsonPropertyAnnotation != null && !jsonPropertyAnnotation.value().isEmpty()) {
renameMap.put(jsonPropertyAnnotation.value(), field.getName());
}
}
if (renameMap.isEmpty())
return Collections.emptyMap();
return renameMap;
}
}
This KebabCaseProcessor will use reflection to get a list of mappings for your request object. It will then invoke the KebabCaseDataBinder - passing in the mappings.
#Configuration
public class KebabCaseRequestDataBinder extends ExtendedServletRequestDataBinder {
private final Map<String, String> renameMapping;
public KebabCaseRequestDataBinder(Object target, String objectName, Map<String, String> mapping) {
super(target, objectName);
this.renameMapping = mapping;
}
protected void addBindValues(MutablePropertyValues mpvs, ServletRequest request) {
super.addBindValues(mpvs, request);
for (Map.Entry<String, String> entry : renameMapping.entrySet()) {
String from = entry.getKey();
String to = entry.getValue();
if (mpvs.contains(from)) {
mpvs.add(to, mpvs.getPropertyValue(from).getValue());
}
}
}
}
All that remains now is to add this behavior to your configuration. The following configuration overrides the default configuration that the #EnableWebMVC delivers and adds this behavior to your request processing.
#Configuration
public static class WebContextConfiguration extends WebMvcConfigurationSupport {
#Override
protected void addArgumentResolvers(List<HandlerMethodArgumentResolver> argumentResolvers) {
argumentResolvers.add(kebabCaseProcessor());
}
#Bean
protected KebabCaseProcessor kebabCaseProcessor() {
return new KebabCaseProcessor(true);
}
}
Credit should be given to #Jkee. This solution is derivative of an example he posted here: How to customize parameter names when binding spring mvc command objects.
One way I can think of getting around the hyphens is to use HttpServletRequestWrapper class to wrap the original request.
Parse all the request parameters in this class and convert all hyphenated parameters into camelcase. After this, spring will be able to automatically map those parameters to your POJO classes.
public class CustomRequestWrapper extends HttpServletRequestWrapper {
private Map<String, String> camelCasedParams = new Hashmap();
public CustomRequestWrapper(HttpServletRequest req){
//Get all params from request.
//Transform each param name from hyphenated to camel case
//Put them in camelCasedParams;
}
public String getParameter(String name){
return camelCasedParams.get(name);
}
//Similarly, override other methods related to request parameters
}
Inject this request wrapper from J2EE filter. You can refer to below link for a tutorial on injecting request wrappers using filter.
http://www.programcreek.com/java-api-examples/javax.servlet.http.HttpServletRequestWrapper
Update your web xml to include filter and its filter mapping.
I am starting to program a Code Generator for NetBeans 8, and I am having trouble figuring out the best way to test its invoke() method.
The code generator I want to test is basically like this:
(imports here)
public class MyCodeGenerator implements CodeGenerator {
private final JTextComponent textComponent;
private final CompilationController controller;
MyCodeGenerator(final Lookup context) {
textComponent = context.lookup(JTextComponent.class);
controller = context.lookup(CompilationController.class);
}
#Override
public String getDisplayName() {
return "Generate Some Code...";
}
/**
* This will be invoked when user chooses this Generator from Insert Code
* dialog
*/
#Override
public void invoke() {
if (textComponent != null && controller != null) {
controller.toPhase(Phase.RESOLVED);
//do more things with the source code;
}
}
}
I want to use a mocked (Mockito) object for Lookup, to pass to the MyCodeGenerator's constructor. The mock should return the JTextComponent and the CompilationController.
I know I can provide a JTextComponent with the test code, but I hit the wall when I need to provide a CompilationController.
I can create a temporary java source file with the same content as the JTextComponent, but I could not find a way to create a CompilationController (or WorkingCopy) from it.
This is what I tried so far (my test method):
#Test
public void testInvoke() throws ParseException, IOException {
System.out.println("invoke");
final ExtractControllerTask extractTask = new ExtractControllerTask(
Phase.RESOLVED);
final StringBuilder builder = new StringBuilder(100);
final JTextComponent textComponent;
final Document document;
final FileObject javaTestFile;
final OutputStream outputStream;
final JavaSource source;
builder.append("public class Clazz {");
builder.append("private int a = 2;");
builder.append("}");
textComponent = new JTextArea(builder.toString());
document = textComponent.getDocument();
document.putProperty(BaseDocument.MIME_TYPE_PROP, "text/x-java");
javaTestFile = FileUtil.createData(new File(
"/tmp/javaTestSourceFile.java"));
outputStream = javaTestFile.getOutputStream();
outputStream.write(builder.toString().getBytes());
outputStream.flush();
source = JavaSource.forFileObject(javaTestFile);
assertNotNull(source);
source.runUserActionTask(extractTask, true);
assertNotNull(extractTask.controller); //FAILS HERE
}
This is the code for ExtractControllerTask:
private static class ExtractControllerTask implements
Task<CompilationController> {
private final Phase targetPhase;
private CompilationController controller;
private ExtractControllerTask(final Phase phase) {
this.targetPhase = phase;
}
public void run(final CompilationController compControler) {
try {
compControler.toPhase(this.targetPhase);
this.controller = compControler;
} catch (IOException ioe) {
throw new RuntimeException(ioe);
}
}
}
What surprises me is that the run method in ExtractControllerTask is never called.
I really need to test my code but I can't find the proper way. Maybe the approach is completely wrong.
Can anyone suggest how to achieve this?