Instantiates a interface via deferred binding in GWT? - java

I am reading a code of GWT
Basically in this project they are getting some constant value like button text from a properties file.
so they have an interface LocalizableResource and getting the instance like
public interface LocalizableResource extends Constants {
public static class Util {
public static LocalizableResource getInstance() {
return GWT.create(LocalizableResource.class);
}
}
String lblName_text_1();
}
and use this instance to get a button text
String buttonText = LocalizableResource.Util.getInstance().lblName_text_1();
Button b = new Button(buttonText);
in java we can not Instantiates an interface then,
How GWT doing this such like. I have not so much Idea about deferred binding and GWT.

That's the beauty of GWT and one of its way to manage multiple clients which is the core advantages of GWT framework.
http://www.gwtproject.org/doc/latest/DevGuideCodingBasicsDeferred.html
Deferred binding is a feature of the GWT compiler that works by generating many versions of code at compile time, only one of which needs to be loaded by a particular client during bootstrapping at runtime. Each version is generated on a per browser basis, along with any other axis that your application defines or uses. For example, if you were to internationalize your application using GWT’s Internationalization module, the GWT compiler would generate various versions of your application per browser environment, such as “Firefox in English”, “Firefox in French”, “Internet Explorer in English”, etc… As a result, the deployed JavaScript code is compact and quicker to download than hand coded JavaScript, containing only the code and resources it needs for a particular browser environment.

A tag interface that facilitates locale-sensitive, compile-time
binding of constant values supplied from properties files. Using
GWT.create(class) to "instantiate" an interface that extends Constants
returns an instance of an automatically generated subclass that is
implemented using values from a property file selected based on
locale. more info

Related

Use ArchUnit As Adapter to Run Architecture Test Based on External AnalyzeClasses

I am trying to do one example with ArchUnit where passing the AnalyzeClasses can be dynamic based on for which Adapter Application the test need run.
For Example:
#AnalyzeClasses(packages = "${archtest.scan.package}", importOptions = { ImportOption.DoNotIncludeTests.class, ImportOption.DoNotIncludeJars.class })
public class ArchitectureTests {
}
And from application.properties file it should allow to pass the packages to analyze dynamically, so any application using this Application as Jar library can provide the scan classes in its properties file. As below.
archtest.scan.package=com.example.pkgname
I am not sure what is the right way to pick up the dynamic value from property and pass that into #AnalyzeClasses Annotation. I am looking for some help or any example in this regard.
I don't think that ArchUnit's JUnit 4 & 5 support – in the current version 0.23.1 – allows for dynamic packages configured via an application.properties.
But instead of using #AnalyzeClasses, you can always just invoke new ClassFileImporter().import… and pass any dynamic runtime values you like.
(Note that ArchUnit's JUnit support also introduces a clever cache to reuse imported JavaClasses by multiple #ArchTests, but storing JavaClasses in a static field may be also good enough.)
This actually should be possible using a custom LocationProvider within #AnalyzeClasses. E.g.
#AnalyzeClasses(locations = ApplicationPropertiesLocationProvider.class)
public class ExampleTest {
// ...
}
class ApplicationPropertiesLocationProvider implements LocationProvider {
#Override
public Set<Location> get(Class<?> testClass) {
String packageToScan = readFromApplicationProperties();
return Locations.ofPackage(packageToScan);
}
}
But be aware of caching limitations! The caching mechanism assumes that your LocationProvider is "idempotent", i.e. it always returns the same locations. The caching mechanism will only take the type of the LocationProvider into consideration as cache key. This should not be a problem for a static application.properties as source though.

What replaces FileResolver in the JasperReports API?

One of our software utilities uses a class that implements net.sf.jasperreports.engine.util.FileResolver to load report elements (like images for example) that reside at a path relative to the report or that are to be loaded via a proprietary file server protocol. As of the latest version, 6.6.0, I see that the plan is to remove the FileResolver class entirely. However, in the Javadocs, it only notes that the class will be removed. No details about a replacement are specified.
I am not expecting to be able to trade out the FileResolver with another class with a 1:1 substitution, but would really like to know what the report filler is now using to locate external report elements.
FileResolver was deprecated in favor of net.sf.jasperreports.repo.RepositoryService implementations.
There's a builtin implementation named net.sf.jasperreports.repo.FileRepositoryService which is roughly the equivalent of the deprecated net.sf.jasperreports.engine.util.SimpleFileResolver.
Repository services are registered as JasperReportsContext extensions.
That can be done either in a jasperreports_extension.properties file like this:
net.sf.jasperreports.extension.registry.factory.file.repository=net.sf.jasperreports.repo.FileRepositoryServiceExtensionsRegistryFactory
net.sf.jasperreports.extension.file.repository.root=/path/to/repository
net.sf.jasperreports.extension.registry.factory.persistence=net.sf.jasperreports.repo.FileRepositoryExtensionsRegistryFactory
Registering the extensions can also be done by programmatically creating a JasperReportsContext object and then using it to fill the reports:
SimpleJasperReportsContext context = new SimpleJasperReportsContext();
FileRepositoryService fileRepository = new FileRepositoryService(context, "/path/to/repository", false);
context.setExtensions(RepositoryService.class, Collections.singletonList(fileRepository));
context.setExtensions(PersistenceServiceFactory.class, Collections.singletonList(FileRepositoryPersistenceServiceFactory.getInstance()));
JasperPrint jasperPrint = JasperFillManager.getInstance(context).fill(jasperReport, params);
If you need to implement a custom repository service, you can take FileRepositoryService as a reference. You'll probably want to implement StreamRepositoryService and register PersistenceServices (as in FileRepositoryPersistenceServiceFactory).
If what you need to do is about resource paths relative to the report, you can also take a look at the JRFiller methods that take a JasperReportSource argument. Passing such an object is meant to automatically resolve report resource references as relative to the report (provided that the repository service implements resource lookup based on RepositoryContext).

Description on JMX fields and methods JBoss

How can we add description on the fields and operations exposed for JMX?
JBoss version : JBoss EAP 5.1.2
We have a Service bean as
#Service
#Management(MyConfigMgnt.class)
public class MyConfigService implements MyConfigLocal, MyConfigMgnt {
public void setMyValue(String MyValue){}
public String getMyValue(){}
}
These methods are declared in the MyConfigMgnt interface.
This is visible in the jboss jmx console as
and for the field it is shown as
How do we add relevant and proper information on the fields and the MBean.
Thanks
There's 2 ways of doing this.
Re-implement your service as a DynamicMBean which is slightly more complicated but allows for the definition of attribute and operation meta-data. (i.e. MyConfigMgnt extends DynamicMBean)
An easier way (but possibly not future-proof) is to use an XMBean descriptor. XMBeans are a proprietary JBoss JMX extension where meta-data is defined in an external XML resource. It would require no actual changes to the source code except the addition of the XMBean resource location which looks something like this:
#Service(objectName = XMBeanService.OBJECT_NAME, xmbean = "resource:META-INF/service-xmbean.xml")
If you have a very large number of attributes and operations, the XMBean XML descriptor can be arduous to write, but twiddle has a helper command which will generate a template specific to your existing simple MBean, so you can save the output, fill in the details and go from there.

use conditional logging via domino api

I want to implement the logging to openlog.nsf functionality from the domino api in an application.
However in the current setup of the application logging (to Domino console) only occurs when this is enabled for the whole application via a configuration property e.g.
public static void writeToConsole(String msg){
if (getDeugMode() ==true) {
System.out.println(msg);
}
}
usage:
writeToConsole("hello world");
I am wondering how I could rewrite the writeToConsole method to utilize the XspOpenLogLogUtil class?
XspOpenLogUtil.logErrorEx(Throwable, String, Level, Document) will allow you to pass a custom message.
Also XspOpenLogUtil.getOpenLogItem() gets a handle on the OpenLogItem object. From there you can use any of the inner methods.
See this page https://wiki.openntf.org/pages/viewpage.action?pageId=6586418 (all the method names are the same, the class in ODA is just XspOpenLogUtil instead of OpenLogUtil.
In my apps I usually have a wrapper method handleException(Throwable t) which calls XspOpenLogUtil.logError()` anyway and that's what I would recommend. It gives greater flexibility for handling e.g. different logging levels like this or changing the logging framework, should you so wish in the future.

Generic way to divide the process into sub processes

I am working on Spring based standalone application , Once xml message enter into system .We are doing some technical validations using java code using exceptions and regular expressions.Now i am trying to plugin one more feature ,based on some flag in the database we need few more validations on incoming xml message, but all existing clients dont require. I know there is a concept fork/join in java 7.But i am limited upto java 1.6. So how can i implement similar feature using java 1.6.
Below is the approach .
main class -handler thread--> calls action class --->every action class extends abstract action class-->performs technical validations using xml file.
public class AbstractAction {
public abstract void processMsg(String msg);
}
public Class GenericAction extends AbstractAction {
public void processMsg(String str){
// existing code already doing validations
//now i have to check flag in table ,whether that client requires new validations,these validations are in xml file in the form of spring beans.
//Java code read that bean validate using some helper classes.There is no third party code here,Here i want to put some new code.So i have to break down existing into
//some small piceses
}
}
Regards,
chaitu hara

Categories