How do I restrict java function usage in intellij? - java

I have a project with more than 100 java classes. My case is:
- i have Button.java, DropDown.java, Checkbox.Java, ListBox.java....
- also, some business class that represent a page for my web app with custom functions and mapping for Button.java and DropDown.java
- I want now to be able to use only functions from button and dropdown
An example:
public class FieldsProperties extends RHRegion{
public static void setProperties(String strMap){
HashMap<String, String> mapProperties = getMapProperties();
if(strMap.equals("Properties")){
mapProperties = UISummary.getProperties("UISummary");
}
setMapProperties(mapProperties);
RHRegion.setProperties(strMap);
}
}
so in my scenario i want to be allowed only functions from UISummary.java or just to be notified if it is used another class

You can use the IllegalType rule from checkStyle:
Checks that particular classes are never used as types in variable declarations, return values or parameters.
Rationale: Helps reduce coupling on concrete classes.
Simply add the classes to the list of illegalClassNames. You can configure checkStyle to treat the rule violation either as warning or as error.
Note that this does not prevent from instantiation - for example, assigning to a parent type is still possible. A better (or additional) rule might be IllegalInstantiation:
Checks for illegal instantiations where a factory method is preferred.
Assumed that you do not have a factory method for your restricted classes, this would violate all instantiations of classes defined in the classes property.
of that rule
There is also an IntelliJ plugin for checkstyle available.

Without having deeper knowledge of your application, but wouldn't it be much better to rethink your packge / application / class structure to prevent the usage of unwanted functionality?

Related

Exposing static methods through proxy classes

We have a Shared Utilities project, two independent SDK projects (each referring to the Utilities) and some plugin projects, each using only one of those SDKs. Shared Utilities contains some all-static classes, which need to be made visible to the plugins mentioned, but we'd like to hide the rest of its classes from them.
How we can go about resolving the issue? We'd like to keep the build process as simple as possible (we're using Ant for the build) with least possible dependencies.
Here's the options we've considered so far and why we've discarded each approach:
Second Shared Utilities project, which will be made available to the plugins - will make deployment harder.
Build 2 separate .jar-s from the Shared Utils project, one containing only the all-static utilities and the other - whatever needs to be hidden. This would make the build more complex, i.e. additional dependencies to the plugins' build scripts.
Proxy all-static classes in each of the SDKs - duplicate method definitions, but the implementation simply calls the corresponding static method from the Shared project - seems most painless, downside is we'd need to copy the Javadoc by hand. Is there a simple Javadoc tag, which would do this automatically upon generation?
Convert all-static classes to "normal" and simply create subclasses in each SDK - unnecessary (to my mind) performance overhead.
"Exposing static methods through proxy classes"
I've read your complete question and I'm not sure what is your issue exactly.
Exposing static (non-instance related) method through a proxy (instance).
What you wan't to hide from what exactly. What you want to expose to what exactly.
public class A {
private A(){} //prevent instanciation
public static void doSomething(){} //want to expose to some class, hide from other
}
To limit exposition of doSomething you can set the visibility: Controlling Access to Members of a Class.
You can also remove the static nature and use a static factory pattern to return the object like:
public class A {
private A(){}
public void doSomething(){} //want to expose to some class, hide from other
//the new A can also be set in a member variable and
//return always the same instance, if it is unmuttable it will be thread safe.
public static A getInstance(){ return new A();}
}
This can look like the "same thing" but you can now control the visibility of all method in A by controlling only the visibility of the getInstance(), the javadoc stay on A, now to control exactly how the getInstance can be access... I would have to understand exactly what you want to do.

sharing an object application wide

It's common to have an object used application wide.
What are the different patterns / models / ways to share an object through an application?
Is defining a "main class", then setting a member variable and extending all other classes from this "main class" a good way? Is creating a static class probably the better and cleaner way? What's your prefered pattern?
It's common to have an object used application wide. What are the different patterns / models / ways to share an object through an application?
One common way is to use the singleton pattern. I would avoid that though.
Is defining a "main class", then setting a member variable and extending all other classes from this "main class" a good way
Absolutely not. Aside from anything else, if it's an instance variable then it wouldn't be "shared" with instances of your other classes anyway. It's also a complete abuse of inheritance which would certainly bite you hard in any application of significant size - your other classes wouldn't logically have an inheritance relationship with your "main" class, would they? As a general rule, inheritance should only be used when it's really appropriate - not to achieve a quick fix.
What's your prefered pattern?
Dependency injection. When your application starts up, create all the appropriate objects which need to know about each other, and tell them (usually in the constructor) about their dependencies. Several different objects can all depend on the same object if that's appropriate. You can use one of the many dependency injection frameworks available to achieve this easily.
Dependency injection generally works better than using singletons because:
The class itself doesn't know whether or not the dependency is actually shared; why should it care?
Global state makes unit testing harder
Each class makes its dependencies clearer when they're declared - it's then easier to navigate around the application and see how the classes relate to each other.
Singletons and global factories are more appropriate when they're for things like logging - but even then, it means it's relatively hard to test the logging aspects of a class. It's a lot simpler to create a dependency which does what you need it to, and pass that to the object under test, than it is to add ways of messing around with a singleton (which usually remains "fixed" after initialization).
If you use a framework like Spring which has dependency injection, you can get all the benefits of "global" objects for free without needing to explicitly define them. You just create a reference to them in your application context and you can inject them into any object you'd like without worrying about issues with synchronizing.
Singleton pattern, AFAIK the preferable way in software engineering.
I believe what you are looking for is the Singleton Pattern. With this pattern you are ensured that only one instance of a class can be created in memory.
Example:
public class mySingletonClass {
private static mySingletonClass singleObject;
// Note that the constructor is private to prevent more than one
//instance of the class
private SingletonObjectDemo() {
// Optional Code
}
public static mySingletonClass getSingletonObject() {
if (singleObject == null) {
singleObject = new mySingletonClass();
}
return singleObject;
}
}
That said, you should try to avoid using it; but there are some acceptable cases, one of which is here.

Can a Java class add a method to itself at runtime?

Can a class add a method to itself at runtime (like from a static block), so that if someone is performing reflection on this class, they'll see the new method, even though it wasn't defined at compile time?
Background:
A framework I'm using expects Action classes to be defined that have a doAction(...) method, by convention. The framework inspects these classes at runtime to see what type of parameters are available in their doAction() method. For example: doAction(String a, Integer b)
I'd like each class to be able to programatically generate its doAction() method with various parameters, just-in-time when it is inspected. The body of the method can be empty.
It's not simple. Once a class is loaded by a classloader, there is no way to change the methods of loaded classes. When a class is requested, a classloader will load it and link it. And there is no way (with Java) to change the linked code or to add/remove methods.
The only trick that comes to my mind is playing with classloaders. If we delete a custom classloader, then the classes loaded by that classloader should be deleted or inaccessible too. The idea that comes to my mind is to
implement one custom classloader
load the dynamic class with that custom classloader
if we have an updated version of this class,
remove the custom classloader and
load the new version of this class with a new instance of the custom classloader
I leave that as food for thought, can't prove, if this leads to a solution or if we have pitfalls.
As a simple answer to the question: No, we can't change a loaded class like we can change the content of fields with reflection. (we can't add or remove fields too).
Andres_D is right, we can very well do so using custom class loading, here is a detailed guide on how to do this: http://www.javaworld.com/javaworld/jw-06-2006/jw-0612-dynamic.html?page=1
The article explains how to write dynamic Java code. It discusses runtime source code compilation, class reloading, and the use of the Proxy design pattern to make modifications to a dynamic class transparent to its caller.
In fact researcher in Austria have written a JVM that even allows reloading classes with different type hierarchies. They have achieved this by using existing thread save points to generate a complete 'side universe' of an object and all it's related references and referenced content and then once fully reshuffled with all required changes simply swap in all changed classes. [1] Here a link to their project http://ssw.jku.at/dcevm/ the oracle sponsorship certainly makes for interesting speculations on future plans.
Less intrusive changes to method bodies and fields are already possible in the standard java VM using the Hot Swap capabilities of the JPDA as introduced in Java 1.4:
docs.oracle.com/javase/1.4.2/docs/guide/jpda/enhancements.html#hotswap
I'm not sure whether it was the first one but this Sun employee's paper from 2001 appears to be one of the early proposals mentioning the capabilities of the HotSpot to Hot Swap. [2]
REFERENCE
[1] T. Würthinger, C. Wimmer, and L. Stadler, “Dynamic Code Evolution for Java,” presented at the 8th International Conference on the Principles and Practice of Programming in Java, Vienna, 2010.
[2] M. Dmitriev, “Towards flexible and safe technology for runtime evolution of java language applications,” in OOPSLA Workshop on Engineering Complex Object-Oriented Systems for Evolution, 2001.
I've never tried anything quite like that myself, but you should have a look at ASM, cglib, and Javassist.
No, that is not (easily) possible in Java.
It sounds like you are trying to use Java as if it is a dynamic programming language. For example, Ruby has open classes: you can add and remove methods from Ruby classes at runtime. In Ruby, you can also have a "method missing" method in your class, that will be called when you try to call a method that doesn't exist in the class. Such a thing also doesn't exist in Java.
There is a version of Ruby that runs on the JVM, JRuby, and it has to do very difficult tricks to make open classes work on the JVM.
You can have a doAction method which does whatever you would like the generated method to do. Is there a reason it needs to be generated or can it be dynamic?
It looks like there is no way to add method dynamically. But you can prepare an class with a list of Methods or an hash like:
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
import java.lang.reflect.Modifier;
import java.util.HashMap;
public class GenericClass {
private HashMap<String, Method> methodMap = new HashMap<String, Method>();
public Object call(String methodName,Object ...args)
throws IllegalAccessException, IllegalArgumentException, InvocationTargetException {
Method method = methodMap.get(methodName);
return method.invoke(null, args);
}
public void add(String name,Method method){
if(Modifier.isStatic(method.getModifiers()))
methodMap.put(name, method);
}
public static void main(String[] args) {
try {
GenericClass task = new GenericClass();
task.add("Name",Object.class.getMethod("Name", new Class<?>[0]));
} catch (NoSuchMethodException | SecurityException e) {
e.printStackTrace();
}
}
}
Than, using reflections you can set or unset the attribute.
I believe you need some byte code altering tool/framework, such as asm, cglib or javassist.
You can achieve this via aspects/weaving like it's done Spring, but I believe you still need to have the method defined first.
Proxy may help. But have to instantiate a Proxy every time you want to add or remove a method.
What I suggest should work for your situation:
1. You have an existing class MyClass with n methods
2. You want to include (n+1) th method which is not in the class while compiling in another .java source file
My way to solve it is Inheritance. Create a new .java source file for a Class MyClassPlusOne extending the first class MyClass. Compile this class and use the object. How can I compile and deploy a java class at runtime?
class MyClassPlusOne extends MyClass
{
void doAction(String a, Integer b)
{
int myNPlus1 = a+b;
//add whatever you want before compiling this code
}
}
I'm not sure that is possible. However, you could use AspectJ, ASM, etc. and weave these methods into the appropriate classes.
The other alternative is to use composition to wrap the target class and provide the doAction method. You would end up delegating to the target class in this case.
This is a rather old question, but I still found myself looking at it today so, just in case, I'll add my two cents.
If you are using Java 8+, you can define "default" implementations of an interface method, so you can just define the interface with all the extra methods with empty default implementations, and add the implements clause in the desired classes. This approach, in some cases, may be the easiest one.
If you don't have control over the definition of the classes, or you need compatibility with older Java versions, you can still define an interface containing all the required extra methods; but in this case, implement a "Decorator" class with a method that receives the object to "decorate" as parameter, and returns a DynamicProxy instance, wrapping the passed object with this interface.
If you are using Spring, the decorator can be added to the context as a #Component, so you can inject it wherever you need to use it. If any of the objects you need to inject are Spring Beans, you could implement a FactoryBean that uses the decorator to return the instances, so you can just forget about calling the decorator explicitly for them.

A Way Of Saying "For All Subclasses In My Project That Extend This Superclass"?

Here's something that's got me a bit stumped but intrigued all the same. In my Android game I have various Levels that extend the superclass Level. What I am trying to do is build a levelDirectory (based on the Singleton DP) that essentially is an object that has a HashMap object within it that stores all the Level subclasses. Here is my question:
We're all familiar with the enhanced for loop, but how can I write something that would be the equivalent of
for(Level l : An Array Of Every Level Subclass In My Project that is an Extension of the Level Superclass){
HashMap.put(l.name, l);
}
I am trying to build a system that can dynamically update itself when I add more and more level subclasses. I know having a method in Level that submitted itself to the static Directory and was called in the Level's constructor is an option, But I'm just wondering whether there is a way of doing what I said above in that enhanced for loop?
Many thanks
The question itself is wrong. You cannot loop over List ("Every Level Subclass In My Project") and get instances of Level. l should be Class.
From the context, I think you meant "every instance of every Level subclass". No, it is not possible - a virtual machine is not and should not be a database. You cannot just query for objects, you have to manage references in your code (but that you already knew that - your constructor solution will work).
Option 1:
Lately I had to solve a similar problem within JavaSE. I'm using the Google Reflections Library for that:
http://code.google.com/p/reflections/
However I'm not sure if it can run with Android. I think it's worth to give it a try, since it's quite easy to use. In your case you would do something like:
Reflections reflections = new Reflections("my.project.prefix");
Set<Class<? extends Level>> subTypes = reflections.getSubTypesOf(Level.class);
That would give you a Set (subTypes) to iterate on and put it in the HashMap.
Option 2:
You could maybe use custom annotations to annotate your Level classes, for example:
#Level public class MyCustomLevel {}
Then use a custom annotation processor which implements AbstractProcessor to process the annotation at compile time. Implement the process method to find all classes annotated with your #Level annotation. Now you can write the full names of the found classes to a property file in your META-INF dir. From your application you can read this property file and instantiate the classes using reflection.
If you're trying to dynamically fetch the list of all classes that extend Level at runtime, that's not really possible, I'm afraid. Have a look at this thread: How do you find all subclasses of a given class in Java?
I think you might want to make the level an interface and then check if it's an interface.
In its most common form, an interface is a group of related methods with empty bodies. A bicycle's behavior, if specified as an interface, might appear as follows:
interface Bicycle {
void changeCadence(int newValue); // wheel revolutions per minute
void changeGear(int newValue);
void speedUp(int increment);
void applyBrakes(int decrement);
}
To implement this interface, the name of your class would change (to a particular brand of bicycle, for example, such as ACMEBicycle), and you'd use the implements keyword in the class declaration:
class ACMEBicycle implements Bicycle {
// remainder of this class implemented as before
}
Implementing an interface allows a class to become more formal about the behavior it promises to provide. Interfaces form a contract between the class and the outside world, and this contract is enforced at build time by the compiler. If your class claims to implement an interface, all methods defined by that interface must appear in its source code before the class will successfully compile.
I think standard way in the "spirit" of java is the service provider pattern.
Add a declaration file in the META-INF/services of the "plugin" jar and use java.util.ServiceLoader (http://developer.android.com/reference/java/util/ServiceLoader.html) to enumerate your providers.
Don't know much about Android but sounds like Reflection might help here, so what do you know about reflection in Java?
EDIT
Didn't know you had to limit yourself to loaded levels. That being the case you would want to do your tracking on every instance as it is created pretty much like you proposed in your question.
My idea involved parsing all the directories of a project looking for subclasses - it could be done once at the start of program execution but it would list levels that may never get instantiated...

Preferred method for initializing a child class parameters in Java?

I have an application that takes some input and generates configuration files as output. Since the exact input or output format could change over time, I defined two interfaces: Importer and Exporter.
Each concrete importer or exporter could have different parameters that need to be initialized to work. For example, if the import data is coming from a CSV file you only need the path of the file, but if the data is coming from a database then you need a connection string, username, password, etc. Same thing for exporters.
My implementation currently is:
public interface Importer {
public void setup(Map<String,String> params);
public List<ConfigEntry> getList();
}
public interface Exporter {
public void setup(Map<String,String> params);
public void writeDocument(List<ConfigEntry> entries) throws IOException;
}
The setup method needs to be called before getList() or writeDocument() can be called. I use a Map to keep parameters because each child class can have different parameters.
Is using JavaBean style parameter initialization a preferred way? That means, adding setConnnectionString(), setCSVFilePath(), setX() to each child class.
What are the advantages, disadvantages of these approaches?
There are two obvious downsides to map-based approach:
Absence of well-defined parameter names. Yes, you could define them as constants somewhere but you'd still need to check that parameter name is valid as passed.
Absence of well-defined parameter types. Even worse then above - if I need to pass an integer I'd have to convert it to String and you'll have to parse it (and deal with possible errors). Can be somewhat mitigated by using Map<String,Object> and auto-bounding but then you'd still need to validate appropriate types.
Setter-based approach has only one downside - it can't be done. That is, it can't be reliably done by using setters ALONE - you need to supplement it with some kind of init() or afterPropertiesSet() method that will be called after all setters and will allow you to perform additional (co-dependent) validation and initialization steps.
Also, something like this practically begs for some kind of Dependency Injection framework. Like Spring, for example.
I wouldn't say that passing a Map (or Properties) object in the constructor is necessarily preferred over child class specific setter, or vice versa. Which approach is best depends on how you are going to instantiate the classes.
If you are going to instantiate the classes directly from Java then the Map approach tends to be neater, especially if you have a good way to assemble the maps. (For example, loading a Properties object from a property file.) The 'setters' approach forces you to write code against each of the child class APIs.
On the other hand, if you are going to instantiate the classes using some container framework that supports "wiring", "inversion of control" or the like (e.g. Spring, PicoContainer, JavaBeans, etc), then setters are generally better. The framework typically takes care of when and how to instantiate the classes and call the setters, using reflection under the hood to do the work.
So the answer is ... it depends ...

Categories