Write code that works on Desktop and Android - java

I'm working on a project in Java that will most likely support Android in the future. But from what I know, Android has different classes/APIs than default Java (for example, I don't think android has all of the AWT stuff). So what I'm wondering, is how can I write my code so that if it is running on Android, it will use the android APIs, and if it is on a desktop, it use the standard Java APIs. I have looked into conditional imports, but unlike C++, that doesn't exist in Java. So how is this kind of thing solved in Java.
Here is an example of what I would like to be able to do:
int[] i;
if(onAnroid)
{
i = androidFoo.bar();
} else {
i = javaFoo.bar();
}
EDIT:
One thing I had thought of was using a Common class so I don't directly call the APIs. But What I was trying to figure out is how to call those classes of they aren't necessarily existent without the compiler complaining that the classes don't exist.

You can create architectural layers which are agnostic of any particular user-interface toolkit. If these need to interact with the user interface, they can do so through interface types.
Atop those layers, you can create multiple presentation layers with different toolkits.
Porting software to new user interfaces is common. Separating architectural layers from the beginning can cost little. De-tangling a monolith later can be expensive, sometimes to the point that it is economically infeasible for a company. Whether a requirement to port is certain, possible or unknown, it can be a good practice to separate architectural layers early.

You will have to separate your code, but not like this. Your classes need to be designed to separate the code that is platform dependent from the code it isn't. You should focus on keep the core logic of the application in one layer and render the objects on the screen in another layer.

Regarding not-necessarily existing classes you can call like this:
try {
MyObject o = (MyObject)Class.forName("org.me.MyObject").newInstance();
} catch(ClassNotFoundException x) {
// Here you know class does not exist
}
Actually, you should do better than this but I don't remember. But anyway similar to this.

Dependency Injection can probably serve your needs. There are several Frameworks out there: http://en.wikipedia.org/wiki/Dependency_injection#Frameworks_that_support_dependency_injection

Related

JAVA refactoring using reflection

I am using a 3rd party API in few Java applications. They have updated few things in the latest version. We will have to update to the latest version and it needs corresponding changes from our code.
The changes are,
1) The interface and the abstract class name which we used to implement/extend has been changed. Also, the method names has been changed.
These are all just the name changes.
2) Need to annotate the class which implements these interfaces with #Service
3) Then need to add some new Java file and a property file.
4) We also have the abstract class which implements the 3rd part abstract class and then there are many concrete classes. So, few methods from the 3rd party abstract class is been overridden in our base abstract class which extends the base abstract class and few methods are there in the concrete abstract class.
I can do the refactoring through Eclipse IDE, but we dont prefer this.
I like this to be completely automated like running a script.
I tried with Java reflection to find all the concrete class of an Abstract class and rename the methods. Still, it looks risky.
Is there any other better approach?
It depends how much code you need to change, how long it takes to do each step and how many times you repeat the same refactoring.
If it is only a few hundred classes and/or simpler refactorings like rename class/interface can do most of the work, then do it by hand.
Otherwise if you really want to, you can try to write rules in a tool like AutoRefactor: https://github.com/JnRouvignac/AutoRefactor
Disclaimer: I am the author of AutoRefactor.
I remember reading somewhere that a programmer is someone who would rather spend 12 hours writing a script to automate a manual task than to spend 20 minutes actually doing that task.
I understand why you want to automate this - the API you're using is making life hard for its clients by renaming things. It's unusual for APIs to break compatibility with naming only - are you sure it's as simple as that?
My strong recommendation is to just bite the bullet and manually refactor. It will almost certainly take less time than automating the process, you'll identify further opportunities to improve your own application's design, and it's unlikely you will ever need to use the refactoring script again.
Unfortunately, I do not now the exact details of you situation. I can point some principles which can simplify life in future according to my experience.
Shortly, if you are using any 3rd party API, try to minimize it's propagation into your code. Hide the 3rd party code behind your own abstractions (interfaces) using patterns like Adapter, Facade etc.
So, in case the 3rd party code changes, you will make changes only in one place. This approach gives you extra freedom: if you'll decide to use another 3rd party API, it will be simple, because the major peace of your code will not touched. Also it is useful while testing: you can mock actual 3rd party functionality.
For example, suppose your project need to have persisting storage. So you can start from declaring interface like this:
interface IStorage {
void save(Model m);
Model load(int id);
}
This will allow you:
Make decision about storage provider (may be it will be MySQL or
MongoDB or simply just XML file on disk) more later.
Easily substitute one 3rd party API by another (for example change from file storage to DB).
Test you business logic easily by mocking this interface instead of use real storage.
Speed up development in case some modules (which another developers have to do) require working storage (they will just use
IStorage interface as if it is already implemented).

Does the use of ObservableList in JavaFX go against Model-View-Controller separation?

I am attempting a study of JavaFX because I want to use it as the GUI of my program. My question is essentially a conceptual one:
To date my program is mostly the "Model" part of the MVC pattern; that is, almost all of my code is the OO-representation of abstractions in the sense of classes, and all of that code is logical code.
Since I do not want to be the only user of my program, I want to add the "View" part of MVC so that people can easily use and manipulate the "Model" part of my program. For this, I want to use JavaFX.
In my "Model" classes I obviously use various Lists, Maps, and other classes from the Java Collections API. In order to let the users of my program manipulate these underlying Lists and Maps I want to use the Observable(List/Map) interfaces in JavaFX.
A concrete example to bring clarity to the situation:
Let's say that I have a MachineMonitor class that every 3 minutes checks certain properties of a Machine, such as if the connection is still good, the speed that the gears are turning, etc. If certain inequalities are met (say that the speed of the gears has fallen to a rate of 1 turn/sec) the MachineMonitor fires a RestartMachineEvent.
Currently I use an ArrayList<MachineMonitor> to keep track of all of the individual MachineMonitor's. Now extending to the "View" part of MVC, I want the User to be able to manipulate a TableView that displays the list of MachineMonitors so that they can, for instance, create and remove new MachineMonitor's to monitor various Machines.
So that I can keep track of what the user of my program wants to do (say, create a MachineMonitor for Machine #5 that checks to see if the turn/sec of the gears falls below 0.5) I use an ObservableList<MachineMonitor> as the underlying List for the TableView.
The easiest way to link the "Model" and "View" of my program would simply be to change the "Model" class to have an ObservableList<MachineMonitor> and not an ArrayList<MachineMonitor> but (getting to the topic of the question) I feel that this is very messy because it mixes "Model" and "View" code.
A naïve approach would be to use an ObservableList<MachineMonitor> for the TableView and retain the use of my ArrayList<MachineMonitor>. However, changes made to the ObservableList<MachineMonitor> do not affect the underlying List as per the JavaFX specifications.
Given this, is the best way to solve this conundrum to make a ChangeListener for the ObservableList<MachineMonitor> that "propagates" the changes made to the ObservableList<MachineMonitor> to the underlying "Model" ArrayList<MachineMonitor>? Perhaps put this in a class called MachineMonitorController?
This ad-hoc solution seems very messy and non-ideal.
My question is: What is the best way to retain nearly complete separation between the "Model" and "View" in this scenario?
Briefly, I don't think use of ObservableList breaks the MVC contract.
The rest, you may read or not as you wish, as it is quite annoyingly long.
Architectural Pattern Background
Observables are useful in MVC style architectures because they provide a way of feeding data back and forth between the MVC components through loose couplings where the model and view classes don't need to refer directly to each other, but can instead work with some shared data model which communicates data flow. It's not a coincidence that the Observable pattern and the MVC style architecture concept both originated around the same time at Xerox PARC - the things are linked.
As noted in Martin Fowler's GUI architectures, there are numerous different approaches to building GUIs. MVC is just one of these, kind of the granddaddy of them all. It is nice to understand MVC well (it is often misunderstood) and MVC concepts are applicable in many places. For your application you should use the system which feels best for you rather than rigidly following a given pattern (unless you are using a particular framework which enforces a given pattern) and also be open to adopting different patterns within an application rather than trying to shoehorn everything into a single conceptual framework.
Java Beans are a fundamental part of almost all Java programs. Though traditionally often only used in client apps, the observer pattern, through PropertyChangeListeners, has been, for good reason, a part of the Java Bean specification since it was created. The observable and binding elements of JavaFX are a rework of that earlier work, learning from it to build something that is both more convenient to work with and easier to understand. Perhaps, if the JavaFX observable and binding elements had existed ten or twelve years ago as part of the JDK, such concepts would be more generally used in a wider variety of libraries and frameworks than a couple of pure GUI frameworks.
Advice
I suggest considering the MVVM model and other GUI architectures.
If you want a dead-easy framework which follows a model, view, presenter style, definitely give afterburner.fx a spin.
I think the correct choice of architecture depends on your application, your experience and the size and complexity of the problems you are trying to solve. For instance, if you have a distributed system, then you could follow REST principles rather than (or in addition to) MVC. Whichever you choose, the architecture should aid you in solving the problem at hand (and possibly future problems) and not the converse. Over-architecting a solution is a common trap and is very easy to do, so try to avoid it.
Caveat
One caveat to consider is that observables necessarily work via side-effects which can be difficult to reason about and can be antithetical to the concept of isolation. JavaFX features some good tools, such as ReadOnlyObjectWrapper and ReadOnlyListWrapper, to help limit the impact (damage control if you like) on observables so they don't run amok in your system. Use such tools (and immutable objects) with reckless abandon.
Learn from Examples
For a simple JavaFX application which is built using observables, refer to tic-tac-toe.
For a good way to structure a large and complex JavaFX application with FXML based components, refer to the source code for SceneBuilder and SceneBuilderKit. The source code is available in the JavaFX mercurial source tree, just check it out and start learning.
Read up on the JavaFX UI controls architecture. Examine the JavaFX controls source code (e.g. Button and ButtonSkin or ListView and ListViewSkin) to see how concepts such as MVC can be applied using JavaFX structures. Based on that learning, try creating some of your own custom controls using the architecture that the JavaFX controls framework provides. Often, when you are building your own application you don't need to create your own controls (at least ones which derive form JavaFX Control). The JavaFX Controls architecture is specially crafted to support building libraries of reusable controls, so it is not necessarily generally suitable for all purposes; instead it provides a concrete demonstration of one proven way to get certain things done. Adopting and adapting proven solutions goes a long way to ensuring you don't reinvent stuff needlessly and allows you to build on a solid base and learn from the trials of others.
Regarding your Concrete Example
I advise you to go with:
The easiest way to link the "Model" and "View" of my program would simply be to change the "Model" class to have an ObservableList and not an ArrayList
Maybe use a ReadOnlyListWrapper to expose the ObservableList from the MachineMonitor to the outside world, so that nothing can modify it unduly.
Setup some other structure which encapsulates the view (for example a ControlPanel and ControlPanelSkin) and provide it a reference to the read only observable list of MachineMonitors. The ControlPanelSkin can encapsulate a TableView, a graph or whatever visual knobs and widgets you want to use for the user to monitor the machines.
Using such a structure effectively isolates your view from the model. The model really doesn't know anything about the UI at all and ControlPanelSkin implementation could be changed out to a completely different visual representation or technology without changing the core MachineMonitor system at all.
The above just outlines a general approach, you'll need to tweak it for your specific example.
I disagree that using an ObservableList in your "model" class violates MVC separation. An ObservableList is purely data representation; it is part of the model and not part of the view. I (and others) use JavaFX properties and collections in model representations in all tiers of my applications. Among other things in there, I point out how I use JavaFX properties that are (or can be, at least) bound to JSF. (I should mention that not everyone agrees with the approach of using FX properties on the server side; however I don't really see any way to make the argument that they are somehow part of the view.)
Also, if you do
List<MachineMonitor> myNonObservableList = ... ;
ObservableList<MachineMonitor> myObservableList = FXCollections.observableList(myNonObservableList);
myObservableList.add(new MachineMonitor());
the observable list is backed by the non-observable list, so the change occurs in myNonObservableList too. So you can use this approach if you prefer.

The missing "framework level" access modifier

Here's the scenario. As a creator of publicly licensed, open source APIs, my group has created a Java-based web user interface framework (so what else is new?). To keep things nice and organized as one should in Java, we have used packages with naming convention
org.mygroup.myframework.x, with the x being things like components, validators, converters, utilities, and so on (again, what else is new?).
Now, somewhere in class org.mygroup.myframework.foo.Bar is a method void doStuff() that I need to perform logic specific to my framework, and I need to be able to call it from a few other places in my framework, for example org.mygroup.myframework.far.Boo. Given that Boo is neither a subclass of Bar nor in the exact same package, the method doStuff() must be declared public to be callable by Boo.
However, my framework exists as a tool to allow other developers to create simpler more elegant R.I.A.s for their clients. But if com.yourcompany.yourapplication.YourComponent calls doStuff(), it could have unexpected and undesirable consequences. I would
prefer that this never be allowed to happen. Note that Bar contains other methods that are genuinely public.
In an ivory tower world, we would re-write the Java language and insert a tokenized analogue to default access, that would allow any class in a package structure of our choice to access my method, maybe looking similar to:
[org.mygroup.myframework.*] void doStuff() { .... }
where the wildcard would mean any class whose package begins with org.mygroup.myframework can call, but no one else.
Given that this world does not exist, what other good options might we have?
Note that this is motivated by a real-life scenario; names have been changed to protect the guilty. There exists a real framework where peppered throughout its Javadoc one will find public methods commented as "THIS METHOD IS INTERNAL TO MYFRAMEWORK AND NOT
PART OF ITS PUBLIC API. DO NOT CALL!!!!!!" A little research shows these methods are called from elsewhere within the framework.
In truth, I am a developer using the framework in question. Although our application is deployed and is a success, my team experienced so many challenges that we want to convince our bosses to never use this framework again. We want to do this in a well thought out presentation of the poor design decisions made by the framework's developers, and not just as a rant. This issue would be one (of several) of our points, but we just can't put a finger on how we might have done it differently. There has already been some lively discussion here at my workplace, so I wondered what the rest of the world would think.
Update: No offense to the two answerers so far, but I think you've missed the mark, or I didn't express it well. Either way allow me to try to illuminate things. Put as simply as I can, how should the framework's developers have refactored the following. Note this is a really rough example.
package org.mygroup.myframework.foo;
public class Bar {
/** Adds a Bar component to application UI */
public boolean addComponentHTML() {
// Code that adds the HTML for a Bar component to a UI screen
// returns true if successful
// I need users of my framework to be able to call this method, so
// they can actually add a Bar component to their application's UI
}
/** Not really public, do not call */
public void doStuff() {
// Code that performs internal logic to my framework
// If other users call it, Really Bad Things could happen!
// But I need it to be public so org.mygroup.myframework.far.Boo can call
}
}
Another update: So I just learned that C# has the "internal" access modifier. So perhaps a better way to have phrased this question might have been, "How to simulate/ emulate internal access in Java?" Nevertheless, I am not in search of new answers. Our boss ultimately agreed with the concerns mentioned above
You get closest to the answer when you mention the documentation problem. The real issue isn't that you can't "protect" your internal methods; rather, it is that the internal methods pollute your documentation and introduce the risk that a client module may call an internal method by mistake.
Of course, even if you did have fine grained permissions, you still aren't going to be able to prevent a client module from calling internal methods---the jvm doesn't protect against reflection based calls to private methods anyway.
The approach I use is to define an interface for each problematic class, and have the class implement it. The interface can be documented solely in terms of client modules, while the implementing class can provide what internal documentation you desire. You don't even have to include the implementation javadoc in your distribution bundle if you don't want to, but either way the boundary is clearly demarcated.
As long as you ensure that at runtime only one implementation is loaded per documentation-interface, a modern jvm will guarantee you don't suffer any performance penalty for using it; and, you can load harness/stub versions during testing for an added bonus.
The only idea that I can think in order to supply this missing "Framework level access modifier" is CDI and a better design.
If you have to use a method from very different classes and packages in various (but few) situations THERE WILL BE certainly a way to redesign those classes in order to make those methods "private" and inacessible.
There is no support in Java language for such kind of access level (you would like something like "internal" with namespace). You can only restrict access to package level (or the known inheritance public-protected-private model).
From my experience, you can use Eclipse convention:
create a package called "internal" that all class hierarchy (including sub-packages) of this package will be considered as non-API code and could be changed anytime with no guarantee for your users. In that non-API code, use public methods whenever you like. Since it is only a convention and it is not enforced by the JVM or Java compiler, you cannot prevent users from using the code, but at least let them know that these classes were not meant to be used by 3rd parties.
By the way, in Eclipse platform source code, there is a complex plugin model that enforces you not to use internal code of other plugins by implementing custom class loader for each plugin that prevents loading classes that should be "internal" in these plugins.
Interfaces and dynamic proxies are sometimes used to make sure you only expose methods that you do want to expose.
However that comes at a fairly hefty performance cost, if your methods are called very often.
Using the #Deprecated annotation might also be an option, although it won't stop external users invoking your "framework private" methods, they can't say they hadn't been warned.
In general I don't think you should worry about your users deliberately shooting themselves in the foot too much, so long as you made it clear to them that they shouldn't use something.

Need help improving a tightly coupled design

I have an in-house enterprise application (EJB2) that works with a certain BPM vendor. The current implementation of the in-house application involves pulling in an object that is only exposed by the vendor's API and making changes to it through the exposed methods in the API.
I'm thinking that I need to somehow map an internal object to this external one, but that seems too simple and I'm not quite sure of the best strategy to go about doing this. Can anyone shed some light on how they have handled such a situation in the past?
I want to "black box" this vendor's software so I can replace it easily if needed. What would be the best approach from a design point of view to somehow map an internal object to this exposed API object? Keep in mind that my in-house app needs to talk to the API still, so there is going to be some dependency between the two, but I want to reduce it so I can also test in isolation from this software using junit.
Thanks,
Jason
Create an interface for the service layer, internally all your code can work with that. Then make a class that uses that interface and calls the third party api methods and as the api facade.
i.e.
interface IAPIEndpoint {
MyDomainDataEntity getData();
}
class MyAPIEndpoint : IAPIEndpoint {
public MyDomainDataEntity getData() {
MyDomainDataEntity dataEntity = new MyDomainDataEntity();
// Call the third party api and fill it
return dataEntity;
}
}
It is always a good idea to interface out third party apis so you don't get their funk invading your app domain, and you can swap out as needed. You could make another class implementation that uses a different service entirely.
To use it in code you just call
IAPIEndpoint endpoint = new MyAPIEndpoint(); // or get it specific to the lang you are using.
Making your stuff based on interfaces when it spans multiple implementations is the way to go. It works great for TDD as well so you can just swap out the interface to a local test one that can inspect your domain code entirely separate from the third party api.
Abstraction; implement a DAL which will provide the transition from internal to external and back.
Then if you switched vendors your internals would remain valuable and you could change out the vendor specific code; assuming the vendors provide the same functionality and the data types related to each other.
I will be the black sheep here and advocate for the YAGNI principle. The problem is that if you do an abstraction layer now, it will look so close to the third party API that it will just be a redundant layer. Since you don't know now what a hypothetical future second vendor's API will look like, you don't know what differences you need to account for, and any future port is likely to require a rework for those unforeseen differences anyway.
If you need a test framework, my recommendation is to make your own test implementation using the same API as the BPM vendor. Even better, almost all reputable API providers provide some sort of sandbox mode for testing. If they don't, you should ask for one.

OK to put my public interfaces into their own package

Would it be OK to put my public interfaces into their own package (for my organisation only).
for example
com.example.myprogram - contains all normal code
com.example.myprogram.public - contains public accessible interfaces
com.example.myprogram.abstract - contains abstract classes
Is this a good or a bad thing to do, are there any disadvantages?
I wouldn't like this practice at all. You should group classes, both abstract and concrete, and interfaces according to functionality.
Look at the Java API as an example. Did Sun separate the Collections interfaces from implementations? No. Sun's practices aren't always the best guide, but in this case I agree.
Don't do it.
I can suggest you 2 common ways:
If you really think that your interfaces can have more implementations in future (i.e. you're working on API) then move them to a separate module and create there special package with name 'core', for example. (com.example.myprogram.core). Implementations should be in correspondent packages (like com.example.myprogram.firstimpl).
If you have only 1 implementation then let all your interfaces be in com.example.myprogram package and all concrete classes in com.example.myprogram.impl package.
I can't see that as being bad practice, however you might wanna consider as an alternative organizing your stuff per logical functionality rather than syntactic definition, so that all code for a given unit of functionality interfaces/abstract classes/normal code goes in the same package. This is one of the principles of modular programming.
Said so, putting all the interfaces (but only those) in a separated package might be necessary depending on the size of the project, and might eve become almost necessary if you have a pure component based plugin architecture (so that other module know only about interfaces and the actual implementation is somehow dynamically injected).
Public interfaces are a formal contract between system modules or systems. Because of that, it makes sense to isolate them from the remainder of the code, to make them stand out.
For example, in a system I've worked on, all public interfaces between the server and client components of the system have been placed in a special system module (called, no surprise, "api"). This has a number of desirable effects, among which these:
- semantically, you know where to look if you need any kind of information on how communication should take place
- you can version the api module separately, which is especially useful when you don't want a moving target, i.e. you sign a contract to deliver an application which will support "the api v.1.1" rather than constantly playing catch while someone else changes the interface and requires you to adapt your side
That doesn't mean you shouldn't organize them further in sub-packages to distinguish what they are for. :)
In summary, you are doing the right thing by separating the interfaces from the rest of the code base, although depending on your specific needs, you might do well to take it a step further and isolate the interfaces in a separate system module.

Categories