My Question: How can I improve these methods to check that a string is valid json without having to ignore JSONExceptions by using the official JSON for java library?
public boolean isValidJSON(String possibleJson) {
return isJSONObject(possibleJson) || isJSONArray(possibleJson);
}
private boolean isJSONObject(String possibleJson) {
try {
new JSONObject(possibleJson);
return true;
} catch (JSONException ex) {
return false;
}
}
private boolean isJSONArray(String possibleJson) {
try {
new JSONArray(possibleJson);
return true;
} catch (JSONException ex) {
return false;
}
}
I'm pretty sure it's not best practices to depend on exceptions thrown as part of logic in a method. Is there another way to do this?
Note: Remember, I would prefer to not use other libraries to do this. It is a small part of a big project and I don't want to introduce another dependency if I can help it.
If you don't want to use an other library, I think it is the best way to do it. There is no validation function in that lib. If you want to use something else, you can try JSON Tools with the JSONValidator object
From the official JSON Java library Javadocs, it doesn't look like there's any other way. As you don't want to use any other library to do this and you don't want to incur the overhead added by exception handling, a workaround is to check their source code and rewrite a method like isValidJSON yourself (without adding exception handling). A downside of this is you won't get the future modifications to that class/method automatically.
If you're willing to use other libraries, you can check out JSONUtils which has a method like mayBeJSON that returns a boolean.
As per my understanding for javadocs standard there is no specific way that you can check this. except if you handle the exception and validate it with the try catch block. You can use any other libs to validate this. Well here the one that you can use http://json-lib.sourceforge.net/apidocs/net/sf/json/util/JSONUtils.html JavaUtils. there is method name maybeJson(String) that return boolean and validate the json.
Related
I am using Java 8/Spring 5 to call a third-party API. I am using HttpClient which works well, and I have a POJO that works 99% of the time.
When I call this one service, the JSON looks like:
{"field1":"a",
"field2":"b",
"field3:"c"}
Based on different parameters it could come back as:
{"field1":"a",
"field2":{
"subfield1:"x",
"subfield2:"y",
"subfield3":"z"},
"field3:"c"}
I am using the latest FastJacksonMapper to convert from JSON string to a Java POJO, and it works in the first instance, but not in the second instance.
I know it may be common for JSON to change based on requests, but I expect JSON to be a little more consistent.
Any thoughts on how I could tweak my POJO? Any JSON annotations I can use to fix this? Or, maybe create a separate POJO so that in case one fails, the other picks up?
Thanks!
So for the same URL and http method you can get different payloads? The same key "field2" might have a string value in option A and a different object in option B? IMHO that's bad design of API.
That third-party API having any description like swagger? Such description will help you generate correct POJO with proper annotations.
If such documentation is not available try to use generators like: http://www.jsonschema2pojo.org/
private static Either<Type1Obj, Type2Obj> getPojoFromJson(String json) {
try {
// Type1Obj from json
return Either.left(obj1);
} catch (IOException e) {
try {
//Type2Obj from json
return Either.right(obj2);
} catch (IOException e1) {
//fallback
}
e.printStackTrace();
}
return null;
}
I'm trying to determine the best way to create a new instance of a class based on which classes are available on the classpath at runtime.
For example, I have a library that requires a JSON response to be parsed in multiple classes. The library has the following interface:
JsonParser.java:
public interface JsonParser {
<T> T fromJson(String json, Class<T> type);
<T> String toJson(T object);
}
This class has multiple implementations, i.e. GsonJsonParser, JacksonJsonParser, Jackson2JsonParser, and currently, the user of the library is required to "pick" their implementation to be used based on which library they've included in their project. For example:
JsonParser parser = new GsonJsonParser();
SomeService service = new SomeService(parser);
What I'd like to do, is dynamically pick up which library is on the classpath, and create the proper instance, so that the user of the library doesn't have to think about it (or even have to know the internal implementation of another class parses JSON).
I'm considering something similar to the following:
try {
Class.forName("com.google.gson.Gson");
return new GsonJsonParser();
} catch (ClassNotFoundException e) {
// Gson isn't on classpath, try next implementation
}
try {
Class.forName("com.fasterxml.jackson.databind.ObjectMapper");
return new Jackson2JsonParser();
} catch (ClassNotFoundException e) {
// Jackson 2 was not found, try next implementation
}
// repeated for all implementations
throw new IllegalStateException("You must include either Gson or Jackson on your classpath to utilize this library");
Would this be an appropriate solution? It seems kind of like a hack, as well as uses exceptions to control the flow.
Is there a better way to do this?
Essentially you want to create your own JsonParserFactory. We can see how it's implemented in the Spring Boot framework:
public static JsonParser getJsonParser() {
if (ClassUtils.isPresent("com.fasterxml.jackson.databind.ObjectMapper", null)) {
return new JacksonJsonParser();
}
if (ClassUtils.isPresent("com.google.gson.Gson", null)) {
return new GsonJsonParser();
}
if (ClassUtils.isPresent("org.yaml.snakeyaml.Yaml", null)) {
return new YamlJsonParser();
}
return new BasicJsonParser();
}
So your approach is nearly the same as this, except for the use of the ClassUtils.isPresent method.
This sounds like a perfect case for the Service Provider Interface (SPI) pattern. Check out the java.util.ServiceLoader documentation for an example of how to implement it.
If only one of the implementations (GsonJsonParser, JacksonJsonParser, Jackson2JsonParser) would be present at runtime and there is no other option, then you'd have to use Class.forName().
Although you can handle it a smarter.
For example, you can put all the classes into a Set<String> and then loop over them. If any one of them throws exception, you can just continue, and the one which does not, you can do your operations.
Yes, it is a hack, and your code would become library dependent. If there could be any chance that you can include all three implementations of your JsonParsers in your classpath and use a logic to define which implementation you have to use; that would be a much better approach.
If this is not possible, you can continue with above.
Also, instead of using plain Class.forName(String name), you can use a better option Class.forName(String name, boolean initialize, ClassLoader loader) which will NOT run any static initializers (if present in your class).
Where initialize = false and loader = [class].getClass().getClassLoader()
The simple approach is the one SLF4J uses: create a separate wrapper library per underlying JSON library (GSON, Jackson, etc.) with a com.mypackage.JsonParserImpl class that delegates to the underlying library. Put the appropriate wrapper in the classpath alongside the underlying library. Then you can get the current implementation like:
public JsonParser getJsonParser() {
// needs try block
// also, you probably want to cache
return Class.forName("com.mypackage.JsonParserImpl").newInstance()
}
This approach uses the class loader to locate the JSON parser. It is the simplest and requires no 3rd party dependencies or frameworks. I see no drawbacks to it relative to Spring, Service Provider, or any other method of locating resources.
Alternately use the Service Provider API, as Daniel Pryden suggests. To do this, you still create a separate wrapper library per underlying JSON library. Each library includes a text file at location "META-INF/services/com.mypackage.JsonParser" whose contents is the fully qualified name of the implementation of JsonParser in that library. Then your getJsonParser method would look like:
public JsonParser getJsonParser() {
return ServiceLoader.load(JsonParser.class).iterator().next();
}
IMO this approach is unnecessarily more complex than the first.
It sounds a stupid question. However, in the API that I am using, version 1.7.2 has got the method Bukkit.getServer().getOnlinePlayers() returning a Player[], and version 1.7.10 has got Bukkit.getServer().getOnlinePlayers() returning a Collection<Player>. I need to make my plugin compatible with both.
I have got the API's for both, but aside from creating separate plugins, I currently have got no idea how to do this.
I currently just convert the collection into an array anyway. So, is there any way to (i can already get the version) if the version is less than 1.7.9, not use the .toArray() but since it already returns an array?
You could do it with reflection. Something like this:
List<Player> getPlayers() {
try {
Method method = getMethod(Server.class, "getOnlinePlayers");
Object result = method.invoke(Bukkit.getServer());
if(result instanceof Player[])
return Arrays.asList((Player[])result);
else
return (List<Player>)result;
} catch(ReflectiveOperationException e) {
// something went wrong! If you have a better way to handle problems, do that instead
throw new RuntimeException(e);
}
}
There are actually two ways to do this. #immibis's answer gives the first way. The second way involves creating a "version adapter" API with multiple plugin implementations for different versions of (in this case) Bukkit; e.g.
public interface BukkitVersionAdapter {
Collection<Player> getOnlinePlayers();
...
}
public class BukkitVersionAdapterV1dot7 implements BukkitVersionAdapter {
public Collection<Player> getOnlinePlayers() {
return Arrays.asList(Bukkit.getServer().getOnlinePlayers());
}
...
}
public class BukkitVersionAdapterV1dot8 implements BukkitVersionAdapter {
public Collection<Player> getOnlinePlayers() {
return Bukkit.getServer().getOnlinePlayers();
}
...
}
The version-specific adapter classes then need to be compiled against the Bukkit API jars for the respective Bukkit versions.
Then when you start your main application (or Bukkit plugin I guess), you do something like this:
String version = // get the Bukkit version
String className = // map the version to a class name; e.g.
// "pkg.BukkitVersionAdapterV1dot7" or
// "pkg.BukkitVersionAdapterV1dot8"
Class clazz = Class.forName(className);
BukkitVersionAdapter adapter =
(BukkitVersionAdapter) clazz.newInstance();
// Then ...
Collection<Player> players = adapter.getOnlinePlayers();
This is all rather cumbersome, but it has two benefits compared with the approach of using reflection to make the calls:
The logic that is specific to different Bukkit versions is now isolated to one part of the codebase ... rather than being scattered (potentially) all over the place.
The overheads of reflection are only incurred at startup. After that, all of the method calls to Bukkit made via the adapter API are regular Java method calls.
Depending on the context, these may make the adapter approach more appropriate.
If the class org.bukkit.Server just changed the return type of the method getOnlinePlayers() rather than making it deprecated and introducing a new method (yack!!!!! who dares to change the already published interface in such a way?), you have no decent way how to call the method in one single code. You simply cannot write something like
Server server = Bukkit.getServer();
// One of the branches will not be compilable
if (version <= xxxx) {
Player[] players = server.getOnlinePlayers();
}
else {
Collection<Player> players = server.getOnlinePlayers();
}
I'm afraid the only way is to use reflection like in the meantime others recommended.
You are doing well. Unfortunately there is no way to create strongly typed method that can deal with both collection and array.
(You can create method that accepts Object and then examine it using instanceof, cast and deal with both collection and arrays but this solution is ugly.)
So I have a try/finally block. I need to execute a number of methods in the finally block. However, each one of those methods can throw an exception. Is there a way to ensure that all these methods are called (or attempted) without nested finally blocks?
This is what I do right now, which is pretty ugly :
protected void verifyTable() throws IOException {
Configuration configuration = HBaseConfiguration.create();
HTable hTable = null;
try {
hTable = new HTable(configuration, segmentMatchTableName);
//...
//various business logic here
//...
} finally {
try {
try {
if(hTable!=null) {
hTable.close(); //This can throw an IOException
}
} finally {
try {
generalTableHelper.deleteTable(configuration, segmentMatchTableName); //This can throw an IOException
} finally {
try {
generalTableHelper.deleteTable(configuration, wordMatchTableName); //This can throw an IOException
} finally {
generalTableHelper.deleteTable(configuration, haplotypeTableName); //This can throw an IOException
}
}
}
} finally {
HConnectionManager.deleteConnection(configuration, true); //This can throw an IOException
}
}
}
Is there a more-elegant way to do this?
The standard (working) way to right resource management in Java (the principle applies to other languages as well) is:
Resource resource = acquire(resource);
try {
use(resource);
} finally {
resource.release();
}
Or using the shortcut (with an extra bit of cleverness) in the current version of Java SE:
try (Resource resource = acquire(resource)) {
use(resource);
}
(As Joe K points out, you may need to wrap the resource to make it confirm to the specific interface that the Java language depends upon.)
Two resources, and you just apply the idiom twice:
Resource resource = acquire(resource);
try {
SubResource sub = resource.acquire();
try {
use(sub);
} finally {
sub.release();
}
} finally {
resource.release();
}
And in Java SE 7:
try (
Resource resource = acquire(resource);
SubResource sub = resource.acquire()
) {
use(resource, sub);
}
The really great advantage of the new language feature is that resource handling was more often than not broken when written out.
You might have more complicated exception handling. For instance, you don't want to throw low-level exceptions such as IOException through to the application proper - you probably want to wrap in some subtype of RuntimeException. This can, with Java's typicaly verboseness, be factored out using the Execute Around idiom (see this excellent question). From Java SE 8, there will also be shorter syntax with randomly different semantics.
with(new ResourceSubAction() { public void use(Resource resource, SubResource sub) {
... use resource, sub ...
}});
If this is Java 7, you could consider using the new try-with-resources construct. You may need to create some basic AutoCloseable wrappers for deleting the tables.
In general, there's no way out of this. You need multiple finally blocks.
However, I don't want to comment on your specific code, whether or not that's an appropriate design. It certainly looks pretty odd.
There is no way I'm afraid. There was a similar pattern when closing io resources. Eg what do you do when closing a file throws an IOException? Usually you just had to ignore it. As this was a bit if an anti pattern they introduced the try-with syntax in Java 7. For your example though I think there is no other option. Perhaps put each finally into its own method to make it clearer
To call multiple methods from a finally block, you have to ensure that none of them throw -- which is a good idea anyway, because any exception thrown from a finally block will override the exception or return value thrown from the try/catch.
The most common use case is a file or database connection, in which case you write a "close quietly" method (or use one from an existing library, such as Jakarta Commons IO). If the things you need to clean up don't let you use a pre-existing method, you write your own (in your case, deleteTableQuietly()).
If you're using JDK-7, you can also use the "try with resource" construct.
You could create an abstract class Action with an execute method, and derive from that class one class for each method throwing exception which you want to call, calling this method from the execute method. Then you can create a list of Actions and loop over the elements of the list, calling their execute method in a try finally block, ignoring exceptions.
deleteTableSilently(table1);
deleteTableSilently(table2);
deleteTableSilently(table3);
deleteTableSilently()
try
deleteTable()
catch whatever
log.error();
Consider using the java.util.concurrent framework -- if you code each call as an individual Callable (named or anonymous), you could use ExecutorService.invokeAll.
I have a method which opens a web service session. The method structure looks something like this:
public Soap getServicePort()
{
//TODO: Open a connect and return the SOAP object
return soap;
}
I have a requirement to add a monitor straight after the return. The monitor's job is to wait for 2hrs and in-activate the session and rebuild a new one - well reason been the current session will be invalid at that time and therefore we need to rebuild and return a new session.
Can anyone suggest a reasonable way of doing this?
Thanks.
public Soap getServicePort()
{
try {
return soap;
} finally {
// add monitor here.
}
}
But be careful: monitor should not throw exceptions. Put its initiation ito try/catch.
Probably better solution is wraper pattern. For example you can define interface with method getServicePort() and 2 implementations: one your real implementation and other that wraps real and adds monitor. This solution is more flexible. For example probably you will have to create your monitor afeter other methods and even after other methods implemented in other classes.
In this case you can use AOP. There are several ways to use it. One is using Dynamic Proxy of java. Other is using special tools like AspectJ.
So, choose your solution. Your choice should depend on the complexity of your task and number of methods/classes that required to implement this functionality. If it is only one method use try/finally, if it is several methods in the same class, use wrapper pattern. If it is required for several methods in several classes use Proxy or AspectJ.
You can try logic like this.. no need to have monitor on this
private Soap soap = null;
public Soap getServicePort()
{
try {
if(soap!=null && soap.isValide()){
// not sure about the method isValide(), some condition to check session
return soap;
}else{
// create new soap & return
return soap;
}
} catch(Exception e){
}// END Catch
}// END MEthod
Call the method as many times as you want...