I am thinking of the best way to manage package versioning in our project when have #deprecated annotation of some methods. We are using the maven-bundle-plugin, which either takes a package version from a packageinfo file or if there are no files, from the bundle-version.
We follow this convention about packages: ("X.Y.Z") where Z is growing when there is bug fix, Y is growing when there is new feature and X is growing when there is no compatibility.
So lets say we have package P with version ("1.0.0"). In P there is two classes Foo and Bar. If we have two methods in Foo - A() and B() and we have deprecated method A() and this are all changes in the hole package. How the package version is changing?
I'll refer to these version segments by their proper OSGi names: Major (your X), Minor (Y) and Micro (Z):
Major should be incremented when there is a breaking change that affects all users of the API;
Minor should be incremented when there is a new feature, which does not break consumers but it may break providers since they cannot automatically provide the new feature;
Micro should be incremented for any other change that does not cause any backwards compatibility issues.
Clearly adding #Deprecated to a method does not change the compatibility for either consumers or providers. Therefore this is a Micro change (in your example you should increment "Z"). It is at most a documentation change for the API.
Related
I have an old application I am upgrading from Saxon-HE 9.2 to 9.5 (and hopefully to 9.8). The application implements the Saxon Debugger interface. After upgrading to 9.5, the Debugger no long fires any events. I thought this might be due to the byte code optimization, and set GENERATE_BYTE_CODE to false. However, I still receive no debugging events. I believe I found the issue, but don't know of the fix. In 9.2, InstructionInfo had a number of subclasses, including StyleElement. My code expects some InstructionInfos to be StyleElements, which they are not.
public class Saxon2TraceListener implements TraceListener {
// implement interface
public void enter(InstructionInfo instruction, XPathContext context)
{
if (!(instruction instanceof StyleElement))
return;
// do logic with StyleElement
}
}
How can I get a StyleElement from InstructionInfo?
I'm afraid when you're working with low-level interfaces like this, there's no substitute for reading and understanding the source code. 9.2 and 9.5 are both unsupported releases, and in any case this is the kind of support that we only really offer to paying customers - we have to draw the line somewhere.
I think the actual Debugger interface has been obsolete for some time. Its original idea was to allow you to annotate a stack frame with the names of the variables occupying each slot, but that's now a standard product feature and doesn't require a custom debugger.
You seem to be talking instead about the TraceListener interface, which has certainly undergone changes over successive releases, inevitably since it gives you access to the internal representation of a compiled stylesheet which is something we are always tweaking.
I'm not sure what the situation was in 9.5 or 9.8, but in 10.x the argument to TraceListener.enter() and TraceListener.leave() has changed from an InstructionInfo to a Traceable, and every expression and instruction is a Traceable.
A StyleElement is a node in the tree representation of the source stylesheet, and the source stylesheet no longer exists at runtime, whether you're tracing/debugging or not.
in lucene 4.3.1 there was an interface StandardTokenizerInterface and a number of classes implement this class, such as StandardTokenizerImpl and ..... this interface doesn't exist in solr 5.3.1... what is the replacement of this class in solr 5.3.1?
The interface was not replaced, it was removed entirely, as it was deemed to no longer serve a useful purpose, due to the changes in how backwards compatibility is handled (instead of passing in a version arg, you would just use StandardTokenizer40, for instance). Ticket here: LUCENE-6000
The calls specified in the interface are still used in pretty much the same way by the current StandardAnalyzerImpl though, as far as I can tell.
The Inspection reports any uses of java.util.Vector or java.util.hashtable. While still supported, these classes were made obsolete by the JDK 1.2 Collection classes and should probably not be used in new Development....
I have a project in Java which uses vector Everywhere, and I'm using JDK 8 which is the latest one. I want to know if I can run that application on latest java.
And tell if i can use some other keyword for ArrayList like Vector for new java.
First of all, although Vector is mostly obsoleted by ArrayList, it is still perfectly legal to use, and your project should run just fine.
As noted, however, it is not recommended to use. The main reason for this is that all of its methods are synchronized, which is usually useless and could considerably slow down your application. Any local variable that's not shared outside the scope of the method can safely be replaced with an ArrayList. Method arguments, return values and data members should be inspected closely before being replaced with ArrayList, lest you unwittingly change the synchronization semantics and introduce a hard-to-discover bug.
I am looking for a tool that can take a large set of classes and search them for unused methods/variables when given a set of seed classes. My goal is to re-refactor the large set of classes so I can extract only the needed stuff — which is used by the seed classes — from it.
When I say seed classes, I mean a set of classes to be used as entry point for figuring out what is unused. For instance, if class A calls class B and class C calls class D, but the only seed class is class A, then class C and class D should both be considered unused classes. The tool I am looking for should be able to give the unused classes/methods/variables based on the set of seed classes. Does such a tool exist?
Eclipse's Java error/warning settings will help you find unused variables, through the Unused local or private member setting shown below. Unused method notifications should be controlled by a similar setting.
The Unused import setting, right above the highlighted line in the screenshot, should help somewhat with finding unused classes, but not on the scale you want. To use your example, I don't think Eclipse will recognize classes C or D as unused, as I don't think it can differentiate between the "seed group" and the "large group."
You should take a look at CodePro Analytix
Semantic Designs (my company) has such a tool for Java 1.4. You designate a set of Java source files and essentially the seed classes. It returns two results:
A list of declarations (and their precise source code locations)
that are useless with respect to the seeds
(including the transitive closure of dead).
A revised version of the supplied code, with the dead declarations removed.
If you like what you see in the first set, you can use the modified code. If the first set lists something you thing should have been in the used list, add it to the seed set and run it again.
It assumes you aren't using arbitrary reflection (if you are, simply list those
classes that might be inspected by reflection. No, no easy way out of this).
We're working on one for Java 1.5/6/7 and hope to complete it this summer. If someone was interested, we could make the 1.4 version available for experimentation.
I have a java application that uses some third party API. The third party jar files change fairly frequently because of all kinds of patches, but the API itself does not change that often.
Do I have to recompile my application every time the third party jar changes?
If the API changes, you should recompile even if you don't need to make any changes in your source code. If the API hasn't changed, you don't need to recompile.
The reason for the "even if you don't need to make any changes" is that some source-compatible changes may not be binary compatible. For instance, suppose you are currently calling:
public void foo(String x)
and in a later version this is changed to:
public void foo(Object x)
Obviously your code will still compile, but the method it resolves the call to will change.
This is a bit of an edge case, of course. Basically, so long as you know when the API changes, you should be okay.
Another case where a recompilation is theoretically necessary is constants. The value of a constant is literally compiled into the bytecode of classes that use it. If the value is changed in a new version of the API, anything compiled against the old version will continue using the old value, which may lead to different behaviour and very difficult to diagnose bugs.
Of course, no API designer with an ounce of brain will change the values of public constants if he can avoid it, but there may be cases where it is necessary.
No, you only need to recompile code you change.
Generally no, if the third party is well designed.
However, in the (horrible) case where the API change a method signature, or remove a method / class, then you will need a modification and a recompilation of your own code.
For example, if you have doSomething(String value); that became doSomething(int value); (or String doSomething() becoming int doSomething()) then the code of your application that calls doSomething(...) will not work anymore. So a modification of your code is required (and then a recompilation).
However, you have to know that this case is extremely rare (except if you are using a pre-alpha dependency for example). Generally, to ensure backwards compatibility, APIs never remove classes or methods. The #Deprecated annotation is used instead, to force developer to use another method / class...
java is not C++: when you use a library you don't import a single line of code. You just load the library on request and use that.
For this reason you don't need to recompile your code :)