I have 5 packages in my workspace. One is the "core" package that holds the critical java files for my application, and the name of this package also happens to be the name that I want the working set that contains it to be. The other packages in the workspace (4 of them) are on the build path and provide convenience methods and the like. I want to create a working set based on this core package and have all the other packages on its build path automatically enter its working set.
How do I achieve this without manually setting these extraneous packages to the working set, and then manually updating the working set when the dependencies change?
In my real life setting I have numerous working sets I need to manage so this quite quickly becomes overly tedious.
I don't think you can unless you provide an external script. Eclipse help file states
Newly created resources are not automatically included in the active
working set. They are implicitly included in a working set if they are
children of an existing working set element. If you want to include
other resources after you have created them you have to explicitly add
them to the working set.
Related
This question already has answers here:
Spring Boot: Is it possible to use external application.properties files in arbitrary directories with a fat jar?
(12 answers)
Closed 3 years ago.
I want to override few configurations in my spring boot application during a restart via an external configuration file.
What I am using:
java -jar -Dspring.profiles.active=${ENV} my-application.jar
This loads my profile specific application property during application start. Let's assume I have an issue and I need to change the configuration in my application, I don't want to rebuild my application again with the changed property, what I want to achieve is that I provide an external property file which has the new value for the configuration and I restart my application.
I have tried suggestion mentioned here https://docs.spring.io/spring-boot/docs/current/reference/htmlsingle/#boot-features-external-config-application-property-files
Let's say I copy my jar to bin folder on my server and create a /config folder inside the bin folder which contains the override.properties file and then run the same command as stated above to restart my application.
It doesn't override the property mentioned in override.properties
I tried to provide spring.config.location as a command line argument but then I need to write all my properties in that file which is not what I need.
If you look at the top of Section 24 in the link you cite, you'll see a long list of places that Spring looks for property providers. Have you looked down that list? There are a number of options for providing external properties that override internal ones. Basically, anything higher on the list will override something lower on the list.
One option is to put JSON in the single environment variable SPRING_APPLICATION_JSON. This is what we do for unexpected overrides. We always define this variable in a separate file included by our main startup script, but it is usually empty. But at any time, we can go and add properties to it, and they will take priority and override any existing property values. We chose this option because it has a vey high priority. It is mostly only test code and settings that override these settings. The only other thing that does is properties put on the command line. Those, of course, can be changed without building a new binary.
There are other promising choices on that list, like #14. I believe there are ways of having external properties files that don't replace existing ones, but rather just override them, so that you don't have to redefine all of your existing properties there. I'd be surprised if there was no way to do that...have an external properties file that just overrode a few properties.
UPDATE: The "duplicate" cited in the question comments backs up what I'm saying here. It says very clearly that multiple properties files will override each other. No one file need provide all the properties. So it seems you're on the right track, and just have something wrong with your properties file configuration. Just keep in mind what I'm saying. It may be easier to use some other source than a properties file, like either the single environment variable SPRING_APPLICATION_JSON, or individual environment variables.
When distributing a Java application to others, it can be deployed as a JAR file for easy execution.
But is there a way to change a Java class / part of the code after deployment without having to rebundle the whole application again?
If you have an app with say 10 classes where 9 are finalized but one needs to be adjusted according to the individual case. What would be the easiest way to change just one class in an app?
Probably you want to use java web start. If your user starts application via java web start it is automatically being updated if updates are available.
EDIT
It does not provide class-based granularity, but I believe this is not the real issue. It however provides the jar-based granularity, i.e. the newer version of jar is being downloaded only if it was changed.
No, there's not.
You should repackage OR design the one that should be adjusted to be configurable at runtime. If you can modify it using a configuration database and factory that would be the only way to do it without repackaging.
In theory you could create another jar for the customized classes and put it into the classpath before the old jar, and the JVM will load the customized classes. But this is simply looking for trouble...
Better to build two jars, one with the non changing classes and another with the customized classes and rebuild the later when you need it.
I'm adding some interception routines to Dalvik libcore methods (e.g. file open method in libcore/luni/src/main/java/org/apache/harmony/luni/platform/OSFileSystem.java), which I think only changes basic sharing libraries. But to my surprise, every time I run make after modifications, it rebuilds nearly everything of the framework, such as Calculator application, W3C DOM parser, etc. It really takes time to build the framework after a small modification. I'm wondering if it is possible to reduce number of rebuilt components after modifying dalvik libcore? Thanks.
It actually isn't too surprising that changing core.jar causes many things to be rebuilt. core.jar contains many/all of the core java classes, like Object, String etc. So that every other jar/apk that gets built actually depends on core.jar.
From a makefile perspective, it has no clue what you changed in core.jar, and whether it is safe to not rebuild all these other things that depend on core.jar. It simply sees that the last modified time on core.jar is newer than on all of the other jars/apk that depend on it, so it rebuilds them all.
The trick, however, is to tell make specifically what you want to build, instead of telling it to build everything.
Assuming that you have already done a full build previously, you can simply do
make core snod
The core target will specifically build a new core.jar with your changes, without rebuilding anything that depends on core.jar.
And the snod target (short for systemimage-nodeps) will cause it to repackage everything from out/target/product//system into a new system.img. This is a "special" target that is declared in build/core/Makefile.
In general, the target for a particular jar/apk is simply the name of that jar/apk, without the extension. Alternatively, you can look at the Android.mk file for that module, and find the module name, which is typically something like LOCAL_PACKAGE_NAME or LOCAL_MODULE, depending on the type of module.
For core.jar (in gingerbread at least), the module name is in libcore/JavaLibrary.mk (which is actually included by libcore/Android.mk). This file contains definitions for a number of different modules, but the first one, with LOCAL_MODULE := core is the one resposible for building core.jar. The rest seem to mostly be test related modules.
This has been bugging me for years now, and I thought one of you fine people would know - in Eclipse's .classpath files, what is the combineaccessrules attribute of the classpathentry element actually used for?
I can see in the Java Build Path config dialog that it can be maniuplated, but I can't think of a good use case for it. If I muck about with the settings, or modify the .classpath file manually, it doesn't seem to have any effect.
I'm hoping someone else has put it to good use, and I can steal their ideas. Basically, it's an itch I'm trying to scratch.
With proper use of access rules you can prevent using "internal" and/or "non-api" classes and methods. When you add a class or package as Forbidden or Discouraged the compiler show an error or warning when you use that class or class from the specified package. For a longer introduction of access rules you should read this short article.
For using combine access rules imagine the following situation:
You have 2 projects, A and B.
On the classpath of project A there is a jar file that is exported. The jar contains some "stable api", "unstable api" and "non-api" public classes.
Project B depends on project A.
You do not allow using "non-api" classes in project A so you set some Forbidden access rules on those classes / packages.
In project B you do not allow using "non-api" as well, but you do want to get a warning when using "unstable api". In this case in project B you only have to set the additional Discouraged access rules if you check the Combine rules with the access rules of the exported project entries.
Access rules are handy little things, but dangerous. They exclude a source file from the project compiler but leave the file intact in the filesystem.
The project I work on has a bootstrap class in one of our source folders, but if we include the entire folder the project classpath it won't compile (it's a long story and the build process handles this).
So we use an eclipse access rule to exclude it and it never bothers us during development. This means we can't easily change the code, but it's one of those classes that literally hasn't been touched in years.
Combine Access Rules, judging by the JavaDoc, is a real edge use case. To use it you would have to have:
an access rule in an exported source entry of one project
a link to that project from a parent project
a need to combine the access rules of the sub project with the parent
I really can't say how it would be useful, but I hope that at least answers your "what is it" question :)
although i have never used it myself, a little bit of into can be found here.
whether the access rules of the project's exported entries should be combined with this entry's access rules
the access rules would be something like including "com/tests/**"
We use Eclipse for our Java development, and we've got Maven compiling our JSPs into servlets to be used in our embedded Jetty instance. This means that to run the project from Eclipse, I have to include ./target/jsp-source as a source folder, which works great. The warnings that show up for that generated code are everywhere though, and I want to filter them out.
mainMenu_jsp.java has a warning about a local variable not being used. It's generated code, so I don't care about it, but I can't figure out how to filter out any warnings by filename pattern.
I know I can define a working set, but because I'm always opening,closing, and sometimes adding and deleting projects, I don't want another point of manual bookkeeping that I have to keep up to date. If I add a new project and forget to go in and add it to the working set, I won't get any warnings for it, which with all the other projects, I might not notice. Working sets would only really work if there was a way to set them to dynamically expand to include all projects (not just all right now), and have their filtering automatically apply to each new project as it's added.
Just use a Working Set. Details about how to do it here:
Excluding Unfixables from Eclipse Problem View
And here:
Eclipse Problems view
[Using a Working Set] doesn't work for a lot of situations.
For example, the Android Eclipse plug-in generates code into the same project.
A working set consists of one or more projects, in their entirety, so a working set can't filter out generated warnings that are intra-project.
Edit:
I learned something super-useful: working sets can filter out parts of projects. It's simple to do--click on the project in the left pane of the working-set editor and expand it. Then only add what you want.
A working set works well with multiple projects. I suggest you try using one.