How can I find the Public Suffix List version Guava is using? Is there a command to see the version when I have the package installed? Or is there some note in the release notes I am missing?
I see that they use Mozilla Foundation's Public Suffix List but this list seems to update much more frequently than Guava releases. Does Guava pull the latest every time they do a release? This question I hope to answer once I know where to find the version.
We don't document which version we're using. But you can look at PublicSuffixPatterns at some release tag and follow the link to the commit that updated it most recently. That should tell you approximately how old the data is.
Traditionally, we updated about every 6 weeks, and the file history suggests that that's still approximately correct.
Ideally someday we'll provide a way for users to regenerate the patterns file themselves, but we haven't yet.
you can see that guava uses it's own sort of engine to acknowledge theses values, but it's engine is well documented Here
Related
Is there a way to get put the cursor in a Java keyword and get help for that keyword? I know it works for apis, but what about the language itself.
I found a feature request for this at https://bugs.eclipse.org/bugs/show_bug.cgi?id=197903
Not without some plugin (if any even exists). Maybe just check official documentation (ex. http://docs.oracle.com/javase/tutorial/java/nutsandbolts/_keywords.html as a starting point). There really arent that many keywords, and they are fixed (at least per version, unlike classes, methods, etc.). Wikipedia has a page that also includes a brief summary at http://en.wikipedia.org/wiki/List_of_Java_keywords#List.
Is there any way to enable indexing for version2Store in alfresco?.
I'm using Alfresco 4.2.c & solr-search subsystem.
My requirement is as follows:
User will search based on content in alfresco. If a word is not present in latest version but present in previous version then search result should display older version. But if I put lucene query against version2Store nothing is listed since it is not indexed. How to enable indexing for version2Store. Please help me in this regard.
Thanks in advance
As discussed via comments we are talking about the SOLR search subsystem here.
Please take a look into the Alfresco wiki here & post another more detailed questions if you're still struggling: http://wiki.alfresco.com/wiki/Alfresco_And_SOLR#Configuring_query_against_additional_stores
You should also keep in mind that just adding the store into the index does not include the store in your queries automatically. Take a look into org.alfresco.service.cmr.search.SearchParameters.addStore(...)
AFAIK it's only possible to execute a search for 1 store only as the stores are handled as SOLR cores internally.
I need to query xml data using XQJ in my java application. I wanted to know the options that I have for xml/xquery processors.
I explored and got to know about:
oracle's xquery processor that is shipped with Oracle 11g
Saxon.
Any other suggestions?
You might want to try BaseX, which also offers a full implementation of XQJ:
http://basex.org/products/
http://docs.basex.org/index.php?title=Special%3ASearch&search=xqj
You will most probably get better results when using the native APIs, no matter which processor you are using (but of course there are reasons for using XQJ as well).
Hope this helps,
Hannes
You could certainly try MarkLogic, eXist, BaseX or Sedna XQJ drivers which are located at http://xqj.net
It would be daft to use vendor propriety APIs as you will be locked into a particular database vendor, with no improvement in performance.
Also you could try checking out the XQJ entry on Wikipedia for more clarity:
http://en.wikipedia.org/wiki/XQuery_API_for_Java
As well as having a processor you need an API. Charles Foster's XQJ.net might well help for
eXist, baseX, Sedna and Marklogic.
http://xqj.net/
zorba has an XQJ branch. Please, consult the zorba users mailing list for further information on this.
I'm using com.sun.net.httpserver.HttpServer in my project. However, it seems that the server leaks connections when it gets invalid data from the HTTP connection. The bug is this one:
http://bugs.sun.com/view_bug.do;jsessionid=dfe841c3152d878571573bafceb8?bug_id=6946825
Now, this is reported to be fixed in version "7(b94)" - however, we are still using Java 1.6 and it is unlikely that we would want switch Java versions at this point.
So, I am looking for ways to fix this situation. I don't have a lot of time, so I'd prefer quick solutions that work for now, over reimplementing a lot of things for later.
I have a few ideas on how to go about this:
Update to a more recent Java - this is something I don't want to do.
Find a jar which only contains a more recent version of com.sun.net.httpserver and make sure that jar loads before the system jars.
Find a drop-in replacement for com.sun.net.httpserver - I'm open to pointers here.
Modify code to work with another embedded HTTP server, hopefully one that isn't too different from the current one. I can rewrite the server setup code, somewhat, but most of the interfaces should stay the same.
Decompile the com.sun.net.httpserver.ServerImpl class, fix the offending places, and recompile that single class to a jar of it's own
But, I'm open to good suggestions!
Thank you in advance.
Fix is now implemented and works. I will paste here the relevant bits if anyone else needs these:
final Field httpserverimpl_server = Class.forName("sun.net.httpserver.HttpServerImpl").getDeclaredField("server");
final Field httpsserverimpl_server = Class.forName("sun.net.httpserver.HttpsServerImpl").getDeclaredField("server");
final Field serverimpl_allconnections = Class.forName("sun.net.httpserver.ServerImpl").getDeclaredField("allConnections");
final Field httpconnection_closed = Class.forName("sun.net.httpserver.HttpConnection").getDeclaredField("closed");
httpserverimpl_server.setAccessible(true);
httpsserverimpl_server.setAccessible(true);
serverimpl_allconnections.setAccessible(true);
httpconnection_closed.setAccessible(true);
Object serverimpl = httpserverimpl_server.get(server);
Set allconnections = (Set)serverimpl_allconnections.get(serverimpl);
LinkedList<Object> toRemove = new LinkedList<Object>();
for (Object conn : allconnections) {
if (httpconnection_closed.getBoolean(conn)) {
toRemove.add(conn);
}
}
for (Object conn : toRemove) {
allconnections.remove(conn);
}
Could you put a reverse proxy infront of the HTTP server, to make sure you only allow known good requests to come through? Varnish or Squid or Apache?
Or knock something up in Jetty so that it acts as a reverse proxy?
Another approach would be to grab the source code of the fixed version, rename the class and package so that it fits into your project, make the class public, and then use that implementation instead.
I can understand your reluctance to upgrade to a pre-release build of Java 7.
Here are my suggestions:
Get a Java support contract from Oracle and get them to provide you with a patch for Java 6 that fixes the bug.
Download the Java 6 sources for the release you are currently using, backport the bug fix from the Java 7 sources and build. Maybe you only need to do a build of certain JAR files.
Look at the code and see if you could develop a workaround. For example, you might be able to use reflection to dig out the "list of HttpConnection instances" that the bug report talks about, and periodically remove entries that look like they are dead. (I'd treat this as a last resort.)
(Updated: 2012-05-15)
And, now that Java 7 is well and truly released (we are now at 1.7u4):
upgrade to Java 7, and
get rid of the nasty reflective hacks that you used as a TEMPORARY workaround.
Do you have access to 7(b94)? Then you can compare the sources and see whether you can fix it by overriding or providing different accessors.
From what I believe and have read online. Sun has decided to include Xalan in JDK 1.5. I am trying to take advantage of this and try to perform an XSLT to spit out multiple files. The problem I encounter:
'Unrecognized XSLTC extension 'org.apache.xalan.xslt.extensions.Redirect:write''"
From what I have read on google that i needed to change:
xmlns:redirect="org.apache.xalan.xslt.extensions.Redirect"
to
xmlns:redirect="http://xml.apache.org/xalan/redirect"
in XSL transforms
When I apply this change to my .XSL File, I appear to be getting the same error. Need to get this working ASAP and can't seem to find an answer online. Any help will be greatly appreciated.
Just ignored the JDK's default Xalan. I just added the files from Xalan. Better, that way I can just use that rather than depending on a single JDK.