I'm using SolrJ. But through the API documentation could not figure out how to use the particular class to receive the response of the spell checker.
i have a search component defined in solrconfig.xml for performing the checking
Maybe you already found the solution, anyway the SpellingResult class comes with Solr, while you're using SolrJ to access a Solr server, if I'm not wrong. So, you should use the specific classes that come with SolrJ; the QueryResponse object contains a SpellCheckResponse object with all the information you're looking for.
SolrServer solr = new CommonsHttpSolrServer("http://localhost:8080/solr");
ModifiableSolrParams params = new ModifiableSolrParams();
params.set("qt", "/spell");
params.set("q", "whatever");
params.set("spellcheck", "on");
//params.set("spellcheck.build", "true");
QueryResponse response = solr.query(params);
SpellCheckResponse spellCheckResponse = response.getSpellCheckResponse();
if (!spellCheckResponse.isCorrectlySpelled()) {
for (Suggestion suggestion : response.getSpellCheckResponse().getSuggestions()) {
logger.debug("original token: " + suggestion.getToken() + " - alternatives: " + suggestion.getAlternatives());
}
}
Hope this helps.
Related
I have to call a DELETE Method inside one of the APIS of a client.
This shouldn´t be a problem but I am struggling with the framework that my company currently uses and I hope I can get some info that hopefully will help me solve the problem:
First thing first.
1: The api doesn´t work if I do the call sending the params as URL:
2: It works completely OK if I send the params inside the body as x-www-form-urlencoded but not form-data or raw
The documentation of the method inside the API told us the following: (Important to Look to IDEtapa)
I have to do this call in JAVA (JAVA 8)
currently my company uses HTTP_CLIENT as the main framework for APICalls.
My code:
The build of the Data (Currently I build both, as Entity and as Params for you to view I´ve tried with each one of them indepently):
Map datosApi = new HashMap<>();
datosApi.put(Constants.URL, anular_cita);
Map headers = new HashMap<>();
headers.put(Constants.AUTHORIZATION, params.get("token_autorizacion"));
headers.put("Content-Type", "application/json");
headers.put("entity_charset", "UTF-8");
datosApi.put(Constants.HEADERS, headers);
JSONObject entity = new JSONObject();
Map param = new HashMap();
param.put(Constants.ID_CENTRO, consul);
param.put("IdAsistencia", propiedades[0]);
param.put("IdCapitulo", propiedades[1]);
param.put("IdEtapa", Integer.valueOf(propiedades[2]));
entity.put(Constants.ID_CENTRO, consul);
entity.put("IdAsistencia", propiedades[0]);
entity.put("IdCapitulo", propiedades[1]);
entity.put("IdEtapa", Integer.valueOf(propiedades[2]));
datosApi.put("entity", entity.toString());
datosApi.put("entity_mime_type", "application/json");
datosApi.put("entity_charset", "UTF-8");
datosApi.put("params", param);
String anularCita = APIDao.callAPIDelete(datosApi);
The preparation for my call to the framework:
public static String callAPIDelete(Map in) {
String contentString = "";
Map res = new HashMap<>();
try {
res = XWM.execute("delete#http_client", in);
byte[] content = (byte[]) res.get("content");
contentString = new String(content, StandardCharsets.UTF_8);
And inside our framework we have this:
if (StringUtils.equals(StringUtils.trim(method), "delete"))
{
StringBuilder requestUrl = new StringBuilder(url);
if (formparams != null)
{
if (requestUrl.indexOf("?")==-1) requestUrl.append("?");
else requestUrl.append("&");
requestUrl.append(URLEncodedUtils.format(formparams, charset));
}
if (entityRequest != null)
{
log.error("Param 'entity' no puede usarse en get (se ignora)");
}
HttpDelete delete = new HttpDelete(requestUrl.toString());
delete.setConfig(requestConfig);
uriRequest = delete;
}
}
}
}
// Headers
if (headers != null)
{
for (String h: headers.keySet()) uriRequest.addHeader(h, headers.get(h));
}
// Ejecutamos método
log.info("Executing request " + uriRequest.getRequestLine());
CloseableHttpResponse response = null;
if (!preemtiveAuth || credsProvider == null)
{
response = httpclient.execute(uriRequest);
}
As you can see, in the delete method, we ignore the entity that i´ve built in the first patch of code.
The HTTPDelete Class is APACHE, the url with info of the class is the following:
https://www.javadoc.io/doc/org.apache.httpcomponents/httpclient/4.5.2/org/apache/http/client/methods/HttpDelete.html
The question can be divided in two:
1: Can we send the Entity in a Delete Call? I have found a few info about this in the following places:
Is an entity body allowed for an HTTP DELETE request?
https://web.archive.org/web/20090213142728/http://msdn.microsoft.com:80/en-us/library/cc716657.aspx
https://peterdaugaardrasmussen.com/2020/11/14/rest-should-you-use-a-body-for-your-http-delete-requests/
I assume that in order to do it, i would need a new HttpDelete that would allow us to use entity, if possible, could you give me some examples of this?
2: For what i understand of the links i posted above, while using entity in Delete calls is not forbidden is something that is not encouraged, Should I just talk with the people who made the API and ask them to change their configuration to allow us to send the info by params? (It is not personal or sensible info, just a bunch of IDs)
Thank you so much for your attention and my apologies for any typo or format mistake, still learning how to make good posts.
EDIT: i have found this answer setEntity in HttpDelete that more or less solve the first issue about if its, posible to send an Entity in a Delete call, but still, don´t now if it should be better to ask them to change their method
As told in the coments by Ewramner and VoiceOfUnreason and in the edits:
1: The answer about how to make this was found in an older post of StackOverflow: setEntity in HttpDelete
2: The answer about "Should I just talk with the people who made the API and ask them to change their configuration to allow us to send the info by params?" was answered by both of them. While it´s not forbidden, it´s something thats not recommended.
My course of action will be:
1: Talk with the people responsible for the API to give them info about this situation.
2: Ask our Architecture team to create a new method that will allow us to do HttpDelete calls with an entity body, just in case that we have to make more API calls like this one.
With this i think all my answers are solved.
Again, Thank you.
What is the correct way of getting results from solrj using Solr Suggester?
This is my request:
SolrQuery query = new SolrQuery();
query.setRequestHandler("/suggest");
query.setParam("suggest", "true");
query.setParam("suggest.build", "true");
query.setParam("suggest.dictionary", "mySuggester");
query.setParam("suggest.q", "So");
QueryResponse response = server.query(query);
However I found it extremely difficult to get the response. The way I got the response is with this:
NamedList obj = (NamedList)((Map)response.getResponse().get("suggest")).get("mySuggester");
SimpleOrderedMap obj2 = (SimpleOrderedMap) obj.get("So");
List<SimpleOrderedMap> obj3 = (List<SimpleOrderedMap>) obj2.get("suggestions");
This seems to assume a lot about the objects I am getting from the response and will be difficult to anticipate errors.
Is there a better and cleaner way than this?
In new versions have a SuggesterResponse:
https://lucene.apache.org/solr/5_3_1/solr-solrj/org/apache/solr/client/solrj/response/SuggesterResponse.html
Best option is to get it as List, below code worked for me
HttpSolrClient solrClient = new HttpSolrClient(solrURL);
SolrQuery query = new SolrQuery();
query.setRequestHandler("/suggest");
query.setParam("suggest.q", "Ins");
query.setParam("wt", "json");
try {
QueryResponse response = solrClient.query(query);
System.out.println(response.getSuggesterResponse().getSuggestedTerms());
List<String> types=response.getSuggesterResponse().getSuggestedTerms().get("infixSuggester");
System.out.println(types);
} catch (SolrServerException | IOException e) {
e.printStackTrace();
}
You can get the suggestions via the SpellCheckResponse by doing the following
SpellCheckResponse spellCheckResponse=response.getSpellCheckResponse();
Check this link for more details
I am trying to perform a retrieveAndRank query using Java wrapper. I follow the online javadocs for Retrieve and Rank API.
The example there for SearchAndRank is:
https://www.ibm.com/watson/developercloud/retrieve-and-rank/api/v1/#query_ranker
RetrieveAndRank service = new RetrieveAndRank();
service.setUsernameAndPassword("{username}","{password}");
HttpSolrClient solrClient = new HttpSolrClient;
solrClient = getSolrClient(service.getSolrUrl("scfaaf8903_02c1_4297_84c6_76b79537d849"), "{username}","{password}");
SolrQuery query = new SolrQuery("what is the basic mechanism of the transonic aileron buzz");
QueryResponse response = solrClient.query("example_collection", query);
Ranking ranking = service.rank("B2E325-rank-67", response);
System.out.println(ranking);
but RetrieveAndRank class has no such rank(String rankerId, QueryResponse response) method. Just one getting a file or an InputStream as arguments (browsing IBM’s source code I see it expects CSV content not a java QueryResponse there).
How should I pass QueryResponse to the rank method?
I am using solr-solrj-5.5.2.jar and java-sdk-3.2.0-jar-with-dependencies.jar libraries
You need to use the /fcselect query handler and send the ranker_id as a parameter.
The code below assumes you have a Solr collection with documents and you have trained a ranker, otherwise follow this tutorial.
RetrieveAndRank service = new RetrieveAndRank();
service.setUsernameAndPassword(USERNAME, PASSWORD);
// create the solr client
String solrUrl = service.getSolrUrl(SOLR_CLUSTER_ID);
HttpClient client = createHttpClient(solrUrl, USERNAME, PASSWORD);
HttpSolrClient solrClient = new HttpSolrClient(solrUrl, client);
// build the query
SolrQuery query = new SolrQuery("*:*");
query.setRequestHandler("/fcselect");
query.set("ranker_id", RANKER_ID);
// execute the query
QueryResponse response = solrClient.query(SOLR_COLLECTION_NAME, query);
System.out.println("Found " + response.getResults().size() + " documents!");
System.out.println(response);
Make sure you update the service credentials for RetrieveAndRank(USERNAME and PASSWORD), SOLR_CLUSTER_ID, SOLR_COLLECTION_NAME and RANKER_ID.
The code for createHttpClient() can be found here.
Using the RallyRestAPI, is there a way to query ScopedAttributeDefinition types? This appears to define custom datatypes in Rally. I am trying to build a data dictionary of the custom types we use in Rally and associate these custom types to the object they are attributes of. (In case that doesn't make sense, here's an example: We have a custom field called Enabler Lead on Rally PortfolioItems. I'd like to query Rally for all custom fields for PortfolioItem and get the Enabler Field, and its metadata, from the Rally REST API.)
I'm using the Java client.
I filed a github issue to add support for the ScopedAttributeDefinition: https://github.com/RallyTools/RallyRestToolkitForJava/issues/19
In the meantime though you can work around it by using the underlying http client directly. This code queries for all projects in your workspace and then finds the type definition for Portfolio Item and then for each project grabs all the custom attribute defs via the scopedattributedefinition endpoint and prints out their hidden/required status.
QueryRequest projectQuery = new QueryRequest("project");
projectQuery.setLimit(Integer.MAX_VALUE);
QueryResponse projectResponse = restApi.query(projectQuery);
QueryRequest typeDefQuery = new QueryRequest("typedefinition");
typeDefQuery.setQueryFilter(new QueryFilter("Name", "=", "Portfolio Item"));
QueryResponse typeDefResponse = restApi.query(typeDefQuery);
JsonObject piTypeDef = typeDefResponse.getResults().get(0).getAsJsonObject();
for(JsonElement projectResult : projectResponse.getResults()) {
JsonObject project = projectResult.getAsJsonObject();
System.out.println("Project: " + project.get("Name").getAsString());
//Begin hackery (note we're not handling multiple pages-
// if you have more than 200 custom attributes you'll have to page
String scopedAttributeDefUrl = "/project/" + project.get("ObjectID").getAsLong() +
"/typedefinition/" + piTypeDef.get("ObjectID").getAsLong() + "/scopedattributedefinition" +
"?fetch=Hidden,Required,Name&query=" + URLEncoder.encode("(Custom = true)", "utf-8");
String attributes = restApi.getClient().doGet(scopedAttributeDefUrl);
QueryResponse attributeResponse = new QueryResponse(attributes);
//End hackery
for(JsonElement customAttributeResult : attributeResponse.getResults()) {
JsonObject customAttribute = customAttributeResult.getAsJsonObject();
System.out.print("\tAttribute: " + customAttribute.get("Name").getAsString());
System.out.print(", Hidden: " + customAttribute.get("Hidden").getAsBoolean());
System.out.println(", Required: " + customAttribute.get("Required").getAsBoolean());
}
System.out.println();
}
Any other info you'd like for each of those custom fields should be accessible just by querying the Attributes collection from the piTypeDef.
I have the following code, which is simply searching the Solr Server.
SolrServer server = new CommonsHttpSolrServer(url);
SolrQuery searchquery = new SolrQuery("company profile");
QueryResponse response = server.query(searchquery)
I want to have the response in json, other than the default which is xml. So I went into the solrconfig.xml file and enabled the following line:
<queryResponseWriter name="json" class="org.apache.solr.request.JSONResponseWriter" />
However, from the console, I'm still getting wt=javabin encoded to the search query request.
Also, I've modified the above code like this:
SolrServer server = new CommonsHttpSolrServer(url);
SolrQuery searchquery = new SolrQuery("company profile");
searchquery.setParam("wt", "json");
QueryResponse response = server.query(searchquery)
But I'm still getting wt=javabin encoded and wt=json also appended, such that the query now looks like this:
webapp/solr path=/select params={wt=javabin&wt=json}
Is there anything I'm doing wrong?
Thanks
SolrJ only supports the javabin and xml formats (configurable with CommonHttpSolrServer.setParser).
But why would you want to use JSON? Javabin is by far the format which provides the best decompression speed, its drawback being that is is not human-readable, on the contrary to JSON and XML (which is not a problem in this case since SolrJ is parsing the result for you).
With SolrJ and json-lib, I was able to get json response like this:
SolrServer server = new CommonsHttpSolrServer(url);
SolrQuery searchquery = new SolrQuery("company profile");
QueryResponse response = server.query(searchquery)
JSONArray jsonObject = JSONArray.fromObject( response.getResults() );
log.log(Level.INFO, "received jsonObject is {0}", jsonObject.toString());
Iterator data = jsonObject.iterator();
SolrResultBean bean = new SolrResultBean();
List output = new ArrayList();
while(data.hasNext()) {
output.add(data.next());
}
log.log(Level.INFO, "json items in the List are {0}", output);
bean.setObject(output);
// wicket page redirect
setResponsePage(SearchPage.class, new PageParameters());