So I ran into a problem and the service support asked me to support them with the HTML.
So I need to get the equivalent HTML that is generated by
HttpClient client = new HttpClient();
PostMethod post = new PostMethod( "https://www.url.to/post-to" );
NameValuePair[] params = {
new NameValuePair( "NAME", name ),
new NameValuePair( "EMAIL", email ),
for( NameValuePair param : params ){
post.setParameter(param.getName(), param.getValue());
}
Is it possible to get it as HTML or get the request sent as a string, with all the headers etc?
If you use some logging framework, Apache HttpClient can be configured to log requests and responses which go through it, see Wire logging. Judging by the documentation, if logging framework is properly setup, this should be enough to enable it
System.setProperty("org.apache.commons.logging.Log", "org.apache.commons.logging.impl.SimpleLog");
System.setProperty("org.apache.commons.logging.simplelog.showdatetime", "true");
System.setProperty("org.apache.commons.logging.simplelog.log.httpclient.wire", "debug");
System.setProperty("org.apache.commons.logging.simplelog.log.org.apache.commons.httpclient", "debug");
EDIT
These settings are ok if you use Apache Commons Logging as logging framework. Above mentioned link also contains examples for setting this up for log4j and java.util.logging.
You could write an equivalent HTML form, that sends the same two parameters to the same URL.
However, if you're opening a support ticket for a problem caused by this connection, an HTML will probably not be very helpful, because a browser may be sending different headers to the URL even if the request body is the same.
You can get the request headers from the post variable, using post.getRequestHeaders(). It returns an array of type Header[], which you'll have to translate into strings. The message body is probably easy to reconstruct from the parameters by urlencoding the values. The body is always in the form NAME=urlencodedname&EMAIL=urlencodedemail.
Related
When using the java.net.http.HttpClient classes in Java 11 and later, how does one tell the client to follow through an HTTP 303 to get to the redirected page?
Here is an example. Wikipedia provides a REST URL for getting the summary of a random page of their content. That URL redirects to the URL of the randomly-chosen page. When running this code, I see the 303 when calling HttpResponse#toString. But I do not know how to tell the client class to follow along to the new URL.
HttpClient client = HttpClient.newHttpClient();
HttpRequest request =
HttpRequest
.newBuilder()
.uri( URI.create( "https://en.wikipedia.org/api/rest_v1/page/random/summary" ) )
.build();
try
{
HttpResponse < String > response = client.send( request , HttpResponse.BodyHandlers.ofString() );
System.out.println( "response = " + response ); // ⬅️ We can see the `303` status code.
String body = response.body();
System.out.println( "body = " + body );
}
catch ( IOException e )
{
e.printStackTrace();
}
catch ( InterruptedException e )
{
e.printStackTrace();
}
When run:
response = (GET https://en.wikipedia.org/api/rest_v1/page/random/summary) 303
body =
Problem
You're using HttpClient#newHttpClient(). The documentation of that method states:
Returns a new HttpClient with default settings.
Equivalent to newBuilder().build().
The default settings include: the "GET" request method, a preference of HTTP/2, a redirection policy of NEVER [emphasis added], the default proxy selector, and the default SSL context.
As emphasized, you are creating an HttpClient with a redirection policy of NEVER.
Solution
There are at least two solutions to your problem.
Automatically Follow Redirects
If you want to automatically follow redirects then you need to use HttpClient#newBuilder() (instead of #newHttpClient()) which allows you to configure the to-be-built client. Specifically, you need to call HttpClient.Builder#followRedirects(HttpClient.Redirect) with an appropriate redirect policy before building the client. For example:
HttpClient client =
HttpClient.newBuilder()
.followRedirects(HttpClient.Redirect.NORMAL) // follow redirects
.build();
The different redirect policies are specified by the HttpClient.Redirect enum:
Defines the automatic redirection policy.
The automatic redirection policy is checked whenever a 3XX response code is received. If redirection does not happen automatically, then the response, containing the 3XX response code, is returned, where it can be handled manually.
There are three constants: ALWAYS, NEVER, and NORMAL. The meaning of the first two is obvious from their names. The last one, NORMAL, behaves just like ALWAYS except it won't redirect from https URLs to http URLs.
Manually Follow Redirects
As noted in the documentation of HttpClient.Redirect you could instead manually follow a redirect. I'm not well versed in HTTP and how to properly handle all responses so I won't give an example here. But I believe, at a minimum, this requires you:
Check the status code of the response.
If the code indicates a redirect, grab the new URI from the response headers.
If the new URI is relative then resolve it against the request URI.
Send a new request.
Repeat 1-4 as needed.
Obviously configuring the HttpClient to automatically follow redirects is much easier (and less error-prone), but this approach would give you more control.
Please find below code where i was calling another api from my REST APi in java.
To note I am using java version 17. This will solve error code 303.
#GetMapping(value = "url/api/url")
private String methodName() throws IOException, InterruptedException {
var url = "api/url/"; // remote api url which you want to call
System.out.println(url);
var request = HttpRequest.newBuilder().GET().uri(URI.create(url)).setHeader("access-token-key", "accessTokenValue").build();
System.out.println(request);
var client = HttpClient.newBuilder().followRedirects(HttpClient.Redirect.NORMAL).build();
System.out.println(client);
var response = client.send(request, HttpResponse.BodyHandlers.ofString());
System.out.println(response);
System.out.println(response.body());
return response.body();
}
I'm trying to consume a REST API that requires a body with a GET request. But as a GET usually doesn't have a body, I can't find a way to attach a body in my request. I am also building the REST API, but the professor won't allow us to change the method to POST (he gave us a list of the endpoints we are to create, no more, no less).
I'm trying to do it like this:
Response r = target.request().method(method, Entity.text(body));
Where I set the method to GET and the body to my get body. However, using this approach I get an exception:
javax.ws.rs.ProcessingException: RESTEASY004565: A GET request cannot have a body.
Is there any way to do this with JAX-RS? We learned to use JAX-RS so I would prefer a solution using this, as I'm not sure my professor would allow us to use any other REST client. I'm currently using RESTEasy, provided by the WildFly server.
(This is not a duplicate of HTTP GET with request body because I'm asking on how to create a GET request with body in JAX-RS, not if it should be done.)
This depends on what is your JAX-RS implementation. This check can be disabled in Jersey 2.25 using SUPPRESS_HTTP_COMPLIANCE_VALIDATION property:
ClientConfig config = new ClientConfig();
config.property(ClientProperties.SUPPRESS_HTTP_COMPLIANCE_VALIDATION, true);
JerseyClient client = JerseyClientBuilder.createClient(config);
WebTarget target = client.target(URI.create("https://www.stackoverflow.com"));
Response response = target.request().method("GET", Entity.text("BODY HERE"));
Instead of exception you will get an INFO log
INFO: Detected non-empty entity on a HTTP GET request. The underlying HTTP transport connector may decide to change the request method to POST.
However in RESTEasy 3.5.0.Final there is a hardcoded check in both URLConnectionEngine and ApacheHttpClient4Engine:
if (request.getEntity() != null)
{
if (request.getMethod().equals("GET")) throw new ProcessingException(Messages.MESSAGES.getRequestCannotHaveBody());
You would have to create your own implementation of the ClientHttpEngine to skip this. Then you need to supply it when building the client:
ClientHttpEngine engine = new MyEngine();
ResteasyClient client = new ResteasyClientBuilder().httpEngine(engine).build();
I am loading model in apache jena using function FileManager.get().loadModel(url).And I also know that there may be some URLs in HTTP Response Link Header .I want to load model also from the links(URLs) in link header.How to do that ? Is there any inbuilt fuctionality to get access to header and process link header in Response header?
FileManager.get().loadModel(url) packages up reading a URL and parsing the results into a model. It is packing up a common thing to do; it is not claiming to be comprehensive. It is quite an old interface.
If you wanted detailed control over the HTTP handling, see if HttpOp (a lower level) mechanism helps, otherwise do the handling in the application and hand the input stream for the response directly to the parser.
You may also find it useful to look at the code in RDFDataMgr.process for help with content negotiation.
I don't think that this is supported by Jena. I don't see any reason in doing so. The HTTP request is done to get the data and maybe also to get the response type. If you want to get the URLs in some header fields, why not simply use plain old Java:
URL url = new URL("http://your_ontology.owl");
URLConnection conn = url.openConnection();
Map<String, List<String>> map = conn.getHeaderFields();
I need to be able to use a password protected proxy and be able to read json information returned from a url.
I do not want to declare proxies at the system level; I would like to have multiple proxies being used in the same application.
What is the best way to do this?
I once faces the same problem. Unfortunately, JSoup is not a good choice for this. I ended up using the apache http client, which works nicely with proxies.
Here is the proxy-relevant part of my http-client configuration:
String ipStr = "the.proxy.ip.string";
int port = 8080;
String proxyLogin = "your login name";
String proxyPassword = "your password";
httpClient.getCredentialsProvider().setCredentials(
new AuthScope(ipStr, port),
new UsernamePasswordCredentials(proxyLogin, proxyPassword));
HttpHost httpHost = new HttpHost(ipStr, port, "http");
httpClient.getParams().setParameter(ConnRoutePNames.DEFAULT_PROXY, httpHost);
You can use the http-client to get the website or JSON response from the net. If the content is HTML, you can use JSoup as parser with the returned input. If you get JSON back, then you probably want to use a JSON parser like json-simple (but there are many other very useful JSON libraries out there!)
We have a JSF web application that uses Acegi security. We also have a standalone Java Swing application. One function of the Swing app is to load the user's home page in a browser window.
To do this we're currently using Commons HttpClient to authenticate the user with the web app:
String url = "http://someUrl/j_acegi_security_check";
HttpClient client = new HttpClient();
System.setProperty(trustStoreType, "Windows-ROOT");
PostMethod method = new PostMethod(url);
method.addParameter("j_username", "USERNAME");
method.addParameter("j_password", "PASSWORD");
int statusCode = client.executeMethod(method);
if (statusCode == HttpStatus.SC_MOVED_TEMPORARILY ) {
Header locationHeader= method.getResponseHeader("Location");
String redirectUrl = locationHeader.getValue();
BrowserLauncher launcher = new BrowserLauncher();
launcher.openURLinBrowser(redirectUrl);
}
This returns a HTTP 302 redirect response, from which we take the redirect url and open it using BrowserLauncher 2. The url contains the new session ID, something like:
http://someUrl/HomePage.jsf;jsessionid=C4FB2F643CE48AC2DE4A8A4C354033D4
The problem we're seeing is that Acegi processes the redirect but throws an AuthenticationCredentialsNotFoundException. It seems that for some reason the authenticated credentials cannot be found in the security context.
Does anyone have an idea as to why this is happening? If anyone needs more info then I'll be happy to oblige.
Many thanks,
Richard
I have never done Acegi/SpringSecurity, but the symptoms are clear enough: some important information is missing in the request. You at least need to investigate all the response headers if there isn't something new which needs to be passed back in the header of the subsequent request. Maybe another cookie entry which represents the Acegi credentials.
But another caveat is that you in fact cannot open just the URL in a local browser instance, because there's no way to pass the necessary request headers along it. You'll need to have your Swing application act as a builtin webbrowser. E.g. get HTML response in an InputStream and render/display it somehow in a Swing frame. I would check if there isn't already an existing API for that, because it would involve much more work than you'd initially think .. (understatement).
In this case you can do Basic Authentication and set this header in every request instead of sending the jsessionid:
AUTHORIZATION:Basic VVNFUk5BTUU6UEFTU1dPUkQ=
The token VVNFUk5BTUU6UEFTU1dPUkQ= is the username and the password encoded base64.
Example:
scott:tiger
is:
c2NvdHQ6dGlnZXI=
One more thing: use SSL.