Could someone try my codes out? It was working a few days ago and now it's not. I did not modify anything, and so I suspect the webmaster of that side has block me. Could someone check it out for me? This is part of my school project.
public class Cost extends TimerTask{
public void run() {
Calendar rightNow = Calendar.getInstance();
Integer hour = rightNow.get(Calendar.HOUR_OF_DAY);
if (hour==1) {
try {
URL tariff = new URL("http://www.emcsg.com/MarketData/PriceInformation?downloadRealtime=true");
ReadableByteChannel tar = Channels.newChannel(tariff.openStream());
FileOutputStream fos = new FileOutputStream("test.csv");
fos.getChannel().transferFrom(tar, 0, 1<<24);
} catch (IOException ex) {
Logger.getLogger(Cost.class.getName()).log(Level.SEVERE, null, ex);
}
}
else {
}
}
}
First of all, clean up your IO exceptions as that might be obscuring the problem - check you can write to D:.
If you are being blocked by the site because of your user-agent header:
This will show you your user-agent header: http://pgl.yoyo.org/http/browser-headers.php. Then the answer to Setting user agent of a java URLConnection tells you how to set your header.
You will either need to add a step between instantiating URL and opening stream:
URL tariff = new URL("http://www.emcsg.com/MarketData/PriceInformation?downloadRealtime=true");
java.net.URLConnection c = tariff.openConnection();
c.setRequestProperty("User-Agent", " USER AGENT STRING HERE ");
ReadableByteChannel tar = Channels.newChannel(c.getInputStream());
or you could try just doing this:
System.setProperty("http.agent", " USER AGENT STRING HERE ");
sometime before you call openStream().
Edit: This works for me. Can you try running it and let us know the output:
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.net.URL;
import java.net.URLConnection;
public class TestURL {
public static void main(String[] args) {
try {
URL tariff = new URL("http://www.emcsg.com/MarketData/PriceInformation?downloadRealtime=true");
URLConnection c = tariff.openConnection();
BufferedReader br = new BufferedReader(new InputStreamReader(c.getInputStream()));
System.out.println(br.readLine());
} catch (IOException ex) {
ex.printStackTrace();
}
}
}
I checked your code and running it I had no problem, everything works fine.
Are you working behind a proxy?
In that case you have to configure it:
System.setProperty("http.proxyHost", "my.proxy.name");
System.setProperty("http.proxyPort", "8080");
Related
Let me summarize my problem I am trying to download a file using java nio in that I have also written code for resuming the file download when you run the program again but my problem is this when there is no internet connection the download process is not stoping( what I meant is when there is no internet the code is not going to the next line no exception nothing it simply waits for the internet to resume .)
package com.jcg.java.nio;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.net.HttpURLConnection;
import java.net.URL;
import java.nio.channels.Channels;
import java.nio.channels.FileChannel;
import java.nio.channels.ReadableByteChannel;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
public class DownloadFileFromUrl {
// File Location
private static String filePath ="D:\\path\\app.zip";
// Sample Url Location
private static String sampleUrl = "server_url";
// private static int downloaded;
// This Method Is Used To Download A Sample File From The Url
private static void downloadFileFromUrlUsingNio() {
URL urlObj = null;
ReadableByteChannel rbcObj = null;
FileOutputStream fOutStream = null;
long downloaded=0l;
try {
long startTime = System.currentTimeMillis();
urlObj = new URL(sampleUrl);
HttpURLConnection httpUrlConnection = (HttpURLConnection) urlObj.openConnection();
File file=new File("D:\\path\\app.zip");
if(file.exists()){
System.out.println("if condition");
downloaded = file.length();
System.out.println(downloaded);
httpUrlConnection.setRequestProperty("Range", "bytes="+(file.length())+"-");
}
else{
httpUrlConnection.setRequestProperty("Range", "bytes=" + downloaded + "-");
}
httpUrlConnection.setDoInput(true);
httpUrlConnection.setDoOutput(true);
rbcObj = Channels.newChannel(urlObj.openStream());
fOutStream = new FileOutputStream(filePath,true);
fOutStream.getChannel().transferFrom(rbcObj, 0, Long.MAX_VALUE);
System.out.println("! File Successfully Downloaded From The Url !");
long endTime = System.currentTimeMillis();
System.out.println(endTime);
System.out.println(endTime-startTime);
// System.out.println(System);
} catch (IOException ioExObj) {
System.out.println("Problem Occured While Downloading The File= " + ioExObj.getMessage());
} finally {
try {
if(fOutStream != null){
fOutStream.close();
}
if(rbcObj != null) {
rbcObj.close();
}
} catch (IOException ioExObj) {
System.out.println("Problem Occured While Closing The Object= " + ioExObj.getMessage());
}
}
// } else {
// System.out.println("File Not Present! Please Check!");
// }
}
public static void main(String[] args) {
downloadFileFromUrlUsingNio();
// usingJavaNIO();
}
}
In the above code if you can see the below code of line
fOutStream.getChannel().transferFrom(rbcObj, 0, Long.MAX_VALUE);
this is for downloading the file , when I disable the internet connection(no internet) than the control is not comming to the next line
System.out.println("! File Successfully Downloaded From The Url !");
and not in catch block either
System.out.println("Problem Occured While Closing The Object= " + ioExObj.getMessage());
And what I am trying to accomplish is that when there is no internet the process(download) or the channel should close and the rest of the code executes normally but what actually happens is until I reconnect my internet it will not stop(and after connecting internet still it takes time).
So in simple terms my application will even wait for hours to stop the download when there is no internet and there will be no errors.
Please someone help me to overcome this scenario I just want it to stop when there is no internet.
I am trying to use java.net.HttpURLConnection to make a simple HTTP GET call and am running into something I can't explain:
public String makeGETCall(HttpURLConnection con) {
try {
System.out.println("About to make the request...");
if(con == null)
System.out.println("con is NULL");
else {
System.out.println("con is NOT null");
if(con.getInputStream() == null)
System.out.println("con's input stream is NULL");
else
System.out.println("con's input stream is NOT null");
}
} catch(Throwable t) {
System.out.println("Error: " + t.getMessage());
}
System.out.println("Returning...")
return "DUMMY DATA";
}
When I run this, I get the following console output:
About to make the request...
con is NOT null
And then the program terminates, without error. No exceptions get thrown, it doesn't exit unexpectedly, and it doesn't hang or timeout...it just dies.
It seems to be dying when I check con.getInputStream() for being null or not. But that still doesn't explain why it just dies quietly without any indication of error. Any ideas? I''m willing to admit that I could have created the HttpURLConnection incorrectly, but still, there should be more indication of what is killing my program...Thanks in advance!
Your code shouldn't be compiling since this line:
System.out.println("Returning...")
has a missing semi-colon. With that said, I would imagine any runs of the application you're using are using an old execution and don't have the new code you probably wrote.
If that's not the case then you've pasted your code incorrectly (somehow) and I would venture a guess that you missed other aspects that we need to see? If you've edited the code in some way for StackOverflow would you mind sharing the original?
Additionally, I would recommend against catching Throwable unless you have good reason to. Its typically bad practice to mask application errors as such.
It seems to be dying when I check con.getInputStream() for being null or not.
Well I find that hard to believe.
On the face of it:
testing a reference to see if it is null cannot terminate your program
the con.getInputStream() can either throw an exception or return
if does return it shouldn't return null ... 'cos the API doesn't allow it ... and you would see a message
if an exception was thrown you would see the Error: ... message
My conclusion is that a either different thread is causing the application to exit, or the application is not exiting at all.
The only other explanation I can think of is that your edit / compile / deploy / run procedure is not working, and the code that is actually being run doesn't match the code you are showing use.
I suggest that you attempt to create a SSCCE for this so that other people can reproduce it and figure out what is actually happening.
same Problem here. I traced my code and it gets stuck at con.getInputStream() forever.
To reproduce the Problem run the code example below. (put your correct URL)
A) Start any HTTPS Server on another Host
B) Start the Client Code
C) Shutdown HTTPS Server
D) Start HTTPS Server again
-> Stuck at con.getInputStream()
While restarting the HTTPS Server it seems like some deadlock in the client occurs.
FYI I am using the bundle org.apache.felix.http.jetty as HTTP(S)-Server with a Restlet Servlet attached.
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.net.MalformedURLException;
import java.net.URL;
import java.security.KeyManagementException;
import java.security.NoSuchAlgorithmException;
import javax.net.ssl.HostnameVerifier;
import javax.net.ssl.HttpsURLConnection;
import javax.net.ssl.SSLContext;
import javax.net.ssl.SSLSession;
import javax.net.ssl.TrustManager;
import javax.net.ssl.X509TrustManager;
public class TestHTTPS{
public static void main(String[] args) throws InterruptedException
{
new TestHTTPS().activate();
}
private void activate() throws InterruptedException{
TrustManager[] insecureTrustManager = new TrustManager[] {
new X509TrustManager() {
public java.security.cert.X509Certificate[] getAcceptedIssuers() {
return null;
}
public void checkClientTrusted(
java.security.cert.X509Certificate[] certs, String authType) {
}
public void checkServerTrusted(
java.security.cert.X509Certificate[] certs, String authType) {
}
}
};
try {
SSLContext sc = SSLContext.getInstance("SSL");
sc.init(null, insecureTrustManager, new java.security.SecureRandom());
HttpsURLConnection.setDefaultSSLSocketFactory(sc.getSocketFactory());
} catch (KeyManagementException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
} catch (NoSuchAlgorithmException e) {
e.printStackTrace();
}
HostnameVerifier allHostsValid = new HostnameVerifier() {
public boolean verify(String hostname, SSLSession session) {
return true;
}
};
HttpsURLConnection.setDefaultHostnameVerifier(allHostsValid);
String https_url = "https://192.168.xx.xx:8443";
URL url;
try {
url = new URL(https_url);
while(true) {
HttpsURLConnection con = (HttpsURLConnection)url.openConnection();
con.setConnectTimeout(1000);
print_content(con);
Thread.sleep(100);
}
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
private void print_content(HttpsURLConnection con){
if(con!=null){
try {
System.out.println("****** Content of the URL ********");
BufferedReader br =
new BufferedReader(
new InputStreamReader(con.getInputStream()));
String input;
while ((input = br.readLine()) != null){
System.out.println(input);
}
br.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
Any recommendations welcome.
Fixed it.
con.setReadTimeout(1000)
obviously the HTTP Server accepts a connection, but is unable to fulfill the request when you connect in the wrong moment while server is starting. setReadTimeout causes the thread to throw an SocketTimeoutException.
Hope this helps someone else to solve the Problem...
As this Problem may also occur while using RESTlet with the internal connector, here is the solution for RESTlet. You have to take care of a hostnameVerifier and an SSLContextFactory by yourself:
context = new Context();
context.getParameters().add("readTimeout", Integer.toString(1000));
context.getAttributes().put("sslContextFactory", new YourSslContextFactory());
context.getAttributes().put("hostnameVerifier", new YourHostnameVerifier());
client = new Client(context, Protocol.HTTPS);
make sure
org.restlet.ext.ssl
org.restlet.ext.net
org.restlet.ext.httpclient
are in your classpath.
Best Regards
I want to send a command to a server, and find out if I get a response.
Right now i am using BufferedReader's readline() function, which blocks until there's a response from server, but all I want to do is verify that there's a response from the server in the first place.
I tried using ready() or reset() to avoid this block, but it doesn't help.
This is causing my program to get stuck waiting for the server to respond, which never happens. InputStreamReader seems to do the same thing, by my understanding of things.
Other questions I found here on the subject didn't answer my question,
so please if you can answer my question it will be great.
If you want to read responses asynchronously, I suggest starting a thread which read a BufferedReader. This is much simpler to code and easier to control.
May be all you need is the InputStream without wrapping it in a BufferedReader
while (inputStream.available() > 0) {
int i = inputStream.read(tmp, 0, 1024);
if (i < 0)
break;
strBuff.append(new String(tmp, 0, i));
}
I hope this helps.
It's a tricky task not get blocking if you use standard java IO. Common answer is migration to NIO or netty. Netty is more preferable choice. However sometimes you don't have a choice so I suggest you to try my workaround:
public String readResponse(InputStream inStreamFromServer, int timeout) throws Exception {
BufferedReader reader = new BufferedReader(new InputStreamReader(inStreamFromServer, Charsets.UTF_8));
char[] buffer = new char[8092];
boolean timeoutNotExceeded;
StringBuilder result = new StringBuilder();
final long startTime = System.nanoTime();
while ((timeoutNotExceeded = (TimeUnit.NANOSECONDS.toMillis(System.nanoTime() - startTime) < timeout))) {
if (reader.ready()) {
int charsRead = reader.read(buffer);
if (charsRead == -1) {
break;
}
result.append(buffer, 0, charsRead);
} else {
try {
Thread.sleep(timeout / 200);
} catch (InterruptedException ex) {
LOG.error("InterruptedException ex=", ex);
}
}
}
if (!timeoutNotExceeded) throw new SocketTimeoutException("Command timeout limit was exceeded: " + timeout);
return result.toString();
}
This workaround isn't a silver bullet but it has some important feature:
Doesn't use readline(). This method is dangerous for network communications because some servers don't return LF/CR symbols and your code will be stuck. When you read from a file it isn't critical you will reach end of the file anyway.
Doesn't use char symbol = (char) fr.read();. This approach is slower than reading to char[]
It has timeout functionality and you'll have possibility interrupt communication on slow connections
I did something similar recently using a CountDownLatch. There may be some better ways, but this is pretty easy, and seems to work reasonably well. You can adjust the wait tome of the CountDownLatch to suit your needs.
package com.whatever;
import java.io.BufferedReader;
import java.io.InputStreamReader;
import java.net.URL;
import java.net.URLConnection;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class TestRead {
private static final Logger log = LoggerFactory.getLogger(TestRead.class);
private CountDownLatch latch = new CountDownLatch(1);
public void read() {
URLReader urlReader = new URLReader();
Thread listener = new Thread(urlReader);
listener.setDaemon(true);
listener.start();
boolean success = false;
try {
success = latch.await(20000, TimeUnit.MILLISECONDS);
} catch (InterruptedException e) {
log.error("error", e);
}
log.info("success: {}", success);
}
class URLReader implements Runnable {
public void run() {
log.info("run...");
try {
URL oracle = new URL("http://www.oracle.com/");
URLConnection yc = oracle.openConnection();
BufferedReader in = new BufferedReader(new InputStreamReader(yc.getInputStream()));
String inputLine;
while ((inputLine = in.readLine()) != null)
System.out.println(inputLine);
in.close();
latch.countDown();
} catch (Exception ex) {
log.error("error", ex);
}
log.info("consumer is done");
}
}
public static void main(String[] args) {
TestRead testRead = new TestRead();
testRead.read();
}
}
Simple stuff, I am learning URLs/Networking in my class and I am trying to display something on a webpage. Later I am going to connect it to a MySQL DB... anyway here is my program:
import java.net.*; import java.io.*;
public class asp {
public static URLConnection
connection;
public static void main(String[] args) {
try {
System.out.println("Hello World!"); // Display the string.
try {
URLConnection connection = new URL("post.php?players").openConnection();
}catch(MalformedURLException rex) {}
InputStream response =
connection.getInputStream();
System.out.println(response);
}catch(IOException ex) {}
} }
It compiles fine... but when I run it I get:
Hello World!
Exception in thread "main" java.lang.NullPointerException
at asp.main(asp.java:17)
Line 17: InputStream response = connection.getInputStream();
Thanks,
Dan
You have a malformed URL, but you wouldn't know because you swallowed its exception!
URL("post.php?players")
This URL is not complete, it misses the host (maybe localhost for you?), and the protocol part, say http so to avoid the malformed URL exception you have to provide the full URL including the protocol
new URL("http://www.somewhere-dan.com/post.php?players")
Use the Sun tutorials on URLConnection first. That snippet is at least known to work, if you substitute the URL in that example with a valid URL you should have a working piece of code.
It's because your URL is not valid. You need to put the full address to the page you are trying to open a connection to. You are catching the malformedurlexception but that means that there is no "connection" object at that point. You have an extra closed bracket after the first catch block it appears as well. You should put the line that you are getting the null pointer for and the system.out.println above the catch blocks
import java.net.*; import java.io.*;
public class asp {
public static URLConnection connection;
public static void main(String[] args) {
try {
System.out.println("Hello World!"); // Display the string.
try {
URLConnection connection = new URL("http://localhost/post.php?players").openConnection();
InputStream response = connection.getInputStream();
System.out.println(response);
}catch(MalformedURLException rex) {
System.out.println("Oops my url isn't right");
}catch(IOException ex) {}
}
}
I'm trying to create a simple Flash chat application for educational purposes, but I'm stuck trying to send a policy file from my Java server to the Flash app (after several hours of googling with little luck).
The policy file request reaches the server that sends a harcoded policy xml back to the app, but the Flash app doesn't seem to react to it at all until it gives me a security sandbox error.
I'm loading the policy file using the following code in the client:
Security.loadPolicyFile("xmlsocket://myhostname:" + PORT);
The server recognizes the request as "<policy-file-request/>" and responds by sending the following xml string to the client:
public static final String POLICY_XML =
"<?xml version=\"1.0\"?>"
+ "<cross-domain-policy>"
+ "<allow-access-from domain=\"*\" to-ports=\"*\" />"
+ "</cross-domain-policy>";
The code used to send it looks like this:
try {
_dataOut.write(PolicyServer.POLICY_XML + (char)0x00);
_dataOut.flush();
System.out.println("Policy sent to client: " + PolicyServer.POLICY_XML);
} catch (Exception e) {
trace(e);
}
Did I mess something up with the xml or is there something else I might have overlooked?
I've seen your approach and after some time trying i wrote a working class, listening on any port you want:
package Server;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.net.ServerSocket;
import java.net.Socket;
public class PolicyServer {
public static final String POLICY_XML =
"<?xml version=\"1.0\"?>"
+ "<cross-domain-policy>"
+ "<allow-access-from domain=\"*\" to-ports=\"*\" />"
+ "</cross-domain-policy>";
public PolicyServer(){
ServerSocket ss = null;
try {
ss = new ServerSocket(843);
} catch (IOException e) {e.printStackTrace();}
while(true){
try {
final Socket client = ss.accept();
new Thread(new Runnable() {
#Override
public void run() {
try {
client.setSoTimeout(10000); //clean failed connections
client.getOutputStream().write(PolicyServer.POLICY_XML.getBytes());
client.getOutputStream().write(0x00); //write required endbit
client.getOutputStream().flush();
BufferedReader in = new BufferedReader(new InputStreamReader(client.getInputStream()));
//reading two lines emties flashs buffer and magically it works!
in.readLine();
in.readLine();
} catch (IOException e) {
}
}
}).start();
} catch (Exception e) {}
}
}
}
Try add \n at the end of policy xml.