I have a program that download stocks data from the internet and then create a Stock objects that contains a stock meta data and an array of historical prices.
At first I ran over more then 20000 stocks created them and entered them into an arraylist for writing them to the db(all of them in one transaction). this wasn't a good idea and before i ended to download all the stocks i wanted the program died because of OutOfMemory.
Then i decided that after every 500 stocks that i am adding to the arraylist i will write them to the DB and clean the arraylist(arraylist.clear() for GC to make it's "magic") and then fill the arraylist with another new 500 Stocks and go through the same proccess all over again.
That didn't work either, and my program died again because of an OutOfMemory Exception.
I thought that the problem may be in some where else in my code and I made an experiment and ran the same code but one little difference: after i create each Stock object i do not put it into the arraylist and i just continue and create Stock objects all over again without adding them into the arraylist.
The result was that my program didn't consume almost any memory at all and that make me really confused and frustrated.
Please help me find out if something is wrong with my program.
Here are some lines of the code:
The first version:
ArrayList<Stock> stocksData = new ArrayList<Stock>();
Stock stock;
BufferedReader br = null;
String line = "";
try {
br = new BufferedReader(new FileReader(YAHOO_STOCKS_SYMBOLS));
while ((line = br.readLine()) != null) {
String[] stockMetaData = line.split(CSV_SPLIT);
stock = new Stock();
if (stockMetaData.length >= 4) {
stock.setSymbol(stockMetaData[0]);
stock.setName(stockMetaData[1]);
stock.setExchange(stockMetaData[2]);
stock.setCategory(stockMetaData[3]);
DATA_UPDATER.updateStockData(stock, fromDate, toDate);
if (stock.getHistoricalData() != null
&& stock.getHistoricalData().size() > 0) {
stocksData.add(stock);
}
}
}
}
the second version:
ArrayList<Stock> stocksData = new ArrayList<Stock>();
Stock stock;
BufferedReader br = null;
String line = "";
try {
br = new BufferedReader(new FileReader(YAHOO_STOCKS_SYMBOLS));
while ((line = br.readLine()) != null) {
String[] stockMetaData = line.split(CSV_SPLIT);
stock = new Stock();
if (stockMetaData.length >= 4) {
stock.setSymbol(stockMetaData[0]);
stock.setName(stockMetaData[1]);
stock.setExchange(stockMetaData[2]);
stock.setCategory(stockMetaData[3]);
DATA_UPDATER.updateStockData(stock, fromDate, toDate);
if (stock.getHistoricalData() != null
&& stock.getHistoricalData().size() > 0) {
stocksData.add(stock);
}
}
if (stocksData.size() == 500) {
WritingUtils.getInstance().updateStocks(stocksData);
stocksData.clear();
// I also tried
//System.gc();
}
}
}
Im adding some more information:
Stock fields:
protected String _symbol;
protected String _name;
protected String _exchange;
protected Date _upToDate;
protected ArrayList<DailyData> _historicalData;
DailyData fields:
private Date _date;
private double _open;
private double _close;
private double _adjClose;
private double _high;
private double _low;
private double _volume;
First I would start by profiling the application with a tool like JVisualVM (which comes with the JDK) and pinpointing exactly what objects are being retained in the heap.
Second, there is no 'clean()' method on ArrayList, so its not clear what you actually did to prevent your list from growing indefinitely.
Lastly, consider setting a conditional breakpoint in your program to 'catch' when the List is growing beyond your expectation.
EDIT: If what you're asking now is :
'Can my OutOfMemoryError be due to the fact that:
protected ArrayList<DailyData> _historicalData;
is growing and contributing to the memory pressure?'
The answer is yes. Have you also tried to increase the memory to your process? Try running:
java -Xms1g -Xmx1g *yourProgram*
to see if it still fails. You may not have a leak, it may just be that your application needs more memory for what you are doing.
Related
So I'm currently trying to test how long it takes for an AVLTree to be populated with different amounts of data. When adding small amounts of data to the tree like 16 objects or 2824 objects, my program is able to run within seconds (I'm using my phone stopwatch to record the time). When I add ~11000 objects to the tree however, it takes around 28 seconds and when I try to add 250000 objects to the tree my program seems to never finish running.
Within the program, I have added a variable that stores the current time at the beginning of the code and another variable that also stores the current time at the end so that I can roughly calculate how long it took the program to run. What is interesting though is that when I populated the AVLTree with 2824 objects it says it took ~1.56 seconds to run and when I populated the AVLTree with ~11000 objects it apparently took the same time to run.
Using my stopwatch to see that it took ~28 seconds to run compared to the ~1.56 seconds my program is telling me it took makes it really hard to see what is really causing the problem. The compiler I'm currently using is Eclipse Java Oxygen by the way.
public class AVLTreeTester {
public static void main(String[] args) {
long start = System.currentTimeMillis();
AVLTree<IPAddress, URL> avlTree = new AVLTree<IPAddress, URL>();
BufferedReader br = null;
String line;
try {
br = new BufferedReader(new FileReader("src/data/top-
250k.txt"));
while ((line = br.readLine()) != null) {
String[] values = line.split("\t");
IPAddress newIP = new IPAddress(values[1]);
URL newURL = new URL(values[0]);
avlTree.add(newIP, newURL);
}
}catch (IOException e) {
e.printStackTrace();
}
avlTree = null;
long stop = System.currentTimeMillis();
double time = stop - start / 1000;
System.out.println("Took: " + time + " second(s)");
}
}
I'm trying to write a curl like program using java, which uses only java socket programming (and not apache http client or any other APIs)
I want to have the option of showing whole or only the body of the response to my get request to user. Currently came up with the following code:
BufferedReader br = new BufferedReader(new InputStreamReader(s.getInputStream()));
String t;
while ((t = br.readLine()) != null) {
if (t.isEmpty() && !parameters.isVerbose()) {
StringBuilder responseData = new StringBuilder();
while ((t = br.readLine()) != null) {
responseData.append(t).append("\r\n");
}
System.out.println(responseData.toString());
parameters.verbose = false;
break;
} else if(parameters.isVerbose())// handle output
System.out.println(t);
}
br.close();
When the verbose option is on, it works quick and shows the whole response body in less than a second. but when I want to just have the body of the message it takes too much time(approx 10 sec) to hand it out.
Does any one knows how can it be processed in a faster way?
Thank you.
I'm going to assume what you mean by slow is that it starts displaying something almost immediately but keeps on printing lines for a long time. Writing to the console takes time, and you're printing each line invidually while in the other code path you first store the entire response in memory and then flush it to the console.
If the verbose response is small enough to fit in memory, you should do the same, otherwise you can decide on an arbitrary number of lines to print in batches (i.e; you accumulate n lines in memory and then flush to the console, clear the StringBuilderand repeat).
The most elegant way to implement my suggestion is to use a PrintStream wrapping a BufferedOutputStream, itself wrapping System.out. All my comments and advices are condensed in the following snippet:
private static final int BUFFER_SIZE = 4096;
public static void printResponse(Socket socket, Parameters parameters) throws IOException {
try (BufferedReader br = new BufferedReader(new InputStreamReader(socket.getInputStream()));
PrintStream printStream = new PrintStream(new BufferedOutputStream(System.out, BUFFER_SIZE))) {
// there is no functional difference in your code between the verbose and non-verbose code paths
// (they have the same output). That's a bug, but I'm not fixing it in my snippet as I don't know
// what you intended to do.
br.lines().forEach(line -> printStream.append(line).append("\r\n"));
}
}
If it uses any language construct you don't know about, feel free to ask further questions.
Note: I understand that the console is for debugging and games should use GUI. This is for testing/experience
I'm writing a game that runs at 60fps. Every update, I check to see if the user has entered a String command. If so, it gets passed through, but if not, null is paas through, and the null is ignored.
Scanner is out of the question, since hasNext(), the method used to see if there is data to read, can potentially block and causes problems.
I've tried using BufferedReader.ready(). Not only did I have problems (never returned true), but I've read that it's not recommended for a few reasons.
BufferedReader.available() always returned 0, which in the documentation, it state's that InputStream.available() will always return 0 unless overriden. Here is my attempt:
class Game {
public static void main(String[] args) {
InputReader reader = new InputReader(System.in);
int timePerLoop = 1000/30;
Game game = new Game();
while(true) {
long start = System.nanoTime();
game.update(reader.next());
long end = System.nanoTime();
long sleepTime = timePerLoop + ((start - end) / 10000000);
if(sleepTime > 0)
try {
Thread.sleep(sleepTime);
}catch(InterruptedException e) {
e.printStackTrace();
}
else
Thread.yield();
}
}
public void update(String command) {
if(commands != null) {
//handle command
}
//update game
}
}
InputReader.java
public class InputReader {
private InputStream in;
public InputReader(InputStream stream) {
in = stream;
}
public String next() {
String input = null;
try {
while(in.available > 0) {
if(input == null)
input = "";
input += (char) in.read();
}
}catch(IOException e) {
e.printStackTrace();
}
return input;
}
}
InputStream by itself has the same problem as above. I'm not completely sure what type the object stored in System.in, but using available() yields the same results.
I've tried using the reader() from System.console(), but console() returns null. I've read into the subject, and I am not confused why. This is not the way to do it.
The goal is to check the stream to see if it contains data to read, so I can read the data knowing it won't block.
I do not want to use a separate Thread to handle user input, so please don't recommend or ask why.
The input has to be from the console. No new sockets are to be created in the process. I have read a few topics about it, but none of them clearly states a solution. Is this possible?
As you have said yourself, a custom GUI or an additional thread is the correct way to do this. However in absence of that, have you tried using readLine() for example: String inputR = System.console().readLine();
Some alterations to main():
Replace: InputReader reader = new InputReader(System.in); with:
Console c = System.console();
Replace: game.update(reader.next());
with: game.update(c.readLine());
Edit: This thread could also be helpful: Java: How to get input from System.console()
I've got two problems with the android app I'm writing.
I'm reading out the local arp table from /proc/net/arp and save ip and corresponding mac address in a hash map. See my function. It's working properly.
/**
* Extract and save ip and corresponding MAC address from arp table in HashMap
*/
public Map<String, String> createArpMap() throws IOException {
checkMapARP.clear();
BufferedReader localBufferdReader = new BufferedReader(new FileReader(new File("/proc/net/arp")));
String line = "";
while ((line = localBufferdReader.readLine()) != null) {
String[] ipmac = line.split("[ ]+");
if (!ipmac[0].matches("IP")) {
String ip = ipmac[0];
String mac = ipmac[3];
if (!checkMapARP.containsKey(ip)) {
checkMapARP.put(ip, mac);
}
}
}
return Collections.unmodifiableMap(checkMapARP);
}
First problem:
I'm also using a broadcast receiver. When my app receives the State WifiManager.NETWORK_STATE_CHANGED_ACTION i check if the connection to the gateway is established. If true i call my function to read the arp table. But in this stage the system has not yet builded up the arp table. Sometimes when i receive the connection state the arp table is sill empty.
Anyone got an idea to solve this?
Second problem:
I want to save the ip and mac address of the gateway in a persistent way. Right now i'm using Shared Preferences for this. Maybe it's better to write to an internal storage?
Any tips?
For the first problem you could start a new thread that runs that method after sleeping for a set amount of time or until it has some entries (Make a Runnable with a mailbox to get the Map) - unless you need to use the map directly, then I think the only way is to wait for the entries. For example (if you need to use the map directly):
public Map<String, String> createArpMap() throws IOException {
checkMapARP.clear();
BufferedReader localBufferdReader = new BufferedReader(new FileReader(new File("/proc/net/arp")));
String line = "";
while ((line = localBufferdReader.readLine()) == null) {
localBufferdReader.close();
Thread.sleep(1000);
localBufferdReader = new BufferedReader(new FileReader(new File("/proc/net/arp")));
}
do {
String[] ipmac = line.split("[ ]+");
if (!ipmac[0].matches("IP")) {
String ip = ipmac[0];
String mac = ipmac[3];
if (!checkMapARP.containsKey(ip)) {
checkMapARP.put(ip, mac);
}
}
} while ((line = localBufferdReader.readLine()) != null);
return Collections.unmodifiableMap(checkMapARP);
}
I'm writing an application in Java with multithreading which I want to pause and resume.
The thread is reading a file line by line while finding matching lines to a pattern. It has to continue on the place I paused the thread. To read the file I use a BufferedReader in combination with an InputStreamReader and FileInputStream.
fip = new FileInputStream(new File(*file*));
fileBuffer = new BufferedReader(new InputStreamReader(fip));
I use this FileInputStream because I need the filepointer for the position in the file.
When processing the lines it writes the matching lines to a MySQL database. To use a MySQL-connection between the threads I use a ConnectionPool to make sure just one thread is using one connection.
The problem is when I pause the threads and resume them, a few matching lines just disappear. I also tried to subtract the buffersize from the offset but it still has the same problem.
What is a decent way to solve this problem or what am I doing wrong?
Some more details:
The loop
// Regex engine
RunAutomaton ra = new RunAutomaton(this.conf.getAuto(), true);
lw = new LogWriter();
while((line=fileBuffer.readLine()) != null) {
if(line.length()>0) {
if(ra.run(line)) {
// Write to LogWriter
lw.write(line, this.file.getName());
lw.execute();
}
}
}
// Loop when paused.
while(pause) { }
}
Calculating place in file
// Get the position in the file
public long getFilePosition() throws IOException {
long position = fip.getChannel().position() - bufferSize + fileBuffer.getNextChar();
return position;
}
Putting it into the database
// Get the connector
ConnectionPoolManager cpl = ConnectionPoolManager.getManager();
Connector con = null;
while(con == null)
con = cpl.getConnectionFromPool();
// Insert the query
con.executeUpdate(this.sql.toString());
cpl.returnConnectionToPool(con);
Here's an example of what I believe you're looking for. You didn't show much of your implementation so it's hard to debug what might be causing gaps for you. Note that the position of the FileInputStream is going to be a multiple of 8192 because the BufferedReader is using a buffer of that size. If you want to use multiple threads to read the same file you might find this answer helpful.
public class ReaderThread extends Thread {
private final FileInputStream fip;
private final BufferedReader fileBuffer;
private volatile boolean paused;
public ReaderThread(File file) throws FileNotFoundException {
fip = new FileInputStream(file);
fileBuffer = new BufferedReader(new InputStreamReader(fip));
}
public void setPaused(boolean paused) {
this.paused = paused;
}
public long getFilePos() throws IOException {
return fip.getChannel().position();
}
public void run() {
try {
String line;
while ((line = fileBuffer.readLine()) != null) {
// process your line here
System.out.println(line);
while (paused) {
sleep(10);
}
}
} catch (IOException e) {
// handle I/O errors
} catch (InterruptedException e) {
// handle interrupt
}
}
}
I think the root of the problem is that you shouldn't be subtracting bufferSize. Rather you should be subtracting the number of unread characters in the buffer. And I don't think there's a way to get this.
The easiest solution I can think of is to create a custom subclass of FilterReader that keeps track of the number of characters read. Then stack the streams as follows:
FileReader
< BufferedReader
< custom filter reader
< BufferedReader(sz == 1)
The final BufferedReader is there so that you can use readLine ... but you need to set the buffer size to 1 so that the character count from your filter matches the position that the application has reached.
Alternatively, you could implement your own readLine() method in the custom filter reader.
After a few days searching I found out that indeed subtracting the buffersize and adding the position in the buffer wasn't the right way to do it. The position was never right and I was always missing some lines.
When searching a new way to do my job I didn't count the number of characters because it are just too many characters to count which will decrease my performance a lot. But I've found something else. Software engineer Mark S. Kolich created a class JumpToLine which uses the Apache IO library to jump to a given line. It can also provide the last line it has readed so this is really what I need.
There are some examples on his homepage for those interested.