This question already has answers here:
How many characters can a Java StringBuilder hold?
(3 answers)
Stringbuilder maximum length
(2 answers)
Android - Set max length of logcat messages
(14 answers)
Closed 5 years ago.
I'm creating a StringBuilder with it's initial capacity to be 8192 and appending lines to fill it with a little under 6000 characters in this case.
When I write the StringBuilder.toString() value out to the log it cuts off the last 1/4 or so of the whole String. This isn't the first time I've noticed android doing this with strings of similarly large sizes written out to the log.
When I run the same in java on a Linux desktop machine, I've no such problem / behaviour - everything is written out.
Is there some sort of limit I don't know about? Do I have to write out everything line by line in separate calls?
Thank you.
Is Android java StringBuilder limit 4096 characters?
No.
Is there some sort of limit I don't know about?
Yes. LogCat will not log arbitrarily-long messages.
Do I have to write out everything line by line in separate calls?
Well, I would log something smaller, or perform other sorts of diagnostics. But, yes, you could split the string into chunks and log those chunks individually.
This is a debugger issue, the full content should be there but not everything is displayed
See: https://stackoverflow.com/a/43537128/2890156
Related
This question already has answers here:
Closing Streams in Java
(6 answers)
Closed 6 years ago.
With huge advancements in CUPs able to process mass amounts of information in fractions of seconds, why is it important that I close a file stream?
Remember that not all devices are the same, platforms like mobile(smartphones and tablets) need to be as efficient as possible. Or if the application has a big user base, maybe when 400 people are logged in there wont be that many problems, but what happens when it goes to ~40k? You have to make your code as versatile as possible, always think about scalability.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 7 years ago.
Improve this question
I need to read a very large file (1.11gb) into memory and process it in bytes. The only way for me to do this is to use an ArrayList (I can't use a byte[] because then it will exceed the limit).
There is no way to make the file smaller (I'm using it as a test to test how long my program processes data).
I then need to drop an ArrayList back onto the hard drive as a file (still 1.11gb)
I'm not as worried about writing as I am reading.
Also speed is of the essence so sub segmenting is to be avoided unless anyone out there has a quick way of doing so.
You are trying to solve this problem the wrong way (and it won't work1).
The possible ways to solve this are:
Redesign the algorithm so that it doesn't need to read the entire file into memory ... in one go.
Read the data into multiple byte[] objects to get around the 2^31 array size limit.
Map the file using multiple ByteBuffer objects2; see Java MemoryMapping big files.
1 - It won't work because ArrayList has an Object[] inside, and is therefore subject to the same limitation you have with byte arrays. In addition, an ArrayList<Byte> will take 4 to 8 times as much memory as a byte[] representing the same number of bytes. Or more, if you populate the ArrayList<Byte> with Byte objects instantiated the wrong way.
2 - The Buffer APIs all use int sizes and offsets, and (AFAIK) do not support mapping of files >= 2^31 bytes into a single Buffer.
This question already has answers here:
Turn Image into Text - Java [duplicate]
(4 answers)
Closed 7 years ago.
I want to search a word from image(scanned copy), retrieve values from image, highlight the location. Is there any API or library available for processing images. I am using Swing for displaying images.
You need something to convert the pixels into characters. That something is a program that provides OCR.
Keep in mind that any program you use will provide its best approximation of what it thinks the character is. While technology has improved a lot, there are many fonts, sufficient noise, and various other confounding factors that could result in false input (where the character is not what you would have deemed it to be). There are also scenarios where the input cannot be mapped to a character. Write your software defensively to handle both cases, as this should be considered "non validated input".
Check out "tesseract". It isn't Java, put available for most platforms open-source, and you can call the command-line program from java via System.exec()
https://code.google.com/p/tesseract-ocr/
given the images in the correct format, it's recognition rate is even better than many commercial OCR software products.
This question already has answers here:
Replace string in file
(2 answers)
Closed 8 years ago.
I have 1 file, which contains some String that need to be updated.
MY REPORT
REPORT RUN DATE : 27/08/2012 12:35:11 PAGE 1 of #TOTAL#
SUCCESSFUL AND UNSUCCESSFUL DAILY TRANSACTIONS REPORT
---record of data here----
MY REPORT
REPORT RUN DATE : 27/08/2012 12:35:11 PAGE 2 of #TOTAL#
SUCCESSFUL AND UNSUCCESSFUL DAILY TRANSACTIONS REPORT
---record of data here----
In case I just want to update all occurence of #TOTAL# to some number, is there a quick and effecient way to do this?
I understand that I can also use BufferedReader+BufferedWriter to print to another file and use String.replace it along the way, but I wonder if there is a better and elegant way to solve this...
The file wont exceed 10MB, so there is no need to concern whether the file can be to big ( exceed 1 GB etc )
If you don't care about the file being too large, and think calling replace() on every line is inelegant, I guess you can just read the entire file into a single String, call replace() once, then write it to the file.
... I wonder if there is a better and elegant way to solve this
It depends on what you mean by "better and elegant", but IMO the answer is no.
The file wont exceed 10MB, so there is no need to concern whether the file can be to big ( exceed 1 GB etc )
You are unlikely to exceed 1Gb. However:
You probably cannot be 100% sure that the file won't be bigger that 10Mb. For any program that has a significant life-time, you can rarely know that the requirements and usage patterns won't change over time.
In fact, a 10Mb text file may occupy up to 60Mb of memory if you load the entire lot into a StringBuilder. Firstly, the bytes are inflated into characters. Secondly, the algorithm used by StringBuilder to manage its backing array involves allocating a new array of double the size the original one. So peak memory usage could be up to 6 times the number of bytes in the file you are reading.
Note that 60Mb is greater than the default maximum heap size for some JVMs on some platforms.
This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
exception while Read very large file > 300 MB
Now, i want to search a string from a big file(>=300M). Because the file is big so i can't load it into memory.
What kind of ways can be provided to handle this problem?
Thanks
There are a few options:
Depending on your target OS, you might be able to hand off this task to a system utility such as grep (which is already optimized for this sort of work) and simply parse the output.
Even if the file were small enough to be contained in memory, you'd have to read it from disk either way. So, you can simply read it in, one line at a time, and compare your string to the contents as they are read. If your app only needs to find the first occurrence of a string in a target file, this has the benefit that, if the target string appears early in the file, you save having to read the entire file just to find something that's in the first half of the file.
Unless you have an upper limit on your app's memory usage (i.e. it must absolutely fit within 128 MB of RAM, etc.) then you can also increase the amount of RAM that the JVM will take up when you launch your app. But, because of the inefficiency of this (in terms of time, and disk I/O, as pointed out in #2), this is unlikely to be the course that you'll want to take, regardless of file size.
I would memory map the file. This doesn't use much heap (< 1 KB), regardless of the file size (up to 2 GB) and takes about 10 ms on most systems.
FileChannel ch = new FileInputStream(fileName).getChannel();
MappedByteBuffer mbb = ch.map(ch.MapMode.READ_ONLY, 0L, ch.size());
This works provided you have a minimum of 4 KB free (and your file is less than 2 GB long)