This question already has answers here:
ClassFormatError in Java
(3 answers)
Closed 9 years ago.
I have one generated class which get this error. Inside this class, there is one huge static block (5000+ lines). I broke the block into several smaller static blocks but still got this error. Why is that
Edit
Code looks like:
private static final Map<Object, Object> nameMap = Maps.newHashMap();
static{
nameMap.put(xxx);
.... 5000 similar lines
nameMap.put(xxx);
}
If it is just data you will need to read the data in from a resource.
Arrange for your data file to be in the same location as the class file and use something like this:
class Primes {
private static final ArrayList<Integer> NUMBERS = new ArrayList<>();
private static final String NUMBER_RESOURCE_NAME = "numbers.txt";
static {
try (InputStream in = Primes.class.getResourceAsStream(NUMBER_RESOURCE_NAME);
InputStreamReader isr = new InputStreamReader(in);
BufferedReader br = new BufferedReader(isr)) {
for (String line; (line = br.readLine()) != null;) {
String[] numberStrings = line.split(",");
for (String numberString : numberStrings) {
if (numberString.trim().length() > 0) {
NUMBERS.add(Integer.valueOf(numberString));
}
}
}
} catch (NumberFormatException | IOException e) {
throw new IllegalStateException("Loading of static numbers failed", e);
}
}
}
I use this to read a comma separated list of 1000 prime numbers.
Related
Tab-Separated File:
2019-06-06 10:00:00 1.0
2019-06-06 11:00:00 2.0
I'd like to iterate over the file once and add the value of each column to a list.
My working approach would be:
import java.util.*;
import java.io.*;
public class Program {
public static void main(String[] args)
{
ArrayList<Double> List_1 = new ArrayList<Double>();
ArrayList<Double> List_2 = new ArrayList<Double>();
String[] values = null;
String fileName = "File.txt";
File file = new File(fileName);
try
{
Scanner inputStream = new Scanner(file);
while (inputStream.hasNextLine()){
try {
String data = inputStream.nextLine();
values = data.split("\\t");
if (values[1] != null && !values[1].isEmpty() == true) {
double val_1 = Double.parseDouble(values[1]);
List_1.add(val_1);
}
if (values[2] != null && !values[2].isEmpty() == true) {
double val_2 = Double.parseDouble(values[2]);
List_2.add(val_2);
}
}
catch (ArrayIndexOutOfBoundsException exception){
}
}
inputStream.close();
}
catch (FileNotFoundException e) {
e.printStackTrace();
}
System.out.println(List_1);
System.out.println(List_2);
}
}
I get:
[1.0]
[2.0]
It doesn't work without the checks for null, ìsEmpty and the ArrayIndexOutOfBoundsException.
I would appreciate any hints on how to save a few lines while keeping the scanner approach.
One option is to create a Map of Lists using column number as a key. This approach gives you "unlimited" number of columns and exactly the same output than one in the question.
public class Program {
public static void main(String[] args) throws Exception
{
Map<Integer, List<Double>> listMap = new TreeMap<Integer, List<Double>>();
String[] values = null;
String fileName = "File.csv";
File file = new File(fileName);
Scanner inputStream = new Scanner(file);
while (inputStream.hasNextLine()){
String data = inputStream.nextLine();
values = data.split("\\t");
for (int column = 1; column < values.length; column++) {
List<Double> list = listMap.get(column);
if (list == null) {
listMap.put(column, list = new ArrayList<Double>());
}
if (!values[column].isEmpty()) {
list.add(Double.parseDouble(values[column]));
}
}
}
inputStream.close();
for(List<Double> list : listMap.values()) {
System.out.println(list);
}
}
}
You can clean up your code some by using try-with resources to open and close the Scanner for you:
try (Scanner inputStream = new Scanner(file))
{
//your code...
}
This is useful because the inputStream will be closed automatically once the try block is left and you will not need to close it manually with inputStream.close();.
Additionally if you really want to "save lines" you can also combine these steps:
double val_2 = Double.parseDouble(values[2]);
List_2.add(val_2);
Into a single step each, since you do not actually use the val_2 anywhere else:
List_2.add(Double.parseDouble(values[2]));
Finally you are also using !values[1].isEmpty() == true which is comparing a boolean value to true. This is typically bad practice and you can reduce it to !values[1].isEmpty() instead which will have the same functionality. Try not to use == with booleans as there is no need.
you can do it like below:
BufferedReader bfr = Files.newBufferedReader(Paths.get("inputFileDir.tsv"));
String line = null;
List<List<String>> listOfLists = new ArrayList<>(100);
while((line = bfr.readLine()) != null) {
String[] cols = line.split("\\t");
List<String> outputList = new ArrayList<>(cols);
//at this line your expected list of cols of each line is ready to use.
listOfLists.add(outputList);
}
As a matter of fact, it is a simple code in java. But because it seems that you are a beginner in java and code like a python programmer, I decided to write a sample code to let you have a good start point. good luck
I'm trying to solve a homework exercise that wants me to count the term frequency in a given inputStream remove duplicates etc... I have an idea how to do the rest of the methods that asks me to but I'm not sure if i initialize the InputStream correctly and how can i count the words in it. It also gives us a hint that we can use sets/maps. Do you have any idea on why and where? I only need few hints regarding the inputStream since I'm pretty new to it.
package com.example.mfromtheleaf;
import java.io.IOException;
import java.io.InputStream;
import java.util.Set;
public class Main {
class TermFrequency {
private String[] stopWords;
private InputStream is;
private Set<String> words;
public TermFrequency(String[] stopWords, InputStream is) {
this.stopWords = stopWords;
this.is = is;
}
public int countTotal() {
int count = 0;
}
}
public static void main(String[] args) {
}
}
1.make a Map to store the word and word frequency,the key of Map is your input words,every value will be set to 0.
2.use BufferedReader read your InputStream one line by one line,then spilit the line to String[],traverse the array and go to the map find the word is exist or not.if exist,then the frequency puls one.
code like this:
String[] terms = new String[2];
Map<String, Integer> map = Arrays.asList(terms).stream()
.collect(Collectors.toMap(e -> e, e -> Integer.parseInt("0")));
InputStream is = new FileInputStream(new File(""));
BufferedReader reader = new BufferedReader(new InputStreamReader(is));
String lineContent = null;
while (null != (lineContent = reader.readLine())) {
String[] words = lineContent.split(" ");
Arrays.asList(words).stream()
.forEach(s -> {
Integer counts = map.get(s);
if (null != counts) {
counts++;
}
});
}
This question already has answers here:
Java: Reading a file into an array
(5 answers)
Closed 4 years ago.
i would like to build a text data-cleaner in Java, which
cleans the text from Smileys and other special charakter. I wrote a text reader,
but he stops after 3/4 of Line 97 and i just don't know why he does it? Normally he should read the complete text file (ca. 110.000 Lines) and then stop. It would be really nice if could show me where my mistake is.
public class FileReader {
public static void main(String[] args) {
String[] data = null;
int i = 0;
try {
Scanner input = new Scanner("C://Users//Alex//workspace//Cleaner//src//Basis.txt");
File file = new File(input.nextLine());
input = new Scanner(file);
while (input.hasNextLine()) {
String line = input.nextLine();
System.out.println(line);
data[i] = line;
i++;
}
input.close();
}
catch (Exception ex) {
ex.printStackTrace();
}
System.out.println(data[97]);
}
}
Your mistake is here:
String[] data = null;
I would expect this code to throw null pointer exception...
You can use ArrayList instead of plain array if you want to have dynamic re-sizing
Given there are some files Customer-1.txt, Customer-2.txt and Customer-3.txt and these files have the following content:
Customer-1.txt
1|1|MARY|SMITH
2|1|PATRICIA|JOHNSON
4|2|BARBARA|JONES
Customer-2.txt
1|1|MARY|SMITH
2|1|PATRICIA|JOHNSON
3|1|LINDA|WILLIAMS
4|2|BARBARA|JONES
Customer-3.txt
2|1|PATRICIA|JOHNSON
3|1|LINDA|WILLIAMS
5|2|ALEXANDER|ANDERSON
These files have a lot of duplicate data, but it is possible that each file contains some data that is unique.
And given that the actual files are sorted, big (a few GB each file) and there are many files...
Then what is the:
a) memory cheapest
b) cpu cheapest
c) fastest
way in Java to create one file out of these three files that will contain all the unique data of each file sorted and concatenated like such:
Customer-final.txt
1|1|MARY|SMITH
2|1|PATRICIA|JOHNSON
3|1|LINDA|WILLIAMS
4|2|BARBARA|JONES
5|2|ALEXANDER|ANDERSON
I looked into the following solution https://github.com/upcrob/spring-batch-sort-merge , but I would like to know if its possible to perhaps do it with the FileInputStream and/or a non spring batch solution.
A solution to use an in memory or real database to join them is not viable for my use case due to the size of the files and the absence of an actual database.
Since the input files are already sorted, a simple parallel iteration of the files, merging their content, is the memory cheapest, cpu cheapest, and fastest way to do it.
This is a multi-way merge join, i.e. a sort-merge join without the "sort", with elimination of duplicates, similar to a SQL DISTINCT.
Here is a version that can do unlimited number of input files (well, as many as you can have open files anyway). It uses a helper class to stage the next line from each input file, so the leading ID value only has to be parsed once per line.
private static void merge(StringWriter out, BufferedReader ... in) throws IOException {
CustomerReader[] customerReader = new CustomerReader[in.length];
for (int i = 0; i < in.length; i++)
customerReader[i] = new CustomerReader(in[i]);
merge(out, customerReader);
}
private static void merge(StringWriter out, CustomerReader ... in) throws IOException {
List<CustomerReader> min = new ArrayList<>(in.length);
for (;;) {
min.clear();
for (CustomerReader reader : in)
if (reader.hasData()) {
int cmp = (min.isEmpty() ? 0 : reader.compareTo(min.get(0)));
if (cmp < 0)
min.clear();
if (cmp <= 0)
min.add(reader);
}
if (min.isEmpty())
break; // all done
// optional: Verify that lines that compared equal by ID are entirely equal
out.write(min.get(0).getCustomerLine());
out.write(System.lineSeparator());
for (CustomerReader reader : min)
reader.readNext();
}
}
private static final class CustomerReader implements Comparable<CustomerReader> {
private BufferedReader in;
private String customerLine;
private int customerId;
CustomerReader(BufferedReader in) throws IOException {
this.in = in;
readNext();
}
void readNext() throws IOException {
if ((this.customerLine = this.in.readLine()) == null)
this.customerId = Integer.MAX_VALUE;
else
this.customerId = Integer.parseInt(this.customerLine.substring(0, this.customerLine.indexOf('|')));
}
boolean hasData() {
return (this.customerLine != null);
}
String getCustomerLine() {
return this.customerLine;
}
#Override
public int compareTo(CustomerReader that) {
// Order by customerId only. Inconsistent with equals()
return Integer.compare(this.customerId, that.customerId);
}
}
TEST
String file1data = "1|1|MARY|SMITH\n" +
"2|1|PATRICIA|JOHNSON\n" +
"4|2|BARBARA|JONES\n";
String file2data = "1|1|MARY|SMITH\n" +
"2|1|PATRICIA|JOHNSON\n" +
"3|1|LINDA|WILLIAMS\n" +
"4|2|BARBARA|JONES\n";
String file3data = "2|1|PATRICIA|JOHNSON\n" +
"3|1|LINDA|WILLIAMS\n" +
"5|2|ALEXANDER|ANDERSON\n";
try (
BufferedReader in1 = new BufferedReader(new StringReader(file1data));
BufferedReader in2 = new BufferedReader(new StringReader(file2data));
BufferedReader in3 = new BufferedReader(new StringReader(file3data));
StringWriter out = new StringWriter();
) {
merge(out, in1, in2, in3);
System.out.print(out);
}
OUTPUT
1|1|MARY|SMITH
2|1|PATRICIA|JOHNSON
3|1|LINDA|WILLIAMS
4|2|BARBARA|JONES
5|2|ALEXANDER|ANDERSON
The code merges purely by ID value, and doesn't verify that rest of line is actually equal. Insert code at the optional comment to check for that, if needed.
This might help:
public static void main(String[] args) {
String files[] = {"Customer-1.txt", "Customer-2.txt", "Customer-3.txt"};
HashMap<Integer, String> customers = new HashMap<Integer, String>();
try {
String line;
for(int i = 0; i < files.length; i++) {
BufferedReader reader = new BufferedReader(new FileReader("data/" + files[i]));
while((line = reader.readLine()) != null) {
Integer uuid = Integer.valueOf(line.split("|")[0]);
customers.put(uuid, line);
}
reader.close();
}
BufferedWriter writer = new BufferedWriter(new FileWriter("data/Customer-final.txt"));
Iterator<String> it = customers.values().iterator();
while(it.hasNext()) writer.write(it.next() + "\n");
writer.close();
} catch (Exception e) {
e.printStackTrace();
}
}
If you have any cquestions ask me.
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
How do I populate JComboBox from a text file?
I am new to programming Java with only 2 months of experience. Can anyone help me to populate a JComboBox with a text file, consisting of 5 lines? I have looked at code on Google, but I keep getting errors.
private void populate() {
String[] lines;
lines = readFile();
jComboBox1.removeAllItems();
for (String str : lines) {
jComboBox1.addItem(str);
}
}
Here is readFile(). From this site
private String[] readFile() {
ArrayList<String> arr = new ArrayList<>();
try {
FileInputStream fstream = new FileInputStream("textfile.txt");
BufferedReader br = new BufferedReader(new InputStreamReader(fstream));
String strLine;
while ((strLine = br.readLine()) != null) {
arr.add(strLine);
}
in.close();
} catch (Exception e) {
}
return arr.toArray(new String[arr.size()]);
}