How can I read and search from file in jTable? - java

What I want to do in this code: When the search button is clicked it will read a file then match the search values with the data inside the file & will show the search result in the jTable.
Problems I am facing: If GPA is selected A+ then it shows A+, A- both & when I press the search button again after giving another search value, the table just adds more data in it.
Solutions needed: I want to just read the file and show only the results in the jTable, not adding the results again & again. The search button should do search in the GPA & Class columns only. & when GPA is selected "A/B/C+" or "-" the search result should give only the data containing that particular GPA.
NOTE: I don't want to change the search options.
I m a total newbie in JAVA. So any kind of help would be appreciated! :)
Screenshot of the UI
private void srchBtnActionPerformed(java.awt.event.ActionEvent evt) {
//file read
String filepath = "E:\\Netbeans workspace\\modified\\Project\\Info.txt";
File file = new File(filepath);
try {
BufferedReader br = new BufferedReader(new FileReader(file));
model = (DefaultTableModel)jTable1.getModel();
Object[] tableLines = br.lines().toArray();
for (int i = 0; i < tableLines.length; i++){
String line = tableLines[i].toString().trim();
String[] dataRow = line.split("/");
model.addRow(dataRow);
}
} catch (Exception ex) {
Logger.getLogger(ReceiverF.class.getName()).log(Level.SEVERE, null, ex);
}
//search from file
String bGroupSrch = (String) jComboBoxBGroup.getSelectedItem();
if(positiveRBtn.isSelected())
bGroupSrch = bGroupSrch + "+";
else if(negativeRBtn.isSelected())
bGroupSrch = bGroupSrch + "-";
String areaSrch = (String)jComboBoxArea.getSelectedItem();
if (bgGroup.getSelection() != null) {
filter(bGroupSrch);
filter(areaSrch);
} else {
SrchEMsg sem = new SrchEMsg(this);
sem.setVisible(true);
sem.setDefaultCloseOperation(JDialog.DISPOSE_ON_CLOSE);
}
}
//Filter Method
private void filter(String query){
TableRowSorter<DefaultTableModel> tr= new TableRowSorter<DefaultTableModel>(model);
jTable1.setRowSorter(tr);
tr.setRowFilter(RowFilter.regexFilter(query));
}

the table just adds more data in it.
When you start the search you do:
model.setRowCount(0);
to clear the data in the table model of the table.
Or the easier solution is to NOT reload the data all the time. Instead you just change the filter that is used by the table.
Read the section from the Swing tutorial on Sorting and Filtering. The code there replaces the filter every time a character is typed.
Your code will change the filter when the search option is changed.

Related

How do I read data from excel sheet and use data as an input for webpage using selenium ? I am able to read only 1 row at a time. Code displayed:

I have just started learning selenium and I am not able to automate the code only reads one at a time from excel.I need to make the code read from the excel automatically instead of changing the row count number in this line "for (int i= 1; i<=6; i++)."
How can I make it automatically read from the code below?
public static void main(String[] args) throws IOException, InterruptedException {
System.setProperty("driver location");
WebDriver driver = new FirefoxDriver();
driver.get("link");
FileInputStream file = new FileInputStream("xcel file location");
XSSFWorkbook workbook = new XSSFWorkbook(file);
XSSFSheet sheet= workbook.getSheet("SO Reg");
int noOfRows = sheet.getLastRowNum(); // returns the row count
System.out.println("No. of Records in the Excel Sheet:" + noOfRows);
int cols=sheet.getRow(1).getLastCellNum();
System.out.println("No. of Records in the Excel Sheet:" + cols);
for (int i= 1; i<=6; i++)
{
String SO_Name = row.getCell(0).getStringCellValue();
String Contact_Person = row.getCell(1).getStringCellValue();
String Address_1 = row.getCell(2).getStringCellValue();
String Address_2 = row.getCell(3).getStringCellValue();
String City = row.getCell(4).getStringCellValue();
String State = row.getCell(5).getStringCellValue();
String ZipCode = row.getCell(6).getStringCellValue();
String Phone_Number = row.getCell(8).getStringCellValue();
String Username = row.getCell(9).getStringCellValue();
String Email = row.getCell(10).getStringCellValue();
String Re_Type_Email = row.getCell(11).getStringCellValue();
//Registration Process
driver.findElement(By.cssSelector("p.text-white:nth-child(4) > a:nth-child(1)")).click(); //create an account
Thread.sleep(5000);
//Enter Data information
driver.findElement(By.id("SOName")).sendKeys(SO_Name);
driver.findElement(By.xpath("//*[#id=\"ContactPerson\"]")).sendKeys(Contact_Person);
driver.findElement(By.xpath("//*[#id=\"AddressLine1\"]")).sendKeys(Address_1);
driver.findElement(By.xpath("//*[#id=\"AddressLine2\"]")).sendKeys(Address_2);
driver.findElement(By.id("City")).sendKeys(City);
driver.findElement(By.id("State")).sendKeys(State);
driver.findElement(By.id("ZipCode")).sendKeys(ZipCode);
driver.findElement(By.id("Phone")).sendKeys(Phone_Number);
driver.findElement(By.xpath("//*[#id=\"UserName\"]")).sendKeys(Username);
driver.findElement(By.xpath("//*[#id=\"Email\"]")).sendKeys(Email);
driver.findElement(By.xpath("//*[#id=\"RandText\"]")).sendKeys(Re_Type_Email);
driver.findElement(By.id("ConfirmBox")).click();
driver.findElement(By.xpath("/html/body/app-root/app-soregistration/div[2]/div/div/div/div/form[2]/div/div[12]/div/button[1]")).click();
driver.findElement(By.cssSelector(".btn-green-text-black")).click(); //finish button
driver.findElement(By.cssSelector("p.text-white:nth-child(4) > a:nth-child(1)")).click(); //create an account
Thread.sleep(5000);
}
}
}
}
}
Do you only want to process newly added rows in Excel? If so, you should also save your last stay.
First of all, you can simply keep it in an infinite loop. Like
while(true){...}
. You can also start your loop here by keeping the last line you read from Excel in a static variable.
For example:
for (int i= previusLastSavedRowNum; i<=getLastRowNum; i++) {...}
If there is no new record, you can wait for a while in the WHILE loop.
Of course, for a better solution, you can create a SpringBoot project and set up a structure that listens for changes in Excel. When Excel detects the change, you can call the Selenium code with a trigger.
Better way to handle it is to open the excel file as CSV.
You can read all the data into one String with:
String excelToString = new String(Files.readAllBytes(Paths.get(path_to_file)));
If you want to keep it as table you can parse this String into String [] [] table.

Change order of columns in a txt file

I have a txt file where some columns do not appear in every row but this causes the problem that in the rows where they appear they mess up the order of my columns:
35=d|5799=00000000|980=A|779=20190721173046000465|1180=310|1300=64|462=5|207=XCME|1151=ES|6937=ES|55=ESM0|48=163235|22=8|167=FUT|461=FFIXSX|200=202006|15=USD|1142=F|562=1|1140=3000|969=25.000000000|9787=0.010000000|996=IPNT|1147=50.000000000|1150=302775.000000000|731=00000110|5796=20190724|1149=315600.000000000|1148=285500.000000000|1143=600.000000000|1146=12.500000000|9779=N|864=2|865=5|1145=20190315133000000000|865=7|1145=20200619133000000000|1141=1|1022=GBX|264=10|870=1|871=24|872=00000000000001000010000000001111|1234=0|5791=279|5792=10121|
35=d|5799=00000000|980=A|779=20190721173046000465|1180=310|1300=64|462=5|207=XCME|1151=ES|6937=ES|55=ESU9|48=191262|22=8|167=FUT|461=FFIXSX|200=201909|15=USD|1142=F|562=1|1140=3000|969=25.000000000|9787=0.010000000|996=IPNT|1147=50.000000000|1150=302150.000000000|731=00000110|5796=20190724|1149=315700.000000000|1148=285600.000000000|1143=600.000000000|1146=12.500000000|9779=N|864=2|865=5|1145=20180615133000000000|865=7|1145=20190920133000000000|1141=1|1022=GBX|264=10|870=1|871=24|872=00000000000001000010000000001111|1234=0|5791=250519|5792=452402|
35=d|5799=00000000|980=A|779=20190721173046000465|1180=310|1300=64|462=5|207=XCME|1151=$E|6937=0ES|55=0ESQ9|48=229588|22=8|167=FUT|461=FFIXSX|200=201908|15=USD|1142=F|562=1|1140=3000|969=25.000000000|9787=0.010000000|996=IPNT|1147=50.000000000|1150=25.000000000|731=00000011|5796=20190607|1143=0.000000000|1146=12.500000000|9779=N|864=2|865=5|1145=20190621133000000000|865=7|1145=20190816133000000000|1141=1|1022=GBX|264=10|870=1|871=24|872=00000000000001000010000000001111|1234=0|
35=d|5799=00000000|980=A|779=20190721173114000729|1180=441|1300=56|462=16|207=DUMX|1151=1O|6937=OQE|55=OQEH4 C6100|48=1546|22=8|167=OOF|461=OCEFPS|201=1|200=202403|15=USD|202=6100.000000000|947=USD|9850=0.100000000|1142=F|562=1|1140=999|969=1.000000000|1146=10.000000000|9787=0.010000000|996=BBL|1147=1000.000000000|731=00000001|1148=0.100000000|9779=N|5796=20190718|864=2|865=5|1145=20181031213000000000|865=7|1145=20240126193000000000|1141=1|1022=GBX|264=3|870=1|871=24|872=00000000000001000000000100000101|1234=1|1093=4|1231=1.0000|711=1|309=211120|305=8|311=OQDH4|1647=0|
35=d|5799=00000000|980=A|779=20190721173115000229|1180=441|1300=56|462=16|207=DUMX|1151=1O|6937=OQE|55=OQEM4 C5700|48=2053|22=8|167=OOF|461=OCEFPS|201=1|200=202406|15=USD|202=5700.000000000|947=USD|9850=0.100000000|1142=F|562=1|1140=999|969=1.000000000|1146=10.000000000|9787=0.010000000|996=BBL|1147=1000.000000000|731=00000001|1148=0.100000000|9779=N|5796=20190718|864=2|865=5|1145=20181031213000000000|865=7|1145=20240425183000000000|1141=1|1022=GBX|264=3|870=1|871=24|872=00000000000001000000000100000101|1234=1|1093=4|1231=1.0000|711=1|309=329748|305=8|311=OQDM4|1647=0|
For example in the first three rows there always comes 461=… and then 200=… while starting from the 4th row between 461=… and 200=… there is 201=…
Now I thought of somehow moving every column which appears later which was not there in the first row to the end of the row so that it becomes the last column but I do not know how to do exactly this operation. Here is what I have tried:
private static void ladeDatei(String datName) {
File file = new File(datName);
if (!file.canRead() || !file.isFile())
System.exit(0);
BufferedReader in = null;
try {
in = new BufferedReader(new FileReader(datName));
String row = null;
String row2 = null;
while ((row = in.readLine()) != null) {
System.out.println("Gelesene Zeile: " + row);
while(row.contains("|")) {
row2 = row.substring(row.indexOf("|") + 1);
row=row2;
row2 = row.substring(0, row.indexOf("=") + 1);
row2 = row2.replace("=", "");
if(!numbers.contains(row2)) {
numbers.add(row2);
}
System.out.println(row);
//System.out.println(row2);
}
}
} catch (IOException e) {
e.printStackTrace();
} finally {
if (in != null)
try {
in.close();
} catch (IOException e) {
}
}
}
I thought about splitting every row by | and save them in the textArr list but then I wouldn't know which rows belong together. My main problem is that I don't know a good way to check if the column exists in an earlier row and how to move it to the end of the row.
EDIT: Now I saved every new entry in the numbers arraylist (see my edit in the code above) but now I am stuck because I don't know how to shift them and all the ones which come after them to the end of each row.
That's a hell of a job. What I would do is:
(1) split the lines at |
(2) make a List where You append the numbers between | and = (append each new number at the end)
(3) make a Map where the line parts are mapped to the numbers in (2) as key
(4) make a second Map where the max-column-values of the line parts are mapped to the numbers in (2)
(5) read through the List from (2) joinig the associated line parts with | padded to the max-column-values
(if there is no line part for a specific number You must do the padding as well)
When ever possible — I would prefer to structure the line parts in a html-table.
The change of the column order will not solve the problem of broader or smaller colums.

android- convert arraylist to one string in custom format

i have data in sqlit database so i use it to store categoreis and items
and i get it in arraylist like this:
public ArrayList showDataItems(String id_cate){
ArrayList<Items> arrayListItems = new ArrayList<>();
SQLiteDatabase db = this.getReadableDatabase();
Cursor cr = db.rawQuery("select * from items where id_cate = "+id_cate,null);
cr.moveToFirst();
while (cr.isAfterLast() == false){
String item_id = cr.getString(0);
String ItemName = cr.getString(1);
String Item_quantity = cr.getString(2);
String icon = cr.getString(3);
int isDone = Integer.parseInt(cr.getString(5));
arrayListItems.add(new Items(item_id,ItemName,Item_quantity,R.drawable.shopicon,icon,isDone));
cr.moveToNext();
}
return arrayListItems;
}
so i need to get this data and convert it to string and share it to other application like whatsapp in custom format for example :
1- first one *
2- second one *
3-....
so i use this code for send data
Intent intent = new Intent(Intent.ACTION_SEND);
intent.putExtra(Intent.EXTRA_TEXT,"hello world");
intent.setType("text/plain");
startActivity(Intent.createChooser(intent,"send items i need all
data here"));
so we can use string builder or some thing to get data in one string
please help me!
As you said, there is a StringBuilder class who can help formatting strings.
Here are the java docs
From StringBuilder, see the append(..) method. For your example:
int size = list.size();
for(int i = 0; i< size-1;i++){
stringBuilder.append(i+1).append("- ").append(list.get(i)).append('\n');
}
stringBuilder.append(size).append("- ").append(list.get(size-1));
The last call to append is different from the firstone by not appending the end line

Java-Produce a list of products which has already expired based on the current date from text file

i'm in making GUI application in netbeans for my college semester project (Market manager). we can add any product to the app and store the data as .txt file and file name is based on product code.
sample data in 1234.txt :
Product code : 1234
Name : Noodle
Price : $1000
Description : Instant noodle is not good for healthy
Expiry data : 12-01-2050
my question is how to read all file if there already added more of .txt file and read date in file and showing list of expired product to jtextArea as file name based on current date and button for remove all expired file.
private void okBtnActionPerformed(java.awt.event.ActionEvent evt) {
String code = txtCode.getText();
String name = txtName.getText();
String price = txtPrice.getText();
String expiry = txtExpiry.getText();
String quantity = txtQuantity.getText();
String description = txtDescription.getText();
int quant = Integer.parseInt(quantity);
try {
for (int i = 0; i < quant; i++) {
File file = new File("Product/"+code+i+".txt");
if (!file.exists()) {
file.createNewFile();
String content = "Code: " + code + i + "\r\nName: " + name + "\r\nPrice: RM." + price + "\r\nDescription: " + description + "\r\nExpiry Date: " + expiry;
FileWriter data = new FileWriter(file.getAbsoluteFile());
BufferedWriter bw = new BufferedWriter(data);
bw.write(content);
bw.close();
JOptionPane.showMessageDialog(this, "Product Added");
txtCode.setText("");
txtName.setText("");
txtPrice.setText("");
txtExpiry.setText("");
txtQuantity.setText("");
txtDescription.setText("");
} else {
JOptionPane.showMessageDialog(this, "The Product Code Already Added");
break;
}
}
} catch (IOException e) {
}
}
this code for adding product
You start by separating responsibilities. You first create a class that represents a Product. In your current approach, you try to "model" a Product by a "set" of variables that somehow belong together.
Instead, create a class that has the corresponding fields; and a nice equals method for example.
Then you create a method that takes a String representing a filename. That method opens the file; reads the textual data and creates one Product object from that. Lets call that readSingleProduct().
Next: create a method that takes a String representing a directory for example. That method checks for all the text files in that directory and calls readSingleProduct() in order create Product objects; and in the end, that method would return some List<Product>.
And then, finally, you build your UI code that receives such a List of Product objects, and uses that as model for the actual UI components.
And a hint: you never go with empty catch blocks. You should at least print the exception there; ignoring errors is always a super-bad idea!
Hope that gets you going!

Compare Two CSV Files and Fetch Data

I have two csv files. One Master CSV File around 500000 records. Another DailyCSV file has 50000 Records.
The DailyCSV files misses few columns which has to be fetched from Master CSV File.
For example
DailyCSV File
id,name,city,zip,occupation
1,Jhon,Florida,50069,Accountant
MasterCSV File
id,name,city,zip,occupation,company,exp,salary
1, Jhon, Florida, 50069, Accountant, AuditFirm, 3, $5000
What I have to do is, read both files, match the records with ID, if ID is present in the master file, then i have to fetch company, exp, salary and write it to a new csv file.
How to achieve this.??
What I have done Currently
while (true) {
line = bstream.readLine();
lineMaster = bstreamMaster.readLine();
if (line == null || lineMaster == null)
{
break;
}
else
{
while(lineMaster != null)
readlineSplit = line.split(",(?=([^\"]*\"[^\"]*\")*[^\"]*$)", -1);
String splitId = readlineSplit[4];
String[] readLineSplitMaster =lineMaster.split(",(?=([^\"]*\"[^\"]*\")*[^\"]*$)", -1);
String SplitIDMaster = readLineSplitMaster[13];
System.out.println(splitId + "|" + SplitIDMaster);
//System.out.println(splitId.equalsIgnoreCase(SplitIDMaster));
if (splitId.equalsIgnoreCase(SplitIDMaster)) {
String writeLine = readlineSplit[0] + "," + readlineSplit[1] + "," + readlineSplit[2] + "," + readlineSplit[3] + "," + readlineSplit[4] + "," + readlineSplit[5] + "," + readLineSplitMaster[15]+ "," + readLineSplitMaster[16] + "," + readLineSplitMaster[17];
System.out.println(writeLine);
pstream.print(writeLine + "\r\n");
}
}
}pstream.close();
fout.flush();
bstream.close();
bstreamMaster.close();
First of all, your current parsing approach will be painfully slow. Use a CSV parsing library dedicated for that to speed things up. With uniVocity-parsers you can process your 500K records in less than a second. This is how you can use it to solve your problem:
First let's define a few utility methods to read/write your files:
//opens the file for reading (using UTF-8 encoding)
private static Reader newReader(String pathToFile) {
try {
return new InputStreamReader(new FileInputStream(new File(pathToFile)), "UTF-8");
} catch (Exception e) {
throw new IllegalArgumentException("Unable to open file for reading at " + pathToFile, e);
}
}
//creates a file for writing (using UTF-8 encoding)
private static Writer newWriter(String pathToFile) {
try {
return new OutputStreamWriter(new FileOutputStream(new File(pathToFile)), "UTF-8");
} catch (Exception e) {
throw new IllegalArgumentException("Unable to open file for writing at " + pathToFile, e);
}
}
Then, we can start reading your daily CSV file, and generate a Map:
public static void main(String... args){
//First we parse the daily update file.
CsvParserSettings settings = new CsvParserSettings();
//here we tell the parser to read the CSV headers
settings.setHeaderExtractionEnabled(true);
//and to select ONLY the following columns.
//This ensures rows with a fixed size will be returned in case some records come with less or more columns than anticipated.
settings.selectFields("id", "name", "city", "zip", "occupation");
CsvParser parser = new CsvParser(settings);
//Here we parse all data into a list.
List<String[]> dailyRecords = parser.parseAll(newReader("/path/to/daily.csv"));
//And convert them to a map. ID's are the keys.
Map<String, String[]> mapOfDailyRecords = toMap(dailyRecords);
... //we'll get back here in a second.
This is the code to generate a Map from the list of daily records:
/* Converts a list of records to a map. Uses element at index 0 as the key */
private static Map<String, String[]> toMap(List<String[]> records) {
HashMap<String, String[]> map = new HashMap<String, String[]>();
for (String[] row : records) {
//column 0 will always have an ID.
map.put(row[0], row);
}
return map;
}
With the map of records, we can process your master file and generate the list of updates:
private static List<Object[]> processMasterFile(final Map<String, String[]> mapOfDailyRecords) {
//we'll put the updated data here
final List<Object[]> output = new ArrayList<Object[]>();
//configures the parser to process only the columns you are interested in.
CsvParserSettings settings = new CsvParserSettings();
settings.setHeaderExtractionEnabled(true);
settings.selectFields("id", "company", "exp", "salary");
//All parsed rows will be submitted to the following RowProcessor. This way the bigger Master file won't
//have all its rows stored in memory.
settings.setRowProcessor(new AbstractRowProcessor() {
#Override
public void rowProcessed(String[] row, ParsingContext context) {
// Incoming rows from MASTER will have the ID as index 0.
// If the daily update map contains the ID, we'll get the daily row
String[] dailyData = mapOfDailyRecords.get(row[0]);
if (dailyData != null) {
//We got a match. Let's join the data from the daily row with the master row.
Object[] mergedRow = new Object[8];
for (int i = 0; i < dailyData.length; i++) {
mergedRow[i] = dailyData[i];
}
for (int i = 1; i < row.length; i++) { //starts from 1 to skip the ID at index 0
mergedRow[i + dailyData.length - 1] = row[i];
}
output.add(mergedRow);
}
}
});
CsvParser parser = new CsvParser(settings);
//the parse() method will submit all rows to the RowProcessor defined above.
parser.parse(newReader("/path/to/master.csv"));
return output;
}
Finally, we can get the merged data and write everything to another file:
... // getting back to the main method here
//Now we process the master data and get a list of updates
List<Object[]> updatedData = processMasterFile(mapOfDailyRecords);
//And write the updated data to another file
CsvWriterSettings writerSettings = new CsvWriterSettings();
writerSettings.setHeaders("id", "name", "city", "zip", "occupation", "company", "exp", "salary");
writerSettings.setHeaderWritingEnabled(true);
CsvWriter writer = new CsvWriter(newWriter("/path/to/updates.csv"), writerSettings);
//Here we write everything, and get the job done.
writer.writeRowsAndClose(updatedData);
}
This should work like a charm. Hope it helps.
Disclosure: I am the author of this library. It's open-source and free (Apache V2.0 license).
I will approach the problem in a step by step manner.
First I will parse/read the master CSV file and keep its content into a hashmap, where the key will be each record's unique 'id' as for the value maybe you can store them in a hash or simply create a java class to store the information.
Example of hash:
{
'1' : { 'name': 'Jhon',
'City': 'Florida',
'zip' : 50069,
....
}
}
Next, read your comparer csv file. For each row, read the 'id' and check if the key exists on the hashmap you have created earlier.
if it exists, then from the hashmap access the information you need and write to a new CSV file.
Also, you might want to consider using a 3rd party CSV parser to make this task easier.
If you have maven maybe you can follow this example I found on net. Otherwise you can just google for apache 'csv parser' example on the internet.
http://examples.javacodegeeks.com/core-java/apache/commons/csv-commons/writeread-csv-files-with-apache-commons-csv-example/

Categories