Cannot read documents as Java classes - java

I am using ArangoDB 3.1.23 and the ArangoDB Java driver 4.2.2. I am using Eclipse and Maven. I am having troubles to read documents as Java classes, as it is explained here. I followed the tutorial, and wrote the following test code.
As you can see, reading documents as BaseDocument or VelocyPack works, but reading them as Java classes returns a null.
public static void main(String[] args) {
class MyObject {
private String key;
private String name;
private int age;
public MyObject(String name, int age) {
this();
this.name = name;
this.age = age;
}
public MyObject() {
super();
}
}
final String dbName = "testdb";
final String collName = "testCollection";
ArangoDB arangoDB = new ArangoDB.Builder().user("root").password("").build();
// Delete existing database
try{
System.out.println("Deleted existing " + dbName + " database: " + arangoDB.db(dbName).drop());
} catch (Exception e) {
System.err.println("Error while deleting database " + dbName);
}
// Test database creation
try {
arangoDB.createDatabase(dbName);
System.out.println("Created database " + dbName);
} catch (Exception e) {
System.err.println("Did not create database " + dbName);
}
// Test collection creation
try {
arangoDB.db(dbName).createCollection(collName);
System.out.println("Created collection " + collName);
} catch (Exception e) {
System.err.println("Did not create collection " + collName);
}
// Test custom class document insertion
String key1 = null;
try {
MyObject myObject = new MyObject("Homer", 38);
key1 = arangoDB.db(dbName).collection(collName).insertDocument(myObject).getKey();
System.out.println("Inserted new document as MyObject. key: " + myObject.key + ", " + key1);
} catch (Exception e) {
System.err.println("Did not insert new document");
}
// Test BaseDocument class document insertion
String key2 = null;
try {
BaseDocument myBaseDocument = new BaseDocument();
myBaseDocument.addAttribute("name", "Paul");
myBaseDocument.addAttribute("age", 23);
key2 = arangoDB.db(dbName).collection(collName).insertDocument(myBaseDocument).getKey();
System.out.println("Inserted new document as BaseDocument. key: " + myBaseDocument.getKey() + ", " + key2);
} catch (Exception e) {
System.err.println("Did not insert new document");
}
// Test read as VPackSlice
String keyToRead1 = key1;
VPackSlice doc1 = arangoDB.db(dbName).collection(collName).getDocument(keyToRead1, VPackSlice.class);
if (doc1 != null)
System.out.println("Open document " + keyToRead1 + " VPackSlice: " + doc1.get("name").getAsString() + " " + doc1.get("age").getAsInt());
else
System.err.println("Could not open the document " + keyToRead1 + " using VPackSlice");
// Test read as BaseDocument
String keyToRead2 = key1;
BaseDocument doc2 = arangoDB.db(dbName).collection(collName).getDocument(keyToRead2, BaseDocument.class);
if (doc2 != null)
System.out.println("Open document " + keyToRead2 + " as BaseDocument: " + doc2.getAttribute("name") + " " + doc2.getAttribute("age"));
else
System.err.println("Could not open the document " + keyToRead2 + " as BaseDocument");
// Test read as MyObject
String keyToRead3 = key1;
MyObject doc3 = arangoDB.db(dbName).collection(collName).getDocument(keyToRead3, MyObject.class);
if (doc3 != null)
System.out.println("Open document " + keyToRead3 + " as MyObject: " + doc3.name + " " + doc3.age);
else
System.err.println("Could not open the document " + keyToRead3 + " as MyObject");
}
Result:
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Deleted existing testdb database: true
Created database testdb
Created collection testCollection
Inserted new document as MyObject. key: null, 3510088
Inserted new document as BaseDocument. key: 3510092, 3510092
Open document 3510088 VPackSlice: Homer 38
Open document 3510088 as BaseDocument: Homer 38
Could not open the document 3510088 as MyObject

I was able to get your example to work by moving the MyObject to it's own file. I think it might be due to the inline object as I tried adding the annotation and getters/setters inline and that didn't work either. Like so:
import com.arangodb.entity.DocumentField;
import com.arangodb.entity.DocumentField.Type;
public class MyObject {
#DocumentField(Type.KEY)
private String key;
public String getKey() {
return key;
}
public void setKey(String key) {
this.key = key;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public int getAge() {
return age;
}
public void setAge(int age) {
this.age = age;
}
private String name;
private int age;
public MyObject(String name, int age) {
this();
this.name = name;
this.age = age;
}
public MyObject() {
super();
}
}
And
// Test read as MyObject
String keyToRead3 = key1;
MyObject doc3 = arangoDB.db(dbName).collection(collName).getDocument(keyToRead3, MyObject.class);
if (doc3 != null)
System.out.println("Open document " + keyToRead3 + " as MyObject: " + doc3.getName() + " " + doc3.getAge());
else
System.err.println("Could not open the document " + keyToRead3 + " as MyObject");
Which produces
Inserted new document as MyObject. key: 7498620, 7498620
Inserted new document as BaseDocument. key: 7498624, 7498624
Open document 7498620 VPackSlice: Homer 38
Open document 7498620 as BaseDocument: Homer 38
Open document 7498620 as MyObject: Homer 38

Related

How to read values from properties file and use it in final class

Here is my final class "Constants"
#Component
public final class Constants {
#Value("${db2.schema}")
private static String Schema;
public static final String STUDENT_TABLE = Schema + ".Student";
}
I have db2.schema in my properties file :
db2.schema = ${DB2_SCHEMA}
DB2_SCHEMA = D5677ESB
#Value can not be used with static fields
You can use like:
...
public class Constants {
public static final String NewOrder = "neworder";
public static final String POST = "POST";
public static final String CONTENT_TYPE = "Content-Type";
public static final String APPLICATION_TYPE = "application/json";
public static final String ACCEPT = "Accept";
public static final String CART_URL = PropsUtil.get("order.inquiry.search.insertCartDataURL");
}
...
** or check this code:**
...
public String getPropValues() throws IOException {
try {
Properties prop = new Properties();
String propFileName = "config.properties";
inputStream = getClass().getClassLoader().getResourceAsStream(propFileName);
if (inputStream != null) {
prop.load(inputStream);
} else {
throw new FileNotFoundException("property file '" + propFileName + "' not found in the classpath");
}
Date time = new Date(System.currentTimeMillis());
// get the property value and print it out
String user = prop.getProperty("user");
String company1 = prop.getProperty("company1");
String company2 = prop.getProperty("company2");
String company3 = prop.getProperty("company3");
result = "Company List = " + company1 + ", " + company2 + ", " + company3;
System.out.println(result + "\nProgram Ran on " + time + " by user=" + user);
} catch (Exception e) {
System.out.println("Exception: " + e);
} finally {
inputStream.close();
}
return result;
} ...

Query a JSON file with Java-Large file

I am trying to parse below JSON file using java.
I need to be able to
search the file by id or name or any of the fields in the object.
search for empty values in the field as well.
The search should return entire object.
The File will be huge and the search should still be time efficient.
[
{
"id": 1,
"name": "Mark Robb",
"last_login": "2013-01-21T05:13:41 -11:30",
"email": "markrobb#gmail.com",
"phone": "12345",
"locations": [
"Germany",
"Austria"
]
},
{
"id": 2,
"name": "Matt Nish",
"last_login": "2014-02-21T07:10:41 -11:30",
"email": "mattnish#gmail.com",
"phone": "456123",
"locations": [
"France",
"Italy"
]
}
]
This is what I have tried so far using Jackson library.
public void findById(int id) {
List<Customer> customers = objectMapper.readValue(new File("src/main/resources/customers.json"), new TypeReference<List<Customer>>(){});
for(Customer customer: customers) {
if(customer.getId() == id) {
System.out.println(customer.getName());
}
}
}
I just don't think this is an efficient method for a huge JSON file(About 20000 customers in a file). And there could be multiple files. Search time should not increase linearly.
How can I make this time efficient? Should I use any other library?
The most efficient (both CPU and memory) way to parse is to use stream oriented parsing instead of object mapping. Usually, it takes a bit more code to be written, but also usually it is a good deal :) Both Gson and Jackson support such lightweight technique. Also, you should avoid memory allocation in the main/hot path to prevent GC pauses. To illustrate the idea I use a small GC-free library https://github.com/anatolygudkov/green-jelly:
import org.green.jelly.*;
import java.io.CharArrayReader;
import java.io.Reader;
import java.util.ArrayList;
import java.util.List;
public class SelectById {
public static class Customer {
private long id;
private String name;
private String email;
public void clear() {
id = 0;
name = null;
email = null;
}
public Customer makeCopy() {
Customer result = new Customer();
result.id = id;
result.name = name;
result.email = email;
return result;
}
#Override
public String toString() {
return "Customer{" +
"id=" + id +
", name='" + name + '\'' +
", email='" + email + '\'' +
'}';
}
}
public static void main(String[] args) throws Exception {
final String file = "\n" +
"[\n" +
" {\n" +
" \"id\": 1,\n" +
" \"name\": \"Mark Robb\",\n" +
" \"last_login\": \"2013-01-21T05:13:41 -11:30\",\n" +
" \"email\": \"markrobb#gmail.com\",\n" +
" \"phone\": \"12345\",\n" +
" \"locations\": [\n" +
" \"Germany\",\n" +
" \"Austria\"\n" +
" ]\n" +
"},\n" +
" {\n" +
" \"id\": 2,\n" +
" \"name\": \"Matt Nish\",\n" +
" \"last_login\": \"2014-02-21T07:10:41 -11:30\",\n" +
" \"email\": \"mattnish#gmail.com\",\n" +
" \"phone\": \"456123\",\n" +
" \"locations\": [\n" +
" \"France\",\n" +
" \"Italy\"\n" +
" ]\n" +
" }\n" +
"]\n";
final List<Customer> selection = new ArrayList<>();
final long selectionId = 2;
final JsonParser parser = new JsonParser().setListener(
new JsonParserListenerAdaptor() {
private final Customer customer = new Customer();
private String currentField;
#Override
public boolean onObjectStarted() {
customer.clear();
return true;
}
#Override
public boolean onObjectMember(final CharSequence name) {
currentField = name.toString();
return true;
}
#Override
public boolean onStringValue(final CharSequence data) {
switch (currentField) {
case "name":
customer.name = data.toString();
break;
case "email":
customer.email = data.toString();
break;
}
return true;
}
#Override
public boolean onNumberValue(final JsonNumber number) {
if ("id".equals(currentField)) {
customer.id = number.mantissa();
}
return true;
}
#Override
public boolean onObjectEnded() {
if (customer.id == selectionId) {
selection.add(customer.makeCopy());
return false; // we don't need to continue
}
return true;
}
}
);
// now let's read and parse the data with a buffer
final CharArrayCharSequence buffer = new CharArrayCharSequence(1024);
try (final Reader reader = new CharArrayReader(file.toCharArray())) { // replace by FileReader, for example
int len;
while((len = reader.read(buffer.getChars())) != -1) {
buffer.setLength(len);
parser.parse(buffer);
}
}
parser.eoj();
System.out.println(selection);
}
}
It should work almost as fast as possible in Java (in case we cannot use SIMD instructions directly). To get rid of memory allocation at all (and GC pauses) in the main path, you have to replace ".toString()" (it creates new instance of String) by something reusable like StringBuilder.
The last thing which may affects overall performance is method of the file reading. And RandomAccessFile is one of the best options we have in Java. Since your encoding seems to be ASCII, just cast byte to char to pass to the JsonParser.
It should be possible to do this with Jackson. The trick is to use JsonParser to stream/parse the top-level array and then parse each record using ObjectMapper.readValue().
ObjectMapper objectMapper = new ObjectMapper();
File file = new File("customers.json");
try (JsonParser parser = objectMapper.getFactory().createParser(file))
{
//Assuming top-level array
if (parser.nextToken() != JsonToken.START_ARRAY)
throw new RuntimeException("Expected top-level array in JSON.");
//Now inside the array, parse each record
while (parser.nextToken() != JsonToken.END_ARRAY)
{
Customer customer = objectMapper.readValue(parser, Customer.class);
//Do something with each customer as it is parsed
System.out.println(customer.id + ": " + customer.name);
}
}
#JsonIgnoreProperties(ignoreUnknown = true)
public static class Customer
{
public String id;
public String name;
public String email;
}
In terms of time efficiency it will need to still scan the entire file - not much you can do about that without an index or something fancier like parallel parsing. But it will be more memory efficient than reading the entire JSON into memory - this code only loads one Customer object at a time.
Also:
if(customer.getId() == id) {
Use .equals() for comparing strings, not ==:
if (customer.getId().equals(id)) {
You can try the Gson library. This library implements a TypeAdapter class that converts Java objects to and from JSON by streaming serialization and deserialization.
The API is efficient and flexible especially for huge files. Here is an example:
public class GsonStream {
public static void main(String[] args) {
Gson gson = new Gson();
try (Reader reader = new FileReader("src/main/resources/customers.json")) {
Type listType = new TypeToken<List<Customer>>(){}.getType();
// Convert JSON File to Java Object
List<Customer> customers = gson.fromJson(reader, listType);
List<Customer> names = customers
.stream()
.filter(c -> c.getId() == id)
.map(Customer::getName)
.collect(Collectors.toList());
} catch (IOException e) {
e.printStackTrace();
}
}
}
If you want to understand how to Override the TypeAdapter abstract class here you have and example:
public class GsonTypeAdapter {
public static void main(String args[]) {
GsonBuilder builder = new GsonBuilder();
builder.registerTypeAdapter(Customer.class, new customerAdapter());
builder.setPrettyPrinting();
Gson gson = builder.create();
try {
reader = new JsonReader(new FileReader("src/main/resources/customers.json"));
Customer customer = gson.fromJson(jsonString, Customer.class);
System.out.println(customer);
jsonString = gson.toJson(customer);
System.out.println(jsonString);
} catch (IOException e) {
e.printStackTrace();
}
}
}
class customerAdapter extends TypeAdapter<Customer> {
#Override
public customer read(JsonReader reader) throws IOException {
Customer customer = new customer();
reader.beginObject();
String fieldName = null;
while (reader.hasNext()) {
JsonToken token = reader.peek();
if (token.equals(JsonToken.NAME)) {
//get the current token
fieldName = reader.nextName();
}
if ("name".equals(fieldName)) {
//move to next token
token = reader.peek();
customer.setName(reader.nextString());
}
if("id".equals(fieldName)) {
//move to next token
token = reader.peek();
customer.setRollNo(reader.nextInt());
}
}
reader.endObject();
return customer;
}
#Override
public void write(JsonWriter writer, Customer customer) throws IOException {
writer.beginObject();
writer.name("name");
writer.value(customer.getName());
writer.name("id");
writer.value(customer.getId());
writer.endObject();
}
}
class Customer {
private int id;
private String name;
public int getId() {
return id;
}
public void setId(int id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String toString() {
return "Customer[ name = " + name + ", id: " + id + "]";
}
}

Repeated Writing and Loading ArrayList of objects from a file

I have a 'Person' class where i stored data like name, surname etc. I make 5 object type Person, add them to ArrayList, and save this ArrayList to file. Next i'm loading from this file ArrayList and i have 5 person. Problem is when i want save again for example 10 object Person. When i'm loading ArrayList from file i'm getting only 5 person from first writing. If i repeat this still i will have load data from first writing to this file. How i can fix this ?
public class Data {
static List<Person> persons = new ArrayList<Person>();
public static void main(String[] args) throws IOException {
Data.savePersons(5);
Data.loadPersons();
/** Clean 'persons' array for TEST of load data */
persons.removeAll(persons);
System.out.println("\n-----------\nNext Round\n-----------\n");
Data.savePersons(10);
Data.loadPersons();
}
/** Save a couple of Person Object to file C:/data.ser */
public static void savePersons(int noOfPersonToSave) throws IOException {
FileOutputStream fout = null;
ObjectOutputStream oos = null;
/** Make 5 'Person' object and add them to ArrayList 'persons' for example */
for (int i = 0; i < noOfPersonToSave; i++) {
Person personTest = new Person("name" + i, "surname" + i, "email" +i, "1234567890" +i);
persons.add(personTest);
}
try {
fout = new FileOutputStream("C:\\data.ser", true);
oos = new ObjectOutputStream(fout);
oos.writeObject(persons);
System.out.println("Saving '" + persons.size() + "' Object to Array");
System.out.println("persons.size() = " + persons.size());
System.out.println("savePersons() = OK");
} catch (Exception ex) {
System.out.println("Saving ERROR: " + ex.getMessage());
} finally {
if (oos != null) {
oos.close();
}
}
}
/** Load previously saved a couple of Person Object in file C:/data.ser */
public static void loadPersons() throws IOException {
FileInputStream fis = null;
ObjectInputStream ois = null;
try {
fis = new FileInputStream("C:\\data.ser");
ois = new ObjectInputStream(fis);
persons = (List<Person>) ois.readObject();
//persons.add(result);
System.out.println("-------------------------");
System.out.println("Loading '" + persons.size() + "' Object from Array");
System.out.println("persons.size() = " + persons.size());
System.out.println("loadPersons() = OK");
} catch (Exception e) {
System.out.println("-------------------------");
System.out.println("Loading ERROR: " + e.getMessage());
} finally {
if (ois != null) {
ois.close();
}
}
}}
class Person implements Serializable {
private static final long serialVersionUID = 1L;
private String name;
private String surname;
private String mail;
private String telephone;
public Person(String n, String s, String m, String t) {
name = n;
surname = s;
mail = m;
telephone = t;
}
public String getName() {
return name;
}
public String getSurname() {
return surname;
}
public String getMail() {
return mail;
}
public String getTelephone() {
return telephone;
}}
new FileOutputStream("C:\\data.ser", true)
You're passing true for the append parameter. So you're appending a list of 10 persons to the file, after the already existing list of 5 people. And since you only read one list, you read the first you wrote, which contains 5 persons.
Pass false instead of true.

Tomcat not logging to file

Hi wrote my own little Logger class to write log output into files, the class looks like this:
public class Logger {
private String prefix;
private String pathToLogFiles = "/tmp/sos/logs/";
private String infoLog;
private String errorLog;
private boolean doLogtoFile;
public Logger(String prefix, Class classname, boolean logToFile) {
this.prefix = "[" + prefix + "]";
SimpleDateFormat sdf = new SimpleDateFormat("yyyyMMdd");
Date now = new Date();
this.infoLog = this.pathToLogFiles + "info_" + sdf.format(now) + ".log";
this.errorLog = this.pathToLogFiles + "error_" + sdf.format(now) + ".log";
this.doLogtoFile = logToFile;
System.out.println("Will log to: " + this.infoLog + " and " + this.errorLog);
}
public void info(String message) {
String logmessage = prefix + " INFO " + message;
if( this.doLogtoFile ) {
try {
Files.write(Paths.get(infoLog), logmessage.getBytes(), StandardOpenOption.APPEND);
} catch (IOException e) {
System.out.println(prefix + message);
}
} else {
System.out.println(prefix + message);
}
}
public void error(String message) {
String logmessage = prefix + " ERROR " + message;
if( this.doLogtoFile ) {
try {
Files.write(Paths.get(errorLog), logmessage.getBytes(), StandardOpenOption.APPEND);
} catch (IOException e) {
System.out.println(prefix + message);
}
} else {
System.out.println(prefix + message);
}
}
}
Permissions of the directory and subdirectories are all set up correctly and owned by the tomcat user. However I don't receive any logoutput, nor are the log files even created.
Could anyone see what could be going wrong?
Strangely enough I am not getting any Exception
Many thanks for any help

Testng to print result pass/ fail

Here, I am trying to print the results of testcases...
But whatever I do the test reult prints only as 16...
I want it to print as success or pass/ fail
It would be a great help if you could post a example.
public class EdiitAttendancePunchesByDepartment {
public EdiitAttendancePunchesByDepartment() {
}
#BeforeClass
public static void setUpClass() {
}
#AfterClass
public static void tearDownClass() {
}
#BeforeMethod
public void setUp() {
}
#AfterMethod
public void tearDown() {
}
Boolean t = true;
#Test(priority = 0,dataProvider = "globalcfg")
public void global(String propid, String propvalue) throws IOException {
ConfigurationBean cfgbean = new ConfigurationBean();
for (ConfigurationModel cc : cfgbean.getConfigurationList()) {
if (cc.getPropertyId().equals(propid)) {
cc.setPropertyId(propid);
cc.setPropertyValue(propvalue);
System.out.println("Propid : " + cc.getPropertyId() + " value : " + cc.getPropertyValue());
}
}
File output = new File("D:/data/kishore/Edit_punches_output.xls");
FileWriter writes = new FileWriter(output.getAbsoluteFile());
BufferedWriter bw = new BufferedWriter(writes);
bw.write("EmpID-Date-Punch_Times-PayDay-Total_IN_Hours-OT-TEST_ID-Leave_Type");
bw.close();
cfgbean.setCreateButtonFlag(Boolean.FALSE);
cfgbean.setUpdateButtonFlag(Boolean.TRUE);
cfgbean.updateConfiguration();
cfgbean.retrieveConfiguration();
for (ConfigurationModel cc : cfgbean.getConfigurationList()) {
if (cc.getPropertyId().equals(propid)) {
System.out.println("Propid Out Put : " + cc.getPropertyId() + " value Out Put: " + cc.getPropertyValue());
}
}
System.out.println("Global Config running>>>>>>>>>>>>>>>");
boolean tr=Reporter.getCurrentTestResult().isSuccess();
System.out.println("Check<><><><><><><><><><><><><><><><><><><><>"+tr);
if(tr==true){
System.out.println("Result /////////////////////////////////////"+tr);
}
System.out.println("Test result>>>>>>>>>>>>>> "+Reporter.getCurrentTestResult().getStatus());
}
#Test(priority = 1,dataProvider = "viewpunches")
public void testviewpunches(String cmpcode, String orgcode, String Empid, String Empname, String deptname, Integer compgrpid,
String date, String Time1, String Time2, String type, String typeid) {
EditEmpTimeSheetBean bean = new EditEmpTimeSheetBean();
bean.setCmpCode(cmpcode);
bean.setOrgCode(orgcode);
try {
SimpleDateFormat format = new SimpleDateFormat("dd/MM/yyyy");
SimpleDateFormat format1 = new SimpleDateFormat("dd/MM/yyyy HH:mm:ss");
Date date1 = format.parse(date);
Date time1 = format1.parse(Time1);
Date time3 = format1.parse(Time2);
SimpleDateFormat op = new SimpleDateFormat("dd/MM/yyyy");
bean.setTimeSheetDate(date1);
bean.setEmpCompGroupId(compgrpid);
bean.setDepartmentName(deptname);
bean.setEmployeeCode(Empid);
bean.setEmployeeName(Empname);
bean.setDialogFlag(Boolean.TRUE);
bean.viewTimeSheetDetailByDepartment();
EmployeeDTO dto = new EmployeeDTO();
EmployeeService service = new EmployeeServiceImpl();
dto = service.retrieveEmployee(bean.getCmpCode(), bean.getOrgCode(), bean.getEmployeeCode());
if (dto == null) {
File output = new File("D:/data/kishore/Edit_punches_output.xls");
FileWriter write_new = new FileWriter(output, true);
BufferedWriter bw_new = new BufferedWriter(write_new);
bw_new.write("\n" + bean.getEmployeeCode() + "- NO record found - - - - -" + typeid);
bw_new.close();
System.out.println("Invalid Employee Code");
} else {
System.out.println("Valid code");
}
for (EmpWorkLogModel cc : bean.getEmpWorkLogArray()) {
if (cc.getEmployeeCode().equals(Empid)) {
cc.onChange();
List<LogTimeModel> kl = new ArrayList<LogTimeModel>();
LogTimeModel m = new LogTimeModel();
LogTimeModel mn = new LogTimeModel();
m.setPunchTimes(time1);
if (type.equalsIgnoreCase("insert")) {
m.setOpFlag(Constant.OP_FLAG_INSERT);
}
if (type.equalsIgnoreCase("update")) {
m.setOpFlag(Constant.OP_FLAG_UPDATE);
}
if (type.equalsIgnoreCase("delete")) {
m.setOpFlag(Constant.OP_FLAG_DELETE);
}
mn.setPunchTimes(time3);
if (type.equalsIgnoreCase("insert")) {
mn.setOpFlag(Constant.OP_FLAG_INSERT);
}
if (type.equalsIgnoreCase("update")) {
mn.setOpFlag(Constant.OP_FLAG_UPDATE);
}
if (type.equalsIgnoreCase("delete")) {
mn.setOpFlag(Constant.OP_FLAG_DELETE);
}
if (type.equalsIgnoreCase("CHANGE")) {
}
kl.add(m);
kl.add(mn);
cc.setPunchList(kl);
System.out.println("punch time>>>>>>>>>>>>>>>>>XXXXXXXXX>>>>>>>>>>>>XXXXXXXXXX " + mn.getPunchTimes() + " " + cc.getPunchTime() + " " + cc.getFromDate());
System.out.println("Emp ID : " + cc.getEmployeeCode() + " \nPunch time : " + cc.getPunchTimes() + " \nPay Day : " + cc.getPayDay() + " \nIN hours: " + cc.getWorkedTime());
} else {
System.out.println("\n\nNo Records found for : " + cc.getEmployeeCode());
}
}System.out.println("Test result>>>>>>>>>>>>>> "+Reporter.getCurrentTestResult().getStatus());
bean.updateLogTime();
testview(bean.getEmployeeCode(), bean.getEmployeeName(), bean.getShiftId(), date, typeid);
} catch (Exception e) {
System.out.println(" Error :" + e);
e.printStackTrace();
}
}
This is the output i receive
sessionFactory1 org.hibernate.internal.SessionFactoryImpl#196a6ac
sessionFactory1 SessionStatistics[entity count=0collection count=0]
Propid : com.rgs.payroll.enableDynamicWeekOff value : N Propid Out Put
: com.rgs.payroll.enableDynamicWeekOff value Out Put: N Global Config
running> Check<><><><><><><><><><>false Test result>>>>>>>>>>>>>> 16
Remove pass fail decision making code from ur test case and use the below code:
#AfterMethod
public void tearDown(ITestResult result) {
if (result.getStatus() == ITestResult.FAILURE) {
System.out.println(result.getMethod().getMethodName()+ " is failed");
//do something here.
}
}
16 means "STARTED".
It's the only value you can expect into the test itself.
TestNG will only determine the value after the end of the method.
You can try to print the result from the #AfterMethod method.

Categories