Intellij keeps saying Undefined Step when running my feature file. However, I have copied the steps and put them into another package and added that package name in the "glue" parameter for Edit configurations. Still not working.
I've added the feature file and it doesn't show any missing step references. I am adding a screenshot. I have added all the steps in another package.
Please see below
The code for CountriesSteps is as follows
package Steps;
import Fixtures.Countries;
import cucumber.api.java.en.Given;
import cucumber.api.java.en.Then;
import cucumber.api.java.en.When;
import org.junit.Assert;
public class CountriesSteps {
Countries countriesclass = new Countries();
#Given("^I generate a restful request for countries$")
public void iGenerateARestfulRequestForCountries() throws Throwable {
countriesclass.GetCountries();
}
#When("^I receive a successful country response (.*)$")
public void iReceiveASuccessfulCountryResponseResponse(int code) throws Throwable {
Assert.assertEquals(code, countriesclass.getsCode());
}
#When("^I receive a successful country response$")
public void iReceiveASuccessfulCountryResponse() throws Throwable {
Assert.assertEquals(200, countriesclass.getsCode());
}
#Then("^the api country response returns (.*)$")
public void theApiCountryResponseReturnsCountries(int countries) throws Throwable {
Assert.assertEquals(countries, countriesclass.getCount());
}
#Then("^the api country response returns (.*),(.*),(.*),(.*),(.*)$")
public void theApiCountryResponseReturnsCountriesNameCapitalPopulation(int countries, int index, String name, String capital, int population) throws Throwable {
Assert.assertEquals(countries, countriesclass.getCount());
}
#Then("^the api country response for Index (.*) returns (.*),(.*),(.*)$")
public void theApiCountryResponseForIndexIndexReturnsNameCapitalPopulation(int index, String name, String capital, int population) throws Throwable {
//Validate a few values from response
Assert.assertEquals(name, countriesclass.getcList().get(index).name);
Assert.assertEquals(capital, countriesclass.getcList().get(index).capital);
Assert.assertEquals(population, countriesclass.getcList().get(index).population);
}
}
And code for Countries is
package Fixtures;
import Models.CountriesData;
import com.jayway.restassured.response.Response;
import gherkin.deps.com.google.gson.Gson;
import gherkin.deps.com.google.gson.reflect.TypeToken;
import org.json.JSONArray;
import java.lang.reflect.Type;
import java.util.ArrayList;
import java.util.List;
import static com.jayway.restassured.RestAssured.get;
public class Countries {
private static String url;
private static int count;
private static int sCode;
private static List<CountriesData> cList;
public void GetCountries() throws Exception
{
try {
url = "http://restcountries.eu/rest/v1/all";
// make get request to fetch json response from restcountries
Response resp = get(url);
//Fetching response in JSON as a string then convert to JSON Array
JSONArray jsonResponse = new JSONArray(resp.asString());
count = jsonResponse.length(); // how many items in the array
sCode = resp.statusCode(); // status code of 200
//create new arraylist to match CountriesData
List<CountriesData> cDataList = new ArrayList<CountriesData>();
Gson gson = new Gson();
Type listType = new TypeToken<List<CountriesData>>() {}.getType();
cDataList = gson.fromJson(jsonResponse.toString(), listType);
cList = cDataList;
}
catch (Exception e)
{
System.out.println("There is an error connecting to the API: " + e);
e.getStackTrace();
}
}
//getters to return ('get) the values
public int getsCode() {
return sCode;
}
public int getCount() {
return count;
}
public List<CountriesData> getcList() {
return cList;
}
}
When creating a new step definition file, IntelliJ by default proposes file path of \IdeaProjects\RestAPI\src\main\resources\stepmethods.
Your step definition folder is \IdeaProjects\RestAPI\src\test\java\Steps. Make sure cucumber isn't looking in the wrong place.
Related
I am trying to return a list of files from a directory. Here's my code:
package com.demo.web.api.file;
import java.io.File;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.ArrayList;
import java.util.List;
import java.util.stream.Collectors;
import java.util.stream.Stream;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import com.demo.core.Logger;
import io.swagger.v3.oas.annotations.Operation;
#RestController
#RequestMapping(value = "/files")
public class FileService {
private static final Logger logger = Logger.factory(FileService.class);
#Value("${file-upload-path}")
public String DIRECTORY;
#Value("${file-upload-check-subfolders}")
public boolean CHECK_SUBFOLDERS;
#GetMapping(value = "/list")
#Operation(summary = "Get list of Uploaded files")
public ResponseEntity<List<File>> list() {
List<File> files = new ArrayList<>();
if (CHECK_SUBFOLDERS) {
// Recursive check
try (Stream<Path> walk = Files.walk(Paths.get(DIRECTORY))) {
List<Path> result = walk.filter(Files::isRegularFile).collect(Collectors.toList());
for (Path p : result) {
files.add(p.toFile().getAbsoluteFile());
}
} catch (Exception e) {
logger.error(e.getMessage());
}
} else {
// Checks the root directory only.
try (Stream<Path> walk = Files.walk(Paths.get(DIRECTORY), 1)) {
List<Path> result = walk.filter(Files::isRegularFile).collect(Collectors.toList());
for (Path p : result) {
files.add(p.toFile().getAbsoluteFile());
}
} catch (Exception e) {
logger.error(e.getMessage());
}
}
return ResponseEntity.ok().body(files);
}
}
As seen in the code, I am trying to return a list of files.
However, when I test in PostMan, I get a list of string instead.
How can I make it return the file object instead of the file path string? I need to get the file attributes (size, date, etc.) to display in my view.
I would recommend that you change your ResponseEntity<> to return not a List of File but instead, a List of Resource, which you can then use to obtain the file metadata that you need.
public ResponseEntity<List<Resource>> list() {}
You can also try specifying a produces=MediaType... param in your #GetMapping annotation so as to tell the HTTP marshaller which kind of content to expect.
You'd have to Create a separate payload with the details you wanna respond with.
public class FilePayload{
private String id;
private String name;
private String size;
public static fromFile(File file){
// create FilePayload from File object here
}
}
And convert it using a mapper from your internal DTO objects to payload ones.
final List<FilePayload> payload = files.forEach(FilePayload::fromFile).collect(Collectors.toList());
return new ResponseEntity<>(payload, HttpStatus.OK);
I think you should not return a body in this case as you may be unaware of the size.
Better to have another endpoint to GET /files/{id}
I did give this another thought. What I just needed was the filename, size and date of the file. From there, I can get the file extension and make my list display look good already.
Here's the refactored method:
#GetMapping(value = "/list")
#Operation(summary = "Get list of Uploaded files")
public ResponseEntity<String> list() {
JSONObject responseObj = new JSONObject();
List<JSONObject> files = new ArrayList<>();
// If CHECK_SUBFOLDERS is true, pass MAX_VALUE to make it recursive on all
// sub-folders. Otherwise, pass 1 to use the root directory only.
try (Stream<Path> walk = Files.walk(Paths.get(DIRECTORY), CHECK_SUBFOLDERS ? MAX_VALUE : 1)) {
List<Path> result = walk.filter(Files::isRegularFile).collect(Collectors.toList());
for (Path p : result) {
JSONObject file = new JSONObject();
file.put("name", p.toFile().getName());
file.put("size", p.toFile().length());
file.put("lastModified", p.toFile().lastModified());
files.add(file);
}
responseObj.put("data", files);
} catch (Exception e) {
String errMsg = CoreUtils.formatString("%s: Error reading files from the directory: \"%s\"",
e.getClass().getName(), DIRECTORY);
logger.error(e, errMsg);
responseObj.put("errors", errMsg);
}
return ResponseEntity.ok().body(responseObj.toString());
}
The above was what I ended up doing. I created a JSONObject of the props I need, and then returned the error if it did not succeed. This made it a lot better for me.
I'm learning to create a REST Assured and Cucumber framework from scratch following a tutorial video on Youtube.
Below is the step definition and the method it calls in the RestAssuredExtension class.
#Given("^I perform GET operation for \"([^\"]*)\"$")
public void i_Perform_GET_Operation_For(String url) throws Throwable {
RestAssuredExtension.GetOps(url);
}
package utilities;
import io.restassured.RestAssured;
import io.restassured.builder.RequestSpecBuilder;
import io.restassured.http.ContentType;
import io.restassured.response.Response;
import io.restassured.response.ResponseOptions;
import io.restassured.specification.RequestSpecification;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.Map;
public class RestAssuredExtension {
public static RequestSpecification Request;
public RestAssuredExtension() {
//Arrange
RequestSpecBuilder builder = new RequestSpecBuilder();
builder.setBaseUri("http://localhost:3000/");
builder.setContentType(ContentType.JSON);
var requestSpec = builder.build();
Request = RestAssured.given().spec(requestSpec);
}
public static ResponseOptions<Response> GetOps(String url) {
//Act
try {
return Request.get(new URI(url));
} catch (URISyntaxException e) {
e.printStackTrace();
}
return null;
}
}
In the video tutorial, the test passes successfully. But when I run the test myself, it results in the following error:
Step failed
java.lang.NullPointerException: Cannot invoke "io.restassured.specification.RequestSpecification.get(java.net.URI)" because "utilities.RestAssuredExtension.Request" is null
at utilities.RestAssuredExtension.GetOps(RestAssuredExtension.java:42)
at steps.GETPostSteps.i_Perform_GET_Operation_For(GETPostSteps.java:21)
Any takers please?
From the example you have given, I think you have not initialized the RestAssuredExtension.Request field.
In the video (I quickly skimmed it), they provide a hook to create a new instance of the RestAssuredExtension before any tests are executed. This will ensure that the public static class variable Request will have been initialized to a non-null value.
My recommendation, if you want to reduce dependency for setup on the test framework and make use of static methods:
public final class RequestExtension {
private static RequestSpecification request;
// Ensure that no one is able to create an instance and thereby bypass proper initalization
private RequestExtension() {
}
// Ensures the initialization responsibility is within the class itself and not a hidden dependency for other users.
private static void getInstance() {
if (request == null) {
RequestSpecBuilder builder = new RequestSpecBuilder();
builder.setBaseUri("http://localhost:3000/");
builder.setContentType(ContentType.JSON);
var requestSpec = builder.build();
request = RestAssured.given().spec(requestSpec);
}
return request;
}
public static ResponseOptions<Response> GetOps(String url) {
// Initialize
getInstance();
// Act
try {
return request.get(new URI(url));
} catch (URISyntaxException e) {
e.printStackTrace();
}
return null;
}
}
Otherwise, mixing static methods with dependencies on the instance will keep tripping people up. Would go either with the above or remove static from the class altogether:
public class RequestExtension {
private RequestSpecification request;
public RestAssuredExtension() {
//Arrange
RequestSpecBuilder builder = new RequestSpecBuilder();
builder.setBaseUri("http://localhost:3000/");
builder.setContentType(ContentType.JSON);
var requestSpec = builder.build();
request = RestAssured.given().spec(requestSpec);
}
public ResponseOptions<Response> GetOps(String url) {
//Act
try {
return request.get(new URI(url));
} catch (URISyntaxException e) {
e.printStackTrace();
}
return null;
}
}
One thing to help with debugging is to follow Java naming conventions. The capitalisation of your class field RequestSpecification makes it read as a class not a field name. (Request vs request) It was the same in the video so its a source issue. :)
UPDATE: I have adapted my code according to suggestions in the first reply but still an error is produced.
I have written the following code to parse a very large json file:
public static void main(String[] args) throws Exception {
String jsonFile="/home/zz/Work/data/wdc/WDC_ProdMatch/idclusters.json";
WDCProdMatchDatasetIndexer_2 indexer = new WDCProdMatchDatasetIndexer_2();
indexer.readClusterMetadata(jsonFile);
}
public void readClusterMetadata(String jsonFile){
try(JsonReader jsonReader = new JsonReader(
new InputStreamReader(
new FileInputStream(jsonFile), StandardCharsets.UTF_8))) {
Gson gson = new GsonBuilder().create();
jsonReader.beginObject(); //start of json array
int numberOfRecords = 0;
while (jsonReader.hasNext()){ //next json array element
Cluster c = gson.fromJson(jsonReader, Cluster.class);
long[] sizeInfo=new long[]{c.clusterSizeInOffers, c.size};
//clusterSize.put(String.valueOf(c.id), sizeInfo);
numberOfRecords++;
if (numberOfRecords%1000==0)
System.out.println(String.format("processed %d clusters", numberOfRecords));
}
jsonReader.endArray();
System.out.println("Total Records Found : "+numberOfRecords);
}
catch (Exception e) {
e.printStackTrace();
}
}
class ArrayAsStringJsonDeserializer implements JsonDeserializer<List<String>> {
#Override
public List<String> deserialize(JsonElement json, Type typeOfT, JsonDeserializationContext context) throws JsonParseException {
String value = json.getAsString().trim();
value = value.substring(1, value.length() - 1);
return Arrays.stream(value.split(",")).map(String::trim).collect(Collectors.toList());
}
}
class Cluster {
protected long id;
protected long size;
#SerializedName("cluster_size_in_offers")
protected long clusterSizeInOffers;
#JsonAdapter(ArrayAsStringJsonDeserializer.class)
#SerializedName("id_values")
protected List<String> idValues;
#SerializedName("categoryDensity")
protected double catDensity;
#SerializedName("category")
protected String cat;
}
And the data file looks like this (first 10 lines)
{"size":4,"cluster_size_in_offers":1,"id_values":"[814914023129, w2190254, pfl60gs25ssdr, pfl60gs25ssdr]","id":2,"categoryDensity":1,"category":"Computers_and_Accessories"}
{"size":2,"cluster_size_in_offers":1,"id_values":"[hst322440ss, g1042641]","id":3,"categoryDensity":1,"category":"Office_Products"}
{"size":4,"cluster_size_in_offers":1,"id_values":"[4051329063869, t24datr01765, t24datr01763, datr01763]","id":4,"categoryDensity":1,"category":"Automotive"}
{"size":2,"cluster_size_in_offers":1,"id_values":"[5057195062301, sppct335a2bl]","id":7,"categoryDensity":1,"category":"Office_Products"}
{"size":3,"cluster_size_in_offers":1,"id_values":"[ 845173001269, mpnlkbusmokeam89us, lkbusmokeam89]","id":8,"categoryDensity":1,"category":"Computers_and_Accessories"}
{"size":2,"cluster_size_in_offers":1,"id_values":"[ksw26r0100, g1104817]","id":9,"categoryDensity":1,"category":"Other_Electronics"}
{"size":2,"cluster_size_in_offers":1,"id_values":"[5054328719897, ltr12x31r685c15]","id":11,"categoryDensity":1,"category":"Office_Products"}
{"size":2,"cluster_size_in_offers":1,"id_values":"[model82226, sirsir822261]","id":15,"categoryDensity":1,"category":"Tools_and_Home_Improvement"}
{"size":2,"cluster_size_in_offers":1,"id_values":"[5054328970724, sscl3816114a2bl]","id":17,"categoryDensity":1,"category":"Office_Products"}
{"size":2,"cluster_size_in_offers":1,"id_values":"[814882011647, 203932664]","id":20,"categoryDensity":1,"category":"Tools_and_Home_Improvement"}
But when the code is run on this data, an error is generated as follows:
com.google.gson.JsonSyntaxException: java.lang.IllegalStateException: Expected BEGIN_OBJECT but was NAME at line 1 column 3 path $.
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:226)
at com.google.gson.Gson.fromJson(Gson.java:927)
at uk.ac.shef.inf.wop.indexing.WDCProdMatchDatasetIndexer_2.readClusterMetadata(WDCProdMatchDatasetIndexer_2.java:38)
at uk.ac.shef.inf.wop.indexing.WDCProdMatchDatasetIndexer_2.main(WDCProdMatchDatasetIndexer_2.java:25)
Caused by: java.lang.IllegalStateException: Expected BEGIN_OBJECT but was NAME at line 1 column 3 path $.
at com.google.gson.stream.JsonReader.beginObject(JsonReader.java:385)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:215)
... 3 more
Any suggestions please?
In a file each line is a separate JSON Object. One problem with it is a fact that JSON Array is wrapped in quotes which makes it a String primitive. You need to provide custom deserialiser for it, unwrap array from quotes and manually split items by comma (,). Example solution could look like below:
import com.google.gson.Gson;
import com.google.gson.GsonBuilder;
import com.google.gson.JsonDeserializationContext;
import com.google.gson.JsonDeserializer;
import com.google.gson.JsonElement;
import com.google.gson.JsonParseException;
import com.google.gson.annotations.JsonAdapter;
import com.google.gson.annotations.SerializedName;
import lombok.Data;
import lombok.ToString;
import java.io.File;
import java.io.IOException;
import java.lang.reflect.Type;
import java.nio.file.Files;
import java.util.Arrays;
import java.util.List;
import java.util.stream.Collectors;
import java.util.stream.Stream;
public class GsonApp {
public static void main(String[] args) throws Exception {
File jsonFile = new File("./resource/test.json").getAbsoluteFile();
List<Cluster> clusters = readClusters(jsonFile);
clusters.forEach(System.out::println);
}
private static List<Cluster> readClusters(File jsonFile) throws IOException {
Gson gson = new GsonBuilder().create();
try (Stream<String> lines = Files.lines(jsonFile.toPath())) {
return lines
.map(line -> gson.fromJson(line, Cluster.class))
.collect(Collectors.toList());
}
}
}
class ArrayAsStringJsonDeserializer implements JsonDeserializer<List<String>> {
#Override
public List<String> deserialize(JsonElement json, Type typeOfT, JsonDeserializationContext context) throws JsonParseException {
String value = json.getAsString().trim();
value = value.substring(1, value.length() - 1);
return Arrays.stream(value.split(",")).map(String::trim).collect(Collectors.toList());
}
}
#Data
#ToString
class Cluster {
protected long id;
protected long size;
#SerializedName("cluster_size_in_offers")
protected long clusterSizeInOffers;
#JsonAdapter(ArrayAsStringJsonDeserializer.class)
#SerializedName("id_values")
protected List<String> idValues;
#SerializedName("categoryDensity")
protected int catDensity;
#SerializedName("category")
protected String cat;
}
Above code prints:
Cluster(id=2, size=4, clusterSizeInOffers=1, idValues=[814914023129, w2190254, pfl60gs25ssdr, pfl60gs25ssdr], catDensity=1, cat=Computers_and_Accessories)
Cluster(id=3, size=2, clusterSizeInOffers=1, idValues=[hst322440ss, g1042641], catDensity=1, cat=Office_Products)
Cluster(id=4, size=4, clusterSizeInOffers=1, idValues=[4051329063869, t24datr01765, t24datr01763, datr01763], catDensity=1, cat=Automotive)
...
I have a spring boot app with a HTTP post request handler. It accepts a payload that I parse and outputs a JSON. I have handled it that it needs to accept a payload of certain parameters(18).
import java.io.IOException;
import java.util.HashMap;
import java.util.Map;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.validation.annotation.Validated;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.RestController;
import com.google.gson.Gson;
#Validated
#RestController
public class MockController {
#Autowired
MockConfig mockconfig;
private static final Logger LOGGER = LoggerFactory.getLogger(MockController.class);
#RequestMapping(value = "/", method = RequestMethod.GET)
public String index() {
return "hello!";
}
String[] parse;
#PostMapping(value = "/")
public String payloader(#RequestBody String params ) throws IOException{
LOGGER.debug("code is hitting");
parse = params.split("\\|");
String key;
String value;
String dyn;
Map<String, String> predictionFeatureMap = mockconfig.getPredictionFeatureMap();
if(parse.length!=18) {
key = "Not_enough_parameters";
value = predictionFeatureMap.get(key);
Map<?, ?> resultJsonObj = new Gson().fromJson(value, Map.class);
}
else {
key = params;
value = predictionFeatureMap.get(key);
}
return value;
}
}
My config class is where I get the input and output from a file and put them into a hashmap.
import java.io.File;
import java.io.IOException;
import java.util.HashMap;
import java.util.Map;
import java.util.Scanner;
import org.springframework.context.annotation.Configuration;
#Configuration
public class MockConfig {
private Map<String, String> predictionFeatureMap = new HashMap<String, String>();
public Map<String,String> getPredictionFeatureMap() throws IOException {
return predictionFeatureMap;
}
public MockConfig() throws IOException {
init();
}
private Map<String, String> init() throws IOException {
Scanner sc = new Scanner (new File("src/test/resources/Payload1.txt"));
int counter = 1;
String key = "";
while (sc.hasNextLine()) {
if(counter % 2 != 0) {
key = sc.nextLine();
}else {
predictionFeatureMap.put(key, sc.nextLine());
}
counter++;
}
sc.close();
return predictionFeatureMap;
}
}
This is the key and value in the file that I'm trying to work with specifically.
Not_enough_parameters
{"status": false, "errorMessage": "Payload has incorrect amount of parts: expecting: 18, actual:8", "version": "0.97", "coreName": "Patient_Responsibility"}
(The JSON string is the response to too much or too little parameters... the paramter length should be 18.)
Example input:
ncs|56-2629193|1972-03-28|20190218|77067|6208|3209440|self|-123|-123|-123|0.0|0.0|0.0|0.0|0.0|0.0|0.0
This input would pass because it has 18 parameters...
What I want to do is if a user curls for example 5 parameters
ncs|56-2629193|1972-03-28|20190218|77067
I want the value(JSON error message) to have the 'actual' field updated like:
{"status": false, "errorMessage": "Payload has incorrect amount of parts: expecting: 18, actual:5", "version": "0.97", "coreName": "Patient_Responsibility"}
without hardcoding it into the txt file or hashmap...
I have tried getting the index of the string and replacing the '8' character with parse.length() and casting it as a char but it just gives me this output:
{"status": false, "errorMessage": "Payload has incorrect amount of parts: expecting:1 actual:", "version": "0.97", "coreName": "Nulogix_Patient_Responsibility"}
How do I parse or index the JSON to update this value? Or is there a hashmap method to deal with this?
When working with a framework, you usually handle errors using the frameworks way of handling errors.
To handle errors in spring boot you typically use a controller advice that will assist in handling errors. This is created by annotating a class with #ControllerAdvice.
There you can catch thrown exceptions and build responses that will be returned to the calling client.
#PostMapping(value = "/")
public String payloader(#RequestBody String params ) throws IOException{
LOGGER.debug("code is hitting");
parse = params.split("\\|");
String key;
String value;
String dyn;
Map<String, String> predictionFeatureMap = mockconfig.getPredictionFeatureMap();
if(parse.length!=18) {
// Here you throw a custom runtime exception and pass what
// parameters you want that will help you build your error
// message you are passing to your client.
final String errorMessage = String.format(
"Payload has incorrect amount of parts: expecting:%d actual:%d",
predictionFeatureMap.size(),
parse.length);
throw new MyException(errorMessage);
}
else {
key = params;
value = predictionFeatureMap.get(key);
}
return value;
}
Then in a controller advice class
#ControllerAdvice
public class Foo extends ResponseEntityExceptionHandler {
#ExceptionHandler(MyException.class)
public ResponseEntity<MyCustomErrorBody> handleControllerException(HttpServletRequest request, MyException ex) {
// Build your error response here and return it.
// Create a class that will represent the json error body
// and pass it as body and jackson will deserialize it for
// you into json automatically.
final MyCustomErrorBody body = new MyCustomErrorBody(false, ex.getMessage(), "0.97", "Patient_Responsibility")
return ResponseEntity.unprocessableEntity().body(myCustomErrorBody).build();
}
}
public class MyCustomErrorBody {
private boolean status;
private String errorMessage;
private String version;
private String coreName;
// constructor and getters setters ommitted
}
Links about spring boot error handling:
Official spring boot documentation
Baeldung exception-handling-for-rest-with-spring
I am working on a requirement where I need to parse CSV record fields against multiple validations. I am using supercsv which has support for field level processors to validate data.
My requirement is to validate each record/row field against multiple validations and save them to the database with success/failure status. for failure records I have to display all the failed validations using some codes.
Super CSV is working file but it is checking only first validation for a filed and if it is failed , ignoring second validation for the same field.Please look at below code and help me on this.
package com.demo.supercsv;
import java.io.FileReader;
import java.io.IOException;
import java.io.StringWriter;
import java.util.ArrayList;
import java.util.List;
import org.supercsv.cellprocessor.Optional;
import org.supercsv.cellprocessor.constraint.NotNull;
import org.supercsv.cellprocessor.constraint.StrMinMax;
import org.supercsv.cellprocessor.constraint.StrRegEx;
import org.supercsv.cellprocessor.constraint.UniqueHashCode;
import org.supercsv.cellprocessor.ift.CellProcessor;
import org.supercsv.exception.SuperCsvCellProcessorException;
import org.supercsv.io.CsvBeanReader;
import org.supercsv.io.CsvBeanWriter;
import org.supercsv.io.ICsvBeanReader;
import org.supercsv.io.ICsvBeanWriter;
import org.supercsv.prefs.CsvPreference;
public class ParserDemo {
public static void main(String[] args) throws IOException {
List<Employee> emps = readCSVToBean();
System.out.println(emps);
System.out.println("******");
writeCSVData(emps);
}
private static void writeCSVData(List<Employee> emps) throws IOException {
ICsvBeanWriter beanWriter = null;
StringWriter writer = new StringWriter();
try{
beanWriter = new CsvBeanWriter(writer, CsvPreference.STANDARD_PREFERENCE);
final String[] header = new String[]{"id","name","role","salary"};
final CellProcessor[] processors = getProcessors();
// write the header
beanWriter.writeHeader(header);
//write the beans data
for(Employee emp : emps){
beanWriter.write(emp, header, processors);
}
}finally{
if( beanWriter != null ) {
beanWriter.close();
}
}
System.out.println("CSV Data\n"+writer.toString());
}
private static List<Employee> readCSVToBean() throws IOException {
ICsvBeanReader beanReader = null;
List<Employee> emps = new ArrayList<Employee>();
try {
beanReader = new CsvBeanReader(new FileReader("src/employees.csv"),
CsvPreference.STANDARD_PREFERENCE);
// the name mapping provide the basis for bean setters
final String[] nameMapping = new String[]{"id","name","role","salary"};
//just read the header, so that it don't get mapped to Employee object
final String[] header = beanReader.getHeader(true);
final CellProcessor[] processors = getProcessors();
Employee emp;
while ((emp = beanReader.read(Employee.class, nameMapping,
processors)) != null) {
emps.add(emp);
if (!CaptureExceptions.SUPPRESSED_EXCEPTIONS.isEmpty()) {
System.out.println("Suppressed exceptions for row "
+ beanReader.getRowNumber() + ":");
for (SuperCsvCellProcessorException e :
CaptureExceptions.SUPPRESSED_EXCEPTIONS) {
System.out.println(e);
}
// for processing next row clearing validation list
CaptureExceptions.SUPPRESSED_EXCEPTIONS.clear();
}
}
} finally {
if (beanReader != null) {
beanReader.close();
}
}
return emps;
}
private static CellProcessor[] getProcessors() {
final CellProcessor[] processors = new CellProcessor[] {
new CaptureExceptions(new NotNull(new StrRegEx("\\d+",new StrMinMax(0, 2)))),//id must be in digits and should not be more than two charecters
new CaptureExceptions(new Optional()),
new CaptureExceptions(new Optional()),
new CaptureExceptions(new NotNull()),
// Salary
};
return processors;
}
}
Exception Handler:
package com.demo.supercsv;
import java.util.ArrayList;
import java.util.List;
import org.supercsv.cellprocessor.CellProcessorAdaptor;
import org.supercsv.cellprocessor.ift.CellProcessor;
import org.supercsv.exception.SuperCsvCellProcessorException;
import org.supercsv.util.CsvContext;
public class CaptureExceptions extends CellProcessorAdaptor {
public static List<SuperCsvCellProcessorException> SUPPRESSED_EXCEPTIONS =
new ArrayList<SuperCsvCellProcessorException>();
public CaptureExceptions(CellProcessor next) {
super(next);
}
public Object execute(Object value, CsvContext context) {
try {
return next.execute(value, context);
} catch (SuperCsvCellProcessorException e) {
// save the exception
SUPPRESSED_EXCEPTIONS.add(e);
if(value!=null)
return value.toString();
else
return "";
}
}
}
sample csv file
ID,Name,Role,Salary
a123,kiran,CEO,"5000USD"
2,Kumar,Manager,2000USD
3,David,developer,1000USD
when I run my program supercsv exception handler displaying this message for the ID value in the first row
Suppressed exceptions for row 2:
org.supercsv.exception.SuperCsvConstraintViolationException: 'a123' does not match the regular expression '\d+'
processor=org.supercsv.cellprocessor.constraint.StrRegEx
context={lineNo=2, rowNo=2, columnNo=1, rowSource=[a123, kiran, CEO, 5000USD]}
[com.demo.supercsv.Employee#23bf011e, com.demo.supercsv.Employee#50e26ae7, com.demo.supercsv.Employee#40d88d2d]
for field Id length should not be null and more than two and it should be neumeric...I have defined field processor like this.
new CaptureExceptions(new NotNull(new StrRegEx("\\d+",new StrMinMax(0, 2))))
but super csv ignoring second validation (maxlenght 2) if given input is not neumeric...if my input is 100 then its validating max lenght..but how to get two validations for wrong input.plese help me on this
SuperCSV cell processors will work in sequence. So, if it passes the previous constraint validation then it will check next one.
To achieve your goal, you need to write a custom CellProcessor, which will check whether the input is a number (digit) and length is between 0 to 2.
So, that both of those checks are done in a single step.