Refactor for supplier classes - java

I'm looking to refactor two supplier classes as they both have very similar code. One provides and ArrayList and the other a Map. They are sorted in the configuration folder but I'm not sure thats the correct place. They both load data from a text file sorted in the project folder, which doesn't feel right to me.
The two supplier classes are:
#Component
public class ModulusWeightTableSupplier implements Supplier<List>{
private static final Logger LOGGER = LogManager.getLogger(CDLBankDetailsValidator.class);
private static final String MODULUS_WEIGHT_TABLE = "AccountModulus_Weight_Table.txt";
#Override
public List<ModulusWeightTableEntry> get(){
LOGGER.debug("Attempting to load modulus weight table " + MODULUS_WEIGHT_TABLE);
final List<ModulusWeightTableEntry> modulusWeightTable = new ArrayList<>();
try{
final InputStream in = new FileInputStream(MODULUS_WEIGHT_TABLE);
final BufferedReader br = new BufferedReader(new InputStreamReader(in));
String line;
while ((line = br.readLine()) != null) {
final String[] fields = line.split("\\s+");
modulusWeightTable.add(new ModulusWeightTableEntry(fields));
}
LOGGER.debug("Modulus weight table loaded");
br.close();
}
catch (final IOException e) {
throw new BankDetailsValidationRuntimeException("An error occurred loading the modulus weight table or sort code substitution table", e);
}
return modulusWeightTable;
}
}
and
#Component
public class SortCodeSubstitutionTableSupplier implements Supplier<Map> {
private static final Logger LOGGER = LogManager.getLogger(CDLBankDetailsValidator.class);
private static final String SORT_CODE_SUBSTITUTION_TABLE = "SCSUBTAB.txt";
#Override
public Map<String, String> get() {
LOGGER.debug("Attempting to load sort code substitution table " + SORT_CODE_SUBSTITUTION_TABLE);
final Map<String, String> sortCodeSubstitutionTable = new HashMap<>();
try {
final InputStream in = new FileInputStream(SORT_CODE_SUBSTITUTION_TABLE);
final BufferedReader br = new BufferedReader(new InputStreamReader(in));
String line;
while ((line = br.readLine()) != null) {
final String[] fields = line.split("\\s+");
sortCodeSubstitutionTable.put(fields[0], fields[1]);
}
LOGGER.debug("Sort code substitution table loaded");
br.close();
}
catch (final IOException e) {
throw new BankDetailsValidationRuntimeException("An error occurred loading the sort code substitution table", e);
}
return sortCodeSubstitutionTable;
}
}
Both classes have a lot of duplicate code. I'm trying to work out the best way to refactor them.

So your current code loads some configuration from textual files. Maybe best solution for this would be to go with properties or yaml files. This is most common approach for loading configuration data from external files.
Good starting point would be Spring Boot documentation for externalized configuration which provides information on how to use both properies and yaml files for loading configuration data in your application.

Related

Should I implement CSV reader methods as static for better Testing in Java?

I am trying to implement an Helper class using OpenCSV and I am thinking to define the related methods as static like this:
public class CsvHelper {
private enum Headers {
ID,
NAME,
EMAIL,
COUNTRY
}
public static List<EmployeeRequest> csvToEmployees(InputStream is) throws IllegalAccessException {
try (BufferedReader reader = new BufferedReader(new InputStreamReader(is, StandardCharsets.UTF_8));
CSVParser parser = new CSVParser(reader,
CSVFormat.DEFAULT.withFirstRecordAsHeader().withIgnoreHeaderCase().withTrim())) {
List<EmployeeRequest> employees = new ArrayList<>();
Iterable<CSVRecord> records = parser.getRecords();
for (CSVRecord rec : records) {
EmployeeRequest employee = new EmployeeRequest(
Long.parseLong(rec.get(Headers.ID)),
rec.get(Headers.NAME),
rec.get(Headers.EMAIL),
rec.get(Headers.COUNTRY)
);
employees.add(employee);
}
return employees;
} catch (IOException e) {
throw new IllegalAccessException("Failed: " + e.getMessage());
}
}
}
However, I am not sure if I should define csvToEmployees method without static in a service class for proper testing. So:
1. Should I define csvToEmployees method in this Helper class as static? Or should I define it in y service as the other service methods (without static)?
2. Can I write Unit and Integration test for reading data from CSV file? Or should I write only Integration test for that?

How to test a class which reads from a file

I can a couple of classes defined by myself which are in charge of reading different shapes from different types of file (json, csv, xml) , and possibly later from a web service.
I have come up with a common interface called ShapeReader
public interface ShapeReader{
List<Shapes> readShapes() throws Exception;
}
I have a couple classes implement this interface
FlatFileReader
public class FlatFileReader implements ShapeReader{
private static final String filename = "shapes.txt";
#Override
public List<Shape> readShapes() throws Exception {
List<Shape> shapes= new ArrayList<>();
try(BufferedReader reader = new BufferedReader(new InputStreamReader(new ClassPathResource(filename).getInputStream()))) {
String line ;
while((line = reader.readLine()) != null){
// create Shape objects
);
shapes.add(shape);
}
log.info("Items read from json file: " + items);
}
catch(Exception e) {
//error handle
}
return shapes;
}
}
XmlReader
public class FlatFileReader implements ShapeReader{
private static final String filename = "shapes.xml";
#Override
public List<Shape> readShapes() throws Exception {
List<Shape> shapes= new ArrayList<>();
try(BufferedReader reader = new BufferedReader(new InputStreamReader(new ClassPathResource(filename).getInputStream()))) {
String line ;
while((line = reader.readLine()) != null){
// create Shape objects from xml
);
shapes.add(shape);
}
log.info("Items read from json file: " + items);
}
catch(Exception e) {
//error handle
}
return shapes;
}
}
I am not sure how I can even test this. I was thinking of reading from a mock test.txt file, but the filename is hardcoded as a static constant. What is the best way to solve this issue which I assume is poor design

Java concatenation in properties file

I have created a properties file called myproperties.properties as:
test.value1=one
test.value2=two
My java code to read this file is the following:
String test = Utility.getInstance().getProperty("test.value1");
where class Utility is so defined:
public class Utility {
private static Utility _instance = null;
private static Properties properties = new Properties();
static public Utility getInstance(){
if (_instance == null) {
_instance = new Utility();
}
return _instance;
}
private Utility(){
loadUtility();
}
public String getProperty(String tgtPropertyName) {
Object prop = properties.get(tgtPropertyName);
if (prop != null) {
return prop.toString();
} else {
return null;
}
}
private void loadUtility(){
String filename = null;
try{
filename = getClass().getClassLoader().getResource("myproperties").getFile();
InputStream file = new FileInputStream(new File(filename));
properties.load(file);
Iterator iter = properties.keySet().iterator();
while (iter.hasNext()){
System.out.println("FILE LOADED");
}
}catch(Exception e){
}
}
}
This code works correctly. Now I must add a concatenation in my properties file:
test.value3=${test.value1}${test.value2}
and this not worked because my Java code cannot interpret ${}.
The exception is:
Caused by: java.lang.IllegalStateException: Stream handler unavailable due to: For input string: "${test.value1}"
Why?
Use below code to concatenate in type.value3 in properties file
Properties prop=null;
public FileReader FileLoader() throws FileNotFoundException
{
File file=new File("myproperties.properties");
FileReader fileReader=new FileReader(file);
return fileReader;
}
public String propertyLoader(String key) throws IOException
{
FileReader fileReader=FileLoader();
prop=new Properties();
prop.load(fileReader);
String value=prop.getProperty(key);
return value;
}
public void resultWriter() throws IOException
{
String value1=propertyLoader("test.value1");
String value2=propertyLoader("test.value2");
String res=value1+value2;
System.out.println(res);
FileWriter fw=new FileWriter("myproperties.properties");
prop=new Properties();
prop.setProperty("test.value3", res);
prop.store(fw, null);
}
public static void main(String[] args) throws IOException
{
UtilityNew util=new UtilityNew();
util.resultWriter();
System.out.println("Success");
}
Nested properties are not supported in core Java. The only thing you can do is create a class that is going to resolve the ${XXX} values once you have loaded the file.properties into a Properties object.
Or maybe the typesafe library can be usefull to you. https://github.com/lightbend/config. It has a lot of functionalities and one of them is substitutions:
substitutions ("foo" : ${bar}, "foo" : Hello ${who})
But you won't have a key-value properties file anymore, it will look more like a json file.
This might be a late answer but someone might find this useful.
You can write a small utility function which reads the property values and then iteratively replaces any nested values that are present
First search for your pattern. Replace it with the actual value by looking-up at the properties. Repeat this until you get the final string.
Properties properties = new Properties();
properties.setProperty("base_url", "http://base");
properties.setProperty("subs_url", "${base_url}/subs");
properties.setProperty("app_download", "apps/download");
properties.setProperty("subs_detail", "${subs_url}/detail/${app_download}");
String input = properties.getProperty("subs_detail");
Pattern pattern = Pattern.compile("\\$\\{.*?\\}"); //change the pattern here to find nested values
while (pattern.matcher(input).find())
{
Matcher match = pattern.matcher(input);
while (match.find())
{
input = input.replace(match.group(), properties.getProperty(match.group().substring(2, match.group().length()-1)));
}
}
System.out.println("final String : " + input); // this prints http://base/subs/detail/apps/download

How to create fixed format file using FixedFormat4j Java Library?

I am able to load files in Fixed Format but unable to write a fixed format file using FixedFormat4j.
Any idea how to do that?
public class MainFormat {
public static void main(String[] args) {
new MainFormat().start();
}
private FixedFormatManager manager;
public void start(){
System.out.println("here");
manager = new FixedFormatManagerImpl();
try {
BufferedReader br = new BufferedReader(new FileReader("d:/myrecords.txt"));
System.out.println("here1");
String text;
MyRecord mr = null;
while ((text = br.readLine()) != null) {
mr = manager.load(MyRecord.class, text);
System.out.println(""+mr.getAddress() + " - "+mr.getName());
}
mr.setName("Rora");
manager.export(mr);
} catch (IOException | FixedFormatException ex) {
System.out.println(""+ex);
}
}
}
I have seen export method but don't understand how to use it? Nothing happens in above code
The export method returns a string representing the marshalled record.
In your code you would need to write out the result of the export command to a FileWriter. So:
before the while loop:
FileWriter fw = new FileWriter("d:/myrecords_modified.txt", true);
BufferedWriter bw = new BufferedWriter(fw);
after the while loop:
mr.setName("Rora");
String modifiedRecord = manager.export(mr);
bw.write(modifiedRecord);

Synchronization Issue while using Apache Storm

I am trying Apache Storm for Processing Streams of GeoHash Codes. I am using this library and Apache Storm 0.9.3. The geohash details for python can be found at enter link description here.
Currently, I am facing an synchronization issue in the execute method of one BOLT class. I have tried using a single bold, which gives me the correct output. But the moment I go from one Bolt thread to two or more. The output gets messed up.
The code snippet for one of the BOLT(Only this is having issues) is:
public static int PRECISION=6;
private OutputCollector collector;
BufferedReader br;
String lastGeoHash="NONE";
HashMap<String,Integer> map;
HashMap<String,String[]> zcd;
TreeMap<Integer,String> counts=new TreeMap<Integer,String>();
public void prepare( Map conf, TopologyContext context, OutputCollector collector )
{
String line="";
this.collector = collector;
map=new HashMap<String,Integer>();
zcd=new HashMap<String,String[]>();
try {
br = new BufferedReader(new FileReader("/tmp/zip_code_database.csv"));
int i=0;
while ((line = br.readLine()) != null) {
if(i==0){
String columns[]=line.split(",");
for(int j=0;j<columns.length;j++){
map.put(columns[j],j);
}
}else{
String []split=line.split(",");
zcd.put(split[map.get("\"zip\"")],new String[]{split[map.get("\"state\"")],split[map.get("\"primary_city\"")]});
}
i++;
}
br.close();
// System.out.println(zcd);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
System.out.println("Initialize");
initializeTreeMapAsPerOurRequirement(counts);
}
public void execute( Tuple tuple )
{
String completeFile = tuple.getStringByField("string");//So, this data is generated by Spout and it contains the complete shape file where each line is separated by a new line character i.e. "\n"
String lines[]=completeFile.split("\t");
String geohash=lines[0];
int count=Integer.parseInt(lines[1]);
String zip=lines[2];
String best="";
String city="";
String state="";
if(!(geohash.equals(lastGeoHash)) && !(lastGeoHash.equals("NONE"))){
//if(counts.size()!=0){
//System.out.println(counts.firstKey());
best=counts.get(counts.lastKey());
//System.out.println(geohash);
if(zcd.containsKey("\""+best+"\"")){
city = zcd.get("\""+best+"\"")[0];
state = zcd.get("\""+best+"\"")[1];
System.out.println(lastGeoHash+","+best+","+state+","+city+","+"US");
}else if(!best.equals("NONE")){
System.out.println(lastGeoHash);
city="MISSING";
state="MISSING";
}
// initializeTreeMapAsPerOurRequirement(counts);
//}else{
//System.out.println("else"+geohash);
//}
//}
}
lastGeoHash=geohash;
counts.put(count, zip);
collector.ack( tuple );
}
private void initializeTreeMapAsPerOurRequirement(TreeMap<Integer,String> counts){
counts.clear();
counts.put(-1,"NONE");
}
public void declareOutputFields( OutputFieldsDeclarer declarer )
{
System.out.println("here");
declarer.declare( new Fields( "number" ) );
}
Topology code is:
public static void main(String[] args)
{
TopologyBuilder builder = new TopologyBuilder();
builder.setSpout( "spout", new SendWholeFileDataSpout(),2);
builder.setBolt( "map", new GeoHashBolt(),2).shuffleGrouping("spout");
builder.setBolt("reduce",new GeoHashReduceBolt(),2).fieldsGrouping("map", new Fields("value"));
Config conf = new Config();
LocalCluster cluster = new LocalCluster();
cluster.submitTopology("test", conf, builder.createTopology());
Utils.sleep(10000);
cluster.killTopology("test");
cluster.shutdown();
}
Can someone look into the code and guide me a bit.
You have set the parallelism_hint to 2 for your spout and both of your bolts. It means 2 executers will run per component, which may mess-up your output.
By setting parallelism_hint to 1 you may achieve your desired output.

Categories