I have some theoretical knowledge related to Design Patterns and now I have some issues to make these info and another ones in action.
I have the following 2 methods in my ServiceImpl class:
#Override
public MultipartFile exportA() throws IOException {
// repeated lines-I same as exportB method (code omitted for brevity)
// other lines special to exportA method
// repeated lines-II same as exportB method (code omitted for brevity)
}
#Override
public MultipartFile exportB() throws IOException {
// repeated lines-I same as exportA method (code omitted for brevity)
// other lines special to exportB method
// repeated lines-II same as exportA method (code omitted for brevity)
}
As it is shown, there are repeated parts in all of these methods. So, should I create 2 methods for repeated lines-I and II, eand then move these code blocks to these newly created 2 methods? Or, is there a better way for Design Patterns?
If I have well understood your statement, this sounds to me a
Builder Pattern that you are looking for. You methods are building a MultipartFile whereas the build process itself depends on 'arguments/parameters' (here I guess sheet file path) and distinct code (the one you referred to as "other lines special to this method").
For that I would create a class MultipartFileBuilder that does the staff and that I would call in each method; of course, by means of setting the appropriate parameters and "code" each time. The code is simply an implementation of the java.util.function.Consumer<T> functional interface used in the following code (*2) and other parameters are using simple setters as well (here (*1)).
Note that I invoked the Consumer<T> as lambda expression here (the c->... construct). And note also that the type parameter <T> in Consumer<T> here is a new class I introduced MultipartFileBuildContext to allow multiple information to be passed to the code you are willing to write in each method. I guest the 'sheet' var would be a starting point. You can add other information if you need to.
To summer up things, this is how the code would look :
#Override
public MultipartFile exportMethodA() throws IOException {
return new MultipartFileBuilder()
.setSheetPath("sheetA/path") (*1)
.setAction(c->{
//(*2) do your staff here for method A
// the "other lines special to this method"
XSSFSheet sheet=c.getSheet()
...
}).build();
}
#Override
public MultipartFile exportMethodB() throws IOException {
return new MultipartFileBuilder()
.setSheetPath("sheetB/path") (*1)
.setAction(c->{
//(*2) do your staff here for method B
// the "other lines special to this method"
XSSFSheet sheet=c.getSheet()
...
}).build();
}
class MultipartFileBuildContext {
private XSSFSheet sheet;
public MultipartFileBuildContext(XSSFSheet sheet){this.sheet=sheet;}
public String getSheetPath() {
return sheetPath;
}
}
class MultipartFileBuilder {
String sheetPath;
Consumer<MultipartFileBuildContext> action;
public String getSheetPath() {
return sheetPath;
}
public MultipartFileBuilder setSheetPath(String sheetPath) {
this.sheetPath = sheetPath;
return this;
}
public Consumer<MultipartFileBuildContext> getAction() {
return action;
}
public MultipartFileBuilder setAction(Consumer<MultipartFileBuildContext> action) {
this.action = action;
return this;
}
public MockMultipartFile build() {
// repeated lines
workbook = new XSSFWorkbook();
sheet = workbook.createSheet(sheetPath);
//
if(action!=null){
MultipartFileBuildContext c=new MultipartFileBuildContext(sheet);
action.accept(c);
}
// repeated lines ======================================================
outputFile = File.createTempFile(...);
try (FileOutputStream outputStream = new FileOutputStream(outputFile)) {
workbook.write(outputStream);
} catch (IOException e) {
LoggingUtils.error("Writing failed ", e);
}
final FileInputStream input = new FileInputStream(outputFile);
final String fileName = TextBundleUtil.read(...);
return new MockMultipartFile(fileName,
fileName, CONTENT_TYPE, IOUtils.toByteArray(input));
//
}
}
At the end, this pattern needs to be used with care because you need to factorize all that you can to make the builder really useful, but not too much to make it a "boat of anything". For instance, you can have as input the sheet path or an inputstream of it to make it more useful/generic.
Related
I have the following problem to be solved, could you help me?
I have two methods in a class.
The first generates a document (calling another class) and stores it in a string.
The second one I want to save this document number for use in other methods and in other classes, in a way that the document is the same generated initially. That is, do not generate a different document!
I'm not getting ... = //
First Methods in one class (generates document, calling a method of another class):
public class oneClass {
private String cpf;
private String document() {
if (this.cpf == null) {
this.cpf = incluiDocumento.cpf(false);
} else {
}
return this.cpf;
}
public void one() {
System.out.println(document());
System.out.println(document());
System.out.println(document());
}
public void two() {
System.out.println(document());
}
}
Second class:
#Test
public void testDocuments() {
new oneClass().one();
new oneClass().two();
}
Conclusion:
I can generate my document and store it in a string. However, in the next methods and classes, I can never use the first document ever generated. It will always generate new documents.
How can I generate a document and store it for use in tests and validate it?
Tool: Selenium Webdriver, Java.
Thanks in advance!!!
In this case you might use this approach:
public class OneClass{
private String cpf;
//...
public String document() {
if(this.cpf==null){
this.cpf = document.cpf(false);
}
return this.cpf;
}
//... method one() and two()
}
The document is created only once and saved in a class variable. Any call after that will return the saved document.
So the Second Method will always get the first document generated.
Edit:
And test it like in the following:
#Test
public void testDocuments() {
OneClass oneClass = new OneClass();
oneClass.one();
oneClass.two();
}
I changed the name of your class from oneClass to OneClass because in Java class names start with capital letter.
I'm trying to remove the method body of test() in the following program so that nothing is printed to the Console. I'm using using ASM 5.2 but everything I've tried doesn't seem to have any effect.
Can someone explain what I'm doing wrong and also point me to some up-to-date tutorials or documentation on ASM? Almost everything Iv'e found on Stackoverflow and the ASM website seems outdated and/or unhelpful.
public class BytecodeMods {
public static void main(String[] args) throws Exception {
disableMethod(BytecodeMods.class.getMethod("test"));
test();
}
public static void test() {
System.out.println("This is a test");
}
private static void disableMethod(Method method) {
new MethodReplacer()
.visitMethod(Opcodes.ACC_PUBLIC | Opcodes.ACC_STATIC, method.getName(), Type.getMethodDescriptor(method), null, null);
}
public static class MethodReplacer extends ClassVisitor {
public MethodReplacer() {
super(Opcodes.ASM5);
}
#Override
public MethodVisitor visitMethod(int access, String name, String desc, String signature, String[] exceptions) {
return null;
}
}
}
You are not supposed to invoke the methods of a visitor directly.
The correct way to use a ClassVisitor, is to create a ClassReader with the class file bytes of the class you’re interested in and pass the class visitor to the its accept method. Then, all the visit methods will be called by the class reader according to the artifacts found in the class file.
In this regard, you should not consider the documentation outdated, just because it refers to an older version number. E.g. this document describes that process correctly and it speaks for the library that no fundamental change was necessary between the versions 2 and 5.
Still, visiting a class does not change it. It helps analyzing it and perform actions when encountering a certain artifact. Note that returning null is not an actual action.
If you want to create a modified class, you need a ClassWriter to produce the class. A ClassWriter implements ClassVisitor, also class visitors can be chained, so you can easily create a custom visitor delegating to a writer, that will produce a class file identical to the original one, unless you override a method to intercept the recreation of a feature.
But note that returning null from visitMethod does more than removing the code, it will remove the method entirely. Instead, you have to return a special visitor for the specific method which will reproduce the method but ignore the old code and create a sole return instruction (you are allowed to omit the last return statement in source code, but not the return instruction in the byte code).
private static byte[] disableMethod(Method method) {
Class<?> theClass = method.getDeclaringClass();
ClassReader cr;
try { // use resource lookup to get the class bytes
cr = new ClassReader(
theClass.getResourceAsStream(theClass.getSimpleName()+".class"));
} catch(IOException ex) {
throw new IllegalStateException(ex);
}
// passing the ClassReader to the writer allows internal optimizations
ClassWriter cw = new ClassWriter(cr, 0);
cr.accept(new MethodReplacer(
cw, method.getName(), Type.getMethodDescriptor(method)), 0);
byte[] newCode = cw.toByteArray();
return newCode;
}
static class MethodReplacer extends ClassVisitor {
private final String hotMethodName, hotMethodDesc;
MethodReplacer(ClassWriter cw, String name, String methodDescriptor) {
super(Opcodes.ASM5, cw);
hotMethodName = name;
hotMethodDesc = methodDescriptor;
}
// invoked for every method
#Override
public MethodVisitor visitMethod(
int access, String name, String desc, String signature, String[] exceptions) {
if(!name.equals(hotMethodName) || !desc.equals(hotMethodDesc))
// reproduce the methods we're not interested in, unchanged
return super.visitMethod(access, name, desc, signature, exceptions);
// alter the behavior for the specific method
return new ReplaceWithEmptyBody(
super.visitMethod(access, name, desc, signature, exceptions),
(Type.getArgumentsAndReturnSizes(desc)>>2)-1);
}
}
static class ReplaceWithEmptyBody extends MethodVisitor {
private final MethodVisitor targetWriter;
private final int newMaxLocals;
ReplaceWithEmptyBody(MethodVisitor writer, int newMaxL) {
// now, we're not passing the writer to the superclass for our radical changes
super(Opcodes.ASM5);
targetWriter = writer;
newMaxLocals = newMaxL;
}
// we're only override the minimum to create a code attribute with a sole RETURN
#Override
public void visitMaxs(int maxStack, int maxLocals) {
targetWriter.visitMaxs(0, newMaxLocals);
}
#Override
public void visitCode() {
targetWriter.visitCode();
targetWriter.visitInsn(Opcodes.RETURN);// our new code
}
#Override
public void visitEnd() {
targetWriter.visitEnd();
}
// the remaining methods just reproduce meta information,
// annotations & parameter names
#Override
public AnnotationVisitor visitAnnotation(String desc, boolean visible) {
return targetWriter.visitAnnotation(desc, visible);
}
#Override
public void visitParameter(String name, int access) {
targetWriter.visitParameter(name, access);
}
}
The custom MethodVisitor does not get chained to the method visitor returned by the class writer. Configured this way, it will not replicate the code automatically. Instead, performing no action will be the default and only our explicit invocations on the targetWriter will produce code.
At the end of the process, you have a byte[] array containing the changed code in the class file format. So the question is, what to do with it.
The easiest, most portable thing you can do, is to create a new ClassLoader, which creates a new Class from these bytes, which has the same name (as we didn’t change the name), but is distinct from the already loaded class, because it has a different defining class loader. We can access such dynamically generated class only through Reflection:
public class BytecodeMods {
public static void main(String[] args) throws Exception {
byte[] code = disableMethod(BytecodeMods.class.getMethod("test"));
new ClassLoader() {
Class<?> get() { return defineClass(null, code, 0, code.length); }
} .get()
.getMethod("test").invoke(null);
}
public static void test() {
System.out.println("This is a test");
}
…
In order to make this example do something more notable than doing nothing, you could alter the message instead,
using the following MethodVisitor
static class ReplaceStringConstant extends MethodVisitor {
private final String matchString, replaceWith;
ReplaceStringConstant(MethodVisitor writer, String match, String replacement) {
// now passing the writer to the superclass, as most code stays unchanged
super(Opcodes.ASM5, writer);
matchString = match;
replaceWith = replacement;
}
#Override
public void visitLdcInsn(Object cst) {
super.visitLdcInsn(matchString.equals(cst)? replaceWith: cst);
}
}
by changing
return new ReplaceWithEmptyBody(
super.visitMethod(access, name, desc, signature, exceptions),
(Type.getArgumentsAndReturnSizes(desc)>>2)-1);
to
return new ReplaceStringConstant(
super.visitMethod(access, name, desc, signature, exceptions),
"This is a test", "This is a replacement");
If you want to change the code of an already loaded class or intercept it right before being loaded into the JVM, you have to use the Instrumentation API.
The byte code transformation itself doesn’t change, you’ll have to pass the source bytes into the ClassReader and get the modified bytes back from the ClassWriter. Methods like ClassFileTransformer.transform(…) will already receive the bytes representing the current form of the class (there might have been previous transformations) and return the new bytes.
The problem is, this API isn’t generally available to Java applications. It’s available for so-called Java Agents, which must have been either, started together with the JVM via startup options or get loaded dynamically in an implementation-specific way, e.g. via the Attach API.
The package documentation describes the general structure of Java Agents and the related command line options.
At the end of this answer is a program demonstrating how to use the Attach API to attach to your own JVM to load a dummy Java Agent that will give the program access to the Instrumentation API. Considering the complexity, I think, it became apparent, that the actual code transformation and turning the code into a runtime class or using it to replace a class on the fly, are two different tasks that have to collaborate, but whose code you usually want to keep separated.
The easier way is to create a MethodNode instance and replace the body with a new InsnList. First, you need the original class representation. You can get it just like #Holger suggested.
Class<?> originalClass = method.getDeclaringClass();
ClassReader classReader;
try {
cr = new ClassReader(
originalClass.getResourceAsStream(originalClass.getSimpleName()+".class"));
} catch(IOException e) {
throw new IllegalStateException(e);
}
Then create a ClassNode and replace the method body.
//Create the CLassNode
ClassNode classNode = new ClassNode();
classReader.accept(classNode,0);
//Search for the wanted method
final List<MethodNode> methods = classNode.methods;
for(MethodNode methodNode: methods){
if(methodNode.name.equals("test")){
//Replace the body with a RETURN opcode
InsnList insnList = new InsnList();
insnList.add(new InsnNode(Opcodes.RETURN));
methodNode.instructions = insnList;
}
}
Before generating the new class, you will need a ClassLoader with a public defineClass() method. Just like this.
public class GenericClassLoader extends ClassLoader {
public Class<?> defineClass(String name, byte[] b) {
return defineClass(name, b, 0, b.length);
}
}
Now you can generate the actual class.
//Generate the Class
ClassWriter classWriter = new ClassWriter(ClassWriter.COMPUTE_FRAMES | ClassWriter.COMPUTE_MAXS);
classNode.accept(classWriter);
//Define the representation
GenericClassLoader classLoader = new GenericClassLoader();
Class<?> modifiedClass = classLoader.defineClass(classNode.name, classWriter.toByteArray());
This is my method in some service class. It's public so it should be tested. I simply do not know WHAT should I test. I'd mock Writer and spyOn function call, but with this implementation it's impossible (isn't it?)
I'm using Mockito and JUnit
For now, I can only make function to throw and assert that exception
Any help?
#Override
public void initIndexFile(File emptyIndexFile) {
try {
Writer writer = new FileWriter(emptyIndexFile);
writer.write("[]");
writer.close();
} catch (IOException e) {
throw new IndexFileInitializationException(
"Error initialization index file " + emptyIndexFile.getPath()
);
}
}
If you feel that adding the the special content is the business logic and therefore the responsibility of your class, then creating the FileWriter is not (according to the single responsibility pattern.
So you should use a FileWriterFactory that is injected into your Class under Test. Then you can mock that FileWriterFactory to return a mock implementation of the Writer interface on which in turn you can check that it got the expected String.
Your CuT would change to this:
private final WriterFactory writerFactory;
public ClassUnderTest(#Inject WriterFactory writerFactory){
this.writerFactory = writerFactory;
}
#Override
public void initIndexFile(File emptyIndexFile) {
try {
Writer writer = writerFactory.create(emptyIndexFile);
writer.write("[]");
writer.close();
} catch (IOException e) {
throw new IndexFileInitializationException(
"Error initialization index file " + emptyIndexFile.getPath()
);
}
}
and your test to this:
class Test{
#Rule public MockitoRule mockitoRule = MockitoJUnit.rule();
#Mock
private FileWriterFactory fileWriterFactory;
private Writer fileWriter = spy(new StringWriter());
File anyValidFile = new File(".");
#Test
public void initIndexFile_validFile_addsEmptyraces(){
//arrange
doReturn(fileWriter).when(fileWriterFactory).create(any(File.class));
// act
new ClassUnderTest(fileWriterFactory).initIndexFile(anyValidFile);
//assert
verify(fileWriterFactory)create(anyValidFile);
assertEquals("text written to File", "[]", fileWriter.toString());
verify(fileWriter).close();
}
}
in addition you could easily check that your CuT intercepts the IOException:
#Rule
public ExpectedException exception = ExpectedException.none();
#Test
public void initIndexFile_missingFile_IndexFileInitializationException(){
//arrange
doReturnThrow(new IOException("UnitTest")).when(fileWriterFactory).create(any(File.class));
//assert
exception.expect(IndexFileInitializationException.class);
exception.expectMessage("Error initialization index file "+anyValidFile.getPath());
// act
new ClassUnderTest(fileWriterFactory).initIndexFile(anyValidFile);
}
Nice! a factory just to test 3 lines of code! – Nicolas Filotto
This is a good point.
The question is: will there be any method within that class ever interacting with the File object directly and needs to create the FileWriter afterwards?
If the answer is "no" (as it is most likely) following the KISS principle you should inject a Writer object directly instead of the factory and have your methods without the File parameter.
private final Writer writer;
public ClassUnderTest(#Inject Writer writer){
this.writer = writer;
}
#Override
public void initIndexFile() {
try {
writer.write("[]");
writer.close();
} catch (IOException e) {
throw new IndexFileInitializationException(
"Error initialization index file " + emptyIndexFile.getPath()
);
}
}
modified test:
class Test{
#Rule public MockitoRule mockitoRule = MockitoJUnit.rule();
#Rule public ExpectedException exception = ExpectedException.none();
#Mock
private FileWriterFactory fileWriterFactory;
#Mock
private Writer failingFileWriter;
private Writer validFileWriter = spy(new StringWriter());
File anyValidFile = new File(".");
#Test
public void initIndexFile_validFile_addsEmptyraces(){
//arrange
// act
new ClassUnderTest(validFileWriter).initIndexFile();
//assert
verify(fileWriterFactory)create(anyValidFile);
assertEquals("text written to File", "[]", fileWriter.toString());
verify(fileWriter).close();
}
#Test
public void initIndexFile_missingFile_IndexFileInitializationException(){
//arrange
doReturnThrow(new IOException("UnitTest")).when(failingFileWriter).write(anyString());
//assert
exception.expect(IndexFileInitializationException.class);
exception.expectMessage("Error initialization index file "+anyValidFile.getPath());
// act
new ClassUnderTest(fileWriterFactory).initIndexFile(anyValidFile);
}
}
To test that your method can interact with a writer correctly, by sending the correct commands, your pogram has to expose some sort of "seam" so that your test can configure a mock FileWriter. I'm not familiar with mockito but one way would be to encapsulate the FileWriter instantiation behind a method then your test could override that method to return a mock FileWriter.
Assuming that File is an interface:
public Writer getFileWriter(File emptyIndexFile) {
return new FileWriter(emptyIndexFile);
}
This could allow you to override the above method for a test and return a fake Writer
#Override
public Writer getFileWriter(File emptyIndexFile) {
return mockFileWriterInstance;
}
Then your test could make exercise initIndexFile and make assertions on the operations. Using a mock file writer shoudl be trivial to throw IOException so that you can exercise error handling logic.
You could simply provide a temporary file to your method in your test and simply check that it contains [] as expected and once over delete the file.
Something like:
public class FileWritingTest {
// File to provide to the method initIndexFile
private File file;
/* This is executed before the test */
#Before
public void init() throws IOException {
// Create a temporary file
this.file = File.createTempFile("FileWritingTest", "tmp");
// Indicates that it should be removed on exit
file.deleteOnExit();
}
/* This is executed after the test */
#After
public void clean() throws IOException {
// Delete the file once test over
file.delete();
}
#Test
public void testInitIndexFile() throws IOException {
FileWriting fw = new FileWriting();
// Call the method
fw.initIndexFile(this.file);
// Check that the content is [] as expected
Assert.assertEquals("[]", new String(Files.readAllBytes(file.toPath())));
}
}
NB 1: I rely on new String(byte[]) which means that I rely on the default character encoding like you do in your current code but it is not a good practice, we should set a character encoding explicitly to avoid platform dependent.
NB 2: Assuming that you use java 7 or higher, you should consider using the try-with-resources statement to properly close your writer, your code would then be:
public void initIndexFile(File emptyIndexFile) {
try (Writer writer = new FileWriter(emptyIndexFile)) {
writer.write("[]");
} catch (IOException e) {
throw new IndexFileInitializationException(
"Error initialization index file " + emptyIndexFile.getPath()
);
}
}
Mocking a dependency is possible and natural, but mocking an object declared in the body of the method is not natural and tricky.
I imagine 3 solutions:
1) Why, instead of mocking, could you not simply assert that the file is written with the expected character?
It avoids tricks, but it may be redundant and slow if you perform this task very often and you want to unit test them.
2) Making the local variable an instance field to mock it. This seems really a not at all clean solution. If you have multiple methods in the same class that does this kind of processing, you risk to reuse the same writer or to have multiple writer fields. In both cases, you could have side effects.
3) If you perform many write operations and you want to really isolate the call to the writer, you have a solution: redesign your code to have a testable class.
You could extract a dependency to perform the writer processings. The class could provide a method with required parameters to perform instructions. We could call it : WriteService.
public class WriteService {
...
public void writeAndClose(Writer writer, String message){
try {
writer.write(message);
writer.close();
}
catch (IOException e) {
throw new IndexFileInitializationException("Error initialization index file " + emptyIndexFile.getPath());
}
}
}
This class is testable because the writer dependency is a parameter.
And you call the new service like that :
public class YourAppClass{
private WriteService writeService;
public YourAppClass(WriteService writeService){
this.writeService=writeService;
}
#Override
public void initIndexFile(File emptyIndexFile) {
Writer writer = new FileWriter(emptyIndexFile);
writeService.writeAndClose(writer,"[]");
}
}
Now initIndexFile() is also testable by mocking WriteService.
You could check tat writeAndClose() is called on writeService with the good parameter.
Personally, I would use the first solution or the third solution.
I defined a job flow in my batch Spring project and defined ItemReader, ItemProcessor, ItemWriter, etc.
My ItemReader as below code :
#Component
#StepScope
public class MyFileReader extends FlatFileItemReader<FileInfo> {
private String fileName;
public MyFileReader () {
}
#Value("#{jobParameters[fileName]}")
public void setFileName(final String fileName) {
this.fileName = fileName;
}
#Override
public void afterPropertiesSet() throws Exception {
Resource resource = new FileSystemResource(fileName);
setResource(resource);
setEncoding("UTF-8");
super.afterPropertiesSet();
}
}
and my file input format is:
111111,11111,111,111
222222,22222,222,222
I want to read all lines of file and return lines and file address to ItemProcessor, but FlatFileItemReader read line by line. How do I do it correctly? Is overriding doRead method and handle problem manually correct?
If I'm understanding the question, you want to read in all lines from a file, store that data in an object and then pass said object to the processor. One approach would be to read all lines from the file before the job starts using a Job Listener. As illustrated below, you could read all lines in, populate a Java object that represents the content of a single row, collect all of those objects (so if there were two rows you'd populate 2 beans), and then pass them to the processor one at a time (or potentially at the same time, if you wish). It would look something like this:
First you would create a listener.
public class MyJobListenerImpl implements JobExecutionListener {
private MyFileReader reader;
#Override
public void beforeJob(JobExecution jobExecution) {
reader.init();
}
#Override
public void afterJob(JobExecution jobExecution) {
// noop
}
// Injected
public void setReader(MyFileReader reader) {
this.reader = reader;
}
Next add an init method to your custom reader.
public void init() {
if(Files.exists(inputFileLocation)) {
List<String> inputData = null;
try {
inputData = Files.readAllLines(inputFileLocation, StandardCharsets.UTF_8);
} catch(IOException e) {
System.out.println("issue reading input file {}. Error message: {}", inputFileLocation, e);
throw new IllegalStateException("could not read the input file.");
}
try {
for(String fileItem : inputData) {
YourFileDataBean fileData = new YourFileDataBean();
yourFileDataBean.setField1(fileItem.split(",")[0].trim());
yourFileDataBean.setFiled2(fileItem.split(",")[1].trim());
yourFileDataBean.setField3(fileItem.split(",")[2].trim());
yourFileDataBean.setField4(fileItem.split(",")[3].trim());
myDeque.add(yourFileDataBean); // really up to you how you want to store your bean but you could add a Deque instance variable and store it there.
}
} catch(ArrayIndexOutOfBoundsException e) {
LOGGER.warn("ArrayIndexOutOfBoundsException due to data in input file.");
throw new IllegalStateException("Failure caused by init() method. Error reading in input file.");
}
} else {
LOGGER.warn("Input file {} does not exist.", inputFileLocation);
throw new IllegalStateException("Input file does not exist at file location " + inputFileLocation);
}
}
Make your read() (or MyFileReader()) method in your custom reader return the object populated by all the file lines read in. In this example I am implementing ItemReader rather than extending it as you have done, but you get the idea. And if you intend to return a single Java object that represents the entire file then there would be no need to store the object in a Deque or List.
#Override
public MyFileReader read() throws NonTransientResourceException {
return myDeque.poll();
}
Hope this helps.
As for returning the file address to the ItemProcessor. You could make this a field in YourFileDataBean and store inputFileLocation there, or save it to the execution context and access it that way. If you inject this file path into your reader, you could do the same in your processor assuming your reader plays no role in determining the file path (aka, it's predetermined).
Is it possible to force Properties not to add the date comment in front? I mean something like the first line here:
#Thu May 26 09:43:52 CEST 2011
main=pkg.ClientMain
args=myargs
I would like to get rid of it altogether. I need my config files to be diff-identical unless there is a meaningful change.
Guess not. This timestamp is printed in private method on Properties and there is no property to control that behaviour.
Only idea that comes to my mind: subclass Properties, overwrite store and copy/paste the content of the store0 method so that the date comment will not be printed.
Or - provide a custom BufferedWriter that prints all but the first line (which will fail if you add real comments, because custom comments are printed before the timestamp...)
Given the source code or Properties, no, it's not possible. BTW, since Properties is in fact a hash table and since its keys are thus not sorted, you can't rely on the properties to be always in the same order anyway.
I would use a custom algorithm to store the properties if I had this requirement. Use the source code of Properties as a starter.
Based on https://stackoverflow.com/a/6184414/242042 here is the implementation I have written that strips out the first line and sorts the keys.
public class CleanProperties extends Properties {
private static class StripFirstLineStream extends FilterOutputStream {
private boolean firstlineseen = false;
public StripFirstLineStream(final OutputStream out) {
super(out);
}
#Override
public void write(final int b) throws IOException {
if (firstlineseen) {
super.write(b);
} else if (b == '\n') {
firstlineseen = true;
}
}
}
private static final long serialVersionUID = 7567765340218227372L;
#Override
public synchronized Enumeration<Object> keys() {
return Collections.enumeration(new TreeSet<>(super.keySet()));
}
#Override
public void store(final OutputStream out, final String comments) throws IOException {
super.store(new StripFirstLineStream(out), null);
}
}
Cleaning looks like this
final Properties props = new CleanProperties();
try (final Reader inStream = Files.newBufferedReader(file, Charset.forName("ISO-8859-1"))) {
props.load(inStream);
} catch (final MalformedInputException mie) {
throw new IOException("Malformed on " + file, mie);
}
if (props.isEmpty()) {
Files.delete(file);
return;
}
try (final OutputStream os = Files.newOutputStream(file)) {
props.store(os, "");
}
if you try to modify in the give xxx.conf file it will be useful.
The write method used to skip the First line (#Thu May 26 09:43:52 CEST 2011) in the store method. The write method run till the end of the first line. after it will run normally.
public class CleanProperties extends Properties {
private static class StripFirstLineStream extends FilterOutputStream {
private boolean firstlineseen = false;
public StripFirstLineStream(final OutputStream out) {
super(out);
}
#Override
public void write(final int b) throws IOException {
if (firstlineseen) {
super.write(b);
} else if (b == '\n') {
// Used to go to next line if did use this line
// you will get the continues output from the give file
super.write('\n');
firstlineseen = true;
}
}
}
private static final long serialVersionUID = 7567765340218227372L;
#Override
public synchronized Enumeration<java.lang.Object> keys() {
return Collections.enumeration(new TreeSet<>(super.keySet()));
}
#Override
public void store(final OutputStream out, final String comments)
throws IOException {
super.store(new StripFirstLineStream(out), null);
}
}
Can you not just flag up in your application somewhere when a meaningful configuration change takes place and only write the file if that is set?
You might want to look into Commons Configuration which has a bit more flexibility when it comes to writing and reading things like properties files. In particular, it has methods which attempt to write the exact same properties file (including spacing, comments etc) as the existing properties file.
You can handle this question by following this Stack Overflow post to retain order:
Write in a standard order:
How can I write Java properties in a defined order?
Then write the properties to a string and remove the comments as needed. Finally write to a file.
ByteArrayOutputStream baos = new ByteArrayOutputStream();
properties.store(baos,null);
String propertiesData = baos.toString(StandardCharsets.UTF_8.name());
propertiesData = propertiesData.replaceAll("^#.*(\r|\n)+",""); // remove all comments
FileUtils.writeStringToFile(fileTarget,propertiesData,StandardCharsets.UTF_8);
// you may want to validate the file is readable by reloading and doing tests to validate the expected number of keys matches
InputStream is = new FileInputStream(fileTarget);
Properties testResult = new Properties();
testResult.load(is);