I am currently working on a project using jmeter-maven plugin. I need to use a CSV data config file from which some variables are to be read during execution. Although the created JMX file works in the JMeter GUI, the same isn't working in non-gui mode.
From the logs, I was able to gather that the data file isn't being opened (stored) during execution in the non-gui mode, which on the other hand, it happens in the GUI mode leading to successful execution in GUI mode.
I have checked the path of the file (the absolute path of the file - with forward slashes), permissions of the file and all the parameters set in the CSV data config element in the jmx file created (it has proper path to the csv file) and I couldn't possibly get the reason on why the CSV file is not being used during execution in non-gui mode.
I have tried having the CSV file in the bin folder (giving the complete path of to the bin folder and just the file name - both methods), having the csv in the same path as the jmx file.
Any ideas on what I might be missing?
CSV Config Data CSV configuration Image
Related
I developed a small program that takes as input some configuration parameters from a .cfg file and produces an output .txt file based on the values taken from the .cfg file.
While the program runs perfectly in eclipse, I receive a NullPointerException error when I create a JAR file of this program and try to run it. From my understanding I have to make the JAR access its internal files or try to receive the needed information (in this case the .cfg file) externally, e.g. create a resource folder next to the JAR file.
I have searched many related questions asked here but I got even more confused whether there is an optimal way to produce a JAR file that can access input files and produces output files. Should I modify my code to achieve this or there is another way?
For the record, I use FileReader and FileWriter to access and produce the files.
If your .cfg file is outside of the JAR file, it should work just as in Eclipse.
If you want to access it from inside of the JAR archive, then you should use a class loader to load it, instead of FileReader...
I have a .properties file that is under a source folder I made called res (using Eclipse Mars 2). I have two other folders in there called messages and schemas.
I need help in giving a filepath so it works locally and on a server (e.g. JBoss) after making the project into a .war file. This is my .properties file:
# Credentials
user=flow
password=flow
# Path to schema for validation
schemaPath=schemas/Schema1.xsd
# Path to where you want to keep incoming and outgoing messages
messagePath=messages/
The above properties file will only work if I provide the full path to the two different *Path properties (above is not full path). However, I can't do that because it needs to work on the application server and on different operating systems.
In my code, I save the filepaths to Strings and use those Strings to specify where to write or read. How can I make it so it works after deploying to the server using a .war file?
I am using a Dynamic Web Project in Eclipse Mars 2.
EDIT: Since the properties is user configurable, they might give a full path. It should work whether the path is short as shown or the full path.
You have to make sure that the properties file is part of the classpath, that is usually including it in classes/ directory.
Mark the folder with properties file as Source Folder in your eclipse. If it is in a package, then use that package name in your path while loading the file.
For example, if the file is in config/data.properties, then load the file by .getResource("config/data.properties");
I have a Scala project that I want to export as a jar file so I can run it on another machine(I know how to export as a jar file). My main function reads in an existing input.json file and writes to a output.json file. The path where the the input.json is hard coded in my main and the path where the output.json will be written is hard coded as well. My goal is to export a jar file and pass in multiple input.json files to this jar file and my desire is the output.json file to be written to some directory. Basically, I want to have a large pipeline that feeds many different input.json files to this jar file and outputs all the output.json to some directory. I'm not too sure if this can be done, and if so how exactly?
Try to use the path from the configuration and from the environment variables and if the environment variable is not present it will pick up the path from the Configuration.
Change the environmental variable according to your needs and there you go.!
But in the case you want a whole directory to be taken as input containing multiple json files. Then in that case you have to implement a hack !
Read the directory path from the configuration file or environmental variable, and read it as directory, extract the list of names and perform operations on them inside a map !
I hope I answered your query!
I have a Spring Batch process that writes a .txt file. It works great, except for one thing. When it runs on a scheduler (zena), the output file gets written out to the wrong location because I have the path set up to be relative. Here's how I specify the location of the output file in the FlatFileItemWriter:
FlatFileItemWriter<Something> writer = new FlatFileItemWriter<Something>();
writer.setResource(new FileSystemResource(new File("..csv/output.txt")));
When I run without the scheduler, the file is written out to the correct directory:
/BatchJob/csv/output.txt
But when I run on the scheduler, the file gets written out to:
/Scheduler/Location/csv/output.txt
I tried using classPathResource:
writer.setResource(new ClassPathResource("output.txt", getClass()));
But then it tries to write the file to a directory location based on the name of the class package. Instead, I want to write the file to a specific directory path location, not based on the package name.
Read the Directory parameter from a property file and inject into your resource.
I am using Talend Studio for Data Integration v5.3.1.
In that I created a Job for fileDelimited. I have uploaded a CSV file and it is reading the file.
I exported the Job as a Zip file, extracted it, and I run the sh file in Terminal. And it was reading the file and displaying it in the console.
Now I want to read a different file in some other file location. is it possible to read the different file by running the same shell script? If so, where I have to change?
you can do it using context variables, and context load.
create configuration file which will have all the required input location path and other details.
you have different files on different location and you just wanted upload files without file parsing right? if so then
first create configuration file with two parameters.
FilePath|FileName
\\Folderone\|File1.txt
\\Foldertwo\|File2.txt
create two context variables in context named as FilePath & FileName
and then used tContextLoad to load above configuration details to context variable
used these variables to provide file and path at runtime from configuration.
See my answer here : https://stackoverflow.com/a/17565718/921244 to have guidance on how to open the Demo project.
There you will find a demo job for loading Context variables.
If you want an online example, take a look at the official documentation : http://www.talendforge.org/tutorials/tutorial.php?language=english&idTuto=34