how to capture response data in jmeter and do parameterisation - java

I'm testing a standalone java application using JMeter and I'm using a OS Process Sampler to execute the jar file. I've given JSON string in java program and I've used publish method to post JSON string to Queue. Now I want to capture the data in JMeter response data and I want to do parameterisation for those fields and I want to use CSV Data Set Config to pass the values to fields and each user should take unique data.
The JSON string looks like below and I want to parameterise these valuse from JMeter response data and pass values from CSV file.
[{
"id": 100001,
"status": "Pending",
"priorityCode": "1",
"miniTeamId": "234256",
"requestName": "General"
}]

Given your CSV file looks like:
100001,Pending,1,234256,General
100002,Done,2,234257,General
etc.
Configure your CSV Data Set config as follows:
Filename: path to your .csv file (better full path)
Variable Names: id,status,priorityCode,miniTeamId,requestName
other fields can be left as they are. By default JMeter will read next line from CSV file by each thread on each loop, when file end will be reached - JMeter will start over.
JMeter Variables populated by the CSV Data Set Config can be referred as ${variableName} or ${__V(variableName)} so parametrised request body should look like:
[
{
"id": ${id},
"status": "${status}",
"priorityCode": "${priorityCode}",
"miniTeamId": "${miniTeamId}",
"requestName": "${requestName}"
}
]
See Using CSV DATA SET CONFIG for more detailed information on parametrisation of JMeter tests using CSV files.
Also be aware of the following test elements:
__csvRead() function - if you need to get data from multiple CSV files
JSON Path Extractor - if you need to get values from response JSON
UPD: passing parametrized JSON string via OS Process Sampler

Related

Consume the Json in kafka topic using tJava in Talend

I am currently trying to create an ingestion job workflow using kafka in Talend Studio. The job will read the json data in topic "work" and store into the hive table.
Snippet of json:
{"Header": {"Vers":"1.0","Message": "318","Owner": {"ID": 102,"FID": 101},"Mode":"8"},"Request": {"Type":"4","ObjType": "S","OrderParam":[{"Code": "OpType","Value": "30"},{"Code": "Time","Value": "202"},{"Code": "AddProperty","ObjParam": [{"Param": [{"Code": "Sync","Value": "Y"}]}]}]}}
{"Header": {"Vers":"2.0","Message": "318","Owner": {"ID": 103,"FID": 102},"Mode":"8"},"Request": {"Type":"5","ObjType": "S","OrderParam":[{"Code": "OpType","Value": "90"},{"Code": "Time","Value": "203"},{"Code": "AddProperty","ObjParam": [{"Param": [{"Code": "Sync","Value": "Y"}]}]}]}}
Talend workflow:
My focus in this question is not the talend component. But the java code in tJava component that uses the java to fetch and read the json.
Java code:
String output=((String)globalMap.get("tLogRow_1_OUTPUT"));
JSONObject jsonObject = new JSONObject(output);
System.out.println(jsonObject);
String sourceDBName=(jsonObject.getString("Vers"));
The code above able to get the data from tLogRow in "output" variable. However, it gives an error where it read null value for json object. What should I do to correctly get the data from json accordingly?
You can use a tExtractJsonFields instead of a tJava. This component extracts data from your input String following a json schema that you can define in metadata. With this you could extract all fields from your input .

ExecuteStreamCommand won't read foreign characters

We have Apache Nifi set to write files to local drive, then run program that processes these files and outputs response to "response" attribute. This is a JSON string that we then deliver to API to update records.
However, issue is that while we can successfully write, read and process the files, Nifi fails to understand non-English characters in the response text. This leads to names being corrupted when we send back the response. This only applies to the JSON string we receive from the program.
Nifi is running Windows 10 environment When we run the program manually using files outputted by Nifi, we get correct output. Issue only happens in Nifi.
To provide example, input json is:
{
"player" : "mörkö",
"target" : "goal",
"didhitin" : ""
}
This is stored in our programs work folder and we call program using ExecuteStreamCommand , giving our input JSON file as the parameter. JSON is processed and our program outputs following JSON, which is then stored into response attribute of the flowfile:
{
"player" : "mörkö",
"target" : "goal",
"didhitin" : "true"
}
However, issue is that when this is read by Nifi into the response attribute of the flowfile, it becomes
{
"player" : "m¤rk¤",
"target" : "goal",
"didhitin" : "true"
}
(Not the actual process, but close enough to demonstrate the issue)
Which, when we feed it into API, would either fail or corrupt the name of the original (In this case, value of player). Neither which is desirable output.
So far, we have figured out that this is most likely issue with encoding, but we have not found a way to change the encoding of Nifi to possibly fix incorrectly read characters.
Managed to fix this issue by adding following line to the start of the program:
Console.OutputEncoding = Encoding.UTF8;
This effectively enforces the program to output UTF-8 characters, which would be in-line with rest of the flow.

Read json data from properties file

I have a JSON data in the properties file and trying to retrieve it in java. When I am trying to retrieve the JSON data with the property name it's giving only first string/word from the JSON.
Inside the property file, I have the below content.
profile: {"fname": "ABC","lname": "XYZ","meetings":{"morning":10,"evening":60}}
I am trying to read the content using property name 'profile' as a string and I am getting below error message.
Expected ',' instead of ''
can someone help me with the issue, I tried to escape and unescape but still have the same issue
It may depend on what you are using to deserialize the JSON, but well formed JSON is a single element, so what you have needs to be inside of a container. That is, your file content should be:
{ profile: {"fname": "ABC","lname": "XYZ","meetings":{"morning":10,"evening":60}}}
You can do it like this:
profile={"fname": "ABC","lname": "XYZ","meetings":{"morning":10,"evening":60}}
Or if you want to do it in multiple lines
profile={\
"fname": "ABC",\
"lname": "XYZ",\
"meetings":{\
"morning":10,\
"evening":60\
}\
}

How to use json array in order to use each item in the array to use values in a get request ?

I have a json array data file (pre-prepared) each item in the json array contain data which I want to be able to use in order to send a request to a server using the json array data file , i know how to be able to send one request after extracting data from the file but I am struggling of getting all the data to send one after another.
I want to imitate the following behavior which I wrote already in java:
List<Integer> mtl = Arrays.asList(new Integer[]{1, 9, 257, 265});
for(int i=0;i<jsonArray.size();i++){
JSONObject item = (JSONObject)jsonArray.get(i);
int dataFlagType = Integer.parseInt(item.get("DataFlagType").toString());
if(!(mtl.contains(dataFlagType))){
sendPushStream(Long.parseLong(m_ap.sid),pid,subsId,item,domain,dnsName,dataFlagType);
}else{
lastMessage = (JSONObject)jsonArray.get(i);
}
Thread.sleep(100);
}
Thread.sleep(100);
sendPushStream(Long.parseLong(m_ap.sid),pid,subsId,lastMessage,domain,dnsName,Integer.parseInt(lastMessage.get("DataFlagType").toString()));
where sendPushStream execute the post request itself.
I would suggest organizing your Test Plan as follows:
HTTP Request (Protocol: file, Path: /path/to/your/file.json
JSON Extractor (relevant JSONPath query to extract values and store them into JMeter Variables)
ForEach Controller to iterate the Variables coming from the JSON Extractor
HTTP Request - to mimic sendPushStream
Another option is using your Java code from JSR223 Sampler or JUnit Request sampler, just make sure you package your helper code as .jar and add it to JMeter Classpath

Cloud Dataflow:Reading entire json array file from Cloud Storage and create a PCollection of json object

I have a json array file with content as below
[ {
"MemberId" : "1234",
"Date" : "2017-07-03",
"Interactions" : [ {
"Number" : "1327",
"DwellTime" : "00:03:05"
} ]
}, {
"MemberId" : "5678",
"Date" : "2017-07-03",
"Interactions" : [ {
"Number" : "1172",
"DwellTime" : "00:01:26"
} ]
} ]
I wanted to create a PCollection of Java Object mapped to each Json present in Json array
JSON formatted like this (records spread over multiple lines instead of one per line) is hard for a data processing tool like beam/dataflow to process in parallel - from a random point in the file, you cannot be sure where the next record begins. You can do it by reading from the beginning of the file, but then you're not really reading in parallel.
If it's possible, reformatting it so that it's one record per line would let you use something like TextIO to read in the file.
If not, you'll need to read the file in one go.
I would suggest a couple possible approaches:
Write a ParDo that reads from the file using the gcs API
This is pretty straight forward. You'll do all the reading in one ParDo and you'll need to implement the connection code inside of that pardo. Inside the pardo you would write the same code you would as if you're reading the file in a normal java program. The pardo will emit each java object as a record.
Implement a filebasedsource
File based sources will work - when the fileOrPatternSpec is "gs://..." it knows how to read from GCS. You'll need to make sure to set fileMetadata.isReadSeekEfficient to false so that it won't try to split the file. I haven't tried it, but I believe the correct way to do that is to set it inside of the single file constructor of FBS (ie, your class's override of FileBaseSource(MetaData, long, long)
TextSource/XmlSource (and their accompanying wrappers TextIO/XmlIO) are examples of this, except that they try to implement splitting - yours will be much simpler since it won't.

Categories