My test has 20 requests, but 5 of them I've excluded using filter jmeter property:
jmeter.reportgenerator.exporter.html.series_filter
However the dashboard still counts the metrics in table headers:
Is it possible to fix this?
If you want to completely remove the "unwanted" results it's better to remove them from the .jtl file.
The options are in:
Add JSR223 PostProcessor as a child of the request(s) you don't "want" to see and put the following Groovy code into "Script" area:
prev.setIgnore()
where prev stands for previous SampleResult
Use Filter Results Tool which has --include-labels and --exclude-labels parameters allowing you to specify what needs to be included or excluded from the .jtl results file.
Related
I know it is possible to delete the entire Jmeter result tree, through the code:
import org.apache.jmeter.gui.GuiPackage;
import org.apache.jmeter.gui.JMeterGUIComponent;
import org.apache.jmeter.gui.tree.JMeterTreeNode;
import org.apache.jmeter.samplers.Clearable;
log.info("Clearing All ...");
guiPackage = GuiPackage.getInstance();
guiPackage.getMainFrame().clearData();
for (JMeterTreeNode node : guiPackage.getTreeModel().getNodesOfType(Clearable.class)) {
JMeterGUIComponent guiComp = guiPackage.getGui(node.getTestElement());
if (guiComp instanceof Clearable){
Clearable item = (Clearable) guiComp;
try {
item.clearData();
} catch (Exception ex) {
log.error("Can't clear: "+node+" "+guiComp, ex);
}
}
}
but I didn't want to delete the entire result tree, only the returns that returned with status == 500. Because my api returns 500 until the callback is available for consultation, when it finds the callback it returns "success", so while the api keeps retrying, these retries show up as an error in the report,but in fact the callback has not returned yet, when it returns the api returns the callback and is successful. I would like to remove these requests from the report, which are retry.
Add a JSR223 Post Processor with following code to ignore the test results when the response code is 500
if (prev.getResponseCode()=="500"){
prev.setIgnore()
}
prev - (SampleResult) - gives access to the previous SampleResult (if any)
API documentation for prev variable (SampleResult)
I don't think it's possible without modifying JMeter source code or heavily using reflection. In any case the answer will not fit here.
In general:
You should be using JMeter GUI only for tests development and debugging, when it comes to test execution you should run your test in command-line non-GUI mode
You should not be using listeners during test execution as they don't add any value, just consume valuable resources, all the necessary information is stored in .jtl results file
There is Filter Results plugin which allows removing the "unwanted" data from the .jtl results file
You can also (as well) generate the HTML Reporting Dashboard out of the .jtl results file, the Dashboard has its own responses filtering facilities
guys. First, thanks for the answers, I ended up giving up on the idea of removing the result from the tree and looking for a way to modify the way that result is presented.
I made a beanShell assertion, so that if Status==500, it would return "Success" in the result tree:
BeanShell
I also made it so that if it were a new attempt the name displayed in the results tree would indicate this, leaving the api name mutable depending on the return:
api name = variable
and I have this logic:
import org.apache.jmeter.samplers.SampleResult;
//process main sample
if (${status} == 500) {
SampleResult.setResponseCodeOK();
SampleResult.setSuccessful(true);
vars.put("Api_Fake_Client_name", "API_FAKE_CLIENT_RETRY");
I will configure the answers in the other conditions, but I believe that this way I will be able to solve my problem because the new attempts no longer appear as an error.
I have a JMeter test that insert an input via an HTTP call to an asynchronous java-service and then collects an exposed metric on another java-service via a groovy script.
The script then saves the collected metric as a JMeter variable to be reviewed as a performance metric.
I would like to publish this value inside the JMeter -generated dashboard but I can't find a way to save this variable as a JMeter output.
Is there a way? seems JMeter is primarily aimed to test HTTP synchronous services but it's capable of doing such collection of data.
You can use variable(s) in custom graph definitions:
You can graph any sample_variable in CSV over time, you can customize your graphs by settings their properties in the user.properties file.
They must use the id prefix custom_:
jmeter.reportgenerator.graph.custom_<your_graph_name_id>.property.<your_option_name>
To specify that this graph is a customized one :
jmeter.reportgenerator.graph.custom_<your_graph_name_id>.classname=org.apache.jmeter.report.processor.graph.impl.CustomGraphConsumer
Here is an example of a custom graph configuration that graphs the variable ts-hit:
jmeter.reportgenerator.graph.custom_testGraph.classname=org.apache.jmeter.report.processor.graph.impl.CustomGraphConsumer
jmeter.reportgenerator.graph.custom_testGraph.title=Chunk Hit
jmeter.reportgenerator.graph.custom_testGraph.property.set_Y_Axis=Number of Hits
jmeter.reportgenerator.graph.custom_testGraph.set_X_Axis=Over Time
jmeter.reportgenerator.graph.custom_testGraph.property.set_granularity=60000
jmeter.reportgenerator.graph.custom_testGraph.property.set_Sample_Variable_Name=ts-hit
jmeter.reportgenerator.graph.custom_testGraph.property.set_Content_Message=Number of Hits :
Declare the JMeter Variable you're saving in the JSR223 script as a Sample Variable, in order to do this add the next line to user.properties file:
sample_variables=foo
Then you can configure your custom chart like:
jmeter.reportgenerator.graph.custom_testGraph.classname=org.apache.jmeter.report.processor.graph.impl.CustomGraphConsumer
jmeter.reportgenerator.graph.custom_testGraph.title=Your custom chart title
jmeter.reportgenerator.graph.custom_testGraph.property.set_Y_Axis=Your Y axis name
jmeter.reportgenerator.graph.custom_testGraph.set_X_Axis=Over Time
jmeter.reportgenerator.graph.custom_testGraph.property.set_granularity=60000
jmeter.reportgenerator.graph.custom_testGraph.property.set_Sample_Variable_Name=foo
jmeter.reportgenerator.graph.custom_testGraph.property.set_Content_Message=Your custom content message
replace foo with the actual JMeter Variable name of your choice and next time you generate HTML reporting dashboard you should see your variable values plotted over time
More information:
Reporting configuration
Apache JMeter Properties Customization Guide
I work with JBehave on a daily basis, but have been tasked with working on a project that uses Cucumber. In order to add a custom reporting class functionality to that project, I need to add two steps, one at the start of the feature (story) and another at the start of the scenario. I merely want to pass to the application a description of the feature/story and the scenario to be passed to the reporting module. I know that cucumber can access the scenario name through code, but that would only resolve one of the two lines - I would still need to have another one that passes the description of the feature/story.
What I've tried in the feature file:
Feature: Ecolab BDD Test Automation Demo
Scenario Outline: User can login and logout from the landing page
Given story "EcolabWebDemo_TestCases - Ecolab BDD Test Automation Demo"
Given scenario "User can login and logout from the landing page"
Given I am on the Ecolab landing page
The corresponding code for the two added Given statements at the beginning above:
#Given("^story {string}$") // \"(\\S+)\"
public void givenStory(String storyName) {
test.initStory(storyName); // will show on report in Features column
}
#Given("^scenario {string}$") // \"(\\S+)\"
public void givenScenario(String scenarioName) {
test.initScenario(scenarioName);
}
The commented regex patterns afterwards are the suggested ones I should try but do not seem to work either.
The current configuration at least seems to "find" the steps but reports:
cucumber.runtime.CucumberException:
java.util.regex.PatternSyntaxException: Illegal repetition near index
13 ^the scenario {string}$
So that's obviously not the solution. The regex used instead of {string} simply does not find a match and does not run.
regex is absolute Greek to me, not sure why it can't just be simple like the {string} option implied it would be in the cucumber documentation. I've been searching on-line for guidance for the better part of two days to no avail, I'm apparently not even sure what to be searching for.
Based on Grasshopper's suggestion, I updated the version of Cucumber from 1.2.0 to 1.2.5. I was prepared to change the pom.xml to use the 3.x versions but tried the latest of the specified libraries first, and it did report after an attempted run what the correct regex should be for the two steps I added.
#Given("^story \"([^\"]*)\"$")
and
#Given("^scenario \"([^\"]*)\"$")
Now that the project has a version that seems to recognize strings and also reports the missing steps, the project now runs as intended.
Thanks for your help, Grasshopper.
So we have tests that look like this:
Scenario: XXX- 9056: Change password to special characters
Meta:
#Regression
#ticket #5732
#skip
Given a customer with the following properties:...
we put the #skip there whenever we are still working on it or we know it will not work properly.
We want to get Serenity reports, but we don't want it to include skipped stories. How can we exclude them from being reported?
We found our issue was that in the Scenario line some of our test cases looked like
Scenario: XXX-#9056: Change password to special characters
Instead of
Scenario: XXX- 9056: Change password to special characters
So having the pound symbol(#) on the line with Scenario was messing it up. It doesn't matter if it is under the meta tag. Now none of the skipped test are showing in the report.
I trying to create queues using PCF command in the WebSphere API as detailed in $MQM_HOME/samp/pcf/samples/PCF_CreateQeue.java. The creation fails when i add a description
command.addParameter(PCFConstants.MQCA_Q_DESC, "Created using MQMonitor");
I get the error: com.ibm.mq.pcf.PCFException: MQJE001: Completion Code 2, Reason 3015 : MQRCCF_CFST_PARM_ID_ERROR
Is there another way of setting the description, i'm using version 6 of the API.
The Commands page in the PCF manual states that:
The required parameters and the
optional parameters are listed. On
platforms other than z/OS®, the
parameters must occur in the order:
All required parameters, in the order stated, followed by
Optional parameters as required, in any order, unless specifically
noted in the PCF definition.
The section Change, Copy and Create Queue lists the required parameters in the following order:
MQCA_Q_NAME
MQIA_Q_TYPE
Optional parameters, including QDesc
The same manual provides required parameters and their order for all PCF commands so no need to play hide-and-seek trying out parms and orders in the future.
It turns out the addParameter on the PCFMessage should in a certain sequence (stumbled on it). If i change the add parameters if works. This is not just for creating queues, but for channels as well.
command.addParameter(PCFConstants.MQCA_Q_NAME, qname);
command.addParameter(PCFConstants.MQIA_Q_TYPE, PCFConstants.MQQT_LOCAL);
command.addParameter(PCFConstants.MQCA_Q_DESC, qdesc);
command.addParameter(PCFConstants.MQIA_DEF_PERSISTENCE, PCFConstants.MQPER_PERSISTENT);
the above will execute without error.
command.addParameter(PCFConstants.MQCA_Q_NAME, qname);
command.addParameter(PCFConstants.MQCA_Q_DESC, qdesc);
command.addParameter(PCFConstants.MQIA_Q_TYPE, PCFConstants.MQQT_LOCAL);
command.addParameter(PCFConstants.MQIA_DEF_PERSISTENCE, PCFConstants.MQPER_PERSISTENT);
the above will fail after moving around the description.
I haven't seen it documented in the Java docs, and if thats the case i looks forward to some hide and seek.