I have gradle script which is creating the version in JIRA using the REST API.
But there is jira-rest-java-client also available. I want to use the java library of jira-rest-java-client and wants to do the same stuff in gradle. Can someone provide an example how could I try this.
How to use the jira-rest-java-client library to make connection with JIRA through example?
In Java I am trying to use this JRCJ Library but getting below error through Intellj
import com.atlassian.jira.rest.client.api.JiraRestClient;
import com.atlassian.jira.rest.client.api.domain.*;
import com.atlassian.jira.rest.client.api.domain.input.ComplexIssueInputFieldValue;
import com.atlassian.jira.rest.client.api.domain.input.FieldInput;
import com.atlassian.jira.rest.client.api.domain.input.TransitionInput;
import com.atlassian.jira.rest.client.internal.ServerVersionConstants;
import com.atlassian.jira.rest.client.internal.async.AsynchronousJiraRestClientFactory;
import com.google.common.collect.Lists;
import org.codehaus.jettison.json.JSONException;
import java.io.IOException;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.Arrays;
import java.util.Collection;
import java.util.List;
/**
* A sample code how to use JRJC library
*
* #since v0.1
*/
public class Example1 {
private static URI jiraServerUri = URI.create("http://localhost:2990/jira");
private static boolean quiet = false;
public static void main(String[] args) throws URISyntaxException, JSONException, IOException {
parseArgs(args);
final AsynchronousJiraRestClientFactory factory = new AsynchronousJiraRestClientFactory();
final JiraRestClient restClient = factory.createWithBasicHttpAuthentication(jiraServerUri, "admin", "admin");
try {
final int buildNumber = restClient.getMetadataClient().getServerInfo().claim().getBuildNumber();
// first let's get and print all visible projects (only jira4.3+)
if (buildNumber >= ServerVersionConstants.BN_JIRA_4_3) {
final Iterable<BasicProject> allProjects = restClient.getProjectClient().getAllProjects().claim();
for (BasicProject project : allProjects) {
if (project == TEST){
println(project);}else {
System.out.println("Project" + "Not Found");
}
}
}
// let's now print all issues matching a JQL string (here: all assigned issues)
if (buildNumber >= ServerVersionConstants.BN_JIRA_4_3) {
final SearchResult searchResult = restClient.getSearchClient().searchJql("assignee is not EMPTY").claim();
for (BasicIssue issue : searchResult.getIssues()) {
println(issue.getKey());
}
}
final Issue issue = restClient.getIssueClient().getIssue("TST-7").claim();
println(issue);
// now let's vote for it
restClient.getIssueClient().vote(issue.getVotesUri()).claim();
// now let's watch it
final BasicWatchers watchers = issue.getWatchers();
if (watchers != null) {
restClient.getIssueClient().watch(watchers.getSelf()).claim();
}
// now let's start progress on this issue
final Iterable<Transition> transitions = restClient.getIssueClient().getTransitions(issue.getTransitionsUri()).claim();
final Transition startProgressTransition = getTransitionByName(transitions, "Start Progress");
restClient.getIssueClient().transition(issue.getTransitionsUri(), new TransitionInput(startProgressTransition.getId()))
.claim();
// and now let's resolve it as Incomplete
final Transition resolveIssueTransition = getTransitionByName(transitions, "Resolve Issue");
final Collection<FieldInput> fieldInputs;
// Starting from JIRA 5, fields are handled in different way -
if (buildNumber > ServerVersionConstants.BN_JIRA_5) {
fieldInputs = Arrays.asList(new FieldInput("resolution", ComplexIssueInputFieldValue.with("name", "Incomplete")));
} else {
fieldInputs = Arrays.asList(new FieldInput("resolution", "Incomplete"));
}
final TransitionInput transitionInput = new TransitionInput(resolveIssueTransition.getId(), fieldInputs, Comment
.valueOf("My comment"));
restClient.getIssueClient().transition(issue.getTransitionsUri(), transitionInput).claim();
}
finally {
restClient.close();
}
}
private static void println(Object o) {
if (!quiet) {
System.out.println(o);
}
}
private static void parseArgs(String[] argsArray) throws URISyntaxException {
final List<String> args = Lists.newArrayList(argsArray);
if (args.contains("-q")) {
quiet = true;
args.remove(args.indexOf("-q"));
}
if (!args.isEmpty()) {
jiraServerUri = new URI(args.get(0));
}
}
private static Transition getTransitionByName(Iterable<Transition> transitions, String transitionName) {
for (Transition transition : transitions) {
if (transition.getName().equals(transitionName)) {
return transition;
}
}
return null;
}
}
Error:
xception in thread "main" java.lang.NoClassDefFoundError: com/atlassian/sal/api/executor/ThreadLocalContextManager
at com.atlassian.jira.rest.client.internal.async.AsynchronousJiraRestClientFactory.create(AsynchronousJiraRestClientFactory.java:35)
at com.atlassian.jira.rest.client.internal.async.AsynchronousJiraRestClientFactory.createWithBasicHttpAuthentication(AsynchronousJiraRestClientFactory.java:42)
at Example1.main(Example1.java:34)
Caused by: java.lang.ClassNotFoundException: com.atlassian.sal.api.executor.ThreadLocalContextManager
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 3 more
Moreover I added the JRJC api,core jar in External Libraries but still getting this error?
Could someone tell me what is the issue or where am I missing something?
compile 'com.atlassian.jira:jira-rest-java-client-core:4.0.0'
compile 'com.atlassian.jira:jira-rest-java-client-api:4.0.0'
Simple connection to JIRA:
JiraRestClient restClient = new AsynchronousJiraRestClientFactory().createWithBasicHttpAuthentication(new URI("https://" + jira_domain),
jira_username, jira_password);
Related
I am currently using cucumber and the masterthought plugin to generate html reports on test executions. I want to enable future flexibility to configure the cucumber options when running my tests from a gradle script as well as running the cucumber from java code without the need of using plugins on build script.
I have previously used the surefire plugin to run the cucumber and the masterthought plugin to generate the cucumber report
package com.my.domain;
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.ArrayList;
import java.util.List;
import java.util.Optional;
import org.apache.commons.io.FileUtils;
import org.apache.commons.io.FilenameUtils;
import org.junit.runner.RunWith;
import org.junit.runners.model.InitializationError;
import cucumber.api.CucumberOptions;
import cucumber.api.junit.Cucumber;
import cucumber.runtime.RuntimeOptions;
import cucumber.runtime.RuntimeOptionsFactory;
import net.masterthought.cucumber.Configuration;
import net.masterthought.cucumber.ReportBuilder;
import net.masterthought.cucumber.Reportable;
#RunWith(Cucumber.class)
#CucumberOptions(
plugin = {
"json:build/path/cucumber.json" },
features = {
"src/main/resources/features" },
glue = {
"com.my.domain.stepdefinitions" },
dryRun = false,
tags = {"#tag"}
)
public class CucumberJunitRunner {
// TODO: Make Tags, Features, Glue, plugin runtime, and path configurable
private static final String JSON_PLUGIN_PREFIX = "json:";
private static final String PRECONFIGURED_REPORT_OUTPUT_FOLDER_NAME = "cucumber-html-reports";
private static final Optional<String> REPORT_OUTPUT_FOLDER_LOCATION = Optional.ofNullable(null);
private static final boolean SKIP_REPORT_GENERATION = false;
public static void main(String[] args) throws InitializationError {
RuntimeOptions cucumberRuntimeOptions = null;
Optional<String> jsonPluginOutputLocation = Optional.empty();
if(args.length > 0) {
//TODO: USE ARGUMENTS TO INITIALIZE cucumberRuntimeOptions AND jsonPluginOutputLocation
} else {
RuntimeOptionsFactory cucumberRuntimeOptionsFactory = new RuntimeOptionsFactory(CucumberJunitRunner.class);
cucumberRuntimeOptions = cucumberRuntimeOptionsFactory.create();
jsonPluginOutputLocation = cucumberRuntimeOptions.getPluginFormatterNames().stream()
.filter(s -> s.startsWith(JSON_PLUGIN_PREFIX))
.map(s -> s.substring(JSON_PLUGIN_PREFIX.length())).findFirst();
if( ! jsonPluginOutputLocation.isPresent() ) {
throw new RuntimeException(String.format(
"Could not find jsonPluginOutputLocation in plugins from cucumber options: %s",
cucumberRuntimeOptions.getPluginFormatterNames()));
}
}
deletePreviousData(jsonPluginOutputLocation);
runCucumber(cucumberRuntimeOptions, Thread.currentThread().getContextClassLoader());
if (SKIP_REPORT_GENERATION) {
System.out.println("Report generation skipped. No HTML report was built.");
} else {
if (cucumberRuntimeOptions.isDryRun()) {
System.out.println("Cucumber DryRun executed. No HTML report was built.");
} else {
if (jsonPluginOutputLocation.isPresent()) {
Path jsonPluginOutputPath = Paths.get(jsonPluginOutputLocation.get());
if (Files.exists(jsonPluginOutputPath)) {
generateCucumberReport(
REPORT_OUTPUT_FOLDER_LOCATION.orElse(
FilenameUtils.getFullPathNoEndSeparator(jsonPluginOutputLocation.get())),
jsonPluginOutputLocation.get(), "1", "My Project");
} else {
System.out.println("Cucumber JSON report was missing. No HTML report was built.");
}
} else {
System.out.println("Cucumber JSON plugin was missing. No HTML report was built.");
}
}
}
}
private static void deletePreviousData(Optional<String> jsonPluginOutputLocation) {
Path jsonPluginOutputPath = Paths.get(jsonPluginOutputLocation.get());
if (Files.exists(jsonPluginOutputPath)) {
try {
Files.delete(jsonPluginOutputPath);
System.out.println("Cucumber JSON file was deleted: " +
jsonPluginOutputPath.toAbsolutePath().toString());
} catch (IOException e) {
e.printStackTrace();
}
} else {
System.out.println("Cucumber JSON file from previous execution was not detected: "
+ jsonPluginOutputPath.toAbsolutePath().toString());
}
Path cucumberReportsFolder = jsonPluginOutputPath.resolveSibling(PRECONFIGURED_REPORT_OUTPUT_FOLDER_NAME);
if (Files.exists(cucumberReportsFolder)) {
try {
FileUtils.deleteDirectory(cucumberReportsFolder.toFile());
System.out.println("Cucumber JSON report was deleted: " +
cucumberReportsFolder.toAbsolutePath().toString());
} catch (IOException e) {
e.printStackTrace();
}
} else {
System.out.println("Cucumber JSON report from previous execution was not detected: "
+ cucumberReportsFolder.toAbsolutePath().toString());
}
}
/**
* Launches the Cucumber-JVM command line.
*
* #param argv runtime options. See details in the
* {#code cucumber.api.cli.Usage.txt} resource.
* #param classLoader classloader used to load the runtime
* #return 0 if execution was successful, 1 if it was not (test failures)
*/
public static byte runCucumber(RuntimeOptions cucumberRuntimeOptions, ClassLoader classLoader) {
final cucumber.runtime.Runtime runtime = cucumber.runtime.Runtime.builder()
.withRuntimeOptions(cucumberRuntimeOptions).withClassLoader(classLoader).build();
runtime.run();
return runtime.exitStatus();
}
private static void generateCucumberReport(String reportOutputDirectoryLocation, String cucumberJsonFile,
String buildNumber, String projectName) {
File reportOutputDirectory = new File(reportOutputDirectoryLocation);
List<String> jsonFiles = new ArrayList<>();
jsonFiles.add(cucumberJsonFile);
// jsonFiles.add("cucumber-report-2.json");
// String buildNumber = "1";
// String projectName = "cucumberProject";
boolean runWithJenkins = false;
Configuration configuration = new Configuration(reportOutputDirectory, projectName);
// optional configuration - check javadoc
configuration.setRunWithJenkins(runWithJenkins);
configuration.setBuildNumber(buildNumber);
// addidtional metadata presented on main page
// configuration.addClassifications("Platform", "Windows");
// configuration.addClassifications("Browser", "Firefox");
// configuration.addClassifications("Branch", "release/1.0");
// optionally add metadata presented on main page via properties file
//List<String> classificationFiles = new ArrayList<>();
//classificationFiles.add("properties-1.properties");
//classificationFiles.add("properties-2.properties");
// configuration.addClassificationFiles(classificationFiles);
ReportBuilder reportBuilder = new ReportBuilder(jsonFiles, configuration);
Reportable result = reportBuilder.generateReports();
// and here validate 'result' to decide what to do if report has failed
if (result == null) {
System.out.println("There was an isssue while building the report");
System.exit(1);
}
System.out.println(result);
}
}
I'd like to use my custom JSON transformation that implements the com.bazaarvoice.jolt.Transform interface.
I use "Custom Transformation Class Name" and "Custom Module Directory" like this:
However, I cannot get the JoltTransformJSON processor to use it; I get a ClassNotFoundException:
2019-04-01 14:30:54,196 ERROR [Timer-Driven Process Thread-4] o.a.n.p.standard.JoltTransformJSON JoltTransformJSON[id=b407714f-0169-1000-d9b2-1709069238d7] Unable to transform StandardFlowFileRecord[uuid=72dc471b-c587-4da9-b54c-eb46247b0cf4,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1554129053747-21203, container=default, section=723], offset=607170, length=5363],offset=0,name=72dc471b-c587-4da9-b54c-eb46247b0cf4,size=5363] due to java.util.concurrent.CompletionException: java.lang.ClassNotFoundException: org.sentilo.nifi.elasticsearch.ElasticsearchToOpenTSDB: java.util.concurrent.CompletionException: java.lang.ClassNotFoundException: org.sentilo.nifi.elasticsearch.ElasticsearchToOpenTSDB
java.util.concurrent.CompletionException: java.lang.ClassNotFoundException: org.sentilo.nifi.elasticsearch.ElasticsearchToOpenTSDB
at com.github.benmanes.caffeine.cache.BoundedLocalCache$BoundedLocalLoadingCache.lambda$new$0(BoundedLocalCache.java:3373)
at com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2039)
at java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1853)
at com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2037)
at com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2020)
at com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:112)
at com.github.benmanes.caffeine.cache.LocalLoadingCache.get(LocalLoadingCache.java:67)
at org.apache.nifi.processors.standard.JoltTransformJSON.getTransform(JoltTransformJSON.java:316)
at org.apache.nifi.processors.standard.JoltTransformJSON.onTrigger(JoltTransformJSON.java:277)
at org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1162)
at org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:205)
at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: org.sentilo.nifi.elasticsearch.ElasticsearchToOpenTSDB
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.nifi.processors.standard.util.jolt.TransformFactory.getCustomTransform(TransformFactory.java:65)
at org.apache.nifi.processors.standard.JoltTransformJSON.createTransform(JoltTransformJSON.java:346)
at org.apache.nifi.processors.standard.JoltTransformJSON.lambda$setup$0(JoltTransformJSON.java:324)
at com.github.benmanes.caffeine.cache.BoundedLocalCache$BoundedLocalLoadingCache.lambda$new$0(BoundedLocalCache.java:3366)
... 19 common frames omitted
I compiled the class together with all its dependencies with the maven-assembly-plugin and placed it in the directory "/data/bin/nifi-1.9.1/jolt_modules".
The directory and the jar are readable.
I also have tried to add the classname to the operation in spec as in here), but Ii seems that it's the "Custom Module Directory" that does no effect for some reason...
EDIT: I complete the answer with the code of ElasticsearchToOpenTSDB, in case somebody finds it useful. Is just converts Sentilo messages stored in Elasticsearch to OpenTSDB datapoints, flattening some nested JSON structures on the way.
package org.sentilo.nifi.elasticsearch;
import com.bazaarvoice.jolt.SpecDriven;
import com.bazaarvoice.jolt.Transform;
import com.bazaarvoice.jolt.exception.TransformException;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.apache.commons.beanutils.BeanUtils;
import org.sentilo.agent.historian.domain.OpenTSDBDataPoint;
import org.sentilo.agent.historian.utils.OpenTSDBValueConverter;
import org.sentilo.common.domain.EventMessage;
import org.sentilo.nifi.elasticsearch.model.Hits;
import org.springframework.util.StringUtils;
import javax.inject.Inject;
import java.lang.reflect.InvocationTargetException;
import java.text.ParseException;
import java.util.ArrayList;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import static org.sentilo.agent.historian.utils.OpenTSDBValueConverter.replaceIllegalCharacters;
public class ElasticsearchToOpenTSDB implements SpecDriven, Transform {
private final Object spec;
private final ObjectMapper mapper = new ObjectMapper();
public ElasticsearchToOpenTSDB() {
this.spec = "{}";
}
#Inject
public ElasticsearchToOpenTSDB( Object spec ) {
this.spec = spec;
}
public Object transform( final Object input ) {
try{
Hits hits = mapper.readValue(input.toString(), Hits.class);
List<EventMessage> newEventList = new ArrayList<EventMessage>();
List<OpenTSDBDataPoint> dataPoints = new ArrayList<OpenTSDBDataPoint>();
for(EventMessage event : hits.hits) {
if (OpenTSDBValueConverter.isComplexValue(event.getMessage())) {
addComplexValueToQueue(event,newEventList);
} else {
addSimpleValueToQueue(event, newEventList);
}
}
for(EventMessage event2 : newEventList) {
OpenTSDBDataPoint dp = unmarshal(event2);
dataPoints.add(dp);
}
return dataPoints;
}catch(Exception e) {
throw new TransformException(e.getMessage());
}
}
private void addComplexValueToQueue(final EventMessage event, List<EventMessage> eventList) throws IllegalAccessException, InvocationTargetException {
// Flatten JSON message into N measures
final String metricName = OpenTSDBValueConverter.createMetricName(event);
final Map<String, Object> unfoldValues = OpenTSDBValueConverter.extractMeasuresFromComplexType(metricName, event.getMessage());
for (final Map.Entry<String, Object> e : unfoldValues.entrySet()) {
final EventMessage newEvent = new EventMessage();
BeanUtils.copyProperties(newEvent, event);
newEvent.setTopic(e.getKey());
newEvent.setMessage(e.getValue().toString());
eventList.add(newEvent);
}
}
private void addSimpleValueToQueue(final EventMessage event, List<EventMessage> eventList) {
// The value should be long, float or boolean
try {
final Object numericValue = OpenTSDBValueConverter.getSimpleValue(event.getMessage());
final String metricName = OpenTSDBValueConverter.createMetricName(event);
event.setMessage(numericValue.toString());
event.setTopic(metricName);
eventList.add(event);
} catch (final ParseException e) {
// Probably String or some non-numeric value that we cannot store in OpenTSDB. Pass
return;
}
}
public static OpenTSDBDataPoint unmarshal(final EventMessage event) throws ParseException {
final OpenTSDBDataPoint dataPoint = new OpenTSDBDataPoint();
dataPoint.setMetric(event.getTopic());
dataPoint.setValue(OpenTSDBValueConverter.getSimpleValue(event.getMessage()));
if (event.getPublishedAt() != null) {
dataPoint.setTimestamp(event.getPublishedAt());
} else {
dataPoint.setTimestamp(event.getTime());
}
dataPoint.setTags(createTags(event));
return dataPoint;
}
private static Map<String, String> createTags(final EventMessage event) {
final Map<String, String> tags = new LinkedHashMap<String, String>();
putTag(tags, OpenTSDBDataPoint.Tags.type.name(), replaceIllegalCharacters(event.getType()));
putTag(tags, OpenTSDBDataPoint.Tags.sensor.name(), replaceIllegalCharacters(event.getSensor()));
putTag(tags, OpenTSDBDataPoint.Tags.provider.name(), replaceIllegalCharacters(event.getProvider()));
putTag(tags, OpenTSDBDataPoint.Tags.component.name(), replaceIllegalCharacters(event.getComponent()));
putTag(tags, OpenTSDBDataPoint.Tags.alertType.name(), replaceIllegalCharacters(event.getAlertType()));
putTag(tags, OpenTSDBDataPoint.Tags.sensorType.name(), replaceIllegalCharacters(event.getSensorType()));
putTag(tags, OpenTSDBDataPoint.Tags.publisher.name(), replaceIllegalCharacters(event.getPublisher()));
putTag(tags, OpenTSDBDataPoint.Tags.tenant.name(), replaceIllegalCharacters(event.getTenant()));
putTag(tags, OpenTSDBDataPoint.Tags.publisherTenant.name(), replaceIllegalCharacters(event.getPublisherTenant()));
return tags;
}
private static void putTag(final Map<String, String> tags, final String tagName, final String tagValue) {
if (StringUtils.hasText(tagValue)) {
tags.put(tagName, tagValue);
}
}
}
Update
As indicated in the comments, the issue is not resolved yet and has been filed as a bug report. The latest status can be seen here: https://issues.apache.org/jira/browse/NIFI-6213
The problem is not resolved yet and has been filed as a bug report. The latest status can be seen here:
https://issues.apache.org/jira/browse/NIFI-6213
I want to connect Java with JIRA trial account. I tested this code:
public class JiraImpl
{
private static URI JIRA_URL = URI.create("https://sonoratest.atlassian.net");
private static final String JIRA_ADMIN_USERNAME = "sonoratestw#gmail.com";
private static final String JIRA_ADMIN_PASSWORD = "sonpass";
public static void main(String[] args) throws IOException, InterruptedException, ExecutionException
{
try
{
AsynchronousJiraRestClientFactory factory = new AsynchronousJiraRestClientFactory();
JiraRestClient restClient = factory.createWithBasicHttpAuthentication(JIRA_URL, JIRA_ADMIN_USERNAME, JIRA_ADMIN_PASSWORD);
Iterable<BasicProject> allProjects = restClient.getProjectClient().getAllProjects().claim();
}
catch (Exception e)
{
e.printStackTrace();
}
}
}
But when I run it nothing happens. Wahat is the proper way to get data from JIRA using REST API?
Update. I also tried this:
private static URI JIRA_URL = URI.create("https://sonoratest.atlassian.net/rest/auth/1/session");
I get
java.util.concurrent.ExecutionException: RestClientException{statusCode=Optional.of(404), errorCollections=[ErrorCollection{status=404, errors={}, errorMessages=[]}]}
at com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:299)
at com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:286)
at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116)
at com.google.common.util.concurrent.ForwardingFuture.get(ForwardingFuture.java:63)
at com.atlassian.jira.rest.client.internal.async.DelegatingPromise.get(DelegatingPromise.java:102)
at com.jira.impl.JiraImpl.main(JiraImpl.java:23)
Caused by: RestClientException{statusCode=Optional.of(404), errorCollections=[ErrorCollection{status=404, errors={}, errorMessages=[]}]}
Try getting an issue first, since that is so basic.
import java.net.URI;
import java.util.Optional;
import com.atlassian.jira.rest.client.api.JiraRestClient;
import com.atlassian.jira.rest.client.api.domain.Issue;
import com.atlassian.jira.rest.client.internal.async.AsynchronousJiraRestClientFactory;
import com.atlassian.util.concurrent.Promise;
public class JRC
{
public Issue getIssue(String issueKey) throws Exception
{
final URI jiraServerUri = new URI("https://jira-domain");
final JiraRestClient restClient = new AsynchronousJiraRestClientFactory().createWithBasicHttpAuthentication(jiraServerUri, "user#domain.com", "password");
Promise issuePromise = restClient.getIssueClient().getIssue(issueKey);
return Optional.ofNullable((Issue) issuePromise.claim()).orElseThrow(() -> new Exception("No such issue"));
}
}
You can also take a look at this code to get a fully working sample:
https://github.com/somaiah/jrjc
Is there a way to start elasticsearch within a gradle build before running integration tests and afterwards stop elasticsearch?
My approach so far is the following, but this blocks the further execution of the gradle build.
task runES(type: JavaExec) {
main = 'org.elasticsearch.bootstrap.Elasticsearch'
classpath = sourceSets.main.runtimeClasspath
systemProperties = ["es.path.home":"$buildDir/elastichome",
"es.path.data":"$buildDir/elastichome/data"]
}
For my purpose i have decided to start elasticsearch within my integration test in java code.
I've tried out ElasticsearchIntegrationTest but that didn't worked with spring, because it didn't harmony with SpringJUnit4ClassRunner.
I've found it easier to start elasticsearch in the before method:
My test class testing some 'dummy' productive code (indexing a document):
import static org.hamcrest.CoreMatchers.notNullValue;
import static org.junit.Assert.assertThat;
import org.elasticsearch.action.index.IndexResponse;
import org.elasticsearch.client.Client;
import org.elasticsearch.client.transport.TransportClient;
import org.elasticsearch.common.settings.ImmutableSettings;
import org.elasticsearch.common.settings.ImmutableSettings.Builder;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.transport.InetSocketTransportAddress;
import org.elasticsearch.indices.IndexAlreadyExistsException;
import org.elasticsearch.node.Node;
import org.elasticsearch.node.NodeBuilder;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
public class MyIntegrationTest {
private Node node;
private Client client;
#Before
public void before() {
createElasticsearchClient();
createIndex();
}
#After
public void after() {
this.client.close();
this.node.close();
}
#Test
public void testSomething() throws Exception {
// do something with elasticsearch
final String json = "{\"mytype\":\"bla\"}";
final String type = "mytype";
final String id = index(json, type);
assertThat(id, notNullValue());
}
/**
* some productive code
*/
private String index(final String json, final String type) {
// create Client
final Settings settings = ImmutableSettings.settingsBuilder().put("cluster.name", "mycluster").build();
final TransportClient tc = new TransportClient(settings).addTransportAddress(new InetSocketTransportAddress(
"localhost", 9300));
// index a document
final IndexResponse response = tc.prepareIndex("myindex", type).setSource(json).execute().actionGet();
return response.getId();
}
private void createElasticsearchClient() {
final NodeBuilder nodeBuilder = NodeBuilder.nodeBuilder();
final Builder settingsBuilder = nodeBuilder.settings();
settingsBuilder.put("network.publish_host", "localhost");
settingsBuilder.put("network.bind_host", "localhost");
final Settings settings = settingsBuilder.build();
this.node = nodeBuilder.clusterName("mycluster").local(false).data(true).settings(settings).node();
this.client = this.node.client();
}
private void createIndex() {
try {
this.client.admin().indices().prepareCreate("myindex").execute().actionGet();
} catch (final IndexAlreadyExistsException e) {
// index already exists => we ignore this exception
}
}
}
It is also very important to use elasticsearch version 1.3.3 or higher. See Issue 5401.
I am running the YouTubeSample given on the google developers website. I have no errors in the code and my imports appear to be fine. But when I run the project I get the aforementioned error.
I have done some searches but to be honest I have been unable to work out what the problem is. I have already tried importing an external jar guava but it didn't help.
Any help is appreciated. Here is the full class
package com.pengilleys.googlesamples;
import java.io.IOException;
import java.util.List;
import com.google.api.client.googleapis.GoogleHeaders;
import com.google.api.client.googleapis.json.JsonCParser;
import com.google.api.client.http.GenericUrl;
import com.google.api.client.http.HttpRequest;
import com.google.api.client.http.HttpRequestFactory;
import com.google.api.client.http.HttpRequestInitializer;
import com.google.api.client.http.HttpTransport;
import com.google.api.client.http.javanet.NetHttpTransport;
import com.google.api.client.json.JsonFactory;
import com.google.api.client.json.jackson.JacksonFactory;
import com.google.api.client.util.Key;
public class YouTubeSample {
public static class VideoFeed {
#Key List<Video> items;
}
public static class Video {
#Key String title;
#Key String description;
#Key Player player;
}
public static class Player {
#Key("default") String defaultUrl;
}
public static class YouTubeUrl extends GenericUrl {
#Key final String alt = "jsonc";
#Key String author;
#Key("max-results") Integer maxResults;
YouTubeUrl(String url) {
super(url);
}
}
public static void main(String[] args) throws IOException {
// set up the HTTP request factory
HttpTransport transport = new NetHttpTransport();
final JsonFactory jsonFactory = new JacksonFactory();
HttpRequestFactory factory = transport.createRequestFactory(new HttpRequestInitializer() {
#Override
public void initialize(HttpRequest request) {
// set the parser
JsonCParser parser = new JsonCParser();
parser.jsonFactory = jsonFactory;
request.addParser(parser);
// set up the Google headers
GoogleHeaders headers = new GoogleHeaders();
headers.setApplicationName("Google-YouTubeSample/1.0");
headers.gdataVersion = "2";
request.headers = headers;
}
});
// build the YouTube URL
YouTubeUrl url = new YouTubeUrl("https://gdata.youtube.com/feeds/api/videos");
url.author = "searchstories";
url.maxResults = 2;
// build the HTTP GET request
HttpRequest request = factory.buildGetRequest(url);
// execute the request and the parse video feed
VideoFeed feed = request.execute().parseAs(VideoFeed.class);
for (Video video : feed.items) {
System.out.println();
System.out.println("Video title: " + video.title);
System.out.println("Description: " + video.description);
System.out.println("Play URL: " + video.player.defaultUrl);
}
}
}
The setup documentation gives a list of dependencies:
Depending on the application you are building, you may also need these dependencies:
Apache HTTP Client version 4.0.3
Google Guava version r09
Jackson version 1.6.7
Google GSON version 1.6
In this case, it looks like it's Guava which is missing. I don't know what you mean about "exporting" Guava, but if you include the Guava r09 jar file in the classpath when you're running the code, it should be fine.
what's the extra ); for above the // build the YouTube URL and did you mean to close main on that line?