Start elasticsearch within gradle build for integration tests - java

Is there a way to start elasticsearch within a gradle build before running integration tests and afterwards stop elasticsearch?
My approach so far is the following, but this blocks the further execution of the gradle build.
task runES(type: JavaExec) {
main = 'org.elasticsearch.bootstrap.Elasticsearch'
classpath = sourceSets.main.runtimeClasspath
systemProperties = ["es.path.home":"$buildDir/elastichome",
"es.path.data":"$buildDir/elastichome/data"]
}

For my purpose i have decided to start elasticsearch within my integration test in java code.
I've tried out ElasticsearchIntegrationTest but that didn't worked with spring, because it didn't harmony with SpringJUnit4ClassRunner.
I've found it easier to start elasticsearch in the before method:
My test class testing some 'dummy' productive code (indexing a document):
import static org.hamcrest.CoreMatchers.notNullValue;
import static org.junit.Assert.assertThat;
import org.elasticsearch.action.index.IndexResponse;
import org.elasticsearch.client.Client;
import org.elasticsearch.client.transport.TransportClient;
import org.elasticsearch.common.settings.ImmutableSettings;
import org.elasticsearch.common.settings.ImmutableSettings.Builder;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.transport.InetSocketTransportAddress;
import org.elasticsearch.indices.IndexAlreadyExistsException;
import org.elasticsearch.node.Node;
import org.elasticsearch.node.NodeBuilder;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
public class MyIntegrationTest {
private Node node;
private Client client;
#Before
public void before() {
createElasticsearchClient();
createIndex();
}
#After
public void after() {
this.client.close();
this.node.close();
}
#Test
public void testSomething() throws Exception {
// do something with elasticsearch
final String json = "{\"mytype\":\"bla\"}";
final String type = "mytype";
final String id = index(json, type);
assertThat(id, notNullValue());
}
/**
* some productive code
*/
private String index(final String json, final String type) {
// create Client
final Settings settings = ImmutableSettings.settingsBuilder().put("cluster.name", "mycluster").build();
final TransportClient tc = new TransportClient(settings).addTransportAddress(new InetSocketTransportAddress(
"localhost", 9300));
// index a document
final IndexResponse response = tc.prepareIndex("myindex", type).setSource(json).execute().actionGet();
return response.getId();
}
private void createElasticsearchClient() {
final NodeBuilder nodeBuilder = NodeBuilder.nodeBuilder();
final Builder settingsBuilder = nodeBuilder.settings();
settingsBuilder.put("network.publish_host", "localhost");
settingsBuilder.put("network.bind_host", "localhost");
final Settings settings = settingsBuilder.build();
this.node = nodeBuilder.clusterName("mycluster").local(false).data(true).settings(settings).node();
this.client = this.node.client();
}
private void createIndex() {
try {
this.client.admin().indices().prepareCreate("myindex").execute().actionGet();
} catch (final IndexAlreadyExistsException e) {
// index already exists => we ignore this exception
}
}
}
It is also very important to use elasticsearch version 1.3.3 or higher. See Issue 5401.

Related

How to inject mongoclient to my POST service

I have a very simple Quarkus application which accepts input and insert it into MongoDB using MongoClient.
Controller:
#ApplicationScoped
#Path("/endpoint")
public class A {
#Inject
B service;
#POST
#Produces(MediaType.APPLICATION_JSON)
#Consumes(MediaType.APPLICATION_JSON)
public Document add(List<? extends Document> list) {
return service.add(list);
}
}
Service Class:
#ApplicationScoped
public class B {
#Inject
MongoClient mongoClient;
private MongoCollection<Document> getCollection() {
return mongoClient.getDatabase(DBname).getCollection(coll);
}
public Document add(List<? extends Document> list) {
Document response = new Document();
getCollection().deleteMany(new BasicDBObject());
getCollection().insertMany(list);
response.append("count", list.size());
return response;
}
}
As you see that my service removes existing data and inserts the new data. For JUnit testing, I am trying to set up embedded MongoDB and want my service call to use the embedded mongo. But no success.
My JUnit class
I tried out many approaches discussed on the internet to set up the embedded mongo but none worked for me.
I want to invoke my POST service but actual mongodb must not get connected. My JUnit class is as below:
#QuarkusTest
public class test {
List<Document> request = new ArrayList<Document>();
Document doc = new Document();
doc.append("Id", "007")
.append("name", "Nitin");
request.add(doc);
given()
.body(request)
.header("Content-Type", MediaType.APPLICATION_JSON)
.when()
.post("/endpoint")
.then()
.statusCode(200);
}
You need to use a different connection-string for your test than for your regular (production) run.
Quakus can use profiles to do this, the %test profile is automatically selected when running #QuarkusTest tests.
So you can add in your application.properties something like this :
quarkus.mongodb.connection-string=mongodb://host:port
%test.quarkus.mongodb.connection-string=mongodb://localhost:27017
Here mongodb://host:port will be use on the normal run of your application and mongodb://localhost:27017 will be used from inside your test.
Then you can use flapdoodle or Testcontainers to launch a MongoDB database on localhost during your test.
More information on configuration profiles: https://quarkus.io/guides/config#configuration-profiles
More information on how to start an external service from a Quarkus test: https://quarkus.io/guides/getting-started-testing#quarkus-test-resource
Have u tried flapdoodle:
package com.example.mongo;
import com.mongodb.BasicDBObject;
import com.mongodb.MongoClient;
import com.mongodb.client.MongoCollection;
import com.mongodb.client.MongoDatabase;
import de.flapdoodle.embed.mongo.MongodExecutable;
import de.flapdoodle.embed.mongo.MongodProcess;
import de.flapdoodle.embed.mongo.MongodStarter;
import de.flapdoodle.embed.mongo.config.IMongodConfig;
import de.flapdoodle.embed.mongo.config.MongodConfigBuilder;
import de.flapdoodle.embed.mongo.config.Net;
import de.flapdoodle.embed.mongo.distribution.Version;
import de.flapdoodle.embed.process.runtime.Network;
import java.util.Date;
import org.junit.After;
import static org.junit.Assert.*;
import org.junit.Before;
import org.junit.Test;
public class EmbeddedMongoTest
{
private static final String DATABASE_NAME = "embedded";
private MongodExecutable mongodExe;
private MongodProcess mongod;
private MongoClient mongo;
#Before
public void beforeEach() throws Exception {
MongodStarter starter = MongodStarter.getDefaultInstance();
String bindIp = "localhost";
int port = 12345;
IMongodConfig mongodConfig = new MongodConfigBuilder()
.version(Version.Main.PRODUCTION)
.net(new Net(bindIp, port, Network.localhostIsIPv6()))
.build();
this.mongodExe = starter.prepare(mongodConfig);
this.mongod = mongodExe.start();
this.mongo = new MongoClient(bindIp, port);
}
#After
public void afterEach() throws Exception {
if (this.mongod != null) {
this.mongod.stop();
this.mongodExe.stop();
}
}
#Test
public void shouldCreateNewObjectInEmbeddedMongoDb() {
// given
MongoDatabase db = mongo.getDatabase(DATABASE_NAME);
db.createCollection("testCollection");
MongoCollection<BasicDBObject> col = db.getCollection("testCollection", BasicDBObject.class);
// when
col.insertOne(new BasicDBObject("testDoc", new Date()));
// then
assertEquals(1L, col.countDocuments());
}
}
Reference : Embedded MongoDB when running integration tests
Thanks everyone for suggestions. I declared test collections in application.properties file. %test profile automatically get activated when we run junits, so automatically my services picked up the test collections. I deleted the test collections after my junit test cases got completed.

Micronaut set up EmbeddedServer for Pact test

I have this SpringBoot and Pact test example from Writing Contract Tests with Pact in Spring Boot
#RunWith(SpringRunner.class)
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.NONE,
properties = "user-service.base-url:http://localhost:8080",
classes = UserServiceClient.class)
public class UserServiceContractTest {
#Rule
public PactProviderRuleMk2 provider = new PactProviderRuleMk2("user-service", null,
8080, this);
#Autowired
private UserServiceClient userServiceClient;
#Pact(consumer = "messaging-app")
public RequestResponsePact pactUserExists(PactDslWithProvider builder) {
return builder.given("User 1 exists")
.uponReceiving("A request to /users/1")
.path("/users/1")
.method("GET")
.willRespondWith()
.status(200)
.body(LambdaDsl.newJsonBody((o) -> o
.stringType("name", “user name for CDC”)
).build())
.toPact();
}
#PactVerification(fragment = "pactUserExists")
#Test
public void userExists() {
final User user = userServiceClient.getUser("1");
assertThat(user.getName()).isEqualTo("user name for CDC");
}
}
In order to generate the PACT file I need to start a mock Provider, which is set up as:
public PactProviderRuleMk2 provider = new PactProviderRuleMk2("user-service", null,
8080, this);
The #SpringBootTest annotation provides a mock web environment running on http://localhost:8080
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.NONE,
properties = "user-service.base-url:http://localhost:8080",
classes = UserServiceClient.class)
Is it possible to do something similar in Micronaut? Can I use an EmbeddedServer running in a specified port such as http://localhost:8080 so my Pact MockProvider can listen to that port?
I would like to specify the port in the test class, not into an application.yml file
Any ideas?
You can use micronaut and pact with junit5, simple example based on hello-world-java:
Add pact dependencies to build.gradle:
// pact
compile 'au.com.dius:pact-jvm-consumer-junit5_2.12:3.6.10'
compile 'au.com.dius:pact-jvm-provider-junit5_2.12:3.6.10'
// client for target example
compile 'io.micronaut:micronaut-http-client'
FooService.java:
import io.micronaut.http.client.RxHttpClient;
import io.micronaut.http.client.annotation.Client;
import javax.inject.Inject;
import javax.inject.Singleton;
import static io.micronaut.http.HttpRequest.GET;
#Singleton
public class FooService {
#Inject
#Client("http://localhost:8080")
private RxHttpClient httpClient;
public String getFoo() {
return httpClient.retrieve(GET("/foo")).blockingFirst();
}
}
FooServiceTest.java:
import au.com.dius.pact.consumer.Pact;
import au.com.dius.pact.consumer.dsl.PactDslWithProvider;
import au.com.dius.pact.consumer.junit5.PactConsumerTestExt;
import au.com.dius.pact.consumer.junit5.PactTestFor;
import au.com.dius.pact.model.RequestResponsePact;
import io.micronaut.test.annotation.MicronautTest;
import org.junit.jupiter.api.extension.ExtendWith;
import org.junit.jupiter.api.Test;
import javax.inject.Inject;
import static org.junit.jupiter.api.Assertions.assertEquals;
#MicronautTest
#ExtendWith(PactConsumerTestExt.class)
#PactTestFor(providerName = "foo", hostInterface = "localhost", port = "8080")
public class FooServiceTest {
#Inject
FooService fooService;
#Pact(provider = "foo", consumer = "foo")
public RequestResponsePact pact(PactDslWithProvider builder) {
return builder
.given("test foo")
.uponReceiving("test foo")
.path("/foo")
.method("GET")
.willRespondWith()
.status(200)
.body("{\"foo\":\"bar\"}")
.toPact();
}
#Test
public void testFoo() {
assertEquals(fooService.getFoo(), "{\"foo\":\"bar\"}");
}
}

Use jira-rest-java-client java library in Java/gradle/groovy?

I have gradle script which is creating the version in JIRA using the REST API.
But there is jira-rest-java-client also available. I want to use the java library of jira-rest-java-client and wants to do the same stuff in gradle. Can someone provide an example how could I try this.
How to use the jira-rest-java-client library to make connection with JIRA through example?
In Java I am trying to use this JRCJ Library but getting below error through Intellj
import com.atlassian.jira.rest.client.api.JiraRestClient;
import com.atlassian.jira.rest.client.api.domain.*;
import com.atlassian.jira.rest.client.api.domain.input.ComplexIssueInputFieldValue;
import com.atlassian.jira.rest.client.api.domain.input.FieldInput;
import com.atlassian.jira.rest.client.api.domain.input.TransitionInput;
import com.atlassian.jira.rest.client.internal.ServerVersionConstants;
import com.atlassian.jira.rest.client.internal.async.AsynchronousJiraRestClientFactory;
import com.google.common.collect.Lists;
import org.codehaus.jettison.json.JSONException;
import java.io.IOException;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.Arrays;
import java.util.Collection;
import java.util.List;
/**
* A sample code how to use JRJC library
*
* #since v0.1
*/
public class Example1 {
private static URI jiraServerUri = URI.create("http://localhost:2990/jira");
private static boolean quiet = false;
public static void main(String[] args) throws URISyntaxException, JSONException, IOException {
parseArgs(args);
final AsynchronousJiraRestClientFactory factory = new AsynchronousJiraRestClientFactory();
final JiraRestClient restClient = factory.createWithBasicHttpAuthentication(jiraServerUri, "admin", "admin");
try {
final int buildNumber = restClient.getMetadataClient().getServerInfo().claim().getBuildNumber();
// first let's get and print all visible projects (only jira4.3+)
if (buildNumber >= ServerVersionConstants.BN_JIRA_4_3) {
final Iterable<BasicProject> allProjects = restClient.getProjectClient().getAllProjects().claim();
for (BasicProject project : allProjects) {
if (project == TEST){
println(project);}else {
System.out.println("Project" + "Not Found");
}
}
}
// let's now print all issues matching a JQL string (here: all assigned issues)
if (buildNumber >= ServerVersionConstants.BN_JIRA_4_3) {
final SearchResult searchResult = restClient.getSearchClient().searchJql("assignee is not EMPTY").claim();
for (BasicIssue issue : searchResult.getIssues()) {
println(issue.getKey());
}
}
final Issue issue = restClient.getIssueClient().getIssue("TST-7").claim();
println(issue);
// now let's vote for it
restClient.getIssueClient().vote(issue.getVotesUri()).claim();
// now let's watch it
final BasicWatchers watchers = issue.getWatchers();
if (watchers != null) {
restClient.getIssueClient().watch(watchers.getSelf()).claim();
}
// now let's start progress on this issue
final Iterable<Transition> transitions = restClient.getIssueClient().getTransitions(issue.getTransitionsUri()).claim();
final Transition startProgressTransition = getTransitionByName(transitions, "Start Progress");
restClient.getIssueClient().transition(issue.getTransitionsUri(), new TransitionInput(startProgressTransition.getId()))
.claim();
// and now let's resolve it as Incomplete
final Transition resolveIssueTransition = getTransitionByName(transitions, "Resolve Issue");
final Collection<FieldInput> fieldInputs;
// Starting from JIRA 5, fields are handled in different way -
if (buildNumber > ServerVersionConstants.BN_JIRA_5) {
fieldInputs = Arrays.asList(new FieldInput("resolution", ComplexIssueInputFieldValue.with("name", "Incomplete")));
} else {
fieldInputs = Arrays.asList(new FieldInput("resolution", "Incomplete"));
}
final TransitionInput transitionInput = new TransitionInput(resolveIssueTransition.getId(), fieldInputs, Comment
.valueOf("My comment"));
restClient.getIssueClient().transition(issue.getTransitionsUri(), transitionInput).claim();
}
finally {
restClient.close();
}
}
private static void println(Object o) {
if (!quiet) {
System.out.println(o);
}
}
private static void parseArgs(String[] argsArray) throws URISyntaxException {
final List<String> args = Lists.newArrayList(argsArray);
if (args.contains("-q")) {
quiet = true;
args.remove(args.indexOf("-q"));
}
if (!args.isEmpty()) {
jiraServerUri = new URI(args.get(0));
}
}
private static Transition getTransitionByName(Iterable<Transition> transitions, String transitionName) {
for (Transition transition : transitions) {
if (transition.getName().equals(transitionName)) {
return transition;
}
}
return null;
}
}
Error:
xception in thread "main" java.lang.NoClassDefFoundError: com/atlassian/sal/api/executor/ThreadLocalContextManager
at com.atlassian.jira.rest.client.internal.async.AsynchronousJiraRestClientFactory.create(AsynchronousJiraRestClientFactory.java:35)
at com.atlassian.jira.rest.client.internal.async.AsynchronousJiraRestClientFactory.createWithBasicHttpAuthentication(AsynchronousJiraRestClientFactory.java:42)
at Example1.main(Example1.java:34)
Caused by: java.lang.ClassNotFoundException: com.atlassian.sal.api.executor.ThreadLocalContextManager
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 3 more
Moreover I added the JRJC api,core jar in External Libraries but still getting this error?
Could someone tell me what is the issue or where am I missing something?
compile 'com.atlassian.jira:jira-rest-java-client-core:4.0.0'
compile 'com.atlassian.jira:jira-rest-java-client-api:4.0.0'
Simple connection to JIRA:
JiraRestClient restClient = new AsynchronousJiraRestClientFactory().createWithBasicHttpAuthentication(new URI("https://" + jira_domain),
jira_username, jira_password);

Auto generate data schema from JPA 2.1 annotated entity classes without a database connection

Two years ago I was working on a project using:
spring 4.0.3.RELEASE
jpa 2.0
hibernate 4.2.7.Final
java 1.6.X
This project has a maven task hibernate3-maven-plugin which allow us to generate a database schema without any connection to a database (MySQL).
Now we are upgrading this project with:
java 1.8
jpa 2.1
spring 4.2.4.RELEASE
hibernate 5.0.6.Final
I understand that hibernate3-maven-plugin does not work on JPA 2.1 and hibernate > 4.3.
All the solution I have found need a connection to a database.
For instance: Auto generate data schema from JPA annotated entity classes.
Does anyone know how to generate a database schema offline?
All I have is a persistence.xml with all the Entity classes listed.
I follow your idea of using h2 with Mysql dialect but using JPA Persistence.generateSchema(...).
It does work except all commands are not separated by a semi-column...
How can this been done using JPA 2.1?
Otherwise I will switch to your solution.
import java.util.Properties;
import javax.persistence.Persistence;
import javax.persistence.PersistenceException;
import org.hibernate.jpa.AvailableSettings;
/**
* Generate DDL with hibernate 4+/5:
* http://stackoverflow.com/questions/27314165/generate-ddl-script-at-maven-build-with-hibernate4-jpa-2-1/27314166#27314166
* #author dmary
*
*/
public class Jpa21SchemaExport {
/**
*
*/
public Jpa21SchemaExport() {
// TODO Auto-generated constructor stub
}
/**
* #param args
*/
public static void main(String[] args) {
execute(args[0], args[1]);
System.exit(0);
}
public static void execute(String persistenceUnitName, String destination) {
System.out.println("Generating DDL create script to : " + destination);
final Properties persistenceProperties = new Properties();
// XXX force persistence properties : remove database target
persistenceProperties.setProperty(org.hibernate.cfg.AvailableSettings.HBM2DDL_AUTO, "");
persistenceProperties.setProperty(AvailableSettings.SCHEMA_GEN_DATABASE_ACTION, "none");
// XXX force persistence properties : define create script target from metadata to destination
// persistenceProperties.setProperty(AvailableSettings.SCHEMA_GEN_CREATE_SCHEMAS, "true");
persistenceProperties.setProperty(AvailableSettings.SCHEMA_GEN_SCRIPTS_ACTION, "create");
persistenceProperties.setProperty(AvailableSettings.SCHEMA_GEN_CREATE_SOURCE, "metadata");
persistenceProperties.setProperty(AvailableSettings.SCHEMA_GEN_SCRIPTS_CREATE_TARGET, destination);
persistenceProperties.setProperty(AvailableSettings.JDBC_DRIVER,"org.h2.Driver");
persistenceProperties.setProperty(AvailableSettings.JDBC_URL, "jdbc:h2:mem:export");
persistenceProperties.setProperty(AvailableSettings.JDBC_USER, "sa");
persistenceProperties.setProperty(AvailableSettings.JDBC_PASSWORD, "");
persistenceProperties.setProperty(org.hibernate.cfg.AvailableSettings.DIALECT, "com.wiztivi.sdk.persistence.MySQL5InnoDBUTF8Dialect");
try
{
Persistence.generateSchema(persistenceUnitName, persistenceProperties);
} catch (PersistenceException pe)
{
System.err.println("DDL generation failed: ");
pe.printStackTrace(System.err);
}
}
As the other question, you can use hbm2ddl and an embedded database to provide a connection.
For exemple using H2 database (require h2, scannotation, hibernate and common-io):
package com.stackoverflow;
import java.io.File;
import java.io.FileWriter;
import java.io.InputStream;
import java.net.URL;
import java.util.Set;
import javax.persistence.Entity;
import org.apache.commons.io.IOUtils;
import org.hibernate.cfg.Configuration;
import org.hibernate.cfg.Environment;
import org.hibernate.connection.DriverManagerConnectionProvider;
import org.hibernate.dialect.PostgreSQLDialect;
import org.hibernate.tool.hbm2ddl.SchemaExport;
import org.scannotation.AnnotationDB;
public class ExportShema {
private static final String OUTPUT_SQL_FILE = "target/database.sql";
private static final String INIT_FILE = "init.sql";
private static final String DB_URL = "jdbc:h2:mem:test;DB_CLOSE_DELAY=-1";
private static final String DB_USERNAME = "sa";
private static final String DB_PASSWORD = "";
private static final File HBM_DIRECTORY = new File("src/main/resources/com/stackoverflow/domain/");
public static void main(final String[] args) throws Exception {
final Configuration cfg = new Configuration();
cfg.setProperty(Environment.CONNECTION_PROVIDER, DriverManagerConnectionProvider.class.getName());
//for postgrest schema
cfg.setProperty(Environment.DIALECT, PostgreSQLDialect.class.getName());
cfg.setProperty(Environment.URL, DB_URL);
cfg.setProperty(Environment.USER, DB_USERNAME);
cfg.setProperty(Environment.PASS, DB_PASSWORD);
//If you have HBM + annotated class
cfg.addDirectory(HBM_DIRECTORY);
final AnnotationDB db = new AnnotationDB();
db.scanArchives(new URL("file:target/classes/"));
final Set<String> clazzNames = db.getAnnotationIndex().get(Entity.class.getName());
for (final String clazzName : clazzNames) {
cfg.addAnnotatedClass(Class.forName(clazzName));
}
final SchemaExport exporter = new SchemaExport(cfg);
exporter.setOutputFile(OUTPUT_SQL_FILE);
exporter.setDelimiter(";");
exporter.setFormat(true);
exporter.create(false, true);
try (final InputStream init_file = ExportShema.class.getResourceAsStream(INIT_FILE)) {
if (init_file != null) {
final File output = new File(OUTPUT_SQL_FILE);
try (final FileWriter fw = new FileWriter(output, true)) {
final String eol = System.getProperty("line.separator");
fw.append(eol + eol);
fw.append(IOUtils.toString(init_file));
}
}
}
}
}
you can do this in a unit test or create an annotation processor.
I was able to mix your Hibernate solution with JPA2.1:
I am now able to add the entity classes from the persistence.xml
This way I can generate the SQl file outside the jar where the entities are located.
This is a temporary solution till hibernate fix this bug
Thanks for your help.
/**
*
*/
package com.stackoverflow.common.util.schema;
import java.io.IOException;
import java.util.Properties;
import javax.persistence.EntityManagerFactory;
import javax.persistence.Persistence;
import javax.persistence.metamodel.ManagedType;
import javax.persistence.metamodel.Metamodel;
import org.hibernate.boot.MetadataBuilder;
import org.hibernate.boot.MetadataSources;
import org.hibernate.boot.registry.BootstrapServiceRegistry;
import org.hibernate.boot.registry.BootstrapServiceRegistryBuilder;
import org.hibernate.boot.registry.StandardServiceRegistry;
import org.hibernate.boot.registry.StandardServiceRegistryBuilder;
import org.hibernate.boot.spi.MetadataImplementor;
import org.hibernate.cfg.Environment;
import org.hibernate.engine.jdbc.connections.internal.DriverManagerConnectionProviderImpl;
import org.hibernate.jpa.AvailableSettings;
import org.hibernate.tool.hbm2ddl.SchemaExport;
import org.hibernate.dialect.MySQL5InnoDBDialect;
/**
*
*/
public class JPA21Hibernate5ExportSchema {
private static final String JDBC_DRIVER = "org.h2.Driver";
private static final String JDBC_URL = "jdbc:h2:mem:export;DB_CLOSE_DELAY=-1";
private static final String JDBC_USERNAME = "sa";
private static final String JDBC_PASSWORD = "";
/**
*
*/
public JPA21Hibernate5ExportSchema() {
}
public static void main(String[] args) {
try {
JPA21Hibernate5ExportSchema hes = new JPA21Hibernate5ExportSchema();
hes.export(args[0], args[1]);
System.exit(0);
} catch (Exception e) {
e.printStackTrace();
System.exit(1);
}
}
public void export(String persistenceUnitName, String sqlFile) throws IOException, ClassNotFoundException {
final BootstrapServiceRegistry bsr = new BootstrapServiceRegistryBuilder().build();
final MetadataSources metadataSources = new MetadataSources(bsr);
final StandardServiceRegistryBuilder srrBuilder = new StandardServiceRegistryBuilder(bsr)
.applySetting(Environment.CONNECTION_PROVIDER, DriverManagerConnectionProviderImpl.class.getName())
.applySetting(Environment.DIALECT, MySQL5InnoDBDialect.class.getName())
.applySetting(Environment.URL, JDBC_URL).applySetting(Environment.USER, JDBC_USERNAME)
.applySetting(Environment.PASS, JDBC_PASSWORD);
// Use the persistence metamodel to retrieve the Entities classes
Metamodel metamodel = this.getMetamodel(persistenceUnitName);
for (final ManagedType<?> managedType : metamodel.getManagedTypes()) {
metadataSources.addAnnotatedClass(managedType.getJavaType());
}
final StandardServiceRegistry ssr = (StandardServiceRegistry) srrBuilder.build();
final MetadataBuilder metadataBuilder = metadataSources.getMetadataBuilder(ssr);
final SchemaExport exporter = new SchemaExport((MetadataImplementor) metadataBuilder.build());
exporter.setOutputFile(sqlFile);
exporter.setDelimiter(";");
exporter.setFormat(true);
exporter.create(false, true);
}
/**
* Retrieve the JPA metamodel from the persistence unit name
*
* #param persistenceUnitName
* #return
*/
private Metamodel getMetamodel(String persistenceUnitName) {
final Properties persistenceProperties = new Properties();
persistenceProperties.setProperty(AvailableSettings.JDBC_DRIVER, JDBC_DRIVER);
persistenceProperties.setProperty(AvailableSettings.JDBC_URL, JDBC_URL);
persistenceProperties.setProperty(AvailableSettings.JDBC_USER, "sa");
persistenceProperties.setProperty(AvailableSettings.JDBC_PASSWORD, "");
persistenceProperties.setProperty(org.hibernate.cfg.AvailableSettings.DIALECT,
MySQL5InnoDBDialect.class.getName());
final EntityManagerFactory emf = Persistence.createEntityManagerFactory(persistenceUnitName,
persistenceProperties);
return emf.getMetamodel();
}
}

Google+ API Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/base/Preconditions

I am running the YouTubeSample given on the google developers website. I have no errors in the code and my imports appear to be fine. But when I run the project I get the aforementioned error.
I have done some searches but to be honest I have been unable to work out what the problem is. I have already tried importing an external jar guava but it didn't help.
Any help is appreciated. Here is the full class
package com.pengilleys.googlesamples;
import java.io.IOException;
import java.util.List;
import com.google.api.client.googleapis.GoogleHeaders;
import com.google.api.client.googleapis.json.JsonCParser;
import com.google.api.client.http.GenericUrl;
import com.google.api.client.http.HttpRequest;
import com.google.api.client.http.HttpRequestFactory;
import com.google.api.client.http.HttpRequestInitializer;
import com.google.api.client.http.HttpTransport;
import com.google.api.client.http.javanet.NetHttpTransport;
import com.google.api.client.json.JsonFactory;
import com.google.api.client.json.jackson.JacksonFactory;
import com.google.api.client.util.Key;
public class YouTubeSample {
public static class VideoFeed {
#Key List<Video> items;
}
public static class Video {
#Key String title;
#Key String description;
#Key Player player;
}
public static class Player {
#Key("default") String defaultUrl;
}
public static class YouTubeUrl extends GenericUrl {
#Key final String alt = "jsonc";
#Key String author;
#Key("max-results") Integer maxResults;
YouTubeUrl(String url) {
super(url);
}
}
public static void main(String[] args) throws IOException {
// set up the HTTP request factory
HttpTransport transport = new NetHttpTransport();
final JsonFactory jsonFactory = new JacksonFactory();
HttpRequestFactory factory = transport.createRequestFactory(new HttpRequestInitializer() {
#Override
public void initialize(HttpRequest request) {
// set the parser
JsonCParser parser = new JsonCParser();
parser.jsonFactory = jsonFactory;
request.addParser(parser);
// set up the Google headers
GoogleHeaders headers = new GoogleHeaders();
headers.setApplicationName("Google-YouTubeSample/1.0");
headers.gdataVersion = "2";
request.headers = headers;
}
});
// build the YouTube URL
YouTubeUrl url = new YouTubeUrl("https://gdata.youtube.com/feeds/api/videos");
url.author = "searchstories";
url.maxResults = 2;
// build the HTTP GET request
HttpRequest request = factory.buildGetRequest(url);
// execute the request and the parse video feed
VideoFeed feed = request.execute().parseAs(VideoFeed.class);
for (Video video : feed.items) {
System.out.println();
System.out.println("Video title: " + video.title);
System.out.println("Description: " + video.description);
System.out.println("Play URL: " + video.player.defaultUrl);
}
}
}
The setup documentation gives a list of dependencies:
Depending on the application you are building, you may also need these dependencies:
Apache HTTP Client version 4.0.3
Google Guava version r09
Jackson version 1.6.7
Google GSON version 1.6
In this case, it looks like it's Guava which is missing. I don't know what you mean about "exporting" Guava, but if you include the Guava r09 jar file in the classpath when you're running the code, it should be fine.
what's the extra ); for above the // build the YouTube URL and did you mean to close main on that line?

Categories