In my Spring Boot 1.5 application I use ClassPathResource to read a static file the application JAR:
// ...
import org.springframework.core.io.ClassPathResource;
import org.springframework.core.io.Resource;
#Slf4j
#Service
public class MyService {
private Resource resource = new ClassPathResource("a.txt");
private List<String> myStrings;
public MyService() {
myStrings = load(resource);
}
private List<String> load(Resource resource) {
try(Stream<String> stream = Files.lines(Paths.get(resource.getURI()))) {
myStrings = stream.filter(/* my filter */)
.collect(Collectors.toList());
} catch (IOException x) {
log.error("Failed to read '{}'.", resource.getFilename());
}
}
}
but this fails with:
Caused by: java.nio.file.FileSystemNotFoundException: null
at com.sun.nio.zipfs.ZipFileSystemProvider.getFileSystem(ZipFileSystemProvider.java:171) ~[zipfs.jar:1.8.0_121]
at com.sun.nio.zipfs.ZipFileSystemProvider.getPath(ZipFileSystemProvider.java:157) ~[zipfs.jar:1.8.0_121]
at java.nio.file.Paths.get(Paths.java:143) ~[na:1.8.0_121]
at MyService.load(MyService.java:53) ~[classes!/:2.0.0-SNAPSHOT]
//...
How can I read a ClassPathResource embedded in my application JAR?
The JDK's Paths.get cannot resolve resources within JAR files so replace:
Files.lines(Paths.get(resource.getURI()))
with:
new BufferedReader(new InputStreamReader(resource.getInputStream())).lines();
Related
We have an integration test setup for testing the behavior of missing but required configuration properties. Among one of these properties is a directory where failed uploads should be written to for later retries. The general behavior for this property should be that the application doesn't even start up and fail immediately when certain constraints are violated.
The properties are managed by Spring via certain ConfigurationProperties among these we have a simple S3MessageUploadSettings class
#Getter
#Setter
#ConfigurationProperties(prefix = "s3")
#Validated
public class S3MessageUploadSettings {
#NotNull
private String bucketName;
#NotNull
private String uploadErrorPath;
...
}
In the respective Spring configuration we now perform certain validation checks, like whether the path exists, is writable and a directory, and throw respective RuntimeExceptions when certain assertions aren't met:
#Slf4j
#Import({ S3Config.class })
#Configuration
#EnableConfigurationProperties(S3MessageUploadSettings.class)
public class S3MessageUploadSpringConfig {
#Resource
private S3MessageUploadSettings settings;
...
#PostConstruct
public void checkConstraints() {
String sPath = settings.getUploadErrorPath();
Path path = Paths.get(sPath);
...
log.debug("Probing path '{}' for existence', path);
if (!Files.exists(path)) {
throw new RuntimeException("Required error upload directory '" + path + "' does not exist");
}
log.debug("Probig path '{}' for being a directory", path);
if (!Files.isDirectory(path)) {
throw new RuntimeException("Upload directory '" + path + "' is not a directoy");
}
log.debug("Probing path '{}' for write permissions", path);
if (!Files.isWritable(path)) {
throw new RuntimeException("Error upload path '" + path +"' is not writable);
}
}
}
Our test setup now looks like this:
public class StartupTest {
#ClassRule
public static TemporaryFolder testFolder = new TemporaryFolder();
private static File BASE_FOLDER;
private static File ACCESSIBLE;
private static File WRITE_PROTECTED;
private static File NON_DIRECTORY;
#BeforeClass
public static void initFolderSetup() throws IOException {
BASE_FOLDER = testFolder.getRoot();
ACCESSIBLE = testFolder.newFolder("accessible");
WRITE_PROTECTED = testFolder.newFolder("writeProtected");
if (!WRITE_PROTECTED.setReadOnly()) {
fail("Could not change directory permissions to readonly")
}
if (!WRITE_PROTECTED.setWritable(false)) {
fail("Could not change directory permissions to writable(false)");
}
NON_DIRECTORY = testFolder.newFile("nonDirectory");
}
#Configuration
#Import({
S3MessageUploadSpringConfig.class,
S3MockConfig.class,
...
})
static class BaseContextConfig {
// common bean definitions
...
}
#Configuration
#Import(BaseContextConfig.class)
#PropertySource("classpath:ci.properties")
static class NotExistingPathContextConfig {
#Resource
private S3MessageUploadSettings settings;
#PostConstruct
public void updateSettings() {
settings.setUploadErrorPath(BASE_FOLDER.getPath() + "/foo/bar");
}
}
#Configuration
#Import(BaseContextConfig.class)
#PropertySource("classpath:ci.properties")
static class NotWritablePathContextConfig {
#Resource
private S3MessageUploadSettings settings;
#PostConstruct
public void updateSettings() {
settings.setUploadErrorPath(WRITE_PROTECTED.getPath());
}
}
...
#Configuration
#Import(BaseContextConfig.class)
#PropertySource("classpath:ci.properties")
static class StartableContextConfig {
#Resource
private S3MessageUploadSettings settings;
#PostConstruct
public void updateSettings() {
settings.setUploadErrorPath(ACCESSIBLE.getPath());
}
}
#Test
public void shouldFailStartupDueToNonExistingErrorPathDirectory() {
ApplicationContext context = null;
try {
context = new AnnotationConfigApplicationContext(StartupTest.NotExistingPathContextConfig.class);
fail("Should not have started the context");
} catch (Exception e) {
e.printStackTrace();
assertThat(e, instanceOf(BeanCreationException.class));
assertThat(e.getMessage(), containsString("Required error upload directory '" + BASE_FOLDER + "/foo/bar' does not exist"));
} finally {
closeContext(context);
}
}
#Test
public void shouldFailStartupDueToNonWritablePathDirectory() {
ApplicationContext context = null;
try {
context = new AnnotationConfigApplicationContext(StartupTest.NotWritablePathContextConfig.class);
fail("Should not have started the context");
} catch (Exception e) {
assertThat(e, instanceOf(BeanCreationException.class));
assertThat(e.getMessage(), containsString("Error upload path '" + WRITE_PROTECTED + "' is not writable"));
} finally {
closeContext(context);
}
}
...
#Test
public void shouldStartUpSuccessfully() {
ApplicationContext context = null;
try {
context = new AnnotationConfigApplicationContext(StartableContextConfig.class);
} catch (Exception e) {
e.printStackTrace();
fail("Should not have thrown an exception of type " + e.getClass().getSimpleName() + " with message " + e.getMessage());
} finally {
closeContext(context);
}
}
private void closeContext(ApplicationContext context) {
if (context != null) {
// check and close any running S3 mock as this may have negative impact on the startup of a further context
closeS3Mock(context);
// stop a running Spring context manually as this might interfere with a starting context of an other test
((ConfigurableApplicationContext) context).stop();
}
}
private void closeS3Mock(ApplicationContext context) {
S3Mock s3Mock = null;
try {
if (context != null) {
s3Mock = context.getBean("s3Mock", S3Mock.class);
}
} catch (Exception e) {
e.printStackTrace();
} finally {
if (null != s3Mock) {
s3Mock.stop();
}
}
}
}
When run locally, everything looks fine and all tests pass. Though our CI runs these tests inside a docker container and for some reason changing file permissions seem to end up in a NOOP returning true on the method invocation though not changing anything in regards of the file permission itself.
Neiter File.setReadOnly(), File.setWritable(false) nor Files.setPosixFilePermissions(Path, Set<PosixFilePermission>) seem to have an effect on the actual file permissions in the docker container.
I've also tried to change the directories to real directories, i.e. /root or /dev/pts that are write protected, though as the CI runs the tests as root these directories are writable by the application and the test fails again.
I also considered using an in-memory file system (such as JimFS) though here I'm not sure how to convince the test to make use of the custom filesystem. AFAIK JimFS does not support the constructor needed for declaring it as default filesystem.
Which other possibilities exist from within Java to change a directories permission to readonly/write-protected when run inside a docker container or test successfully for such a directory?
I assume this is due to the permissions and policies of the JVM, and you cannot do anything from your code if the OS has blocked some permissions for your JVM.
You can try to edit java.policy file and set appropriate file permissions.
Perhaps these will be some given files to which write privileges will be set, for example:
grant {
permission java.io.FilePermission "/dev/pts/*", "read,write,delete";
};
More examples in docs: https://docs.oracle.com/javase/8/docs/technotes/guides/security/spec/security-spec.doc3.html.
I'm trying to upload a file to amazon s3. Instead of uploading, I want to read the data from database using spring batch and write the file directly into the s3 storage. Is there anyway we can do that ?
Spring Cloud AWS adds support for the Amazon S3 service to load and write resources with the resource loader and the s3 protocol. Once you have configured the AWS resource loader, you can write a custom Spring Batch writer like:
import java.io.OutputStream;
import java.util.List;
import org.springframework.batch.item.ItemWriter;
import org.springframework.core.io.ResourceLoader;
import org.springframework.core.io.WritableResource;
public class AwsS3ItemWriter implements ItemWriter<String> {
private ResourceLoader resourceLoader;
private WritableResource resource;
public AwsS3ItemWriter(ResourceLoader resourceLoader, String resource) {
this.resourceLoader = resourceLoader;
this.resource = (WritableResource) this.resourceLoader.getResource(resource);
}
#Override
public void write(List<? extends String> items) throws Exception {
try (OutputStream outputStream = resource.getOutputStream()) {
for (String item : items) {
outputStream.write(item.getBytes());
}
}
}
}
Then you should be able to use this writer with an S3 resource like s3://myBucket/myFile.log.
Is there anyway we can do that ?
Please note that I did not compile/test the previous code. I just wanted to give you a starting point of how to do it.
Hope this helps.
The problem is that the OutputStream will only write the last List items sent by the step...
I think you might need to write a temporary file on file system and then send the whole file in a separate tasklet
See this example :
https://github.com/TerrenceMiao/AWS/blob/master/dynamodb-java/src/main/java/org/paradise/microservice/userpreference/service/writer/CSVFileWriter.java
I had the same thing to do. Because spring has no clas to write to a stream alone I made one my self like above example:
You need to classes for this. A Resource class which implements WriteableResource and extends AbstractResource:
...
public class S3Resource extends AbstractResource implements WritableResource {
ByteArrayOutputStream resource = new ByteArrayOutputStream();
#Override
public String getDescription() {
return null;
}
#Override
public InputStream getInputStream() throws IOException {
return new ByteArrayInputStream(resource.toByteArray());
}
#Override
public OutputStream getOutputStream() throws IOException {
return resource;
}
}
And your writer which extends ItemWriter:
public class AmazonStreamWriter<T> implements ItemWriter<T>{
private WritableResource resource;
private LineAggregator<T> lineAggregator;
private String lineSeparator;
public String getLineSeparator() {
return lineSeparator;
}
public void setLineSeparator(String lineSeparator) {
this.lineSeparator = lineSeparator;
}
AmazonStreamWriter(WritableResource resource){
this.resource = resource;
}
public WritableResource getResource() {
return resource;
}
public void setResource(WritableResource resource) {
this.resource = resource;
}
public LineAggregator<T> getLineAggregator() {
return lineAggregator;
}
public void setLineAggregator(LineAggregator<T> lineAggregator) {
this.lineAggregator = lineAggregator;
}
#Override
public void write(List<? extends T> items) throws Exception {
try (OutputStream outputStream = resource.getOutputStream()) {
StringBuilder lines = new StringBuilder();
Iterator var3 = items.iterator();
while(var3.hasNext()) {
T item = (T) var3.next();
lines.append(this.lineAggregator.aggregate(item)).append(this.lineSeparator);
}
outputStream.write(lines.toString().getBytes());
}
}
}
With this setup you will write your Item-Information you recieve from your database and write it to your Customresource via an OutputStream. The filled resource then can be used in one of your Steps zu open an InputStream and upload to S3 via Client.
I did it with: amazonS3.putObject(awsBucketName, awsBucketKey , resource.getInputStream(), new ObjectMetadata());
My solution may be not the perfect aproach, but from here on you can optimize it.
I'm quite new to SpringBoot. I have created a sample application with a service class.
Below is my SpringBootApplication class
#SpringBootApplication
public class SampleApplication {
#Autowired
static AWSService awsService;
public static void main(String[] args) {
SpringApplication.run(SampleApplication.class, args);
awsService.getCertificate(); // Getting an NPE at this point
}
}
The AWSService class
#Service
public class AWSService {
public AWSService() {
}
private final Log log = new Log(getClass().getSimpleName());
public void getCertificate() {
String accessKey="";
String secretKey="";
try {
Scanner awsCredentials = new Scanner(new File(Constants.AWS_CREDENTIALS));
accessKey=awsCredentials.next();
secretKey=awsCredentials.next();
} catch (FileNotFoundException e) {
e.printStackTrace();
log.error(e.getMessage());
}
BasicAWSCredentials basicAWSCredentials = new BasicAWSCredentials(accessKey,secretKey);
AmazonS3 s3Client = AmazonS3ClientBuilder.standard().withCredentials(
new AWSStaticCredentialsProvider(basicAWSCredentials)).build();
S3Object s3object = s3Client.getObject(
new GetObjectRequest(Constants.S3_BUCKET_NAME, Constants.S3_KEY_NAME));
String temporaryCertificatePath = storeCertificate(s3object);
Constants.setKeyStoreFile(temporaryCertificatePath);
}
private String storeCertificate(S3Object s3Object) {
try {
File certificate = File.createTempFile("signingKey",".p12");
OutputStream outputStream = new FileOutputStream(certificate);
byte buffer [] = IOUtils.toByteArray(s3Object.getObjectContent());
outputStream.write(buffer);
certificate.deleteOnExit();
return certificate.getCanonicalPath();
} catch (IOException e) {
e.printStackTrace();
log.error(e.getMessage());
}
return null;
}
}
Following is the error I'm getting
Exception in thread "main" java.lang.NullPointerException
2017-02-27 13:24:04.023 at in.juspay.SampleApplication.main(SampleApplication.java:18)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
INFO at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
798 at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
I have Autowired the service in my application class, yet I'm getting a NullPointerException. If I understand Spring's #Service properly, the #Autowire should take care of the initialization of the object. Why then, am I getting an NPE at that point?
You cannot autowire static fields. Use application context to reach it.
Can you use #Autowired with static fields?
Very basic thing, static fields gets priority even before bean creation.
That's why you can use static keyword to get a access of an instance.
refer this also Getting java.lang.NullPointerException when calling Method.invoke
I created a javafx project using Netbeans, the project itself works just fine.
I'm now trying to implement a custom light-weight plugin system, the plugins are external JARs located inside the plugins/ directory of the main project. I'm using javax.security package to sandbox the plugins.
Here's the main project's structure:
MainProject
|
|---plugins/
| |---MyPlugin.jar
|
|---src/
| |---main.app.plugin
| |---Plugin.java
| |---PluginSecurityPolicy.java
| |---PluginClassLoader.java
| |---PluginContainer.java
....
And the plugin's one:
Plugin
|
|---src/
| |---my.plugin
| | |---MyPlugin.java
| |--settings.xml
|
|---dist/
|---MyPlugin.jar
|---META-INF/
| |---MANIFEST.MF
|---my.plugin
| |---MyPlugin.class
|---settings.xml
To load the plugins into the program i've made a PluginContainer class that gets all the jar files from the plugins directory, lists all file inside the jar and lookup for the plugin file and the settings file.
I can load and make an instance of the plugin class, but when it comes to the XML there's no way i can even list it among the jar contents.
Here's the code, maybe someone can see where i did it wrong.
PluginSecurityPolicy.java
import java.security.AllPermission;
import java.security.PermissionCollection;
import java.security.Permissions;
import java.security.Policy;
import java.security.ProtectionDomain;
public class PluginSecurityPolicy extends Policy {
#Override
public PermissionCollection getPermissions(ProtectionDomain domain) {
if (isPlugin(domain)) {
return pluginPermissions();
} else {
return applicationPermissions();
}
}
private boolean isPlugin(ProtectionDomain domain) {
return domain.getClassLoader() instanceof PluginClassLoader;
}
private PermissionCollection pluginPermissions() {
Permissions permissions = new Permissions();
//
return permissions;
}
private PermissionCollection applicationPermissions() {
Permissions permissions = new Permissions();
permissions.add(new AllPermission());
return permissions;
}
}
PluginClassLoader.java
import java.net.URL;
import java.net.URLClassLoader;
public class PluginClassLoader extends URLClassLoader {
public PluginClassLoader(URL jarFileUrl) {
super(new URL[] {jarFileUrl});
}
}
PluginContainer.java, the #load method is the one
import main.app.plugin.PluginClassLoader;
import main.app.plugin.PluginSecurityPolicy;
import java.io.File;
import java.net.URL;
import java.security.Policy;
import java.util.ArrayList;
import java.util.Enumeration;
import java.util.zip.ZipEntry;
import java.util.zip.ZipFile;
public class PluginContainer {
private ArrayList<Plugin> plugins;
private ManifestParser parser;
public PluginContainer() {
Policy.setPolicy(new PluginSecurityPolicy());
System.setSecurityManager(new SecurityManager());
plugins = new ArrayList<>();
parser = new ManifestParser();
}
public void init() {
File[] dir = new File(System.getProperty("user.dir") + "/plugins").listFiles();
for (File pluginJarFile : dir) {
try {
Plugin plugin = load(pluginJarFile.getCanonicalPath());
plugins.add(plugin);
} catch (Exception e) {
throw new RuntimeException(e.getMessage(), e);
}
}
}
public <T extends Plugin> T getPlugin(Class<T> plugin) {
for (Plugin p : plugins) {
if (p.getClass().equals(plugin)) {
return (T)p;
}
}
return null;
}
private Plugin load(String pluginJarFile) throws Exception {
PluginManifest manifest = null;
Plugin plugin = null;
// Load the jar file
ZipFile jarFile = new ZipFile(pluginJarFile);
// Get all jar entries
Enumeration allEntries = jarFile.entries();
String pluginClassName = null;
while (allEntries.hasMoreElements()) {
// Get single file
ZipEntry entry = (ZipEntry) allEntries.nextElement();
String file = entry.getName();
// Look for classfiles
if (file.endsWith(".class")) {
// Set class name
String classname = file.replace('/', '.').substring(0, file.length() - 6);
// Look for plugin class
if (classname.endsWith("Plugin")) {
// Set the class name and exit loop
pluginClassName = classname;
break;
}
}
}
// Load the class
ClassLoader pluginLoader = new PluginClassLoader(new URL("file:///" + pluginJarFile));
Class<?> pluginClass = pluginLoader.loadClass(pluginClassName);
// Edit as suggested by KDM, still null
URL settingsUrl = pluginClass.getResource("/settings.xml");
manifest = parser.load(settingsUrl);
// Check if manifest has been created
if (null == manifest) {
throw new RuntimeException("Manifest file not found in " + pluginJarFile);
}
// Create the plugin
plugin = (Plugin) pluginClass.newInstance();
plugin.load(manifest);
return plugin;
}
}
And the autogenerated MANIFEST.MF
Manifest-Version: 1.0
Ant-Version: Apache Ant 1.9.4
Created-By: 1.8.0_25-b18 (Oracle Corporation)
The Class-Path directive is missing, but if i force it to . or ./settings.xml or settings.xml (by manually editing the MANIFEST.MF file) it won't work either.
This is all I can think of, Thanks in advance for any help
[EDIT] I've created an images/monitor-16.png into the plugin jar root, added the #load2 method into the PluginContainer.
Since the method is called within a loop I left the Policy.setPolicy(new PluginSecurityPolicy()); and System.setSecurityManager(new SecurityManager()); inside the constructor.
Here's the new plugn jar structure:
TestPlugin.jar
|
|---META-INF/
| |---MANIFEST.MF
|
|---dev.jimbo
| |---TestPlugin.class
|
|---images
| |---monitor-16.png
|
|---settings.xml
The new method code:
private Plugin load2(String pluginJarFile) throws MalformedURLException, ClassNotFoundException {
PluginClassLoader urlCL = new PluginClassLoader(new File(pluginJarFile).toURL());
Class<?> loadClass = urlCL.loadClass("dev.jimbo.TestPlugin");
System.out.println(loadClass);
System.out.println("Loading the class using the class loader object. Resource = " + urlCL.getResource("images/monitor-16.png"));
System.out.println("Loading the class using the class loader object with absolute path. Resource = " + urlCL.getResource("/images/monitor-16.png"));
System.out.println("Loading the class using the class object. Resource = " + loadClass.getResource("images/monitor-16.png"));
System.out.println();
return null;
}
Here's the output
class dev.jimbo.TestPlugin
Loading the class using the class loader object. Resource = null
Loading the class using the class loader object with absolute path. Resource = null
Loading the class using the class object. Resource = null
The following program:
public static void main(String[] args) throws MalformedURLException, ClassNotFoundException {
Policy.setPolicy(new PluginSecurityPolicy());
System.setSecurityManager(new SecurityManager());
PluginClassLoader urlCL = new PluginClassLoader(new File(
"A Jar containing images/load.gif and SampleApp class").toURL());
Class<?> loadClass = urlCL.loadClass("net.sourceforge.marathon.examples.SampleApp");
System.out.println(loadClass);
System.out.println("Loading the class using the class loader object. Resource = " + urlCL.getResource("images/load.gif"));
System.out.println("Loading the class using the class loader object with absolute path. Resource = " + urlCL.getResource("/images/load.gif"));
System.out.println("Loading the class using the class object. Resource = " + loadClass.getResource("images/load.gif"));
}
Produces the following output:
class net.sourceforge.marathon.examples.SampleApp
Loading the class using the class loader object. Resource = jar:file:/Users/dakshinamurthykarra/Projects/install/marathon/sampleapp.jar!/images/load.gif
Loading the class using the class loader object with absolute path. Resource = null
Loading the class using the class object. Resource = null
So I do not think any problem with your class loader. Putting this as an answer so that the code can be formatted properly.
Nailed it! Seems that my previous Netbeans (8.0) was deleting the plugin directory from the added Jar/Folder Libraries references on Clean and Build action. I've downloaded and installed Netbeans 8.0.2 and the problem was solved. Couldn't find any related bug for that version on their tracker though..
Anyways Thanks for the help :)
I would like to write some integration with ElasticSearch. For testing I would like to run in-memory ES.
I found some information in documentation, but without example how to write those kind of test. Elasticsearch Reference [1.6] » Testing » Java Testing Framework » integration tests
« unit tests
Also I found following article, but it's out of data. Easy JUnit testing with Elastic Search
I looking example how to start and run ES in-memory and access to it over REST API.
Based on the second link you provided, I created this abstract test class:
#RunWith(SpringJUnit4ClassRunner.class)
public abstract class AbstractElasticsearchTest {
private static final String HTTP_PORT = "9205";
private static final String HTTP_TRANSPORT_PORT = "9305";
private static final String ES_WORKING_DIR = "target/es";
private static Node node;
#BeforeClass
public static void startElasticsearch() throws Exception {
removeOldDataDir(ES_WORKING_DIR + "/" + clusterName);
Settings settings = Settings.builder()
.put("path.home", ES_WORKING_DIR)
.put("path.conf", ES_WORKING_DIR)
.put("path.data", ES_WORKING_DIR)
.put("path.work", ES_WORKING_DIR)
.put("path.logs", ES_WORKING_DIR)
.put("http.port", HTTP_PORT)
.put("transport.tcp.port", HTTP_TRANSPORT_PORT)
.put("index.number_of_shards", "1")
.put("index.number_of_replicas", "0")
.put("discovery.zen.ping.multicast.enabled", "false")
.build();
node = nodeBuilder().settings(settings).clusterName("monkeys.elasticsearch").client(false).node();
node.start();
}
#AfterClass
public static void stopElasticsearch() {
node.close();
}
private static void removeOldDataDir(String datadir) throws Exception {
File dataDir = new File(datadir);
if (dataDir.exists()) {
FileSystemUtils.deleteRecursively(dataDir);
}
}
}
In the production code, I configured an Elasticsearch client as follows. The integration test extends the above defined abstract class and configures property elasticsearch.port as 9305 and elasticsearch.host as localhost.
#Configuration
public class ElasticsearchConfiguration {
#Bean(destroyMethod = "close")
public Client elasticsearchClient(#Value("${elasticsearch.clusterName}") String clusterName,
#Value("${elasticsearch.host}") String elasticsearchClusterHost,
#Value("${elasticsearch.port}") Integer elasticsearchClusterPort) throws UnknownHostException {
Settings settings = Settings.settingsBuilder().put("cluster.name", clusterName).build();
InetSocketTransportAddress transportAddress = new InetSocketTransportAddress(InetAddress.getByName(elasticsearchClusterHost), elasticsearchClusterPort);
return TransportClient.builder().settings(settings).build().addTransportAddress(transportAddress);
}
}
That's it. The integration test will run the production code which is configured to connect to the node started in the AbstractElasticsearchTest.startElasticsearch().
In case you want to use the elasticsearch REST api, use port 9205. E.g. with Apache HttpComponents:
HttpClient httpClient = HttpClients.createDefault();
HttpPut httpPut = new HttpPut("http://localhost:9205/_template/" + templateName);
httpPut.setEntity(new FileEntity(new File("template.json")));
httpClient.execute(httpPut);
Here is my implementation
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.util.UUID;
import org.elasticsearch.client.Client;
import org.elasticsearch.common.settings.ImmutableSettings;
import org.elasticsearch.node.Node;
import org.elasticsearch.node.NodeBuilder;
/**
*
* #author Raghu Nair
*/
public final class ElasticSearchInMemory {
private static Client client = null;
private static File tempDir = null;
private static Node elasticSearchNode = null;
public static Client getClient() {
return client;
}
public static void setUp() throws Exception {
tempDir = File.createTempFile("elasticsearch-temp", Long.toString(System.nanoTime()));
tempDir.delete();
tempDir.mkdir();
System.out.println("writing to: " + tempDir);
String clusterName = UUID.randomUUID().toString();
elasticSearchNode = NodeBuilder
.nodeBuilder()
.local(false)
.clusterName(clusterName)
.settings(
ImmutableSettings.settingsBuilder()
.put("script.disable_dynamic", "false")
.put("gateway.type", "local")
.put("index.number_of_shards", "1")
.put("index.number_of_replicas", "0")
.put("path.data", new File(tempDir, "data").getAbsolutePath())
.put("path.logs", new File(tempDir, "logs").getAbsolutePath())
.put("path.work", new File(tempDir, "work").getAbsolutePath())
).node();
elasticSearchNode.start();
client = elasticSearchNode.client();
}
public static void tearDown() throws Exception {
if (client != null) {
client.close();
}
if (elasticSearchNode != null) {
elasticSearchNode.stop();
elasticSearchNode.close();
}
if (tempDir != null) {
removeDirectory(tempDir);
}
}
public static void removeDirectory(File dir) throws IOException {
if (dir.isDirectory()) {
File[] files = dir.listFiles();
if (files != null && files.length > 0) {
for (File aFile : files) {
removeDirectory(aFile);
}
}
}
Files.delete(dir.toPath());
}
}
You can start ES on your local with:
Settings settings = Settings.settingsBuilder()
.put("path.home", ".")
.build();
NodeBuilder.nodeBuilder().settings(settings).node();
When ES started, access it over REST like:
http://localhost:9200/_cat/health?v
As of 2016 embedded elasticsearch is no-longer supported
As per a response from one of the devlopers in 2017 you can use the following approaches:
Use the Gradle tools elasticsearch already has. You can read some information about this here: https://github.com/elastic/elasticsearch/issues/21119
Use the Maven plugin: https://github.com/alexcojocaru/elasticsearch-maven-plugin
Use Ant scripts like http://david.pilato.fr/blog/2016/10/18/elasticsearch-real-integration-tests-updated-for-ga
Using Docker: https://www.testcontainers.org/modules/elasticsearch
Using Docker from maven: https://github.com/dadoonet/fscrawler/blob/e15dddf72b1ed094dad279d492e4e0314f73683f/pom.xml#L241-L289