I created a file helloworld.txt. Now I'm reading from the file and then I want to load the contents of the file into the cache, and whenever the cache is updated, it should write to the file as well.
This is my code so far:
Please tell me what to do to load the cache and then write from the cache to the file, as the instructions are not clear from Apache Ignite documentation.
import java.io.BufferedReader;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.io.IOException;
import org.apache.ignite.Ignite;
import org.apache.ignite.IgniteCache;
import org.apache.ignite.IgniteDataStreamer;
import org.apache.ignite.IgniteException;
import org.apache.ignite.Ignition;
import org.apache.ignite.examples.ExampleNodeStartup;
import org.apache.ignite.examples.ExamplesUtils;
public class FileRead {
/** Cache name. */
private static final String CACHE_NAME = "FileCache";
/** Heap size required to run this example. */
public static final int MIN_MEMORY = 512 * 1024 * 1024;
/**
* Executes example.
*
* #param args Command line arguments, none required.
* #throws IgniteException If example execution failed.
*/
public static void main(String[] args) throws IgniteException {
ExamplesUtils.checkMinMemory(MIN_MEMORY);
try (Ignite ignite = Ignition.start("examples/config/example-ignite.xml")) {
System.out.println();
try (IgniteCache<Integer, String> cache = ignite.getOrCreateCache(CACHE_NAME)) {
long start = System.currentTimeMillis();
try (IgniteDataStreamer<Integer, String> stmr = ignite.dataStreamer(CACHE_NAME)) {
// Configure loader.
stmr.perNodeBufferSize(1024);
stmr.perNodeParallelOperations(8);
///FileReads();
try {
BufferedReader in = new BufferedReader
(new FileReader("/Users/akritibahal/Desktop/helloworld.txt"));
String str;
int i=0;
while ((str = in.readLine()) != null) {
System.out.println(str);
stmr.addData(i,str);
i++;
}
System.out.println("Loaded " + i + " keys.");
}
catch (IOException e) {
}
}
}
}
}
}
For information on how to load the cache from a persistence store please refer to this page: https://apacheignite.readme.io/docs/data-loading
You have two options:
Start a client node, create IgniteDataStreamer and use it to load the data. Simply call addData() for each line in the file.
Implement CacheStore.loadCache() method, provide the implementation in the cache configuration and call IgniteCache.loadCache().
Second approach will require to have the file on all server nodes, by there will be no communication between nodes, so most likely it will be faster.
Related
i'd like to substitute all my system.out.println with a log.println in my web-app, in order insert all my log not in Eclipse console, but in an apposite file. I want that because i've deployed my web-app under a tomcat docker container.
I've found after some research this class:
import java.io.PrintWriter;
import java.io.FileOutputStream;
import java.util.Date;
public class Log{
private SettingsManager settings;
private String logFile;
private PrintWriter writer;
static Log theInstance = null;
/**
* Returns the only available instance of this class, if it exists...
* instantiates and returns it otherwise. LOg file name is retrieved
* through the SettingsManager
*
* #return
*/
public static Log getInstance() {
if (Log.theInstance == null) {
Log.theInstance = new Log();
}
return Log.theInstance;
}
private Log() {
this.settings = SettingsManager.getInstance();
this.logFile = settings.getString("settings.log.filename");
try {
this.writer = new PrintWriter(new FileOutputStream(this.logFile, true), true);
writer.println("*** Kerberos Logfile ***");
writer.println(" *** Logging started ***");
}catch(Exception ex) {
ex.printStackTrace();
}
}
public String getLogFile() {
return this.logFile;
}
public void println(String line) {
writer.println("[" + new Date().toString() + "]" + line);
}
}
Well, how can i modify this singleton class (I've no class named "SettingsManager") and substitute all my system.out.println with log.println? How can i set my log path?
Or.. can someone show me a simple log class and how to instanciate it?
Use some logging library like Log4j. Read tutorials/document and implement in your code. Do not copy exactly someone's code and then use.
You can check below tutorial for example:
http://www.vogella.com/tutorials/Logging/article.html
I have to move files from one directory to other directory.
Am using property file. So the source and destination path is stored in property file.
Am haivng property reader class also.
In my source directory am having lots of files. One file should move to other directory if its complete the operation.
File size is more than 500MB.
import java.io.File;
import java.nio.file.Files;
import java.nio.file.StandardCopyOption;
import static java.nio.file.StandardCopyOption.*;
public class Main1
{
public static String primarydir="";
public static String secondarydir="";
public static void main(String[] argv)
throws Exception
{
primarydir=PropertyReader.getProperty("primarydir");
System.out.println(primarydir);
secondarydir=PropertyReader.getProperty("secondarydir");
File dir = new File(primarydir);
secondarydir=PropertyReader.getProperty("secondarydir");
String[] children = dir.list();
if (children == null)
{
System.out.println("does not exist or is not a directory");
}
else
{
for (int i = 0; i < children.length; i++)
{
String filename = children[i];
System.out.println(filename);
try
{
File oldFile = new File(primarydir,children[i]);
System.out.println( "Before Moving"+oldFile.getName());
if (oldFile.renameTo(new File(secondarydir+oldFile.getName())))
{
System.out.println("The file was moved successfully to the new folder");
}
else
{
System.out.println("The File was not moved.");
}
}
catch (Exception e)
{
e.printStackTrace();
}
}
System.out.println("ok");
}
}
}
My code is not moving the file into the correct path.
This is my property file
primarydir=C:/Desktop/A
secondarydir=D:/B
enter code here
Files should be in B drive. How to do? Any one can help me..!!
Change this:
oldFile.renameTo(new File(secondarydir+oldFile.getName()))
To this:
oldFile.renameTo(new File(secondarydir, oldFile.getName()))
It's best not to use string concatenation to join path segments, as the proper way to do it may be platform-dependent.
Edit: If you can use JDK 1.7 APIs, you can use Files.move() instead of File.renameTo()
Code - a java method:
/**
* copy by transfer, use this for cross partition copy,
* #param sFile source file,
* #param tFile target file,
* #throws IOException
*/
public static void copyByTransfer(File sFile, File tFile) throws IOException {
FileInputStream fInput = new FileInputStream(sFile);
FileOutputStream fOutput = new FileOutputStream(tFile);
FileChannel fReadChannel = fInput.getChannel();
FileChannel fWriteChannel = fOutput.getChannel();
fReadChannel.transferTo(0, fReadChannel.size(), fWriteChannel);
fReadChannel.close();
fWriteChannel.close();
fInput.close();
fOutput.close();
}
The method use nio, it make use os underling operation to improve performance.
Here is the import code:
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.nio.channels.FileChannel;
If you are in eclipse, just use ctrl + shift + o.
I have some housekeeping tasks within an Elastic Beanstalk Java application running on Tomcat, and I need to run them every so often. I want these tasks run only on the leader node (or, more correctly, on a single node, but the leader seems like an obvious choice).
I was looking at running cron jobs within Elastic Beanstalk, but it feels like this should be more straightforward than what I've come up with. Ideally, I'd like one of these two options within my web app:
Some way of testing within the current JRE whether or not this server is the leader node
Some some way to hit a specific URL (wget?) to trigger the task, but also restrict that URL to requests from localhost.
Suggestions?
It is not possible, by design (leaders are only assigned during deployment, and not needed on other contexts). However, you can tweak and use the EC2 Metadata for this exact purpose.
Here's an working example about how to achieve this result (original source). Once you call getLeader, it will find - or assign - an instance to be set as a leader:
package br.com.ingenieux.resource;
import java.io.IOException;
import java.net.MalformedURLException;
import java.net.URL;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.List;
import javax.inject.Inject;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.Produces;
import org.apache.commons.io.IOUtils;
import com.amazonaws.services.ec2.AmazonEC2;
import com.amazonaws.services.ec2.model.CreateTagsRequest;
import com.amazonaws.services.ec2.model.DeleteTagsRequest;
import com.amazonaws.services.ec2.model.DescribeInstancesRequest;
import com.amazonaws.services.ec2.model.Filter;
import com.amazonaws.services.ec2.model.Instance;
import com.amazonaws.services.ec2.model.Reservation;
import com.amazonaws.services.ec2.model.Tag;
import com.amazonaws.services.elasticbeanstalk.AWSElasticBeanstalk;
import com.amazonaws.services.elasticbeanstalk.model.DescribeEnvironmentsRequest;
#Path("/admin/leader")
public class LeaderResource extends BaseResource {
#Inject
AmazonEC2 amazonEC2;
#Inject
AWSElasticBeanstalk elasticBeanstalk;
#GET
public String getLeader() throws Exception {
/*
* Avoid running if we're not in AWS after all
*/
try {
IOUtils.toString(new URL(
"http://169.254.169.254/latest/meta-data/instance-id")
.openStream());
} catch (Exception exc) {
return "i-FFFFFFFF/localhost";
}
String environmentName = getMyEnvironmentName();
List<Instance> environmentInstances = getInstances(
"tag:elasticbeanstalk:environment-name", environmentName,
"tag:leader", "true");
if (environmentInstances.isEmpty()) {
environmentInstances = getInstances(
"tag:elasticbeanstalk:environment-name", environmentName);
Collections.shuffle(environmentInstances);
if (environmentInstances.size() > 1)
environmentInstances.removeAll(environmentInstances.subList(1,
environmentInstances.size()));
amazonEC2.createTags(new CreateTagsRequest().withResources(
environmentInstances.get(0).getInstanceId()).withTags(
new Tag("leader", "true")));
} else if (environmentInstances.size() > 1) {
DeleteTagsRequest deleteTagsRequest = new DeleteTagsRequest().withTags(new Tag().withKey("leader").withValue("true"));
for (Instance i : environmentInstances.subList(1,
environmentInstances.size())) {
deleteTagsRequest.getResources().add(i.getInstanceId());
}
amazonEC2.deleteTags(deleteTagsRequest);
}
return environmentInstances.get(0).getInstanceId() + "/" + environmentInstances.get(0).getPublicIpAddress();
}
#GET
#Produces("text/plain")
#Path("am-i-a-leader")
public boolean isLeader() {
/*
* Avoid running if we're not in AWS after all
*/
String myInstanceId = null;
String environmentName = null;
try {
myInstanceId = IOUtils.toString(new URL(
"http://169.254.169.254/latest/meta-data/instance-id")
.openStream());
environmentName = getMyEnvironmentName();
} catch (Exception exc) {
return false;
}
List<Instance> environmentInstances = getInstances(
"tag:elasticbeanstalk:environment-name", environmentName,
"tag:leader", "true", "instance-id", myInstanceId);
return (1 == environmentInstances.size());
}
protected String getMyEnvironmentHost(String environmentName) {
return elasticBeanstalk
.describeEnvironments(
new DescribeEnvironmentsRequest()
.withEnvironmentNames(environmentName))
.getEnvironments().get(0).getCNAME();
}
private String getMyEnvironmentName() throws IOException,
MalformedURLException {
String instanceId = IOUtils.toString(new URL(
"http://169.254.169.254/latest/meta-data/instance-id"));
/*
* Grab the current environment name
*/
DescribeInstancesRequest request = new DescribeInstancesRequest()
.withInstanceIds(instanceId)
.withFilters(
new Filter("instance-state-name").withValues("running"));
for (Reservation r : amazonEC2.describeInstances(request)
.getReservations()) {
for (Instance i : r.getInstances()) {
for (Tag t : i.getTags()) {
if ("elasticbeanstalk:environment-name".equals(t.getKey())) {
return t.getValue();
}
}
}
}
return null;
}
public List<Instance> getInstances(String... args) {
Collection<Filter> filters = new ArrayList<Filter>();
filters.add(new Filter("instance-state-name").withValues("running"));
for (int i = 0; i < args.length; i += 2) {
String key = args[i];
String value = args[1 + i];
filters.add(new Filter(key).withValues(value));
}
DescribeInstancesRequest req = new DescribeInstancesRequest()
.withFilters(filters);
List<Instance> result = new ArrayList<Instance>();
for (Reservation r : amazonEC2.describeInstances(req).getReservations())
result.addAll(r.getInstances());
return result;
}
}
You can keep a secret URL (a long URL is un-guessable, almost as safe as a password), hit this URL from somewhere. On this you can execute the task.
One problem however is that if the task takes too long, then during that time your server capacity will be limited. Another approach would be for the URL hit to post a message to the AWS SQS. The another EC2 can have a code which waits on SQS and execute the task. You can also look into http://aws.amazon.com/swf/
Another approach if you're running on the Linux-type EC2 instance:
Write a shell script that does (or triggers) your periodic task
Leveraging the .ebextensions feature to customize your Elastic Beanstalk instance, create a container command that specifies the parameter leader_only: true -- this command will only run on an instance that is designated the leader in your Auto Scaling group
Have your container command copy your shell script into /etc/cron.hourly (or daily or whatever).
The result will be that your "leader" EC2 instance will have a cron job running hourly (or daily or whatever) to do your periodic task and the other instances in your Auto Scaling group will not.
I am creating a sample program using protocol buffer and protobuf-java-format.My proto file is
package com.sample;
option java_package = "com.sample";
option java_outer_classname = "PersonProtos";
message Person {
required string name = 1;
required int32 id = 2;
optional string email = 3;
}
My Sample program is
package com.sample;
import java.io.BufferedReader;
import java.io.FileInputStream;
import java.io.FileReader;
import java.io.IOException;
import com.google.protobuf.Message;
import com.googlecode.protobuf.format.XmlFormat;
import com.sample.PersonProtos.Person;
/**
* This class generate XML out put from Object and vice-versa
*
* #author mcapatna
*
*/
public class Demo
{
public static void main(String[] args) throws IOException
{
// get the message type from protocol buffer generated class.set the
// required property
Message personProto = Person.newBuilder().setEmail("a").setId(1).setName("as").build();
// use protobuf-java-format to generate XMl from Object.
String toXml = XmlFormat.printToString(personProto);
System.out.println(toXml);
// Create the Object from XML
Message.Builder builder = Person.newBuilder();
String fileContent = "";
Person person = Person.parseFrom(new FileInputStream("C:\\file3.xml"));
System.out.println(XmlFormat.printToString(person));
System.out.println("-Done-");
}
}
XmlFormat.printToString() is working fine.but creating object from XML not working
I also tried XmlFormat.merge(toXml, builder); .but since merge() return void.so how can we get the object of Person class.
Both the above method merge() and parseFrom() giving the same exception
com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.
NOTE: "C:\\file3.xml" have the same content as toXml.
After a lots of effort,I found the solution...Here is the answer
package com.sample;
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
import com.google.protobuf.Message;
import com.googlecode.protobuf.format.XmlFormat;
import com.sample.PersonProtos.Person;
/**
* This class generate XML output from Object and vice-versa
*
* #author mcapatna
*
*/
public class Demo
{
public static void main(String[] args) throws IOException
{
long startDate=System.currentTimeMillis();
// get the message type from protocol buffer generated class.set the
// required property
Message personProto = Person.newBuilder().setEmail("a").setId(1).setName("as").build();
// use protobuf-java-format to generate XMl from Object.
String toXml = XmlFormat.printToString(personProto);
System.out.println("toXMl "+toXml);
// Create the Object from XML
Message.Builder builder = Person.newBuilder();
String fileContent = "";
// file3 contents same XML String as toXml
fileContent = readFileAsString("C:\\file3.xml");
// call protobuf-java-format method to generate Object
XmlFormat.merge(fileContent, builder);
Message msg= builder.build();
System.out.println("From XML"+XmlFormat.printToString(msg));
long endDate=System.currentTimeMillis();
System.out.println("Time Taken: "+(endDate-startDate));
System.out.println("-Done-");
}
private static String readFileAsString(String filePath) throws IOException
{
StringBuffer fileData = new StringBuffer();
BufferedReader reader = new BufferedReader(new FileReader(filePath));
char[] buf = new char[1024];
int numRead = 0;
while ((numRead = reader.read(buf)) != -1)
{
String readData = String.valueOf(buf, 0, numRead);
fileData.append(readData);
}
reader.close();
return fileData.toString();
}
}
Here is the Output of program:
toXMl <Person><name>as</name><id>1</id><email>a</email></Person>
From XML<Person><name>Deepak</name><id>1</id><email>a</email></Person>
Time Taken: 745
-Done-
Hope it will be useful for other members.
I've requested trial license for Callback File System and tried to write simple application using java! So, I've written next few lines and run it and received exception eldos.cbfs.ECBFSError: Access is denied
Code
import eldos.cbfs.CallbackFileSystem;
import eldos.cbfs.ECBFSError;
import eldos.cbfs.boolRef;
import java.util.logging.Level;
import java.util.logging.Logger;
/**
* #author Sergii.Zagriichuk
*/
public class Test1 {
private static Logger logger = Logger.getLogger(Test1.class.getName());
public static void main(String[] args) {
CallbackFileSystem callbackFileSystem = new CallbackFileSystem();
callbackFileSystem.setRegistrationKey("My registration key ");
try {
callbackFileSystem.install("<path to cab>\\cbfs.cab", "Test", true, 131072, new boolRef(false));
} catch (ECBFSError ecbfsError) {
logger.log(Level.SEVERE, ecbfsError.getMessage(), ecbfsError);
}
}
}
What do I should to do for fix this problem?
Thanks