Google Guava - pass parameters to load method in addition go KEY - java

I have written a program to cache objects using Google Guava. My problem is how to pass additional parameters to Guava Load method. Here is the code. Below you see in this line - I have made fileId and pageno as key - cache.get(fileID+pageNo);. Now when cache.get is called and when that key is not in the cache - guava will call the load method of class PreviewCacheLoader which I have given below as well.
public class PreviewCache {
static final LoadingCache<String, CoreObject> cache = CacheBuilder.newBuilder()
.maximumSize(1000)
.expireAfterWrite(5, TimeUnit.MINUTES)
.build(new PreviewCacheLoader());
public CoreObject getPreview(String strTempPath, int pageNo, int requiredHeight, String fileID, String strFileExt, String ssoId) throws IOException
{
CoreObject coreObject = null;
try {
coreObject = cache.get(fileID+pageNo, HOW TO PASS pageNO and requiredHeight because I want to keep key as ONLY fileID+pageNo );
} catch (ExecutionException e) {
e.printStackTrace();
}
return coreObject;
}
}
How to pass parameters from above which are int and String to below Load method in addition to key parameter
public class PreviewCacheLoader extends CacheLoader<String, CoreObject> {
#Override
public CoreObject load(String fileIDpageNo, HOW TO GET pageNO and requiredHeight) throws Exception {
CoreObject coreObject = new CoreObject();
// MAKE USE OF PARAMETERS pageNO and requiredHeight
// Populate coreObject here
return coreObject;
}
}

For starters, it's extremely bad programming practice to use fileId + pageNo as a String key instead of creating a proper object. (This is called "stringly typed" code.) The best way to solve your problem would probably look like:
class FileIdAndPageNo {
private final String fileId;
private final int pageNo;
...constructor, hashCode, equals...
}
public CoreObject getPreview(final int pageNo, final int requiredHeight, String fileID) { throws IOException
{
CoreObject coreObject = null;
try {
coreObject = cache.get(new FileIdAndPageNo(fileID, pageNo),
new Callable<CoreObject>() {
public CoreObject call() throws Exception {
// you have access to pageNo and requiredHeight here
}
});
} catch (ExecutionException e) {
e.printStackTrace();
}
return coreObject;
}

Related

How to Implement Factory Design Pattern for CsvProcessing based on Key

I have written a controller which is a default for MototuploadService(for Motor Upload), but I need to make one Factory Design so that
based on parentPkId, need to call HealUploadService, TempUploadService, PersonalUploadService etc which will have separate file processing stages.
controller is below.
#RequestMapping(value = "/csvUpload", method = RequestMethod.POST)
public List<String> csvUpload(#RequestParam String parentPkId, #RequestParam List<MultipartFile> files)
throws IOException, InterruptedException, ExecutionException, TimeoutException {
log.info("Entered method csvUpload() of DaoController.class");
List<String> response = new ArrayList<String>();
ExecutorService executor = Executors.newFixedThreadPool(10);
CompletionService<String> compService = new ExecutorCompletionService<String>(executor);
List< Future<String> > futureList = new ArrayList<Future<String>>();
for (MultipartFile f : files) {
compService.submit(new ProcessMutlipartFile(f ,parentPkId,uploadService));
futureList.add(compService.take());
}
for (Future<String> f : futureList) {
long timeout = 0;
System.out.println(f.get(timeout, TimeUnit.SECONDS));
response.add(f.get());
}
executor.shutdown();
return response;
}
Here is ProcessMutlipartFile class which extends the callable interface, with CompletionService's compService.submit() invoke this class, which in turn executes call() method, which will process a file.
public class ProcessMutlipartFile implements Callable<String>
{
private MultipartFile file;
private String temp;
private MotorUploadService motUploadService;
public ProcessMutlipartFile(MultipartFile file,String temp, MotorUploadService motUploadService )
{
this.file=file;
this.temp=temp;
this.motUploadService=motUploadService;
}
public String call() throws Exception
{
return motUploadService.csvUpload(temp, file);
}
}
Below is MotorUploadService class, where I'm processing uploaded CSV file, line by line and then calling validateCsvData() method to validate Data,
which returns ErrorObject having line number and Errors associated with it.
if csvErrorRecords is null, then error-free and proceed with saving to Db.
else save errorList to Db and return Upload Failure.
#Component
public class MotorUploadService {
#Value("${external.resource.folder}")
String resourceFolder;
public String csvUpload(String parentPkId, MultipartFile file) {
String OUT_PATH = resourceFolder;
try {
DateFormat df = new SimpleDateFormat("yyyyMMddhhmmss");
String filename = file.getOriginalFilename().split(".")[0] + df.format(new Date()) + file.getOriginalFilename().split(".")[1];
Path path = Paths.get(OUT_PATH,fileName)
Files.copy(file.getInputStream(), path, StandardCopyOption.REPLACE_EXISTING);
}
catch(IOException e){
e.printStackTrace();
return "Failed to Upload File...try Again";
}
List<TxnMpMotSlaveRaw> txnMpMotSlvRawlist = new ArrayList<TxnMpMotSlaveRaw>();
try {
BufferedReader br = new BufferedReader(new InputStreamReader(file.getInputStream()));
String line = "";
int header = 0;
int lineNum = 1;
TxnMpSlaveErrorNew txnMpSlaveErrorNew = new TxnMpSlaveErrorNew();
List<CSVErrorRecords> errList = new ArrayList<CSVErrorRecords>();
while ((line = br.readLine()) != null) {
// TO SKIP HEADER
if (header == 0) {
header++;
continue;
}
lineNum++;
header++;
// Use Comma As Separator
String[] csvDataSet = line.split(",");
CSVErrorRecords csvErrorRecords = validateCsvData(lineNum, csvDataSet);
System.out.println("Errors from csvErrorRecords is " + csvErrorRecords);
if (csvErrorRecords.equals(null) || csvErrorRecords.getRecordNo() == 0) {
//Function to Save to Db
} else {
// add to errList
continue;
}
}
if (txnMpSlaveErrorNew.getErrRecord().size() == 0) {
//save all
return "Successfully Uploaded " + file.getOriginalFilename();
}
else {
// save the error in db;
return "Failure as it contains Faulty Information" + file.getOriginalFilename();
}
} catch (IOException ex) {
ex.printStackTrace();
return "Failure Uploaded " + file.getOriginalFilename();
}
}
private TxnMpMotSlaveRaw saveCsvData(String[] csvDataSet, String parentPkId) {
/*
Mapping csvDataSet to PoJo
returning Mapped Pojo;
*/
}
private CSVErrorRecords validateCsvData(int lineNum, String[] csvDataSet) {
/*
Logic for Validation goes here
*/
}
}
How to make it as a factory design pattern from controller,
so that if
parentPkId='Motor' call MotorUploadService,
parentPkId='Heal' call HealUploadService
I'm not so aware of the Factory Design pattern, please help me out.
Thanks in advance.
If I understood the question, in essence you would create an interface, and then return a specific implementation based upon the desired type.
So
public interface UploadService {
void csvUpload(String temp, MultipartFile file) throws IOException;
}
The particular implementations
public class MotorUploadService implements UploadService
{
public void csvUpload(String temp, MultipartFile file) {
...
}
}
public class HealUploadService implements UploadService
{
public void csvUpload(String temp, MultipartFile file) {
...
}
}
Then a factory
public class UploadServiceFactory {
public UploadService getService(String type) {
if ("Motor".equals(type)) {
return new MotorUploadService();
}
else if ("Heal".equals(type)) {
return new HealUploadService();
}
}
}
The factory might cache the particular implementations. One can also use an abstract class rather than an interface if appropriate.
I think you currently have a class UploadService but that is really the MotorUploadService if I followed your code, so I would rename it to be specific.
Then in the controller, presumably having used injection for the UploadServiceFactory
...
for (MultipartFile f : files) {
UploadService uploadSrvc = uploadServiceFactory.getService(parentPkId);
compService.submit(new ProcessMutlipartFile(f ,parentPkId,uploadService));
futureList.add(compService.take());
}
So with some additional reading in your classes:
public class ProcessMutlipartFile implements Callable<String>
{
private MultipartFile file;
private String temp;
private UploadService uploadService;
// change to take the interface UploadService
public ProcessMutlipartFile(MultipartFile file,String temp, UploadService uploadService )
{
this.file=file;
this.temp=temp;
this.uploadService=uploadService;
}
public String call() throws Exception
{
return uploadService.csvUpload(temp, file);
}
}

Google Cloud Endpoint Bucket Downloader

I am very new to the GC platform and am trying to create an API in Java with two methods: one which returns a list of all of the files in a specific bucket and another which retrieves a specified file from that bucket. The goal is to be able to iterate the file list in order to download every file from the bucket. Essentially, I am wanting to mirror the contents of the bucket on an Android device, so the API will be called from a generated client library in an Android app.
My getFileList() method returns a ListResult object. How do I extract a list of files from this?
#ApiMethod(name = "getFileList", path = "getFileList", httpMethod = ApiMethod.HttpMethod.GET)
public ListResult getFileList(#Named("bucketName") String bucketName) {
GcsService gcsService = GcsServiceFactory.createGcsService(RetryParams.getDefaultInstance());
ListResult result = null;
try {
result = gcsService.list(bucketName, ListOptions.DEFAULT);
return result;
} catch (IOException e) {
return null; // Handle this properly.
}
}
Also, I am struggling to determine what the return type of my getFile() API method should be. I can’t use a byte array since return types cannot be simple types as I understand it. This is where I am with it:
#ApiMethod(name = "getFile", path = "getFile", httpMethod = ApiMethod.HttpMethod.GET)
public byte[] getFile(#Named("bucketName") String bucketName, ListItem file) {
GcsService gcsService = GcsServiceFactory.createGcsService();
GcsFilename gcsFilename = new GcsFilename(bucketName, file.getName());
ByteBuffer byteBuffer;
try {
int fileSize = (int) gcsService.getMetadata(gcsFilename).getLength();
byteBuffer = ByteBuffer.allocate(fileSize);
GcsInputChannel gcsInputChannel = gcsService.openReadChannel(gcsFilename, 0);
gcsInputChannel.read(byteBuffer);
return byteBuffer.array();
} catch (IOException e) {
return null; // Handle this properly.
}
}
I am lost in the Google documentation for this stuff and am concerned that I am coming at it from completely the wrong direction since all I am trying to do is securely download a bunch of files!
I can't give you a complete solution because, this is code I wrote for my company, but I can show you some basics. I use the google-cloud-java API.
First you need to create an API key and download this in JSON format. More details can be found here.
I have - amongst other - these two fields in my class:
protected final Object storageInitLock = new Object();
protected Storage storage;
First you will need a method to initialize a com.google.cloud.storage.Storage object, something like (set your project-id and path to json api key):
protected final Storage getStorage() {
synchronized (storageInitLock) {
if (null == storage) {
try {
storage = StorageOptions.newBuilder()
.setProjectId(PROJECTID)
.setCredentials(ServiceAccountCredentials.fromStream(new FileInputStream(pathToJsonKey)))
.build()
.getService();
} catch (IOException e) {
throw new MyCustomException("Error reading auth file " + pathToJsonKey, e);
} catch (StorageException e) {
throw new MyCustomException("Error initializing storage", e);
}
}
return storage;
}
}
to get all entries you could use something like:
protected final Iterator<Blob> getAllEntries() {
try {
return getStorage().list(bucketName).iterateAll();
} catch (StorageException e) {
throw new MyCustomException("error retrieving entries", e);
}
}
list files in a directory:
public final Optional<Page<Blob>> listFilesInDirectory(#NotNull String directory) {
try {
return Optional.ofNullable(getStorage().list(getBucketName(), Storage.BlobListOption.currentDirectory(),
Storage.BlobListOption.prefix(directory)));
} catch (Exception e) {
return Optional.empty();
}
}
get info about a file:
public final Optional<Blob> getFileInfo(#NotNull String bucketFilename) {
try {
return Optional.ofNullable(getStorage().get(BlobId.of(getBucketName(), bucketFilename)));
} catch (Exception e) {
return Optional.empty();
}
}
adding a file:
public final void addFile(#NotNull String localFilename, #NotNull String bucketFilename,
#Nullable ContentType contentType) {
final BlobInfo.Builder builder = BlobInfo.newBuilder(BlobId.of(bucketName, bucketFilename));
if (null != contentType) {
builder.setContentType(contentType.getsValue());
}
final BlobInfo blobInfo = builder.build();
try (final RandomAccessFile raf = new RandomAccessFile(localFilename, "r");
final FileChannel channel = raf.getChannel();
final WriteChannel writer = getStorage().writer(blobInfo)) {
writer.write(channel.map(FileChannel.MapMode.READ_ONLY, 0, channel.size()));
} catch (Exception e) {
throw new MyCustomException(MessageFormat.format("Error storing {0} to {1}", localFilename,
bucketFilename), e);
}
}
I hope these code snippets and the referenced documentation will get you going, actulally it's not too hard.

Best way to handle huge fields with GSON JsonReader

I'm getting a java.lang.OutOfMemoryError: Java heap space even with GSON Streaming.
{"result":"OK","base64":"JVBERi0xLjQKJeLjz9MKMSAwIG9iago8PC...."}
base64 can be up to 200Mb long. GSON is taking much more memory than that, (3GB) When I try to store the base64 in a variable I get a:
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:2367)
at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:130)
at java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:114)
at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:535)
at java.lang.StringBuilder.append(StringBuilder.java:204)
at com.google.gson.stream.JsonReader.nextQuotedValue(JsonReader.java:1014)
at com.google.gson.stream.JsonReader.nextString(JsonReader.java:815)
What is the best way to handle this kind of fields?
The reason of why you're getting OutOfMemoryError is that GSON nextString() returns a string that's aggregated during building a very huge string using StringBuilder. When you're facing with such an issue, you have to deal with intermediate data since there is no other choice. Unfortunately, GSON does not let you to process huge literals in any way.
Not sure if you can change the response payload, but if you can't, you might want to implement your own JSON reader, or "hack" the existing JsonReader to make it work in streaming fashion. The example below is based on GSON 2.5 and makes heavy use of reflection because JsonReader hides its state very carefully.
EnhancedGson25JsonReader.java
final class EnhancedGson25JsonReader
extends JsonReader {
// A listener to accept the internal character buffers.
// Accepting a single string built on such buffers is total memory waste as well.
interface ISlicedStringListener {
void accept(char[] buffer, int start, int length)
throws IOException;
}
// The constants can be just copied
/** #see JsonReader#PEEKED_NONE */
private static final int PEEKED_NONE = 0;
/** #see JsonReader#PEEKED_SINGLE_QUOTED */
private static final int PEEKED_SINGLE_QUOTED = 8;
/** #see JsonReader#PEEKED_DOUBLE_QUOTED */
private static final int PEEKED_DOUBLE_QUOTED = 9;
// Here is a bunch of spies made to "spy" for the parent's class state
private final FieldSpy<Integer> peeked;
private final MethodSpy<Integer> doPeek;
private final MethodSpy<Integer> getLineNumber;
private final MethodSpy<Integer> getColumnNumber;
private final FieldSpy<char[]> buffer;
private final FieldSpy<Integer> pos;
private final FieldSpy<Integer> limit;
private final MethodSpy<Character> readEscapeCharacter;
private final FieldSpy<Integer> lineNumber;
private final FieldSpy<Integer> lineStart;
private final MethodSpy<Boolean> fillBuffer;
private final MethodSpy<IOException> syntaxError;
private final FieldSpy<Integer> stackSize;
private final FieldSpy<int[]> pathIndices;
private EnhancedJsonReader(final Reader reader)
throws NoSuchFieldException, NoSuchMethodException {
super(reader);
peeked = spyField(JsonReader.class, this, "peeked");
doPeek = spyMethod(JsonReader.class, this, "doPeek");
getLineNumber = spyMethod(JsonReader.class, this, "getLineNumber");
getColumnNumber = spyMethod(JsonReader.class, this, "getColumnNumber");
buffer = spyField(JsonReader.class, this, "buffer");
pos = spyField(JsonReader.class, this, "pos");
limit = spyField(JsonReader.class, this, "limit");
readEscapeCharacter = spyMethod(JsonReader.class, this, "readEscapeCharacter");
lineNumber = spyField(JsonReader.class, this, "lineNumber");
lineStart = spyField(JsonReader.class, this, "lineStart");
fillBuffer = spyMethod(JsonReader.class, this, "fillBuffer", int.class);
syntaxError = spyMethod(JsonReader.class, this, "syntaxError", String.class);
stackSize = spyField(JsonReader.class, this, "stackSize");
pathIndices = spyField(JsonReader.class, this, "pathIndices");
}
static EnhancedJsonReader getEnhancedGson25JsonReader(final Reader reader) {
try {
return new EnhancedJsonReader(reader);
} catch ( final NoSuchFieldException | NoSuchMethodException ex ) {
throw new RuntimeException(ex);
}
}
// This method has been copied and reworked from the nextString() implementation
void nextSlicedString(final ISlicedStringListener listener)
throws IOException {
int p = peeked.get();
if ( p == PEEKED_NONE ) {
p = doPeek.get();
}
switch ( p ) {
case PEEKED_SINGLE_QUOTED:
nextQuotedSlicedValue('\'', listener);
break;
case PEEKED_DOUBLE_QUOTED:
nextQuotedSlicedValue('"', listener);
break;
default:
throw new IllegalStateException("Expected a string but was " + peek()
+ " at line " + getLineNumber.get()
+ " column " + getColumnNumber.get()
+ " path " + getPath()
);
}
peeked.accept(PEEKED_NONE);
pathIndices.get()[stackSize.get() - 1]++;
}
// The following method is also a copy-paste that was patched for the "spies".
// It's, in principle, the same as the source one, but it has one more buffer singleCharBuffer
// in order not to add another method to the ISlicedStringListener interface (enjoy lamdbas as much as possible).
// Note that the main difference between these two methods is that this one
// does not aggregate a single string value, but just delegates the internal
// buffers to call-sites, so the latter ones might do anything with the buffers.
/**
* #see JsonReader#nextQuotedValue(char)
*/
private void nextQuotedSlicedValue(final char quote, final ISlicedStringListener listener)
throws IOException {
final char[] buffer = this.buffer.get();
final char[] singleCharBuffer = new char[1];
while ( true ) {
int p = pos.get();
int l = limit.get();
int start = p;
while ( p < l ) {
final int c = buffer[p++];
if ( c == quote ) {
pos.accept(p);
listener.accept(buffer, start, p - start - 1);
return;
} else if ( c == '\\' ) {
pos.accept(p);
listener.accept(buffer, start, p - start - 1);
singleCharBuffer[0] = readEscapeCharacter.get();
listener.accept(singleCharBuffer, 0, 1);
p = pos.get();
l = limit.get();
start = p;
} else if ( c == '\n' ) {
lineNumber.accept(lineNumber.get() + 1);
lineStart.accept(p);
}
}
listener.accept(buffer, start, p - start);
pos.accept(p);
if ( !fillBuffer.apply(just1) ) {
throw syntaxError.apply(justUnterminatedString);
}
}
}
// Save some memory
private static final Object[] just1 = { 1 };
private static final Object[] justUnterminatedString = { "Unterminated string" };
}
FieldSpy.java
final class FieldSpy<T>
implements Supplier<T>, Consumer<T> {
private final Object instance;
private final Field field;
private FieldSpy(final Object instance, final Field field) {
this.instance = instance;
this.field = field;
}
static <T> FieldSpy<T> spyField(final Class<?> declaringClass, final Object instance, final String fieldName)
throws NoSuchFieldException {
final Field field = declaringClass.getDeclaredField(fieldName);
field.setAccessible(true);
return new FieldSpy<>(instance, field);
}
#Override
public T get() {
try {
#SuppressWarnings("unchecked")
final T value = (T) field.get(instance);
return value;
} catch ( final IllegalAccessException ex ) {
throw new RuntimeException(ex);
}
}
#Override
public void accept(final T value) {
try {
field.set(instance, value);
} catch ( final IllegalAccessException ex ) {
throw new RuntimeException(ex);
}
}
}
MethodSpy.java
final class MethodSpy<T>
implements Function<Object[], T>, Supplier<T> {
private static final Object[] emptyObjectArray = {};
private final Object instance;
private final Method method;
private MethodSpy(final Object instance, final Method method) {
this.instance = instance;
this.method = method;
}
static <T> MethodSpy<T> spyMethod(final Class<?> declaringClass, final Object instance, final String methodName, final Class<?>... parameterTypes)
throws NoSuchMethodException {
final Method method = declaringClass.getDeclaredMethod(methodName, parameterTypes);
method.setAccessible(true);
return new MethodSpy<>(instance, method);
}
#Override
public T get() {
// my javac generates useless new Object[0] if no args passed
return apply(emptyObjectArray);
}
#Override
public T apply(final Object[] arguments) {
try {
#SuppressWarnings("unchecked")
final T value = (T) method.invoke(instance, arguments);
return value;
} catch ( final IllegalAccessException | InvocationTargetException ex ) {
throw new RuntimeException(ex);
}
}
}
HugeJsonReaderDemo.java
And here is a demo that uses that method to read a huge JSON and redirect its string values to a another file.
public static void main(final String... args)
throws IOException {
try ( final EnhancedGson25JsonReader input = getEnhancedGson25JsonReader(new InputStreamReader(new FileInputStream("./huge.json")));
final Writer output = new OutputStreamWriter(new BufferedOutputStream(new FileOutputStream("./huge.json.STRINGS"))) ) {
while ( input.hasNext() ) {
final JsonToken token = input.peek();
switch ( token ) {
case BEGIN_OBJECT:
input.beginObject();
break;
case NAME:
input.nextName();
break;
case STRING:
input.nextSlicedString(output::write);
break;
default:
throw new AssertionError(token);
}
}
}
}
I successfully extracted the fields above to a file. The input file was 544 MB (570 425 371 bytes) length and generated out of the following JSON chunks:
{"result":"OK","base64":"
JVBERi0xLjQKJeLjz9MKMSAwIG9iago8PC × 16777216 (2^24)
"}
And the result is (since I just redirect all strings to the file):
OK
JVBERi0xLjQKJeLjz9MKMSAwIG9iago8PC × 16777216 (2^24)
I think that you faced with a very interesting issue. It would be nice to have some feedback from the GSON team on possible API enhancement.

How to eliminate repeat code in a for-loop?

I have implemented two member functions in the same class:
private static void getRequiredTag(Context context) throws IOException
{
//repeated begin
for (Record record : context.getContext().readCacheTable("subscribe")) {
String traceId = record.get("trace_id").toString();
if (traceSet.contains(traceId) == false)
continue;
String tagId = record.get("tag_id").toString();
try {
Integer.parseInt(tagId);
} catch (NumberFormatException e) {
context.getCounter("Error", "tag_id not a number").increment(1);
continue;
}
//repeated end
tagSet.add(tagId);
}
}
private static void addTagToTraceId(Context context) throws IOException
{
//repeated begin
for (Record record : context.getContext().readCacheTable("subscribe")) {
String traceId = record.get("trace_id").toString();
if (traceSet.contains(traceId) == false)
continue;
String tagId = record.get("tag_id").toString();
try {
Integer.parseInt(tagId);
} catch (NumberFormatException e) {
context.getCounter("Error", "tag_id not a number").increment(1);
continue;
}
//repeated end
Vector<String> ret = traceListMap.get(tagId);
if (ret == null) {
ret = new Vector<String>();
}
ret.add(traceId);
traceListMap.put(tagId, ret);
}
}
I will call that two member functions in another two member functions(so I can't merge them into one function):
private static void A()
{
getRequiredTag()
}
private static void B()
{
getRequiredTag()
addTagToTraceId()
}
tagSet is java.util.Set and traceListMap is java.util.Map.
I know DRY principle and I really want to eliminate the repeat code, so I come to this code:
private static void getTraceIdAndTagIdFromRecord(Record record, String traceId, String tagId) throws IOException
{
traceId = record.get("trace_id").toString();
tagId = record.get("tag_id").toString();
}
private static boolean checkTagIdIsNumber(String tagId)
{
try {
Integer.parseInt(tagId);
} catch (NumberFormatException e) {
return false;
}
return true;
}
private static void getRequiredTag(Context context) throws IOException
{
String traceId = null, tagId = null;
for (Record record : context.getContext().readCacheTable("subscribe")) {
getTraceIdAndTagIdFromRecord(record, traceId, tagId);
if (traceSet.contains(traceId) == false)
continue;
if (!checkTagIdIsNumber(tagId))
{
context.getCounter("Error", "tag_id not a number").increment(1);
continue;
}
tagSet.add(tagId);
}
}
private static void addTagToTraceId(Context context) throws IOException
{
String traceId = null, tagId = null;
for (Record record : context.getContext().readCacheTable("subscribe")) {
getTraceIdAndTagIdFromRecord(record, traceId, tagId);
if (traceSet.contains(traceId) == false)
continue;
if (!checkTagIdIsNumber(tagId))
{
context.getCounter("Error", "tag_id not a number").increment(1);
continue;
}
Vector<String> ret = traceListMap.get(tagId);
if (ret == null) {
ret = new Vector<String>();
}
ret.add(traceId);
traceListMap.put(tagId, ret);
}
}
It seems I got an new repeat... I have no idea to eliminate repeat in that case, could anybody give me some advice?
update 2015-5-13 21:15:12:
Some guys gives a boolean argument to eliminate repeat, but I know
Robert C. Martin's Clean Code Tip #12: Eliminate Boolean Arguments.(you can google it for more details).
Could you gives some comment about that?
The parts that changes requires the values of String tagId and String traceId so we will start by extracting an interface that takes those parameters:
public static class PerformingInterface {
void accept(String tagId, String traceId);
}
Then extract the common parts into this method:
private static void doSomething(Context context, PerformingInterface perform) throws IOException
{
String traceId = null, tagId = null;
for (Record record : context.getContext().readCacheTable("subscribe")) {
getTraceIdAndTagIdFromRecord(record, traceId, tagId);
if (traceSet.contains(traceId) == false)
continue;
if (!checkTagIdIsNumber(tagId))
{
context.getCounter("Error", "tag_id not a number").increment(1);
continue;
}
perform.accept(tagId, traceId);
}
}
Then call this method in two different ways:
private static void getRequiredTag(Context context) throws IOException {
doSomething(context, new PerformingInterface() {
#Override public void accept(String tagId, String traceId) {
tagSet.add(tagId);
}
});
}
private static void addTagToTraceId(Context context) throws IOException {
doSomething(context, new PerformingInterface() {
#Override public void accept(String tagId, String traceId) {
Vector<String> ret = traceListMap.get(tagId);
if (ret == null) {
ret = new Vector<String>();
}
ret.add(traceId);
traceListMap.put(tagId, ret);
}
});
}
Note that I am using lambdas here, which is a Java 8 feature (BiConsumer is also a functional interface defined in Java 8), but it is entirely possible to accomplish the same thing in Java 7 and less, it just requires some more verbose code.
Some other issues with your code:
Way too many things is static
The Vector class is old, it is more recommended to use ArrayList (if you need synchronization, wrap it in Collections.synchronizedList)
Always use braces, even for one-liners
You could use a stream (haven't tested):
private static Stream<Record> validRecords(Context context) throws IOException {
return context.getContext().readCacheTable("subscribe").stream()
.filter(r -> {
if (!traceSet.contains(traceId(r))) {
return false;
}
try {
Integer.parseInt(tagId(r));
return true;
} catch (NumberFormatException e) {
context.getCounter("Error", "tag_id not a number").increment(1);
return false;
}
});
}
private static String traceId(Record record) {
return record.get("trace_id").toString();
}
private static String tagId(Record record) {
return record.get("tag_id").toString();
}
Then could do just:
private static void getRequiredTag(Context context) throws IOException {
validRecords(context).map(r -> tagId(r)).forEach(tagSet::add);
}
private static void addTagToTraceId(Context context) throws IOException {
validRecords(context).forEach(r -> {
String tagId = tagId(r);
Vector<String> ret = traceListMap.get(tagId);
if (ret == null) {
ret = new Vector<String>();
}
ret.add(traceId(r));
traceListMap.put(tagId, ret);
});
}
tagId seems to be always null in your second attempt.
Nevertheless, one approach would be to extract the code that collects tagIds (this seems to be the same in both methods) into its own method. Then, in each of the two methods just iterate over the collection of returned tagIds and do different operations on them.
for (String tagId : getTagIds(context)) {
// do method specific logic
}
EDIT
Now I noticed that you also use traceId in the second method. The principle remains the same, just collect Records in a separate method and iterate over them in the two methods (by taking tagId and traceId from records).
Solution with lambdas is the most elegant one, but without them it involves creation of separate interface and two anonymous classes which is too verbose for this use case (honestly, here I would rather go with a boolean argument than with a strategy without lambdas).
Try this approach
private static void imYourNewMethod(Context context,Boolean isAddTag){
String traceId = null, tagId = null;
for (Record record : context.getContext().readCacheTable("subscribe")) {
getTraceIdAndTagIdFromRecord(record, traceId, tagId);
if (traceSet.contains(traceId) == false)
continue;
if (!checkTagIdIsNumber(tagId))
{
context.getCounter("Error", "tag_id not a number").increment(1);
continue;
}
if(isAddTag){
Vector<String> ret = traceListMap.get(tagId);
if (ret == null) {
ret = new Vector<String>();
}
ret.add(traceId);
traceListMap.put(tagId, ret);
}else{
tagSet.add(tagId);
}
}
call this method and pass one more parameter boolean true if you want to add otherwise false to get it.

Fetching nodes with a label and index using cypher

I want to fetch node and its information using label and index using cypher query but still i get null value in variable "result" in CypherQuery() method.
NewCypherQuery.java (bean)
public class NewCypherQuery {
private static final String DB_PATH = "/var/lib/neo4j/data/";
private static String resultString;
private static String columnsString, nodeResult, rows = "", query;
private static ExecutionResult result;
private static ExecutionEngine engine;
private static GraphDatabaseService db;
private static Node amad = null, pari = null, sona = null;
private static Relationship rel;
private static IndexDefinition inxamd, inxpri;
private static Label amd,pri;
public static void callAllMethods() {
clearDbPath();
setUp();
createNodes();
CypherQuery();
}
public static void CypherQuery() {
try (Transaction ignored = db.beginTx();) {
result = engine.execute("MATCH (m:inxamd)-->(n:inxpri) USING INDEX m:inxamd(name) USING INDEX n:inxpri(name) WHERE m.name = 'Amad' AND n.name= 'Pari' RETURN m");
for (Map<String, Object> row : result) {
resultString = engine.execute("MATCH (m:inxamd)-->(n:inxpri) USING INDEX m:inxamd(name) USING INDEX n:inxpri(name) WHERE m.name = 'Amad' AND n.name= 'Pari' RETURN m").dumpToString();
System.out.println(resultString);
}
} catch (Exception e) {
e.printStackTrace();
}
}
private static void setUp() {
try {
db = new GraphDatabaseFactory().newEmbeddedDatabase(DB_PATH);
try (Transaction tx = db.beginTx()) {
engine = new ExecutionEngine(db);
Schema schema = db.schema();
inxamd = schema.indexFor(amd).on("name").create();
inxpri = schema.indexFor(pri).on("name").create();
tx.success();
}
} catch (Exception e) {
e.printStackTrace();
}
}
private static void createNodes() {
try (Transaction tx = db.beginTx();) {
amad = db.createNode();
amad.setProperty("name", "Amad");
amad.setProperty("age", 24);
amad.setProperty("edu", "mscit");
pari = db.createNode();
pari.setProperty("name", "Pari");
pari.setProperty("age", 20);
pari.setProperty("edu", "mscit");
sona = db.createNode();
sona.setProperty("name", "Sona");
sona.setProperty("age", 21);
sona.setProperty("edu", "mscit");
rel = amad.createRelationshipTo(pari, RelTypes.KNOWS);
rel.setProperty("rel", "friend");
rel = pari.createRelationshipTo(sona, RelTypes.KNOWS);
rel.setProperty("rel", "friend");
System.out.println("Nodes created.....");
tx.success();
} catch (Exception e) {
e.printStackTrace();
}
}
private static void clearDbPath() {
try {
deleteRecursively(new File(DB_PATH));
} catch (IOException e) {
throw new RuntimeException(e);
}
}
}
InsertNodes.java (Servlet):-
protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
// TODO Auto-generated method stub
NewCypherQuery ncq=new NewCypherQuery();
ncq.callAllMethods();
}
index.jsp
<form method="post" action="InsertNodes">
<input type="text" name="txtname" value="Hello World !!!!!"></input>
<input type="submit" value="Neo4j World"></input>
</form>
You haven't really defined values for the two labels amd (inxamd) and pri (inxpri) and nor have you assigned them to any nodes created.
You can either implement the Label class and assign a name to a label such as "inxamd" or use DynamicLabel.
Then, assign the label to your node using
amad.addLabel(thelabel);
Unrelated to your issue above, a label name is usually a descriptive string indicating what set the node belongs to e.g. Person, Dog
And in most cases you don't need to explicitly provide the index hint (USING INDEX)

Categories