I am playing around with JGroups as a distributed system. I want to create objects on a remote JVM and use them as if they were created locally. Therefore I am using a java.lang.reflect.Proxy to wrap the RPC calls. This is a much like RMI behavior. This works very well.
But now I want to garbage collect the remote object if the client interface/proxy is no longer in use. So I thought I can get this working by using a WeakReference but I never fall into the GC cycle. What am I missing?
public class RMILikeWrapper {
private static final ScheduledExecutorService garbageCollector = Executors.newSingleThreadScheduledExecutor();
private final Map remoteObjects = new ConcurrentHashMap();
private final Map<WeakReference, IdPointer> grabageTracker = new ConcurrentHashMap<>();
private final ReferenceQueue rq = new ReferenceQueue();
private final RpcDispatcher rpcDispatcher;
private final long callTimeout = 10000L;
private class IdPointer {
public final Address address;
public final String id;
public IdPointer(Address address, String id) {
this.address = address;
this.id = id;
}
#Override
public String toString() {
return "IdPointer{" + "address=" + address + ", id='" + id + '\'' + '}';
}
}
public RMILikeWrapper(Channel channel) {
this.rpcDispatcher = new RpcDispatcher(channel, null, null, this);
// enable garbage collecting
garbageCollector.scheduleWithFixedDelay(new Runnable() {
#Override
public void run() {
System.out.println("my GC ");
Reference<?> ref; //this should be our weak reference
while((ref = rq.poll()) != null) {
// remove weak reference from the map
IdPointer garbage = grabageTracker.remove(ref);
System.out.println("found expired weak references: " + garbage);
// now we need to destroy the remote object too
try {
rpcDispatcher.callRemoteMethod(garbage.address, "purge", new Object[]{garbage.id},
new Class[]{String.class}, new RequestOptions(ResponseMode.GET_FIRST, callTimeout));
} catch (Exception e) {
e.printStackTrace();
}
}
}
},0,10, TimeUnit.SECONDS);
}
public <T>T createRemoteObject(Class<T> proxyInterface, Address targetNode, Class c, Object[] args, Class[] argTypes) {
try {
Object[] remoteArgs = new Object[4];
remoteArgs[0] = UUID.randomUUID().toString();
remoteArgs[1] = c;
remoteArgs[2] = args;
remoteArgs[3] = argTypes;
rpcDispatcher.callRemoteMethod(targetNode, "addObject", remoteArgs,
new Class[]{String.class, Class.class, Object[].class, Class[].class},
new RequestOptions(ResponseMode.GET_FIRST, callTimeout));
// now get in interface stub for this object
return getRemoteObject(targetNode, remoteArgs[0].toString(), proxyInterface);
} catch (Exception e) {
e.printStackTrace();
}
return null;
}
// Operation triggerd by RPC
public void addObject(String id, Class c, Object[] args, Class[] parameterTypes) throws Exception {
remoteObjects.put(id, c.getConstructor(parameterTypes).newInstance(args));
}
// Operation triggerd by RPC
public Object invoke(String id, String methodName, Object[] args, Class[] argTypes) throws Exception {
Object ro = remoteObjects.get(id);
return ro.getClass().getMethod(methodName, argTypes).invoke(ro, args);
}
// Operation triggerd by RPC
public void purge(String id) {
System.out.println("garbage collecting: " + id);
//return remoteObjects.remove(id) != null;
remoteObjects.remove(id);
}
public <T>T getRemoteObject(final Address nodeAdress, final String id, final Class<T> clazz) {
if (!clazz.isInterface()) throw new RuntimeException("Class has to be an interface!");
InvocationHandler handler = new InvocationHandler() {
#Override
public Object invoke (Object proxy, Method method, Object[] args) throws Throwable {
Object[] remoteArgs = new Object[4];
remoteArgs[0] = id;
remoteArgs[1] = method.getName();
remoteArgs[2] = args;
remoteArgs[3] = method.getParameterTypes();
// remote call
return rpcDispatcher.callRemoteMethod(nodeAdress, "invoke",
remoteArgs, new Class[]{String.class, String.class, Object[].class, Class[].class},
new RequestOptions(ResponseMode.GET_FIRST, callTimeout));
}
};
T result = (T) Proxy.newProxyInstance(
clazz.getClassLoader(),
new Class[]{clazz},
handler);
// use weak pointers to the proxy object here and if one is garbage collected, purge the remote object as well
WeakReference<T> weakReference = new WeakReference<>(result, rq);
grabageTracker.put(weakReference, new IdPointer(nodeAdress, id));
return result;
}
public static void main(String[] args) throws Exception {
Channel channel = new JChannel();
channel.connect("test-cluster");
List<Address> members = channel.getView().getMembers();
RMILikeWrapper w = new RMILikeWrapper(channel);
if (members.size() > 1) {
System.out.println("send to " + members.get(0));
FooInterface remoteObject = w.createRemoteObject(FooInterface.class, members.get(0), FooImpl.class, null, null);
System.out.println(remoteObject.doSomething("Harr harr harr"));
remoteObject = null;
}
System.out.println(channel.getView().getMembers());
}
}
Using following methods you could identify how GC behaves on weak references.
Option 1:
-verbose:gc
This argument record GC behaviour whenever GC kicks into picture. You could take the log file when you want to check did GC gets into action, It could be checked from the GC logs. For Interactive GC analysis try the log with http://www.ibm.com/developerworks/java/jdk/tools/gcmv/
Option 2 :
Collect Heap dump and user event and load it in https://www.ibm.com/developerworks/java/jdk/tools/memoryanalyzer/
Write OQL(Object Query language) on OQL section
select * from package(s).classname
and click on ! on the tool bar
It will give list of objects of that type
Right click on the objects -> Path to GC roots -> Exclude soft/weak/Phantom references
If suspect object does not have any strong reference then it will show NULL else
you will get information on who is holding the strong references on the suspected object.
Related
[ISSUE] repo always returns null when I call repo methods, while stepping through, throws null pointer exception. then front end receives
500: Http failure response for http://localhost:4200/api/aiprollout/updatecsv: 500 Internal Server Error
[HAVE TRIED] Adjusting AutoWired and components and service annotations.
[QUESTIONS]
1- Does every repo method need its own service and controller method?
2- Is it okay to create a new service that uses an existing controller?
3- If this new service uses SuperCsv and I create custom CsvCellProcessors, can these cell processors also call the repo? Should these cell processors perform logic? or should it be done else where? What class annotations should these cellProcessors classes have? #Component?
Any advice is greatly appreciated, feel a little lost at this point not even sure what to do.
[CODE]
Controller:
#RestController
#EnableConfigurationProperties({SpoofingConfigurationProperties.class})
#RequestMapping(value = "")
public class AipRolloutController {
private final Logger logger = some logger
private final AipRolloutService AipRolloutService;
private final CsvParserService csvParserService;
#Autowired
public AipRolloutController(AipRolloutService aipRolloutService, CsvParserService csvParserService) {
this.AipRolloutService = aipRolloutService;
this.csvParserService = csvParserService;
}
#PostMapping(value = "/updatecsv", produces = MediaType.APPLICATION_JSON_VALUE)
#ResponseBody
public ResponseEntity<?> processCsv(#RequestParam("csvFile") MultipartFile csvFile) throws IOException {
if (csvFile.isEmpty()) return new ResponseEntity(
responceJson("please select a file!"),
HttpStatus.NO_CONTENT
);
csvParserService.parseCsvFile(csvFile);
return new ResponseEntity(
responceJson("Successfully uploaded - " + csvFile.getOriginalFilename()),
new HttpHeaders(),
HttpStatus.CREATED
);
}
Service:
#Service
public class AipRolloutService {
private static final Logger logger = some logger
#Autowired
private AIPRolloutRepository AIPRolloutRepository;
New Csv parser Service
#Service
public class CsvParserService {
#Autowired private AipRolloutService aipRolloutService;
public CsvParserService(AipRolloutService aipRolloutService) {
this.aipRolloutService = aipRolloutService;
}
public void parseCsvFile(MultipartFile csvFile) throws IOException {
CsvMapReader csvMapReader = new CsvMapReader(new InputStreamReader(csvFile.getInputStream()), CsvPreference.STANDARD_PREFERENCE);
parseCsv(csvMapReader);
csvMapReader.close();
}
private void parseCsv(CsvMapReader csvMapReader) throws IOException {
String[] header = csvMapReader.getHeader(true);
List<String> headers = Arrays.asList(header);
verifySourceColumn(headers);
verifyPovColumn(headers);
final CellProcessor[] processors = getProcessors(headers);
Map<String, Object> csvImportMap = null;
while ((csvImportMap = csvMapReader.read(header, processors)) != null) {
CsvImportDTO csvImportDto = new CsvImportDTO(csvImportMap);
if ( activationTypeP(csvImportDto) ){
int mssValue = Integer.parseInt(csvImportDto.getMssValue());
aipRolloutService.updateAipRollout(csvImportDto.getSource(),
csvImportDto.getPov(),
csvImportDto.getActivationType(),
mssValue);
}
}
}
private CellProcessor[] getProcessors(List<String> headers) {
CellProcessor[] processors = new CellProcessor[headers.size()];
int index = 0;
for (String header : headers) {
if (header.contains(SOURCE_ID)) {
processors[index++] = new CsvSourceIdCellParser();
} else if (header.contains(POV)) {
processors[index++] = new CsvPovCellParser();
} else if (header.contains(ACTIVATION_TYPE)) {
processors[index++] = new CsvActivationTypeCellParser();
} else if (header.contains(ACTIVATION_DATE)) {
processors[index++] = new Optional();
} else if (header.contains(DEACTIVATION_DATE)) {
processors[index++] = new Optional();
} else if (header.contains(MSS_VALUE)) {
processors[index++] = new CsvMssValueCellParser();
} else {
processors[index++] = null; // throw exception? wrong header info instead of allowing null?
}
}
return processors;
}
Custom Cell Processor that calls repo and returns null
public class CsvSourceIdCellParser extends CellProcessorAdaptor {
#Autowired AIPRolloutRepository aipRolloutRepository;
public CsvSourceIdCellParser(){ super(); }
// this constructor allows other processors to be chained
public CsvSourceIdCellParser(CellProcessor next){ super(next); }
#Override
public Object execute(Object value, CsvContext csvContext) {
// throws an Exception if the input is null
validateInputNotNull(value, csvContext);
// get rid of description only need first 3 #'s
value = value.toString().substring(0,3);
// check if WH exists
if( aipRolloutRepository.dcExistsInDatabase(value.toString()) )
return value;
else
throw new RuntimeException("Check Warehouse Value, Value Not Found "
+ "Row number: " + csvContext.getRowNumber()
+ " Column number: " + csvContext.getColumnNumber());
}
}
Repository
#Repository
public class AIPRolloutRepository {
private static final Logger logger = LoggerFactory.getLogger(AIPRolloutRepository.class);
#Autowired
JdbcTemplate jdbcTemplate;
public AIPRolloutRepository() {
}
public boolean dcExistsInDatabase(String dc){
// Query for a count saves time and memory, query for distinct saves time and memory on execution
boolean hasRecord =
jdbcTemplate
.query( "select count (distinct '" + dc +"')" +
"from xxcus.XX_AIP_ROLLOUT" +
"where DC = '" + dc + "';",
new Object[] { dc },
(ResultSet rs) -> {
if (rs.next()) {
return true;
}
return false;
}
);
return hasRecord;
}
I'm getting a java.lang.OutOfMemoryError: Java heap space even with GSON Streaming.
{"result":"OK","base64":"JVBERi0xLjQKJeLjz9MKMSAwIG9iago8PC...."}
base64 can be up to 200Mb long. GSON is taking much more memory than that, (3GB) When I try to store the base64 in a variable I get a:
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:2367)
at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:130)
at java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:114)
at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:535)
at java.lang.StringBuilder.append(StringBuilder.java:204)
at com.google.gson.stream.JsonReader.nextQuotedValue(JsonReader.java:1014)
at com.google.gson.stream.JsonReader.nextString(JsonReader.java:815)
What is the best way to handle this kind of fields?
The reason of why you're getting OutOfMemoryError is that GSON nextString() returns a string that's aggregated during building a very huge string using StringBuilder. When you're facing with such an issue, you have to deal with intermediate data since there is no other choice. Unfortunately, GSON does not let you to process huge literals in any way.
Not sure if you can change the response payload, but if you can't, you might want to implement your own JSON reader, or "hack" the existing JsonReader to make it work in streaming fashion. The example below is based on GSON 2.5 and makes heavy use of reflection because JsonReader hides its state very carefully.
EnhancedGson25JsonReader.java
final class EnhancedGson25JsonReader
extends JsonReader {
// A listener to accept the internal character buffers.
// Accepting a single string built on such buffers is total memory waste as well.
interface ISlicedStringListener {
void accept(char[] buffer, int start, int length)
throws IOException;
}
// The constants can be just copied
/** #see JsonReader#PEEKED_NONE */
private static final int PEEKED_NONE = 0;
/** #see JsonReader#PEEKED_SINGLE_QUOTED */
private static final int PEEKED_SINGLE_QUOTED = 8;
/** #see JsonReader#PEEKED_DOUBLE_QUOTED */
private static final int PEEKED_DOUBLE_QUOTED = 9;
// Here is a bunch of spies made to "spy" for the parent's class state
private final FieldSpy<Integer> peeked;
private final MethodSpy<Integer> doPeek;
private final MethodSpy<Integer> getLineNumber;
private final MethodSpy<Integer> getColumnNumber;
private final FieldSpy<char[]> buffer;
private final FieldSpy<Integer> pos;
private final FieldSpy<Integer> limit;
private final MethodSpy<Character> readEscapeCharacter;
private final FieldSpy<Integer> lineNumber;
private final FieldSpy<Integer> lineStart;
private final MethodSpy<Boolean> fillBuffer;
private final MethodSpy<IOException> syntaxError;
private final FieldSpy<Integer> stackSize;
private final FieldSpy<int[]> pathIndices;
private EnhancedJsonReader(final Reader reader)
throws NoSuchFieldException, NoSuchMethodException {
super(reader);
peeked = spyField(JsonReader.class, this, "peeked");
doPeek = spyMethod(JsonReader.class, this, "doPeek");
getLineNumber = spyMethod(JsonReader.class, this, "getLineNumber");
getColumnNumber = spyMethod(JsonReader.class, this, "getColumnNumber");
buffer = spyField(JsonReader.class, this, "buffer");
pos = spyField(JsonReader.class, this, "pos");
limit = spyField(JsonReader.class, this, "limit");
readEscapeCharacter = spyMethod(JsonReader.class, this, "readEscapeCharacter");
lineNumber = spyField(JsonReader.class, this, "lineNumber");
lineStart = spyField(JsonReader.class, this, "lineStart");
fillBuffer = spyMethod(JsonReader.class, this, "fillBuffer", int.class);
syntaxError = spyMethod(JsonReader.class, this, "syntaxError", String.class);
stackSize = spyField(JsonReader.class, this, "stackSize");
pathIndices = spyField(JsonReader.class, this, "pathIndices");
}
static EnhancedJsonReader getEnhancedGson25JsonReader(final Reader reader) {
try {
return new EnhancedJsonReader(reader);
} catch ( final NoSuchFieldException | NoSuchMethodException ex ) {
throw new RuntimeException(ex);
}
}
// This method has been copied and reworked from the nextString() implementation
void nextSlicedString(final ISlicedStringListener listener)
throws IOException {
int p = peeked.get();
if ( p == PEEKED_NONE ) {
p = doPeek.get();
}
switch ( p ) {
case PEEKED_SINGLE_QUOTED:
nextQuotedSlicedValue('\'', listener);
break;
case PEEKED_DOUBLE_QUOTED:
nextQuotedSlicedValue('"', listener);
break;
default:
throw new IllegalStateException("Expected a string but was " + peek()
+ " at line " + getLineNumber.get()
+ " column " + getColumnNumber.get()
+ " path " + getPath()
);
}
peeked.accept(PEEKED_NONE);
pathIndices.get()[stackSize.get() - 1]++;
}
// The following method is also a copy-paste that was patched for the "spies".
// It's, in principle, the same as the source one, but it has one more buffer singleCharBuffer
// in order not to add another method to the ISlicedStringListener interface (enjoy lamdbas as much as possible).
// Note that the main difference between these two methods is that this one
// does not aggregate a single string value, but just delegates the internal
// buffers to call-sites, so the latter ones might do anything with the buffers.
/**
* #see JsonReader#nextQuotedValue(char)
*/
private void nextQuotedSlicedValue(final char quote, final ISlicedStringListener listener)
throws IOException {
final char[] buffer = this.buffer.get();
final char[] singleCharBuffer = new char[1];
while ( true ) {
int p = pos.get();
int l = limit.get();
int start = p;
while ( p < l ) {
final int c = buffer[p++];
if ( c == quote ) {
pos.accept(p);
listener.accept(buffer, start, p - start - 1);
return;
} else if ( c == '\\' ) {
pos.accept(p);
listener.accept(buffer, start, p - start - 1);
singleCharBuffer[0] = readEscapeCharacter.get();
listener.accept(singleCharBuffer, 0, 1);
p = pos.get();
l = limit.get();
start = p;
} else if ( c == '\n' ) {
lineNumber.accept(lineNumber.get() + 1);
lineStart.accept(p);
}
}
listener.accept(buffer, start, p - start);
pos.accept(p);
if ( !fillBuffer.apply(just1) ) {
throw syntaxError.apply(justUnterminatedString);
}
}
}
// Save some memory
private static final Object[] just1 = { 1 };
private static final Object[] justUnterminatedString = { "Unterminated string" };
}
FieldSpy.java
final class FieldSpy<T>
implements Supplier<T>, Consumer<T> {
private final Object instance;
private final Field field;
private FieldSpy(final Object instance, final Field field) {
this.instance = instance;
this.field = field;
}
static <T> FieldSpy<T> spyField(final Class<?> declaringClass, final Object instance, final String fieldName)
throws NoSuchFieldException {
final Field field = declaringClass.getDeclaredField(fieldName);
field.setAccessible(true);
return new FieldSpy<>(instance, field);
}
#Override
public T get() {
try {
#SuppressWarnings("unchecked")
final T value = (T) field.get(instance);
return value;
} catch ( final IllegalAccessException ex ) {
throw new RuntimeException(ex);
}
}
#Override
public void accept(final T value) {
try {
field.set(instance, value);
} catch ( final IllegalAccessException ex ) {
throw new RuntimeException(ex);
}
}
}
MethodSpy.java
final class MethodSpy<T>
implements Function<Object[], T>, Supplier<T> {
private static final Object[] emptyObjectArray = {};
private final Object instance;
private final Method method;
private MethodSpy(final Object instance, final Method method) {
this.instance = instance;
this.method = method;
}
static <T> MethodSpy<T> spyMethod(final Class<?> declaringClass, final Object instance, final String methodName, final Class<?>... parameterTypes)
throws NoSuchMethodException {
final Method method = declaringClass.getDeclaredMethod(methodName, parameterTypes);
method.setAccessible(true);
return new MethodSpy<>(instance, method);
}
#Override
public T get() {
// my javac generates useless new Object[0] if no args passed
return apply(emptyObjectArray);
}
#Override
public T apply(final Object[] arguments) {
try {
#SuppressWarnings("unchecked")
final T value = (T) method.invoke(instance, arguments);
return value;
} catch ( final IllegalAccessException | InvocationTargetException ex ) {
throw new RuntimeException(ex);
}
}
}
HugeJsonReaderDemo.java
And here is a demo that uses that method to read a huge JSON and redirect its string values to a another file.
public static void main(final String... args)
throws IOException {
try ( final EnhancedGson25JsonReader input = getEnhancedGson25JsonReader(new InputStreamReader(new FileInputStream("./huge.json")));
final Writer output = new OutputStreamWriter(new BufferedOutputStream(new FileOutputStream("./huge.json.STRINGS"))) ) {
while ( input.hasNext() ) {
final JsonToken token = input.peek();
switch ( token ) {
case BEGIN_OBJECT:
input.beginObject();
break;
case NAME:
input.nextName();
break;
case STRING:
input.nextSlicedString(output::write);
break;
default:
throw new AssertionError(token);
}
}
}
}
I successfully extracted the fields above to a file. The input file was 544 MB (570 425 371 bytes) length and generated out of the following JSON chunks:
{"result":"OK","base64":"
JVBERi0xLjQKJeLjz9MKMSAwIG9iago8PC × 16777216 (2^24)
"}
And the result is (since I just redirect all strings to the file):
OK
JVBERi0xLjQKJeLjz9MKMSAwIG9iago8PC × 16777216 (2^24)
I think that you faced with a very interesting issue. It would be nice to have some feedback from the GSON team on possible API enhancement.
I Have a multithreaded environment in android app. I use a singleton class to store data. This singleton class contains a arraylist that is accessed using a synchronized method.
The app uses this arraylist to render images in app.
Initial problem : Concurrent modification error use to come so I made the get arraylist function syncronized.
Current Problem:Concurrent modification error not coming but in between empty arraylist returned (maybe when there is concurrent access).
Objective : I want to detect when Concurrent modification so that Instead of empty arraylist being return I can return last state of the arraylist.
public synchronized List<FrameData> getCurrentDataToShow() {
List<FrameData> lisCurrDataToShow = new ArrayList<FrameData>();
//for (FrameData fd : listFrameData) {//concurrent modification exception
//todo iterator test
Iterator<FrameData> iterator = listFrameData.iterator();
while (iterator.hasNext()) {
FrameData fd = iterator.next();
long currentTimeInMillis = java.lang.System.currentTimeMillis();
if ((currentTimeInMillis > fd.getStartDate().getTime() && currentTimeInMillis < fd.getEndDate().getTime()) || (fd.isAllDay() && DateUtils.isToday(fd.getStartDate().getTime()))) {
if (new File(ImageFrameActivity.ROOT_FOLDER_FILES + fd.getFileName()).exists()) {
lisCurrDataToShow.add(fd);
}
}
}
if (lisCurrDataToShow.size() == 0) {
lisCurrDataToShow.add(new FrameData(defaultFileName, null, null, null, String.valueOf(120), false));
}
return lisCurrDataToShow;
}
Referred to Detecting concurrent modifications?
Please help!
EDIT1:
This problem occurs rarely not everytime.
If a threads is accessing getCurrentDataToShow() and another thread tries to access this function what will the function return?? I'm new to multithreading , please guide
Edit 2
in oncreate following methods of singleton are called periodically
DataModelManager.getInstance().getCurrentDataToShow();
DataModelManager.getInstance().parseData(responseString);
Complete singleton class
public class DataModelManager {
private static DataModelManager dataModelManager;
private ImageFrameActivity imageFrameAct;
private String defaultFileName;
public List<FrameData> listFrameData = new ArrayList<FrameData>();
// public CopyOnWriteArrayList<FrameData> listFrameData= new CopyOnWriteArrayList<FrameData>();
private String screensaverName;
private boolean isToDownloadDeafultFiles;
private String tickerMsg = null;
private boolean showTicker = false;
private boolean showHotspot = false;
private String hotspotFileName=null;
public String getDefaultFileName() {
return defaultFileName;
}
public boolean isToDownloadDeafultFiles() {
return isToDownloadDeafultFiles;
}
public void setToDownloadDeafultFiles(boolean isToDownloadDeafultFiles) {
this.isToDownloadDeafultFiles = isToDownloadDeafultFiles;
}
private String fileNames;
private DataModelManager() {
}
public static DataModelManager getInstance() {
if (dataModelManager == null) {
synchronized (DataModelManager.class) {
if (dataModelManager == null) {
dataModelManager = new DataModelManager();
}
}
}
return dataModelManager;
}
private synchronized void addImageData(FrameData frameData) {
//Log.d("Frame Data","Start date "+frameData.getStartDate()+ " " +"end date "+frameData.getEndDate());
listFrameData.add(frameData);
}
public synchronized void parseData(String jsonStr) throws JSONException {
listFrameData.clear();
if (jsonStr == null) {
return;
}
List<String> listFileNames = new ArrayList<String>();
JSONArray jsonArr = new JSONArray(jsonStr);
int length = jsonArr.length();
for (int i = 0; i < length; i++) {
JSONObject jsonObj = jsonArr.getJSONObject(i);
dataModelManager.addImageData(new FrameData(jsonObj.optString("filename", ""), jsonObj.optString("start", ""), jsonObj.optString("end", ""), jsonObj.optString("filetype", ""), jsonObj.optString("playTime", ""), jsonObj.optBoolean("allDay", false)));
listFileNames.add(jsonObj.optString("filename", ""));
}
fileNames = listFileNames.toString();
}
public void setDefaultFileData(String jsonStr) throws JSONException {
JSONObject jsonObj = new JSONObject(jsonStr);
defaultFileName = jsonObj.optString("default_image", "");
screensaverName = jsonObj.optString("default_screensaver ", "");
}
#Override
public String toString() {
return fileNames.replace("[", "").replace("]", "") + "," + defaultFileName + "," + screensaverName;
}
public FrameData getFrameData(int index) {
return listFrameData.get(index);
}
public synchronized List<FrameData> getCurrentDataToShow() {
List<FrameData> lisCurrDataToShow = new ArrayList<FrameData>();
// for (FrameData fd : listFrameData) {//concurrent modification exception
//todo iterator test
Iterator<FrameData> iterator = listFrameData.iterator();
while (iterator.hasNext()) {
FrameData fd = iterator.next();
long currentTimeInMillis = java.lang.System.currentTimeMillis();
if ((currentTimeInMillis > fd.getStartDate().getTime() && currentTimeInMillis < fd.getEndDate().getTime()) || (fd.isAllDay() && DateUtils.isToday(fd.getStartDate().getTime()))) {
if (new File(ImageFrameActivity.ROOT_FOLDER_FILES + fd.getFileName()).exists()) {
lisCurrDataToShow.add(fd);
}
}
}
if (lisCurrDataToShow.size() == 0) {
lisCurrDataToShow.add(new FrameData(defaultFileName, null, null, null, String.valueOf(120), false));
}
return lisCurrDataToShow;
}
public String getCurrentFileNames() {
String currFileNames = "";
List<FrameData> currFrameData = getCurrentDataToShow();
for (FrameData data : currFrameData) {
currFileNames += "," + data.getFileName();
}
return currFileNames;
}
public ImageFrameActivity getImageFrameAct() {
return imageFrameAct;
}
public void setImageFrameAct(ImageFrameActivity imageFrameAct) {
this.imageFrameAct = imageFrameAct;
}
}
This is the only part of your question that is currently answerable:
If a threads is accessing getCurrentDataToShow() and another thread tries to access this function what will the function return?
It depends on whether you are calling getCurrentDataToShow() on the same target object; i.e. what this is.
If this is the same for both calls, then the first call will complete before the second call starts.
If this is different, you will be locking on different objects, and the two calls could overlap. Two threads need to lock the same object to achieve mutual exclusion.
In either case, this method is not changing the listFrameData collection. Hence it doesn't matter whether the calls overlap! However, apparently something else is changing the contents of the collection. If that code is not synchronizing at all, or if it is synchronizing on a different lock, then that could be a source of problems.
Now you say that you are not seeing ConcurrentModificationException's at the moment. That suggests (but does not prove) that there isn't a synchronization problem at all. And that suggests (but does not prove) that your current problem is a logic error.
But (as I commented above) there are reasons to doubt that the code you have shown us is an true reflection of your real code. You need to supply an MVCE if you want a more definite diagnosis.
I use mybatis to perform sql queries in my project. I need to intercept sql query before executing to apply some changed dynamically. I've read about #Interseptors like this:
#Intercepts({#Signature(type= Executor.class, method = "query", args = {...})})
public class ExamplePlugin implements Interceptor {
public Object intercept(Invocation invocation) throws Throwable {
return invocation.proceed();
}
public Object plugin(Object target) {
return Plugin.wrap(target, this);
}
public void setProperties(Properties properties) {
}
}
And it really intercepts executions, but there is no way to change sql query since appropriate field is not writable. Should I build new instance of whole object manually to just replace sql query? Where is the right place to intercept query execution to change it dynamically? Thank.
I hope it will help you:
#Intercepts( { #Signature(type = Executor.class, method = "query", args = {
MappedStatement.class, Object.class, RowBounds.class,
ResultHandler.class
})
})
public class SelectCountSqlInterceptor2 implements Interceptor
{
public static String COUNT = "_count";
private static int MAPPED_STATEMENT_INDEX = 0;
private static int PARAMETER_INDEX = 1;
#Override
public Object intercept(Invocation invocation) throws Throwable
{
processCountSql(invocation.getArgs());
return invocation.proceed();
}
#SuppressWarnings("rawtypes")
private void processCountSql(final Object[] queryArgs)
{
if (queryArgs[PARAMETER_INDEX] instanceof Map)
{
Map parameter = (Map) queryArgs[PARAMETER_INDEX];
if (parameter.containsKey(COUNT))
{
MappedStatement ms = (MappedStatement) queryArgs[MAPPED_STATEMENT_INDEX];
BoundSql boundSql = ms.getBoundSql(parameter);
String sql = ms.getBoundSql(parameter).getSql().trim();
BoundSql newBoundSql = new BoundSql(ms.getConfiguration(),
getCountSQL(sql), boundSql.getParameterMappings(),
boundSql.getParameterObject());
MappedStatement newMs = copyFromMappedStatement(ms,
new OffsetLimitInterceptor.BoundSqlSqlSource(newBoundSql));
queryArgs[MAPPED_STATEMENT_INDEX] = newMs;
}
}
}
// see: MapperBuilderAssistant
#SuppressWarnings({ "unchecked", "rawtypes" })
private MappedStatement copyFromMappedStatement(MappedStatement ms,
SqlSource newSqlSource)
{
Builder builder = new MappedStatement.Builder(ms.getConfiguration(), ms
.getId(), newSqlSource, ms.getSqlCommandType());
builder.resource(ms.getResource());
builder.fetchSize(ms.getFetchSize());
builder.statementType(ms.getStatementType());
builder.keyGenerator(ms.getKeyGenerator());
// setStatementTimeout()
builder.timeout(ms.getTimeout());
// setParameterMap()
builder.parameterMap(ms.getParameterMap());
// setStatementResultMap()
List<ResultMap> resultMaps = new ArrayList<ResultMap>();
String id = "-inline";
if (ms.getResultMaps() != null)
{
id = ms.getResultMaps().get(0).getId() + "-inline";
}
ResultMap resultMap = new ResultMap.Builder(null, id, Long.class,
new ArrayList()).build();
resultMaps.add(resultMap);
builder.resultMaps(resultMaps);
builder.resultSetType(ms.getResultSetType());
// setStatementCache()
builder.cache(ms.getCache());
builder.flushCacheRequired(ms.isFlushCacheRequired());
builder.useCache(ms.isUseCache());
return builder.build();
}
private String getCountSQL(String sql)
{
String lowerCaseSQL = sql.toLowerCase().replace("\n", " ").replace("\t", " ");
int index = lowerCaseSQL.indexOf(" order ");
if (index != -1)
{
sql = sql.substring(0, index);
}
return "SELECT COUNT(*) from ( select 1 as col_c " + sql.substring(lowerCaseSQL.indexOf(" from ")) + " ) cnt";
}
#Override
public Object plugin(Object target)
{
return Plugin.wrap(target, this);
}
#Override
public void setProperties(Properties properties)
{
}
}
You may consider using a string template library (eg Velocity, Handlebars, Mustache) to help you
As of to date, there is even MyBatis-Velocity (http://mybatis.github.io/velocity-scripting/) to help you to do scripting for the sql.
Depending on the changes you want to make, you may want to use the dynamic sql feature of mybatis 3
The following method gets a "Route" (class name and class method):
public Route getRoute(final String method, final String request) {
if (hasRoutes) {
for (Map.Entry<Pattern, HashMap<String, String>> entry : routes) {
Matcher match = entry.getKey().matcher(request);
if (match.find()) {
HashMap<String, String> methods = entry.getValue();
// ISSUE: Returns FALSE after 1st call of Router.getRoute()
if (methods.containsKey(method)) {
return new Route(match.group("interface"), "TRUE (" + method + " - " + match.group("interface") + "): " + methods.get(method));
} else {
return new Route(match.group("interface"), "FALSE (" + method + " - " + match.group("interface") + "): " + methods.values().toString() + ", SIZE: " + entry.getValue().size());
}
//return entry.getValue().containsKey(method) ? new Route(match.group("interface"), entry.getValue().get(method)) : null;
}
}
}
return null;
}
"routes" is defined as:
private Set<Entry<Pattern, HashMap<String, String>>> routes;
It is a cached representation of a JSON configuration file that defines supported routes, e.g.:
{
"^/?(?<interface>threads)/?$": {
"GET": "list",
"POST": "create"
},
"^/?(?<interface>threads)/(?<id>\\d+)/?$": {
"GET": "get",
"POST": "reply",
"PUT": "edit",
"PATCH": "edit",
"DELETE": "delete"
}
}
EDIT, here's how "routes" is filled from the contents of the JSON file:
try {
JsonParser parser = JSONFactory.createJsonParser(in);
JsonNode root = JSONMapper.readTree(parser);
Iterator base = root.getFieldNames();
Iterator node;
String match, method;
HashMap<Pattern, HashMap<String, String>> routesMap = new HashMap();
while (base.hasNext()) {
match = base.next().toString();
if (match != null) {
node = root.get(match).getFieldNames();
HashMap<String, String> methods = new HashMap();
while (node.hasNext()) {
method = node.next().toString();
if (method != null) {
methods.put(method, root.get(match).get(method).getTextValue());
}
}
if (!methods.isEmpty()) {
routesMap.put(Pattern.compile(match), methods);
}
}
}
if (!routesMap.isEmpty()) {
hasRoutes = true;
routes = routesMap.entrySet();
}
// Help garbage collection
parser = null;
root = null;
base = null;
node = null;
match = null;
method = null;
routesMap = null;
} catch (Exception ex) {
}
EDIT 2, properties in question & init() method:
public final static JsonFactory JSONFactory = new JsonFactory();
public final static ObjectMapper JSONMapper = new ObjectMapper();
public static Router router;
private final Class self = getClass();
private final ClassLoader loader = self.getClassLoader();
public void init(ServletConfig config) throws ServletException {
super.init(config);
router = new Router(self.getResourceAsStream("/v1_0/Routes.json"), JSONFactory, JSONMapper);
}
For some reason when accessing the servlet after the first time the HashMap is empty of values. A x.size() returns zero.
This is a rewrite of a PHP application from the ground up so I apologise in advance if the issue is something mundane.
Full source:
- Router source
- Route source
Your getOptions() method removes every entry from this map as it iterates. So, after calling getOptions() once, the map is empty.
By the way, assigning null to variables does not "help the garbage collector." The garbage collector knows that when the scope of a variable is exited, that variable no longer references the object. You are actually slowing things down by assigning values (null) that can never be read (as well as cluttering your code with counterproductive noise). A good static analysis tool like FindBugs will warn you that this is bad code.
Iterating through HashSet empties HashMap entry
That doesn't happen. Simply iterating a HashSet has no side-effects on the set's contents.
There is something else going on here that is causing your problem.