LazyDataModel called twice - java

I'm using datatable from primefaces for displaying list of records.
But it call twice the load internal method(overriden) while navigating to pages. This behavior affects the performance.
partial server side code:
public void getInternalList() {
try {
dataList = service.getInternalDetails(userId, 0, 5);
if (null != dataList) {
// Getting the current page data using lazy loading
lazyModelEmployees = new LazyDataModel<Employee>() {
#Override
public List<Employee> load(int first, int pageSize, String arg2, SortOrder arg3, Map<String, String> arg4) {
List<Employee> currentPageEmployeeList = null;
try {
currentPageEmployeeList = service.getInternalDetails(userId, first, pageSize);
} catch (Exception e) {
exceptionHandler(e);
}
if (arg2 != null) {
Collections.sort(currentPageEmployeeList, new LazySorter(arg2, arg3));
}
int dataSize = currentPageEmployeeList.size();
this.setRowCount(dataSize);
if (dataSize > pageSize) {
try {
return currentPageEmployeeList.subList(first, first + pageSize);
} catch (IndexOutOfBoundsException e) {
return currentPageEmployeeList.subList(first, first + (dataSize % pageSize));
}
} else {
return currentPageEmployeeList;
}
}
};
lazyModelEmployees.setRowCount(dataList.size());
lazyModelEmployees.setPageSize(5);
}
} catch (Exception e1) {
e1.printStackTrace();
}
}

Related

java, sonar, Cyclomatic Complexity

can anyone help me to reduce cylomatic complexity for below method upto 10..also considering no nesting of if else is allow as it will also cause sonar issue.
It will be great help for me
private void processIntransitFile(String fileName) {
if (StringUtils.isNotBlank(fileName))
return;
// read Intransit folder and do the processing on these files
try (BufferedReader bufferedReader = new BufferedReader(new FileReader(intransitDir + fileName))) {
TokenRangeDTO tokenRangeDTO = new TokenRangeDTO();
int count = 0;
String header = "";
String next;
String line = bufferedReader.readLine();
LinkedHashSet<String> tokenRanges = new LinkedHashSet<>();
int trCount = 0;
boolean first = true;
boolean last = line == null;
while (!last) {
last = (next = bufferedReader.readLine()) == null;
if (!first && !last) {
tokenRanges.add(line);
}
// read first line of the file
else if (first && line.startsWith(H)) {
header = line;
first = false;
} else if (first && !line.startsWith(H)) {
tokenRangeDTO.setValidationMessage(HEADER_MISSING);
first = false;
}
// read last line of the file
else if (last && line.startsWith(T)) {
trCount = getTrailerCount(tokenRangeDTO, line, trCount);
} else if (last && !line.startsWith(T)) {
tokenRangeDTO.setValidationMessage(TRAILOR_MISSING);
}
line = next;
count++;
}
processInputFile(fileName, tokenRangeDTO, count, header, tokenRanges, trCount);
} catch (IOException e) {
LOGGER.error(IO_EXCEPTION, e);
} catch (Exception e) {
LOGGER.error("Some exception has occured", e);
} finally {
try {
FileUtils.deleteQuietly(new File(intransitDir + fileName));
} catch (Exception ex) {
LOGGER.error(STREAM_FAILURE, ex);
}
}
}
can anyone help me to reduce cylomatic complexity for below method upto 10..also considering no nesting of if else is allow as it will also cause sonar issue.
It will be great help for me
You could extract part of your code to methods and/or refactor some variables which could be used in other way. Also, when you have comments explaining your code it is a strong indicator that your logic can be improved there:
private void processIntransitFile(String fileName) {
if (StringUtils.isNotBlank(fileName)) return;
processFromIntransitDirectory(fileName);
}
private void processFromIntransitDirectory(String fileName) {
try (BufferedReader bufferedReader = new BufferedReader(getFileFromIntransitFolder(fileName))) {
TokenRangeDTO tokenRangeDTO = new TokenRangeDTO();
int count = 0;
String header = "";
String next;
String line = bufferedReader.readLine();
LinkedHashSet<String> tokenRanges = new LinkedHashSet<>();
int trCount = 0;
while (!isLastLine(line)) {
next = bufferedReader.readLine();
if (!isFirstLine(count) && !isLastLine(next)) {
tokenRanges.add(line);
}
header = readFirstLine(line, count, tokenRangeDTO);
trCount = readLastLine(line, next, trCount, tokenRangeDTO);
line = next;
count++;
}
processInputFile(fileName, tokenRangeDTO, count, header, tokenRanges, trCount);
} catch (IOException e) {
LOGGER.error(IO_EXCEPTION, e);
} catch (Exception e) {
LOGGER.error("Some exception has occured", e);
} finally {
try {
FileUtils.deleteQuietly(new File(intransitDir + fileName));
} catch (Exception ex) {
LOGGER.error(STREAM_FAILURE, ex);
}
}
}
private boolean isLastLine(String line) {
return line != null;
}
private String readFirstLine(String line, int count, TokenRangeDTO tokenRangeDTO) {
if (isFirstLine(count) && isHeader(line)) {
return line;
} else if (isFirstLine(count) && !isHeader(line)) {
tokenRangeDTO.setValidationMessage(HEADER_MISSING);
}
return StringUtils.EMPTY;
}
private int readLastLine(String line, String next, int trCount, TokenRangeDTO tokenRangeDTO){
if (isLastLine(next) && isTrailor(line)) {
return getTrailerCount(tokenRangeDTO, line, trCount);
} else if (last && !isTrailor(line)) {
tokenRangeDTO.setValidationMessage(TRAILOR_MISSING);
}
return 0;
}
private boolean isTrailor(String line) {
return line.startsWith(T);
}
private boolean isHeader(String line) {
return line.startsWith(H);
}
private boolean isFirstLine(int count) {
return count == 0;
}
private FileReader getFileFromIntransitFolder(String fileName) {
return new FileReader(intransitDir + fileName);
}
Doing this your code will be more readable, you will avoid useless variables using logic and your cyclomatic complexity will decrease.
For more tips, I recommend access refactoring.guru.

Netflix Zuul Pre Filter for Cache is not working for a smal amount of compressed responses

I'd like to use zuul to cache some requests. The Cache is stored in a Redis as a POJO and contains plaintext (not gzip compressed data).
For normal tests and integration tests, everything works pretty well. With a jmeter load test, some of the requests fails with
java.util.zip.ZipException: Not in GZIP format (from jmeter)
We figure out, that at this point, zuul is returning an empty response.
My PreFilter:
public class CachePreFilter extends CacheBaseFilter {
private static DynamicIntProperty INITIAL_STREAM_BUFFER_SIZE = DynamicPropertyFactory.getInstance().getIntProperty(ZuulConstants.ZUUL_INITIAL_STREAM_BUFFER_SIZE, 8192);
#Autowired
CounterService counterService;
public CachePreFilter(RedisCacheManager redisCacheManager, Properties properties) {
super(redisCacheManager, properties);
}
#Override
public Object run() {
RequestContext ctx = RequestContext.getCurrentContext();
CachedResponse data = getFromCache(ctx);
if (null != data) {
counterService.increment("counter.cached");
HttpServletResponse response = ctx.getResponse();
response.addHeader("X-Cache", "HIT");
if (null != data.getContentType()) {
response.setContentType(data.getContentType());
}
if (null != data.getHeaders()) {
for (Entry<String, String> header : data.getHeaders().entrySet()) {
if (!response.containsHeader(header.getKey())) {
response.addHeader(header.getKey(), header.getValue());
}
}
}
OutputStream outStream = null;
try {
outStream = response.getOutputStream();
boolean isGzipRequested = ctx.isGzipRequested();
if (null != data.getBody()) {
final String requestEncoding = ctx.getRequest().getHeader(ZuulHeaders.ACCEPT_ENCODING);
if (requestEncoding != null && HTTPRequestUtils.getInstance().isGzipped(requestEncoding)) {
isGzipRequested = true;
}
ByteArrayOutputStream byteArrayOutputStream = null;
ByteArrayInputStream is = null;
try {
if (isGzipRequested) {
byteArrayOutputStream = new ByteArrayOutputStream();
GZIPOutputStream gzipOutputStream = new GZIPOutputStream(byteArrayOutputStream);
gzipOutputStream.write(data.getBody().getBytes(StandardCharsets.UTF_8));
gzipOutputStream.flush();
gzipOutputStream.close();
ctx.setResponseGZipped(true);
is = new ByteArrayInputStream(byteArrayOutputStream.toByteArray());
logger.debug(String.format("Send gzip content %s", data.getBody()));
response.setHeader(ZuulHeaders.CONTENT_ENCODING, "gzip");
} else {
logger.debug(String.format("Send content %s", data.getBody()));
is = new ByteArrayInputStream(data.getBody().getBytes(StandardCharsets.UTF_8));
}
writeResponse(is, outStream);
} catch (Exception e) {
logger.error("Error at sending response " + e.getMessage(), e);
throw new RuntimeException("Failed to send content", e);
} finally {
if (null != byteArrayOutputStream) {
byteArrayOutputStream.close();
}
if (null != is) {
is.close();
}
}
}
ctx.setSendZuulResponse(false);
} catch (IOException e) {
logger.error("Cannot read from Stream " + e.getMessage(), e.getMessage());
} finally {
// don't close the outputstream
}
ctx.set(CACHE_HIT, true);
return data;
} else {
counterService.increment("counter.notcached");
}
ctx.set(CACHE_HIT, false);
return null;
}
private ThreadLocal<byte[]> buffers = new ThreadLocal<byte[]>() {
#Override
protected byte[] initialValue() {
return new byte[INITIAL_STREAM_BUFFER_SIZE.get()];
}
};
private void writeResponse(InputStream zin, OutputStream out) throws Exception {
byte[] bytes = buffers.get();
int bytesRead = -1;
while ((bytesRead = zin.read(bytes)) != -1) {
out.write(bytes, 0, bytesRead);
}
}
#Override
public int filterOrder() {
return 99;
}
#Override
public String filterType() {
return "pre";
}
}
My Post Filter
public class CachePostFilter extends CacheBaseFilter {
public CachePostFilter(RedisCacheManager redisCacheManager, Properties properties) {
super(redisCacheManager, properties);
}
#Override
public boolean shouldFilter() {
RequestContext ctx = RequestContext.getCurrentContext();
return super.shouldFilter() && !ctx.getBoolean(CACHE_HIT);
}
#Override
public Object run() {
RequestContext ctx = RequestContext.getCurrentContext();
HttpServletRequest req = ctx.getRequest();
HttpServletResponse res = ctx.getResponse();
if (isSuccess(res, ctx.getOriginResponseHeaders())) {
// Store only successful responses
String cacheKey = cacheKey(req);
if (cacheKey != null) {
String body = null;
if (null != ctx.getResponseBody()) {
body = ctx.getResponseBody();
} else if (null != ctx.getResponseDataStream()) {
InputStream is = null;
try {
is = ctx.getResponseDataStream();
final Long len = ctx.getOriginContentLength();
if (len == null || len > 0) {
if (ctx.getResponseGZipped()) {
is = new GZIPInputStream(is);
}
StringWriter writer = new StringWriter();
IOUtils.copy(is, writer, "UTF-8");
body = writer.toString();
if (null != body && !body.isEmpty()) {
ctx.setResponseDataStream(new ByteArrayInputStream(body.getBytes()));
ctx.setResponseGZipped(false);
ctx.setOriginContentLength(String.valueOf(body.getBytes().length));
} else {
ctx.setResponseBody("{}");
}
}
} catch (IOException e) {
logger.error("Cannot read body " + e.getMessage(), e);
} finally {
if (null != is) {
try {
is.close();
} catch (IOException e) {
}
}
}
saveToCache(ctx, cacheKey, body);
}
}
}
return null;
}
#Override
public int filterOrder() {
return 1;
}
#Override
public String filterType() {
return "post";
}
private boolean isSuccess(HttpServletResponse res, List<Pair<String, String>> originHeaders) {
if (res != null && res.getStatus() < 300) {
if (null != originHeaders) {
for (Pair<String, String> header : originHeaders) {
if (header.first().equals("X-CACHEABLE") && header.second().equals("1")) {
return true;
}
}
}
}
return false;
}
We test it without Redis (just store it into a local variable) and this is still the same. We logged always the response from cache (before gzip) and everything looks good.
(Posted on behalf of the question author).
Solution
We refactor our PostFilter and don't change so much in the Response for zuul. After this change, we don't see any problems any more:
Working Post Filter
public class CachePostFilter extends CacheBaseFilter {
public CachePostFilter(RedisCacheManager redisCacheManager, Properties properties) {
super(redisCacheManager, properties);
}
#Override
public boolean shouldFilter() {
RequestContext ctx = RequestContext.getCurrentContext();
return super.shouldFilter() && !ctx.getBoolean(CACHE_HIT);
}
#Override
public Object run() {
RequestContext ctx = RequestContext.getCurrentContext();
HttpServletRequest req = ctx.getRequest();
HttpServletResponse res = ctx.getResponse();
if (isSuccess(res, ctx.getOriginResponseHeaders())) {
// Store only successful responses
String cacheKey = cacheKey(req);
if (cacheKey != null) {
String body = null;
if (null != ctx.getResponseBody()) {
body = ctx.getResponseBody();
} else if (null != ctx.getResponseDataStream()) {
InputStream rawInputStream = null;
InputStream gzipByteArrayInputStream = null;
try {
rawInputStream = ctx.getResponseDataStream();
gzipByteArrayInputStream = null;
// If origin tell it's GZipped but the content is ZERO
// bytes,
// don't try to uncompress
final Long len = ctx.getOriginContentLength();
if (len == null || len > 0) {
byte[] rawData = IOUtils.toByteArray(rawInputStream);
ctx.setResponseDataStream(new ByteArrayInputStream(rawData));
if (ctx.getResponseGZipped()) {
gzipByteArrayInputStream = new GZIPInputStream(new ByteArrayInputStream(rawData));
} else {
gzipByteArrayInputStream = new ByteArrayInputStream(rawData);
}
StringWriter writer = new StringWriter();
IOUtils.copy(gzipByteArrayInputStream, writer, "UTF-8");
body = writer.toString();
}
} catch (IOException e) {
logger.error("Cannot read body " + e.getMessage(), e);
} finally {
if (null != rawInputStream) {
try {
rawInputStream.close();
} catch (IOException e) {
}
}
if (null != gzipByteArrayInputStream) {
try {
gzipByteArrayInputStream.close();
} catch (IOException e) {
}
}
}
// if we read from the stream, the other filter cannot read
// and they dont' deliver any response
// ctx.setResponseBody(body);
// ctx.setResponseGZipped(false);
saveToCache(ctx, cacheKey, body);
}
}
}
return null;
}
#Override
public int filterOrder() {
return 1;
}
#Override
public String filterType() {
return "post";
}
private boolean isSuccess(HttpServletResponse res, List<Pair<String, String>> originHeaders) {
if (res != null && res.getStatus() == 200) {
if (null != originHeaders) {
for (Pair<String, String> header : originHeaders) {
if (header.first().equals("X-CACHEABLE") && header.second().equals("1")) {
return true;
}
}
}
}
return false;
}
}

Java Reflection/BeanUtils issue

I'm trying to populate a bean with some table driven attribute:value pairs. I retrieve them from a MySQL table, and populate a hashmap just fine. I iterate through the hashmap, and if I use PropertyUtils.setProperty() I get a "Class does not have setter for *" error. If I use BeanUtils.setProperty() the bean never gets populated. Here's the sample:
public class DBDrivenPayloadHandler extends GDE{
DbDrivenPayloadHandlerBean bean;
#SuppressWarnings("rawtypes")
public void populateBean() throws Exception {
ITransaction trans = new MySQLTransaction();
IAdapterDataMapDAO adapterDataMap = new MySQLAdapterDataMapDAO();
adapterDataMap.setTransaction(trans);
HashMap<String, String> values = adapterDataMap.getHashMap(super.getCurrentAccountId());
//hashmap gets populated correctly with correct variable names and values != "-1";
DbDrivenPayloadHandlerBean bean = new DbDrivenPayloadHandlerBean();
//We have a bean with all the intialized variable values
Iterator it = values.entrySet().iterator();
while (it.hasNext()) {
Map.Entry entry = (Map.Entry)it.next();
try {
PropertyUtils.setProperty(bean, (String) entry.getKey(), entry.getValue());
//PropertyUtils will give a setter not found error. BeanUtils never sets the values.
} catch (Exception e) {
e.printStackTrace();
}
}
}
public void getInfo(String fileName) {
try {
populateBean();
} catch (Exception e) {
e.printStackTrace();
}
APPTS_FULLNAME_POS = bean.getAPPTS_FULLNAME_POS();
APPTS_DATETIME_POS = bean.getAPPTS_DATETIME_POS();
//Both still -1;
super.getInfo(filename);
}
And here's the Bean (or at least some of it):
public class DbDrivenPayloadHandlerBean {
int APPTS_FULLNAME_POS = -1;
int APPTS_DATETIME_POS = -1;
public DbDrivenPayloadHandlerBean() {
super();
}
public int getAPPTS_FULLNAME_POS() {
return APPTS_FULLNAME_POS;
}
public void setAPPTS_FULLNAME_POS(String APPTS_FULLNAME_POS) {
this.APPTS_FULLNAME_POS = Integer.parseInt(APPTS_FULLNAME_POS);
}
public int getAPPTS_DATETIME_POS() {
return APPTS_DATETIME_POS;
}
public void setAPPTS_DATETIME_POS(String APPTS_DATETIME_POS) {
this.APPTS_DATETIME_POS = Integer.parseInt(APPTS_DATETIME_POS);
}
Sorry guys, BeanUtils does the trick. I just don't want to be allowing setters that take Strings. I guess reflection does the casting for you. Apologies.
Solution as stated above: don't try to cast for BeanUtils.
public class DBDrivenPayloadHandler extends GDE{
DbDrivenPayloadHandlerBean bean;
#SuppressWarnings("rawtypes")
public void populateBean() throws Exception {
ITransaction trans = new MySQLTransaction();
IAdapterDataMapDAO adapterDataMap = new MySQLAdapterDataMapDAO();
adapterDataMap.setTransaction(trans);
HashMap<String, String> values = adapterDataMap.getHashMap(super.getCurrentAccountId());
this.bean = new DbDrivenPayloadHandlerBean();
Iterator it = values.entrySet().iterator();
while (it.hasNext()) {
Map.Entry entry = (Map.Entry)it.next();
try {
BeanUtils.setProperty(bean, (String) entry.getKey(), entry.getValue());
} catch (Exception e) {
e.printStackTrace();
}
}
}
public void getInfo(String fileName) {
try {
populateBean();
} catch (Exception e) {
e.printStackTrace();
}
APPTS_FULLNAME_POS = bean.getAPPTS_FULLNAME_POS();
APPTS_DATETIME_POS = bean.getAPPTS_DATETIME_POS();
//Both still -1;
super.getInfo(filename);
}
Bean
public class DbDrivenPayloadHandlerBean {
int APPTS_FULLNAME_POS = -1;
int APPTS_DATETIME_POS = -1;
public DbDrivenPayloadHandlerBean() {
super();
}
public int getAPPTS_FULLNAME_POS() {
return APPTS_FULLNAME_POS;
}
public void setAPPTS_FULLNAME_POS(int APPTS_FULLNAME_POS) {
this.APPTS_FULLNAME_POS = APPTS_FULLNAME_POS;
}
public int getAPPTS_DATETIME_POS() {
return APPTS_DATETIME_POS;
}
public void setAPPTS_DATETIME_POS(String APPTS_DATETIME_POS) {
this.APPTS_DATETIME_POS = APPTS_DATETIME_POS;
}

Elasticsearch: Adding manual mapping using Java

I cant change the mapping. Can anybody help me to find the bug in my code?
I have found this standard way to change the mapping according to several tutorials. But when i'm try to call the mapping structure there just appear a blank mapping structure after manuall mapping creation.
But after inserting some data there appear the mapping specification because ES is using of course the default one. To be more specific see the code below.
public class ElasticTest {
private String dbname = "ElasticSearch";
private String index = "indextest";
private String type = "table";
private Client client = null;
private Node node = null;
public ElasticTest(){
this.node = nodeBuilder().local(true).node();
this.client = node.client();
if(isIndexExist(index)){
deleteIndex(this.client, index);
createIndex(index);
}
else{
createIndex(index);
}
System.out.println("mapping structure before data insertion");
getMappings();
System.out.println("----------------------------------------");
createData();
System.out.println("mapping structure after data insertion");
getMappings();
}
public void getMappings() {
ClusterState clusterState = client.admin().cluster().prepareState()
.setFilterIndices(index).execute().actionGet().getState();
IndexMetaData inMetaData = clusterState.getMetaData().index(index);
MappingMetaData metad = inMetaData.mapping(type);
if (metad != null) {
try {
String structure = metad.getSourceAsMap().toString();
System.out.println(structure);
} catch (IOException e) {
e.printStackTrace();
}
}
}
private void createIndex(String index) {
XContentBuilder typemapping = buildJsonMappings();
String mappingstring = null;
try {
mappingstring = buildJsonMappings().string();
} catch (IOException e1) {
e1.printStackTrace();
}
client.admin().indices().create(new CreateIndexRequest(index)
.mapping(type, typemapping)).actionGet();
//try put mapping after index creation
/*
* PutMappingResponse response = null; try { response =
* client.admin().indices() .preparePutMapping(index) .setType(type)
* .setSource(typemapping.string()) .execute().actionGet(); } catch
* (ElasticSearchException e) { e.printStackTrace(); } catch
* (IOException e) { e.printStackTrace(); }
*/
}
private void deleteIndex(Client client, String index) {
try {
DeleteIndexResponse delete = client.admin().indices()
.delete(new DeleteIndexRequest(index)).actionGet();
if (!delete.isAcknowledged()) {
} else {
}
} catch (Exception e) {
}
}
private XContentBuilder buildJsonMappings(){
XContentBuilder builder = null;
try {
builder = XContentFactory.jsonBuilder();
builder.startObject()
.startObject("properties")
.startObject("ATTR1")
.field("type", "string")
.field("store", "yes")
.field("index", "analyzed")
.endObject()
.endObject()
.endObject();
} catch (IOException e) {
e.printStackTrace();
}
return builder;
}
private boolean isIndexExist(String index) {
ActionFuture<IndicesExistsResponse> exists = client.admin().indices()
.exists(new IndicesExistsRequest(index));
IndicesExistsResponse actionGet = exists.actionGet();
return actionGet.isExists();
}
private void createData(){
System.out.println("Data creation");
IndexResponse response=null;
for (int i=0;i<10;i++){
Map<String, Object> json = new HashMap<String, Object>();
json.put("ATTR1", "new value" + i);
response = this.client.prepareIndex(index, type)
.setSource(json)
.setOperationThreaded(false)
.execute()
.actionGet();
}
String _index = response.getIndex();
String _type = response.getType();
long _version = response.getVersion();
System.out.println("Index : "+_index+" Type : "+_type+" Version : "+_version);
System.out.println("----------------------------------");
}
public static void main(String[] args)
{
new ElasticTest();
}
}
I just wanna change the property of ATTR1 field to analyzed to ensure fast queries.
What im doing wrong? I also tried to create the mapping after index creation but it leads to the same affect.
Ok i found the answer by my own. On the type level i had to wrap the "properties" with the type name. E.g:
"type1" : {
"properties" : {
.....
}
}
See the following code:
private XContentBuilder getMappingsByJson(){
XContentBuilder builder = null;
try {
builder = XContentFactory.jsonBuilder().startObject().startObject(type).startObject("properties");
for(int i = 1; i<5; i++){
builder.startObject("ATTR" + i)
.field("type", "integer")
.field("store", "yes")
.field("index", "analyzed")
.endObject();
}
builder.endObject().endObject().endObject();
}
catch (IOException e) {
e.printStackTrace();
}
return builder;
}
It creates mappings for the attributes ATTR1 - ATTR4. Now it is possible to define mapping for Example a list of different attributes dynamically. Hope it helps someone else.

How to cache in a Blackberry BrowserField

I am creating a Blackberry application to display a full screen web view of a certain site. I have a working browserfield that displays properly but navigation from page to page is slower than that of the native browser. The browserfield does not seem to have a built in cache causing the load time to be slow. When I add the following code to manage the cache the site no longer displays properly.
BrowserFieldScreen.java:
import net.rim.device.api.browser.field2.*;
import net.rim.device.api.script.ScriptEngine;
import net.rim.device.api.system.*;
import net.rim.device.api.ui.*;
import net.rim.device.api.ui.component.*;
import net.rim.device.api.ui.container.*;
import org.w3c.dom.Document;
class BrowserFieldScreen extends MainScreen
{
BrowserField browserField;
LoadingScreen load = new LoadingScreen();;
public BrowserFieldScreen()
{
browserField = new BrowserField();
browserField.getConfig().setProperty(
BrowserFieldConfig.JAVASCRIPT_ENABLED,
Boolean.TRUE);
browserField.getConfig().setProperty(
BrowserFieldConfig.NAVIGATION_MODE,
BrowserFieldConfig.NAVIGATION_MODE_POINTER);
browserField.getConfig().setProperty(
BrowserFieldConfig.CONTROLLER,
new CacheProtocolController(browserField));
browserField.requestContent("http://www.stackoverflow.com");
add(browserField);
}
}
CacheProtocolController.java:
import javax.microedition.io.HttpConnection;
import javax.microedition.io.InputConnection;
import net.rim.device.api.browser.field2.BrowserField;
import net.rim.device.api.browser.field2.BrowserFieldRequest;
import net.rim.device.api.browser.field2.ProtocolController;
public class CacheProtocolController extends ProtocolController{
// The BrowserField instance
private BrowserField browserField;
// CacheManager will take care of cached resources
private CacheManager cacheManager;
public CacheProtocolController(BrowserField browserField) {
super(browserField);
this.browserField = browserField;
}
private CacheManager getCacheManager() {
if ( cacheManager == null ) {
cacheManager = new CacheManagerImpl();
}
return cacheManager;
}
/**
* Handle navigation requests (e.g., link clicks)
*/
public void handleNavigationRequest(BrowserFieldRequest request)
throws Exception
{
InputConnection ic = handleResourceRequest(request);
browserField.displayContent(ic, request.getURL());
}
/**
* Handle resource request
* (e.g., images, external css/javascript resources)
*/
public InputConnection handleResourceRequest(BrowserFieldRequest request)
throws Exception
{
// if requested resource is cacheable (e.g., an "http" resource),
// use the cache
if (getCacheManager() != null
&& getCacheManager().isRequestCacheable(request))
{
InputConnection ic = null;
// if requested resource is cached, retrieve it from cache
if (getCacheManager().hasCache(request.getURL())
&& !getCacheManager().hasCacheExpired(request.getURL()))
{
ic = getCacheManager().getCache(request.getURL());
}
// if requested resource is not cached yet, cache it
else
{
ic = super.handleResourceRequest(request);
if (ic instanceof HttpConnection)
{
HttpConnection response = (HttpConnection) ic;
if (getCacheManager().isResponseCacheable(response))
{
ic = getCacheManager().createCache(request.getURL(),
response);
}
}
}
return ic;
}
// if requested resource is not cacheable, load it as usual
return super.handleResourceRequest(request);
}
}
CacheManager.java:
import javax.microedition.io.HttpConnection;
import javax.microedition.io.InputConnection;
import net.rim.device.api.browser.field2.BrowserFieldRequest;
public interface CacheManager {
public boolean isRequestCacheable(BrowserFieldRequest request);
public boolean isResponseCacheable(HttpConnection response);
public boolean hasCache(String url);
public boolean hasCacheExpired(String url);
public InputConnection getCache(String url);
public InputConnection createCache(String url, HttpConnection response);
public void clearCache(String url);
}
CacheManagerImpl.java:
import java.io.IOException;
import java.io.InputStream;
import java.util.Date;
import java.util.Hashtable;
import javax.microedition.io.HttpConnection;
import javax.microedition.io.InputConnection;
import net.rim.device.api.browser.field2.BrowserFieldRequest;
import net.rim.device.api.browser.field2.BrowserFieldResponse;
import net.rim.device.api.io.http.HttpHeaders;
public class CacheManagerImpl implements CacheManager {
private static final int MAX_STANDARD_CACHE_AGE = 2592000;
private Hashtable cacheTable;
public CacheManagerImpl() {
cacheTable = new Hashtable();
}
public boolean isRequestCacheable(BrowserFieldRequest request) {
// Only HTTP requests are cacheable
if (!request.getProtocol().equals("http")) {
return false;
}
// Don't cache the request whose method is not "GET".
if (request instanceof HttpConnection) {
if (!((HttpConnection) request).getRequestMethod().equals("GET"))
{
return false;
}
}
// Don't cache the request with post data.
if (request.getPostData() != null) {
return false;
}
// Don't cache authentication request.
if (request.getHeaders().getPropertyValue("Authorization") != null) {
return false;
}
return true;
}
public boolean isResponseCacheable(HttpConnection response) {
try {
if (response.getResponseCode() != 200) {
return false;
}
} catch (IOException ioe) {
return false;
}
if (!response.getRequestMethod().equals("GET")) {
return false;
}
if (containsPragmaNoCache(response)) {
return false;
}
if (isExpired(response)) {
return false;
}
if (containsCacheControlNoCache(response)) {
return false;
}
if ( response.getLength() <= 0 ) {
return false;
}
// additional checks can be implemented here to inspect
// the HTTP cache-related headers of the response object
return true;
}
private boolean isExpired(HttpConnection response) {
try
{
// getExpiration() returns 0 if not known
long expires = response.getExpiration();
if (expires > 0 && expires <= (new Date()).getTime()) {
return true;
}
return false;
} catch (IOException ioe) {
return true;
}
}
private boolean containsPragmaNoCache(HttpConnection response) {
try
{
if (response.getHeaderField("pragma") != null
&& response.getHeaderField("pragma")
.toLowerCase()
.indexOf("no-cache") >= 0)
{
return true;
}
return false;
} catch (IOException ioe) {
return true;
}
}
private boolean containsCacheControlNoCache(HttpConnection response) {
try {
String cacheControl = response.getHeaderField("cache-control");
if (cacheControl != null) {
cacheControl = removeSpace(cacheControl.toLowerCase());
if (cacheControl.indexOf("no-cache") >= 0
|| cacheControl.indexOf("no-store") >= 0
|| cacheControl.indexOf("private") >= 0
|| cacheControl.indexOf("max-age=0") >= 0) {
return true;
}
long maxAge = parseMaxAge(cacheControl);
if (maxAge > 0 && response.getDate() > 0) {
long date = response.getDate();
long now = (new Date()).getTime();
if (now > date + maxAge) {
// Already expired
return true;
}
}
}
return false;
} catch (IOException ioe) {
return true;
}
}
public InputConnection createCache(String url, HttpConnection response) {
byte[] data = null;
InputStream is = null;
try {
// Read data
int len = (int) response.getLength();
if (len > 0) {
is = response.openInputStream();
int actual = 0;
int bytesread = 0 ;
data = new byte[len];
while ((bytesread != len) && (actual != -1)) {
actual = is.read(data, bytesread, len - bytesread);
bytesread += actual;
}
}
} catch (IOException ioe) {
data = null;
} finally {
if (is != null) {
try {
is.close();
} catch (IOException ioe) {
}
}
if (response != null) {
try {
response.close();
} catch (IOException ioe) {
}
}
}
if (data == null) {
return null;
}
// Calculate expires
long expires = calculateCacheExpires(response);
// Copy headers
HttpHeaders headers = copyResponseHeaders(response);
// add item to cache
cacheTable.put(url, new CacheItem(url, expires, data, headers));
return new BrowserFieldResponse(url, data, headers);
}
private long calculateCacheExpires(HttpConnection response) {
long date = 0;
try {
date = response.getDate();
} catch (IOException ioe) {
}
if (date == 0) {
date = (new Date()).getTime();
}
long expires = getResponseExpires(response);
// If an expire date has not been specified assumes the maximum time
if ( expires == 0 ) {
return date + (MAX_STANDARD_CACHE_AGE * 1000L);
}
return expires;
}
private long getResponseExpires(HttpConnection response) {
try {
// Calculate expires from "expires"
long expires = response.getExpiration();
if (expires > 0) {
return expires;
}
// Calculate expires from "max-age" and "date"
if (response.getHeaderField("cache-control") != null) {
String cacheControl = removeSpace(response
.getHeaderField("cache-control")
.toLowerCase());
long maxAge = parseMaxAge(cacheControl);
long date = response.getDate();
if (maxAge > 0 && date > 0) {
return (date + maxAge);
}
}
} catch (IOException ioe) {
}
return 0;
}
private long parseMaxAge(String cacheControl) {
if (cacheControl == null) {
return 0;
}
long maxAge = 0;
if (cacheControl.indexOf("max-age=") >= 0) {
int maxAgeStart = cacheControl.indexOf("max-age=") + 8;
int maxAgeEnd = cacheControl.indexOf(',', maxAgeStart);
if (maxAgeEnd < 0) {
maxAgeEnd = cacheControl.length();
}
try {
maxAge = Long.parseLong(cacheControl.substring(maxAgeStart,
maxAgeEnd));
} catch (NumberFormatException nfe) {
}
}
// Multiply maxAge by 1000 to convert seconds to milliseconds
maxAge *= 1000L;
return maxAge;
}
private static String removeSpace(String s) {
StringBuffer result= new StringBuffer();
int count = s.length();
for (int i = 0; i < count; i++) {
char c = s.charAt(i);
if (c != ' ') {
result.append(c);
}
}
return result.toString();
}
private HttpHeaders copyResponseHeaders(HttpConnection response) {
HttpHeaders headers = new HttpHeaders();
try {
int index = 0;
while (response.getHeaderFieldKey(index) != null) {
headers.addProperty(response.getHeaderFieldKey(index),
response.getHeaderField(index));
index++;
}
} catch (IOException ioe) {
}
return headers;
}
public boolean hasCache(String url) {
return cacheTable.containsKey(url);
}
public boolean hasCacheExpired(String url) {
Object o = cacheTable.get(url);
if (o instanceof CacheItem) {
CacheItem ci = (CacheItem) o;
long date = (new Date()).getTime();
if (ci.getExpires() > date) {
return false;
} else {
// Remove the expired cache item
clearCache(url);
}
}
return true;
}
public void clearCache(String url) {
cacheTable.remove(url);
}
public InputConnection getCache(String url) {
Object o = cacheTable.get(url);
if (o instanceof CacheItem) {
CacheItem ci = (CacheItem) o;
return new BrowserFieldResponse(url,
ci.getData(),
ci.getHttpHeaders());
}
return null;
}
}
CacheItem.java:
import net.rim.device.api.io.http.HttpHeaders;
public class CacheItem {
private String url;
private long expires;
private byte[] data;
private HttpHeaders httpHeaders;
public CacheItem(String url,
long expires,
byte[] data,
HttpHeaders httpHeaders)
{
this.url = url;
this.expires = expires;
this.data = data;
this.httpHeaders = httpHeaders;
}
public String getUrl() {
return url;
}
public long getExpires() {
return expires;
}
public byte[] getData() {
return data;
}
public HttpHeaders getHttpHeaders() {
return httpHeaders;
}
}
Any help that can be giving towards this will be greatly appreciated. This really has me stumped. Thanks.
UPDATE: It looks like the caching only works at a certain level of the Blackberry libraries. I have added logic to check the current Software level and turn on the caching if it is supported by the device's current software level. This provides me with a good work around, but i would still like to know if there is a better way for the caching to work with all devices.
UPDATE 2 Based on comments: The site no longer displaying properly pertains to site not displaying the proper layout, images and text. It basically give a white background with links and text displaying as a bulleted list, all formatting removed.
I've been looking at your code, and the only thing I've found there's wrong with it, is you are completely ignoring the possibility of response.getLength(); returning less than zero (in CacheManagerImpl.createCache()). Although this didn't happen to me at the stackoverflow.com page, some pages use Transfer-Encoding: chunked, which means Content-Length is not present. This is, however, well handled, and should not cause the cache to fail (it would only be less effective).
I suggest testing your code on smaller problems, one step at a time. First, create cacheable page that only contains some text (like "hello") without any HTML tags. That should work pretty well, and in case it does not, it shouldn't be hard to determine where the data are getting lost. Or try to manually create cache item that does not expire and contains a webpage with no (external) stylesheet nor images, and see if it's even possible to pass it to BrowserField the way you do it. Then build on, add an image, add a style sheet so you can corner the problem.
The code is written very nicely, but at this point, it is not possible to help you because there are no evident flaws in the code and you are not explaining yourself very well, it is not clear how the error manifests itself, if it is every time or random, ... If I had a Blackberry device, I could probably try running the code for myself, but i don't.

Categories