Null Pinter Exception simply means that it is because of value is getting null. But In case of using APis such as GeoTiff it becomes annoying to find out the error in usage.
My code is as follows:
System.out.println("vectorization starts");
GridCoverage2D srcCoverage = new GeoTiffReader(new File("E:/output/ll_processed.TIFF")).read(new GeneralParameterValue[]{policy, gridsize, useJaiRead});
SimpleFeatureCollection fc = RasterToVectorProcess.process(srcCoverage, 3, cov.getEnvelope(), Collections.singletonList(0.0d), true, null);
System.out.println("process ends");
System.out.println("vectorization ends");
//MapContext map = new DefaultMapContext();
//map.setTitle("raster to vector conversion");
Style style = SLD.createPolygonStyle(Color.BLUE, Color.CYAN, 1.0f);
//map.addLayer(fc, style);
//map.getLayerBounds();
//JMapFrame.showMap(map);
MapContent mapContent= new MapContent();
mapContent.setTitle("Illegal Mining");
Layer layer = new FeatureLayer(fc, style,"VectorLayer");
//int boundary = 10;
// ReferencedEnvelope env2 = new ReferencedEnvelope(srcCoverage.getEnvelope().getMinimum(0) - boundary, srcCoverage.getEnvelope().getMaximum(0) + boundary,
//srcCoverage.getEnvelope().getMinimum(1) - boundary, srcCoverage.getEnvelope().getMaximum(1) + boundary, srcCoverage.getCoordinateReferenceSystem());
//mapContent.getViewport().setBounds(fc.getBounds());
Line 199 : if(layer.getBounds()!=null) // here the error is coming also tried with if(layer != null && layer.getBounds()!=null)
{
mapContent.addLayer(layer);
}else{
System.out.println("Layer bounds are null");
}
mapContent.getViewport().setCoordinateReferenceSystem(
DefaultGeographicCRS.WGS84);
Error
at org.geotools.map.FeatureLayer.getBounds(FeatureLayer.java:199)
I am trying to convert Tiff to Vector image and Then I want to store it on disk.
Most likely, your featureSource is null, since the geotools FeatureLayer class method getBounds() uses the featureSource to retrieve the bounds, but in the constructor FeatureLayer doesn't check whether the featureSource is null.
The featureSource is the first argument to the constructor of FeatureLayer. In your case, that's variable SimpleFeatureCollection fc.
Most likely, the method process.process returned null so fc is null.
Minor Change in RasterToVectorProcess.java
//features.add(builder.buildFeature(null));
//System.out.println("ignored");
//add
System.out.println("adding...");
SimpleFeature feature = builder.buildFeature(null);
((Collection<SimpleFeature>) features).add(feature);
Related
my code here
QCategory categoryDepth1 = new QCategory("categoryDepth1");
QCategory categoryDepth2 = new QCategory("categoryDepth2");
QCategory categoryDepth3 = new QCategory("categoryDepth3");
QCategory categoryDepth4 = new QCategory("categoryDepth4");
List<Tuple> pagingList = jpaQueryFactory.select(
// Projections.constructor(
// SearchProductListDTO.class,
product.productId,
productImage.mainImageUrl,
product.productName,
priceInfo.originalPrice,
priceInfo.sellingPrice,
product.productCode,
product.regDate,
product.udtDate,
product.statusCodeId,
select(productOption.stock.sum()).from(productOption).where(productOptionIdEqProduct()),
categoryDepth1.categoryName,
categoryDepth2.categoryName,
categoryDepth3.categoryName,
categoryDepth4.categoryName
// )
).
from(brand).
join(product).on(product.brandId.eq(brand.brandId)).
join(priceInfo).on(priceInfo.productId.eq(product.productId)).fetchJoin().
join(productImage).on(productImage.productId.eq(product.productId)).fetchJoin().
join(productCategory).on(product.productId.eq(productCategory.id)).fetchJoin().
join(categoryDepth1).on(categoryIdEqProductCategoryDepth(categoryDepth1.categoryId, productCategory.categoryDepth1)).
join(categoryDepth2).on(categoryIdEqProductCategoryDepth(categoryDepth2.categoryId, productCategory.categoryDepth2)).
join(categoryDepth3).on(categoryIdEqProductCategoryDepth(categoryDepth3.categoryId, productCategory.categoryDepth3)).
rightJoin(categoryDepth4).on(categoryIdEqProductCategoryDepth(categoryDepth4.categoryId, productCategory.categoryDepth4)).
where((brandNameEqBrandKrList(brandIdList))).
offset(pageable.getOffset()).
limit(pageable.getPageSize()).
fetch();
problem code here
rightJoin(categoryDepth4).on(categoryIdEqProductCategoryDepth(categoryDepth4.categoryId, productCategory.categoryDepth4)).
The value productCategory.categoryDepth4 may be null. If the value is null, you do not want to inspect it. What should I do?
The analyzer OpenNLPAnalyzer based on OpenNLPTokenizer in the opennlp package that ships with Lucene in this blog post works as promised. I am now trying to use it inside an ComboAnalyzer (a part of an ES-plugin to combine multiple analyzers; see link below) in the following way:
ComboAnalyzer analyzer = new ComboAnalyzer(new EnglishAnalyzer(), new OpenNLPAnalyzer());
TokenStream stream = analyzer.tokenStream("fieldname", new StringReader(text));
stream is a ComboTokenStream. On calling stream.incrementToken(), I get the following exception at line 105 here:
Exception in thread "main": State contains AttributeImpl of type org.apache.lucene.analysis.tokenattributes.OffsetAttributeImpl that is not in in this AttributeSource
Here is what the called method restoreState does.
public final void restoreState(State state) {
if (state == null) return;
do {
AttributeImpl targetImpl = attributeImpls.get(state.attribute.getClass());
if (targetImpl == null) {
throw new IllegalArgumentException("State contains AttributeImpl of type " +
state.attribute.getClass().getName() + " that is not in in this AttributeSource");
}
state.attribute.copyTo(targetImpl);
state = state.next;
} while (state != null);
}
This hints that one of the TokenStreams has an OffsetAttribute but the other does not. Is there a clean way to fix this?
I tried to add the line addAttribute(OffsetAttribute.class) in the same file here. I still get the same exception.
The problem was here:
Tokenizer source = new OpenNLPTokenizer(
AttributeFactory.DEFAULT_ATTRIBUTE_FACTORY, sentenceDetectorOp, tokenizerOp);
The fix is to pass in TokenStream.DEFAULT_TOKEN_ATTRIBUTE_FACTORY instead of AttributeFactory.DEFAULT_ATTRIBUTE_FACTORY. The former uses PackedTokenAttributeImpl for implementing OffsetAttribute (and many other attributes) and the latter picks OffsetAttributeImpl.
I trying to implement OSM using osmdroid, so far i am able to do everything correctly. But i would like to know how i can set two tile sources on top of each other.
Below is how i am trying to implement :
map = (MapView) findViewById(R.id.map);
map.getTileProvider().getTileCache().getProtectedTileComputers().clear();
map.getTileProvider().getTileCache().setAutoEnsureCapacity(false);
map.setTileSource(TileSourceFactory.MAPNIK);
map.setTileSource(new OnlineTileSourceBase("USGS Topo", 0, 18, 256, ".png?apikey=123",
new String[] { "https://api.xyz.com/maps/satellite/zxy/2020-04-21T16:10:00Z/" }) {
#Override
public String getTileURLString(long pMapTileIndex) {
return getBaseUrl()
+ MapTileIndex.getZoom(pMapTileIndex)
+ "/" + MapTileIndex.getY(pMapTileIndex)
+ "/" + MapTileIndex.getX(pMapTileIndex)
+ mImageFilenameEnding;
}
});
map.setVerticalMapRepetitionEnabled(false);
So far i am getting below output but there is no mapnik source under the satellite tile source.
but i need the output to look like this
How do i achieve this ?
As mentioned in the wiki docs on osmdroid's GitHub page, you can use more than one tile source at a time.
You can only set one setTileSource, so for your code to work you need to set the map as the tile source and add the weather tile layer as an overlay over it.
map = (MapView) findViewById(R.id.map);
map.getTileProvider().getTileCache().getProtectedTileComputers().clear();
map.getTileProvider().getTileCache().setAutoEnsureCapacity(false);
map.setTileSource(TileSourceFactory.MAPNIK);
MapTileProviderBasic provider = new MapTileProviderBasic(MainActivity.this, new OnlineTileSourceBase("USGS Topo", 0, 18, 256, "png?apikey=123", new String[0]) {
#Override
public String getTileURLString(MapTile aTile) {
// BoundingBox bbox = tile2boundingBox(aTile.getX(), aTile.getY(), aTile.getZoomLevel());
// String baseUrl ="https://api.xyz.com/maps/satellite/zxy/2020-04-21T16:10:00Z/?dpi=96&transparent=true&format=png24&bbox="+bbox.west+","+bbox.south+","+bbox.east+","+bbox.north+"&size=256,256&f=image";
return "https://api.xyz.com/maps/satellite/zxy/2020-04-21T16:10:00Z/" + aTile.getZoomLevel() + "/" + aTile.getY() + "/" + aTile.getX();
// return baseUrl;
}
});
TilesOverlay layer = new TilesOverlay(provider, MainActivity.this);
layer.setLoadingBackgroundColor(Color.TRANSPARENT);
layer.setLoadingLineColor(Color.TRANSPARENT);
map.getOverlays().add(layer);
map.setVerticalMapRepetitionEnabled(false);
If you are using osmndroid-wms then too you can use a map layer and an overlay layer on it by changing the following line
MapTileProviderBasic provider = new MapTileProviderBasic(MainActivity.this, mWMSTileSource)
over here mWMSTileSource will be the response you get from WMSTileSource.createFrom(WMSEndpoint endpoint, WMSLayer layer) after you have made a call to the WMS server as mentioned in the wiki.
I use:
cvFindContours(gray, mem, contours, Loader.sizeof(CvContour.class) , CV_RETR_CCOMP, CV_CHAIN_APPROX_SIMPLE, cvPoint(0,0));
and as the result I have CvSeq contours to iterate (as far as I understand it).
So I use it like that:
if(contours!=null) {
for (ptr = contours; ptr != null; ptr = ptr.h_next()) {
//..do sth with ptr
}
}
It works, but from time to time (quite often) I get:
Exception in thread "Thread-3" java.lang.NullPointerException: This pointer address is NULL.
at com.googlecode.javacv.cpp.opencv_core$CvSeq.h_next(Native Method)
at pl..filter(FullFilter.java:69)
at pl..Window$1.run(Window.java:41)
at java.lang.Thread.run(Unknown Source)
The line in which the exception is thrown is the line with ptr.h_next().
I tried to check for nulls but it doesn't work:
System.out.println("PTR="+ptr); // it's not null here!
if(ptr.h_next()==null) //exception in this line!
System.out.println("NULL");
System.out.println(ptr.h_next());
The first line shows:
PTR=com.googlecode.javacv.cpp.opencv_core$CvSeq[address=0x0,position=0,limit=1,capacity=1,deallocator=com.googlecode.javacpp.Pointer$NativeDeallocator#66d53ea4]
I tried also invoking contours.total() but it always throws the same exception.
So, what is a proper way to use in Java such C-like sequences?
EDIT:
my full method:
public IplImage filter(IplImage image) {
IplConvKernel kernel = cvCreateStructuringElementEx(2,2,1,1,CV_SHAPE_RECT, null);
cvDilate(image, image, kernel, 1);
kernel = cvCreateStructuringElementEx(5,5,2,2,CV_SHAPE_RECT, null);
cvErode(image, image, kernel, 1);
IplImage resultImage = cvCloneImage(image);
IplImage gray = cvCreateImage(cvGetSize(image), IPL_DEPTH_8U, 1);
cvCvtColor(image, gray, CV_BGR2GRAY);
CvMemStorage mem = CvMemStorage.create();
CvSeq contours = new CvSeq();
CvSeq ptr = new CvSeq();
cvThreshold(gray, gray, 20, 255, opencv_imgproc.CV_THRESH_BINARY);
double thresh = 20;
Canny( gray, gray, thresh, thresh*2, 3 ,true);
cvFindContours(gray, mem, contours, Loader.sizeof(CvContour.class) , CV_RETR_CCOMP, CV_CHAIN_APPROX_SIMPLE, cvPoint(0,0));
int i=0;
CvRect boundbox;
if(contours!=null) {
for (ptr = contours; ptr != null; ptr = ptr.h_next()) { //EXCEPTION HERE!
System.out.println((i++)+"\t"+ptr);
cvDrawContours( resultImage, ptr, CvScalar.BLUE, CvScalar.RED, -1, 3, 8, cvPoint(0,0) );
System.out.println("PTR="+ptr);
}
}
return resultImage;
}
It works fine for some time and suddenly (probably when no contours found?) it ends with the following exception:
Exception in thread "Thread-3" java.lang.NullPointerException: This pointer address is NULL.
at com.googlecode.javacv.cpp.opencv_core$CvSeq.h_next(Native Method)
at pl.kasprowski.eyetracker.FullFilter2.filter(FullFilter2.java:39)
at pl.kasprowski.eyetracker.Window$1.run(Window.java:42)
at java.lang.Thread.run(Unknown Source)
Im feeding the method directly with images taken from camera (once a second).
EDIT:
After some experiments it occurs that from time to time when I invoke cvFindContours as above I get a contour object which IS NOT NULL but invoking any method like contour.h_next() or contour.total() results in exception like above. What can be wrong? Or - how to check if the contour object is OK?! Of course I could catch NullPointerException but I don't think it's a correct way to solve the problem...
Problem solved.
I added additional condition. Instead of:
if(contours!=null) {
I wrote
if(contours!=null && !contours.isNull()) {
and it works. I don't see exactly why it is necessary but I think it's connected with Java <-> C semantic gap.
Try to use cvGetSeqElem(contours, i)
Example :
for (int i = 0; i < contours.total(); i++) {
CvRect rect = cvBoundingRect(cvGetSeqElem(contours, i),0);
//....... Your code ....//
}
I am beginner in java and Weka tool, I want to use Logitboost algorithm with DecisionStump as weak learner in my java code, but I don't know how do this work. I create a vector with six feature(without label feature) and I want feed it into logitboost for labeling and probability of its assignment. Labels are 1 or -1 and train/test data is in an arff file.This is my code, but algorithm always return 0 !
Thanks
double candidate_similarity(ha_nodes ha , WeightMatrix[][] wm , LogitBoost lgb ,ArrayList<Attribute> atts){
LogitBoost lgb = new LogitBoost();
lgb.buildClassifier(newdata);//newdata is an arff file with some labeled data
Evaluation eval = new Evaluation(newdata);
eval.crossValidateModel(lgb, newdata, 10, new Random(1));
try {
feature_vector[0] = IP_sim(Main.a_new.dip, ha.candidate.dip_cand);
feature_vector[1] = IP_sim(Main.a_new.sip, ha.candidate.sip_cand);
feature_vector[2] = IP_s_d_sim(Main.a_new.sip, ha);
feature_vector[3] = Dport_sim(Main.a_new.dport, ha);
freq_weight(Main.a_new.Atype, ha, freq_avg, weight_avg , wm);
feature_vector[4] = weight_avg;
feature_vector[5] = freq_avg;
double[] values = new double[]{feature_vector[0],feature_vector[1],feature_vector[2],feature_vector[3],feature_vector[4],feature_vector[5]};
DenseInstance newInst = new DenseInstance(1.0,values);
Instances dataUnlabeled = new Instances("TestInstances", atts, 0);
dataUnlabeled.add(newInst);
dataUnlabeled.setClassIndex(dataUnlabeled.numAttributes() - 1);
double clslable = lgb.classifyInstance(inst);
} catch (Exception ex) {
//Logger.getLogger(Module2.class.getName()).log(Level.SEVERE, null, ex);
}
return clslable;}
Where did this newdata come from? you need to load the file properly to get a correct classification, use this class to load features from the file:
http://weka.sourceforge.net/doc/weka/core/converters/ArffLoader.html
I'm not posting an example code because I use weka with MATLAB, so I dont have examples in Java.