I have a gps coordinate, I wish to get a list of adresses inside it's certain radius. From these adresses I wish to select the ones that are companies and contain specific sector name. (clothing, finance, metallurgy, etc..), I thought the returned api object may contain the necessary information to make the distinction. Is there a free api that has the necessary functions to bring me closer to my goal ? (like google maps, google places, yahoo placefinder, and similar applications.)
Note: I use the MVEL language, I guess it has access to some Java functions so you can consider me writing Java, but any information considering the MVEL language is also appreciated.
Note2: Radius is not an essential requisite, it would be enough if I found the adresses in a certain district. The classification however, is necessary.
I think you can take a look into :
GoogleMapServicesJava
Particulary into the Places API service :
Places API
Where you can query a search sort by radius, place type, etc...
Hope that will help !
Related
I am developing an app which will give you nearby Mosques within 10 km of your current location. Now that the Places API allows a certain number of queries per day, I have used firebase to store nearby Mosques for a certain location and I first check if the data is in database or not before querying. But this still doesn't solve the problem. e.g. if a user is on the go the whole day then the results must be changing every single minute, according to his/her location. How can I achieve the desirable results?
As mentioned earlier, I am saving nearby locations in a database with their relative location (around which they exist). But this doesn't quite solve the problem.
Any help will be greatly appreciated.
Places API is a commercial offering - you are meant to pay for using it, if you want to make applications around it.
There's a certain small number of calls that you can do for free, but this is only meant as testing grounds or private use. I am no lawyer, but I would guess that circumventing the fee by scraping the map (like setting a bot to go around a country to build a database of points of interests) would be illegal and would probably get you a letter from Google saying you should stop.
Use AutocompleteSessionToken class to generate a token and place it after your key , this token will reduce your usage because you can request the places api multiple times and still it will be considered as a single request. i hope this will help cause i didnt get your question very well. here is sample of the link:
https://maps.googleapis.com/maps/api/place/autocomplete/json?input=1600+Amphitheatre&key=&sessiontoken=1234567890.
For more details.see here
i want to build a weather app and i have some problems.
my big problems is places names! you know, i want that user find his location with two ways. with GPS and by searching. but my problem is that place names. how can find a database from whole places in the world?!
is it good idea that i store them in a database in my server? or there is some services that provide this functionality? if i have to create my own database how i can create a database like that. is there a database with city names and latitude/ longitude and other information? if a new location created how i can add it to the system? by hand?
the second problem is that after catching the city name how i can find that locations latitude/ longitude? it seems google has a Geocoder service but i don't know how it works. please help me. i just want to select a city name and find related coordinates...
The Google Maps API provides you with Geocoding, which allows you to turn strings like 'Santiago, Chile' or 'New York', into proper latitude/longitude coordinates. So in your code, you should perform an HTTP GET request to a URL like this one:
http://maps.googleapis.com/maps/api/geocode/json?address=Santiago,+Chile&sensor=true
And it returns a JSON object with a properly formatted address along with latitude and longitude information of the place you were looking up.
It's explained very thoroughly in the Google Maps API Geocoder documentation, so you should probably take a look at that. I'm no expert in Android development, but there should be some library that allows you to easily access what the Maps API has to offer in a clean way.
Maps API also provides you with a solution for Place Searching, and even input autocompletion, but all the examples I see are on JavaScript/HTML, so I'm not completely sure if there's an alternative in plain Java/Android to what you're trying to do. Nevertheless, you should take a look at basic place searching and place search autocompletion so you get a general sense of how it works.
There's an entire section in the Google Developers website dedicated to the Maps API on Android, so make sure to take a look at that aswell and you might find more useful information - sadly I have no experience with Android whatsoever so I can't really point you in any direction.
Good luck!
You could use the Google Maps Geocoding API. Querying coordinates returns a JSON response containing the current location in multiple levels of detail that you could parse.
E.g. for the position lat:40.714224 long: -73.961452, the query URL would look like the following:
http://maps.googleapis.com/maps/api/geocode/json?latlng=40.714224,-73.961452&sensor=false
The result now contains the city and country, beside much more.
You can find more information on the API here: https://developers.google.com/maps/documentation/geocoding/
You could also try http://openweathermap.org/
I haven't used this extensively but I tried it enough that I know it works. It lets you query by location name or geographic coordinates, and the city name is included in the responses along with plenty of other data.
http://api.openweathermap.org/data/2.5/weather?lat=xx&lon=xx returns a weather object with city name & related data. API key is even optional so you can try it out very quickly.
I am developing a financial manager in my freetime with Java and Swing GUI. When the user adds a new entry, he is prompted to fill in: Moneyamount, Date, Comment and Section (e.g. Car, Salary, Computer, Food,...)
The sections are created "on the fly". When the user enters a new section, it will be added to the section-jcombobox for further selection. The other point is, that the comments could be in different languages. So the list of hard coded words and synonyms would be enormous.
So, my question is, is it possible to analyse the comment (e.g. "Fuel", "Car service", "Lunch at **") and preselect a fitting Section.
My first thought was, do it with a neural network and learn from the input, if the user selects another section.
But my problem is, I donĀ“t know how to start at all. I tried "encog" with Eclipse and did some tutorials (XOR,...). But all of them are only using doubles as in/output.
Anyone could give me a hint how to start or any other possible solution for this?
Here is a runable JAR (current development state, requires Java7) and the Sourceforge Page
Forget about neural networks. This is a highly technical and specialized field of artificial intelligence, which is probably not suitable for your problem, and requires a solid expertise. Besides, there is a lot of simpler and better solutions for your problem.
First obvious solution, build a list of words and synonyms for all your sections and parse for these synonyms. You can then collect comments online for synonyms analysis, or use parse comments/sections provided by your users to statistically detect relations between words, etc...
There is an infinite number of possible solutions, ranging from the simplest to the most overkill. Now you need to define if this feature of your system is critical (prefilling? probably not, then)... and what any development effort will bring you. One hour of work could bring you a 80% satisfying feature, while aiming for 90% would cost one week of work. Is it really worth it?
Go for the simplest solution and tackle the real challenge of any dev project: delivering. Once your app is delivered, then you can always go back and improve as needed.
String myString = new String(paramInput);
if(myString.contains("FUEL")){
//do the fuel functionality
}
In a simple app, if you will be having only some specific sections in your application then you can get string from comments and check it if it contains some keywords and then according to it change the value of Section.
If you have a lot of categories, I would use something like Apache Lucene where you could index all the categories with their name's and potential keywords/phrases that might appear in a users description. Then you could simply run the description through Lucene and use the top matched category as a "best guess".
P.S. Neural Network inputs and outputs will always be doubles or floats with a value between 0 and 1. As for how to implement String matching I wouldn't even know where to start.
It seems to me that following will do:
hard word statistics
maybe a stemming class (English/Spanish) which reduce a word like "lunches" to "lunch".
a list of most frequent non-words (the, at, a, for, ...)
The best fit is a linear problem, so theoretical fit for a neural net, but why not take immediately the numerical best fit.
A machine learning algorithm such as an Artificial Neural Network doesn't seem like the best solution here. ANNs can be used for multi-class classification (i.e. 'to which of the provided pre-trained classes does the input represent?' not just 'does the input represent an X?') which fits your use case. The problem is that they are supervised learning methods and as such you need to provide a list of pairs of keywords and classes (Sections) that spans every possible input that your users will provide. This is impossible and in practice ANNs are re-trained when more data is available to produce better results and create a more accurate decision boundary / representation of the function that maps the inputs to outputs. This also assumes that you know all possible classes before you start and each of those classes has training input values that you provide.
The issue is that the input to your ANN (a list of characters or a numerical hash of the string) provides no context by which to classify. There's no higher level information provided that describes the word's meaning. This means that a different word that hashes to a numerically close value can be misclassified if there was insufficient training data.
(As maclema said, the output from an ANN will always be floats with each value representing proximity to a class - or a class with a level of uncertainty.)
A better solution would be to employ some kind of word-relation or synonym graph. A Bag of words model might be useful here.
Edit: In light of your comment that you don't know the Sections before hand,
an easy solution to program would be to provide a list of keywords in a file that gets updated as people use the program. Simply storing a mapping of provided comments -> Sections, which you will already have in your database, would allow you to filter out non-keywords (and, or, the, ...). One option is to then find a list of each Section that the typed keywords belong to and suggest multiple Sections and let the user pick one. The feedback that you get from user selections would enable improvements of suggestions in the future. Another would be to calculate a Bayesian probability - the probability that this word belongs to Section X given the previous stored mappings - for all keywords and Sections and either take the modal Section or normalise over each unique keyword and take the mean. Calculations of probabilities will need to be updated as you gather more information ofcourse, perhaps this could be done with every new addition in a background thread.
I am using the 'Twitter4j' library and I am just wondering if it is at all possible to return tweets within a location AND contain a certain keyword. I notice that on the official Twitter documentation it mentions this:
Bounding boxes are logical ORs. A locations parameter may be combined with track parameters, but note that all terms are logically ORd, so the query string track=twitter&locations=-122.75,36.8,-121.75,37.8 would match any tweets containing the term Twitter (even non-geo tweets) OR coming from the San Francisco area.
Which is unfortunate as it is not what I need, it's returning way too many tweets. Any idea on how I could get around this or is there something I'm missing in the library that could allow me to do it?
Library javadoc: http://twitter4j.org/en/javadoc/twitter4j/FilterQuery.html#locations
At the moment I have my filter code like this
twitter.filter(new FilterQuery().locations(sydney).track(keywords));
and have also tried each on its own line:
twitter.filter(new FilterQuery().locations(sydney).track(keywords));
twitter.filter(new FilterQuery().track(keywords));
Unfortunately, you are reading the documents correctly. I don't know enough about twitter4j to say if there's a method contained somewhere that will handle this for you more easily, but you can always just use a simple string comparison to see if your search terms are included in the tweet.
Not an answer but a Simple workaround:
I know most of people don't have GPS enabled when they tweet and others would not like to share their location!
But they are still sharing their location!! Guess how? On their profiles! Their hometown, their country is mostly visible, which can give you an approximate location of where the tweet came from! You can query for the user's profile and thus his/her location using the Rest API
twitter.showUser(userScreenName).getLocation();
Search for all keywords, if the location you wan't doesn't match, simply discard! In this way, you can get more number of tweets atleast
I am planning to design an address validation for users registering in my app. Possibly validating by zipcode and state.
Any idea how to handle addresses from around the globe?
Do i need to insert all the zipcodes in the database and then validate the address. Any possible suggestion for the implementation?
Thanks and Welcome :)
Krisp
Since there is no international standard for zip codes and a list of all zip codes in the world would be out of date before you were finished putting it together, I suggest a smaller approach:
Identify the countries that you will have to handle most and develop seperate validation rules for each of them. Make certain that with this you handle a vast majority of your users (e.g. 95%, or98%). For all the other countries, just accept what they enter vithout further validation.
There are so many different address formats in the world that it is just not worth the effort (if at all possible) to handle them all.
There is MASSIVE variance among address and postal code formats, such that there is not any "standard" way of doing this. See "Frank's Compulsive Guide to Postal Addresses"...
How much/what kind of validation do you really need? If the user is entering their shipping address, for example, they're more likely than you to know what particular format their local postal/shipping provider needs. Just give them a multiline textarea to enter it. If you need parts of it to calculate shipping costs, request just the information you need (City/Country, for example)
Postal Codes can actually be a headache because in some places they can represent very tiny areas as opposed to the US where they often represent relatively large areas (except in a big city where they may represent a few blocks).
Look at Canada, their postal codes can actually represent very very tiny areas. Two stores on opposite sides of the street often have different Canadian postal codes. Also in a list of Canadian businesses, when merging the list it is not uncommon to see the same address with a slightly different postal code. This just indicates that a lot of people get it wrong. On a customer basis I don't know how realistic it is that they actually get their exact zip code right.
http://www.columbia.edu/kermit/postal-ca.html
Basically it seems that each apartment or business dwelling may get their own zip code, which would make sense based upon what I have seen with Canadian business addresses.
The other point is that this is just Canada. Each European country will have its own address/postal code, so will Australia, Russia, etc... If you really want to do address verification, this is a major project.
To actually verify the address you need to to verify the postal code, city, and street. In the US the census releases the TIGER database files which often have a list of streets. But for other countries I don't know how you can get a list of streets. It may be best to look into a commercial package (maybe one of the GIS packages, although a lot of them only offer detailed addresses for the US/Canada and sometimes a few European countries).
A perfect Address validation can't be exactly placed in the already developed application, the validation of zip-code / postal code can be done as per the name of country though.
Please check the regex from the 'supplementalData.xml' xml file from the source xml-files source.
By parsing the xml you can find the corresponding postal-code regular expression for the country-code passed at the run-time, where you can check whether it's matching with country.
Have found another answer on this :
please refer the wiki's link : http://en.wikipedia.org/wiki/List_of_postal_codes.
Here you can find most of the zip-code patterns of most of the countries, of which you may write regex and maintain into database, which would help you to validate zip-code easily and also an optimized approach !
As many users have mentioned previously, verifying international addresses is basically impossible because there are no standards across countries and many countries don't have the resources for their postal system. Technically speaking, even in the United States, the USPS is struggling.
On a minimum you can offer address verification on a per-country basis. One of the easiest countries where you get a lot of coverage is in the USA. To do this you need to connect to some kind of address verification web service. There are several companies which have web services for this. One thing to be careful of is ensuring that each provider has geo-distribution of their API to ensure that any outages on their part don't flow back to you and kill your application. Beyond that, just make sure the results are CASS certified.
In the interest of full disclosure, I'm the founder of SmartyStreets. We have an address verification web service API called LiveAddress. You're more than welcome to contact me personally if you have any questions.