Allow app to communicate with web domain - java

I have a domain that is user posts based. I plan to create a user posts based app like 9gag. I need the app to be able to communicate and fetch data hosted from my domain.
Things I need the app to do:
1) Allow users to post pictures though the app.
2)Allow users to leave comments through the app.
3)Allow users to leave 'likes' though the app.
I want the data to be stored on my domain, while when a user opens the app, the app will fetch this data from the domain and display it for the user. How can I make my app communicate with the domain?
Thanks!

The best way to do this would be to implement an API on your domain that your app can send requests to. I cannot explain all this in detail here because it would require a lot of space and a full blown tutorial, but I can tell you what to research and what to implement to make this happen.
First off you need to create an API for your app to send requests to. I suggest a "RESTful" api as they are pretty strait forward to the average programmer. Here is a good video that explains what an API is and a little bit of how they are typically implemented. https://youtu.be/7YcW25PHnAA
After you have an API setup, you have "encode" the information so that it is easy to parse once your app has a hold of all that information. To do this we use a "data-interchange format". One of the big ones being used today is JSON, see their website to learn more here: http://www.json.org/ JSON is pretty strait forward and easy to understand if you have a concept of what Programming: objects, strings, arrays etc are.
Ok so you have gotten your information from the server, you have parsed it from the JSON you got, and displayed all your content... now, what do you do if your user give a thumbs up or comment on something? This is also implemented via the API, this part should be easiest for you, it involves wrapping up the required data (Content id, user id, what they did [ie liked the content]) and send this via a http request, just like how you got your information in the first place, but instead of reading the data response, now we are just sending the HTTP request from the app, and we don't care what happens next (on the app level) its up to the server to record the data from the HTTP request.
I would highly suggest looking up how to create API and look through some tutorials... there are a lot of tutorials out there that want you to modify the HTACCESS file on the server, this is really necessary (Boy I hope I don't get crucified for saying that; fellow Stack Overflow Citizens, if you disagree, please explain your reasoning) Obviously for a large mainstream website, the whole HTACCESS file might be a good idea, but for a beginner, I don't think it is really needed.

Related

How to automatically get new mails from gmail account using java?

I know my question is more likely to be discussed rather than really answered (because it's very large), but I need some elements and "advices" to really get started, sorry for that.
So I've got a java program which has to analyse some URLs, I've finished this part. For the moment, the user has to enter manually the link into my program, then the link is analysed.
Now I have to retrieve automatically those links, which will be sent to me via e-mail. (In a special Gmail inbox created for that purpose)
So I need to :
Let my Java program "listen" to my inbox
Extract the link of any new mail in order to analyse it
There are many problems, according to the way I choose to access to my inbox (POP3, GMail API...), according to the frequency my application would check the inbox, maybe there would be authentification problems... Even how to let my application run as a "daemon"...
And I really don't know how to get started, which choices to make etc.
Any help is welcome of course, if you have any documentation or else. Thank you in advance.
You have several ways of doing this.
I suggest you using Java Mail, that has a simple and useful API.
You have some documentation and examples in this URL:
https://java.net/projects/javamail/pages/Home#Samples
Look at the class monitor (very bad name!!!) inside the examples. This class monitor a mail box for new emails.

Is there a way to send clients or users Logcat or exceptions to me?

Hí,
I'm doing my second App, and The second one is a little bit complex, it's backup tool.
In some devices works perfect in others not.
I would like if is there a way to do my own Exceptions catcher to my server or log cats or smth, and receive them, and known if my app needs some fix or not etc.
Greetens and thanks
David
As pointed out, you can use ACRA. The site states:
Acra catches exceptions, retrieves lots of context data and send them
to the backend of your choice.
and that is what you seem to need. The Quick Setup Guide is there on the homepage itself.
You can also look into BugSense which you can use as a back end for your ACRA.
If you are using ACRA, you can use BugSense as your backend.
The only change you need to do is specify in formUri BugSense's url
and your API key:
#ReportsCrashes(formUri =
"http://www.bugsense.com/api/acra?api_key=YOUR_API_KEY", formKey="")

Interacting with an AJAX site from Java

I am trying to download the contents of a site. The site is a magneto site where one can filter results by selecting properties on the sidebar. See zennioptical.com for a good example.
I am trying to download the contents of a site. So if we are using zennioptical.com as an example i need to download all the rectangular glasses. Or all the plastic etc..
So how do is send a request to the server to display only the rectangular frames etc?
Thanks so much
You basic answer is you need to do a HTTP GET request with the correct query params. Not totally sure how you are trying to do this based on your question, so here are two options.
If you are trying to do this from javascript you can look at this question. It has a bunch of answers that show how to perform AJAX GETs with the built in XMLHttpRequest or with jQuery.
If you are trying to download the page from a java application, this really doesn't involve AJAX at all. You'll still need to do a GET request but now you can look at this other question for some ideas.
Whether you are using javascript or java, the hard part is going to be figuring out the right URLs to query. If you are trying to scrape someone else's site you will have to see what URLs your browser is requesting when you filter the results. One of the easiest ways to see that info is in Firefox with the Web Console found at Tools->Web Developer->Web Console. You could also download something like Wireshark which is a good tool to have around, but probably overkill for what you need.
EDIT
For example, when I clicked the "rectangle frames" option at zenni optical, this is the query that fired off in the Web Console:
[16:34:06.976] GET http://www.zennioptical.com/?prescription_type=single&frm_shape%5B%5D=724&nav_cat_id=2&isAjax=true&makeAjaxSearch=true [HTTP/1.1 200 OK 2328ms]
You'll have to do a sufficient number of these to figure out how to generate the URLs to get the results you want.
DISCLAIMER
If you are downloading someone's else data, it would be best to check with them first. The owner of the server may not appreciate what they might consider stealing their data/work. And then depending on how you use the data you pull down, you could be venturing into all sorts of ethical issues... Then again, if you are downloading from your own site, go for it.

Android app that accesses database on server machine

I'm developing an android app that accesses Databases that are stored on a server machine. I've done a bit a of googling and had a look through some of my programming books but can't find much information.
SQLite seems like the right way to go but I can't find anything about Databases stored on a server. I'd appreciate it if somebody could point me in the right direction.
Thanks
What kind of server you are talking about?
The simplest case is that you can write a server program to expose some information in your database. Then, you can do a GET request from your Android app to get those data.
If data is more complicated and structural, your server can return json or xml object, then parse them from your client.

Creating a knowledge base on top of provided webpages as a feed

I have some issues with my parts of final year projects. We are implementing a plagiarism detection framework. I'm working on internet sources detection part. Currently my internet search algorithm is completed. But I need to enhance it so that internet search delay is reduced.
My idea is like this:
First user is prompt to insert some web links as the initial knowledge feed for the system.
Then it crawl through internet and expand it's knowledge
Once the knowledge is fetch System don't need to query internet again. Can someone provide me some guidance to implement it? We are using Java. But any abstract detail will surely help me.
if the server side programming is you hand then you can manage a tabel having a boolean in database which shows whether the details were read before. every time your client connects to server, it will check the boolean first and if boolean was set false then it will mean that there is a need to send updates to client other wise no updates will be sent,
the boolean will become true every time when client downloads any data from server and will become false when ever the database is updated
I'm not quite sure that I understand what you're asking. Anyway:
if you're looking for a Java Web crawler, then you I recommend that you read this question
if you're looking for Java libraries to build a knowledge base (KB), then it really depends on (1) what kind of properties your KB should have, and (2) what kind of reasoning capabilities you expect from your KB. One option is to use the Jena framework, but this requires that you're comfortable with Semantic Web formalisms.
Good luck!

Categories