I am trying to make a simple Application on facebook.
There will only be one user using this, me.
Say I have 10 photos to upload everyday, I thought i'll make an application that will simply take every image in a directory and simply upload it to a page. A Fan page assume "X"
I started out by making an app in developer.facebook.com, say "X app".
Then I got the facebook4j library and tried posting a status as the page, using manage_pages permission.
I am stuck at many ends in this attempt and just wanted to ensure this is the right way to do it, or am I doing something enitrely wrong perhaps?
There is already a complete answer to this question here (even with the needed source code).
However, if you need to do this automatically, create a PHP script that will take N new photos from a directory, upload them to your Facebook fan page and moving or deleting them when they are successfully online.
Then just use crontab on your server to call this script everyday at a chosen hour.
Related
Greetings of the day
I know you have a busy schedule. Would you like to give me your 10 to 20 minutes for a discussion. I have a doubt and did some google search for this. There are some information but those are not much clear.
I am sure that if this doubt resolved and we found the solution to achieve the objective. It will help us for each and every project. It can be a time taking solution but only required once to get the line.
Doubt :- Do we can make an application which hit an url and take a response and according response performe some commands to write and read or modify the file itself.
In Java or other language.
In short without building new generated file code can reflect as like in web it happen.
Example :- we write and save the code file in web development at host location. Browser hit the url and show the output as it is newly writen code.
If we have added a new button in our website it reflects after saving the file.
Same thing I wanted to perform in Java and Android or dart.
Issue to solve:- the review time taken by play store. And for each update we need to build new apk each time.
I have a problem that doesn't seem to be answered clearly in StackOverflow. I want to download a page using Java and retrieve some data from it in order to give some values to an application that I develop. This page is a betting site so it contains javascrit methods to change the betting values.
In order to do some tests I downloaded the page manually using Ctrl-S and then I made a programm (with FileReader, BufferedReader, etc...) which retrieves the data. This worked perfectly. So I would make a bash script to be executed in order to download the page every time when the user opens my application.
After that I searched for methods who download the page programmatically (I used Jsoup, URL, ...). What I noticed is that the javascript variable values couldn't be printed because the javascript code wasn't executed.
What i want to know is that if there is some way to download programmatically the executed website (download the instance of the javascript values) without having to make some bash script to do it every time before someone opens my app.
Try HtmlUnit. It is used for automatic testing but should fit your purpose well too!
I am trying to download the contents of a site. The site is a magneto site where one can filter results by selecting properties on the sidebar. See zennioptical.com for a good example.
I am trying to download the contents of a site. So if we are using zennioptical.com as an example i need to download all the rectangular glasses. Or all the plastic etc..
So how do is send a request to the server to display only the rectangular frames etc?
Thanks so much
You basic answer is you need to do a HTTP GET request with the correct query params. Not totally sure how you are trying to do this based on your question, so here are two options.
If you are trying to do this from javascript you can look at this question. It has a bunch of answers that show how to perform AJAX GETs with the built in XMLHttpRequest or with jQuery.
If you are trying to download the page from a java application, this really doesn't involve AJAX at all. You'll still need to do a GET request but now you can look at this other question for some ideas.
Whether you are using javascript or java, the hard part is going to be figuring out the right URLs to query. If you are trying to scrape someone else's site you will have to see what URLs your browser is requesting when you filter the results. One of the easiest ways to see that info is in Firefox with the Web Console found at Tools->Web Developer->Web Console. You could also download something like Wireshark which is a good tool to have around, but probably overkill for what you need.
EDIT
For example, when I clicked the "rectangle frames" option at zenni optical, this is the query that fired off in the Web Console:
[16:34:06.976] GET http://www.zennioptical.com/?prescription_type=single&frm_shape%5B%5D=724&nav_cat_id=2&isAjax=true&makeAjaxSearch=true [HTTP/1.1 200 OK 2328ms]
You'll have to do a sufficient number of these to figure out how to generate the URLs to get the results you want.
DISCLAIMER
If you are downloading someone's else data, it would be best to check with them first. The owner of the server may not appreciate what they might consider stealing their data/work. And then depending on how you use the data you pull down, you could be venturing into all sorts of ethical issues... Then again, if you are downloading from your own site, go for it.
I want to build an Android application that downloads an XML file from a web server and displays its contents in a readable format.
My problem is generating that XML file. Basically I want to run a program, say, every 30 minutes that downloads a web page (as that data is not easily accessible), parses it, generates said XML file and puts it somewhere for the Android application to download.
Now, I was writing a Java application to do this, but it came to me: where am I going to run this? I thought of having a laptop permanently running at home, but there must be a better alternative.
I have online hosting, but it is very simple. It does not even include SSH.
Any ideas?
Edit: as per your suggestions, I checked and yes, my cPanel does have a "Cron Jobs" section. I will now investigate it. Thank you so much for your help.
http://www.setcronjob.com/ allows you to trigger a web page request once every hour, which might be good enough.
I have not actually tried it, but it sounds like a good solution.
you need to rent a server which will generate your html and also serve the content to your app. Not expensive if you get a VPS or cloud server.
Although I've been programming for a few years I've only really dabbled in the web side of things, it's been more application based for computers up until now. I was wondering, in java for example, what library defined function or self defined function I would use to have a program launch a web browser to a certain site? Also as an extension to this how could I have it find a certain field in the website like a search box for instance (if it wasnt the current target of the cursor) and then populate it with a string and submit it to the server? (maybe this is a kind of find by ID scenario?!)
Also, is there a way to control whethere this is visible or not to the user. What I mean is, if I want to do something as a background task whilst the user carries on using the program, I will want the program to be submitting data to a webpage without the whole visual side of things that would interrupt the user?
This may be basic but like I say, I've never tried my hand at it so perhaps if someone could just provide some rough code outlines I'd really appreciate it.
Many thanks
I think Selenium might be what you are looking for.
Selenium allows you to start a Web browser, launch it to a certain website and interact with it. Also, there is a Java API (and a lot of other languages, by the way) allowing you to control the launched browser from a Java application.
There are some tweaking to do, but you can also launch Selenium in background, using a headless Web browser.
as i understand it you want to submit data to a server via the excisting webinterface?
in that case you need to find out how the URL for the request is build and then make a http-call using the corresponding URL
i advice reading this if it involves a POST submit