How to make a File Sharing website [closed] - java

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
What computer languages would I need to create a file sharing website? I think I would use HTML, CSS, Javascript, and PHP? Also how would I go about doing so?

You would most likely need to use a server-based language, like SQL. HTML, CSS and PHP will be required, and Javascript a great plus.
Mainly, what you would do is use move_uploaded_file and is_uploaded_file after a form POST. See here for more informations. You'd store the file name in your server, its location, its uploader, etc.
You'll need an authentification system, which the most primitive of will make use of session variables, and the most advanced your SQL server, a hash function and the likes.
Good luck!

Related

How to take screenshots of a web page and make a video of them using coding(any language)? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I have a table on my web-page with thousands of entries and it shows 20 entries at a time. I want to take screenshots of all the entries page by page and then create a video of them using code. What would be the best language and method to do this task? I know nothing of this thing so I am open to any language like python, Java, Go, etc.
You can user Selenium library in either java or python, it has some tools for doing these operations.
here is an exapmle.

How to write a web browser in java [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I am an amateur java program developer and was wondering if anybody knows how to write a web browser in java if possible.
If you need a component to render HTML, JEditorPane in the javax.swing library is capable of doing it for basic HTML. You could mock browser behavior by adding an input for entering a URL, getting markup, and setting it to the editor pane. Tracking URL history and that sort of thing would be up to you, and it could be a decent project for learning a lot about Swing, event handling, concurrency, etc...
If you're looking at building your own rendering engine for HTML, CSS, and JavaScript... that's a much larger problem.
http://docs.oracle.com/javase/7/docs/api/javax/swing/JEditorPane.html

Access website without browser [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I am just learning java. I just want to make a simple application to access a web-site.
there is a website onto which i want to log-in through java:
and then interact with it through my interface, basically after log in, i would be writing in some text boxes and sending it.
I tried many places to do it, studied HTTP protocol but still cant make it.
can someone help me out?
Accessing a web site, logging in and interacting with forms on it is somewhat complex work, so it might not be the best choice for a first java project.
But if you want to do it, you should probably use Apache HttpComponents/HttpClient.
There are useful examples at the above link as well, which may help you get started.

Web Crawler's Functionality [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
Does a web crawler return the extracted text from webpages only? Say, if there are some pdf/doc files stored in the web server as well. Can a web crawler crawl through them and return their content as well? Anyway what are the suggestions for a good opensource Java web crawler?
Thank You!
Web crawler doesn't extract the text. It simply returns the htmls with some transformations [UTF-8 conversion for example] applied.
If you think of it that way for crawler it doesn't matter at the first hop. Of course for multiple hops it needs to look inside these documents and typical crawlers don't provide multiple hops in pdf/docs etc.

How to clean private data from browsers(temp files, caches, cookies) using Java on Windows [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I think of doing something like a file cleaner for Windows with Java just for fun, I want it to be able to clean the temp files, caches... etc for the major browsers opera, ie, firefox.
I'm basically looking for some guidance to where to start my research, what are some good reads and basically what kind of libraries will I have to use (if any).
I think you should take a look at bleachbit. It's written in python, but the actions should be easily converted to java. But then again, why not just use bleachbit itself?

Categories