Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I have stored some data in Mongo DB (file name, URL, keywords) and performed search for particular files, I can display this result in JSP file line by line but my requirement is to display all the results in a folder structure. for example Java folder should contain Java related files.
How to do this?
For display porpose you can make use of jstree
jsTree is a javascript based, cross browser tree component. It is
packaged as a jQuery plugin. jsTree is absolutely free (licensed same
as jQuery – under the terms of either the MIT License).
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I have a table on my web-page with thousands of entries and it shows 20 entries at a time. I want to take screenshots of all the entries page by page and then create a video of them using code. What would be the best language and method to do this task? I know nothing of this thing so I am open to any language like python, Java, Go, etc.
You can user Selenium library in either java or python, it has some tools for doing these operations.
here is an exapmle.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I am developing a financial website where on click on pdf or excel sheet i need to convert existing data into an PDF and it should get downloaded. Can you help me how to start about this ??
Can it be done through scripting or does it require JAVA ??
Thanks
In order to convert it into pdf, kindly refer mozilla plugin called page saver, a simple command like "firefox --display localhost:2.0 -saveimage" will convert HTML to image file, which can than be converted into pdf easily.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 months ago.
Improve this question
I have an excel file downloaded from SAP. I want to read particular columns from that file, create a json object from it and then render it on a jsp. what would be the fastest way to do it?
I'm fond of JExcel for it's simplicity, but it loads the entire spreadsheet into memory. If you have enough memory for this, I'd use it.
If you don't have sufficient memory to read the entire file into memory, then you'll probably want to use the streaming capabilities of POI-XSSF
i'am oftenly use talend http://www.talend.com/products/talend-open-studio. as far as i try it, you can generate a web service that meets your needs, from excel convert to json and certain data.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
Does a web crawler return the extracted text from webpages only? Say, if there are some pdf/doc files stored in the web server as well. Can a web crawler crawl through them and return their content as well? Anyway what are the suggestions for a good opensource Java web crawler?
Thank You!
Web crawler doesn't extract the text. It simply returns the htmls with some transformations [UTF-8 conversion for example] applied.
If you think of it that way for crawler it doesn't matter at the first hop. Of course for multiple hops it needs to look inside these documents and typical crawlers don't provide multiple hops in pdf/docs etc.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I need to save files from websites Using HtmlUnit. I am currently navigating to pages that have several anchors that use javascript onClick()="DownloadAttachment('attachmentId')" to get the files. The files can be of pretty much any type ( xls, doc, txt, pdf, jpg, etc). So far though I've been unable to find resources or examples that show how to save files using htmlUnit. I've been trying mainly to get AttachmentHandler to work for this as it seems the most likely to work, but have been unsuccessful. I was wondering if anyone else has managed to download files using HtmlUnit and could assist?