I'm trying to paste image from clipboard in my website (like copy and paste). Appreciate if anyone could advice on this. Can I achieve this using HTML 5 or applet or any way. Any advice or any link for reference is highly appreciated.
Managed to do it with JavaScript.
JavaScript
if (!window.Clipboard) {
var pasteCatcher = document.createElement("apDiv1");
pasteCatcher.setAttribute("contenteditable", "");
pasteCatcher.style.opacity = 0;
document.body.appendChild(pasteCatcher);
pasteCatcher.focus();
document.addEventListener("click", function() { pasteCatcher.focus(); });
}
window.addEventListener("paste", onPasteHandler);
function onPasteHandler(e)
{
if(e.clipboardData) {
var items = e.clipboardData.items;
if(!items){
alert("Image Not found");
}
for (var i = 0; i < items.length; ++i) {
if (items[i].kind === 'file' && items[i].type === 'image/png') {
var blob = items[i].getAsFile(),
source = window.webkitURL.createObjectURL(blob);
pastedImage = new Image();
pastedImage.src = source;
pasteData();
}
}
}
}
function pasteData()
{
drawCanvas = document.getElementById('drawCanvas1');
ctx = drawCanvas.getContext( '2d' );
ctx.clearRect(0, 0, 640,480);
ctx.drawImage(pastedImage, 0, 0);
}
DIV
<div id="apDiv1" contenteditable='true'>Paste Test</div>
Even if applet is not signed, JNLP API is available.
ClipboardService cs = (ClipboardService)ServiceManager.lookup("javax.jnlp.ClipboardService");
Image c = (Image)cs.getContents().getTransferData(DataFlavor.imageFlavor);
at first, making a file(image) server.
then using js to listen to paste event.
code key word:
addEventListener 'paste' clipboard image
then using ajax upload to the file server. ajax resp the url.
finally making img tag by the url.
applet is out of date... ignore.
Related
I'm developing a web crawler Java based. I created a JFrame (Java: Swing). My crawler is running successfully. It's visiting founded link. But i want to add dynamically crawled link in JTextArea but it doesn't. I cannot do that. When i try this my program is freezen. But i can set visited url to console.
My gui like this:
image
My code lines like this:
Document html = null;
try {
html = Jsoup.connect(url).get();
Elements links = html.select("a");
for(Element link: links) {
String tmp = link.attr("abs:href");
jTextArea2.append(tmp + "\n");
if(!this.visitedUrl.contains(tmp)) {
this.foundedUrl.add(tmp);
System.out.println(tmp);
}
}
while(this.foundedUrl.size() > 0) {
String tmp = this.foundedUrl.get(this.foundedUrl.size() - 1);
this.foundedUrl.remove(this.foundedUrl.size() - 1);
if(!this.visitedUrl.contains(tmp)) {
this.linkTracker(tmp);
}
}
How can i set visited url in JTextarea dynamically?
try this:
new Thread((Runnable)() ->
{
Document html = null;
try {
html = Jsoup.connect(url).get();
Elements links = html.select("a");
for(Element link: links) {
String tmp = link.attr("abs:href");
EventQueue.invokeLater(() -> {
jTextArea2.append(tmp + "\n");
});
if(!this.visitedUrl.contains(tmp)) {
this.foundedUrl.add(tmp);
System.out.println(tmp);
}
}
while(this.foundedUrl.size() > 0) {
String tmp = this.foundedUrl.get(this.foundedUrl.size() - 1);
this.foundedUrl.remove(this.foundedUrl.size() - 1);
if(!this.visitedUrl.contains(tmp)) {
this.linkTracker(tmp);
}
}
}catch(Exception e){}
}).start();
The reason you GUI is freezing is because you are blocking the GUI thread. So start your operations on a different thread by creating a new Thread and run from there.
To then get updates call EventQue
It will tell the GUI thread to add the text to the JTextArea
How i can download webpage which uses java based loading mechanism?
Code below returns nearly empty document due site mechanism.
When viewed in browser you see "loading..." and after a while content is presented.
Also i want to avoid using WebBrowser control.
HtmlDocument doc = new HtmlDocument();
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(url);
req.AutomaticDecompression = DecompressionMethods.Deflate | DecompressionMethods.GZip;
if (!string.IsNullOrWhiteSpace(userAgent))
req.UserAgent = userAgent;
if (cookies != null)
{
req.CookieContainer = new CookieContainer();
foreach (Cookie c in cookies)
req.CookieContainer.Add(c);
}
var resp = req.GetResponse();
var resp_str = resp.GetResponseStream();
using (StreamReader sr = new StreamReader(resp_str, Encoding.GetEncoding("windows-1251")))
{
string r = sr.ReadToEnd();
doc.LoadHtml(r);
}
return doc;
Well you basically need a web browser to do the javascript running. Your webrequest now only fetches the data, as is, from the server.
You could use System.Windows.Forms.WebBrowser but its not pretty. This https://stackoverflow.com/a/11394830/2940949 might give you some idea on the basic issue.
I have a servlet which is responsible for enabling a user to update a reports table and upload a report at the same time. I have written code that enables a user upload a document and also be able to update the table with other details e.g date submitted etc.
However not all the times will a user have to upload a document. in this case it should be possible for a user to still edit a report's details and come back later to upload the file. i.e the user can submit the form without selecting a file and it still updates the table.
This part is what is not working. If a user selects a file and makes some changes. The code works. If a user doesn't select a file and tries to submit the form, it redirects to my servlet but it is blank. no stacktrace. No error is thrown.
Below is part of the code I have in my servlet:
if(param.equals("updateschedule"))
{
String[] allowedextensions = {"pdf","xlsx","xls","doc","docx","jpeg","jpg","msg"};
final String path = request.getParameter("uploadlocation_hidden");
final Part filepart=request.getPart("uploadreport_file");
int repid = Integer.parseInt(request.getParameter("repid_hidden"));
int reptype = Integer.parseInt(request.getParameter("reporttype_select"));
String webdocpath = request.getParameter("doclocation_hidden");
String subperiod = request.getParameter("submitperiod_select");
String duedate = request.getParameter("reportduedate_textfield");
String repname = request.getParameter("reportname_textfield");
String repdesc = request.getParameter("reportdesc_textarea");
String repinstr = request.getParameter("reportinst_textarea");
int repsubmitted = Integer.parseInt(request.getParameter("repsubmitted_select"));
String datesubmitted = request.getParameter("reportsubmitdate_textfield");
final String filename = getFileName(filepart);
OutputStream out = null;
InputStream filecontent=null;
String extension = filename.substring(filename.lastIndexOf(".") + 1, filename.length());
if(Arrays.asList(allowedextensions).contains(extension))
{
try
{
out=new FileOutputStream(new File(path+File.separator+filename));
filecontent = filepart.getInputStream();
int read=0;
final byte[] bytes = new byte[1024];
while((read=filecontent.read(bytes))!=-1)
{
out.write(bytes,0,read);
}
String fulldocpath = webdocpath+"/"+filename;
boolean succ = icreditdao.updatereportschedule(repid, reptype, subperiod, repname, repsubmitted,datesubmitted, duedate,fulldocpath, repdesc, repinstr);
if(succ==true)
{
response.sendRedirect("/webapp/Pages/Secured/ReportingSchedule.jsp?msg=Report Schedule updated successfully");
}
}
catch(Exception ex)
{
throw new ServletException(ex);
}
}
I'm still teaching myself javaee. Any help will be appreciated. Also open to other alternatives. I have thought of using jquery to detect if a file has been selected then use a different set of code. e.g
if(param.equals("updatewithnofileselected"))
{//update code here}
but I think there must be a better solution. Using jdk6, servlet3.0.
try this one.
MultipartParser parser = new MultipartParser(request, 500000000, false, false, "UTF-8");
Part part;
while ((part = parser.readNextPart()) != null) {
if(part.isParam()){
if(part.isFile()){
if(part.getName().equals("updatewithnofileselected")){
//update code here.
} else if(part.getName().equals("updateschedule")) {
//updateschedule
}
}
}
}
I used this one when I am using Multipart-form and it's working fine.
I am working with Java, Jasper Reports, and JSF. I want to generate a report in PDF format and can display it in an iframe something like this because on the same page to be able to put filters to reports and so on .. the report is generated correctly as a temp file even when I paste the path in the browser will display correct. but to put that route in an iframe I get the following error:
Not allowed to load the local resource: file :/ / / C :/ Users / Juanes ~ 1/AppData/Local/Temp/reportePedidosTemporal1922509630584311367.pdf
this is my code:
public void prueba(AjaxBehaviorEvent evento)
{
try
{
Map<String, Object> parametros = new HashMap<String, Object>();
parametros.put("autor", "Juan Esteban");
parametros.put("titulo", "Reporte de Pedidos");
List<PedidosVO> listaPedidos = new ArrayList<PedidosVO>();
for (int i = 1; i <= 10; i++)
{
PedidosVO pedido = new PedidosVO(""+i,"Cliente:"+i,(i+1));
pedido.setPuntos(i);
listaPedidos.add(pedido);
}
JasperDesign design = JRXmlLoader.load("C:\\Reportes\\Reporte2Pedidos.jrxml");
JasperReport reporte = JasperCompileManager.compileReport(design);
//-**-**-/*-/
JasperPrint jasperPrint = JasperFillManager.fillReport(reporte, parametros,new JRBeanCollectionDataSource(listaPedidos));
byte[] flujo = JasperExportManager.exportReportToPdf(jasperPrint);
File tempFile = File.createTempFile("reportePedidosTemporal",".pdf");
System.out.println(tempFile.setReadable(true));
escribirByte(tempFile,flujo);
tempFile.deleteOnExit();
if(tempFile.exists())
{
System.out.println("RUTA : "+tempFile.getPath());
url = tempFile.getPath();
}
}
catch(Exception e)
{
e.printStackTrace();
}
}
Any ideas, suggestions would be very grateful
You need to have a URL that points to the PDF that you load into the iframe. As it stands, you're attempting to load a file directly from the filesystem. This means your site will attempt to load a file from the end user's filesystem directly, which is not allowed for obvious reasons.
Generate the PDF on your server, then generate a URL that points to it. Then set that as the iframe source.
I am trying to search all image tags on a specific page. An example page would be www.chapitre.com
I am using the following code to search for all images on the page:
HtmlPage page = HTMLParser.parseHtml(webResponse, webClient.openWindow(null,"testwindow"));
List<?> imageList = page.getByXPath("//img");
ListIterator li = imageList.listIterator();
while (li.hasNext() ) {
HtmlImage image = (HtmlImage)li.next();
URL url = new URL(image.getSrcAttribute());
//For now, only load 1X1 pixels
if (image.getHeightAttribute().equals("1") && image.getWidthAttribute().equals("1")) {
System.out.println("This is an image: " + url + " from page " + webRequest.getUrl() );
}
}
This doesn't return me all the image tags in the page. For example, an image tag with attributes "src="http://ace-lb.advertising.com/site=703223/mnum=1516/bins=1/rich=0/logs=0/betr=A2099=[+]LP2" width="1" height="1"" should be captured, but its not. Am I doing something wrong here?
Any help is really appreciated.
Cheers!
That's because
URL url = new URL(image.getSrcAttribute());
Is throwing you an exception :)
Try this code:
public Main() throws Exception {
WebClient webClient = new WebClient();
webClient.setJavaScriptEnabled(false);
HtmlPage page = webClient.getPage("http://www.chapitre.com");
List<HtmlImage> imageList = (List<HtmlImage>) page.getByXPath("//img");
for (HtmlImage image : imageList) {
try {
new URL(image.getSrcAttribute());
if (image.getHeightAttribute().equals("1") && image.getWidthAttribute().equals("1")) {
System.out.println(image.getSrcAttribute());
}
} catch (Exception e) {
System.out.println("You didn't see this comming :)");
}
}
}
You can even get those 1x1 pixel images by xpath.
Hope this helps.