JavaFX ClassCastException using iText PDF - java

I am trying to download a TableView in JavaFx as an pdf document. I am using iText 7.2.5 . When I am clicking the button to download the TableView, it is asking me to select the folder then it is asking to write the name for the pdf document then after it; when I press ok to download the document, it downloads the pdf file that is unable to show any content even a blank page. When I open the pdf file with chrome it shows this, and I also tried to open with Adobe Acrobat, but situation is the same with pdf file.
Error Failed to load PDF document.
First, it was giving an exception for slf4j; then I added these jar files in classpath:
slf4j-simple-2.0.6.jar , slf4j-api-2.0.6-javadoc.jar , slf4j-api-2.0.6.jar .
It was giving warning for provider not found or something like that; that was solved after adding slf4j-simple-2.0.6.jar.
I saw a similar question here
java.lang.NoClassDefFoundError: org/slf4j/LoggerFactory problem.
Answer of this question is:
It turns out that it missed the following 3 jars : slf4j.api, slf4j-log4j12, and log4j
It works now after adding the above 3 jars.
These are all the files at org/slf4j
integration/ - -
jcl-over-slf4j/ - -
jcl104-over-slf4j/ - -
jul-to-slf4j/ - -
log4j-over-slf4j/ - -
nlog4j/ - -
osgi-over-slf4j/ - -
slf4j-android/ - -
slf4j-api/ - -
slf4j-archetype/ - -
slf4j-converter/ - -
slf4j-ext/ - -
slf4j-jcl/ - -
slf4j-jdk-platform-logging/ - -
slf4j-jdk14/ - -
slf4j-log4j12/ - -
slf4j-log4j13/ - -
slf4j-migrator/ - -
slf4j-nop/ - -
slf4j-parent/ - -
slf4j-reload4j/ - -
slf4j-simple/ - -
slf4j-site/ - -
slf4j-skin/ - -
taglib/ - -
log4j-over-slf4j/ ---> is it the file log4j that is mentioned in reference answer to the question.
I am unable to get slf4j-log4j12.
because org/slf4j/slf4j-log4j12/2.0.6 contains no jar file but this
slf4j-log4j12-2.0.6.pom 2022-12-12 19:14 873
slf4j-log4j12-2.0.6.pom.asc 2022-12-12 19:14 317
slf4j-log4j12-2.0.6.pom.md5 2022-12-12 19:14 32
slf4j-log4j12-2.0.6.pom.sha1
Here is the code on download button:
#FXML
public void ButtonDownload(ActionEvent event) {
DirectoryChooser directoryChooser = new DirectoryChooser();
File selectedDirectory = directoryChooser.showDialog(TableName.getScene().getWindow());
if (selectedDirectory == null) {
return;
}
TextInputDialog dialog = new TextInputDialog("table");
dialog.setTitle("Enter File Name");
dialog.setHeaderText("Enter the name for the PDF file:");
dialog.setContentText("File name:");
Optional<String> result = dialog.showAndWait();
if (result.isPresent()) {
try {
String fileName = result.get();
if (!fileName.endsWith(".pdf")) {
fileName = fileName + ".pdf";
}
String filePath = selectedDirectory.getAbsolutePath() + "/" + fileName;
OutputStream file = new FileOutputStream(filePath);
PdfDocument pdfDoc = new PdfDocument(new PdfWriter(filePath));
Document document = new Document(pdfDoc);
Table pdfTable = new Table(TableName.getColumns().size());
for (TableColumn<?, ?> column : TableName.getColumns()) {
pdfTable.addCell(column.getText());
}
for (Object item : TableName.getItems()) {
for (TableColumn<?, ?> column : TableName.getColumns()) {
/*this line is line:254*/
pdfTable.addCell(column.getCellData((int) item).toString());
}
}
document.add(pdfTable);
document.close();
} catch (Exception e) {
System.out.println("Exception while downloading pdf tableView: "+e);
e.printStackTrace();
}
}
}
This is the StackTrace:
Exception while downloading pdf tableView: java.lang.ClassCastException: class application.(className --> that is the controller of the fxml file where the download button is placed) cannot be cast to class java.lang.Integer (application.className(Controller) is in unnamed module of loader 'app'; java.lang.Integer is in module java.base of loader 'bootstrap')
java.lang.ClassCastException: class application.className(Controller) cannot be cast to class java.lang.Integer (application.className(Controller) is in unnamed module of loader 'app'; java.lang.Integer is in module java.base of loader 'bootstrap')
at application.className(Controller).DownloadMethod(className(Controller).java:254)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at com.sun.javafx.reflect.Trampoline.invoke(MethodUtil.java:77)
at jdk.internal.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at javafx.base#19.0.2.1/com.sun.javafx.reflect.MethodUtil.invoke(MethodUtil.java:275)
at javafx.fxml#19.0.2.1/com.sun.javafx.fxml.MethodHelper.invoke(MethodHelper.java:84)
at javafx.fxml#19.0.2.1/javafx.fxml.FXMLLoader$MethodHandler.invoke(FXMLLoader.java:1852)
at javafx.fxml#19.0.2.1/javafx.fxml.FXMLLoader$ControllerMethodEventHandler.handle(FXMLLoader.java:1724)
at javafx.base#19.0.2.1/com.sun.javafx.event.CompositeEventHandler.dispatchBubblingEvent(CompositeEventHandler.java:86)
at javafx.base#19.0.2.1/com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:234)
at javafx.base#19.0.2.1/com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:191)
at javafx.base#19.0.2.1/com.sun.javafx.event.CompositeEventDispatcher.dispatchBubblingEvent(CompositeEventDispatcher.java:59)
at javafx.base#19.0.2.1/com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:58)
at javafx.base#19.0.2.1/com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:114)
at javafx.base#19.0.2.1/com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:56)
at javafx.base#19.0.2.1/com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:114)
at javafx.base#19.0.2.1/com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:56)
at javafx.base#19.0.2.1/com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:114)
at javafx.base#19.0.2.1/com.sun.javafx.event.EventUtil.fireEventImpl(EventUtil.java:74)
at javafx.base#19.0.2.1/com.sun.javafx.event.EventUtil.fireEvent(EventUtil.java:49)
at javafx.base#19.0.2.1/javafx.event.Event.fireEvent(Event.java:198)
at javafx.graphics#19.0.2.1/javafx.scene.Node.fireEvent(Node.java:8923)
at javafx.controls#19.0.2.1/javafx.scene.control.Button.fire(Button.java:203)
at javafx.controls#19.0.2.1/com.sun.javafx.scene.control.behavior.ButtonBehavior.mouseReleased(ButtonBehavior.java:207)
at javafx.controls#19.0.2.1/com.sun.javafx.scene.control.inputmap.InputMap.handle(InputMap.java:274)
at javafx.base#19.0.2.1/com.sun.javafx.event.CompositeEventHandler$NormalEventHandlerRecord.handleBubblingEvent(CompositeEventHandler.java:247)
at javafx.base#19.0.2.1/com.sun.javafx.event.CompositeEventHandler.dispatchBubblingEvent(CompositeEventHandler.java:80)
at javafx.base#19.0.2.1/com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:234)
at javafx.base#19.0.2.1/com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:191)
at javafx.base#19.0.2.1/com.sun.javafx.event.CompositeEventDispatcher.dispatchBubblingEvent(CompositeEventDispatcher.java:59)
at javafx.base#19.0.2.1/com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:58)
at javafx.base#19.0.2.1/com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:114)
at javafx.base#19.0.2.1/com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:56)
at javafx.base#19.0.2.1/com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:114)
at javafx.base#19.0.2.1/com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:56)
at javafx.base#19.0.2.1/com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:114)
at javafx.base#19.0.2.1/com.sun.javafx.event.EventUtil.fireEventImpl(EventUtil.java:74)
at javafx.base#19.0.2.1/com.sun.javafx.event.EventUtil.fireEvent(EventUtil.java:54)
at javafx.base#19.0.2.1/javafx.event.Event.fireEvent(Event.java:198)
at javafx.graphics#19.0.2.1/javafx.scene.Scene$MouseHandler.process(Scene.java:3894)
at javafx.graphics#19.0.2.1/javafx.scene.Scene.processMouseEvent(Scene.java:1887)
at javafx.graphics#19.0.2.1/javafx.scene.Scene$ScenePeerListener.mouseEvent(Scene.java:2620)
at javafx.graphics#19.0.2.1/com.sun.javafx.tk.quantum.GlassViewEventHandler$MouseEventNotification.run(GlassViewEventHandler.java:411)
at javafx.graphics#19.0.2.1/com.sun.javafx.tk.quantum.GlassViewEventHandler$MouseEventNotification.run(GlassViewEventHandler.java:301)
at java.base/java.security.AccessController.doPrivileged(AccessController.java:399)
at javafx.graphics#19.0.2.1/com.sun.javafx.tk.quantum.GlassViewEventHandler.lambda$handleMouseEvent$2(GlassViewEventHandler.java:450)
at javafx.graphics#19.0.2.1/com.sun.javafx.tk.quantum.QuantumToolkit.runWithoutRenderLock(QuantumToolkit.java:424)
at javafx.graphics#19.0.2.1/com.sun.javafx.tk.quantum.GlassViewEventHandler.handleMouseEvent(GlassViewEventHandler.java:449)
at javafx.graphics#19.0.2.1/com.sun.glass.ui.View.handleMouseEvent(View.java:551)
at javafx.graphics#19.0.2.1/com.sun.glass.ui.View.notifyMouse(View.java:937)
at javafx.graphics#19.0.2.1/com.sun.glass.ui.win.WinApplication._runLoop(Native Method)
at javafx.graphics#19.0.2.1/com.sun.glass.ui.win.WinApplication.lambda$runLoop$3(WinApplication.java:184)
at java.base/java.lang.Thread.run(Thread.java:833)
EDIT: 254 original code
pdfTable.addCell(column.getCellData(item).toString());
but then It asked to cast argument item to int like this
pdfTable.addCell(column.getCellData((int) item).toString());
Kindly help out to solve this.

Exception while downloading pdf tableView: java.lang.ClassCastException: class application.(className --> that is the controller of the fxml file where the download button is placed) cannot be cast to class java.lang.Integer (application.className(Controller) is in unnamed module of loader 'app'; java.lang.Integer is in module java.base of loader 'bootstrap')
java.lang.ClassCastException: class application.className(Controller) cannot be cast to class java.lang.Integer (application.className(Controller) is in unnamed module of loader 'app'; java.lang.Integer is in module java.base of loader 'bootstrap')
at application.className(Controller).DownloadMethod(className(Controller).java:254)
As this stack trace suggests, the error occurs because you are trying to cast the table item, which apparently is not an integer, to an integer at line number 254. Note that the method TableView#getItems returns an iterable object of type ObservableList<S>.
You need to fix the following line at the minimum:
pdfTable.addCell(column.getCellData((int) item).toString());
It looks like you intend to add the index of the item in the table view rather than the item itself to your pdf table. Try this instead:
pdfTable.addCell(column.getCellData(TableName.getItems().indexOf(item)).toString());

Thanks to all!
Special Thanks to #James_D
This is the code that is printing the pdf correctly.
#FXML
public void DownloadButtonMethod(ActionEvent event) {
DirectoryChooser directoryChooser = new DirectoryChooser();
File selectedDirectory = directoryChooser.showDialog(TableName.getScene().getWindow());
if (selectedDirectory == null) {
return;
}
TextInputDialog dialog = new TextInputDialog("table");
dialog.setTitle("Enter File Name");
dialog.setHeaderText("Enter the name for the PDF file:");
dialog.setContentText("File name:");
Optional<String> result = dialog.showAndWait();
if (result.isPresent()) {
try {
String fileName = result.get();
if (!fileName.endsWith(".pdf")) {
fileName = fileName + ".pdf";
}
String filePath = selectedDirectory.getAbsolutePath() + "/" + fileName;
OutputStream file = new FileOutputStream(filePath);
PdfDocument pdfDoc = new PdfDocument(new PdfWriter(filePath));
Document document = new Document(pdfDoc);
Table pdfTable = new Table(TableName.getColumns().size());
try {
for (TableColumn<ClassName(holds records), ?> column : TableName.getColumns()) {
pdfTable.addCell(column.getText());
}
}catch(Exception e) {
System.out.println("E1: "+e);
e.printStackTrace();
}
try {
for (int i = 0; i < TableName.getItems().size(); i++) {
for (TableColumn<ClassName(holds records), ?> column : TableName.getColumns()) {
try {
pdfTable.addCell(column.getCellData(i).toString());
} catch (NullPointerException e) {
pdfTable.addCell("");
}
}
}
} catch(Exception e) {
System.out.println("E2: "+e);
e.printStackTrace();
}
document.add(pdfTable);
document.close();
} catch (Exception e) {
System.out.println("Exception while downloading pdf tableView: "+e);
e.printStackTrace();
}
}
}
We have handled the NullPointerException by checking if the object is null, if yes then we have added the empty string to it so it couldn't throw a NullPointerException.
And the ClassCastException was caused because when I was trying to cast an object to a type that it is not compatible with.
Then I placed className(contains records: i.e variables, constructor, and getters and setters) as a type instead of using wildcards in TabelColumn as suggested by #James_D.
Thanks Again :)

Related

How to manage Apache-Beam TextIO exceptions into failures?

How to convert TextIO exceptions into failures?
Sometimes when i use TextIO.read() I have
org.apache.beam.sdk.Pipeline$PipelineExecutionException:
java.io.FileNotFoundException: No files matched spec:
src/test/resources/config/qqqqqqq
How to separate exceptions to independent list of failures?
For example this code:
I have a file with list of other files and need to read all lines from all files as one list
PipelineOptions options = PipelineOptionsFactory.create();
Pipeline pipeline = Pipeline.create(options);
PCollection<String> lines = pipeline
.apply(TextIO.read().from("src/test/resources/config/W-PSFV-LOG-FILE-2022-05-16_23-59-59.txt"))
.apply(MapElements.into(TypeDescriptors.strings()).via(line -> "src/test/resources/config/" + line))
.apply(TextIO.readAll());
;
lines.apply(Log.ofElements());
pipeline.run();
But if one of files is broken it throws FileNotFoundException and stops. Do not want to stop, I want to get a list of all existing files and list with broken files
I thinks you can use a dead letter queue in order to solve your problem.
Beam proposes natively error handling with TupleTags or exceptionInto and exceptionVia methods in MapElements.
It then returns a Result structure with good outputs PCollections and failures PCollection.
You can also use a library called Asgarde :
https://github.com/tosun-si/asgarde
PipelineOptions options = PipelineOptionsFactory.create();
Pipeline pipeline = Pipeline.create(options);
PCollection<String> lines = pipeline
.apply(TextIO.read().from("src/test/resources/config/W-PSFV-LOG-FILE-2022-05-16_23-59-59.txt"))
WithFailures.Result<PCollection<String>, Failure> result = CollectionComposer.of(lines)
.apply(MapElements.into(TypeDescriptors.strings()).via(line -> "src/test/resources/config/" + line));
;
// Gets outputs and Failure PCollections.
PCollection<String> output = result.output();
PCollection<Failure> failures = result.failures();
// Then you can sink your Failures in database, GCS file or topic if needed...
......
pipeline.run();
Failure object is proposed by Asgarde library and give the current input element as String and exception :
public class Failure implements Serializable {
private final String pipelineStep;
private final String inputElement;
private final Throwable exception;
If you want to use this code, you have to import Asgarde library, for example with Maven in your pom.xml file :
<dependency>
<groupId>fr.groupbees</groupId>
<artifactId>asgarde</artifactId>
<version>0.19.0</version>
</dependency>
or with Gradle :
implementation group: 'fr.groupbees', name: 'asgarde', version: '0.19.0'
PS : I am the creator of Asgarde library, the Readme of project shows many examples to apply Dead letter queue with native Beam and with Asgarde library.
Don't hesitate to read the Readme file of the project : https://github.com/tosun-si/asgarde
You can use a FileIO first to split the files into readable-existing files and non-existing files.
PCollection<KV<String, String>> categoryAndFiles = p
.apply(FileIO.match().filepattern("hdfs://path/to/*.gz"))
// withCompression can be omitted - by default compression is detected from the filename.
.apply(FileIO.readMatches().withCompression(GZIP))
.apply(MapElements
// uses imports from TypeDescriptors
.into(kvs(strings(), strings()))
.via((ReadableFile f) -> {
try {
f.open();
return KV.of(
"readable-existing",
f.getMetadata().resourceId().toString());
} catch (IOException ex) {
return KV.of(
"non-existing",
f.getMetadata().resourceId().toString());
}
}));
Adapted from an example.
#Rule
public transient TestPipeline pipeline = TestPipeline.create();
#Test
public void testTransformWithIOException3() throws FileNotFoundException {
PCollection<String> digits = pipeline.apply(Create.of("1", "2", "3"));
WithFailures.Result<PCollection<String>, Failure> result = pipeline
.apply("Read ", Create.of("1", "2", "3")) // PCollection<String>
.apply("Map", new PTransform<PCollection<String>, WithFailures.Result<PCollection<String>, Failure>>() {
#Override
public WithFailures.Result<PCollection<String>, Failure> expand(PCollection<String> input) {
return CollectionComposer.of(input)
.apply("//", MapElements.into(TypeDescriptors.strings()).via(s -> {
try {
if (s.equals("2")) throw new FileNotFoundException();
else
return s.toString();
} catch(Exception e){
throw new RuntimeException("error ");
}
}))
.getResult();
}
});
result.output().apply("out",
MapElements.into(TypeDescriptors.strings()).via(x -> {
System.out.println(x);
return x.toString();
}));
result.failures().apply("failures",
MapElements.into(TypeDescriptors.strings()).via(x -> {
System.out.println(x);
return x.toString();
}));
pipeline.run().waitUntilFinish();

Caused by: java.lang.IllegalAccessError: class com.spire.xls.packages.sprRFA (in module spire.xls.free) cannot access class

I am getting an exception and I can't find the reason of it. The exception I get is :
Caused by: java.lang.IllegalAccessError: class com.spire.xls.packages.sprRFA (in module spire.xls.free) cannot access class sun.security.action.GetPropertyAction (in module java.base) because module java.base does not export sun.security.action to module spire.xls.free
#FXML
public void OnCreateFile(ActionEvent actionEvent) {
String po1 = Word.getText();
String[] ss1 = po1.split(" ");
String po2 = Output.getText();
String[] ss2 = po2.split(" ");
Workbook workbook = new Workbook();
Worksheet sheet = workbook.getWorksheets().get(0);
sheet.insertArray(ss1, 0, 0, true);
sheet.insertArray(ss2, 0, 1, true);
workbook.saveToFile("dictionary.xlsx", ExcelVersion.Version2016);
}
Any help? Also would like to know is there any other way to export array in Excel document using Java.
This thread might be of help:https://www.e-iceblue.com/forum/illegal-access-error-when-instantiating-workbook-t10921.html

Jenkins pipelines set defaultValue on parameter dynamically

I have jenkinsfile and I want to set defaultValue on the imageTag dynamically, which I'm fetching from the pom file.
Here is the file:
def gr
pipeline {
agent any
image = readMavenPom().getParent().getVersion()
parameters {
string(name: 'imageTag', defaultValue: image, description: 'Docker image tag')
}
stages {
stage('Environment') {
steps {
script {
gr = load 'src/ci/script.groovy';
echo("Using imageTag: ${params.imageTag}")
}
}
}
}
I'm getting an error:
startup failed: WorkflowScript: 10: Not a valid section definition:
"image = readMavenPom().getParent().getVersion()". Some extra
configuration is required. # line 10, column 5.
image = readMavenPom().getParent().getVersion()
^
1 error
at
org.codehaus.groovy.control.ErrorCollector.failIfErrors(ErrorCollector.java:310)
at
org.codehaus.groovy.control.CompilationUnit.applyToPrimaryClassNodes(CompilationUnit.java:1085)
at
org.codehaus.groovy.control.CompilationUnit.doPhaseOperation(CompilationUnit.java:603)
at
org.codehaus.groovy.control.CompilationUnit.processPhaseOperations(CompilationUnit.java:581)
at
org.codehaus.groovy.control.CompilationUnit.compile(CompilationUnit.java:558)
at
groovy.lang.GroovyClassLoader.doParseClass(GroovyClassLoader.java:298)
at
groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:268)
at groovy.lang.GroovyShell.parseClass(GroovyShell.java:688) at
groovy.lang.GroovyShell.parse(GroovyShell.java:700) at
org.jenkinsci.plugins.workflow.cps.CpsGroovyShell.doParse(CpsGroovyShell.java:142)
at
org.jenkinsci.plugins.workflow.cps.CpsGroovyShell.reparse(CpsGroovyShell.java:127)
at
org.jenkinsci.plugins.workflow.cps.CpsFlowExecution.parseScript(CpsFlowExecution.java:571)
at
org.jenkinsci.plugins.workflow.cps.CpsFlowExecution.start(CpsFlowExecution.java:523)
at
org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:337)
at
hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429) Finished: FAILURE
I found this solution and it works, but the main idea was setting defaultValue dynamically directly on parameter
stage('Environment') {
steps {
script {
gr = load 'src/ci/script.groovy';
version = readMavenPom().getParent().getVersion()
if (!params.imageTag) {
imageTag = version
}
echo("Using imageTag: ${imageTag}")
}
}
}

Oracle Java: Error casting java.sql.Blob datatype to Oracle BLOB datatype

I am getting the following error upon executing Oracle Java procedure that accepts and returns BLOB data,
Error report - ORA-00932: inconsistent datatypes: expected a return
value that is an instance of a user-defined Java class convertible to
an Oracle type got an object that could not be converted ORA-06512: at
"", line 86 ORA-06512: at line 7
00932. 00000 - "inconsistent datatypes: expected %s got %s"
*Cause:
*Action:
Java Code
public static java.sql.Blob Convert_Image(java.sql.Blob srcBlob) {
java.sql.Blob desBlob = null;
try {
Document document = new Document();
ByteArrayOutputStream pdfDocumentOutputStream = new ByteArrayOutputStream();
PdfWriter pdfDocumentWriter = PdfWriter.getInstance(document, pdfDocumentOutputStream);
document.open();
if (document.newPage()) {
int indentation = 0;
Image img = Image.getInstance(srcBlob.getBytes(1, (int) srcBlob.length()));
float scaler = document.getPageSize().getWidth() - document.leftMargin() - document.rightMargin() - indentation;
img.scalePercent((scaler / img.getWidth()) * 100);
document.newPage();
document.add(Image.getInstance(img));
document.close();
desBlob = new SerialBlob(pdfDocumentOutputStream.toByteArray());
pdfDocumentWriter.close();
pdfDocumentOutputStream.close();
}
}
catch (Exception e) {
Show_Message(e);
}
return desBlob;
}
Oracle Code
FUNCTION CONVERT_IMAGE(
P_BLOB IN DOCUMENTS.BLOB_CONTENT%TYPE)
RETURN BLOB
AS
LANGUAGE JAVA NAME 'egift.Util.Convert_Image (java.sql.Blob) return java.sql.Blob';
Trigger Implementation
...
DECLARE
v_blob_content DOCUMENTS.BLOB_CONTENT%TYPE;
BEGIN
IF :NEW.BLOB_CONTENT IS NOT NULL AND
(
NVL(:NEW.MIME_TYPE,'#') = 'image/png' OR
NVL(:NEW.MIME_TYPE,'#') = 'image/jpeg' OR
NVL(:NEW.MIME_TYPE,'#') = 'image/gif' OR
NVL(:NEW.MIME_TYPE,'#') = 'image/tiff' OR
NVL(:NEW.MIME_TYPE,'#') = 'image/bmp'
) THEN
v_blob_content := EGIFT_UTIL.CONVERT_IMAGE(:NEW.BLOB_CONTENT);
IF v_blob_content is not null then
:NEW.BLOB_CONTENT := v_blob_content;
:NEW.MIME_TYPE := 'application/pdf';
:NEW.NAME := substr(:NEW.NAME,0,instr(:NEW.NAME,'.',-1)) || 'pdf';
END IF;
END IF;
...
You need to return an instance of oracle.sql.BLOB or oracle.jdbc2.Blob from your Java procedure in order to create a trigger that calls the Java procedure and returns a BLOB. ORACLE actually has a table where they compare their datatypes to the Java instances they can accept:
the legal data type mappings. Oracle Database converts between the SQL types and Java classes automatically
Update 1:
I actually tested passing java.sql.Blob and returning the same type in the function and it worked as expected:
CREATE OR REPLACE AND RESOLVE JAVA SOURCE NAMED "Util" as
public class Util {
public static java.sql.Blob Convert_Image(java.sql.Blob srcBlob) {
return srcBlob;
}
}
/
CREATE OR REPLACE FUNCTION CONVERT_IMAGE(
P_BLOB BLOB)
RETURN BLOB
AS
LANGUAGE JAVA NAME 'Util.Convert_Image (java.sql.Blob) return java.sql.Blob';
select utl_raw.cast_to_varchar2(convert_image(utl_raw.cast_to_raw('test'))) from dual;
-- test
Can you try to run the above code and see if you get the same error?
This is a temporary solution that I implemented since I was cutting close to my deadline, I am still in search of a solution wherein I don't have to use deprecated classes and would like to avoid reference of oracle.sql.BLOB and use java.sql.Blob.
Workaround involved creating a oracle.sql.BLOB object instead of SerialBlob and then populating bytearray from output stream as follows,
conn = new OracleDriver().defaultConnection();
desBlob = BLOB.createTemporary(conn, false, BLOB.DURATION_SESSION);
desBlob.setBytes(1, pdfDocumentOutputStream.toByteArray());
And suppressed deprecation warnings using,
#SuppressWarnings("deprecation")
Final Java Code
#SuppressWarnings("deprecation")
public static java.sql.Blob Convert_Image(java.sql.Blob srcBlob) {
java.sql.Blob desBlob = null;
try {
Document document = new Document();
ByteArrayOutputStream pdfDocumentOutputStream = new ByteArrayOutputStream();
PdfWriter pdfDocumentWriter = PdfWriter.getInstance(document, pdfDocumentOutputStream);
document.open();
if (document.newPage()) {
int indentation = 0;
Image img = Image.getInstance(srcBlob.getBytes(1, (int) srcBlob.length()));
float scaler =
document.getPageSize().getWidth() - document.leftMargin() - document.rightMargin() - indentation;
img.scalePercent((scaler / img.getWidth()) * 100);
document.newPage();
document.add(Image.getInstance(img));
document.close();
//desBlob = new SerialBlob(pdfDocumentOutputStream.toByteArray());
conn = new OracleDriver().defaultConnection();
desBlob = BLOB.createTemporary(conn, false, BLOB.DURATION_SESSION);
desBlob.setBytes(1, pdfDocumentOutputStream.toByteArray());
pdfDocumentWriter.close();
pdfDocumentOutputStream.close();
}
} catch (Exception e) {
Show_Message(e);
}
return desBlob;
}
I had setup a bounty to get a solution to resolve this issue w/o using deprecated class and I was unable to find one, although I received attention to this question. This is an open issue for me until I find the right solution. Thanks to all who made an effort.
Regards!
I found this solution without deprecated code:
public static Blob bytes2Blob(byte[] b) throws Exception {
if (System.getProperty("oracle.server.version") != null) {
Connection con = DriverManager.getConnection("jdbc:default:connection");
CallableStatement cStmt = con.prepareCall ("{ call DBMS_LOB.createtemporary(?,true,DBMS_LOB.SESSION) }");
cStmt.registerOutParameter(1, OracleTypes.BLOB);
cStmt.execute();
Blob blob = ((OracleCallableStatement)cStmt).getBLOB(1);
cStmt.close();
OutputStream out = blob.setBinaryStream(1L);
out.write(b);
out.flush();
return blob;
} else {
return new javax.sql.rowset.serial.SerialBlob(b);
}

Getting Spring-XD and the hdfs sink to work for maprfs

This is a question about spring-xd release 1.0.1 working together with maprfs, which is officially not yet supported. Still I would like to get it to work.
So this is what we did:
1) adjusted the xd-shell and xd-worker and xd-singlenode shell scripts to accept the parameter --hadoopDistro mapr
2) added libraries to the new directory $XD_HOME/lib/mapr
avro-1.7.4.jar jersey-core-1.9.jar
hadoop-annotations-2.2.0.jar jersey-server-1.9.jar
hadoop-core-1.0.3-mapr-3.0.2.jar jetty-util-6.1.26.jar
hadoop-distcp-2.2.0.jar maprfs-1.0.3-mapr-3.0.2.jar
hadoop-hdfs-2.2.0.jar protobuf-java-2.5.0.jar
hadoop-mapreduce-client-core-2.2.0.jar spring-data-hadoop-2.0.2.RELEASE-hadoop24.jar
hadoop-streaming-2.2.0.jar spring-data-hadoop-batch-2.0.2.RELEASE-hadoop24.jar
hadoop-yarn-api-2.2.0.jar spring-data-hadoop-core-2.0.2.RELEASE-hadoop24.jar
hadoop-yarn-common-2.2.0.jar spring-data-hadoop-store-2.0.2.RELEASE-hadoop24.jar
3) run bin/xd-singlenode --hadoopDistro mapr and shell/bin/xd-shell --hadoopDistro mapr.
When creating and deploying a stream via stream create foo --definition "time | hdfs" --deploy, data is written to a file tmp/xd/foo/foo-1.txt.tmp on maprfs. Yet when undeploying the stream, the following exceptions appears:
org.springframework.data.hadoop.store.StoreException: Failed renaming from /xd/foo/foo-1.txt.tmp to /xd/foo/foo-1.txt; nested exception is java.io.FileNotFoundException: Requested file /xd/foo/foo-1.txt does not exist.
at org.springframework.data.hadoop.store.support.OutputStoreObjectSupport.renameFile(OutputStoreObjectSupport.java:261)
at org.springframework.data.hadoop.store.output.TextFileWriter.close(TextFileWriter.java:92)
at org.springframework.xd.integration.hadoop.outbound.HdfsDataStoreMessageHandler.doStop(HdfsDataStoreMessageHandler.java:58)
at org.springframework.xd.integration.hadoop.outbound.HdfsStoreMessageHandler.stop(HdfsStoreMessageHandler.java:94)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:317)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:201)
at com.sun.proxy.$Proxy120.stop(Unknown Source)
at org.springframework.integration.endpoint.EventDrivenConsumer.doStop(EventDrivenConsumer.java:64)
at org.springframework.integration.endpoint.AbstractEndpoint.stop(AbstractEndpoint.java:100)
at org.springframework.integration.endpoint.AbstractEndpoint.stop(AbstractEndpoint.java:115)
at org.springframework.integration.config.ConsumerEndpointFactoryBean.stop(ConsumerEndpointFactoryBean.java:303)
at org.springframework.context.support.DefaultLifecycleProcessor.doStop(DefaultLifecycleProcessor.java:229)
at org.springframework.context.support.DefaultLifecycleProcessor.access$300(DefaultLifecycleProcessor.java:51)
at org.springframework.context.support.DefaultLifecycleProcessor$LifecycleGroup.stop(DefaultLifecycleProcessor.java:363)
at org.springframework.context.support.DefaultLifecycleProcessor.stopBeans(DefaultLifecycleProcessor.java:202)
at org.springframework.context.support.DefaultLifecycleProcessor.stop(DefaultLifecycleProcessor.java:106)
at org.springframework.context.support.AbstractApplicationContext.stop(AbstractApplicationContext.java:1186)
at org.springframework.xd.module.core.SimpleModule.stop(SimpleModule.java:234)
at org.springframework.xd.dirt.module.ModuleDeployer.destroyModule(ModuleDeployer.java:132)
at org.springframework.xd.dirt.module.ModuleDeployer.handleUndeploy(ModuleDeployer.java:111)
at org.springframework.xd.dirt.module.ModuleDeployer.undeploy(ModuleDeployer.java:83)
at org.springframework.xd.dirt.server.ContainerRegistrar.undeployModule(ContainerRegistrar.java:261)
at org.springframework.xd.dirt.server.ContainerRegistrar$StreamModuleWatcher.process(ContainerRegistrar.java:884)
at org.apache.curator.framework.imps.NamespaceWatcher.process(NamespaceWatcher.java:67)
at org.apache.zookeeper.ClientCnxn$EventThread.processEvent(ClientCnxn.java:522)
at org.apache.zookeeper.ClientCnxn$EventThread.run(ClientCnxn.java:498)
Caused by: java.io.FileNotFoundException: Requested file /xd/foo/foo-1.txt does not exist.
at com.mapr.fs.MapRFileSystem.getMapRFileStatus(MapRFileSystem.java:805)
at com.mapr.fs.MapRFileSystem.delete(MapRFileSystem.java:629)
at org.springframework.data.hadoop.store.support.OutputStoreObjectSupport.renameFile(OutputStoreObjectSupport.java:258)
... 29 more
I had a look at the OutputStoreObjectSupport.renameFile() function. When a file on hdfs is finished, this method tries to rename the file /xd/foo/foo-1.txt.tmp to xd/foo/foo1.txt. This is the relevant code:
try {
FileSystem fs = path.getFileSystem(getConfiguration());
boolean succeed;
try {
fs.delete(toPath, false);
log.info("Renaming path=[" + path + "] toPath=[" + toPath + "]");
succeed = fs.rename(path, toPath);
} catch (Exception e) {
throw new StoreException("Failed renaming from " + path + " to " + toPath, e);
}
if (!succeed) {
throw new StoreException("Failed renaming from " + path + " to " + toPath + " because hdfs returned false");
}
}
When the target file does not exist on hdfs, maprfs seems to throw an exception when fs.delete(toPath, false) is called. Yet throwing an exception in this case does not make sense. I assume that other Filesystem implementations behave differently, but this is a point I still need to verify. Unfortuntately I cannot find the sources for MapRFileSystem.java. Is this closed source? This would help me to better understand the issue. Has anybody experience with writing from spring-xd to maprfs? Or renaming files on maprfs with spring-data-hadoop?
Edit
I managed to reproduce the issue outside of spring XD with a simple test case (see below). Note that this exception is only thrown if the inWritingSuffix or the inWritingPrefix is set. Otherwise spring-hadoop will not attempt to rename the file. So this is the still somehow unsatisfactory workaround for me: refrain from using inWritingPrefixes and inWritingSuffixes.
#ContextConfiguration("context.xml")
#RunWith(SpringJUnit4ClassRunner.class)
public class MaprfsSinkTest {
#Autowired
Configuration configuration;
#Autowired
FileSystem filesystem;
#Autowired
DataStoreWriter<String >storeWriter;
#Test
public void testRenameOnMaprfs() throws IOException, InterruptedException {
Path testPath = new Path("/tmp/foo.txt");
filesystem.delete(testPath, true);
TextFileWriter writer = new TextFileWriter(configuration, testPath, null);
writer.setInWritingSuffix("tmp");
writer.write("some entity");
writer.close();
}
#Test
public void testStoreWriter() throws IOException {
this.storeWriter.write("something");
}
}
I created a new branch for spring-hadoop which supports maprfs:
https://github.com/blinse/spring-hadoop/tree/origin/2.0.2.RELEASE-mapr
Building this release and using the resulting jar works fine with the hdfs sink.

Categories