I am trying to read (and then store to 3rd party local db) certain DICOM object tags "during" an incoming association request.
For accepting association requests and storing locally my dicom files i have used a modified version of dcmrcv() tool. More specifically i have overriden onCStoreRQ method like:
#Override
protected void onCStoreRQ(Association association, int pcid, DicomObject dcmReqObj,
PDVInputStream dataStream, String transferSyntaxUID,
DicomObject dcmRspObj)
throws DicomServiceException, IOException {
final String classUID = dcmReqObj.getString(Tag.AffectedSOPClassUID);
final String instanceUID = dcmReqObj.getString(Tag.AffectedSOPInstanceUID);
config = new GlobalConfig();
final File associationDir = config.getAssocDirFile();
final String prefixedFileName = instanceUID;
final String dicomFileBaseName = prefixedFileName + DICOM_FILE_EXTENSION;
File dicomFile = new File(associationDir, dicomFileBaseName);
assert !dicomFile.exists();
final BasicDicomObject fileMetaDcmObj = new BasicDicomObject();
fileMetaDcmObj.initFileMetaInformation(classUID, instanceUID, transferSyntaxUID);
final DicomOutputStream outStream = new DicomOutputStream(new BufferedOutputStream(new FileOutputStream(dicomFile), 600000));
//i would like somewhere here to extract some TAGS from incoming dicom object. By trying to do it using dataStream my dicom files
//are getting corrupted!
//System.out.println("StudyInstanceUID: " + dataStream.readDataset().getString(Tag.StudyInstanceUID));
try {
outStream.writeFileMetaInformation(fileMetaDcmObj);
dataStream.copyTo(outStream);
} finally {
outStream.close();
}
dicomFile.renameTo(new File(associationDir, dicomFileBaseName));
System.out.println("DICOM file name: " + dicomFile.getName());
}
#Override
public void associationAccepted(final AssociationAcceptEvent associationAcceptEvent) {
....
#Override
public void associationClosed(final AssociationCloseEvent associationCloseEvent) {
...
}
I would like somewhere between this code to intercept a method wich will read dataStream and will parse specific tags and store to a local database.
However wherever i try to put a piece of code that tries to manipulate (just read for start) dataStream then my dicom files get corrupted!
PDVInputStream is implementing java.io.InputStream ....
Even if i try to just put a:
System.out.println("StudyInstanceUID: " + dataStream.readDataset().getString(Tag.StudyInstanceUID));
before copying datastream to outStream ... then my dicom files are getting corrupted (1KB of size) ...
How am i supposed to use datastream in a CStoreRQ association request to extract some information?
I hope my question is clear ...
The PDVInputStream is probably a PDUDecoder class. You'll have to reset the position when using the input stream multiple times.
Maybe a better solution would be to store the DICOM object in memory and use that for both purposes. Something akin to:
DicomObject dcmobj = dataStream.readDataset();
String whatYouWant = dcmobj.get( Tag.whatever );
dcmobj.initFileMetaInformation( transferSyntaxUID );
outStream.writeDicomFile( dcmobj );
Related
Is there a way to read a serialized object from a .ser file and update or delete one of the objects that have been serialized?
The following is my code which read's in objects of type 'Driver':
public boolean checkPassword(String userName, String password, String depot) throws IOException {
FileInputStream fileIn = new FileInputStream("Drivers.ser");
ObjectInputStream in = new ObjectInputStream(fileIn);
try
{
while (true) {
Driver d = (Driver) in.readObject();
if (d.userName.equals(userName) && d.password.equals(password) && d.depot.equals(depot))
{
this.isManager = d.isManager;
validAccount = true;
}
}
}
catch (Exception e) {}
return validAccount;
}
You will need to read all objects from the original file and then write a new file containing only the objects you want to retain or update.
The Java serialized object stream format is not an archive file format like ZIP, JAR, TAR and so on. It is just a sequence of serialized objects. There is no "index" that would facilitate updating or deleting objects.
This is one reason why serialized objects are not a good way to implement data persistence. This is what databases are designed for.
I have the following method, with the simple aim to store the contents of a given MultipartFile instance under a specified directory:
private void saveOnDisk(final String clientProductId, final MultipartFile image, final String parentDirectoryPath, final String fileSeparator) throws IOException
{
final File imageFile = new File(parentDirectoryPath + fileSeparator + clientProductId + image.getOriginalFilename());
image.transferTo(imageFile);
OutputStream out = new FileOutputStream(imageFile);
out. //... ? How do we proceed? OutputStream::write() requires a byte array or int as parameter
}
For what it might be worth, the MultipartFile instance is going to contain an image file which I receive on a REST API I'm building.
I've checked some SO posts such as this and this but this problem is not quite touched: I'm effectively looking to create an entirely new image file and store it on a specified location on disk: the method write() of OutputStream, given that it requires byte[] or int params, doesn't seem to be fitting my use case. Any ideas?
I am given an assignment where we are not allowed to use a DB or libraries but only textfile for data storage.
But it has rather complex requirements, for e.g. many validations, because of that, we need to "access the db" (i.e. read the textfile) many times.
My question is: should I create a class like this:
class SomeRepository{
static ArrayList<Users> users = new ArrayList();
public SomeRepository(){
//instantiate this class on program load
//In constructor, we read the text file, instantiate and store everything inside the arraylist.
}
//public getOneUser(){ // for get methods, we don't read from text file at all }
/public save() { //text file saving code overhere }
}
Is this a good approach to solve the above problem? Currently, what we are doing is reading and writing to the text file every time we want to retrieve some data or write something new.
Wouldn't this be too expensive in terms of heap space memory? Or should I just read/write to the text file for every method?
public class IOManager {
public static void writeObjToTxtFile(String fileName, Object object) {
File file = new File(fileName + ".txt");//File will be created in the root directory where the program runs.
try (FileOutputStream fos = new FileOutputStream(file);
ObjectOutputStream oos = new ObjectOutputStream(fos);) {
oos.writeObject(object);
} catch (IOException e) {
e.printStackTrace();
}
}
public static Object readObjFromTxtFile(String fileName) {
Object obj = null;
File file = new File(fileName + ".txt");
FileInputStream fis = null;
try {
fis = new FileInputStream(file);
ObjectInputStream ois = new ObjectInputStream(fis);
obj = ois.readObject();
} catch (ClassNotFoundException | IOException e) {
e.printStackTrace();
}
return obj;
}
}
Add this class to your project. Since it's general for all Objects, you can pass and receive Objects like these as well: ArrayList<Users>. Play around and Tinker with it to fit whatever your specific purpose is. Hint: You can write other custom methods that calls these methods. eg:
public static void writeUsersToFile(ArrayList<Users> usersArrayList){
writeObjToTxtFile("users",usersArrayList);
}
Ps. Make sure your Objects implement Serializable. Eg:
public class Users implements Serializable {
}
I would suggest reading the contents of your file to a dynamic list such as an arraylist at the start of your program. Make the required queries/changes to your arraylist and then write that arraylist to your file when the program is set to close. This will save significant time over repeated file reads/writes.
This isn't without it's drawbacks, though. You don't want to hogg up memory in case of very large files - but considering this is an assignment, that may not be the case. Additionally, should your program terminate prior to the write at the end, all changes made to your database during the current execution will be lost.
I have a servlet which is responsible for enabling a user to update a reports table and upload a report at the same time. I have written code that enables a user upload a document and also be able to update the table with other details e.g date submitted etc.
However not all the times will a user have to upload a document. in this case it should be possible for a user to still edit a report's details and come back later to upload the file. i.e the user can submit the form without selecting a file and it still updates the table.
This part is what is not working. If a user selects a file and makes some changes. The code works. If a user doesn't select a file and tries to submit the form, it redirects to my servlet but it is blank. no stacktrace. No error is thrown.
Below is part of the code I have in my servlet:
if(param.equals("updateschedule"))
{
String[] allowedextensions = {"pdf","xlsx","xls","doc","docx","jpeg","jpg","msg"};
final String path = request.getParameter("uploadlocation_hidden");
final Part filepart=request.getPart("uploadreport_file");
int repid = Integer.parseInt(request.getParameter("repid_hidden"));
int reptype = Integer.parseInt(request.getParameter("reporttype_select"));
String webdocpath = request.getParameter("doclocation_hidden");
String subperiod = request.getParameter("submitperiod_select");
String duedate = request.getParameter("reportduedate_textfield");
String repname = request.getParameter("reportname_textfield");
String repdesc = request.getParameter("reportdesc_textarea");
String repinstr = request.getParameter("reportinst_textarea");
int repsubmitted = Integer.parseInt(request.getParameter("repsubmitted_select"));
String datesubmitted = request.getParameter("reportsubmitdate_textfield");
final String filename = getFileName(filepart);
OutputStream out = null;
InputStream filecontent=null;
String extension = filename.substring(filename.lastIndexOf(".") + 1, filename.length());
if(Arrays.asList(allowedextensions).contains(extension))
{
try
{
out=new FileOutputStream(new File(path+File.separator+filename));
filecontent = filepart.getInputStream();
int read=0;
final byte[] bytes = new byte[1024];
while((read=filecontent.read(bytes))!=-1)
{
out.write(bytes,0,read);
}
String fulldocpath = webdocpath+"/"+filename;
boolean succ = icreditdao.updatereportschedule(repid, reptype, subperiod, repname, repsubmitted,datesubmitted, duedate,fulldocpath, repdesc, repinstr);
if(succ==true)
{
response.sendRedirect("/webapp/Pages/Secured/ReportingSchedule.jsp?msg=Report Schedule updated successfully");
}
}
catch(Exception ex)
{
throw new ServletException(ex);
}
}
I'm still teaching myself javaee. Any help will be appreciated. Also open to other alternatives. I have thought of using jquery to detect if a file has been selected then use a different set of code. e.g
if(param.equals("updatewithnofileselected"))
{//update code here}
but I think there must be a better solution. Using jdk6, servlet3.0.
try this one.
MultipartParser parser = new MultipartParser(request, 500000000, false, false, "UTF-8");
Part part;
while ((part = parser.readNextPart()) != null) {
if(part.isParam()){
if(part.isFile()){
if(part.getName().equals("updatewithnofileselected")){
//update code here.
} else if(part.getName().equals("updateschedule")) {
//updateschedule
}
}
}
}
I used this one when I am using Multipart-form and it's working fine.
In map-reduce I would extract the input file name as following
public void map(WritableComparable<Text> key, Text value, OutputCollector<Text,Text> output, Reporter reporter)
throws IOException {
FileSplit fileSplit = (FileSplit)reporter.getInputSplit();
String filename = fileSplit.getPath().getName();
System.out.println("File name "+filename);
System.out.println("Directory and File name"+fileSplit.getPath().toString());
process(key,value);
}
How can I do the similar with cascading
Pipe assembly = new Pipe(SomeFlowFactory.class.getSimpleName());
Function<Object> parseFunc = new SomeParseFunction();
assembly = new Each(assembly, new Fields(LINE), parseFunc);
...
public class SomeParseFunction extends BaseOperation<Object> implements Function<Object> {
...
#Override
public void operate(FlowProcess flowProcess, FunctionCall<Object> functionCall) {
how can I get the input file name here ???
}
Thanks,
I don't use Cascading but I think it should be sufficient to access the context instance, using functionCall.getContext(), to obtain the filename you can use:
String filename= ((FileSplit)context.getInputSplit()).getPath().getName();
However, it seems that cascading use the old API, if the above doesn't work you must try with:
Object name = flowProcess.getProperty( "map.input.file" );
Thank Engineiro for sharing the answer. However, when invoking hfp.getReporter().getInputSplit() method, I got MultiInputSplit type which can't be casted into FileSplit type directly in cascading 2.5.3. After diving into the related cascading APIs, I found a way and retrieved input file names successfully. Therefore, I would like to share this to supplement Engineiro's answer. Please see the following code.
HadoopFlowProcess hfp = (HadoopFlowProcess) flowProcess;
MultiInputSplit mis = (MultiInputSplit) hfp.getReporter().getInputSplit();
FileSplit fs = (FileSplit) mis.getWrappedInputSplit();
String fileName = fs.getPath().getName();
You would do this by getting the reporter within the buffer class, from the provided flowprocess argument in the buffer operate call.
HadoopFlowProcess hfp = (HadoopFlowProcess) flowprocess;
FileSplit fileSplit = (FileSplit)hfp.getReporter().getInputSplit();
.
.//the rest of your code
.