Rest API to demonstrate file system - java

I am designing the rest apis that represent a file system.
File system support 3 functions
mkdir(path)
createFile(path, content) -> create if not exist and replace if exist.
readFile(path)
Here is the REST API, I am thinking of designing what do you guys think of it ?
1. mkdir
POST v1/file-system/directories
BODY {
"path" : "???"
}
RESPONSE
{
"id" : "",
"path" "",
"files": [...] // this will contain info on files or directories under this directory
}
2. createFile
PUT v1/file-system/files
BODY {
"path" : "???"
"content": ""
}
RESPONSE
{
"id" : "",
"content": ""
"path" ""
}
3. read
GET v1/file-system/files/{file-path} or
GET v1/file-system/files?file-path={file-path}
RESPONSE
{
"id" : "",
"content": ""
"path" ""
}
Can you guys tell me if these API'S are correct representation for these function.
Few questions
For GET API, shall I specify the path as path variable or query param ? If path then how will the backend differentiate between url path and file path.
e.g. v1/file-system/files/a/b/c.txt
Since create file can either create a file or replace the content of existing file, is it safe to use PUT ?
For POST and PUT, do we specify path as path variable ?

You have to understand that each request method (GET, POST, PUT...) has its own convention, but they do not differentiate a lot from each other.
For example, you could use POST to update something and not PATCH or so on.
At the end of the day, both methods, receive data in the body of the request and do something with it (or not).
Regarding your questions:
I would avoid sending a path as a query param. Send your data through the request body. That way you have a JSON and you don't have to care about specific encoding and character escaping for the URL.
Again, it is very safe since they only change because of the convention. We mostly use POST to create new data and PUT to create and replace data if it exists. Check this for more info.
Again, avoid putting paths as query params. Insert them into the body as JSON.
Read this article to learn more about HTTP Methods.

Related

ExecuteStreamCommand won't read foreign characters

We have Apache Nifi set to write files to local drive, then run program that processes these files and outputs response to "response" attribute. This is a JSON string that we then deliver to API to update records.
However, issue is that while we can successfully write, read and process the files, Nifi fails to understand non-English characters in the response text. This leads to names being corrupted when we send back the response. This only applies to the JSON string we receive from the program.
Nifi is running Windows 10 environment When we run the program manually using files outputted by Nifi, we get correct output. Issue only happens in Nifi.
To provide example, input json is:
{
"player" : "mörkö",
"target" : "goal",
"didhitin" : ""
}
This is stored in our programs work folder and we call program using ExecuteStreamCommand , giving our input JSON file as the parameter. JSON is processed and our program outputs following JSON, which is then stored into response attribute of the flowfile:
{
"player" : "mörkö",
"target" : "goal",
"didhitin" : "true"
}
However, issue is that when this is read by Nifi into the response attribute of the flowfile, it becomes
{
"player" : "m¤rk¤",
"target" : "goal",
"didhitin" : "true"
}
(Not the actual process, but close enough to demonstrate the issue)
Which, when we feed it into API, would either fail or corrupt the name of the original (In this case, value of player). Neither which is desirable output.
So far, we have figured out that this is most likely issue with encoding, but we have not found a way to change the encoding of Nifi to possibly fix incorrectly read characters.
Managed to fix this issue by adding following line to the start of the program:
Console.OutputEncoding = Encoding.UTF8;
This effectively enforces the program to output UTF-8 characters, which would be in-line with rest of the flow.

How to fetch sub json from request parameter

I just decided to send below json data from client to server. Then i found all my previous request were not of type json. And i am unable to send json. Below is the json i want to send in data of jquery ajax.
data:{
id:"10",
sampleArr:[
{ id:"hello","sample":"hello"},
{ id:"hello1","sample":"hello1"}
]
}
and at server i get below parameters
id=10
group[0][id]=hello
group[0][sample]=hello
group[1][id]=hello1
group[1][sample]=hello1
so i am confused how to fetch all groups
One problem is that what you are sending is not valid JSON.
{ "data" : {
"id" : "10",
"sampleArr": [
{ "id" : "hello", "sample" : "hello"},
{ "id" : "hello1", "sample" : "hello1"}
]
}
}
Notice that all attribute names must be quoted, and the top-level JSON object must have curly brackets around it.
If that doesn't help, you need to explain how your servlet is receiving and parsing the JSON.
#BigMike, Thanks i am able to fetch complete json and play around with it. Was unable to send JSON even setting content type application/json. But Still checking why, but working as temperary fix (Might be that i am not using Rest API)

Cloud Dataflow:Reading entire json array file from Cloud Storage and create a PCollection of json object

I have a json array file with content as below
[ {
"MemberId" : "1234",
"Date" : "2017-07-03",
"Interactions" : [ {
"Number" : "1327",
"DwellTime" : "00:03:05"
} ]
}, {
"MemberId" : "5678",
"Date" : "2017-07-03",
"Interactions" : [ {
"Number" : "1172",
"DwellTime" : "00:01:26"
} ]
} ]
I wanted to create a PCollection of Java Object mapped to each Json present in Json array
JSON formatted like this (records spread over multiple lines instead of one per line) is hard for a data processing tool like beam/dataflow to process in parallel - from a random point in the file, you cannot be sure where the next record begins. You can do it by reading from the beginning of the file, but then you're not really reading in parallel.
If it's possible, reformatting it so that it's one record per line would let you use something like TextIO to read in the file.
If not, you'll need to read the file in one go.
I would suggest a couple possible approaches:
Write a ParDo that reads from the file using the gcs API
This is pretty straight forward. You'll do all the reading in one ParDo and you'll need to implement the connection code inside of that pardo. Inside the pardo you would write the same code you would as if you're reading the file in a normal java program. The pardo will emit each java object as a record.
Implement a filebasedsource
File based sources will work - when the fileOrPatternSpec is "gs://..." it knows how to read from GCS. You'll need to make sure to set fileMetadata.isReadSeekEfficient to false so that it won't try to split the file. I haven't tried it, but I believe the correct way to do that is to set it inside of the single file constructor of FBS (ie, your class's override of FileBaseSource(MetaData, long, long)
TextSource/XmlSource (and their accompanying wrappers TextIO/XmlIO) are examples of this, except that they try to implement splitting - yours will be much simpler since it won't.

autodesk forge "Failed to trigger translation for this file"

I am trying to use the autodesk forge viewer tutorial
https://developer.autodesk.com/en/docs/model-derivative/v2/tutorials/prepare-file-for-viewer/
I have successfully uploaded and downloaded a dwg file
on the step where i convert it to svf it never seems to process and fails with
{"input":{"urn":"Safe Base64 encoded value of the output of the upload result"},"output":{"formats":[{"type":"svf","views":["2d","3d"]}]}}
HTTP/1.1 400 Bad Request
Result{"diagnostic":"Failed to trigger translation for this file."}
First question do i need to remove the urn: before Base64 encoding.
Second is there any more verbose error result that I can see.
Note I have also tried with a rvt file and tried with "type":"thumbnail" nothing seems to work.
I feel my Encoded URN is incorrect but I am not sure why it would be.
On the tutorial page they seem to have a much longer and raw urn not sure if I should be appending something else to it before encoding. they have a version and some other number
from tutorial
raw
"urn:adsk.a360betadev:fs.file:business.lmvtest.DS5a730QTbf1122d07 51814909a776d191611?version=12"
mine
raw
"urn:adsk.objects:os.object:gregbimbucket/XXX"
EDIT:
This is what i get back from the upload of a dwg file
HTTP/1.1 200 OK
Result{
"bucketKey" : "gregbimbucket",
"objectId" : "urn:adsk.objects:os.object:gregbimbucket/XXX",
"objectKey" : "XXX",
"sha1" : "xxxx",
"size" : 57544,
"contentType" : "application/octet-stream",
"location" : "https://developer.api.autodesk.com/oss/v2/buckets/gregbimbucket/objects/XXX"
}
This is what i send to convert the file
{"input":{"urn":"dXJuOmFkc2sub2JqZWN0czpvcy5vYmplY3Q6Z3JlZ2JpbWJ1Y2tldC9YWFg"},"output":{"formats":[{"type":"svf","views":["2d","3d"]}]}}
This is the error I get back
HTTP/1.1 400 Bad Request
Result{"diagnostic":"Failed to trigger translation for this file."}
EDIT 2: SOLUTION
It looks like the object_id when uploading a file has to have the file extension and not end in a GUI or random set of characters for it to know what file type it is. So that it can be converted.
"objectId" : "urn:adsk.objects:os.object:gregbimbucket/Floor_sm.dwg",
SOLUTION It looks like the object_id when uploading a file has to have the file extension and not end in a GUI or random set of characters for it to know what file type it is.

how to capture response data in jmeter and do parameterisation

I'm testing a standalone java application using JMeter and I'm using a OS Process Sampler to execute the jar file. I've given JSON string in java program and I've used publish method to post JSON string to Queue. Now I want to capture the data in JMeter response data and I want to do parameterisation for those fields and I want to use CSV Data Set Config to pass the values to fields and each user should take unique data.
The JSON string looks like below and I want to parameterise these valuse from JMeter response data and pass values from CSV file.
[{
"id": 100001,
"status": "Pending",
"priorityCode": "1",
"miniTeamId": "234256",
"requestName": "General"
}]
Given your CSV file looks like:
100001,Pending,1,234256,General
100002,Done,2,234257,General
etc.
Configure your CSV Data Set config as follows:
Filename: path to your .csv file (better full path)
Variable Names: id,status,priorityCode,miniTeamId,requestName
other fields can be left as they are. By default JMeter will read next line from CSV file by each thread on each loop, when file end will be reached - JMeter will start over.
JMeter Variables populated by the CSV Data Set Config can be referred as ${variableName} or ${__V(variableName)} so parametrised request body should look like:
[
{
"id": ${id},
"status": "${status}",
"priorityCode": "${priorityCode}",
"miniTeamId": "${miniTeamId}",
"requestName": "${requestName}"
}
]
See Using CSV DATA SET CONFIG for more detailed information on parametrisation of JMeter tests using CSV files.
Also be aware of the following test elements:
__csvRead() function - if you need to get data from multiple CSV files
JSON Path Extractor - if you need to get values from response JSON
UPD: passing parametrized JSON string via OS Process Sampler

Categories