I'm creating some rules for a couple of tables which reflect a n2n relation. Below is the code I use to update the database:
// Prepara alterações no BD
HashMap<String, Object> dadosUpdate = new HashMap<>();
String idCurrUser = userManager.getCurrentUser().getId();
// Obtem nova chave para registros novos
if (isNew) {
String newCondominioId = FirebaseDBManager.getNewId("condominio");
condominioSel.setId(newCondominioId);
}
// Usuarios_Condominios
dadosUpdate.put("/usuarios_condominios/" + idCurrUser + "/" + condominioSel.getId() + "/ehAdmin/", true);
dadosUpdate.put("/usuarios_condominios/" + idCurrUser + "/" + condominioSel.getId() + "/ehSindico/", true);
dadosUpdate.put("/usuarios_condominios/" + idCurrUser + "/" + condominioSel.getId() + "/ehColaborador/", true);
// Condominios_Usuarios
dadosUpdate.put("/condominios_usuarios/" + condominioSel.getId() + "/" + idCurrUser + "/ehAdmin/", true);
dadosUpdate.put("/condominios_usuarios/" + condominioSel.getId() + "/" + idCurrUser + "/ehSindico/", true);
dadosUpdate.put("/condominios_usuarios/" + condominioSel.getId() + "/" + idCurrUser + "/ehColaborador/", true);
return fbDB.updateChildren(dadosUpdate);
And here are the rules I've setup at Firebase:
{
"rules": {
"usuarios_condominios": {
".read" : "auth.uid != null",
"$idUsuario": {
".write": "(auth.uid === $idUsuario) || (root.child('usuarios').child(auth.uid).child('ehAdmin').val() == true)"
}
},
"condominios_usuarios": {
".read" : "auth.uid != null",
"$idCondominio": {
".write": "(auth.uid != null && newData.child('condominios_usuarios').child($idCondominio).child(auth.uid).child('ehAdmin').exists())"
}
}
}
}
For some reason I could not find, the last rule is preventing me to save the data:
"newData.child('condominios_usuarios').child($idCondominio).child(auth.uid).child('ehAdmin').exists()"
Please help me understand what am I doing wrong.
The newData variable contains the value for the node on which the rules is defined as it will exist after the write operation, if that write operation succeeds/is permitted.
So, you don't need to build a path from the root yourself, newData already points to the current node. I think that means you'll need:
newData.child(auth.uid).child('ehAdmin').exists()
If in the future you want to inspect a node in a different branch from the root, you will have to call parent() to get from the current node to the root of the tree, and then use child calls from there.
Related
I have a JSON String:
{
"productName": "Gold",
"offerStartDate": "01152023",
"offerEndDate": "01152024",
"offerAttributes": [
{
"id": "TGH-DAD3KVF3",
"storeid": "STG-67925",
"availability": true
}
],
"offerSpecifications": {
"price": 23.25
}
}
The validation logic for the same is written as
Map<String, Object> map = mapper.readValue(json, Map.class);
String productNameValue = (String)map.get("productName");
if(productNameValue ==null && productNameValue.isEmpty()) {
throw new Exception();
}
String offerStartDateValue = (String)map.get("offerStartDate");
if(offerStartDateValue ==null && offerStartDateValue.isEmpty()) {
throw new Exception();
}
List<Object> offerAttributesValue = (List)map.get("offerAttributes");
if(offerAttributesValue ==null && offerAttributesValue.isEmpty()) {
throw new Exception();
}
Map<String, Object> offerSpecificationsValue = (Map)map.get("offerSpecifications");
if(offerSpecificationsValue ==null && offerSpecificationsValue.isEmpty() && ((String)offerSpecificationsValue.get("price")).isEmpty()) {
throw new Exception();
}
This is a portion of the JSON response. The actual response has more than 48 fields and the. The existing code has validation implemented for all the 48 fields as above. Note the responses are way complex than these.
I feel the validation code has very verbose and is very repetitive. How do, I fix this? What design pattern should I use to write a validation logic. I have seen builder pattern, but not sure how to use it for this scenario.
Build a JSON template for comparison.
ObjectMapper mapper = new ObjectMapper();
JsonNode template = mapper.readTree(
"{" +
" \"productName\": \"\"," +
" \"offerStartDate\": \"\"," +
" \"offerEndDate\": \"\"," +
" \"offerAttributes\": []," +
" \"offerSpecifications\": {" +
" \"price\": 0" +
" }" +
"}");
JsonNode data = mapper.readTree(
"{" +
" \"productName\": \"Gold\"," +
" \"offerStartDate\": \"01152023\"," +
" \"offerEndDate\": \"01152024\"," +
" \"offerAttributes\": [" +
" {" +
" \"id\": \"TGH-DAD3KVF3\"," +
" \"storeid\": \"STG-67925\"," +
" \"availability\": true" +
" }" +
" ]," +
" \"offerSpecifications\": {" +
" \"price\": 23.25" +
" }" +
"}");
validate(template, data);
Here is the recursion function to compare the template and the data.
public void validate(JsonNode template, JsonNode data) throws Exception {
final Iterator<Map.Entry<String, JsonNode>> iterator = template.fields();
while (iterator.hasNext()) {
final Map.Entry<String, JsonNode> entry = iterator.next();
JsonNode dataValue = data.get(entry.getKey());
if (dataValue == null || dataValue.isNull()) {
throw new Exception("Missing " + entry.getKey());
}
if (entry.getValue().getNodeType() != dataValue.getNodeType()) {
throw new Exception("Mismatch data type: " + entry.getKey());
}
switch (entry.getValue().getNodeType()) {
case STRING:
if (dataValue.asText().isEmpty()) {
throw new Exception("Missing " + entry.getKey());
}
break;
case OBJECT:
validate(entry.getValue(), dataValue);
break;
case ARRAY:
if (dataValue.isEmpty()) {
throw new Exception("Array " + entry.getKey() + " must not be empty");
}
break;
}
}
}
Option 1
If you are able to deserialize into a class instead of a map, you can use Bean Validaion to do something like this:
class Product {
#NotEmpty String productName;
#NotEmpty String offerStartDate;
#NotEmpty String offerEndDate;
}
You can then write your own #MyCustomValidation for any custom validation.
Option 2
If you must keep the object as a Map, you can extract each of your validations into validator objects make things extensible/composable/cleaner, roughly like this:
#FunctionalInterface
interface Constraint {
void validate(Object value);
default Constraint and(Constraint next) {
return (value) -> {
this.validate(value);
next.validate(value);
};
}
}
var constraints = new HashMap<Validator>();
constraints.put("productName", new NotEmpty());
constraints.put("offerStartDate", new NotEmpty());
constraints.put("someNumber", new LessThan(10).and(new GreaterThan(5)));
// Then
map.forEach((key, value) -> {
var constraint = constraints.get(key);
if (constraint != null) {
constraint.validate(value);
}
});
Hello, you can validate JSON with the following approach.
I created a mini, scaled-down version of yours but it should point you towards the right direction hopefully.
I used the Jackson libary, specifically I used the object mapper.
Here is some documentation to the object mapper - https://www.baeldung.com/jackson-object-mapper-tutorial.
So I created a simple Product class.
The Product class was composed of :
A empty constructor
An all arguments constructor
Three private properties, Name, Price, InStock
Getters & Setters
Here is my Main.java code.
import com.fasterxml.jackson.databind.ObjectMapper;
import java.util.Map;
public class Main {
public static void main(String[] args) throws Exception {
// Mock JSON
String json = """
{
"name" : "",
"price" : 5.99,
"inStock": true
}""";
// Declare & Initialise a new object mapper
ObjectMapper mapper = new ObjectMapper();
// Declare & Initialise a new product object
Product product = null;
try {
// Try reading the values from the json into product object
product = mapper.readValue(json, Product.class);
}
catch (Exception exception) {
System.out.println("Exception");
System.out.println(exception.getMessage());
return;
}
// Map out the object's field names to the field values
Map<String, Object> propertiesOfProductObject = mapper.convertValue(product, Map.class);
// Loop through all the entries in the map. i.e. the fields and their values
for(Map.Entry<String, Object> entry : propertiesOfProductObject.entrySet()){
// If a value is empty throw an exception
// Else print it to the console
if (entry.getValue().toString().isEmpty()) {
throw new Exception("Missing Attribute : " + entry.getKey());
} else {
System.out.println(entry.getKey() + "-->" + entry.getValue());
}
}
}
}
It throws an error sauing the name field is empty.
I have a Swagger 1.2 doc.json and the following Java code which uses Swagger Parser to extract all the paths from this document. The problem is that the parser does not get all the paths (from 50 it shows me only 27).
public class Temps {
public static void main (String[]args ) {
int totale=0;
Swagger swagger = new SwaggerParser().read("C:\\Users\\eya\\Desktop\\nodes.json");
Map<String, Path> paths = swagger.getPaths();
for (Map.Entry<String, Path> p : paths.entrySet()) {
Path path = p.getValue();
totale ++;
Map<HttpMethod, Operation> operations = path.getOperationMap();
for (java.util.Map.Entry<HttpMethod, Operation> o : operations.entrySet()) {
System.out.println("===");
System.out.println("PATH:" + p.getKey());
System.out.println("Http method:" + o.getKey());
System.out.println("Summary:" + o.getValue().getSummary());
System.out.println("Parameters number: " + o.getValue().getParameters().size());
for (Parameter parameter : o.getValue().getParameters()) {
System.out.println(" - " + parameter.getName());
}
System.out.println("Responses:");
for (Map.Entry<String, Response> r : o.getValue().getResponses().entrySet()) {
System.out.println(" - " + r.getKey() + ": " + r.getValue().getDescription());
}
}
}
System.out.println(totale);
}
}
Does anyone know what causes this problem?
There are duplicate paths in your API definition, for example:
"path": "api/v2/nodes/{id}",
"description": "Get a node",
...
"path": "api/v2/nodes/{id}",
"description": "Get a virtual folder",
"path": "api/v2/nodes/actions",
"description": "Get actions for the selected node IDs",
...
"path": "api/v2/nodes/actions",
"description": "Get actions for the selected node IDs",
Duplicate paths are not allowed by the Swagger 1.2 Specification:
In the apis array, there MUST be only one API Object per path.
The parser simply ignores the duplicates.
i succeeded in deploying vertical but i still got this error : Result is already complete: succeeded. I don't understand why, I need some explanation this is my Deploy class :
public class Deploy {
private static Logger logger = LogManager.getLogger(Deploy.class);
private static void deployAsynchronousVerticalByIndex(Vertx vertx, int indexCurrentDeploy, JsonArray verticalArray, Future<Void> startFuture, JsonObject jsonObjectConfig) {
JsonObject currentVertical = verticalArray.getJsonObject(indexCurrentDeploy);
currentVertical.forEach(entry -> {
logger.debug("Starting deploy of class: " + entry.getKey() + ", With the config: " + entry.getValue() + ".");
DeploymentOptions optionsDeploy = new DeploymentOptions().setConfig(jsonObjectConfig);
ObservableFuture<String> observable = RxHelper.observableFuture();
vertx.deployVerticle(entry.getKey(), optionsDeploy, observable.toHandler());
observable.subscribe(id -> {
logger.info("Class " + entry.getKey() + " deployed.");
if (indexCurrentDeploy + 1 < verticalArray.size()) {
deployAsynchronousVerticalByIndex(vertx, indexCurrentDeploy + 1, verticalArray, startFuture, jsonObjectConfig);
} else {
logger.info("ALL classes are deployed.");
startFuture.complete();
}
}, err -> {
logger.error(err, err);
startFuture.fail(err.getMessage());
});
});
}
public static void deployAsynchronousVertical(Vertx vertx, JsonArray verticalArray, Future<Void> startFuture, JsonObject jsonObjectConfig) {
deployAsynchronousVerticalByIndex(vertx, 0, verticalArray, startFuture, jsonObjectConfig);
}
}
That's because you reuse your future between verticles, and have a race condition there.
Simplest way to fix that would be :
if (startFuture.isComplete()) {
startFuture.complete();
}
But that actually would only obscure the problem.
Extract your observable out of the loop, so it actually would listen only once for every vertice.
JsonObject currentVertical = verticalArray.getJsonObject(indexCurrentDeploy);
ObservableFuture<String> observable = RxHelper.observableFuture();
currentVertical.forEach(entry -> {
logger.debug("Starting deploy of class: " + entry.getKey() + ", With the config: " + entry.getValue() + ".");
DeploymentOptions optionsDeploy = new DeploymentOptions().setConfig(jsonObjectConfig);
vertx.deployVerticle(entry.getKey(), optionsDeploy, observable.toHandler());
});
observable.subscribe(id -> {
logger.info("Class " + id + " deployed.");
if (indexCurrentDeploy + 1 < verticalArray.size()) {
deployAsynchronousVerticalByIndex(vertx, indexCurrentDeploy + 1, verticalArray, startFuture, jsonObjectConfig);
} else {
logger.info("ALL classes are deployed.");
if (startFuture.isComplete()) {
startFuture.complete();
}
}
}, err -> {
logger.error(err, err);
startFuture.fail(err.getMessage());
});
and in addition to Extracting your observable out of the loop,
it is a good idea to use io.vertx.core.Promise instead of io.vertx.core.Future.
and use promise.tryXXXX()
according to io.vertx.core.Promise.java documentation:
Future is #deprecated. instead use Promise.
tryFail is Like fail but returns false when the promise is already
completed instead of throwing IllegalStateException. and it returns
true otherwise.
so...
instead of
if (startFuture.isComplete())
startFuture.complete();
if (startFuture.isFail())
startFuture.fail(err.getMessage());
use
promise.tryComplete()
promise.tryFail(err.getMessage());
I use OpenCmis in-memory for testing. But when I create a document I am not allowed to set the versioningState to something else then versioningState.NONE.
The doc created is not versionable some way... I used the code from http://chemistry.apache.org/java/examples/example-create-update.html
The test method:
public void test() {
String filename = "test123";
Folder folder = this.session.getRootFolder();
// Create a doc
Map<String, Object> properties = new HashMap<String, Object>();
properties.put(PropertyIds.OBJECT_TYPE_ID, "cmis:document");
properties.put(PropertyIds.NAME, filename);
String docText = "This is a sample document";
byte[] content = docText.getBytes();
InputStream stream = new ByteArrayInputStream(content);
ContentStream contentStream = this.session.getObjectFactory().createContentStream(filename, Long.valueOf(content.length), "text/plain", stream);
Document doc = folder.createDocument(
properties,
contentStream,
VersioningState.MAJOR);
}
The exception I get:
org.apache.chemistry.opencmis.commons.exceptions.CmisConstraintException: The versioning state flag is imcompatible to the type definition.
What am I missing?
I found the reason...
By executing the following code I discovered that the OBJECT_TYPE_ID 'cmis:document' don't allow versioning.
Code to view all available OBJECT_TYPE_ID's (source):
boolean includePropertyDefintions = true;
for (t in session.getTypeDescendants(
null, // start at the top of the tree
-1, // infinite depth recursion
includePropertyDefintions // include prop defs
)) {
printTypes(t, "");
}
static void printTypes(Tree tree, String tab) {
ObjectType objType = tree.getItem();
println(tab + "TYPE:" + objType.getDisplayName() +
" (" + objType.getDescription() + ")");
// Print some of the common attributes for this type
print(tab + " Id:" + objType.getId());
print(" Fileable:" + objType.isFileable());
print(" Queryable:" + objType.isQueryable());
if (objType instanceof DocumentType) {
print(" [DOC Attrs->] Versionable:" +
((DocumentType)objType).isVersionable());
print(" Content:" +
((DocumentType)objType).getContentStreamAllowed());
}
println(""); // end the line
for (t in tree.getChildren()) {
// there are more - call self for next level
printTypes(t, tab + " ");
}
}
This resulted in a list like this:
TYPE:CMIS Folder (Description of CMIS Folder Type) Id:cmis:folder
Fileable:true Queryable:true
TYPE:CMIS Document (Description of CMIS Document Type)
Id:cmis:document Fileable:true Queryable:true [DOC Attrs->]
Versionable:false Content:ALLOWED
TYPE:My Type 1 Level 1 (Description of My Type 1 Level 1 Type)
Id:MyDocType1 Fileable:true Queryable:true [DOC Attrs->]
Versionable:false Content:ALLOWED
TYPE:VersionedType (Description of VersionedType Type)
Id:VersionableType Fileable:true Queryable:true [DOC Attrs->]
Versionable:true Content:ALLOWED
As you can see the last OBJECT_TYPE_ID has versionable: true... and when I use that it does work.
I am trying to fetch page content with phantomjs. In many examples on the official site (eg.: https://github.com/ariya/phantomjs/blob/master/examples/imagebin.js) the function page.open() is used.
In my script though it does not seem to work. I used reflection to look at all defined methods of the page object:
for ( var prop in page) {
if (typeof page[prop] == 'function') {
log("method in page: " + prop);
}
}
and the open() method did not show up. (close(), render(), etc... did show up)
also when I am trying to execute a script:
// include plugins
var system = require('system');
var fileSystem = require('fs');
var page = require('webpage').create();
// global errorhandler
phantom.onError = function(msg, trace) {
console.log("ERROR!!!!! \n" + msg);
phantom.exit(1);
};
// read json input and remove single outer quotes if set
var jsonin = system.args[1];
if (jsonin.charAt(0) == "'") {
jsonin = jsonin.substr(1, jsonin.length - 2);
}
// make object of json
var data = eval('(' + jsonin + ')');
// optional url
var url = system.args[2];
// transfer file
var dest = system.args[3];
console.log("systemargs[1]: data -> " + data);
console.log("systemargs[2]: url -> " + url);
console.log("systemargs[3]: dest -> " + dest);
openRoot();
/*
* open site
*/
function openRoot() {
page.onConsoleMessage = function(msg) {
console.log('INNER ' + msg);
};
page.open(url, function(status) {
if (status === "success") {
if (loadCount == 0) { // only initial open
console.log("opened successfully.");
page.injectJs("./jquery-1.8.3.min.js");
} else {
console.log("page open error.");
console.log('skip refresh ' + loadCount);
}
} else {
console.log("error opening: " + status);
}
});
}
phantom.exit(0);
it does not execute the open function. The log does not show any messages inside the open() method.
Any advice on what I might do wrong would be greatly appreciated. If there is additional information required, please let me know.
Regards,
Alex
Edit:
The line
console.log(typeof (page.open));
outputs: function which is not what I expected, given the previous list of methods I wrote to the log, where open does not exist. Hmm.
After hours of senseless searching I found the mistake. Stupid me. At the end of the script I call phantom.exit() where I should not.
The working code includes an Interval which checks on an object, in my case content and a member of that content.isFinished. If I set this to true, then phantom.exit() gets called.
My bad, absolutely my fault.
Working code:
var url = system.args[2];
// transfer file
var dest = system.args[3];
content = new Object();
content.isFinished = false;
console.log("systemargs[1]: data -> " + data);
console.log("systemargs[2]: url -> " + url);
console.log("systemargs[3]: dest -> " + dest);
openRoot();
/*
* open site
*/
function openRoot() {
page.onConsoleMessage = function(msg) {
console.log('INNER ' + msg);
};
page.open(url, function(status) {
if (status === "success") {
if (loadCount == 0) { // only initial open
console.log("opened successfully.");
page.injectJs("./jquery-1.8.3.min.js");
// do stuff
content.isFinished = true;
} else {
console.log("page open error.");
console.log('skip refresh ' + loadCount);
content.isFinished = true
}
} else {
console.log("error opening: " + status);
}
});
}
/*
* wait for completion
*/
var interval = setInterval(function() {
if (content.isFinished) {
page.close();
f = fileSystem.open(dest, "w");
f.writeLine(out);
f.close();
// exit phantom
phantom.exit();
} else {
console.log('not finished - wait.');
}
}, 5000);
Regards,
Alex