how to create a complex object from a Cucumber table? - java

I do not know how to read table from .feature and populate correctly
| payInstruction.custodian | and | payInstruction.acctnum |
like the internal class.
I have a table:
| confirmationId | totalNominal | payInstruction.custodian | payInstruction.acctnum |
| 1 | 100.1321 | yyy | yyy |
| 2 | 100.1351 | zzz | zzz |
and I have class template which has the next structure:
class Confirmation {
String confirmationId;
double totalNominal;
PayInstruction payInstruction;
}
class PayInstruction {
String custodian;
String acctnum;
}
auto converting table to List<Confirmation> has error because cannot recognize payInstruction.acctnum and pay payInstruction.custodian
any help ?

I know the question is a bit old, but Google did drive me here and could do so with others in the future.
.feature adaptations according to the question :
Given some confirmations:
| confirmationId | totalNominal |
| 1 | 100.1321 |
| 2 | 100.1351 |
And some pay instructions:
| confirmationId | custodian | acctnum |
| 1 | yyy | yyy |
| 2 | zzz | zzz |
Steps implementation :
Map<Integer, Confirmation> confirmations;
#Given('^some confirmations:$)
public void someConfirmations(List<Confirmation> confirmations) {
this.confirmations = confirmations.stream().collect(Collectors.toMap(Confirmation::getConfirmationId, Function.identity()));
}
#And('^some pay instructions:$)
public void somePayInstructions(List<PayInstructionTestObject> payInstructions) {
payInstructions.forEach(pi ->
this.confirmations.get(pi.getConfirmationId()).addPayInstruction(pi.toPayInstruction())
);
}
The trick is to create a sub class of PayInstruction in test folder which holds a confirmation id as correlation identifier to retrieve the correct confirmation. The toPayInstruction method serves as converter to get rid of the test object.
Hope the Java and feature code is almost compiling, I'm writing this without effectively making it run. Slight adaptations might be necessary to make it run.
The original business model was untouched by the solution, not breaking / tweaking it for testing.

My approach would be to supply the constructor for Confirmation with four primitives and then create the PayInstruction in the constructor of Confirmation.

Related

Is there a way of checking for multiple string sequences with String.matches in Java?

I want to look for any of the following sequences: either one of * - / + followed by a * + /
for example:
4+*2 is something I am looking for.
4+5/2 is not
if (combinedArray.matches("[-*+/][*+/]"))
{
...code
}
I'd be happy to know what I did wrong.
I basicly want it to have the same logic as this:
if (combinedArray.contains("*/") | combinedArray.contains("*+") | combinedArray.contains("**")
| combinedArray.contains("//") | combinedArray.contains("/+") | combinedArray.contains("/*")
| combinedArray.contains("+/") | combinedArray.contains("++") | combinedArray.contains("+*")
| combinedArray.contains("-/") | combinedArray.contains("-+") | combinedArray.contains("-*") )
~~~

Filter Dataset using where column is not a number using Spark Java API 2.2?

I'm new in Spark Java API. I want to filter my Dataset where a column is not a number. My dataset ds1 is something like this.
+---------+------------+
| account| amount |
+---------+------------+
| aaaaaa | |
| aaaaaa | |
| bbbbbb | |
| 123333 | |
| 555555 | |
| 666666 | |
I want return a datset ds2 like this:
+---------+------------+
| account| amount |
+---------+------------+
| 123333 | |
| 555555 | |
| 666666 | |
I tried this but id doesn't work for me.
ds2=ds1.select("account"). where(dsFec.col("account").isNaN());
Can someone please guides me with a sample spark expression to resolve this.
You can define a udf function to check whether the string in account column is numeric or not as
UDF1 checkNumeric = new UDF1<String, Boolean>() {
public Boolean call(final String account) throws Exception {
return StringUtils.isNumeric(account);
}
};
sqlContext.udf().register("numeric", checkNumeric, DataTypes.BooleanType);
and then use callUDF function to call the udf function as
df.filter(callUDF("numeric", col("account"))).show();
which should give you
+-------+------+
|account|amount|
+-------+------+
| 123333| |
| 555555| |
| 666666| |
+-------+------+
Just cast and check if result is null:
ds1.select("account").where(dsFec.col("account").cast("bigint").isNotNull());
One way to do this:
Scala Equivalent:
import scala.util.Try
df.filter(r => Try(r.getString(0).toInt).isSuccess).show()
+-------+------+
|account|amount|
+-------+------+
| 123333| |
| 555555| |
| 666666| |
+-------+------+
Or You can use the same using Java's try catch:
df.map(r => (r.getString(0),r.getString(1),{try{r.getString(0).toInt; true
}catch {
case runtime: RuntimeException => {
false}
}
})).filter(_._3 == true).drop("_3").show()
+------+---+
| _1| _2|
+------+---+
|123333| |
|555555| |
|666666| |
+------+---+

How to convert double with scientific format to String Using spark Java API?

I'm new in spark Java API. I want to transform double with scientific format example: 1.7E7---->17000000,00.
MyDataSet is:
+---------+------------+
| account| amount |
+---------+------------+
| c1 | 1.7E7 |
| c2 | 1.5E8 |
| c3 | 142.0 |
I want to transform my dataset to something like this.
+---------+----------------------+
| account| amount |
+---------+----------------------+
| c1 | 17000000,00 |
| c2 | 1500000000,00 |
| c3 | 142,00 |
Can someOne guide me with an expression in spark Java to resolve this.
Thanks in advance.
I think you can do it like this. Don't forget import spark.sql.functions
Dataset<Row> myDataset = ....
myDataset = myDataset.withColumn("newAmount", col("amount").cast(DataTypes.DoubleType))
.drop(col("amount"))
.withColumnRenamed("newAmount","amount")
.toDF();

Java | Return List<Model> on Oracle DB

I've been reading these Oracle Tutorials but I couldn't find any example of returning a List (as rows) in the Oracle DB.
I've developed a Java function which returns a List<JsonPair>, this is my class
public class JsonPair{
public String path;
public String value;
}
The ideia is to obtain something like this(see below) after include my Java function into the Oracle DB
| Path | Values |
|:------------------|----------:|
| root.obj1.attr1 | Val1 |
| root.obj1.attr2 | Val2 |
| root.obj2.attr1 | Val3 |
| root.obj1.arr1[0] | Val4 |
My function receives a JSON String and return List like mentioned above. The idea is to have something like
SELECT
Path,
Value
FROM
MyPLSQLFunction_ThatCallsMyJaveFunction (JsonString)

Liferay PermissionChecker returns guest's permissions instead of site member's

I am developing a JSF portlet in Liferay-6.2 and I'm setting up permissions. I have created my default.xml file and I think it is working since it saves in the database the correct permissions. Also I think resourceLocalService.addResources(...) is working since it also saves in the database the correct rows, anyway I am doing it as Administrator, I don't know if it has something to do with my problem.
My problem is when I try to check the permission for a site member, it denies the permission as it was a guest.
Here is the significant part of my default.xml
<model-resource>
<model-name>org.lrc.liferay.toolbuilder.model</model-name>
<portlet-ref>
<portlet-name>tool-builder</portlet-name>
</portlet-ref>
<permissions>
<supports>
<action-key>ADD_TOOL_DEF</action-key>
<action-key>ADD_TOOL_INSTANCE</action-key>
</supports>
<site-member-defaults>
<action-key>ADD_TOOL_INSTANCE</action-key>
</site-member-defaults>
<guest-defaults />
<guest-unsupported>
<action-key>ADD_TOOL_DEF</action-key>
<action-key>ADD_TOOL_INSTANCE</action-key>
</guest-unsupported>
</permissions>
</model-resource>
Which as result I suppose it saves these two rows in the ResourceAction table:
| resourceActionId | name | actionId | bitwiseValue |
| 2705 | org.lrc.liferay.toolbuilder.model | ADD_TOOL_DEF | 2 |
| 2706 | org.lrc.liferay.toolbuilder.model | ADD_TOOL_INSTANCE | 4 |
| 2707 | org.lrc.liferay.toolbuilder.model | PERMISSIONS | 4 |
When I save my resource I do
User user = userPersistence.findByPrimaryKey(liferayFacesContext.getUserId());
resourceLocalService.addResources(user.getCompanyId(),
toolDefDBE.getGroupId(),
liferayFacesContext.getScopeGroupId(),
"org.lrc.liferay.toolbuilder.model",
toolDef.getToolDefId(), false, true, true);
Which, if I'm not wrong, saves this in the database:
| resourcePermissionId | companyId | name | primKey | roleId | ownerId | actionIds |
| 6101 | 10154 | org.lrc.liferay.toolbuilder.model | 5201 | 10163 | 10158 | 14 |
| 6102 | 10154 | org.lrc.liferay.toolbuilder.model | 5201 | 10170 | 0 | 4 |
According to Role_ table, 10163 is the Owner's roleId and 10170 is the Site member's roleId
Finally, when I want to check the permission I have this is my requestBean:
public Boolean getHasAddPermission() {
if (this.hasAddPermission == null) {
LiferayFacesContext liferayFacesContext = LiferayFacesContext.getInstance();
long scopeGroupId = liferayFacesContext.getScopeGroupId();
System.out.println("El scopeGroupId es " + scopeGroupId);
this.hasAddPermission = liferayFacesContext.getThemeDisplay().getPermissionChecker().hasPermission
(scopeGroupId, ToolSession.MODEL, toolSession.getToolDef().getToolDefDBEId(), "ADD_TOOL_INSTANCE");
}
return this.hasAddPermission;
}
But for a site member it returns me false instead of true. Anyone does know what I'm doing wrong?
Thanks a lot!
Solved!! The problem was that the user was not site member. Liferay makes a difference between logged users and site members. To be a site member you must assign the new users a role of that site by default or assign it later with the site owner user

Categories