I'm making checkboxes and using Scala, I found nice example but in Java. But I couldn't convert it to Scala.
This is Java code:
Form<StudentFormData> formData = Form.form(StudentFormData.class).fill(studentData);
Scala's play.api.data.Form class doesn't have "fill" and "form" methods like Java's play.data.Form. How I can create Form in Scala?
Here is a function that I use to get data from the Form and generate an object Location.
def add = DBAction { implicit rs =>
val data = LocationForm.form.bindFromRequest.get
Locations.create(Some(data.venueName), data.lat, data.lon)
Redirect(routes.LocationController.all) }
Related
I have got a lot of custom Dataframe transformations in my code.
First group is simple casting:
dframe = dframe.withColumn("account_number", col("account").cast("decimal(38,0)"));
The second group is UDF-Transformations:
(UDF1<Timestamp, Integer>) s -> s.toLocalDateTime().extractMonth()
dframe = dframe.withColumn("month", callUDF(("monthExtractor"), dframe.col("trans_date_t")));
They are all working so the code is testing. But my final goal is to create ML Pipeline out of the code so I'd able to reuse . So is there a way to convert the code above into various Transformers?
You can create your own features transformation (with udf, or other method), and then override the transform method of spark, and put inside your own operation.
The spark code on github gives you some insight on this possibility to extend the transformer functionality provided you create the wrapper objects that are necessary.
override def transform(dataset: Dataset[_]): DataFrame = {
transformSchema(dataset.schema, logging = true)
val xModel = new feature.XModel()
val xOp = udf {xModel.transform _ }
dataset.withColumn($(outputCol), xOp(col($(inputCol))))
}
where xModel, and xOp are abstractions. The model will transform your dataset accordingly given the defined operation.
For my Java application I am trying to use ScalaCheck to write some property-based unit tests. For that purpose I need generators, but all the tutorials I can find use a constructor with parameters to generate objects.
The object I need to generate does not have constructor parameters, and I cannot add such a constructor since it is from an external library.
I now have the following (JwtClaims is from the package org.jose4j.jwt):
def genClaims: Gen[JwtClaims] = {
val url = arbString
val username = arbString
val claims = new JwtClaims()
claims.setNotBeforeMinutesInThePast(0)
claims.setExpirationTimeMinutesInTheFuture(60)
claims.setSubject(username) // error about Gen[String] not matching type String
claims
}
Any suggestions on how to write my generator? I have zero knowledge of Scala, so please be patient if I've made an 'obvious' mistake :) My expertise is in Java, and testing using ScalaCheck is my first venture into Scala.
You need to be returning a generator of a claims object, not a claims object. The generator is effectively a function that can return a claims object. The normal way I go about this is with a for comprehension (other people prefer flatMap, but I think this reads more clearly).
def genClaims: Gen[JwtClaims] = {
for {
url <- arbitrary[String]
username <- arbitrary[String]
} yield {
val claims = new JwtClaims()
claims.setNotBeforeMinutesInThePast(0)
claims.setExpirationTimeMinutesInTheFuture(60)
claims.setSubject(username)
claims
}
}
I'm trying to read in a csv in the hdfs, parse it with cascading, and then use the resulting tuple stream to form the basis of regex expressions in another tuple stream using RegexParser. As far as I can tell, the only way to do this would be to write a custom Function of my own, and I was wondering if anybody knew how to use the Java API to do this instead.
Pointers on how to write my own function to do this inside the cascading framework would be welcome, too.
I'm running Cascading 2.5.1
The best resource for this question is the Palo Alto cascading example tutorial. It's in java and provides examples of a lot of use cases, including writing custom functions.
https://github.com/Cascading/CoPA/wiki
And yes, writing a function that allows an input regex that references other argument inputs is your best option.
public class SampleFunction extends BaseOperation implements Function
{
public void operate( FlowProcess flowProcess, FunctionCall functionCall )
{
TupleEntry argument = functionCall.getArguments();
String regex = argument.getString( 0 );
String argument = argument.getString( 1 );
String parsed = someRegexOperation();
Tuple result = new Tuple();
result.add( parsed );
functionCall.getOutputCollector().add( result );
}
}
I'm relying on an old Java API that kinda sucks and loves to throw null pointer exceptions when data is missing. I want to create a subclass that has option type accessors but preserves the old accessors until I decide I need to create safe accessors for them. Is there a good way to create a subclass from a copy of the original object? I'd like to achieve something like the following:
SafeIssue extends Issue {
def safeMethod: Option[Value] = { //... }
}
val issue = oldapi.getIssue()
val safeIssue = SafeIssue(issue)
//Preserves issue's methods and data if I need them
val unsafeVal = safeIssue.unsafeMethod
val maybeVal = safeIssue.safeMethod
Why not try an implicit conversion instead? This works better with Java APIs that like to create their own objects. So you would
class SafeIssue(issue: Issue) {
def original = issue
def safeFoo = Option(issue.foo)
// ... You must write any of these you need
}
implicit def make_issues_safe(issue: Issue) = new SafeIssue(issue)
Then you can--as long as you've supplied the method--write things like
val yay = Issue.myStaticFactoryMethodThing.safeFoo.map(x => pleaseNoNull(x))
(You can then decide whether you want to carry SafeIssue or Issue around in your code, and you can always get back the Issue from SafeIssue with the exposed original method (or you could make the issue parameter a val.)
I am using the Google Visualization API on the client side and I create a DataTable object. Then I want to pass it to my server and upload it via the Spreadsheet API to a spreadsheet. Probably the best way is to use JSON, so I converted it with the method toJSON() and sent it over POST to my server. I tried to use these 2 classes:
DataTable (JavaScript)
DataTable (Java)
Now I noticed, that these 2 classes aren't compatible, at least not over JSON. The JavaScript class converts for example to this:
{"cols":[
{"id":"Col1","label":"","type":"string"}
{"id":"Col2","label":"","type":"date"}
],
"rows":[
{"c":[{"v":"a"},{"v":"Date(2010,10,6)"}]},
{"c":[{"v":"b"},{"v":"Date(2010,10,7)"}]}
]
}
But the Java side DataTable has different names for the parameters, and I am using Gson which has different type values:
cols -> columns
c -> cells
v -> value
type:"string" -> type:"TEXT"
type:"number" -> type:"NUMBER"
And I am afraid that there are even more incompatibilities.
So.. how can I convert the JavaScript DataTable to the Java object DataTable?
I ran into the same problem in reverse. It appears that the DataTable object in the Java Datasource Library is not parallel to the Javascript DataTable object in the Google Visualization API.
Returning a Java Datasource Library DataTable object requires using the JsonRenderer, rather than the default serialization. And it appears only to work passing from the server to the client. Am not sure if it can be done the other direction.
#WebService
#Path("/tables")
public class DataManager extends GenericManager<db, Long> {
#Path("/hello/")
#GET
#Produces(MediaType.APPLICATION_JSON)
public DataTable getDataTable() {
DataTable data = new DataTable();
... populate object ...
return data;
}
However, the Java DataTable object returned by default serialization is not the same thing as the Google Visualization API javascript DataTable. You can't pass it to a GVis chart.
Instead, from Java, you use the JsonRenderer class (see this Google Groups email) to convert it to a Json-like string that's missing quotes around attributes for modest compression.
#Path("/hello/")
#GET
#Produces(MediaType.TEXT_PLAIN)
public String getDataTable() {
DataTable data = new DataTable();
CharSequence charSequence = JsonRenderer.renderDataTable(dataTable, true, true);
return charSequence.toString();
}
That string can be parsed in Javascript by surrounding with parentheses, not shown in the object literal notation in the examples (see this Google Group forum):
jQuery.ajax({
context: this,
type: 'Get',
url: url,
success: function(data) {
var args = eval('('+data+')'); // add parens around the returned string
var dataTable = new google.visualization.DataTable(args);
...
});
I don't see a method for going the reverse way to the Java Datasource Library DataTable object.
So not quite an answer but you're not alone
Well I am using python on the backend and GWT on the frontend and passing a DataTable from the backend to the frontend works without any problems.
I am using the google-visualization-python api on the backend to create the DataTable.
Parsing is done with following code:
DataTable dataTable = DataTable.create(JSONParser.parseLenient(data).isObject().getJavaScriptObject());
I also convert the parsed DataTable back to JSON to store the json string in localStorage and parsing the stored json string also works fine.
The GWT DataTable is just a simple wrapper which ultimately just calls the function of the underlying Javascript DataTable via JSNI.So I don't see any reason why they should be incompatible.
Make sure you use the latest gwt-visualization API (1.1.2) ?