I have the following two classes, where the Document extends an Abstract class that provides helper functions, one of which is a "find" method that builds queries to find records based on some simple logic.
public abstract class AbstractTable<T extends AbstractTable<T>> extends Model {
...
public T find (String[] columns) {
String whereClause = "";
List<Object> whereClauseData = new ArrayList<Object> ();
for (String column : columns) {
Object data = this.get(column);
if (data == null) {
whereClause += column + " is null AND ";
} else {
whereClause += column + " = ? AND ";
whereClauseData.add (data);
}
}
return findFirst (whereClause.substring(0, whereClause.length () - 5), whereClauseData.toArray());
}
}
public class Document extends AbstractTable<Document> {
...
public Document findExistingObject(Document document) {
String[] columns = new String[] {"court_case_id", "number", "name", "file_date"};
return super.find (columns);
}
}
When I run this code, and the "findExistingObject" method is called on a Document, I receive this exception:
Exception in thread "main" org.javalite.activejdbc.InitException:
failed to determine Model class name, are you sure models have been
instrumented?
I've made completely sure that I've instrumented the classes. When I move the code from AbstractTable into Document, everything works perfectly. I'm hoping someone can lend some advice, or help, that might show me what I'm doing wrong.
Thanks in advance.
The exact reason for your issue is not generics, but instrumentation. Instrumentation skips abstract models, which means that the method findFirst is called on class Model, and not on Document. You need to invoke a method findFirst on the model Document. Here is a version of code that will work for you:
public T find (String[] columns) throws NoSuchMethodException, InvocationTargetException, IllegalAccessException {
String whereClause = "";
List<Object> whereClauseData = new ArrayList<>();
for (String column : columns) {
Object data = get(column);
if (data == null) {
whereClause += column + " is null AND ";
} else {
whereClause += column + " = ? AND ";
whereClauseData.add (data);
}
}
Method findFirst = getClass().getDeclaredMethod("findFirst", String.class, Object[].class);
return (T) findFirst.invoke(null, whereClause.substring(0, whereClause.length () - 5), whereClauseData.toArray());
}
There is a bit of ugliness there, but at least you can apply this across all your models (if this is what you want).
Related
I have an unbounded stream of complex objects that I want to load into BigQuery. The structure of these objects represents the schema of my destination table in BigQuery.
The problem is that since there are a lot of nested fields in the POJO, its an extremely tedious task to convert it to a TableSchema object and I'm looking for a quick/ automated way to convert my POJO to TableSchema object while writing to BigQuery.
I'm not very familiar with Apache Beam API, and any help will be appreciated.
In a pipeline, I load a list of schema from GCS. I keep them in string format because the TableSchema is not serializable. However, I load them to TableSchema for validate them.
Then I add them in string format to a map in the Option object.
String schema = new String(blob.getContent());
// Decorate list of fields for allowing a correct parsing
String targetSchema = "{\"fields\":" + schema + "}";
try {
//Preload schema to ensure validity, but then use string version
Transport.getJsonFactory().fromString(targetSchema, TableSchema.class);
String tableName = blob.getName().replace(SCHEMA_FILE_PREFIX, "").replace(SCHEMA_FILE_SUFFIX, "");
tableSchemaStringMap.put(tableName, targetSchema);
} catch (IOException e) {
logger.warn("impossible to read schema " + blob.getName() + " in bucket gs://" + options.getSchemaBucket());
}
I didn't find another solution when I developed this.
In my company I created kind of a ORM (we called OBQM) to do this. We are expecting to release it to the public. The code is quite big (specially because I created annotations and so on) but I can share with you some snippets for a quick schema generation:
public TableSchema generateTableSchema(#Nonnull final Class cls) {
final TableSchema tableSchema = new TableSchema();
tableSchema.setFields(generateFieldsSchema(cls));
return tableSchema;
}
public List<TableFieldSchema> generateFieldsSchema(#Nonnull final Class cls) {
final List<TableFieldSchema> schemaFields = new ArrayList<>();
final Field[] clsFields = cls.getFields();
for (final Field field : clsFields) {
schemaFields.add(fromFieldToSchemaField(field));
}
return schemaFields;
}
This code takes all the fields from the POJO class and creates a TableSchema object (the one that BigQueryIO uses in ApacheBeam). You can see a method that I created called fromFieldToSchemaField. This method identifies each field type and setup the field name, mode, description and type. In this case to keep it simple I'm going to focus on the type and name:
public static TableFieldSchema fromFieldToSchemaField(#Nonnull final Field field) {
return fromFieldToSchemaField(field, 0);
}
public static TableFieldSchema fromFieldToSchemaField(
#Nonnull final Field field,
final int iteration) {
final TableFieldSchema schemaField = new TableFieldSchema();
final Type customType = field.getGenericType().getTypeName()
schemaField.setName(field.getName());
schemaField.setMode("NULLABLE"); // You can add better logic here, we use annotations to override this value
schemaField.setType(getFieldTypeString(field));
schemaField.setDescription("Optional"); // Optional
if (iteration < MAX_RECURSION
&& (isStruct(schemaField.getType())
|| isRecord(schemaField.getType()))) {
final List<TableFieldSchema> schemaFields = new ArrayList<>();
final Field[] fields = getFieldsFromComplexObjectField(field);
for (final Field subField : fields) {
schemaFields.add(
fromFieldToSchemaField(
subField, iteration + 1));
}
schemaField.setFields(schemaFields.isEmpty() ? null : schemaFields);
}
return schemaField;
}
And now the method that returns the BigQuery field type.
public static String getFieldTypeString(#Nonnull final Field field) {
// On my side this code is much complex but this is a short version of that
final Class<?> cls = (Class<?>) field.getGenericType()
if (cls.isAssignableFrom(String.class)) {
return "STRING";
} else if (cls.isAssignableFrom(Integer.class) || cls.isAssignableFrom(Short.class)) {
return "INT64";
} else if (cls.isAssignableFrom(Double.class)) {
return "NUMERIC";
} else if (cls.isAssignableFrom(Float.class)) {
return "FLOAT64";
} else if (cls.isAssignableFrom(Boolean.class)) {
return "BOOLEAN";
} else if (cls.isAssignableFrom(Double.class)) {
return "BYTES";
} else if (cls.isAssignableFrom(Date.class)
|| cls.isAssignableFrom(DateTime.class)) {
return "TIMESTAMP";
} else {
return "STRUCT";
}
}
Keep in mind that I'm not showing how to identify primitive types or arrays. But this is a good start for your code :). Please let me know if you need any help.
If your using JSON for the message serialization in PubSub you can make use of one of the provided templates:
PubSub To BigQuery Template
The code for that template is here:
PubSubToBigQuery.java
public class Table{
private Long id = 1;
private String name;
List<Terms> terms;
Map<String,Address>
//getters and setters
}
what i need to do is that i need to link my class tables with database table and each element in the above class is a concept in database table and i have whole structure of java classes as per my xml and related database tables in DB what should be the best way.
as per my understanding what can i think as of now is that
use reflection to get the fields name and apply my business logic
Use XPath of my xml and directly link each concept using XPath
Each time get the value from DB and XML and link it using some mediator logic.
Please suggest and provide some code dummy code if possible
You can try with below example:
Iterator<Table> iterator=tableList.iterator();
boolean foundConcept=false;
while(iterator.hasNext())
{
foundConcept=false;
Table table=iterator.next();
String conceptName=table.getConceptDetails().getName();
Field fieldArr[]=Table.getClass().getDeclaredFields();
List<Field> fields=Arrays.asList(fieldArr);
Iterator<Field> iterator1 =fields.iterator();
int i=0;
while(iterator1.hasNext())
{
Field field=iterator1.next();
field.setAccessible(true);
System.out.println(field.getName()+" # "+field.getType());
if(field.getName().equalsIgnoreCase(conceptName) && String.class.isAssignableFrom(field.getType()))
{
foundConceptMap.put(conceptName, (field.get(Table)).toString());
foundConcept=true;
break;
}
else
{
Type type = field.getGenericType();
if (type instanceof ParameterizedType) {
ParameterizedType pType = (ParameterizedType)type;
System.out.print("Raw type: " + pType.getRawType() + " - ");
System.out.println("Type args: " + pType.getActualTypeArguments()[0]);
if("java.util.List".equalsIgnoreCase(pType.getRawType().getTypeName()))
{
String classWithPackage=pType.getActualTypeArguments()[0].getTypeName();
String className="";
if(classWithPackage.contains("."))
{
className=classWithPackage.substring(classWithPackage.lastIndexOf(".")+1);
}
else
{
className=classWithPackage;
}
System.out.println(className);
if("Terms".equalsIgnoreCase(className))
{
List<Terms> list=Table.getTerms();
setTerms(list, foundConceptMap, conceptName);
}
}
}
}
I have a Pig UDF which ingests some data and then attempts to transform that data in a minimal manner.
my_data = LOAD 'path/to/data' USING SomeCustomLoader();
my_other_data = FOREACH my_data GENERATE MyUDF(COL_1, COL_2, $param1, $param2) as output;
my_final_data = FOREACH my_other_data GENERATE output.NEW_COL1, output.NEW_COL2, output.NEW_COL3;
However, I keep getting the following error:
ERROR 0: Exception while executing [POUserFunc (Name: POUserFUnc(udf.MyUDF)[tuple] - scope-38 Operator Key: scope-38) children: null at []]: java.lang.NullPointerException
My UDF takes the data and transforms it:
public class MyUDF extends EvalFunc<Tuple> {
public Tuple exec(Tuple input) throws IOException {
if (input == null || input.size() == 0)
return null;
TupleFactory _factory;
Long fieldOne;
String fieldTwo;
String fieldThree;
_factory.getInstance();
try {
fieldOne = Long.valueOf(input.get(0).toString());
fieldTwo = input.get(1).toString();
fieldThree = input.get(2).toString();
fieldOne = doSomething(fieldOne);
fieldTwo = doSomething(fieldTwo);
fieldThree = doSomething(fieldThree);
return _factory.newTuple(Arrays.asList(fieldOne, fieldTwo, fieldThree));
} catch (Exception ex) {
return _factory.newTuple(Arrays.asList("ParseException", "", "", ""));
}
}
}
I have debugged and confirmed that fieldOne, fieldTwo, and fieldThree do exist prior to calling the tuple factory. It's also clear that the exception is being thrown because the code reaches the catch block and then throws this NullPointerException error.
What is not clear is why on earth this is happening.
According to the Pig docs (Pig 0.14.0 API), I should be able to call newTuple(java.util.List c) with the relevant items.
I have also defined my own Schema to ensure the types are correct when going back to the pig script.
The code in question has not instantiated your tuple instance, thus you cannot call the method on an object that does not exist.
public class ... {
TupleFactory _factory;
public Tuple exec(Tuple input) {
_factory = TupleFactory.getInstance();
...
}
}
I'm using Tapestry5 and Hibernate. I'm trying to build a criteria query that uses dynamic restrictions generated from the URL. My URL context is designed like a key/value pair.
Example
www.mywebsite.com/make/ford/model/focus/year/2009
I decode the parameters as followed
private Map<String, String> queryParameters;
private List<Vehicle> vehicles;
void onActivate(EventContext context) {
//Count is 6 - make/ford/model/focus/year/2009
int count = context.getCount();
if (count > 0) {
int i;
for (i = 0; (i + 1) < count; i += 2) {
String name = context.get(String.class, i);
String value = context.get(String.class, i + 1);
example "make"
System.out.println("name " + name);
example "ford"
System.out.println("value " + value);
this.queryParameters.put(name, value);
}
}
this.vehicles = this.session.createCriteria(Vehicle.class)
...add dynamic restrictions.
}
I was hoping someone could help me to figure out how to dynamically add the list of restrictions to my query. I'm sure this has been done, so if anybody knows of a post, that would be helpful too. Thanks
Exactly as the other answer said, but here more spelt out. I think the crux of your question is really 'show me how to add a restriction'. That is my interpretation anyhow.
You need to decode each restriction into its own field.
You need to know the Java entity property name for each field.
Then build a Map of these 2 things, the key is the known static Java entity property name and the value is the URL decoded data (possibly with type conversion).
private Map<String, Object> queryParameters;
private List<Vehicle> vehicles;
void onActivate(EventContext context) {
//Count is 6 - make/ford/model/focus/year/2009
int count = context.getCount();
queryParameters = new HashMap<String,Object>();
if (count > 0) {
int i;
for (i = 0; (i + 1) < count; i += 2) {
String name = context.get(String.class, i);
String value = context.get(String.class, i + 1);
Object sqlValue = value;
if("foobar".equals(name)) {
// sometime you don't want a String type for SQL compasition
// so convert it
sqlValue = UtilityClass.doTypeConversionForFoobar(value);
} else if("search".equals(name) ||
"model".equals(name) ||
"year".equals(name)) {
// no-op this is valid 'name'
} else if("make".equals(name)) {
// this is a suggestion depends on your project conf
name = "vehicleMake.name";
} else {
continue; // ignore values we did not expect
}
// FIXME: You should validate all 'name' values
// to be valid and/or convert to Java property names here
System.out.println("name " + name);
System.out.println("value " + value);
this.queryParameters.put(name, sqlValue);
}
}
Criteria crit = this.session.createCriteria(Vehicle.class)
for(Map.Entry<String,Object> e : this.queryParameters.entrySet()) {
String n = e.getKey();
Object v = e.getValue();
// Sometimes you don't want a direct compare 'Restructions.eq()'
if("search".equals(n))
crit.add(Restrictions.like(n, "%" + v + "%"));
else // Most of the time you do
crit.add(Restrictions.eq(n, v));
}
this.vehicles = crit.list(); // run query
}
See also https://docs.jboss.org/hibernate/orm/3.5/reference/en/html/querycriteria.html
With the above there should be no risk of SQL injection, since the "name" and "n" part should be 100% validated against a known good list. The "value" and "v" is correctly escaped, just like using SQL position placeholder '?'.
E&OE
I would assume you would just loop over the parameters Map and add a Restriction for each pair.
Be aware that this will open you up to sql injection attacks if you are not careful. the easiest way to protect against this would be to check the keys against the known Vehicle properties before adding to the Criteria.
Another option would be to create an example query by building an object from the name/value pairs:
Vehicle vehicle = new Vehicle();
int count = context.getCount();
int i;
for (i = 0; (i + 1) < count; i += 2) {
String name = context.get(String.class, i);
String value = context.get(String.class, i + 1);
// This will call the setter for the name, passing the value
// So if name is 'make' and value is 'ford', it will call vehicle.setMake('ford')
BeantUtils.setProperty(vehicle, name, value);
}
// This is using a Hibernate example query:
vehicles = session.createCriteria(Vehicle.class).add(Example.create(vehicle)).list();
See BeanUtils.setProperty and Example Queries for more info.
That assumes you are allowing only one value per property and that the query parameters map to the property names correctly. There may also be conversion issues to think about but I think setProperty handles the common ones.
If they are query paramaters you should treat them as query parameters instead of path parameters. Your URL should look something like:
www.mywebsite.com/vehicles?make=ford&model=focus&year=2009
and your code should look something like this:
public class Vehicles {
#ActivationRequestParameter
private String make;
#ActivationRequestParameter
private String model;
#ActivationRequestParameter
private String year;
#Inject
private Session session;
#OnEvent(EventConstants.ACTIVATE)
void activate() {
Criteria criteria = session.createCriteria(Vehicle.class);
if (make != null) criteria.add(Restrictions.eq("make", make));
if (model != null) criteria.add(Restrictions.eq("model", model));
if (year != null) criteria.add(Restrictions.eq("year", year));
vehicles = criteria.list();
}
}
Assuming you are using the Grid component to display the vehicles I'd highly recommend using the HibernateGridDataSource instead of making the query in the "activate" event handler.
public class Vehicles {
#ActivationRequestParameter
private String make;
#ActivationRequestParameter
private String model;
#ActivationRequestParameter
private String year;
#Inject
private Session session;
#OnEvent(EventConstants.ACTIVATE)
void activate() {
}
public GridDataSource getVehicles() {
return new HibernateGridDataSource(session, Vehicles.class) {
#Override
protected void applyAdditionalConstraints(Criteria criteria) {
if (make != null) criteria.add(Restrictions.eq("make", make));
if (model != null) criteria.add(Restrictions.eq("model", model));
if (year != null) criteria.add(Restrictions.eq("year", year));
}
};
}
}
I was writing a toString() for a class in Java the other day by manually writing out each element of the class to a String and it occurred to me that using reflection it might be possible to create a generic toString() method that could work on ALL classes. I.E. it would figure out the field names and values and send them out to a String.
Getting the field names is fairly simple, here is what a co-worker came up with:
public static List initFieldArray(String className) throws ClassNotFoundException {
Class c = Class.forName(className);
Field field[] = c.getFields();
List<String> classFields = new ArrayList(field.length);
for (int i = 0; i < field.length; i++) {
String cf = field[i].toString();
classFields.add(cf.substring(cf.lastIndexOf(".") + 1));
}
return classFields;
}
Using a factory I could reduce the performance overhead by storing the fields once, the first time the toString() is called. However finding the values could be a lot more expensive.
Due to the performance of reflection this may be more hypothetical then practical. But I am interested in the idea of reflection and how I can use it to improve my everyday programming.
Apache commons-lang ReflectionToStringBuilder does this for you.
import org.apache.commons.lang3.builder.ReflectionToStringBuilder
// your code goes here
public String toString() {
return ReflectionToStringBuilder.toString(this);
}
Another option, if you are ok with JSON, is Google's GSON library.
public String toString() {
return new GsonBuilder().setPrettyPrinting().create().toJson(this);
}
It's going to do the reflection for you. This produces a nice, easy to read JSON file. Easy-to-read being relative, non tech folks might find the JSON intimidating.
You could make the GSONBuilder a member variable too, if you don't want to new it up every time.
If you have data that can't be printed (like a stream) or data you just don't want to print, you can just add #Expose tags to the attributes you want to print and then use the following line.
new GsonBuilder()
.setPrettyPrinting()
.excludeFieldsWithoutExposeAnnotation()
.create()
.toJson(this);
W/reflection, as I hadn't been aware of the apache library:
(be aware that if you do this you'll probably need to deal with subobjects and make sure they print properly - in particular, arrays won't show you anything useful)
#Override
public String toString()
{
StringBuilder b = new StringBuilder("[");
for (Field f : getClass().getFields())
{
if (!isStaticField(f))
{
try
{
b.append(f.getName() + "=" + f.get(this) + " ");
} catch (IllegalAccessException e)
{
// pass, don't print
}
}
}
b.append(']');
return b.toString();
}
private boolean isStaticField(Field f)
{
return Modifier.isStatic(f.getModifiers());
}
If you're using Eclipse, you may also have a look at JUtils toString generator, which does it statically (generating the method in your source code).
You can use already implemented libraries, as ReflectionToStringBuilder from Apache commons-lang. As was mentioned.
Or write smt similar by yourself with reflection API.
Here is some example:
class UniversalAnalyzer {
private ArrayList<Object> visited = new ArrayList<Object>();
/**
* Converts an object to a string representation that lists all fields.
* #param obj an object
* #return a string with the object's class name and all field names and
* values
*/
public String toString(Object obj) {
if (obj == null) return "null";
if (visited.contains(obj)) return "...";
visited.add(obj);
Class cl = obj.getClass();
if (cl == String.class) return (String) obj;
if (cl.isArray()) {
String r = cl.getComponentType() + "[]{";
for (int i = 0; i < Array.getLength(obj); i++) {
if (i > 0) r += ",";
Object val = Array.get(obj, i);
if (cl.getComponentType().isPrimitive()) r += val;
else r += toString(val);
}
return r + "}";
}
String r = cl.getName();
// inspect the fields of this class and all superclasses
do {
r += "[";
Field[] fields = cl.getDeclaredFields();
AccessibleObject.setAccessible(fields, true);
// get the names and values of all fields
for (Field f : fields) {
if (!Modifier.isStatic(f.getModifiers())) {
if (!r.endsWith("[")) r += ",";
r += f.getName() + "=";
try {
Class t = f.getType();
Object val = f.get(obj);
if (t.isPrimitive()) r += val;
else r += toString(val);
} catch (Exception e) {
e.printStackTrace();
}
}
}
r += "]";
cl = cl.getSuperclass();
} while (cl != null);
return r;
}
}
Not reflection, but I had a look at generating the toString method (along with equals/hashCode) as a post-compilation step using bytecode manipulation. Results were mixed.
Here is the Netbeans equivalent to Olivier's answer; smart-codegen plugin for Netbeans.