queryRunner insert adds a parameter - java

I'm trying to make an insert and get the auto generated id returned. This is an Oracle database. For that i am using org.apache.commons.dbutils.QueryRunner insert which returns the first column.
The problem is i don't know for which reason i am getting added a parameter in the query and it won't let it work.
I have this code:
import javax.annotation.Nonnull;
import javax.inject.Inject;
import javax.inject.Named;
import javax.inject.Singleton;
import javax.sql.DataSource;
import java.sql.SQLException;
import java.util.Optional;
#Singleton
public class ConceptToBonusRepository extends GenericSQLRepository{
private static final String QUERY_SAVE_CONCEPT_TO_BONUS = "INSERT INTO concept_to_bonus (ID, CONCEPT_ID, CONCEPT_TYPE)" +
" VALUES (SEQ_CONCEPT_TO_BONUS_ID.NEXTVAL,?,?)";
#Inject
public ConceptToBonusRepository(#Named("OracleDataSource") #Nonnull DataSource dataSource) {
super(dataSource);
}
public Optional<Long> saveConceptToBonus(ConceptToBonus conceptToBonus)
try {
return Optional.ofNullable(
runInsert(
QUERY_SAVE_CONCEPT_TO_BONUS, conceptToBonus.getConceptId(), conceptToBonus.getConceptType()
)
);
} catch (SQLException e) {
throw new RuntimeException(e);
}
}
}
and my GenericlSQLRepository
import org.apache.commons.dbutils.QueryRunner;
import org.apache.commons.dbutils.handlers.ScalarHandler;
import javax.annotation.Nonnull;
import javax.sql.DataSource;
import java.sql.SQLException;
public abstract class GenericSQLRepository {
private QueryRunner queryRunner;
private ScalarHandler<Long> autogeneratedIdHandler = new ScalarHandler<>();
protected GenericSQLRepository(#Nonnull final DataSource dataSource) {
this.queryRunner = new QueryRunner(dataSource);
}
protected Long runInsert(#Nonnull final String sql,
Object... args) throws SQLException {
return queryRunner.insert(sql, this.autogeneratedIdHandler, args);
}
}
When i try to run this i get this error
"java.sql.SQLException: Wrong number of parameters: expected 3, was given 2 Query: INSERT INTO concept_to_bonus (ID, CONCEPT_ID, CONCEPT_TYPE) VALUES (SEQ_CONCEPT_TO_BONUS_ID.NEXTVAL,?,?) Parameters: [1731472066, ORDER]"
I really don't understand why is it adding a parameter in the parameter count. When i run this insert with a simple execute, it works just fine

Related

Spring JPA CrudRepository Autowired object is null when used with Apache Commons Tailer

I'm working on a small app that I can point to an Apache HTTP server log, follow the log (a la 'tail -f' in Linux), and write entries to an Oracle Database table.
I set up Spring Boot / Spring Data JPA application and created classes for my Entity, the CrudRepository interface, a service for the interface (though I believed this technically unnecessary for this implementation), and a runner to kick the process off. I also set up the TailerListenerAdapter to do the lifting for the log file parsing. I will post all of this code below.
The problem is that I can write a test record to the database successfully prior to starting the Tailer listener. However, when the listener is running, the Autowired Service in the TailerListenerAdapter, is null and throws an exception.
java.lang.NullPointerException
at sbx.demo.logauditor.util.AccessListener.handle(AccessListener.java:49)
at org.apache.commons.io.input.Tailer.readLines(Tailer.java:525)
at org.apache.commons.io.input.Tailer.run(Tailer.java:457)
at sbx.demo.logauditor.LogAuditRunner.run(LogAuditRunner.java:40)
{... more stack trace ...}
Here are the classes used (I probably have some unnecessary annotations in there left from experimentation) -
LogAuditRunner.java
package sbx.demo.logauditor;
import java.io.File;
import java.sql.Timestamp;
import java.time.LocalDateTime;
import java.time.format.DateTimeFormatter;
import javax.transaction.Transactional;
import org.apache.commons.io.input.Tailer;
import org.apache.commons.io.input.TailerListener;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.CommandLineRunner;
import org.springframework.stereotype.Component;
import lombok.var;
import sbx.demo.logauditor.model.AccessRecord;
import sbx.demo.logauditor.service.AccessService;
import sbx.demo.logauditor.util.AccessListener;
#Component
public class LogAuditRunner implements CommandLineRunner {
#Autowired
AccessService accServ;
final String datePattern = "dd/MMM/yyyy:HH:mm:ss Z";
final DateTimeFormatter formatter = DateTimeFormatter.ofPattern(datePattern);
#Override
#Transactional
public void run(String... args) throws Exception {
// This test code works if uncommented
//LocalDateTime TS = LocalDateTime.from(formatter.parse("31/Jan/2020:14:28:32 -0500"));
//var logTest = new AccessRecord("10.154.103.2",Timestamp.valueOf(TS),"/cs/resources/layouts/Top%20Menus/Oracle/tree_T_collection_closed.gif","304");
//System.out.println("Testing repository with " + logTest.toString());
//accServ.save(logTest);
TailerListener listener = new AccessListener();
Tailer tailer = new Tailer(new File("D:\\access_log"), listener);
tailer.run();
}
}
AccessService.java
package sbx.demo.logauditor.service;
import java.util.ArrayList;
import java.util.List;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Configurable;
import org.springframework.stereotype.Service;
import sbx.demo.logauditor.model.AccessRecord;
import sbx.demo.logauditor.repository.AccessRepository;
#Service
#Configurable
public class AccessService {
#Autowired(required = true)
AccessRepository accessRepo;
public void save(AccessRecord ar) {
try {
System.out.println("Writing record to database: " + ar.toString());
accessRepo.save(ar);
} catch (Exception e) {
e.printStackTrace();
}
}
public List<AccessRecord> findAll() {
List<AccessRecord> recList = new ArrayList<AccessRecord>();
try {
System.out.println("Searching database for all access records");
for(AccessRecord ar : accessRepo.findAll()) {
recList.add(ar);
}
} catch (Exception e) {
e.printStackTrace();
}
return recList;
}
}
AccessRepository.java
package sbx.demo.logauditor.repository;
import org.springframework.data.repository.CrudRepository;
import org.springframework.stereotype.Repository;
import sbx.demo.logauditor.model.AccessRecord;
#Repository
public interface AccessRepository extends CrudRepository<AccessRecord, Long>{
}
AccessRecord.java
package sbx.demo.logauditor.model;
import java.net.InetAddress;
import java.net.UnknownHostException;
import java.sql.Timestamp;
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;
import javax.persistence.Table;
import lombok.Data;
#Entity
#Table(name="ACCESS_LOG")
#Data
public class AccessRecord {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "id", updatable = false, nullable = false)
private Long id;
#Column(name="PROXY_AGENT")
private String agent;
#Column(name="SOURCE_IP")
private String sourceip;
#Column(name="ACCESS_TS")
private Timestamp reqts;
#Column(name="URI")
private String requri;
#Column(name="HTTP_STATUS")
private String respcode;
public AccessRecord() {}
public AccessRecord(String source, Timestamp ts, String uri, String status) {
this.sourceip = source;
this.reqts = ts;
this.requri = uri;
this.respcode = status;
try {
InetAddress ip = InetAddress.getLocalHost();
String hostname = ip.getHostName();
this.agent = hostname;
} catch (UnknownHostException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
#Override
public String toString() {
String record = "Record: [" + agent + "] [" + sourceip + "] [" + reqts + "] [" + requri + "] [" + respcode + "]";
return record;
}
}
AccessListener.java
package sbx.demo.logauditor.util;
import java.sql.Timestamp;
import java.time.LocalDateTime;
import java.time.format.DateTimeFormatter;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import javax.transaction.Transactional;
import org.apache.commons.io.input.TailerListenerAdapter;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;
import lombok.var;
import sbx.demo.logauditor.model.AccessRecord;
import sbx.demo.logauditor.repository.AccessRepository;
import sbx.demo.logauditor.service.AccessService;
#Component
public class AccessListener extends TailerListenerAdapter {
final String regex = "^(\\S+) (\\S+) (\\S+) " +
"\\[([\\w:/]+\\s[+\\-]\\d{4})\\] \"(\\S+)" +
" (\\S+)\\s*(\\S+)?\\s*\" (\\d{3}) (\\S+)";
final Pattern pattern = Pattern.compile(regex, Pattern.MULTILINE);
final String datePattern = "dd/MMM/yyyy:HH:mm:ss Z";
final DateTimeFormatter formatter = DateTimeFormatter.ofPattern(datePattern);
#Autowired
AccessService accServ;
#Override
#Transactional
public void handle(String line) {
LogRecorder lr = new LogRecorder();
try {
final Matcher matcher = pattern.matcher(line);
if (matcher.find()) {
String IP = matcher.group(1);
//String TS = matcher.group(4);
String URL = matcher.group(6);
String STATUS = matcher.group(8);
LocalDateTime TS = LocalDateTime.from(formatter.parse(matcher.group(4)));
var ar = new AccessRecord(IP,Timestamp.valueOf(TS),URL,STATUS);
accServ.save(ar);
}
} catch (Exception e) {
e.printStackTrace();
}
}
}
And finally LogauditorApplication.java
package sbx.demo.logauditor;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
#SpringBootApplication
public class LogauditorApplication {
public static void main(String[] args) {
SpringApplication.run(LogauditorApplication.class, args);
}
}
Side note, I noticed that if I manually instantiate the AccessService (instead of relying on Autowiring), I can invoke it, but then the NullPointerException happens on the Autowired AccessRepository interface. It's clear to me that it has to do with the Autowiring, I just am not understanding why.
I know that there are ways to follow and send logs via command line (this is going to run in a Linux environment) but I want to ensure that it is robust enough to, say, restart if it dies, handle log rollovers, etc. Also, I'm planning to write in some extra validation to ensure entries don't overlap (i.e. - in the event the application restarts and re-reads an entire file). But, I wanted to get it working first. I thought it would be straightforward since Tailer requires so little code, and I'm already comfortable with Spring.
I was able to get this working by passing the AccessService as a parameter for the AccessListener constructor. I then had to add the #Transactional annotation to the save method in the AccessService so that transactions would commit after each line was processed inside the thread.
New AccessListener.java
package sbx.demo.logauditor.util;
import java.sql.Timestamp;
import java.time.LocalDateTime;
import java.time.format.DateTimeFormatter;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.apache.commons.io.input.TailerListenerAdapter;
import sbx.demo.logauditor.model.AccessRecord;
import sbx.demo.logauditor.service.AccessService;
public class AccessListener extends TailerListenerAdapter {
final String regex = "^(\\S+) (\\S+) (\\S+) " +
"\\[([\\w:/]+\\s[+\\-]\\d{4})\\] \"(\\S+)" +
" (\\S+)\\s*(\\S+)?\\s*\" (\\d{3}) (\\S+)";
final Pattern pattern = Pattern.compile(regex, Pattern.MULTILINE);
final String datePattern = "dd/MMM/yyyy:HH:mm:ss Z";
final DateTimeFormatter formatter = DateTimeFormatter.ofPattern(datePattern);
private AccessService accServ;
public AccessListener(AccessService as) {
this.accServ = as;
}
// #Autowired
// AccessService accServ;
#Override
public void handle(String line) {
try {
final Matcher matcher = pattern.matcher(line);
if (matcher.find()) {
String IP = matcher.group(1);
String URL = matcher.group(6);
String STATUS = matcher.group(8);
LocalDateTime TS = LocalDateTime.from(formatter.parse(matcher.group(4)));
AccessRecord ar = new AccessRecord(IP,Timestamp.valueOf(TS),URL,STATUS);
accServ.save(ar);
}
} catch (Exception e) {
e.printStackTrace();
}
}
}

How can I query a remote Apache Tinkerpop Graph Database with Gremlin and Java?

I haven't been able to find a comprehensive example of connecting to and then querying a remote Apache Tinkerpop Graph Database with Gremlin and Java. And I can't quite get it to work. Can anyone that's done something like this before offer any advice?
I've set up a Azure Cosmos database in Graph-DB mode, which is expecting Gremlin queries in order to modify and access its data. I have the database host name, port, username, and password, and I'm able to execute queries, but only if I pass in a big ugly query string. I would like to be able to leverage the org.apache.tinkerpop.gremlin.structure.Graph traversal methods, but I can't quite get it working.
import java.util.List;
import java.util.concurrent.CompletableFuture;
import org.apache.tinkerpop.gremlin.driver.Result;
import org.apache.tinkerpop.gremlin.driver.ResultSet;
import org.apache.tinkerpop.gremlin.structure.Graph;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
//More imports...
#Service
public class SearchService {
private final static Logger log = LoggerFactory.getLogger(SearchService.class);
#Autowired
private GraphDbConnection graphDbConnection;
#Autowired
private Graph graph;
public Object workingQuery() {
try {
String query = "g.V('1234').outE('related').inV().both().as('v').project('vertex').by(select('v')).by(bothE().fold())";
log.info("Submitting this Gremlin query: {}", query);
ResultSet results = graphDbConnection.executeQuery(query);
CompletableFuture<List<Result>> completableFutureResults = results.all();
List<Result> resultList = completableFutureResults.get();
Result result = resultList.get(0);
log.info("Query result: {}", result.toString());
return result.toString();
} catch (Exception e) {
log.error("Error fetching data.", e);
}
return null;
}
public Object failingQuery() {
return graph.traversal().V(1234).outE("related").inV()
.both().as("v").project("vertex").by("v").bothE().fold()
.next();
/* I get an Exception:
"org.apache.tinkerpop.gremlin.process.remote.RemoteConnectionException:
java.lang.RuntimeException: java.lang.RuntimeException:
java.util.concurrent.TimeoutException: Timed out while waiting for an
available host - check the client configuration and connectivity to the
server if this message persists" */
}
}
This is my configuration class:
import java.util.HashMap;
import java.util.Map;
import org.apache.tinkerpop.gremlin.driver.Cluster;
import org.apache.tinkerpop.gremlin.driver.MessageSerializer;
import org.apache.tinkerpop.gremlin.driver.remote.DriverRemoteConnection;
import org.apache.tinkerpop.gremlin.driver.ser.GraphSONMessageSerializerGremlinV2d0;
import org.apache.tinkerpop.gremlin.structure.Graph;
import org.apache.tinkerpop.gremlin.structure.util.GraphFactory;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
public class GraphDbConfig {
private final static Logger log = LoggerFactory.getLogger(GraphDbConfig.class);
#Value("${item.graph.hostName}")
private String hostName;
#Value("${item.graph.port}")
private int port;
#Value("${item.graph.username}")
private String username;
#Value("${item.graph.password}")
private String password;
#Value("${item.graph.enableSsl}")
private boolean enableSsl;
#Bean
public Graph graph() {
Map<String, String> graphConfig = new HashMap<>();
graphConfig.put("gremlin.graph",
"org.apache.tinkerpop.gremlin.process.remote.RemoteGraph");
graphConfig.put("gremlin.remoteGraph.remoteConnectionClass",
"org.apache.tinkerpop.gremlin.driver.remote.DriverRemoteConnection");
Graph g = GraphFactory.open(graphConfig);
g.traversal().withRemote(DriverRemoteConnection.using(cluster()));
return g;
}
#Bean
public Cluster cluster() {
Cluster cluster = null;
try {
MessageSerializer serializer = new GraphSONMessageSerializerGremlinV2d0();
Cluster.Builder clusterBuilder = Cluster.build().addContactPoint(hostName)
.serializer(serializer)
.port(port).enableSsl(enableSsl)
.credentials(username, password);
cluster = clusterBuilder.create();
} catch (Exception e) {
log.error("Error in connecting to host address.", e);
}
return cluster;
}
}
And I have to define this connection component currently in order to send queries to the database:
import org.apache.tinkerpop.gremlin.driver.Client;
import org.apache.tinkerpop.gremlin.driver.Cluster;
import org.apache.tinkerpop.gremlin.driver.ResultSet;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;
#Component
public class GraphDbConnection {
private final static Logger log = LoggerFactory.getLogger(GraphDbConnection.class);
#Autowired
private Cluster cluster;
public ResultSet executeQuery(String query) {
Client client = connect();
ResultSet results = client.submit(query);
closeConnection(client);
return results;
}
private Client connect() {
Client client = null;
try {
client = cluster.connect();
} catch (Exception e) {
log.error("Error in connecting to host address.", e);
}
return client;
}
private void closeConnection(Client client) {
client.close();
}
}
You cannot leverage the remote API with CosmosDB yet. It does not support Gremlin Bytecode yet.
https://github.com/Azure/azure-documentdb-dotnet/issues/439
https://feedback.azure.com/forums/263030-azure-cosmos-db/suggestions/33632779-support-gremlin-bytecode-to-enable-the-fluent-api
You would have to continue with strings until then, though.....since you are using Java you could try a somewhat unadvertised feature: GroovyTranslator
gremlin> g = EmptyGraph.instance().traversal()
==>graphtraversalsource[emptygraph[empty], standard]
gremlin> translator = GroovyTranslator.of('g')
==>translator[g:gremlin-groovy]
gremlin> translator.translate(g.V().out('knows').has('person','name','marko').asAdmin().getBytecode())
==>g.V().out("knows").has("person","name","marko")
As you can see, it takes Gremlin Bytecode and converts it into a String of Gremlin that you could submit to CosmosDB. Later, when CosmosDB supports Bytecode, you could drop the GroovyTranslator and change from EmptyGraph construction of your GraphTraversalSource and everything should start working. To make this really seamless, you could go the extra step and write a TraversalStrategy that would do something similar to TinkerPop's RemoteStrategy. Instead of submitting Bytecode as that strategy does, you would just just use GroovyTranslator and submit the string of Gremlin. That approach would make it even easier to switch over when CosmosDB supports Bytecode because then all you would have to do is remove your custom TraversalStrategy and reconfigure your remote GraphTraversalSource in the standard way.

PLS-00103: Encountered the symbol ")"

I am creating a web app which requires connecting to a database and getting the catagories of various types of reports. I am getting a weird error when it comes to executing my SQL via storedproc in java. SELECT * FROM RPT_CTGR; is the sql I submit but when I look at the stack trace it comes back as this:
org.springframework.jdbc.BadSqlGrammarException: CallableStatementCallback; bad SQL grammar [{call SELECT * FROM RPT_CTGR;(?)}]; nested exception is java.sql.SQLException: ORA-06550: line 1, column 38:
PLS-00103: Encountered the symbol ")" when expecting one of the following:
. ( * # % & = - + < / > at in is mod remainder not rem
<an exponent (**)> <> or != or ~= >= <= <> and or like like2
like4 likec as between || indicator multiset member
submultiset
org.springframework.jdbc.support.SQLStateSQLExceptionTranslator.doTranslate(SQLStateSQLExceptionTranslator.java:98)
org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:72)
org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:80)
org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:969)
org.springframework.jdbc.core.JdbcTemplate.call(JdbcTemplate.java:1003)
org.springframework.jdbc.object.StoredProcedure.execute(StoredProcedure.java:144)
org.ifmc.qies.reportaudit.impl.CataImpl.search(CataImpl.java:59)
org.ifmc.qies.reportaudit.web.CataAction.execute(CataAction.java:31)
org.apache.struts.action.RequestProcessor.processActionPerform(RequestProcessor.java:431)
org.apache.struts.action.RequestProcessor.process(RequestProcessor.java:236)
org.apache.struts.action.ActionServlet.process(ActionServlet.java:1196)
org.apache.struts.action.ActionServlet.doPost(ActionServlet.java:432)
javax.servlet.http.HttpServlet.service(HttpServlet.java:648)
javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
Which for some reason is appending the extra "(?)" to the end of my statement which causes the error. Any ideas?
Code
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Types;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import javax.sql.DataSource;
import org.ifmc.qies.reportaudit.dao.CataDao;
import org.ifmc.qies.reportaudit.model.Catagory;
import org.springframework.context.ApplicationContext;
import org.springframework.context.support.FileSystemXmlApplicationContext;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.jdbc.core.SqlOutParameter;
import org.springframework.jdbc.core.SqlParameter;
import org.springframework.jdbc.core.support.JdbcDaoSupport;
import org.springframework.jdbc.datasource.DriverManagerDataSource;
import oracle.jdbc.OracleTypes;
public class CataImpl extends JdbcDaoSupport implements CataDao {
protected static ApplicationContext ctx = null;
//BaseStoredProcedure extends StoredProcedure
public class CataProc extends BaseStoredProcedure{
public CataProc(DataSource ds, String name) {
super(ds,name);
System.out.println(name);
System.out.println(getSql());
declareParameter(new SqlOutParameter("catagories", OracleTypes.CURSOR,
new RowMapper() {
public Object mapRow(ResultSet rs, int rowNum)
throws SQLException {
Catagory t = new Catagory();
t.setCatId(rs.getString(1));
t.setCat_name(rs.getString(2));
System.out.println(t.getCat_name());
return t;
}
}));
}
}
#Override
public List search() {
String[] paths = {"V:path/to/applicationcontext.xml"};
Map params=new HashMap();
ctx = new FileSystemXmlApplicationContext(paths);
DataSource ds = (DriverManagerDataSource)ctx.getBean("dataSource");
CataProc proc = new CataProc(ds,"SELECT * FROM RPT_CTGR;");
Map results = proc.execute(params);
List catagory = (List)results.get("catagories");
//Test
System.out.println(catagory.get(1).toString());
return catagory;
}
}
I get all my print statements except for the catagories.get(1).toString to return properly the SQL I submitted
You should create a Store Procedure in your Oracle Database which executes your SELECT statement.
You can see this question as reference on how to do it.
E.g.
CREATE OR REPLACE PROCEDURE getCatagories(categories in out sys_refcursor) is
...
SELECT * FROM RPT_CTGR;
...
After that, you need to call the store procedure by its name.
public List search() {
...
DataSource ds = (DriverManagerDataSource) ctx.getBean("dataSource");
CataProc proc = new CataProc(ds,"getCatagories");
Map results = proc.execute(params);
List catagory = (List)results.get("catagories");
...
}

MapReduce with phoenix : org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.io.NullWritable

I am trying to insert values into a table ("mea_interval") from data collected in another table ("mea_data"). The idea is not unique, it identifies a datatype. I use MeasureWritable class to read and write to the database, it implements DBWritable and Writable. When I run my jar I get the error:
15/12/15 10:13:38 WARN mapred.LocalJobRunner: job_local957174264_0001
java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.io.NullWritable
at org.apache.phoenix.mapreduce.PhoenixRecordWriter.write(PhoenixRecordWriter.java:39)
at org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:551)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:85)
at org.apache.hadoop.mapreduce.lib.reduce.WrappedReducer$Context.write(WrappedReducer.java:99)
at org.apache.hadoop.mapreduce.Reducer.reduce(Reducer.java:144)
at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:164)
at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:610)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:444)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:449))
I can read the values in the table mea_data. If I display in console, they appear good. I think the error occurs during the execution of context.write in the map but I don't understand why.
I attached you the code of the job configuration and my map class. If you want to see another part of my code do not hesitate.
Thank you beforehand. :)
The job configuration :
import java.io.IOException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.phoenix.mapreduce.PhoenixInputFormat;
import org.apache.phoenix.mapreduce.PhoenixOutputFormat;
import org.apache.phoenix.mapreduce.util.PhoenixConfigurationUtil;
import org.apache.phoenix.mapreduce.util.PhoenixMapReduceUtil;
public class Application {
public static void main(String[] args) {
final Configuration configuration = HBaseConfiguration.create();
final Job job;
try {
job = Job.getInstance(configuration, "phoenix-mr-job");
final String selectQuery = "SELECT * FROM \"mea_data\" where \"timestamp\" > 1450168200";
PhoenixMapReduceUtil.setInput(job, MeasureWritable.class, "mea_data", selectQuery);
// Set the target Phoenix table and the columns
PhoenixMapReduceUtil.setOutput(job, "\"mea_interval\"", "id_collection,startDate,endDate,value");
job.setMapperClass(MeasureMapper.class);
job.setReducerClass(MeasureReducer.class);
job.setOutputFormatClass(PhoenixOutputFormat.class);
// job.setInputFormatClass(PhoenixInputFormat.class);
job.setNumReduceTasks(10);
job.setMapOutputKeyClass(LongWritable.class);
job.setMapOutputValueClass(Text.class);
job.setOutputKeyClass(NullWritable.class);
job.setOutputValueClass(MeasureWritable.class);
// TableMapReduceUtil.addDependencyJars(job);
job.waitForCompletion(true);
} catch (Exception e) {
e.printStackTrace();
}
}
}
The mapper class :
import java.io.IOException;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;
public class MeasureMapper extends Mapper<NullWritable , MeasureWritable, LongWritable, Text> {
#Override
protected void map(NullWritable key, MeasureWritable measureWritable, Context context) throws IOException, InterruptedException {
final long timestamp = measureWritable.getTimestamp();
double val = measureWritable.getValue();
final long id = measureWritable.getId();
System.out.print("id : "+ new LongWritable(id));
System.out.print(" timestamp : "+ timestamp);
System.out.println(" val : "+ val);
try{
context.write(new LongWritable(id), new Text(timestamp + ";" + val));
} catch (Exception e) {
e.printStackTrace();
}
}
}
The reducer class :
import java.io.IOException;
import java.text.NumberFormat;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Reducer;
public class MeasureReducer extends Reducer<LongWritable, Iterable<Text>, NullWritable, MeasureWritable> {
protected void reduce(LongWritable key, Iterable<Text> valeurs, Context context) throws IOException, InterruptedException {
MeasureWritable interval = new MeasureWritable();
interval.setId(Long.valueOf(key.toString()).longValue());
NumberFormat nf = NumberFormat.getInstance();
for(Text valeur : valeurs) {
String[] array = valeur.toString().split(";", -1);
interval.setStartingDate(Long.valueOf(array[0]).longValue());
interval.setEndingDate(Long.valueOf(array[0]).longValue());
try {
interval.setValue(nf.parse(array[1]).doubleValue() );
} catch (Exception e) {
e.printStackTrace();
}
}
context.write(NullWritable.get(), interval);
}
}
Use LongWritable as Mapper's input key as well as map method's first parameter instead of NullWritable.

Testing Java code and previewing the SQL statement

The following is a Java (Hibernate) Method. How can I write a test for it? I want the test method to return a SQL statement.The code reference a lot of other classes and packages which already exist. Ignore these and just show me how to integrate them in my test program.
#Override
public AppTacticalSubUnit returnByCode(String code) throws MyOwnDAOException {
Session session = getSession();
UnitOfWork unitOfWork = new UnitOfWork();
try {
unitOfWork.beginTransaction(session);
Criteria criteria = session.createCriteria(AppTacticalSubUnit.class);
criteriaAppTacticalSubUnit.add(Restrictions.eq("code", code));
criteriaAppTacticalSubUnit.setResultTransformer(CriteriaSpecification.DISTINCT_ROOT_ENTITY);
setJoinFetches(criteria);
AppTacticalSubUnit ret = (AppTacticalUnit) criteria.uniqueResult();
unitOfWork.commit();
return ret;
} catch (HibernateException e) {
unitOfWork.rollback();
throw new ObelixxDAOException(e.getMessage(), e);
}
}
private void setJoinFetches(Criteria criteria) {
criteria.setFetchMode("appTacticalUnit.spaceOpsAreaServiceType", FetchMode.JOIN);
criteria.setFetchMode("appTacticalSubUnit.spaceOpsAreaServiceType.assExternalServiceType", FetchMode.JOIN);
criteria.setFetchMode("appTacticalSubUnit.spaceOpsAreaServiceType.assExternalServiceType.lookupServiceType", FetchMode.JOIN);
criteria.setFetchMode("appTacticalSubUnit.spaceOpsAreaServiceType.assExternalServiceType.lookupExternalServiceType", FetchMode.JOIN);
criteria.setResultTransformer(CriteriaSpecification.DISTINCT_ROOT_ENTITY);
criteria.addOrder(Order.asc("name"));
I have started something like this:
package na.co.sab.vitalix.db.dao;
import org.hibernate.Criteria;
import org.hibernate.FetchMode;
import org.hibernate.HibernateException;
import org.hibernate.criterion.CriteriaSpecification;
import org.hibernate.criterion.Order;
import org.hibernate.criterion.Restrictions;
import org.hsqldb.Session;
import na.co.sab.datashare.util.UnitOfWork;
import na.co.sab.vitalix.db.exception.MyOwnDAOException;
import na.co.sab.vitalix.db.util.HibernateOltpSessionUtil;
public class AppTacticalSubUnitTest {
//protected static final org.apache.log4j.Logger LOG = org.apache.log4j.Logger.getLogger(AppOTacticalSubUnitTest.class);
static protected HibernateOltpSessionUtil dataShareInstance;
public static void initializeVitalixOnHsql() throws Exception {
initializeVitalixOnHsql(true);
Have a look at this article http://java.dzone.com/articles/how-get-jpqlsql-string seems like it describes what you want.
The idea is to use org.hibernate.Query#getQueryString() method.

Categories