How to connect postgresSQL using play 2.8 java - java

I am trying to connect PostgreSQL from play 2.8 project using java (not scala). But unable to do. I am using JDK 11, Play 2.8.x version.
SBT dependencies:
libraryDependencies ++= Seq(
"org.postgresql" % "postgresql" % "42.3.1",
jdbc,
"org.playframework.anorm" %% "anorm" % "2.7.0",
javaJpa,
"org.hibernate" % "hibernate-core" % "5.4.32.Final",
"com.typesafe.play" %% "play-java-jpa" % "2.8.8",
"net.jodah" % "failsafe" % "2.3.1",
guice,
"io.dropwizard.metrics" % "metrics-core" % "4.1.1",
"com.palominolabs.http" % "url-builder" % "1.1.0",
"net.jodah" % "failsafe" % "2.3.1",
),
Config file:
db.default {
driver=org.postgresql.Driver
url="jdbc:postgresql://localhost/mydatabase"
user="postgres"
password="root#123"
}
MyController
public class MyController extends Controller {
private DataSource dataSource;
#Inject
public MyController(DataSource dataSource) {
this.dataSource = dataSource;
}
public Result index() {
try (Connection conn = dataSource.getConnection()) {
return "connected database"
} catch (SQLException e) {
return internalServerError("Database error: " + e.getMessage());
}
}
}

Related

Is there anyway to disable "Retryable writes" to false in Spring Boot 2.2.1

First time
I am trying to develop a controller to save data in DocumentDB in AWS.
In the first time it saves, but in the second time, I am looking for this register saved in database, I got this and change some data, and save, but...
I am getting this error:
Caused by: com.mongodb.MongoCommandException: Command failed with error 301: 'Retryable writes are not supported' on server aws:27017. The full response is {"ok": 0.0, "code": 301, "errmsg": "Retryable writes are not supported", "operationTime": {"$timestamp": {"t": 1641469879, "i": 1}}}
This my java code
#Service
public class SaveStateHandler extends Handler<SaveStateCommand> {
#Autowired
private MongoRepository repository;
#Autowired
private MongoTemplate mongoTemplate;
#Override
public String handle(Command command) {
SaveStateCommand cmd = (SaveStateCommand) command;
State state = buildState(cmd);
repository.save(state);
return state.getId();
}
private State buildState(SaveStateCommand cmd) {
State state = State
.builder()
.activityId(cmd.getActivityId())
.agent(cmd.getAgent())
.stateId(cmd.getStateId())
.data(cmd.getData())
.dataAlteracao(LocalDateTime.now())
.build();
State stateFound = findState(cmd);
if (stateFound != null) {
state.setId(stateFound.getId());
}
return state;
}
private State findState(SaveStateCommand request) {
Query query = new Query();
selectField(query);
where(request, query);
return mongoTemplate.findOne(query, State.class);
}
private void selectField(Query query) {
query.fields().include("id");
}
private void where(SaveStateCommand request, Query query) {
query.addCriteria(new Criteria().andOperator(
Criteria.where("activityId").is(request.getActivityId()),
Criteria.where("agent").is(request.getAgent())));
}
}
In AWS they suggest to use retryWrites=false but I don´t know how to do it in Spring Boot.
I use Spring Boot 2.2.1
I tryed to do this
#Bean
public MongoClientSettings mongoSettings() {
return MongoClientSettings
.builder()
.retryWrites(Boolean.FALSE)
.build();
}
But not worked.
=================================================================================
Second Time
I connected to AWS DocumentDb with SSH Tunnel.
Started my application with these database configuration
#Configuration
#EnableConfigurationProperties({MongoProperties.class})
public class MongoAutoConfiguration {
private final MongoClientFactory factory;
private final MongoClientOptions options;
private MongoClient mongo;
public MongoAutoConfiguration(MongoProperties properties, ObjectProvider<MongoClientOptions> options, Environment environment) {
this.options = options.getIfAvailable();
if (StringUtils.isEmpty(properties.getUsername()) || StringUtils.isEmpty(properties.getPassword())) {
properties.setUsername(null);
properties.setPassword(null);
}
properties.setUri(createUri(properties));
this.factory = new MongoClientFactory(properties, environment);
}
private String createUri(MongoProperties properties) {
String uri = "mongodb://";
if (StringUtils.hasText(properties.getUsername()) && !StringUtils.isEmpty(properties.getPassword())) {
uri = uri + properties.getUsername() + ":" + new String(properties.getPassword()) + "#";
}
return uri + properties.getHost() + ":" + properties.getPort() + "/" + properties.getDatabase() + "?retryWrites=false";
}
#PreDestroy
public void close() {
if (this.mongo != null) {
this.mongo.close();
}
}
#Bean
public MongoClient mongo() {
this.mongo = this.factory.createMongoClient(this.options);
return this.mongo;
}
}
And localy it saves the data without error.
But, if I put my API update in AWS ECS, and try to save, got the same error.
=================================================================================
Dependencies
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>2.2.1.RELEASE</version>
</dependency>
<dependency>
<groupId>com.querydsl</groupId>
<artifactId>querydsl-mongodb</artifactId>
<version>4.1.4</version>
</dependency>
When you construct your connection string, you can include the parameters for disabling retryable writes, by adding this to your connection URI:
?replicaSet=rs0&readPreference=primaryPreferred&retryWrites=false&maxIdleTimeMS=30000
Then use this when creating the database factory and mongo template (this example uses the Reactive database factory, but the principle is the same for the SimpleMongoClientDatabaseFactory:
#Bean
fun reactiveMongoDatabaseFactory(
#Value("\${spring.data.mongodb.uri}") uri: String,
#Value("\${mongodb.database-name}") database: String
): ReactiveMongoDatabaseFactory {
val parsedURI = URI(uri)
return SimpleReactiveMongoDatabaseFactory(MongoClients.create(uri), database)
}

Gradle - throw exception if project still has SNAPSHOT dependencies

I want to fail the gradle build if the current project still has snapshot dependencies.
My code so far only looks for java dependencies, missing the .NET ones so it only works for java projects. I want to make it work for all projects.
def addSnapshotCheckingTask(Project project) {
project.tasks.withType(JavaCompile) { compileJava ->
project.tasks.create(compileJava.name + 'SnapshotChecking', {
onlyIf {
project.ext.isRelease || project.ext.commitVersion != null
}
compileJava.dependsOn it
doLast {
def snapshots = compileJava.classpath
.filter { project.ext.isRelease || !(it.path ==~ /(?i)${project.rootProject.projectDir.toString().replace('\\', '\\\\')}.*build.libs.*/) }
.filter { it.path =~ /(?i)-SNAPSHOT/ }
.collect { it.name }
.unique()
if (!snapshots.isEmpty()) {
throw new GradleException("Please get rid of snapshots for following dependencies before releasing $snapshots")
}
}
})
}
}
I need some help in generifying this snippet to be applicable to all types of dependencies(not just java)
Thanks!
L.E. Could something like this work?
https://discuss.gradle.org/t/how-can-i-check-for-snapshot-dependencies-and-throw-an-exception-if-some-where-found/4064
So I got it working by tweaking a bit the response of #lance-java, it looks something like:
Task snapshotCheckingTask = project.tasks.create('snapshotCheckingTask', {
doLast {
def snapshots = new ArrayList()
def projectConfigurations = project.configurations.findAll { true }
projectConfigurations.each {
if (it.isCanBeResolved()) {
it.resolvedConfiguration.resolvedArtifacts.each {
if (it.moduleVersion.id.version.endsWith('-SNAPSHOT')) {
snapshots.add(it)
}
}
}
}
if (!snapshots.isEmpty()) {
throw new GradleException("Please get rid of snapshots for following dependencies before releasing $snapshots")
} else {
throw new GradleException("Hah, no snapshots!")
}
}
})
project.tasks.release.dependsOn snapshotCheckingTask
cc #Eugene
P.S. However, this does not take into account .net dependencies
Something like
Collection<ResolvedArtifact> snapshotArtifacts = project.configurations*.resolvedConfiguration.resolvedArtifacts.filter { it.moduleVersion.id.version.endsWith('-SNAPSHOT') }
if (!snapshotArtifacts.empty) {
// throw exception
}
See
https://docs.gradle.org/current/javadoc/org/gradle/api/artifacts/Configuration.html#getResolvedConfiguration--
https://docs.gradle.org/current/javadoc/org/gradle/api/artifacts/ResolvedConfiguration.html#getResolvedArtifacts--
Here's what I came up with using Gradle 7.2 and the modern Kotlin DSL:
tasks.register("releaseEnforcement") {
group = "verification"
description = "Check whether there are any SNAPSHOT dependencies."
doLast {
val violations = project.configurations
.filter { it.name == "compileClasspath" || it.name == "runtimeClasspath" }
.flatMap { configuration ->
configuration.resolvedConfiguration.resolvedArtifacts
.map { it.moduleVersion.id }
.filter { it.version.endsWith("-SNAPSHOT") }
}
.toSet()
if (violations.isNotEmpty()) {
error("Snapshot dependencies found:\n\n${violations.joinToString(separator = "\n")}")
}
}
}

Cassandra failure during read query at consistency QUORUM - ReadFailureException

I have a simple scala/java program to demo Cassandra java API.
I have a simple UDT class Address which is used in class User. For some reason userMapper.get(userId) fails with no clear error message.
Code is part of scala project.
Runner code (java):
void exp02() {
log.debug("JAVA -- exp02");
Cluster cluster = null;
try {
CodecRegistry codecRegistry = new CodecRegistry();
cluster = Cluster.builder() // (1)
.withCodecRegistry(codecRegistry)
.addContactPoint("127.0.0.1")
.build();
log.debug("connect...exp02");
Session session = cluster.connect(); // (2)
MappingManager manager = new MappingManager(session);
Mapper<User> userMapper = manager.mapper(User.class);
// For some reason this will break
{
log.debug("create user *********************** isClosed: " + cluster.isClosed());
log.debug("get users");
ResultSet results = session.execute("SELECT * FROM cTest.user;");
Result<User> user = userMapper.map(results);
for (User u : user) {
log.debug("User : " + u);
}
log.debug("Users printed");
UUID userId = UUID.fromString("567378a9-8533-4d1c-80a8-71bf4b77189e");
User u2 = userMapper.get(userId); // <<<--- This line throws exception, (JRunner.java:67)
log.debug("Select user = " + u2);
}
} catch (RuntimeException e) {
log.error("Exception: " + e);
e.printStackTrace();
} finally {
log.debug("close...exp02");
if (cluster != null) cluster.close(); // (5)
}
}
Main (scala):
package com.example.crunner
import org.slf4j.{Logger, LoggerFactory}
object MainRunner {
val log: Logger = LoggerFactory.getLogger(getClass())
def main(args: Array[String]): Unit = {
val jrunner = new JRunner()
jrunner.exp02()
}
}
User class (java):
package com.example.crunner;
import java.util.UUID;
import com.datastax.driver.mapping.annotations.Column;
import com.datastax.driver.mapping.annotations.PartitionKey;
import com.datastax.driver.mapping.annotations.Table;
#Table(keyspace = "cTest", name = "user",
readConsistency = "QUORUM",
writeConsistency = "QUORUM"
// caseSensitiveKeyspace = false,
// caseSensitiveTable = false
)
public class User {
#PartitionKey
#Column(name = "user_id")
private UUID userId;
private String name;
private Address address;
public User(UUID userId, String name, Address address) {
this.userId = userId;
this.name = name;
this.address = address;
}
public User() { address = new Address(); }
public UUID getUserId() {
return userId;
}
public void setUserId(UUID userId) {
this.userId = userId;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public Address getAddress() {
return address;
}
public void setAddress(Address address) {
this.address = address;
}
#Override
public String toString() {
return "User{" +
"userId=" + userId +
", name='" + name + '\'' +
", address=" + address +
'}';
}
}
UDT Address class (java)
package com.example.crunner;
import com.datastax.driver.mapping.annotations.Field;
import com.datastax.driver.mapping.annotations.UDT;
#UDT(keyspace = "cTest", name = "addressT") //, caseSensitiveType = true)
public class Address {
private String street;
private int zipCode;
public Address(String street, int zipCode) {
this.street = street;
this.zipCode = zipCode;
}
public Address() {
}
public String getStreet() {
return street;
}
public void setStreet(String street) {
this.street = street;
}
public int getZipCode() {
return zipCode;
}
public void setZipCode(int zipCode) {
this.zipCode = zipCode;
}
#Override
public String toString() {
return "Address{" +
"street='" + street + '\'' +
", zipCode=" + zipCode +
'}';
}
}
CQL (other tables not included here):
CREATE TYPE ctest.addresst (
street text,
zipcode int
);
CREATE TABLE ctest.user (
user_id uuid PRIMARY KEY,
address addresst,
name text
) WITH bloom_filter_fp_chance = 0.01
AND caching = {'keys': 'ALL', 'rows_per_partition': 'NONE'}
AND comment = ''
AND compaction = {'class': 'org.apache.cassandra.db.compaction.SizeTieredCompactionStrategy', 'max_threshold': '32', 'min_threshold': '4'}
AND compression = {'chunk_length_in_kb': '64', 'class': 'org.apache.cassandra.io.compress.LZ4Compressor'}
AND crc_check_chance = 1.0
AND dclocal_read_repair_chance = 0.1
AND default_time_to_live = 0
AND gc_grace_seconds = 864000
AND max_index_interval = 2048
AND memtable_flush_period_in_ms = 0
AND min_index_interval = 128
AND read_repair_chance = 0.0
AND speculative_retry = '99PERCENTILE';
build.sbt
name := "CassJExp2"
version := "0.1-SNAPSHOT"
scalaVersion := "2.11.9"
resolvers += "Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/"
val cassandraVersion = "3.2.0"
val logbackVersion = "1.2.3"
libraryDependencies ++= Seq(
"ch.qos.logback" % "logback-classic" % logbackVersion withSources() withJavadoc(), //
"ch.qos.logback" % "logback-core" % logbackVersion withSources() withJavadoc(), //
"ch.qos.logback" % "logback-access" % logbackVersion withSources() withJavadoc(), //
"org.slf4j" % "slf4j-api" % "1.7.25" withSources() withJavadoc(), //
"joda-time" % "joda-time" % "2.9.9" withSources() withJavadoc(), //
"com.datastax.cassandra" % "cassandra-driver-core" % cassandraVersion withSources() withJavadoc(), //
"com.datastax.cassandra" % "cassandra-driver-mapping" % cassandraVersion withSources() withJavadoc(), //
"com.datastax.cassandra" % "cassandra-driver-extras" % cassandraVersion withSources() withJavadoc() //
)
scalacOptions += "-deprecation"
When I run this code on sbt console, I get following output:
18:08:41.447 [run-main-f] DEBUG com.example.crunner.JRunner - JAVA -- exp02
18:08:41.497 [run-main-f] INFO c.d.driver.core.GuavaCompatibility - Detected Guava >= 19 in the classpath, using modern compatibility layer
18:08:41.634 [run-main-f] INFO c.datastax.driver.core.ClockFactory - Using native clock to generate timestamps.
18:08:41.644 [run-main-f] DEBUG com.example.crunner.JRunner - connect...exp02
18:08:41.674 [run-main-f] INFO com.datastax.driver.core.NettyUtil - Did not find Netty's native epoll transport in the classpath, defaulting to NIO.
18:08:42.049 [run-main-f] INFO c.d.d.c.p.DCAwareRoundRobinPolicy - Using data-center name 'datacenter1' for DCAwareRoundRobinPolicy (if this is incorrect, please provide the correct datacenter name with DCAwareRoundRobinPolicy constructor)
18:08:42.051 [run-main-f] INFO com.datastax.driver.core.Cluster - New Cassandra host /127.0.0.1:9042 added
18:08:42.107 [run-main-f] DEBUG com.example.crunner.JRunner - create user *********************** isClosed: false
18:08:42.108 [run-main-f] DEBUG com.example.crunner.JRunner - get users
18:08:42.139 [run-main-f] DEBUG com.example.crunner.JRunner - User : User{userId=54cbad6e-3f27-4b7e-bce0-8a4a4fbffbdf, name='John Doe', address=Address{street='street', zipCode=512}}
18:08:42.139 [run-main-f] DEBUG com.example.crunner.JRunner - User : User{userId=6122b896-8b28-448d-ac5c-4bc9b5c7c7ab, name='John Doe', address=Address{street='street', zipCode=512}}
... output truncated here, table contains about 150 rows ...
18:08:42.175 [run-main-f] DEBUG com.example.crunner.JRunner - User : User{userId=44f69277-ff97-4ba2-9216-bdf65eccd7c3, name='John Doe', address=Address{street='street', zipCode=512}}
18:08:42.175 [run-main-f] DEBUG com.example.crunner.JRunner - Users printed
18:08:42.203 [run-main-f] ERROR com.example.crunner.JRunner - Exception: com.datastax.driver.core.exceptions.ReadFailureException: Cassandra failure during read query at consistency QUORUM (1 responses were required but only 0 replica responded, 1 failed)
com.datastax.driver.core.exceptions.ReadFailureException: Cassandra failure during read query at consistency QUORUM (1 responses were required but only 0 replica responded, 1 failed)
at com.datastax.driver.core.exceptions.ReadFailureException.copy(ReadFailureException.java:130)
at com.datastax.driver.core.exceptions.ReadFailureException.copy(ReadFailureException.java:30)
at com.datastax.driver.mapping.DriverThrowables.propagateCause(DriverThrowables.java:41)
at com.datastax.driver.mapping.Mapper.get(Mapper.java:435)
at com.example.crunner.JRunner.exp02(JRunner.java:67)
at com.example.crunner.MainRunner$.main(MainRunner.scala:18)
at com.example.crunner.MainRunner.main(MainRunner.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at sbt.Run.invokeMain(Run.scala:67)
at sbt.Run.run0(Run.scala:61)
at sbt.Run.sbt$Run$$execute$1(Run.scala:51)
at sbt.Run$$anonfun$run$1.apply$mcV$sp(Run.scala:55)
at sbt.Run$$anonfun$run$1.apply(Run.scala:55)
at sbt.Run$$anonfun$run$1.apply(Run.scala:55)
at sbt.Logger$$anon$4.apply(Logger.scala:84)
at sbt.TrapExit$App.run(TrapExit.scala:248)
at java.lang.Thread.run(Thread.java:745)
Caused by: com.datastax.driver.core.exceptions.ReadFailureException: Cassandra failure during read query at consistency QUORUM (1 responses were required but only 0 replica responded, 1 failed)
at com.datastax.driver.core.exceptions.ReadFailureException.copy(ReadFailureException.java:142)
at com.datastax.driver.core.Responses$Error.asException(Responses.java:140)
at com.datastax.driver.core.DefaultResultSetFuture.onSet(DefaultResultSetFuture.java:179)
at com.datastax.driver.core.RequestHandler.setFinalResult(RequestHandler.java:179)
at com.datastax.driver.core.RequestHandler.access$2400(RequestHandler.java:49)
at com.datastax.driver.core.RequestHandler$SpeculativeExecution.setFinalResult(RequestHandler.java:799)
at com.datastax.driver.core.RequestHandler$SpeculativeExecution.onSet(RequestHandler.java:633)
at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:1075)
at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:998)
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
... 1 more
Caused by: com.datastax.driver.core.exceptions.ReadFailureException: Cassandra failure during read query at consistency QUORUM (1 responses were required but only 0 replica responded, 1 failed)
at com.datastax.driver.core.Responses$Error$1.decode(Responses.java:88)
at com.datastax.driver.core.Responses$Error$1.decode(Responses.java:38)
at com.datastax.driver.core.Message$ProtocolDecoder.decode(Message.java:289)
at com.datastax.driver.core.Message$ProtocolDecoder.decode(Message.java:269)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:88)
... 20 more
18:08:42.205 [run-main-f] DEBUG com.example.crunner.JRunner - close...exp02
[success] Total time: 4 s, completed Apr 18, 2017 6:08:45 PM
At the same time I get the following error message into /var/log/cassandra/system.log:
WARN [ReadStage-2] 2017-04-18 18:08:42,202 AbstractLocalAwareExecutorService.java:169 - Uncaught exception on thread Thread[ReadStage-2,10,main]: {}
java.lang.AssertionError: null
at org.apache.cassandra.db.rows.BTreeRow.getCell(BTreeRow.java:212) ~[apache-cassandra-3.9.jar:3.9]
at org.apache.cassandra.db.SinglePartitionReadCommand.canRemoveRow(SinglePartitionReadCommand.java:895) ~[apache-cassandra-3.9.jar:3.9]
at org.apache.cassandra.db.SinglePartitionReadCommand.reduceFilter(SinglePartitionReadCommand.java:859) ~[apache-cassandra-3.9.jar:3.9]
at org.apache.cassandra.db.SinglePartitionReadCommand.queryMemtableAndSSTablesInTimestampOrder(SinglePartitionReadCommand.java:744) ~[apache-cassandra-3.9.jar:3.9]
at org.apache.cassandra.db.SinglePartitionReadCommand.queryMemtableAndDiskInternal(SinglePartitionReadCommand.java:515) ~[apache-cassandra-3.9.jar:3.9]
at org.apache.cassandra.db.SinglePartitionReadCommand.queryMemtableAndDisk(SinglePartitionReadCommand.java:492) ~[apache-cassandra-3.9.jar:3.9]
at org.apache.cassandra.db.SinglePartitionReadCommand.queryStorage(SinglePartitionReadCommand.java:358) ~[apache-cassandra-3.9.jar:3.9]
at org.apache.cassandra.db.ReadCommand.executeLocally(ReadCommand.java:397) ~[apache-cassandra-3.9.jar:3.9]
at org.apache.cassandra.service.StorageProxy$LocalReadRunnable.runMayThrow(StorageProxy.java:1801) ~[apache-cassandra-3.9.jar:3.9]
at org.apache.cassandra.service.StorageProxy$DroppableRunnable.run(StorageProxy.java:2486) ~[apache-cassandra-3.9.jar:3.9]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[na:1.8.0_121]
at org.apache.cassandra.concurrent.AbstractLocalAwareExecutorService$FutureTask.run(AbstractLocalAwareExecutorService.java:164) ~[apache-cassandra-3.9.jar:3.9]
at org.apache.cassandra.concurrent.AbstractLocalAwareExecutorService$LocalSessionFutureTask.run(AbstractLocalAwareExecutorService.java:136) [apache-cassandra-3.9.jar:3.9]
at org.apache.cassandra.concurrent.SEPWorker.run(SEPWorker.java:109) [apache-cassandra-3.9.jar:3.9]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_121]
Cassandra version is [cqlsh 5.0.1 | Cassandra 3.9 | CQL spec 3.4.2 | Native protocol v4]
So userMapper can map ResultSet of users but getting a single user will fail. The userId I try to fetch exists in the user table. It is also possible to save a new user into db using the userMapper without failure.
I don't know if this is somehow related to having a UDT Address in User class. Tables / mappers without UDT classes are working fine.
EDIT:
As Marko Švaljek suggested I tried the query at command line:
cqlsh> SELECT * FROM cTest.user where user_id=567378a9-8533-4d1c-80a8-71bf4b77189e;
ReadFailure: Error from server: code=1300 [Replica(s) failed to execute read] message="Operation failed - received 0 responses and 1 failures" info={'failures': 1, 'received_responses': 0, 'required_responses': 1, 'consistency': 'ONE'}
Looks like same error than with java client.
SELECT * FROM cTest.user works fine.
EDIT 2:
This is single instance environment.
nodetool status
Datacenter: datacenter1
=======================
Status=Up/Down
|/ State=Normal/Leaving/Joining/Moving
-- Address Load Tokens Owns Host ID Rack
UN 127.0.0.1 354.4 KiB 256 ? 33490146-da36-4359-bb24-42854bdb3c26 rack1
Note: Non-system keyspaces don't have the same replication settings, effective ownership information is meaningless
What's the reason for this error and how to fix it? Thank you for your support.

ElasticSearch 2.2.0 - ESIntegTestCase - ClassNotFoundException when executing groovy script in search

I've been using Elastic 1.4.4, but we're now upgrading to 2.2.0. I am having trouble getting my integration tests to run. My integration test extends org.elasticsearch.test.ESIntegTestCase:
#ESIntegTestCase.ClusterScope(scope = ESIntegTestCase.Scope.SUITE, numDataNodes = 1)
public abstract class AbstractApplicationTest extends ESIntegTestCase {
...
}
I can index documents without problems, but when I try searching with a script field, I get an error. I'm running my tests using sbt (I'm using the Play framework).
The error I'm getting is following:
{
"error": {
"root_cause": [{
"type": "script_exception",
"reason": "failed to compile groovy script"
}],
"type": "search_phase_execution_exception",
"reason": "all shards failed",
"phase": "query",
"grouped": true,
"failed_shards": [{
"shard": 0,
"index": "bokun",
"node": "BNyjts9hTOicRgCAWGdKgQ",
"reason": {
"type": "script_exception",
"reason": "Failed to compile inline script [if(_source.accumulated_availability != null){ for(item in _source.accumulated_availability){ if(start.compareTo(item.day) < 0 && (end == null || end.compareTo(item.day) >= 0)){ return item.day } }} else return null;] using lang [groovy]",
"caused_by": {
"type": "script_exception",
"reason": "failed to compile groovy script",
"caused_by": {
"type": "multiple_compilation_errors_exception",
"reason": "startup failed:\nCould not instantiate global transform class groovy.grape.GrabAnnotationTransformation specified at jar:file:/Users/ogg/.ivy2/cache/org.codehaus.groovy/groovy-all/jars/groovy-all-2.4.4-indy.jar!/META-INF/services/org.codehaus.groovy.transform.ASTTransformation because of exception java.lang.ClassNotFoundException: groovy.grape.GrabAnnotationTransformation\n\nCould not instantiate global transform class org.codehaus.groovy.ast.builder.AstBuilderTransformation specified at jar:file:/Users/ogg/.ivy2/cache/org.codehaus.groovy/groovy-all/jars/groovy-all-2.4.4-indy.jar!/META-INF/services/org.codehaus.groovy.transform.ASTTransformation because of exception java.lang.ClassNotFoundException: org.codehaus.groovy.ast.builder.AstBuilderTransformation\n\n2 errors\n"
}
}
}
}]
},
"status": 500
}
I'll reformat the "reason" message for readability:
startup failed:
Could not instantiate global transform class
groovy.grape.GrabAnnotationTransformation
specified at jar:file:/Users/ogg/.ivy2/cache/org.codehaus.groovy/groovy-all/jars/groovy-all-2.4.4-indy.jar!/META-INF/services/org.codehaus.groovy.transform.ASTTransformation
because of exception
java.lang.ClassNotFoundException: groovy.grape.GrabAnnotationTransformation
Could not instantiate global transform class
org.codehaus.groovy.ast.builder.AstBuilderTransformation
specified at jar:file:/Users/ogg/.ivy2/cache/org.codehaus.groovy/groovy-all/jars/groovy-all-2.4.4-indy.jar!/META-INF/services/org.codehaus.groovy.transform.ASTTransformation
because of exception java.lang.ClassNotFoundException: org.codehaus.groovy.ast.builder.AstBuilderTransformation
What can cause this? As far as I can tell, I have this class in my classpath: org.codehaus.groovy.ast.builder.AstBuilderTransformation.
I have the following dependencies in my build.sbt:
"org.codehaus.groovy" % "groovy-all" % "2.4.4",
"com.carrotsearch.randomizedtesting" % "randomizedtesting-runner" % "2.3.0" % "test",
"org.apache.lucene" % "lucene-test-framework" % "5.4.1",
"org.elasticsearch" % "elasticsearch" % "2.2.0" % "test" classifier "tests" withSources(),
"org.elasticsearch" % "elasticsearch" % "2.2.0" withSources(),
"org.elasticsearch.plugin" % "analysis-icu" % "2.2.0" % "test",
"org.elasticsearch.module" % "lang-groovy" % "2.2.0" % "test"
...and I have the following in my EsIntegTestCase extension class:
#Override
protected Settings nodeSettings(int nodeOrdinal) {
return Settings.settingsBuilder()
.put(super.nodeSettings(nodeOrdinal))
.put(IndexMetaData.SETTING_NUMBER_OF_SHARDS, 1)
.put(IndexMetaData.SETTING_NUMBER_OF_REPLICAS, 1)
.put(Node.HTTP_ENABLED, true)
.put("script.groovy.sandbox.enabled", true)
.put("script.engine.groovy.inline.search", true)
.put("script.engine.groovy.inline.update", "true")
.put("script.inline", true)
.put("script.update", true)
.put("script.indexed", true)
.put("script.default_lang", "groovy")
.build();
}
#Override
protected Collection<Class<? extends Plugin>> nodePlugins() {
return pluginList(GroovyPlugin.class, AnalysisICUPlugin.class);
}
I'm completely stuck, and Google is unwilling to help! :slightly_smiling:
Any help or pointers would be greatly appreciated.
Many thanks,
OGG
This is now solved.
The problem was actually that this was a SecurityException rethrown as ClassNotFoundException.
Using the instructions at https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-scripting-security.html I created a security policy file and added permission for the following classes:
grant {
permission org.elasticsearch.script.ClassPermission "java.lang.Class";
permission org.elasticsearch.script.ClassPermission "org.codehaus.groovy.*";
permission org.elasticsearch.script.ClassPermission "groovy.*";
permission org.elasticsearch.script.ClassPermission "java.lang.*";
permission org.elasticsearch.script.ClassPermission "java.util.*";
permission org.elasticsearch.script.ClassPermission "java.math.BigDecimal";
permission org.elasticsearch.script.ClassPermission "org.joda.time.*";
};
And then I start the tests passing my security policy file on the command line:
-Djava.security.policy=security.policy
You can see the thread on the Elastic discussion forum which helped me reach this solution: https://discuss.elastic.co/t/2-2-0-esintegtestcase-classnotfoundexception-when-executing-groovy-script-in-search/43579

Spring 3 to Spring 4 (Java 7 to Java 8) - AspectJ compilation error : "cannot access ApplicationEventPublisherAware"

I am in the process of upgrading a project from Java 7 to Java 8 (and with it Spring 3 to Spring 4).
I am getting the following compilation error:
DomainSecurityAspect.java:[88,39] error: cannot access ApplicationEventPublisherAware
The code it is complaining about is:
#Around("target(com.mycompany.automation.domain.framework.DomainEntityImpl+) && !execution(* equals(..)) && !execution(* hashCode()) "
+ "&& !execution(* toString()) && !execution(* get*(..)) && !execution(* is*(..)) && execution(public * *(..)) "
+ "&& !within(com.mycompany.iecc.data.automation.domain.aop.DomainSecurityAspect)"
+ "&& !execution(* com.mycompany.iecc.data.automation.domain.framework.BaseObject.*(..))"
+ "&& !execution(* com.mycompany.iecc.data.automation.domain.framework.DomainObjectImpl.*(..))")
public Object domainObjectInstanceExecution(final ProceedingJoinPoint thisJoinPoint) throws Throwable
{
if (this.securityInterceptor == null)
{
return thisJoinPoint.proceed();
}
final AspectJCallback callback = new AspectJCallback()
{
#Override
public Object proceedWithObject()
{
try
{
return thisJoinPoint.proceed();
}
catch (Error e)
{
throw e;
}
catch (RuntimeException re)
{
throw re;
}
catch (Throwable th)
{
throw new RuntimeException(th);
}
}
};
// Compiler complains about this line
return this.securityInterceptor.invoke(thisJoinPoint, callback);
}
Software Versions I am using:
Java 8 (jdk1.8.0_66)
Spring 4.2.4.RELEASE
Aspectj version 1.8.8
cglib-nodep version 3.2.0
Note: This compiles OK when it uses Java 7 with Spring 4, but when compiled with Java 8 it gives that compilation error.
It turns out the error was in my POM - to fix this I had to change the scope from runtime to compile
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<scope>compile</scope>
</dependency>

Categories