Here it is my simple code, it read from pubsub subscription and save the body of the message to Cassandra table with current timestamp.
The message is consumed from subscription but there is no record insert to table and there is no error messages.
But if I change Date type "Timestamp" to Long in class TestTable, this code is working and insert the record to the table.
here it is the script to create the table.
DROP TABLE IF EXISTS test_table;
CREATE TABLE IF NOT EXISTS test_table(
post_index int,
ingestion_time TIMESTAMP,
body text,
PRIMARY KEY ((post_index))
);
#Table(keyspace = "{keyspace_name}", name = "{table_name}",
readConsistency = "LOCAL_QUORUM",
writeConsistency = "LOCAL_QUORUM",
caseSensitiveKeyspace = false,
caseSensitiveTable = false)
class TestTable implements Serializable {
#PartitionKey
#Column(name="post_index")
Integer postIndex;
#Column(name="ingestion_time")
Timestamp ingestionTime;
#Column(name = "body")
String body;
public Integer getPostIndex() {
return postIndex;
}
public void setPostIndex(Integer postIndex) {
this.postIndex = postIndex;
}
public Timestamp getIngestionTime() {
return ingestionTime;
}
public void setIngestionTime(Timestamp ingestionTime) {
this.ingestionTime = ingestionTime;
}
public String getBody() {
return body;
}
public void setBody(String body) {
this.body = body;
}
public TestTable(Integer postIndex, Timestamp ingestionTime, String body) {
this.body = body;
this.ingestionTime = ingestionTime;
this.postIndex = postIndex;
}
public TestTable() {
this.body = "";
this.ingestionTime = Timestamp.from(Instant.now());
this.postIndex = 0;
}
}
public class TestCassandraJobJava {
public static void main(String[] args) {
Pipeline pipeline = Pipeline.create(PipelineOptionsFactory.fromArgs(args).create());
PCollection<String> data = pipeline.apply("ReadStrinsFromPubsub",
PubsubIO.readStrings().fromSubscription("projects/{project_id}/subscriptions/{subscription_name}"))
.apply("window", Window.into(FixedWindows.of(Duration.standardSeconds(5))))
.apply("CreateMutation", ParDo.of(new DoFn<String, TestTable>() {
#ProcessElement
public void processElement(#Element String word, OutputReceiver<TestTable> out) {
TestTable t = new TestTable(new Random().nextInt(), java.sql.Timestamp.from(Instant.now()), word);
out.output(t);
}
})).apply(CassandraIO.<TestTable>write()
.withHosts(Arrays.asList("127.0.0.1"))
.withPort(9042)
.withKeyspace("{keyspace}")
.withLocalDc("Cassandra")
.withEntity(TestTable.class)
);
pipeline.run().waitUntilFinish();
}
}
To get this working you need to have a codec between the Cassandra's timestamp and java.sql.Timestamp. By default, in Java driver 3.x, the timestamp is converted into java.util.Date (see mapping), although you can also use Joda Time, or Java 8.x time API via extra codecs. And in Java driver 4.x, the Instant is used for representation of timestamps.
There is no built-in codec for java.sql.Timestamp, but it shouldn't be very hard to implement your own - the documentation describes process of custom codec creation & usage in much details.
Related
I need to validate my ui data and api responses are same,
here is my code I tried,
private ValidateContentPage cp = new ValidateContentPage();
public void getTitle() {
String UITitle = driver.findElement(titlepage).getText();
System.out.println(UITitle);
Assert.assertEquals(UITitle, cp.getAPICall(),"Passed");
}
here im getting my api responses,
public class ValidateContentPage {
public common cm = new common();
public Properties prop;
public void baseURI() {
prop = cm.getProperties("./src/test/API/IndiaOne/propertyfile/EndpointURL.properties");
RestAssured.baseURI = prop.getProperty("baseURI");
}
public String getAPICall() {
objectpojo ps = given().expect().defaultParser(Parser.JSON).when().get(prop.getProperty("resources")).as(objectpojo.class, cm.getMapper());
int number = ps.getPosts().size();
System.out.println(number);
System.out.println(ps.getPosts().get(0).getTitle());
return ps.getPosts().get(0).getTitle();
}
If i validate both using testng assertion it throwing null pointer exception, anyone help me on how to validate my ui data and api responses.
You need to call your ValidateContentPage from #Test itself or from #BeforeTest
#Test
public void getTitle() {
String UITitle = driver.findElement(titlepage).getText();
System.out.println(UITitle);
ValidateContentPage cp = new ValidateContentPage();
Assert.assertEquals(UITitle, cp.getAPICall(),"Passed");
}
I'm currently new to the Spring Boot Java framework and I'm building a simple application. When my service starts, I want to be able to read a raw file from a URL, parse that data, and upload it into my mongodb database of atlas. So far this is what I have:
#Service
public class CoronaVirusDataService {
private List<LocationStats> allConfirmedStats = new ArrayList<>();
MongoOperations mongoOperations;
#PostConstruct // run this method as soon as the application runs
#Scheduled(cron = "* * 1 * * *") // execute this method every day
public void fetchVirusData() {
List<LocationStats> newStats = new ArrayList<>(); // to hold the stats of each state
HttpClient client = HttpClient.newHttpClient();
// creating a new http request
HttpRequest request = HttpRequest.newBuilder()
.uri(URI.create(ConstantsUtil.VIRUS_CONFIRMED_DATA_URL))
.build();
// get a response by having the client send the request
try {
HttpResponse<String> httpResponse = client.send(request, HttpResponse.BodyHandlers.ofString());
// parse the body of the request from csv format to readable format
StringReader csvBodyReader = new StringReader(httpResponse.body());
Iterable<CSVRecord> records = CSVFormat.DEFAULT.withFirstRecordAsHeader().parse(csvBodyReader);
for (CSVRecord record: records) {
// create a model with the parsed data
LocationStats stats = new LocationStats();
stats.setState(record.get("Province/State"));
stats.setCountry(record.get("Country/Region"));
// the latest day
int latestCases = Integer.parseInt(record.get(record.size() - 1));
int prevDayCases = Integer.parseInt(record.get(record.size() - 2));
stats.setLatestTotalCases(latestCases);
stats.setDiffFromPreviousDay(prevDayCases);
mongoOperations.save(LocationStats);
// add to new stats
newStats.add(stats);
}
// assign to class array -> we use this array to display the data
this.allConfirmedStats = newStats;
} catch (IOException | InterruptedException e) {
e.printStackTrace();
}
}
}
So the main issue with this is the data is not saving to the mongoDB once I call mongoOperations.save(). Also, I've learned that it is bad practice to maintain some type of state in a Service. What is the best practice for this? Will inserting the data into MongoDB take care of that since we are not managing state.
Here is my model class that I want to save to mongodb
#Document(collection = "LocationStats")
public class LocationStats {
/** Location model to show corona virus statistics in each state*/
#Id
private String state;
private String country;
private int latestTotalCases;
private int diffFromPreviousDay;
public String getState() {
return state;
}
public void setState(String state) {
this.state = state;
}
public String getCountry() {
return country;
}
public void setCountry(String country) {
this.country = country;
}
public int getLatestTotalCases() {
return latestTotalCases;
}
public void setLatestTotalCases(int latestTotalCases) {
this.latestTotalCases = latestTotalCases;
}
public int getDiffFromPreviousDay() {
return diffFromPreviousDay;
}
public void setDiffFromPreviousDay(int diffFromPreviousDay) {
this.diffFromPreviousDay = diffFromPreviousDay;
}
#Override
public String toString() {
return "LocationStats{" +
"state='" + state + '\'' +
", country='" + country + '\'' +
", latestTotalCases=" + latestTotalCases +
'}';
}
}
once I have my models saved into mongoDB, I want to read from the database and get all the data from each collection and display it on the webpage. I'm thinking I'd fetch that data within the controller class and pass it to the frontend, is this good practice? here is my controller class.
#Controller
public class HomeController {
/** Controller class to generate/render the html UI */
#Autowired
CoronaVirusDataService coronaVirusDataService;
#Autowired
MongoOperations mongoOperations;
#GetMapping("/") // map this to the root template
public String home(Model model) {
List<LocationStats> allStats = coronaVirusDataService.getAllConfirmedStats();
// instead of above getter method, have a method call that fetches all data from mongoDB and return it as a List<LocationStats>
// get the total confirmed cases
int totalConfirmedCases = allStats.stream().mapToInt(LocationStats::getLatestTotalCases).sum();
int totalNewCases = allStats.stream().mapToInt(LocationStats::getDiffFromPreviousDay).sum();
// send the models to the view
model.addAttribute("locationStats", allStats);
model.addAttribute("totalReportedCases", totalConfirmedCases);
model.addAttribute("totalNewCases", totalNewCases);
return "home";
}
}
I'm using DynamoDB with the Java SDK, but I'm having some issues with querying nested documents. I've included simplified code below. If I remove the filter expression, then everything gets returned. With the filter expression, nothing is returned. I've also tried using withQueryFilterEntry(which I'd prefer to use) and I get the same results. Any help is appreciated. Most of the documentation and forums online seem to use an older version of the java sdk than I'm using.
Here's the Json
{
conf:
{type:"some"},
desc: "else"
}
Here's the query
DynamoDBQueryExpression<JobDO> queryExpression = new DynamoDBQueryExpression<PJobDO>();
queryExpression.withFilterExpression("conf.Type = :type").addExpressionAttributeValuesEntry(":type", new AttributeValue(type));
return dbMapper.query(getItemType(), queryExpression);
Is it a naming issue? (your sample json has "type" but the query is using "Type")
e.g. the following is working for me using DynamoDB Local:
public static void main(String [] args) {
AmazonDynamoDBClient client = new AmazonDynamoDBClient(new BasicAWSCredentials("akey1", "skey1"));
client.setEndpoint("http://localhost:8000");
DynamoDBMapper mapper = new DynamoDBMapper(client);
client.createTable(new CreateTableRequest()
.withTableName("nested-data-test")
.withAttributeDefinitions(new AttributeDefinition().withAttributeName("desc").withAttributeType("S"))
.withKeySchema(new KeySchemaElement().withKeyType("HASH").withAttributeName("desc"))
.withProvisionedThroughput(new ProvisionedThroughput().withReadCapacityUnits(1L).withWriteCapacityUnits(1L)));
NestedData u = new NestedData();
u.setDesc("else");
Map<String, String> c = new HashMap<String, String>();
c.put("type", "some");
u.setConf(c);
mapper.save(u);
DynamoDBQueryExpression<NestedData> queryExpression = new DynamoDBQueryExpression<NestedData>();
queryExpression.withHashKeyValues(u);
queryExpression.withFilterExpression("conf.#t = :type")
.addExpressionAttributeNamesEntry("#t", "type") // returns nothing if use "Type"
.addExpressionAttributeValuesEntry(":type", new AttributeValue("some"));
for(NestedData u2 : mapper.query(NestedData.class, queryExpression)) {
System.out.println(u2.getDesc()); // "else"
}
}
NestedData.java:
#DynamoDBTable(tableName = "nested-data-test")
public class NestedData {
private String desc;
private Map<String, String> conf;
#DynamoDBHashKey
public String getDesc() { return desc; }
public void setDesc(String desc) { this.desc = desc; }
#DynamoDBAttribute
public Map<String, String> getConf() { return conf; }
public void setConf(Map<String, String> conf) { this.conf = conf; }
}
I am facing this issue(getting null response) when i am trying to Query in Java using
I need to based on placed time stamp range and releases desc and status.
// My document as follows:
<ordersAuditRequest>
<ordersAudit>
<createTS>2013-04-19 12:19:17.165</createTS>
<orderSnapshot>
<orderId>43060151</orderId>
<placedTS>2013-04-19 12:19:17.165</placedTS>
<releases>
<ffmCenterDesc>TW</ffmCenterDesc>
<relStatus>d </relStatus>
</releases>
</ordersAudit>
</ordersAuditRequest>
I am using following query but it returns null.
Query query = new Query();
query.addCriteria(Criteria.where("orderSnapshot.releases.ffmCenterDesc").is(ffmCenterDesc)
.and("orderSnapshot.releases.relStatus").is(relStatus)
.andOperator(
Criteria.where("orderSnapshot.placedTS").gt(orderPlacedStart),
Criteria.where("orderSnapshot.placedTS").lt(orderPlacedEnd)
)
);
I can't reproduce your problem, which suggests that the issue is with the values in the database and the values you're passing in to the query (i.e. they're not matching). This is not unusual when you're trying to match dates, as you need to make sure they're stored as ISODates in the database and queried using java.util.date in the query.
I have a test that shows your query working, but I've made a number of assumptions about your data.
My test looks like this, hopefully this will help point you in the correct direction, or if you give me more feedback I can re-create your problem more accurately.
#Test
public void shouldBeAbleToQuerySpringDataWithDates() throws Exception {
// Setup - insert test data into the DB
SimpleDateFormat dateFormat = new SimpleDateFormat("yyyy-MM-dd' 'hh:mm:ss.SSS");
MongoTemplate mongoTemplate = new MongoTemplate(new Mongo(), "TheDatabase");
// cleanup old test data
mongoTemplate.getCollection("ordersAudit").drop();
Release release = new Release("TW", "d");
OrderSnapshot orderSnapshot = new OrderSnapshot(43060151, dateFormat.parse("2013-04-19 12:19:17.165"), release);
OrdersAudit ordersAudit = new OrdersAudit(dateFormat.parse("2013-04-19 12:19:17.165"), orderSnapshot);
mongoTemplate.save(ordersAudit);
// Create and run the query
Date from = dateFormat.parse("2013-04-01 01:00:05.000");
Date to = dateFormat.parse("2014-04-01 01:00:05.000");
Query query = new Query();
query.addCriteria(Criteria.where("orderSnapshot.releases.ffmCenterDesc").is("TW")
.and("orderSnapshot.releases.relStatus").is("d")
.andOperator(
Criteria.where("orderSnapshot.placedTS").gt(from),
Criteria.where("orderSnapshot.placedTS").lt(to)
)
);
// Check the results
List<OrdersAudit> results = mongoTemplate.find(query, OrdersAudit.class);
Assert.assertEquals(1, results.size());
}
public class OrdersAudit {
private Date createdTS;
private OrderSnapshot orderSnapshot;
public OrdersAudit(final Date createdTS, final OrderSnapshot orderSnapshot) {
this.createdTS = createdTS;
this.orderSnapshot = orderSnapshot;
}
}
public class OrderSnapshot {
private long orderId;
private Date placedTS;
private Release releases;
public OrderSnapshot(final long orderId, final Date placedTS, final Release releases) {
this.orderId = orderId;
this.placedTS = placedTS;
this.releases = releases;
}
}
public class Release {
String ffmCenterDesc;
String relStatus;
public Release(final String ffmCenterDesc, final String relStatus) {
this.ffmCenterDesc = ffmCenterDesc;
this.relStatus = relStatus;
}
}
Notes:
This is a TestNG class, not JUnit.
I've used SimpleDateFormat to create Java Date classes, this is just for ease of use.
The XML value you pasted for relStatus included spaces, which I have stripped.
You showed us the document structure in XML, not JSON, so I've had to assume what your data looks like. I've translated it almost directly into JSON, so it looks like this in the database:
{
"_id" : ObjectId("51d689843004ec60b17f50de"),
"_class" : "OrdersAudit",
"createdTS" : ISODate("2013-04-18T23:19:17.165Z"),
"orderSnapshot" : {
"orderId" : NumberLong(43060151),
"placedTS" : ISODate("2013-04-18T23:19:17.165Z"),
"releases" : {
"ffmCenterDesc" : "TW",
"relStatus" : "d"
}
}
}
You can find what yours really looks like by doing a db.<collectionName>.findOne() call in the mongoDB shell.
I have a simple entity class and it is supposed to include unique names on it.
#Entity
class Package {
#PrimaryKey(sequence = "ID")
public Long id;
#SecondaryKey(relate = Relationship.ONE_TO_ONE)
public String name;
private Package() {}
public Package(String name) { this.name = name; }
#Override
public String toString() { return id + " : " + name; }
}
I want to use deferred writing option because of extensive modification. Here is the test i tried and its output.
final String dbfilename = "test01";
new File(dbfilename).mkdirs();
EnvironmentConfig config = new EnvironmentConfig().setAllowCreate(true);
Environment environment = new Environment(new File(dbfilename), config);
StoreConfig storeConfig = new StoreConfig().setAllowCreate(true).setDeferredWrite(true);
EntityStore store = new EntityStore(environment, "", storeConfig);
PrimaryIndex<Long, Package> primaryIndex = store.getPrimaryIndex(Long.class, Package.class);
try {
primaryIndex.put(new Package("package01")); // will be put.
primaryIndex.put(new Package("package01")); // throws exception.
} catch (UniqueConstraintException ex) {
System.out.println(ex.getMessage());
}
store.sync(); // flush them all
// expecting to find one element
SortedMap<Long,Package> sortedMap = primaryIndex.sortedMap();
for (Package entity : sortedMap.values()) {
System.out.println(entity);
}
Output
(JE 5.0.73) Unique secondary key is already present
1 : package01
2 : package01
So my question is that even if it throws exception while putting second package, why does it lists two packages. Any way to avoid this without using transactions?
Thanks.