Groovy jenkins script println doesn't work - java

As a part of my promote artifact task I am creating a query to send to Artifactory. Unfortunately, it does not work and I am trying to debug it stepwise. Here is the message I am preparing to send. Somehow println returns "check", but does not show anything for message in the logs. Why so?
stage('Promote') {
id = "0.1-2020-01-28-18-08.zip"
try {
message = """
items.find(
{
"$and":[
{ "repo": {"$eq": "generic-dev-local"} },
{ "path": {"$match": "mh/*"} },
{ "name": {"$eq": ${id}}}
]
}
).include("artifact.module.build.number")
"""
println "check"
println message
} catch (e) {
return [''] + e.toString().tokenize('\n')
}
}

Related

Network changed error not caught by amplify error callback

Storage.put(key, file, {
resumable: true,
completeCallback(eve) {
},
errorCallback: (err) => {
console.error('Unexpected error while uploading', err);
},
progressCallback(prog) {
}
});
I want to handle this error but cannot find anyway to do that
xhr.js?b50d:187 PUT https://xyz.amazonaws.com/&x-id=UploadPart net::ERR_NETWORK_CHANGED
[ERROR] 59:15.420 axios-http-handler - Network Error

how to list the object in angular calling the java api with header and body?

I want to list the student details calling the java api which have below response. I want the values from data which will have multiple values.Below is the method i am calling in component.ts class.
ngOnInit() {
this.loading = true
this.httpService.get('/student').subscribe(
result => {
console.log('received results');
this.loading = false;
this.scouts = result;
},
error => {
console.log('failed');
this.loading = false;
}
``
This is the api response.
``
{
data: [
{
id: 101,
name: "John doe",
status: "enrolled",
}
],
errors: [ ],
warnings: [ ],
meta: { }
}
```
I tried to using this as html code but this won't work and in the component part i have called the httpservice with get request in ngOnInit part.
``
<tr *ngFor="let student of students">
<td>{{student.id}}</td>
<td>{{student.name}}</td>
<td>{{student.status}}</td>
```
Please can you guide me how can i get the student details from data part and list it in front end. Thank you.
From Best practices:
Use url at the service level (since it's a constant used only inside the service)
Define a StudentResponce model so that you can your json response.
Define the Student model that will be a property of StudentReponse
then your component becomes
ngOnInit() {
this.loading = true;
this.httpService.getStudents().subscribe(
result => {
console.log('received results');
this.loading = false;
this.scouts = result.data;
/*or this.scouts = [...result.data] */
//warning = result.warning
//meta = result.meta
//errors = [...result.errors]
},
error => {
console.log('failed');
this.message = error.message;
this.loading = false;
});
}
and the service
const url = "http://replace_with_service_url";
#Injectable()
export class MyHttpService {
constructor(private client: HttpClient) {
}
getStudents(): Observable<StudentResponse>{
return this.client.get<StudentResponse>(url);
}
}
export class StudentResponse{
data: Student[];
warning: any;
errors: any[];
meta: any;
}
export class Student{
id: number;
name= "";
status: string /*or enum "enrolled"*/;
}
you can replace your service url in this code to test
Just change this assignment:
this.scouts = result;
to
this.scouts = result.data;

DynamoDB updated only partially

I want to update my DynamoDB through java using dynamoDBMapper Library.
What I did is to push messages(updates I want to executed) to one SQS, and let my java code to consume these messages and update my dynamoDB.
I found that when I push more than 150 messages in a short using a script, all the data can be consumed but only parts of the record in DynamoDB was updated.
The code to update DynamoDB like this:
#Service
public class PersistenceMessageProcessingServiceImpl implements PersistenceMessageProcessingService{
#Override
public void process(TextMessage textMessage){
String eventData = textMessage.getText();
updateEventStatus(eventData);
}
/*
each input is a caseDetail messages in Event Table
get data, parse data and update relative records partially in dynamodb.
finally check if still any open cases, if not change state of event
*/
private void updateEventStatus(String eventData) throws ParseException, IOException {
RetryUtils retryUtils = new RetryUtils(maxRetries, waitTimeInMilliSeconds, influxService);
SNowResponse serviceNowResponse = parseData(eventData);
EventCaseMap eventCaseMap = eventCaseMapRepository.findBySysId(sysId);
if (eventCaseMap != null) {
Event event = eventRepository.findByEventId(eventCaseMap.getSecurityEventManagerId());
CaseManagementDetails caseManagementDetails = event.getCaseManagementDetails();
Case existingCaseDetails = getCaseByCaseSystemId(caseManagementDetails, sysId);
caseDetails.setCaseStatus('Resolved');
caseDetails.setResolution(serviceNowResponse.getCloseCode());
caseDetails.setResolvedBy("A");
caseDetails.setAssessment(serviceNowResponse.getAssessment());
caseDetails.setResolutionSource("SEM");
retryUtils.run(() -> {
return eventRepository.updateEvent(event); }, RETRY_MEASUREMENT);
}
boolean stillOpen = false;
for(Case existingCase : caseManagementDetails.getCases()){
if(("OPEN").equals(existingCase.getCaseStatus().toString())){
stillOpen = true;
break;
}
}
if(!stillOpen){
event.setState('CLOSED');
}
}
private Case getCaseByCaseSystemId(CaseManagementDetails caseManagementDetails, String sysId) {
Case caseDetails = null;
if (caseManagementDetails != null) {
List<Case> caseList = caseManagementDetails.getCases();
for (Case c : caseList) {
if (c.getCaseSystemId() != null && c.getCaseSystemId().equalsIgnoreCase(sysId)) {
caseDetails = c;
break;
}
}
}
return caseDetails;
}
}
/* EventCaseMap Table in my DynamoDB
data model is like this for EventCaseMap Table:
{
"caseSystemId": "bb9cc488dbf67b40b3d57709af9619f8",
"securityEventManagerId": "756813a4-4e48-4abb-b37e-da00e931583b"
}
*/
#Repository
public class EventCaseMapRepositoryImpl implements EventCaseMapRepository {
#Autowired
DynamoDBMapper dynamoDBMapper;
#Override
public EventCaseMap findBySysId(String sysId) {
EventCaseMap eventCaseMap = new EventCaseMap();
eventCaseMap.setCaseSystemId(sysId);
return dynamoDBMapper.load(eventCaseMap, DynamoDBMapperConfig.ConsistentReads.CONSISTENT.config());
}
}
/*
data model is like this for Event Table:
{
"caseManagementDetails": {
"cases": [
{
"caseId": "SIR0123456",
"caseStatus": "OPEN",
},
{
"caseId": "SIR0654321",
"caseStatus": "OPEN",
},
{
many other cases(about two hundreds).....
}
]
},
"state": "OPEN",
"securityEventManagerId": "756813a4-4e48-4abb-b37e-da00e931583b"
}
*/
#Repository
public class EventRepositoryImpl implements EventRepository {
#Autowired
DynamoDBMapper dynamoDBMapper;
#Override
public Event findByEventId(String eventId) {
Event event = new Event();
event.setSecurityEventManagerId(eventId);
return dynamoDBMapper.load(event, DynamoDBMapperConfig.ConsistentReads.CONSISTENT.config());
}
#Override
public boolean updateEvent(Event event) {
dynamoDBMapper.save(event, DynamoDBMapperConfig.SaveBehavior.UPDATE_SKIP_NULL_ATTRIBUTES.config());
return false;
}
}
I already try to push the message and consume the message one by one in both 'RUN' and 'DEBUG' model in my Intellij. evertything works fine, all the cases can be updated.
So I was wondering if any inconsistency problems in DynamoDB, but I have already using Strong Consistency in my code.
So do any body know what happened in my code?
There is the input, output, expected output:
input:
many json files like this:
{
"number": "SIR0123456",
"state": "Resolved",
"sys_id": "bb9cc488dbf67b40b3d57709af9619f8",
"MessageAttributes": {
"TransactionGuid": {
"Type": "String",
"Value": "093ddb36-626b-4ecc-8943-62e30ffa2e26"
}
}
}
{
"number": "SIR0654321",
"state": "Resolved",
"sys_id": "bb9cc488dbf67b40b3d57709af9619f7",
"MessageAttributes": {
"TransactionGuid": {
"Type": "String",
"Value": "093ddb36-626b-4ecc-8943-62e30ffa2e26"
}
}
}
output for Event Table:
{
"caseManagementDetails": {
"cases": [
{
"caseId": "SIR0123456",
"caseStatus": "RESOLVED",
},
{
"caseId": "SIR0654321",
"caseStatus": "OPEN"
},
{
many other cases(about two hundreds).....
}
]
},
"state": "OPEN",
"securityEventManagerId": "756813a4-4e48-4abb-b37e-da00e931583b"
}
Expected output for Event Table:
{
"caseManagementDetails": {
"cases": [
{
"caseId": "SIR0123456",
"caseStatus": "RESOLVED",
},
{
"caseId": "SIR0654321",
"caseStatus": "RESOLVED"
},
{
many other cases(about two hundreds).....
}
]
},
"state": "OPEN",
"securityEventManagerId": "756813a4-4e48-4abb-b37e-da00e931583b"
}
I think the problem lies in that when DynamoDB persistent data, it run in a multi-threading way. so if we consume all these data in a short time, there may some threads didn't finish. So the result we saw was just the result of the last thread not that of all threads.

Druid - longSum metrics is not populating

I am doing batch ingestion in druid, by using the wikiticker-index.json file which comes with the druid quickstart.
Following is my data schema in wikiticker-index.json file.
{
type:"index_hadoop",
spec:{
ioConfig:{
type:"hadoop",
inputSpec:{
type:"static",
paths:"quickstart/wikiticker-2015-09-12-sampled.json"
}
},
dataSchema:{
dataSource:"wikiticker",
granularitySpec:{
type:"uniform",
segmentGranularity:"day",
queryGranularity:"none",
intervals:[
"2015-09-12/2015-09-13"
]
},
parser:{
type:"hadoopyString",
parseSpec:{
format:"json",
dimensionsSpec:{
dimensions:[
"channel",
"cityName",
"comment",
"countryIsoCode",
"countryName",
"isAnonymous",
"isMinor",
"isNew",
"isRobot",
"isUnpatrolled",
"metroCode",
"namespace",
"page",
"regionIsoCode",
"regionName",
"user"
]
},
timestampSpec:{
format:"auto",
column:"time"
}
}
},
metricsSpec:[
{
name:"count",
type:"count"
},
{
name:"added",
type:"longSum",
fieldName:"added"
},
{
name:"deleted",
type:"longSum",
fieldName:"deleted"
},
{
name:"delta",
type:"longSum",
fieldName:"delta"
},
{
name:"user_unique",
type:"hyperUnique",
fieldName:"user"
}
]
},
tuningConfig:{
type:"hadoop",
partitionsSpec:{
type:"hashed",
targetPartitionSize:5000000
},
jobProperties:{
}
}
}
}
After ingesting the sample json. only the following metrics show up.
I am unable to find the longSum metrics.i.e added, deleted and delta.
Any particular reason?
Does anybody know about this?
OP confirmed this comment from Slim Bougerra worked:
You need to add yourself on the Superset UI. Superset doesn't populate the metrics automatically.

How to save URL of every page I visit to a .txt file

What I want it to do is, every time I visit a new page, click a link, etc. the URL will automatically save to a .txt file.
Either Chrome or Firefox is okay.
PHP, HTML, Java, Javascript is okay too.
If anyone can help me it would be awesome.
chrome.browserAction.onClicked.addListener(createFile);
createFile();
function createFile()
{
chrome.tabs.getSelected(null, function(tab) {
window.webkitRequestFileSystem(window.TEMPORARY, 1024*1024, function(fs) {
fs.root.getFile('test', {create: true}, function(fileEntry) {
fileEntry.createWriter(function(fileWriter) {
var builder = new WebKitBlobBuilder();
builder.append("Saurabh");
builder.append("\n");
builder.append("Saxena");
var blob = builder.getBlob('text/plain');
fileWriter.onwriteend = function() {
chrome.tabs.create({"url":fileEntry.toURL(),"selected":true},function(tab){});
};
fileWriter.write(blob);
}, errorHandler);
}, errorHandler);
}, errorHandler);
});
}
function errorHandler(e) {
var msg = '';
switch (e.code) {
case FileError.QUOTA_EXCEEDED_ERR:
msg = 'QUOTA_EXCEEDED_ERR';
break;
case FileError.NOT_FOUND_ERR:
msg = 'NOT_FOUND_ERR';
break;
case FileError.SECURITY_ERR:
msg = 'SECURITY_ERR';
break;
case FileError.INVALID_MODIFICATION_ERR:
msg = 'INVALID_MODIFICATION_ERR';
break;
case FileError.INVALID_STATE_ERR:
msg = 'INVALID_STATE_ERR';
break;
default:
msg = 'Unknown Error';
break;
};
Console.Log('Error: ' + msg);
}
I already tried that Javascript code using Tampermonkey in Chrome, but it didn't work.
I know how to save URLs into Chrome storage by creating Chrome extension.
You have to create manifest.json with structure described below:
{
"name": "Save URLs",
"description": "Save URLs",
"version": "0.7",
"permissions": [
"tabs", "storage"
],
"background": {
"scripts": ["store.js"]
},
"manifest_version": 2
}
and JavaScript file store.js
var urlList = [];
chrome.tabs.onUpdated.addListener(function(tabId, changeInfo, tab) {
if(changeInfo.url) {
urlList.push(tab.url);
chrome.storage.sync.set({'urlList': urlList}, function() {
// callback body
});
chrome.storage.sync.get('urlList', function(items) {
alert(items.urlList);
});
}
});
Information about loading the extension into browser: https://developer.chrome.com/extensions/getstarted#unpacked

Categories