I have the following code:
#Controller
public class GatesController {
#RequestMapping ("/gates")
public static String qualityGates(String x) throws IOException {
try {
System.out.println("\n------QualityGates------");
URL toConnect = new URL(x);
HttpURLConnection con = (HttpURLConnection) toConnect.openConnection();
System.out.println("Sending 'GET' request to URL : " + x);
BufferedReader in = new BufferedReader(
new InputStreamReader(con.getInputStream()));
String inputLine;
StringBuffer response = new StringBuffer();
while ((inputLine = in.readLine()) != null) {
response.append(inputLine);
}
in.close();
//Cast the JSON-File to a JSONObject
JSONObject res = new JSONObject(response.toString());
JSONArray gates = new JSONArray(res.getJSONObject("projectStatus").getJSONArray("conditions").toString());
JSONObject test = new JSONObject(res.getJSONObject("projectStatus").toString());
String a = ("\nThe current Project-Status is: " + test.get("status") + "\n");
String b = "";
for (int i = 0; i < gates.length(); i++) {
String status = gates.getJSONObject(i).getString("status");
String metric = gates.getJSONObject(i).getString("metricKey");
b = b + ("<\b>Status: " + status + " | Metric: " + metric);
}
System.out.println(a+b);
return a + b;
} catch (Exception e) {
System.out.println(e);
return String.format("Error");
}
}
#SpringBootApplication
#RestController
public class SonarQualityGatesApplication {
public static void main(String[] args) throws IOException {
ConfigurableApplicationContext context=SpringApplication.run(SonarQualityGatesApplication.class, args);
TestController b = context.getBean(TestController.class);
}
#GetMapping("/hello")
public String hello(#RequestParam(value = "name", defaultValue = "World") String name) {
return String.format("Hello %s!", name);
}
#GetMapping("/gates")
public String gates() throws IOException {
String temp = qualityGates("http://localhost:9000/api/qualitygates/project_status?projectKey={PROJECT_KEY}");
return temp;
}
}
The problem is currently the website looks like this:
Website_Curr
But I want a new line for every metric not in one row.
As you see I tried to add <\b> at the string connotation
Do you have an idea how to fix this? It is my first web application I am a bit stuck.
I appreciate every help!
Your "<\b>" breaks it. If you remove it and add a newLine "\n" it should work.
Like this:
String a = ("\nThe current Project-Status is: " + test.get("status") + "\n");
String b = "";
for (int i = 0; i < gates.length(); i++) {
status = gates.getJSONObject(i).getString("status");
String metric = gates.getJSONObject(i).getString("metricKey");
b = b + ("Status: " + status + " | Metric: " + metric + "\n");
}
Also you are returning plain text. So, to display it correctly add "produces = "text/plain" to return the formatted String.
#GetMapping(value = "/gates", produces = "text/plain")
Then your output will be displayed with line breaks. This way u can apply further formatting.
I have a MapReduce Program which can process Delimited, Fixed Width and Excel Files. There is no problem in reading Delimited and Fixed Width File. But Problem with Excel File is setup() and cleanup() methods are getting called,but not the map(). I tried with adding annotations to map() still it didnt work.
public class RulesDriver extends Configured implements Tool {
private static Logger LOGGER = LoggerFactory.getLogger(RulesDriver.class);
RuleValidationService aWSS3Service = new RuleValidationService();
HashMap<String, Object> dataMap = new HashMap<String, Object>();
HashMap<String, String> controlMap = new HashMap<String, String>();
public String inputPath = "";
public String outputPath = "";
private static DateFormat dateFormat = new SimpleDateFormat("yyyy-MM-dd-HH-mm");
ControlFileReader ctrlReader = new ControlFileReader();
CSVToExcel csv2Excel = new CSVToExcel();
HashMap<Integer,String> countMap = new HashMap<Integer,String>();
HashMap<String,Integer> numberValueMap = new HashMap<String,Integer>();
HashMap<String,Object> rulesMap = new HashMap<String,Object>();
CharsetConvertor charsetConvertor = new CharsetConvertor();
ControlFileComparison controlFileComparison = new ControlFileComparison();
boolean isControlFileValid = false;
public static void main(String[] args) throws Exception {
int res = ToolRunner.run(new Configuration(), new RulesDriver(), args);
System.exit(res);
}
#Override
public int run(String[] args) throws Exception {
LOGGER.info("HADOOP MapReduce Driver started");
if (args.length < 3) {
LOGGER.info("Args ");
return 1;
}
int j = -1;
//Prop - Starts
String cacheBucket = args[0];
String s3accesskey = args[1];
String s3accesspass = args[2];
//Prop - ends
// file2InputPath Value is from DB
String file2InputLocation = "";
String fileComparisonInd = "";
String inputPath = "";
String outputPath = "";
String url = "";
String userId = "";
String password = "";
String fileType = "";
String ctrlCompResult = "N";
try {
Configuration conf = new Configuration();
Properties prop = new Properties();
//Prop - starts
prop.setProperty("qatool.cacheBucket", cacheBucket);
prop.setProperty("qatool.s3accesskey", s3accesskey);
prop.setProperty("qatool.s3accesspass", s3accesspass);
String propertiesFile = aWSS3Service.getObjectKey(cacheBucket, "application",prop);
if(null==propertiesFile && "".equals(propertiesFile)){
return 0;
}
S3Object s3Object = aWSS3Service.getObject(cacheBucket, propertiesFile, prop);
LOGGER.info("Loading App properties");
InputStreamReader in = new InputStreamReader(s3Object.getObjectContent());
Properties appProperties = new Properties();
try {
appProperties.load(in);
prop.putAll(appProperties);
LOGGER.info(" ",prop);
}
catch (IOException e1) {
LOGGER.error("Exception while reading properties file :" , e1);
return 0;
}
initialize(prop);
if (!(dataMap == null)) {
if (("N").equals(dataMap.get("SuccessIndicator"))) {
return 0;
}
List value = (ArrayList) dataMap.get("LookUpValList");
LOGGER.info("lookUpVallist",value);
}
if (dataMap != null) {
controlMap = (HashMap<String, String>) dataMap.get("ControlMap");
}
if (controlMap != null) {
inputPath = (prop.getProperty("qatool.rulesInputLocation") + "/").concat((String) dataMap.get("InputFileName")); //TEMP
LOGGER.info(inputPath);
fileType = (String) dataMap.get("FileType");
} else {
inputPath = (prop.getProperty("qatool.rulesInputLocation") + "/").concat((String) dataMap.get("InputFileName"));
LOGGER.info(inputPath);
fileType = (String) dataMap.get("FileType");
}
rulesMap = (HashMap<String,Object>)dataMap.get("RulesMap");
isControlFileValid = controlFileComparison.compareControlFile(controlMap, aWSS3Service, prop, rulesMap); //TEMP
LOGGER.info("isControlFileValid in driver : "+isControlFileValid);
if(isControlFileValid){
ctrlCompResult = "Y";
}
conf.set("isControlFileValid", ctrlCompResult);
// ** DATABASE Connection **/
String ctrlFileId = controlMap.get("ControlFileIdentifier");
url = prop.getProperty(QaToolRulesConstants.DB_URL);
userId = prop.getProperty(QaToolRulesConstants.DB_USER_ID);
password = prop.getProperty(QaToolRulesConstants.DB_USER_DET);
InpflPrcsSumm inpflPrcsSumm = new InpflPrcsSumm();
DBConnectivity dbConnectivity = new DBConnectivity(url, userId, password);
inpflPrcsSumm = dbConnectivity.getPreviousFileDetail(ctrlFileId);
dbConnectivity.closeConnection();
LOGGER.info( " inpflPrcsSumm.getPrevFileId() " + inpflPrcsSumm.getPrevFileId());
prop.setProperty(QaToolRulesConstants.PREV_FILE_ID, inpflPrcsSumm.getPrevFileId().toString());
file2InputLocation = inpflPrcsSumm.getPrevFileLocation();
boolean file2Available = file2InputLocation.isEmpty();
String folderPath = "";
String bucket = "";
if (!file2Available) {
String arr[] = file2InputLocation.split("/");
if(file2InputLocation.startsWith("http")){
bucket = arr[3];
}else{
bucket = arr[2];
}
folderPath = file2InputLocation.substring(file2InputLocation.lastIndexOf(bucket) + bucket.length() + 1, file2InputLocation.length());
}
// File 2 input path
prop.setProperty("qatool.file2InputPath", file2InputLocation);
if(!file2Available){
file2InputLocation = file2InputLocation + "/Success";
String file2Name = aWSS3Service.getObjectKey(bucket, folderPath,prop);
LOGGER.info("bucket->"+bucket);
LOGGER.info("folderPath->"+folderPath);
file2Name = file2Name.substring(file2Name.lastIndexOf("/")+1, file2Name.length());
prop.setProperty("file2Name", (null!=file2Name && "".equals(file2Name))?"":file2Name);
LOGGER.info(prop.getProperty("file2Name"));
}
prop.setProperty("qatool.auditPrevFolderPath", folderPath);
prop.setProperty("qatool.auditBucketPrevFolderPath", bucket);
LOGGER.info("ctrlFileId : " + ctrlFileId);
LOGGER.info("BUCKET : " + bucket);
LOGGER.info("folder : " + folderPath);
Date dateobj = new Date();
outputPath = (String) prop.getProperty("qatool.rulesOutputLocation") + "/" + dateFormat.format(dateobj); //TEMP
fileComparisonInd = controlMap.get("FileComparisonIndicator");
Gson gson = new Gson();
String propSerilzation = gson.toJson(prop);
conf.set("application.properties", propSerilzation);
Job job = Job.getInstance(conf);
job.setJarByClass(RulesDriver.class);
job.setJobName("Rule Validation and Comparison");
job.getConfiguration().set("fs.s3n.awsAccessKeyId", (String) prop.getProperty("qatool.s3accesskey"));
job.getConfiguration().set("fs.s3n.awsSecretAccessKey", (String) prop.getProperty("qatool.s3accesspass"));
job.getConfiguration().set("fs.s3.awsAccessKeyId", (String) prop.getProperty("qatool.s3accesskey"));
job.getConfiguration().set("fs.s3.awsSecretAccessKey", (String) prop.getProperty("qatool.s3accesspass"));
job.getConfiguration().set("fs.s3a.awsAccessKeyId", (String) prop.getProperty("qatool.s3accesskey"));
job.getConfiguration().set("fs.s3a.awsSecretAccessKey", (String) prop.getProperty("qatool.s3accesspass"));
job.getConfiguration().set("fs.s3n.endpoint", "s3.amazonaws.com");
job.getConfiguration().set("fs.s3.endpoint", "s3.amazonaws.com");
job.getConfiguration().set("fs.s3a.endpoint", "s3.amazonaws.com");
job.setMapOutputKeyClass(Text.class);
job.setMapOutputValueClass(Text.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(Text.class);
job.setReducerClass(RulesCountReducer.class);
job.setNumReduceTasks(1);
job.setMaxMapAttempts(1);
job.setMaxReduceAttempts(1);
if("UTF-16".equalsIgnoreCase(controlMap.get("FileCodePage"))){
convertEncoding((String)dataMap.get("InputFileName"),rulesInputLocation,prop);
if (!file2Available && "Y".equals(ctrlCompResult)) {
convertEncoding(inpflPrcsSumm.getPrevFileName(),file2InputLocation,prop);
}
}
LOGGER.info("fileComparisonInd + "+ fileComparisonInd + " file2Available + " + file2Available + " ctrlCompResult + " + ctrlCompResult);
if (fileType != null && fileType.equals(QaToolRulesConstants.INPUT_FILE_TYPE_DELIMI)) {
job.setInputFormatClass(TextInputFormat.class);
job.setOutputFormatClass(TextOutputFormat.class);
MultipleInputs.addInputPath(job, new Path(rulesInputLocation), TextInputFormat.class, TextRulesMapper.class);
if (fileComparisonInd.equalsIgnoreCase(QaToolRulesConstants.FILE_COMP_INDICATOR) && !file2Available && "Y".equals(ctrlCompResult)) {
Path file2InputPath = new Path(file2InputLocation);
if (isInputPathAvail(file2InputPath, conf)) {
MultipleInputs.addInputPath(job, file2InputPath, TextInputFormat.class, TextRulesMapperFile2.class);
}
}
} else if (fileType != null && fileType.equals(QaToolRulesConstants.INPUT_FILE_TYPE_EXCEL)) {
job.setInputFormatClass(ExcelInputFormat.class);
job.setOutputFormatClass(TextOutputFormat.class);
MultipleInputs.addInputPath(job, new Path(rulesInputLocation), ExcelInputFormat.class, ExcelMapper.class);
String inputFileName = controlMap.get("InputFileName");
String fileExtn = inputFileName.substring(inputFileName.lastIndexOf(".") + 1);
prop.setProperty("File", "Excel");
prop.setProperty("fileExtension", fileExtn);
if (fileComparisonInd.equalsIgnoreCase(QaToolRulesConstants.FILE_COMP_INDICATOR) && !file2Available && "Y".equals(ctrlCompResult)) {
Path file2InputPath = new Path(file2InputLocation);
if (isInputPathAvail(file2InputPath, conf)) {
MultipleInputs.addInputPath(job, file2InputPath, ExcelInputFormat.class, ExcelMapper2.class);
}
}
} else if (fileType != null && fileType.equals(QaToolRulesConstants.INPUT_FILE_TYPE_FIXED)) {
prop.setProperty("File", "DAT");
MultipleInputs.addInputPath(job, new Path(rulesInputLocation), TextInputFormat.class, FixedWidthMapper.class);
if (fileComparisonInd.equalsIgnoreCase(QaToolRulesConstants.FILE_COMP_INDICATOR) && !file2Available && "Y".equals(ctrlCompResult)) {
Path file2InputPath = new Path(file2InputLocation);
if (isInputPathAvail(file2InputPath, conf)) {
MultipleInputs.addInputPath(job, file2InputPath, TextInputFormat.class, FixedWidthMapper2.class);
}
}
}
MultipleOutputs.addNamedOutput(job, "error", TextOutputFormat.class, Text.class, Text.class);
MultipleOutputs.addNamedOutput(job, "success", TextOutputFormat.class, Text.class, Text.class);
MultipleOutputs.addNamedOutput(job, QaToolRulesConstants.ADDED_DELETED, TextOutputFormat.class, Text.class, Text.class );
/*MultipleOutputs.addNamedOutput(job, QaToolRulesConstants.ADDED_UPDATED, TextOutputFormat.class, Text.class, Text.class );*/ //TEMP ADDED FOR ADDED AND UPDATED
MultipleOutputs.addNamedOutput(job, QaToolRulesConstants.DETAIL, TextOutputFormat.class, Text.class, Text.class);
FileOutputFormat.setOutputPath(job, new Path(outputPath));
j = job.waitForCompletion(true) ? 0 : 1;
LOGGER.info("Program Complted with return " + j);
//Code Added for Control file Movement -- starts
String outputBucket = rulesOutputLocation;
outputBucket = outputBucket.substring(outputBucket.indexOf("//")+2, outputBucket.length());
outputBucket = outputBucket.substring(0,(outputBucket.indexOf("/")));
String controlFileNamekey = aWSS3Service.getObjectKey(outputBucket, "delivery/"+ dataMap.get("ControlFileName"),prop);
if (controlFileNamekey != null) {
controlFileNamekey = (String) controlFileNamekey.substring(controlFileNamekey.lastIndexOf("/") + 1,controlFileNamekey.length());
String outputCtrlFilePath = "delivery/"+ dateFormat.format(dateobj) +"/" + controlFileNamekey;
LOGGER.info("controlFileNamekey "+controlFileNamekey+" outputCtrlFilePath "+outputCtrlFilePath);
aWSS3Service.moveObjects(outputBucket, "delivery/"+controlFileNamekey, outputBucket, outputCtrlFilePath,prop);
}
//Code Added for Control file Movement -- Ends
if (j == 0) {
// Get counters
LOGGER.info("Transfer");
final Counters counters = job.getCounters();
long duplicates = counters.findCounter(MATCH_COUNTER.DUPLICATES).getValue();
LOGGER.info("duplicates->"+duplicates);
long groupingThreshold = counters.findCounter(MATCH_COUNTER.GROUPING_ERR_THRESHOLD).getValue();
LOGGER.info("groupingThreshold->"+groupingThreshold);
if(groupingThreshold==1 || duplicates==1){
if(duplicates==1){
writeOutputFile(folderName,dateobj,"DuplicateRecords",prop,cacheBucket);
}else{
writeOutputFile(folderName,dateobj,"GroupingThreshold",prop,cacheBucket);
}
}else{
long successCount = counters.findCounter(MATCH_COUNTER.SUCCESS_COUNT).getValue();
if (controlMap.get("ColumnHeaderPresentIndicator") != null
&& controlMap.get("ColumnHeaderPresentIndicator").equals("Y")) {
successCount = successCount-1;
}
LOGGER.info("successCount "+successCount);
LOGGER.info("TOLERANCEVALUE " + counters.findCounter(MATCH_COUNTER.TOLERANCEVALUE).getValue());
LOGGER.info("RULES_ERRORTHRESHOLD " + counters.findCounter(MATCH_COUNTER.RULES_ERRORTHRESHOLD).getValue());
long errorThreshold = counters.findCounter(MATCH_COUNTER.RULES_ERRORTHRESHOLD).getValue();
LOGGER.info("COMPARISION_ERR_THRESHOLD " + counters.findCounter(MATCH_COUNTER.COMPARISION_ERR_THRESHOLD).getValue());
writeOutputFile(folderName,dateobj, outputPath + "," + successCount + "," + counters.findCounter(MATCH_COUNTER.TOLERANCEVALUE).getValue() + "," + errorThreshold + ","
+counters.findCounter(MATCH_COUNTER.COMPARISION_ERR_THRESHOLD).getValue()+","+ctrlCompResult,prop,cacheBucket);
String auditBucketName = "";
auditBucketName = rulesOutputLocation;
auditBucketName = auditBucketName.substring(auditBucketName.indexOf("//") + 2, auditBucketName.length() - 1);
auditBucketName = auditBucketName.substring(0, (auditBucketName.indexOf("/")));
String auditFileMovementPath = "delivery/" + dateFormat.format(dateobj);
auditFile = auditFile.replace(".xlsx","");
String inputFileName = (String) dataMap.get("InputFileName");
inputFileName = inputFileName.substring(0,inputFileName.lastIndexOf(".")).concat(".xlsx");
try {
LOGGER.info("Audit Bucket Name : " + auditBucketName);
LOGGER.info("Move parameter >>> outputbucketname : auditFileLocation : auditBucketName : auditFileMovementPath auditFile ");
LOGGER.info("Move parameter " + outputbucketname + ", " + auditFileLocation + " , " + auditBucketName + " , " + auditFileMovementPath + "/" + auditFile + "_" + inputFileName);
aWSS3Service.moveObjects(outputbucketname, auditFileLocation, auditBucketName, auditFileMovementPath +"/"+ auditFile +"_"+ inputFileName, prop);
} catch (Exception e) {
LOGGER.error("Exception while moving audit file ",e);
}
}
}else{
writeOutputFile(folderName,dateobj,"DuplicateRecords",prop,cacheBucket);
}
} catch (Exception e) {
LOGGER.error("Error in RulesDriver ", e);
}
return j;
}
}
Excel Mapper :
public class ExcelMapper extends Mapper<LongWritable, Text, Text, Text> {
#Override
protected void setup(Mapper<LongWritable, Text, Text, Text>.Context context)throws InterruptedException, IOException {
LOGGER.info("Inside Mapper Setup");
}
#Override
public void map(LongWritable key, Text value, Context context) throws InterruptedException, IOException {
}
#Override
protected void cleanup(Context context) throws IOException,
InterruptedException {
}
}
How can I split a flat string based on 0102**? string tokenizer is working for only **. Is there any way to split based on 0102**? Please suggest
Here is my complete method
private String handleCibil(InterfaceRequestVO ifmReqDto, String szExtIntType) throws MalformedURLException, org.apache.axis.AxisFault, RemoteException {
/* Declaration and initiliazation */
ConfVO confvo = ifmReqDto.getExtConfVo();
String szResponse = null;
String cibilResponse = null;
String errorResponse = null;
String endpointURL = null;
long timeOut = confvo.getBurMgr().getBurInfo(szExtIntType).getTimeOut();
endpointURL = formWebServiceURL(confvo, szExtIntType);
URL url = new URL(endpointURL);
log.debug("Input xml for cibil "+ifmReqDto.getIfmReqXML());
BasicHttpStub stub= new BasicHttpStub(url,new org.apache.axis.client.Service());
szResponse = stub.executeXMLString(ifmReqDto.getIfmReqXML());
//szResponse=szResponse.replaceAll("&", "&");
log.debug("szResponse "+szResponse);
/* Validate if the obtained response is as expected by IFM */
try {
extDao = new ExtInterfaceXMLTransDAO(ifmReqDto.getSemCallNo(), ifmReqDto.getIdService());
extDao.updateRqstRespXML10g(ifmReqDto.getInterfaceReqNum(), szResponse, GGIConstants.IFM_RESPONSE);
//log.debug("CIBIL_RESPONSE_XPATH " + GGIConstants.CIBIL_RESPONSE_XPATH);
Document xmlDocument = DocumentHelper.parseText(szResponse);
String xPath = GGIConstants.RESPONSE_XPATH;
List<Node> nodes = xmlDocument.selectNodes(xPath);
for (Node node : nodes) {
String keyValue = node.valueOf(GGIConstants.RESPONSE_XPATH_KEY);
// log.debug("keyValue : " + keyValue);
if (keyValue.equalsIgnoreCase(GGIConstants.RESPONSE_XPATH_KEY_VALUE)) {
// log.debug("node value : " + node.getText());
cibilResponse = node.getText();
}
}
log.debug("cibilResponse " + cibilResponse);
String errorResponseXPATH = GGIConstants.CIBIL_ERROR_RESPONSE_XPATH;
List<Node> errorResponseNode = xmlDocument.selectNodes(errorResponseXPATH);
for (Node node : errorResponseNode) {
errorResponse = node.getText();
}
log.debug("errorResponse " + errorResponse);
if(cibilResponse!=null && cibilResponse.length()>0)
{
StringTokenizer cibilResponseResults = new StringTokenizer(cibilResponse,"**");
String tempResponse="";
ArrayList probableMatchList = new ArrayList();
while (cibilResponseResults.hasMoreElements()) {
tempResponse = (String) cibilResponseResults.nextElement();
if(tempResponse.length()>=80)
{
String memberRefNo = tempResponse.substring(69, 80).replaceAll(" ", "");
log.debug("memberRefNo " + memberRefNo);
if (memberRefNo.length() > 0) {
if (Integer.parseInt(memberRefNo) > 0) {
cibilResponse = tempResponse;
cibilResponse = cibilResponse+"**";
}
else
{
probableMatchList.add(tempResponse+"**");
}
}
else
{
probableMatchList.add(tempResponse+"**");
}
}
else
{
cibilResponse = tempResponse+"**";
}
}
log.debug("After finding the Member reference number cibilResponse " + cibilResponse);
log.debug("After finding the Probable reference list " + probableMatchList);
// TKN 008
cibilResponse=StringEscapeUtils.unescapeXml(cibilResponse).replaceAll("[^\\x20-\\x7e]","");
ifmReqDto.setIfmTransformedResult(cibilResponse);
ifmReqDto.setProbableMatchList(probableMatchList);
}
if (errorResponse!=null && errorResponse.length()>0) {
throw new GenericInterfaceException(errorResponse
+ " for the seq_request " + ifmReqDto.getSeqRequest() + " Seq_Interface_req is >> "
+ ifmReqDto.getInterfaceReqNum(),
GGIConstants.SEND_REQUEST_CONSTANT + Strings.padStart(String.valueOf(ifmReqDto.getIdService()), 2, GGIConstants.DEFAULT_NUMBER_STRING)
+ GGIConstants.CIBIL_ERROR_CODE);
}
else if (cibilResponse==null || StringUtils.isEmpty(cibilResponse) ) {
throw new GenericInterfaceException("Cibil TUEF response is empty >> cibil Service "
+ "for the seq_request " + ifmReqDto.getSeqRequest() + "Seq_Interface_req is >> "
+ ifmReqDto.getInterfaceReqNum(),
GGIConstants.SEND_REQUEST_CONSTANT + Strings.padStart(String.valueOf(ifmReqDto.getIdService()), 2, GGIConstants.DEFAULT_NUMBER_STRING)
+ GGIConstants.INTERFACE_ERROR_RESPONSE);
}
/* Setting Instinct response to ifmReqDto object */
} catch (SQLException e) {
log.error("SQLException while connecting to DataBase. Exception message is ", e);
throw new GenericInterfaceException("SQLException >> Instinct Service "
+ "for the seq_request " + ifmReqDto.getSeqRequest() + "Seq_Interface_req is >> "
+ ifmReqDto.getInterfaceReqNum(),
GGIConstants.SEND_REQUEST_CONSTANT + Strings.padStart(String.valueOf(ifmReqDto.getIdService()), 2, GGIConstants.DEFAULT_NUMBER_STRING)
+ GGIConstants.DB_OPERATION_ERROR);
} catch (GenericInterfaceException exp) {
log.error("Exception occured while valid:", exp);
throw exp;
} catch (Exception exp) {
log.error("Exception occured while valid:", exp);
throw new GenericInterfaceException("GeneralException >> Instinct Service "
+ "for the seq_request " + ifmReqDto.getSeqRequest() + "Seq_Interface_req is >> "
+ ifmReqDto.getInterfaceReqNum(),
GGIConstants.SEND_REQUEST_CONSTANT + Strings.padStart(String.valueOf(ifmReqDto.getIdService()), 2, GGIConstants.DEFAULT_NUMBER_STRING)
+ GGIConstants.UNKNOWN_ERROR);
}
return szResponse;
}
I recommend checking out the Java documentation, it provides a really good reference to start with. The .split method uses a regex to split up a string based on a delimiter.
String[] tokens = myString.split("0102\\*\\*");
For now I suspect that you forgot to escape * in split regex.
Try maybe
String[] resutl = yourString.split("0102\\*\\*");
In case you want * to represent any character then use . instead of *
String[] resutl = yourString.split("0102..");
In case you want * to represent any digit use \\d instead
String[] resutl = yourString.split("0102\\d\\d");
String string = "blabla0102**dada";
String[] parts = string.split("0102\\*\\*");
String part1 = parts[0]; // blabla
String part2 = parts[1]; // dada
Here we have a String: "blabla0102**dada", we call it string. Every String object has a method split(), using this we can split a string on anything we desire.
Do you mean literally split by "0102**"? Couldn't you use regex for that?
String[] tokens = "My text 0102** hello!".split("0102\\*\\*");
System.out.println(tokens[0]);
System.out.println(tokens[1]);
I created a button that allows to create a zip file., the function that zips the file works correctly, but when I call her via the button (in the JS file) it crashes and it gives a blank page (I think I do not manage the output stream)
would please an idea
here is my code:
Button
isc.ToolStripButton.create({
ID: "BooksApp_GetXmlImage_Button"
,autoDraw:false
,icon: getUIIcon("icon_xml_16")
,prompt: getUIMsg("book_report_get_xml",4)
,showHover:true
,hoverStyle:"book_hover_style"
,click : function () {
BooksApp_Action_loadFile("objx");
// isc.say("test");
}
});
function to call the zipfile() method:
function BooksApp_Action_loadFile(p_UsedFormat) {
var tmpBookID = BooksApp_Application.FP_BookID;
var tmpIDs = BooksApp_Application.FP_fct_getSelectedPOVIDs();
var tmpUsr_ID = FPIUser.FP_fct_getID();
var tmpFormat = p_UsedFormat;
var showInWindow=false;
books_objects.exportData(
{
r_book_idnum : tmpBookID
,sBook_ID : tmpBookID
,sPOV_IDs : tmpIDs
,sUser_ID : tmpUsr_ID
,sFormat : tmpFormat
}
,{ operationId: "customExport"
,exportDisplay: (showInWindow ? "window" : "download") }
,function (dsResponse, data, dsRequest) {
//Never called
BooksApp_Action_Log("BooksApp_Action_loadFile:"+data);
}
);
}
customExport() function
public static String customExport(RPCManager rpc,
HttpServletResponse response) throws Exception {
String sReturn = _Return_OK;
try {
// setting doCustomResponse() notifies the RPCManager that we'll
// bypass RPCManager.send
// and instead write directly to the servletResponse output stream
rpc.doCustomResponse();
RequestContext.setNoCacheHeaders(response);
writeServerDebug("customExport : start");
DSRequest req = rpc.getDSRequest();
List<?> results = req.execute().getDataList();
String sReqData = (String) req.getParameter("exportDisplay");
String sReqData_sBook_ID = "" + req.getCriteriaValue("sBook_ID");
String sReqData_sPOV_IDs = "" + req.getCriteriaValue("sPOV_IDs");
String sReqData_sUser_ID = "" + req.getCriteriaValue("sUser_ID");
String sReqData_sFormat = "" + req.getCriteriaValue("sFormat");
StringBuilder content = new StringBuilder("get (sReqData:"
+ sReqData + ",sBook_ID:" + sReqData_sBook_ID
+ ",sPOV_IDs:" + sReqData_sPOV_IDs + ",sUser_ID:"
+ sReqData_sUser_ID + ",sFormat:" + sReqData_sFormat + ")"
+ results.size() + " line(s):");
for (Iterator<?> i = results.iterator(); i.hasNext();) {
Map<?, ?> record = (Map<?, ?>) i.next();
content.append("\n" + Books.Column_IDNum + ":"
+ record.get(Books.Column_IDNum));
content.append("\n" + Books.Column_Name + ":"
+ record.get(Books.Column_Name));
}
writeServerDebug("The content is \n" + content.toString());
// Create the new Office Engine
OfficeEngine myOfficeEngine = new OfficeEngine();
boolean bIsConnected = myOfficeEngine._RepositoryBridge
.connectSourceDataBase(false);
if (bIsConnected) {
//Connected to the repository, so get the files
if (sReqData_sFormat.equalsIgnoreCase("pdf") || sReqData_sFormat.equalsIgnoreCase("pptx")) {
//The book end user format
String sReturnPptx = myOfficeEngine.performGeneratePptx(
req.getHttpServletRequest(), response,
sReqData_sBook_ID, sReqData_sPOV_IDs,
sReqData_sUser_ID, sReqData_sFormat);
writeServerDebug("customExport call performGeneratePptx, return is "
+ sReturnPptx);
}
else {
AppZip appZip = new AppZip();
appZip.ZipFile(" ", " ");
String r = "sReturn_OK";;
return r;
}
//Free the connection to repository
myOfficeEngine._RepositoryBridge.freeConnectionSource();
} else {
response.setContentType("text/plain");
response.addHeader("content-disposition",
"attachment; filename=book.txt");
ServletOutputStream os = response.getOutputStream();
os.print(content.toString());
os.flush();
}
} catch (Exception e) {
writeServerDebug("ERROR:" + e.getLocalizedMessage());
sReturn = Repository._Return_KO;
}
return sReturn;
}
I am using twitter4j twitter Streaming API to get the tweets for the specific tag.
I am having number of keywords. I want to search the 100 tweets thats containing that tag
currently what i am doing is i wrote code for getting the tweets for single word
public class StreamAPI {
public static void main(String[] args) {
ConfigurationBuilder cb = new ConfigurationBuilder();
cb.setDebugEnabled(true);
cb.setOAuthConsumerKey("xxxx");
cb.setOAuthConsumerSecret("xxxxx");
cb.setOAuthAccessToken("xxxx");
cb.setOAuthAccessTokenSecret("xxxx");
cb.setUseSSL(true);
cb.setUserStreamRepliesAllEnabled(true);
TwitterStream twitterStream = new TwitterStreamFactory(cb.build()).getInstance();
twitterStream.setOAuthAccessToken(accestoken);
StatusListener listener = new StatusListener() {
int countTweets = 0;
public void onStatus(Status status) {
System.out.println("#" + status.getUser().getScreenName() + " - " + status.getText());
countTweets++;
System.out.println(countTweets);
}
public void onDeletionNotice(StatusDeletionNotice statusDeletionNotice) {
System.out.println("Got a status deletion notice id:" + statusDeletionNotice.getStatusId());
}
public void onTrackLimitationNotice(int numberOfLimitedStatuses) {
System.out.println("Got track limitation notice:" + numberOfLimitedStatuses);
}
public void onScrubGeo(long userId, long upToStatusId) {
System.out.println("Got scrub_geo event userId:" + userId + " upToStatusId:" + upToStatusId);
}
#Override
public void onStallWarning(StallWarning stallWarning) {
//To change body of implemented methods use File | Settings | File Templates.
}
public void onException(Exception ex) {
ex.printStackTrace();
}
};
FilterQuery fq = new FilterQuery();
String keywords[] = {"ipl"};
fq.track(keywords);
twitterStream.addListener(listener);
twitterStream.filter(fq);
}
}
how would i stop the process after it reaches the count 100 and should return that 100 tweet as list.
Please help me.
see the below code maybe helpfull for you
String token= "Key Word";
Query query = new Query(token);
FileWriter outFile = new FileWriter(token.replaceAll("^#","").concat(".txt"), true);
int numberOfTweets = 1500;
long lastID = Long.MAX_VALUE;
ArrayList<Status> tweets = new ArrayList<Status>();
while (tweets.size () < numberOfTweets) {
if (numberOfTweets - tweets.size() > 100)
query.setCount(100);
else
query.setCount(numberOfTweets - tweets.size());
try {
QueryResult result = twitter.search(query);
tweets.addAll(result.getTweets());
System.out.println("Gathered " + tweets.size() + " tweets");
for (Status t: tweets)
if(t.getId() < lastID) lastID = t.getId(); }
catch (TwitterException te) {
System.out.println("Couldn't connect: " + te); };
query.setMaxId(lastID-1);
}
PrintWriter out1 = new PrintWriter(outFile);
for (int i = 0; i < tweets.size(); i++) {
Status t = (Status) tweets.get(i);
GeoLocation loc = t.getGeoLocation();
String user = t.getUser().getScreenName();
String msg = t.getText();
String time = "";
if (loc!=null) {
Double lat = t.getGeoLocation().getLatitude();
Double lon = t.getGeoLocation().getLongitude();
System.out.println(i + " USER: " + user + " wrote: " + msg + " located at " + lat + ", " + lon);}
else
// System.out.println(i + " USER: " + user + " wrote: " + msg.replaceAll("\n",""));
out1.append(i + " USER: " + user + " wrote: " +msg.replaceAll("\n"," ") );
out1.print("\n");
}
System.out.println("file write succefully");