generating a number between a range using json - java

How can we generate a number between a range using Json.
Like we have to generate a number between 0 to 50, how can we perform this in Java using a Json.
This is my Json Data
{
"rand": {
"type': "number",
"minimum": 0,
"exclusiveMinimum": false,
"maximum": 50,
"exclusiveMaximum": true
}
}
This is what I have tried in Java
public class JavaApplication1 {
public static void main(String[] args) {
try {
for (int i=0;i<5;i++)
{
FileInputStream fileInputStream = new FileInputStream("C://users/user/Desktop/V.xls");
HSSFWorkbook workbook = new HSSFWorkbook(fileInputStream);
HSSFSheet worksheet = workbook.getSheet("POI Worksheet");
HSSFRow row1 = worksheet.getRow(0);
String e1Val = cellE1.getStringCellValue();
HSSFCell cellF1 = row1.getCell((short) 5);
System.out.println("E1: " + e1Val);
JSONObject obj = new JSONObject();
obj.put("value", e1Val);
System.out.print(obj + "\n");
Map<String,Object> c_data = mapper.readValue(e1Val, Map.class);
System.out.println(a);
}
} catch (FileNotFoundException e) {
} catch (IOException e) {
}
}
}
Json Data is stored in excel sheet, from there I am reading it in Java program

Get a Json-reader like GSON.
Read in the JSON to an equivalent Object like
public class rand{
private String type;
private int minimum;
private boolean exclusiveMinimum;
private int maximum;
private boolean exclusiveMaximum;
//this standard-constructor is needed for the JsonReader
public rand(){
}
//Getter for all Values
}
and after reading in your JSON you can access your Data via your getter-methods

I think that Jackson may be of help here.
I suggest that you create a data model in Java that reflects the JSON. This can along the lines of:
// This is the root object. It contains the input data (RandomizerInput) and a
// generate-function that is used for generating new random ints.
public class RandomData {
private RandomizerInput input;
#JsonCreator
public RandomData(#JsonProperty("rand") final RandomizerInput input) {
this.input = input;
}
#JsonProperty("rand")
public RandomizerInput getInput() {
return input;
}
#JsonProperty("generated")
public int generateRandomNumber() {
int max = input.isExclusiveMaximum()
? input.getMaximum() - 1 : input.getMaximum();
int min = input.isExclusiveMinimum()
? input.getMinimum() + 1 : input.getMinimum();
return new Random().nextInt((max - min) + 1) + min;
}
}
// This is the input data (pretty much what is described in the question).
public class RandomizerInput {
private final boolean exclusiveMaximum;
private final boolean exclusiveMinimum;
private final int maximum;
private final int minimum;
private final String type;
#JsonCreator
public RandomizerInput(
#JsonProperty("type") final String type,
#JsonProperty("minimum") final int minimum,
#JsonProperty("exclusiveMinimum") final boolean exclusiveMinimum,
#JsonProperty("maximum") final int maximum,
#JsonProperty("exclusiveMaximum") final boolean exclusiveMaximum) {
this.type = type; // Not really used...
this.minimum = minimum;
this.exclusiveMinimum = exclusiveMinimum;
this.maximum = maximum;
this.exclusiveMaximum = exclusiveMaximum;
}
public int getMaximum() {
return maximum;
}
public int getMinimum() {
return minimum;
}
public String getType() {
return type;
}
public boolean isExclusiveMaximum() {
return exclusiveMaximum;
}
public boolean isExclusiveMinimum() {
return exclusiveMinimum;
}
}
To use these classes the ObjectMapper from Jackson can be used like this:
public static void main(String... args) throws IOException {
String json =
"{ " +
"\"rand\": { " +
"\"type\": \"number\", " +
"\"minimum\": 0, " +
"\"exclusiveMinimum\": false, " +
"\"maximum\": 50, " +
"\"exclusiveMaximum\": true " +
"} " +
"}";
// Create the mapper
ObjectMapper mapper = new ObjectMapper();
// Convert JSON to POJO
final RandomData randomData = mapper.readValue(json, RandomData.class);
// Either you can get the random this way...
final int random = randomData.generateRandomNumber();
// Or, you can serialize the whole thing as JSON....
String str = mapper.writeValueAsString(randomData);
// Output is:
// {"rand":{"type":"number","minimum":0,"exclusiveMinimum":false,"maximum":50,"exclusiveMaximum":true},"generated":21}
System.out.println(str);
}
The actual generation of a random number is based on this SO question.

Related

How to add remaining batch of n elements into arrayList?

I'm currently learning to develop a simple blockchain program that reads sample data from .txt and creates a new block for every 10 transactions. I was wondering if the given sample data was 23 lines of transactions, is there a way to make a new block that consist of the last 3 transactions ?
Current Output
Block[header=Header[index=0,currHash=51aa6b7cf5fb821189d58b5c995b4308370888efcaac469d79ad0a5d94fb0432, prevHash=0, timestamp=1654785847112], tranx=null]
Block[header=Header[index=0,currHash=92b3582095e2403c68401448e8a34864e8465d0ea51c05f11c23810ec36b4868, prevHash=0, timestamp=1654785847385], tranx=Transaction [tranxLst=[alice|bob|credit|1.0, alice|bob|debit|2.0, alice|bob|debit|3.0, alice|bob|credit|4.0, alice|bob|debit|5.0, alice|bob|credit|6.0, alice|bob|debit|7.0, alice|bob|debit|8.0, alice|bob|debit|9.0, alice|bob|debit|10.0]]]
Block[header=Header[index=0,currHash=7488c600433d78e0fb8586e71a010b1d39a040cb101cc6e3418668d21b614519, prevHash=0, timestamp=1654785847386], tranx=Transaction [tranxLst=[alice|bob|credit|11.0, alice|bob|credit|12.0, alice|bob|debit|13.0, alice|bob|debit|14.0, alice|bob|credit|15.0, alice|bob|credit|16.0, alice|bob|credit|17.0, alice|bob|debit|18.0, alice|bob|credit|19.0, alice|bob|credit|20.0]]]
What I want
Block[header=Header[index=0,currHash=51aa6b7cf5fb821189d58b5c995b4308370888efcaac469d79ad0a5d94fb0432, prevHash=0, timestamp=1654785847112], tranx=null]
Block[header=Header[index=0,currHash=92b3582095e2403c68401448e8a34864e8465d0ea51c05f11c23810ec36b4868, prevHash=0, timestamp=1654785847385], tranx=Transaction [tranxLst=[alice|bob|credit|1.0, alice|bob|debit|2.0, alice|bob|debit|3.0, alice|bob|credit|4.0, alice|bob|debit|5.0, alice|bob|credit|6.0, alice|bob|debit|7.0, alice|bob|debit|8.0, alice|bob|debit|9.0, alice|bob|debit|10.0]]]
Block[header=Header[index=0,currHash=7488c600433d78e0fb8586e71a010b1d39a040cb101cc6e3418668d21b614519, prevHash=0, timestamp=1654785847386], tranx=Transaction [tranxLst=[alice|bob|credit|11.0, alice|bob|credit|12.0, alice|bob|debit|13.0, alice|bob|debit|14.0, alice|bob|credit|15.0, alice|bob|credit|16.0, alice|bob|credit|17.0, alice|bob|debit|18.0, alice|bob|credit|19.0, alice|bob|credit|20.0]]]
Block[header=Header[index=0,currHash=7488c600433d78e0fb8586e71a010b1d39a040cb101cc6e3418668d21b614520, prevHash=0, timestamp=1654785847387], tranx=Transaction [tranxLst=[alice|bob|credit|21.0, alice|bob|credit|22.0, alice|bob|debit|23.0]]]
my code:
Client app
public static void main(String[] args) throws IOException {
homework();
}
static void homework() throws IOException {
int count = 0;
Transaction tranxLst = new Transaction();
Block genesis = new Block("0");
System.out.println(genesis);
BufferedReader bf = new BufferedReader(new FileReader("dummytranx.txt"));
String line = bf.readLine();
while (line != null) {
tranxLst.add(line);
line = bf.readLine();
count++;
if (count % 10 == 0) {
Block newBlock = new Block(genesis.getHeader().getPrevHash());
newBlock.setTranx(tranxLst);
System.out.println(newBlock);
tranxLst.getTranxLst().clear();
}
}
bf.close();
}
Transaction class
public class Transaction implements Serializable {
public static final int SIZE = 10;
/**
* we will comeback to generate the merkle root ie., hash of merkle tree
* merkleRoot = hash
*/
private String merkleRoot = "9a0885f8cd8d94a57cd76150a9c4fa8a4fed2d04c244f259041d8166cdfeca1b8c237b2c4bca57e87acb52c8fa0777da";
// private String merkleRoot;
public String getMerkleRoot() {
return merkleRoot;
}
public void setMerkleRoot(String merkleRoot) {
this.merkleRoot = merkleRoot;
}
/**
* For the data collection, u may want to choose classic array or collection api
*/
private List<String> tranxLst;
public List<String> getTranxLst() {
return tranxLst;
}
public Transaction() {
tranxLst = new ArrayList<>(SIZE);
}
/**
* add()
*/
public void add(String tranx) {
tranxLst.add(tranx);
}
#Override
public String toString() {
return "Transaction [tranxLst=" + tranxLst + "]";
}
}
Block class
public class Block implements Serializable {
private Header header;
public Header getHeader() {
return header;
}
private Transaction tranx;
public Block(String previousHash) {
header = new Header();
header.setTimestamp(new Timestamp(System.currentTimeMillis()).getTime());
header.setPrevHash(previousHash);
String blockHash = Hasher.sha256(getBytes());
header.setCurrHash(blockHash);
}
/**
* getBytes of the Block object
*/
private byte[] getBytes() {
try (ByteArrayOutputStream baos = new ByteArrayOutputStream();
ObjectOutputStream out = new ObjectOutputStream(baos);) {
out.writeObject(this);
return baos.toByteArray();
} catch (Exception e) {
e.printStackTrace();
return null;
}
}
public Transaction getTranx() {
return tranx;
}
/**
* aggregation rel
*/
public void setTranx(Transaction tranx) {
this.tranx = tranx;
}
/**
* composition rel
*/
public class Header implements Serializable {
private int index;
private String currHash, prevHash;
private long timestamp;
// getset methods
public String getCurrHash() {
return currHash;
}
public int getIndex() {
return index;
}
public void setIndex(int index) {
this.index = index;
}
public void setCurrHash(String currHash) {
this.currHash = currHash;
}
public String getPrevHash() {
return prevHash;
}
public void setPrevHash(String prevHash) {
this.prevHash = prevHash;
}
public long getTimestamp() {
return timestamp;
}
public void setTimestamp(long timestamp) {
this.timestamp = timestamp;
}
#Override
public String toString() {
return "Header [index=" + index + ", currHash=" + currHash + ", prevHash=" + prevHash + ", timestamp="
+ timestamp + "]";
}
}
#Override
public String toString() {
return "Block [header=" + header + ", tranx=" + tranx + "]";
}
}
enter code here
Instead of using a counter in the conditional statement, try ForLoop.
static void homework() throws IOException {
Transaction tranxLst = new Transaction();
Block genesis = new Block("0");
System.out.println(genesis);
BufferedReader bf = new BufferedReader(new FileReader("dummytranx.txt"));
String line = bf.readLine();
while (line != null) {
for (int i = 0; i < 10; i++) {
tranxLst.add(line);
line = bf.readLine();
if (line == null) {
break;
}
}
Block newBlock = new Block(genesis.getHeader().getPrevHash());
newBlock.setTranx(tranxLst);
System.out.println(newBlock);
tranxLst.getTranxLst().clear();
}
bf.close();
}

Creating XML file from Java jaxB

This is the service class.I am creating a XML file by reading value from database. Code is using three more pojo classes. Mt700, Header and Swift details. MT700 is main class for Header and swift details. Problem is I am able to store everything one time. Doesn't matter how many rows of data I have when the file get generated with one record it has only one header and one swift details. How can I make this work in loop? I think I have to use list but I am not sure how to use it to make it work.
public void generateEliteExtracts(int trdCustomerKy, Date lastRunDate, Date currentDate) throws TradeException {
FileOutputStream fout = null;
try {
MT700 mt700 = getMT700(trdCustomerKy,lastRunDate,currentDate);
if (null != mt700){
StringBuffer fileName = new StringBuffer(1024);
fileName.append(mConfiguration.getOutDirectory()).append(MT700_MSGTYPE)
.append(DOT).append(mConfiguration.getOutputFileExtn());
smLog.debug("Generated Extract for BankRef" + fileName.toString());
mTracer.log("Generated Extract for BankRef" + fileName.toString());
File xmlFile = new File(fileName.toString());
fout = new FileOutputStream(xmlFile);
fout.write(MT700_XMLHEADER.getBytes());
JAXBContext jaxbContext = JAXBContext.newInstance(MT700.class);
Marshaller marshaller = jaxbContext.createMarshaller();
marshaller.setProperty(Marshaller.JAXB_ENCODING, ENCODING_ASCII);
marshaller.setProperty(Marshaller.JAXB_FORMATTED_OUTPUT, Boolean.TRUE);
marshaller.setProperty(Marshaller.JAXB_FRAGMENT, Boolean.FALSE);
marshaller.setProperty("com.sun.xml.internal.bind.xmlDeclaration", Boolean.FALSE);
marshaller.marshal(mt700, fout);
IOUtils.closeQuietly(fout);
}
}catch(
Exception ex)
{
smLog.error("Caught unexpected error while creating extracts. ", ex);
throw new TradeException("Caught unexpected error while creating extracts.", ex);
} finally
{
IOUtils.closeQuietly(fout);
}
}
private MT700 getMT700(int trdCustomerKy, Date lastRunDate, Date currentDate) throws TradeException {
MT700 mt700 = new MT700();
AbInBevEliteExtractDAO dao = new AbInBevEliteExtractDAO(mConnection);
CompanyCodesHelper ccHelper = new CompanyCodesHelper(mConnection);
String cifCodes = ccHelper.getDescription(trdCustomerKy, "CIF Codes", "CIF Codes");
if (false == TradeUtil.isStringNull(cifCodes)) {
mTracer.log("Fetching records for CIFs: " + StringUtils.replace(cifCodes, PIPE, COMMA));
String[] codes = StringUtils.split(cifCodes, PIPE);
List<ExportAdvicesData> exportList = dao.getExportAdvices(trdCustomerKy, lastRunDate, currentDate, codes);
for (int i = 0; i < exportList.size(); i++) {
ExportAdvicesData exportData = exportList.get(i);
if ("XXLC".equalsIgnoreCase(exportData.getDocAcronym())) {
Header header = new Header();
header.setMessageType("N");
header.setVersionNo("1.0");
header.setRevisionNo("00");
header.setDocumentDate(DateUtil.formatDate(new Date(), DATE_FORMAT_YYYY_MM_DD_HHMMSS));
header.setBankId("BOA" + StringUtils.substring(exportData.getCustRef(), 0, 4));
header.setCustId("XOM");
SwiftDetails swiftTest = new SwiftDetails();
header.setDocumentType(MT700_MSGTYPE);
SwiftParserBankDocs parser = new SwiftParserBankDocs(exportData.getDocumentContent());
String bankRef = parser.getTagValue("21");
String custRef = parser.getTagValue("20");
if (TradeUtil.isStringNull(bankRef)) {
header.setCustRefNo("NONREF");
header.setBankRefNo(custRef);
} else {
header.setCustRefNo(custRef);
header.setBankRefNo(bankRef);
}
swiftTest.setTAG_27("1/1");
swiftTest.setTAG_20(custRef);
swiftTest.setTAG_23(EMPTY_STRING);
String issueDate = parser.getTagValue("31C");
swiftTest.setTAG_31C(getDateInYYMMDD(issueDate));
swiftTest.setTAG_40E("UCP LATEST VERSION");
String datePlaceOfExpiry = parser.getTagValue("31D");
swiftTest.setTAG_31D(getFormattedDatePlaceOfExpiry(datePlaceOfExpiry));
swiftTest.setTAG_50(parser.getTagValue("50"));
swiftTest.setTAG_59(parser.getTagValue("59"));
swiftTest.setTAG_32B(getCurrencyCdAmount(parser.getTagValue("32B")));
if (false == TradeUtil.isStringNull(exportData.getPositiveTolerance())) {
swiftTest.setTAG_39A(
exportData.getPositiveTolerance() + "/" + exportData.getPositiveTolerance());
} else {
swiftTest.setTAG_39A(EMPTY_STRING);
}
swiftTest.setTAG_39B(EMPTY_STRING);
swiftTest.setTAG_39C(EMPTY_STRING);
swiftTest.setTAG_41A(parser.getTagValue("41D"));
String tag42A = parser.getTagValue("42A");
swiftTest.setTAG_42A(tag42A);
if (TradeUtil.isStringNull(tag42A)) {
swiftTest.setTAG_42A(parser.getTagValue("42D"));
}
swiftTest.setTAG_42C(parser.getTagValue("42C"));
swiftTest.setTAG_42M(parser.getTagValue("42M"));
swiftTest.setTAG_42P(parser.getTagValue("42P"));
swiftTest.setTAG_43P(parser.getTagValue("43P"));
swiftTest.setTAG_43T(parser.getTagValue("43T"));
if (!(TradeUtil.isStringNull(parser.getTagValue("44A")))) {
swiftTest.setTAG_44A(parser.getTagValue("44A"));
}
if (!(TradeUtil.isStringNull(parser.getTagValue("44B")))) {
swiftTest.setTAG_44B(parser.getTagValue("44B"));
}
if (!(TradeUtil.isStringNull(parser.getTagValue("44E")))) {
swiftTest.setTAG_44E(parser.getTagValue("44E"));
}
if (!(TradeUtil.isStringNull(parser.getTagValue("44F")))) {
swiftTest.setTAG_44F(parser.getTagValue("44F"));
}
Date latestShipDate = exportData.getLatestShipDate();
if (null != latestShipDate) {
swiftTest.setTAG_44C(DateUtil.formatDate(latestShipDate, DATE_FORMAT_YYMMDD));
} else {
swiftTest.setTAG_44C(EMPTY_STRING);
}
swiftTest.setTAG_44D(parser.getTagValue("44D"));
swiftTest.setTAG_45A(parser.getTagValue("45") + BLANK_STRING + parser.getTagValue("45A")
+ BLANK_STRING + parser.getTagValue("45B"));
swiftTest.setTAG_46A(parser.getTagValue("46") + BLANK_STRING + parser.getTagValue("46A")
+ BLANK_STRING + parser.getTagValue("46B"));
swiftTest.setTAG_47A(parser.getTagValue("47") + BLANK_STRING + parser.getTagValue("47A")
+ BLANK_STRING + parser.getTagValue("47B"));
swiftTest.setTAG_71B(parser.getTagValue("71B"));
swiftTest.setTAG_48(parser.getTagValue("48"));
swiftTest.setTAG_49(parser.getTagValue("49"));
swiftTest.setTAG_50B(EMPTY_STRING);
swiftTest.setTAG_51A(EMPTY_STRING);
String issuingBank = parser.getAddress(SwiftParserBankDocs.ISSUING_BANK);
if (TradeUtil.isStringNull(issuingBank)) {
String errorMsg = "Issuing Bank address not found in bankdoc text, SWIFT content is possibly invalid, skipped processed record: "
+ exportData.getCustRef();
smLog.error(errorMsg);
mTracer.log("ERROR: " + errorMsg);
}
issuingBank = StringUtils.replace(issuingBank, CRLF, BLANK_STRING + CRLF);
swiftTest.setTAG_52A(issuingBank);
swiftTest.setTAG_53A(parser.getTagValue("53A"));
swiftTest.setTAG_78(parser.getTagValue("78"));
swiftTest.setTAG_57A(parser.getAddress("TO:"));
swiftTest.setTAG_72(parser.getTagValue("72"));
swiftTest.setTAG_40A(parser.getTagValue("40B"));
if (parser.is710Advice()) {
swiftTest.setTAG_20(parser.getTagValue("21"));
}
mt700.setSwift700(swiftTest);
mt700.setHeader(header);
} else if ("XAMD".equalsIgnoreCase(exportData.getDocAcronym())) {
Header header = new Header();
header.setMessageType("N");
header.setVersionNo("1.0");
header.setRevisionNo("00");
header.setDocumentDate(DateUtil.formatDate(new Date(), DATE_FORMAT_YYYY_MM_DD_HHMMSS));
header.setBankId("BOA" + StringUtils.substring(exportData.getCustRef(), 0, 4));
header.setCustId("XOM");
SwiftDetails swift = new SwiftDetails();
header.setDocumentType(MT707_MSGTYPE);
SwiftParserBankDocs parser = new SwiftParserBankDocs(exportData.getDocumentContent());
String custRef = parser.getTagValue("20");
String bankRef = parser.getTagValue("23");
if (TradeUtil.isStringNull(bankRef)) {
header.setBankRefNo("NONREF");
} else {
header.setBankRefNo(bankRef);
}
header.setCustRefNo(custRef);
swift.setTAG_20(custRef);
swift.setTAG_21(parser.getTagValue("21"));
swift.setTAG_23(EMPTY_STRING);
String issuingBank = parser.getAddress(SwiftParserBankDocs.ISSUING_BANK);
if (TradeUtil.isStringNull(issuingBank)) {
String errorMsg = "Issuing Bank address not found in bankdoc text, SWIFT content is possibly invalid, skipped processed record: "
+ exportData.getCustRef();
smLog.error(errorMsg);
mTracer.log("ERROR: " + errorMsg);
swift.setTAG_52A(EMPTY_STRING);
} else {
issuingBank = StringUtils.replace(issuingBank, CRLF, BLANK_STRING + CRLF);
swift.setTAG_52A(issuingBank);
}
swift.setTAG_31C(getDateInYYMMDD(parser.getTagValue("31C")));
swift.setTAG_30(getDateInYYMMDD(parser.getTagValue("30")));
swift.setTAG_26E(parser.getTagValue("26E"));
swift.setTAG_59(parser.getTagValue("59"));
swift.setTAG_31E(getDateInYYMMDD(parser.getTagValue("31E")));
swift.setTAG_79(parser.getTagValue("79"));
swift.setTAG_72(parser.getTagValue("72"));
swift.setTAG_32B(getCurrencyCdAmount(parser.getTagValue("32B")));
swift.setTAG_33B(getCurrencyCdAmount(parser.getTagValue("33B")));
swift.setTAG_34B(getCurrencyCdAmount(parser.getTagValue("34B")));
swift.setTAG_39A(parser.getTagValue("39A"));
swift.setTAG_39B(parser.getTagValue("39B"));
swift.setTAG_39C(parser.getTagValue("39C"));
swift.setTAG_44A(parser.getTagValue("44A"));
swift.setTAG_44B(parser.getTagValue("44B"));
swift.setTAG_44C(parser.getTagValue("44C"));
swift.setTAG_44D(parser.getTagValue("44D"));
swift.setTAG_44E(parser.getTagValue("44E"));
swift.setTAG_44F(parser.getTagValue("44F"));
mt700.setHeader(header);
mt700.setSwift700(swift);
}
}
}
return mt700;
}
This is MT700 POJO class. In this class I am calling header and swift details pojo classes.
#XmlRootElement(name = "MT700")
public class MT700 implements Serializable
{
/**
* serialVersionUID
*/
private static final long serialVersionUID = 1L;
private Header header;
private SwiftDetails swift700;
private String version = "1.0";
public Header getHeader()
{
return header;
}
#XmlElement(name = "Header")
public void setHeader(Header header)
{
this.header = header;
}
/**
* #return the swift700
*/
public SwiftDetails getSwift700()
{
return swift700;
}
#XmlElement(name = "Swift_Details_700")
public void setSwift700(SwiftDetails swift700)
{
this.swift700 = swift700;
}
public String getVersion()
{
return version;
}
#XmlAttribute(name = "Version")
public void setVersion(String version)
{
this.version = version;
}
}
This is Header class. I class similar to like this which has tags and that is swift details
#XmlRootElement(name = "Header")
#XmlType(propOrder = { "documentType", "messageType", "versionNo",
"revisionNo", "documentDate", "bankId", "custId", "custRefNo",
"bankRefNo" })
public class Header implements Serializable
{
private static final long serialVersionUID = 1L;
private String documentType;
private String messageType;
private String versionNo;
private String revisionNo;
private String documentDate;
private String bankId;
private String custId;
private String custRefNo;
private String bankRefNo;
I am not adding getter and setter for this class to make the post look simple
You are creating one MT700 instance and then in this loop, you are reassigning the header and swift fields each time through the loop:
MT700 mt700 = new MT700();
for (int i = 0; i < exportList.size(); i++) {
...
mt700.setHeader(header);
mt700.setSwift700(swift);
}
This means that the document you are outputting contains just the last header/swift returned from the database query.
You need to make one or more of these three into a list of some sort. Either your MT700 contains a list of headers and swifts, or more likely you want to have a list of MT700s each with one header and one swift.
In other words, you want to have a fourth type that will be the actual root of your XML document. That element will contain one MT700 element for each row found by the query. Each MT700 element will have a header element and a swift element.
So, more specifically, here is what you want to do:
#XmlRootElement
class MT700s {
#XmlElement(name = "MT700")
private List<MT700> mt700s = new ArrayList<>();
public List<MT700> getMT700s() { return mt700s; }
// Etc.
}
MT700s mt700s = new MT700s();
for (int i = 0; i < exportList.size(); i++) {
MT700 mt700 = new MT700();
...
mt700.setHeader(header);
mt700.setSwift700(swift);
mt700s.getMT700s().add(mt700);
}

Java I/O FileStream issue

I have an input file stream method that will load a file, I just can't figure out how to then use the file in another method. The file has one UTF string and two integers. How can I now use each of these different ints or strings in a main method? Lets say I want print the three different variables to the console, how would I go about doing that? Here's a few things I've tried with the method:
public static dataStreams() throws IOException {
int i = 0;
char c;
try (DataInputStream input = new DataInputStream(
new FileInputStream("input.dat"));
) {
while((i=input.read())!=-1){
// converts integer to character
c=(char)i;
}
return c;
return i;
/*
String stringUTF = input.readUTF();
int firstInt = input.readInt();
int secondInt = input.readInt();
*/
}
}
Maybe one container for those properties, like this:
public static void main(String [] args) {
DataContainer dContainer = null;
try {
dContainer = dataStreams();
} catch (IOException e) {
e.printStackTrace();
}
//do some logging with properties
System.out.println(dContainer.getFirst());
System.out.println(dContainer.getSecond());
System.out.println(dContainer.getUtf());
}
public static DataContainer dataStreams() throws IOException {
int i = 0;
char c;
try (DataInputStream input = new DataInputStream(
new FileInputStream("input.dat"));
) {
while((i=input.read())!=-1){
// converts integer to character
c=(char)i;
}
String stringUTF = input.readUTF();
int firstInt = input.readInt();
int secondInt = input.readInt();
DataContainer dContainer = new DataContainer(stringUTF, firstInt, secondInt);
return dContainer;
}
}
static class DataContainer {
String utf;
int first;
int second;
DataContainer(String utf, int first, int second) {
this.utf = utf;
this.first = first;
this.second = second;
}
public String getUtf() {
return utf;
}
public int getFirst() {
return first;
}
public int getSecond() {
return second;
}
}

Difficulty on check the data when it pass to a array from text file in Java

I am creating three classes,
First, ExchangeRate class for storing the data from the text file.
code public class ExchangeRate {
private String Local;
private String Foreign;
private double Rate;
public ExchangeRate(String Px, String Py, double ER) {
Local = Px;
Foreign = Py;
Rate = ER;
}
public void setLocal(String L) {
Local = L;
}
public void setForeign(String F) {
Foreign = F;
}
public void setRate(double R) {
Rate = R;
}
public String getLocal() {
return Local;
}
public String getForeign() {
return Foreign;
}
public double getRate() {
return Rate;
}
}
Then, I creat the CurrencyExchange class for converting the exchange rate which get from the constructor.
enter code here public class CurrencyExchange {
public int ratesize = 0;
public ExchangeRate[] allrecord = new ExchangeRate[42];
private String name1;
private String name2;
private double num;
private double num2;
public void convert(String currencyCode1, String currencyCode2,
double amount, boolean printFlag) {
name1 = currencyCode1;
name2 = currencyCode2;
num = amount; //change getLocal() to static?
if (name1 == ExchangeRate.getLocal()
&& name2 == ExchangeRate.getForeign()) {
num2 = num * ExchangeRate.getRate();
}
if (printFlag == true) {
printInfo();
}
}
public void addExchangeRate(ExchangeRate exRate) {
allrecord[ratesize] = exRate;
setratesize();
}
public void setratesize() {
ratesize++;
}
public String getname1() {
return name1;
}
public String getname2() {
return name2;
}
public double getnum() {
return num;
}
public void printInfo() {
System.out.println("Direct Conversion: Converted " + name1 + " " + num
+ " to " + name2 + " "+num2);
}
}
But I have diffculty on how to check if the currenncy can convert accroding to the name of the country indicate on the texting class, such as 'eur' and 'jpy. means convert EUR to JPT according to the exchange rate on the text.file. If I change that checking part
"If(name1==ExchangeRate.getLocal()" to static, Local will become the last data from the text.It cannot be checked. Therefore, I want to know How can I solve the problem?
Testing class
enter code here import java.io.File;
import java.io.FileNotFoundException;
import java.util.Scanner;
public class MP2_Task1 {
public static void main(String[] args) {
CurrencyExchange currencyExchange = new CurrencyExchange();
String fileName = "exchange_rate.txt";
Scanner in = null;
try { // start reading data file
in = new Scanner(new File(fileName));
while (in.hasNextLine()) {
String line = in.nextLine();
String token[] = line.split(",");
if (token.length == 3) {
// create ExchangeRate instance for storing the exchange
// rate record
ExchangeRate exRate = new ExchangeRate(token[0], token[1],
Double.parseDouble(token[2]));
// adding the new exchange rate record to the
// CurrencyExchange instance
currencyExchange.addExchangeRate(exRate);
}
}
} catch (FileNotFoundException e) {
System.out.println(fileName + " cannot be found!");
} finally {
if (in != null) {
in.close();
}
}
String hkd = "HKD";
String usd = "USD";
String jpy = "JPY";
String gbp = "GBP";
String cny = "CNY";
String eur = "EUR";
String chf = "CHF";
// Task 1 - Simple money conversions
double oriAmount1 = 1000;
currencyExchange.convert(hkd, gbp, oriAmount1, true);
double oriAmount2 = 55;
currencyExchange.convert(cny, usd, oriAmount2, true);
double oriAmount3 = 300;
currencyExchange.convert(eur, jpy, oriAmount3, true);
double oriAmount4 = 8000;
currencyExchange.convert(hkd, chf, oriAmount4, true);
System.out.println();
}
}
Expected Output:
Direct Conversion: Converted HKD 1000.0 to GBP 83.8
Direct Conversion: Converted CNY 55.0 to USD 8.6735
Direct Conversion: Converted EUR 300.0 to JPY 39739.23
Direct Conversion: Converted HKD 8000.0 to CHF 1026.4
The whole text.file about exchange rate
HKD,USD,1.290000e-01
HKD,JPY,1.569860e+01
HKD,GBP,8.380000e-02
HKD,CNY,8.178000e-01
HKD,EUR,1.185000e-01
HKD,CHF,1.283000e-01
USD,HKD,7.750800e+00
USD,JPY,1.216885e+02
USD,GBP,6.499000e-01
USD,CNY,6.342400e+00
USD,EUR,9.187000e-01
USD,CHF,9.951000e-01
JPY,HKD,6.370000e-02
JPY,USD,8.200000e-03
JPY,GBP,5.300000e-03
JPY,CNY,5.210000e-02
JPY,EUR,7.500000e-03
JPY,CHF,8.200000e-03
GBP,HKD,1.192560e+01
GBP,USD,1.538600e+00
GBP,JPY,1.872341e+02
GBP,CNY,9.758600e+00
GBP,EUR,1.413500e+00
GBP,CHF,1.531000e+00
CNY,HKD,1.222100e+00
CNY,USD,1.577000e-01
CNY,JPY,1.918650e+01
CNY,GBP,1.025000e-01
CNY,EUR,1.448000e-01
CNY,CHF,1.569000e-01
EUR,HKD,8.437100e+00
EUR,USD,1.088600e+00
EUR,JPY,1.324641e+02
EUR,GBP,7.075000e-01
EUR,CNY,6.904000e+00
EUR,CHF,1.083100e+00
CHF,HKD,7.789700e+00
CHF,USD,1.005000e+00
CHF,JPY,1.222988e+02
CHF,GBP,6.532000e-01
CHF,CNY,6.374200e+00
CHF,EUR,9.233000e-01
if (name1.equals(ExchangeRate.getLocal())
&& name2.equals(ExchangeRate.getForeign())) {
num2 = num * ExchangeRate.getRate();
}
with == you compare the object references and not the values.

Slow chunk response in Play 2.2

In my play-framework-based web application users can download all the rows of different database tables in csv or json format. Tables are relatively large (100k+ rows) and I am trying to stream back the result using chunking in Play 2.2.
However the problem is although println statements shows that the rows get written to the Chunks.Out object, they do not show up in the client side! If I limit the rows getting sent back it will work, but it also has a big delay in the beginning which gets bigger if I try to send back all the rows and causes a time-out or the server runs out of memory.
I use Ebean ORM and the tables are indexed and querying from psql doesn't take much time. Does anyone have any idea what might be the problem?
I appreciate your help a lot!
Here is the code for one of the controllers:
#SecureSocial.UserAwareAction
public static Result showEpex() {
User user = getUser();
if(user == null || user.getRole() == null)
return ok(views.html.profile.render(user, Application.NOT_CONFIRMED_MSG));
DynamicForm form = DynamicForm.form().bindFromRequest();
final UserRequest req = UserRequest.getRequest(form);
if(req.getFormat().equalsIgnoreCase("html")) {
Page<EpexEntry> page = EpexEntry.page(req.getStart(), req.getFinish(), req.getPage());
return ok(views.html.epex.render(page, req));
}
// otherwise chunk result and send back
final ResultStreamer<EpexEntry> streamer = new ResultStreamer<EpexEntry>();
Chunks<String> chunks = new StringChunks() {
#Override
public void onReady(play.mvc.Results.Chunks.Out<String> out) {
Page<EpexEntry> page = EpexEntry.page(req.getStart(), req.getFinish(), 0);
ResultStreamer<EpexEntry> streamer = new ResultStreamer<EpexEntry>();
streamer.stream(out, page, req);
}
};
return ok(chunks).as("text/plain");
}
And the streamer:
public class ResultStreamer<T extends Entry> {
private static ALogger logger = Logger.of(ResultStreamer.class);
public void stream(Out<String> out, Page<T> page, UserRequest req) {
if(req.getFormat().equalsIgnoreCase("json")) {
JsonContext context = Ebean.createJsonContext();
out.write("[\n");
for(T e: page.getList())
out.write(context.toJsonString(e) + ", ");
while(page.hasNext()) {
page = page.next();
for(T e: page.getList())
out.write(context.toJsonString(e) + ", ");
}
out.write("]\n");
out.close();
} else if(req.getFormat().equalsIgnoreCase("csv")) {
for(T e: page.getList())
out.write(e.toCsv(CSV_SEPARATOR) + "\n");
while(page.hasNext()) {
page = page.next();
for(T e: page.getList())
out.write(e.toCsv(CSV_SEPARATOR) + "\n");
}
out.close();
}else {
out.write("Invalid format! Only CSV, JSON and HTML can be generated!");
out.close();
}
}
public static final String CSV_SEPARATOR = ";";
}
And the model:
#Entity
#Table(name="epex")
public class EpexEntry extends Model implements Entry {
#Id
#Column(columnDefinition = "pg-uuid")
private UUID id;
private DateTime start;
private DateTime finish;
private String contract;
private String market;
private Double low;
private Double high;
private Double last;
#Column(name="weight_avg")
private Double weightAverage;
private Double index;
private Double buyVol;
private Double sellVol;
private static final String START_COL = "start";
private static final String FINISH_COL = "finish";
private static final String CONTRACT_COL = "contract";
private static final String MARKET_COL = "market";
private static final String ORDER_BY = MARKET_COL + "," + CONTRACT_COL + "," + START_COL;
public static final int PAGE_SIZE = 100;
public static final String HOURLY_CONTRACT = "hourly";
public static final String MIN15_CONTRACT = "15min";
public static final String FRANCE_MARKET = "france";
public static final String GER_AUS_MARKET = "germany/austria";
public static final String SWISS_MARKET = "switzerland";
public static Finder<UUID, EpexEntry> find =
new Finder(UUID.class, EpexEntry.class);
public EpexEntry() {
}
public EpexEntry(UUID id, DateTime start, DateTime finish, String contract,
String market, Double low, Double high, Double last,
Double weightAverage, Double index, Double buyVol, Double sellVol) {
this.id = id;
this.start = start;
this.finish = finish;
this.contract = contract;
this.market = market;
this.low = low;
this.high = high;
this.last = last;
this.weightAverage = weightAverage;
this.index = index;
this.buyVol = buyVol;
this.sellVol = sellVol;
}
public static Page<EpexEntry> page(DateTime from, DateTime to, int page) {
if(from == null && to == null)
return find.order(ORDER_BY).findPagingList(PAGE_SIZE).getPage(page);
ExpressionList<EpexEntry> exp = find.where();
if(from != null)
exp = exp.ge(START_COL, from);
if(to != null)
exp = exp.le(FINISH_COL, to.plusHours(24));
return exp.order(ORDER_BY).findPagingList(PAGE_SIZE).getPage(page);
}
#Override
public String toCsv(String s) {
return id + s + start + s + finish + s + contract +
s + market + s + low + s + high + s +
last + s + weightAverage + s +
index + s + buyVol + s + sellVol;
}
1. Most of browsers wait for 1-5 kb of data before showing any results. You can check if Play Framework actually sends data with command curl http://localhost:9000.
2. You create streamer twice, remove first final ResultStreamer<EpexEntry> streamer = new ResultStreamer<EpexEntry>();
3. - You use Page class for retrieving large data set - this is incorrect. Actually you do one big initial request and then one request per iteration. This is SLOW. Use simple findIterate().
add this to EpexEntry (feel free to change it as you need)
public static QueryIterator<EpexEntry> all() {
return find.order(ORDER_BY).findIterate();
}
your new stream method implementation:
public void stream(Out<String> out, QueryIterator<T> iterator, UserRequest req) {
if(req.getFormat().equalsIgnoreCase("json")) {
JsonContext context = Ebean.createJsonContext();
out.write("[\n");
while (iterator.hasNext()) {
out.write(context.toJsonString(iterator.next()) + ", ");
}
iterator.close(); // its important to close iterator
out.write("]\n");
out.close();
} else // csv implementation here
And your onReady method:
QueryIterator<EpexEntry> iterator = EpexEntry.all();
ResultStreamer<EpexEntry> streamer = new ResultStreamer<EpexEntry>();
streamer.stream(new BuffOut(out, 10000), iterator, req); // notice buffering here
4. Another problem is - you call Out<String>.write() too often. Call of write() means that server needs to send new chunk of data to client immediately. Every call of Out<String>.write() have significant overhead.
Overhead appears because server needs to wrap response into chunked result - 6-7 bytes for each message Chunked response Format. Since you send small messages, overhead is significant.
Also, server needs to wrap your reply in TCP packet which size will be far less from optimal.
And, server needs to perform some internal action to send an chunk, this is also require some resources. As result, download bandwidth will be far from optimal.
Here is simple test: send 10000 lines of text TEST0 to TEST9999 in chunks. This takes 3 sec on my computer in average. But with buffering this takes 65 ms. Also, download sizes are 136 kb and 87.5 kb.
Example with buffering:
Controller
public class Application extends Controller {
public static Result showEpex() {
Chunks<String> chunks = new StringChunks() {
#Override
public void onReady(play.mvc.Results.Chunks.Out<String> out) {
new ResultStreamer().stream(out);
}
};
return ok(chunks).as("text/plain");
}
}
new BuffOut class. It's dumb, I know
public class BuffOut {
private StringBuilder sb;
private Out<String> dst;
public BuffOut(Out<String> dst, int bufSize) {
this.dst = dst;
this.sb = new StringBuilder(bufSize);
}
public void write(String data) {
if ((sb.length() + data.length()) > sb.capacity()) {
dst.write(sb.toString());
sb.setLength(0);
}
sb.append(data);
}
public void close() {
if (sb.length() > 0)
dst.write(sb.toString());
dst.close();
}
}
This implementation have 3 second download time and 136 kb size
public class ResultStreamer {
public void stream(Out<String> out) {
for (int i = 0; i < 10000; i++) {
out.write("TEST" + i + "\n");
}
out.close();
}
}
This implementation have 65 ms download time and 87.5 kb size
public class ResultStreamer {
public void stream(Out<String> out) {
BuffOut out2 = new BuffOut(out, 1000);
for (int i = 0; i < 10000; i++) {
out2.write("TEST" + i + "\n");
}
out2.close();
}
}

Categories