I have written some code to generate passwords for users that were written to sql before. Then I wanted to write each user with username and password to xml. The code seems to work fine except at around 200th user it suddenly stops halfway through xml tag and ends, which is pretty weird. I'm using Xstream as my library. The Arraylist has like 215 users.
I tried StaxDriver and DomDriver. The Stax Driver result was same as empty Xstream constructor, but Dom was even worse.
XStream xstream = new XStream();
xstream.alias("Zakaznici", ListZakazniku.class);
try {
PrintWriter out = new PrintWriter("Zakaznici.xml");
out.write(xstream.toXML(ListZakazniku.zakaznici));
}catch (Exception e){
e.printStackTrace();
}
public class ListZakazniku {
public static ArrayList<Zakaznik> zakaznici = new ArrayList<>();
public ListZakazniku(){
zakaznici= new ArrayList<Zakaznik>();
}
public void setZakaznici(ArrayList<Zakaznik> zakaznik){
this.zakaznici.clear();
this.zakaznici = zakaznik;
}
public static ArrayList<Zakaznik> getZakaznici() {
return zakaznici;
}
public void add(Zakaznik elbow){
zakaznici.add(elbow);
}
and Zakaznik is pretty basic object with username, password, id....
the cut was like
</Zakaznik>
<Zaka
I don't know what's wrong with it. Im looking forward to any suggestions :)
Your list should not be static, also slightly modified your printing code. An approach like this will work fine:
#XStreamAlias("listZakazniku")
public class ListZakazniku {
private List<Zakaznik> zakaznicis;
public ListZakazniku() {
zakaznicis = new ArrayList<Zakaznik>();
}
public void add(Zakaznik user) {
zakaznicis.add(user);
}
#XStreamAlias("zakaznik")
private static class Zakaznik {
private String user;
private String pwd;
public Zakaznik(String user, String pwd) {
this.user = user;
this.pwd = pwd;
}
}
public static void main(String[] args){
XStream xstream = new XStream();
xstream.processAnnotations(ListZakazniku.class);
ListZakazniku ll = new ListZakazniku();
ll.add(new Zakaznik("user1", "pwd1"));
ll.add(new Zakaznik("user2", "pwd2"));
try {
try (PrintWriter out = new PrintWriter("Zakaznici.xml")) {
out.println(xstream.toXML(ll));
}
}catch (Exception e){
e.printStackTrace();
}
}
}
Output:
<listZakazniku>
<zakaznicis>
<zakaznik>
<user>user1</user>
<pwd>pwd1</pwd>
</zakaznik>
<zakaznik>
<user>user2</user>
<pwd>pwd2</pwd>
</zakaznik>
</zakaznicis>
</listZakazniku>
Don't forget the processAnnotations-call for each annotated class! (also, your Zakaznik is not an internal static class I guess like in my example above, this was just to squeeze in the complete code..)
Related
Program must accept requests to add and remove tasks from the list through the server. After starting, server accepts connections in an infinite loop and reads from them a line containing json of the form:
{ "type": "ADD", "task": "Название задачи" }
where type is the type of operation (ADD or REMOVE) and task is the task itself. After processing the request, a list of all tasks should be displayed in the console. After connecting, my console gives null. What can be wrong?
Server class:
public class TodoServer {
public TodoServer(int port, Todos todos) {
while (true) {
try (ServerSocket serverSocket = new ServerSocket(port);
Socket clientSocket = serverSocket.accept();
PrintWriter out = new PrintWriter(clientSocket.getOutputStream(), true);
BufferedReader in = new BufferedReader(new InputStreamReader(clientSocket.getInputStream()))) {
System.out.println("New connection accepted");
final String json = in.readLine();
Gson gson = new Gson();
String type = gson.fromJson("\"type\"", String.class);
String task = gson.fromJson("\"task\"", String.class);
if (type.equals("ADD")) {
todos.addTask(task);
} else if (type.equals("REMOVE")) {
todos.removeTask(task);
}
System.out.println(todos.getAllTasks());
} catch (IOException e) {
System.out.println("Соединение разорвано");
}
}
}
public void start() throws IOException {
int port = 8989;
System.out.println("Starting server at " + port + "...");
}
}
Task class:
public class Todos {
static ArrayList <String> tasks = new ArrayList<>();
public void addTask(String task) {
tasks.add(task);
Collections.sort(tasks);
}
public void removeTask(String task) {
tasks.remove(task);//...
}
public String getAllTasks() {
return tasks.toString();
}
public ArrayList<String> getListTask() {
return tasks;
}
}
The Main class which the server starts:
public class Main {
public static void main(String[] args) throws IOException {
Todos todos = new Todos();
TodoServer server = new TodoServer(8989, todos);
server.start();
}
}
From what you've shown here, your parsing and use of JSON is the issue. As a starting point, you read a String json but then do nothing with it.
You'll want to parse that value into an object, and then access values out of it (like you would a dictionary or map). How to do that with GSON should have plenty of documentation and examples readily available.
If you are using an IDE for development, I also recommend using this as a great opportunity for trying the debugger out - setting breakpoints, inspecting values, etc!
It would be better to define a simple POJO to represent a task:
#Data
class MyTask {
private String type;
private String task;
}
Here #Data is a Lombok annotation which provides the boilerplate code of getters/setters/default constructor/toString/hashCode/equals.
Then the instance of such POJO is deserialized from JSON abd processed as needed:
final String json = in.readLine();
MyTask task = new Gson().fromJson(json, MyTask.class);
if ("ADD".equals(task.getType())) {
todos.addTask(task.getTask());
} else if ("REMOVE".equals(task.getType())) {
todos.removeTask(task.getTask());
}
System.out.println(todos.getAllTasks());
I am writing a JUnit for a method that uses FileInputStream and in the constructor only the file name is passed. The file is created as part of a servlet request and this file is not stored any where.
I am trying to Mock FileInputStream using PowerMockito so that it gives me a mocked file object. Unfortunately I get FileNotFoundException which is valid but I am not sure how to test this method then because the file doesn't exist.
Method under test:
public String viewReport() throws Exception {
this.inputStream = new FileInputStream(DOCUSIGN_REPORT_FILE);
try {
boolean returnReport = validateRequest();
if (returnReport) {
intgList = this.generateViewIntegrationReportData(getESignUIConfig());
this.createCSVFile(intgList, new FileWriter(DOCUSIGN_REPORT_FILE));
} else {
failureResponse(msgs, 400);
return null;
}
} catch (Exception e) {
e.printStackTrace();
msgs.add(new Message(ESignatureIntegrationMessageTypeEnum.MESSAGE_TYPE_ERROR,
UiIntegrationKeyConstants.UI_INTEGRATION_ERROR_CODE_500, UiIntegrationKeyConstants.UI_INTEGRATION_ERROR_TEXT_SERVICE_ERROR));
failureResponse(msgs, 500);
return null;
}
return UiIntegrationKeyConstants.REPORT_REPSONSE;
}
JUnit test so far.
#Test
public void testViewReport() throws Exception {
Map<String, Object> actionMap = new HashMap<>();
actionMap.put("application", "ESignatureIntegrationAction");
ActionContext.setContext(new ActionContext(actionMap));
FileInputStream inputStream = Mockito.mock(FileInputStream.class);
PowerMockito.whenNew(FileInputStream.class).withAnyArguments().thenReturn(inputStream);
action = new ESignatureIntegrationAction();
action.viewReport();
}
I get an exception when the code reaches to new FileInputStream(DOCUSIGN_REPORT_FILE);
Thanks for the help.
I would suggest to refactor your code in a way that allows testing without a mocking framework.
It could look somewhat like this:
public class YourClass {
// ...
public String viewReport() {
try {
boolean isValidRequest = validateRequest();
if (isValidRequest) {
IntegrationReportCsvFileHandler fileHandler = new IntegrationReportCsvFileHandler();
IntegrationReportData inputData = fileHandler.readData(new FileInputStream(DOCUSIGN_REPORT_FILE));
IntegrationReportGenerator generator = new IntegrationReportGenerator();
IntegrationReportData outputData = generator.processData(inputData, getESignUIConfig());
fileHandler.writeReport(outputData, new FileWriter(DOCUSIGN_REPORT_FILE));
} else {
failureResponse(msgs, 400);
return UiIntegrationKeyConstants.FAILURE_RESPONSE;
}
} catch (Exception e) {
e.printStackTrace();
msgs.add(new Message(ESignatureIntegrationMessageTypeEnum.MESSAGE_TYPE_ERROR,
UiIntegrationKeyConstants.UI_INTEGRATION_ERROR_CODE_500, UiIntegrationKeyConstants.UI_INTEGRATION_ERROR_TEXT_SERVICE_ERROR));
failureResponse(msgs, 500);
return UiIntegrationKeyConstants.FAILURE_RESPONSE;
}
return UiIntegrationKeyConstants.REPORT_RESPONSE;
}
// ...
}
public class IntegrationReportData {
// your custom data structure
// may as well just be a List<Data>
// may be different for input and output
}
public class IntegrationReportException extends Exception {
// your custom exception
public IntegrationReportException(String message) { super(exception); }
}
public class IntegrationReportGenerator {
public IntegrationReportData processData(IntegrationReportData data, ESignConfig config) throws IntegrationReportException {
// here's your logic that requires testing
}
}
public class IntegrationReportCsvFileHandler {
public IntegrationReportData readData(InputStream input) throws IOException {
// read data from given input stream
}
public void writeData(IntegrationReportData data, OutputStreamWriter outputWriter) throws IOException {
// write data to given output stream
}
}
That way the IntegrationReportGenerator would be easily testable.
I am wondering how to run a same java class with different command line options without manually change those command line options?
Basically, for inputFile and treeFile, I have more than 100 different combinations of the two files. I can not do "edit configurations" in IntelliJ to get result manually for each combination of treeFile and inputFile.
Could anybody give some suggestions to me such that how to create a loop of those inputFile and treeFile so that I do not need to manually specifying them for each combination.
Your help is highly appreciated!!!!
#Option(gloss="File of provided alignment")
public File inputFile;
#Option(gloss="File of the tree topology")
public File treeFile;
My java class code is below:
public class UniformizationSample implements Runnable
{
#Option(gloss="File of provided alignment")
public File inputFile;
#Option(gloss="File of the tree topology")
public File treeFile;
#Option(gloss="ESS Experiment Number")
public int rep = 1;
#Option(gloss="Rate Matrix Method")
public RateMtxNames selectedRateMtx = RateMtxNames.POLARITYSIZEGTR;
#Option(gloss = "True rate matrix generating data")
public File rateMtxFile;
#Option(gloss="Use cache or not")
public boolean cached=true;
private final PrintWriter detailWriter = BriefIO.output(Results.getFileInResultFolder("experiment.details.txt"));
public void run() {
ObjectMapper mapper = new ObjectMapper();
double[][] array;
EndPointSampler.cached=cached;
try (FileInputStream in = new FileInputStream(rateMtxFile)) {
array = mapper.readValue(in, double[][].class);
long startTime = System.currentTimeMillis();
UnrootedTreeLikelihood<MultiCategorySubstitutionModel<ExpFamMixture>> likelihood1 =
UnrootedTreeLikelihood
.fromFastaFile(inputFile, selectedRateMtx)
.withSingleRateMatrix(array)
.withExpFamMixture(ExpFamMixture.rateMtxModel(selectedRateMtx))
.withTree(treeFile);
Random rand = new Random(1);
likelihood1.evolutionaryModel.samplePosteriorPaths(rand, likelihood1.observations, likelihood1.tree);
logToFile("Total time in seconds: " + ((System.currentTimeMillis() - startTime) / 1000.0));
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (JsonMappingException e) {
} catch (IOException e) {
e.printStackTrace();
}
}
public static void main(String [] args)
{
Mains.instrumentedRun(args, new UniformizationSample());
}
public void logToFile(String someline) {
this.detailWriter.println(someline);
this.detailWriter.flush();
}
}
There is no way to do this in IntelliJ IDEA. However, you can modify your UniformizationSample class so that it will take the input data as method parameters, and write another Java class that will loop through your inputs and call your class with the necessary parameters.
Currently im trying to use a SAX Parser but about 3/4 through the file it just completely freezes up, i have tried allocating more memory etc but not getting any improvements.
Is there any way to speed this up? A better method?
Stripped it to bare bones, so i now have the following code and when running in command line it still doesn't go as fast as i would like.
Running it with "java -Xms-4096m -Xmx8192m -jar reader.jar" i get a GC overhead limit exceeded around article 700000
Main:
public class Read {
public static void main(String[] args) {
pages = XMLManager.getPages();
}
}
XMLManager
public class XMLManager {
public static ArrayList<Page> getPages() {
ArrayList<Page> pages = null;
SAXParserFactory factory = SAXParserFactory.newInstance();
try {
SAXParser parser = factory.newSAXParser();
File file = new File("..\\enwiki-20140811-pages-articles.xml");
PageHandler pageHandler = new PageHandler();
parser.parse(file, pageHandler);
pages = pageHandler.getPages();
} catch (ParserConfigurationException e) {
e.printStackTrace();
} catch (SAXException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
return pages;
}
}
PageHandler
public class PageHandler extends DefaultHandler{
private ArrayList<Page> pages = new ArrayList<>();
private Page page;
private StringBuilder stringBuilder;
private boolean idSet = false;
public PageHandler(){
super();
}
#Override
public void startElement(String uri, String localName, String qName, Attributes attributes) throws SAXException {
stringBuilder = new StringBuilder();
if (qName.equals("page")){
page = new Page();
idSet = false;
} else if (qName.equals("redirect")){
if (page != null){
page.setRedirecting(true);
}
}
}
#Override
public void endElement(String uri, String localName, String qName) throws SAXException {
if (page != null && !page.isRedirecting()){
if (qName.equals("title")){
page.setTitle(stringBuilder.toString());
} else if (qName.equals("id")){
if (!idSet){
page.setId(Integer.parseInt(stringBuilder.toString()));
idSet = true;
}
} else if (qName.equals("text")){
String articleText = stringBuilder.toString();
articleText = articleText.replaceAll("(?s)<ref(.+?)</ref>", " "); //remove references
articleText = articleText.replaceAll("(?s)\\{\\{(.+?)\\}\\}", " "); //remove links underneath headings
articleText = articleText.replaceAll("(?s)==See also==.+", " "); //remove everything after see also
articleText = articleText.replaceAll("\\|", " "); //Separate multiple links
articleText = articleText.replaceAll("\\n", " "); //remove new lines
articleText = articleText.replaceAll("[^a-zA-Z0-9- \\s]", " "); //remove all non alphanumeric except dashes and spaces
articleText = articleText.trim().replaceAll(" +", " "); //convert all multiple spaces to 1 space
Pattern pattern = Pattern.compile("([\\S]+\\s*){1,75}"); //get first 75 words of text
Matcher matcher = pattern.matcher(articleText);
matcher.find();
try {
page.setSummaryText(matcher.group());
} catch (IllegalStateException se){
page.setSummaryText("None");
}
page.setText(articleText);
} else if (qName.equals("page")){
pages.add(page);
page = null;
}
} else {
page = null;
}
}
#Override
public void characters(char[] ch, int start, int length) throws SAXException {
stringBuilder.append(ch,start, length);
}
public ArrayList<Page> getPages() {
return pages;
}
}
Your parsing code is likely working fine, but the volume of data you're loading is probably just too large to hold in memory in that ArrayList.
You need some sort of pipeline to pass the data on to its actual destination without ever
store it all in memory at once.
What I've sometimes done for this sort of situation is similar to the following.
Create an interface for processing a single element:
public interface PageProcessor {
void process(Page page);
}
Supply an implementation of this to the PageHandler through a constructor:
public class Read {
public static void main(String[] args) {
XMLManager.load(new PageProcessor() {
#Override
public void process(Page page) {
// Obviously you want to do something other than just printing,
// but I don't know what that is...
System.out.println(page);
}
}) ;
}
}
public class XMLManager {
public static void load(PageProcessor processor) {
SAXParserFactory factory = SAXParserFactory.newInstance();
try {
SAXParser parser = factory.newSAXParser();
File file = new File("pages-articles.xml");
PageHandler pageHandler = new PageHandler(processor);
parser.parse(file, pageHandler);
} catch (ParserConfigurationException e) {
e.printStackTrace();
} catch (SAXException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
}
Send data to this processor instead of putting it in the list:
public class PageHandler extends DefaultHandler {
private final PageProcessor processor;
private Page page;
private StringBuilder stringBuilder;
private boolean idSet = false;
public PageHandler(PageProcessor processor) {
this.processor = processor;
}
#Override
public void startElement(String uri, String localName, String qName, Attributes attributes) throws SAXException {
//Unchanged from your implementation
}
#Override
public void characters(char[] ch, int start, int length) throws SAXException {
//Unchanged from your implementation
}
#Override
public void endElement(String uri, String localName, String qName) throws SAXException {
// Elide code not needing change
} else if (qName.equals("page")){
processor.process(page);
page = null;
}
} else {
page = null;
}
}
}
Of course, you can make your interface handle chunks of multiple records rather than just one and have the PageHandler collect pages locally in a smaller list and periodically send the list off for processing and clear the list.
Or (perhaps better) you could implement the PageProcessor interface as defined here and build in logic there that buffers the data and sends it on for further handling in chunks.
Don Roby's approach is somewhat reminiscent to the approach I followed creating a code generator designed to solve this particular problem (an early version was conceived in 2008). Basically each complexType has its Java POJO equivalent and handlers for the particular type are activated when the context changes to that element. I used this approach for SEPA, transaction banking and for instance discogs (30GB). You can specify what elements you want to process at runtime, declaratively using a propeties file.
XML2J uses mapping of complexTypes to Java POJOs on the one hand, but lets you specify events you want to listen on.
E.g.
account/#process = true
account/accounts/#process = true
account/accounts/#detach = true
The essence is in the third line. The detach makes sure individual accounts are not added to the accounts list. So it won't overflow.
class AccountType {
private List<AccountType> accounts = new ArrayList<>();
public void addAccount(AccountType tAccount) {
accounts.add(tAccount);
}
// etc.
};
In your code you need to implement the process method (by default the code generator generates an empty method:
class AccountsProcessor implements MessageProcessor {
static private Logger logger = LoggerFactory.getLogger(AccountsProcessor.class);
// assuming Spring data persistency here
final String path = new ClassPathResource("spring-config.xml").getPath();
ClassPathXmlApplicationContext context = new ClassPathXmlApplicationContext(path);
AccountsTypeRepo repo = context.getBean(AccountsTypeRepo.class);
#Override
public void process(XMLEvent evt, ComplexDataType data)
throws ProcessorException {
if (evt == XMLEvent.END) {
if( data instanceof AccountType) {
process((AccountType)data);
}
}
}
private void process(AccountType data) {
if (logger.isInfoEnabled()) {
// do some logging
}
repo.save(data);
}
}
Note that XMLEvent.END marks the closing tag of an element. So, when you are processing it, it is complete. If you have to relate it (using a FK) to its parent object in the database, you could process the XMLEvent.BEGIN for the parent, create a placeholder in the database and use its key to store with each of its children. In the final XMLEvent.END you would then update the parent.
Note that the code generator generates everything you need. You just have to implement that method and of course the DB glue code.
There are samples to get you started. The code generator even generates your POM files, so you can immediately after generation build your project.
The default process method is like this:
#Override
public void process(XMLEvent evt, ComplexDataType data)
throws ProcessorException {
/*
* TODO Auto-generated method stub implement your own handling here.
* Use the runtime configuration file to determine which events are to be sent to the processor.
*/
if (evt == XMLEvent.END) {
data.print( ConsoleWriter.out );
}
}
Downloads:
https://github.com/lolkedijkstra/xml2j-core
https://github.com/lolkedijkstra/xml2j-gen
https://sourceforge.net/projects/xml2j/
First mvn clean install the core (it has to be in the local maven repo), then the generator. And don't forget to set up the environment variable XML2J_HOME as per directions in the usermanual.
I have a class which reads a properties file. Please see below.
The method readProperties() is called many times when the application is running, does that mean there is a memory issue here?
public class PropertyReader {
private static Properties configKeyValuePairs = null;
private static String configPropertiesFileName = "Config.properties";
static void readProperties() throws FileNotFoundException, IOException {
configKeyValuePairs = new Properties();
InputStream input = ConfigReader.class
.getResourceAsStream(configPropertiesFileName);
configKeyValuePairs.load(input);
input.close();
}
static String getUserName(){
//return user name which is from the properties file.
}
}
Assuming your properties file never changes, you can do the following:
public class MyApplicationConfiguration {
private static Properties configKeyValuePairs = new Properties();
private static String configPropertiesFileName = "Config.properties";
static {
InputStream input = null;
try {
input = MyApplicationConfiguration.class
.getResourceAsStream(configPropertiesFileName);
configKeyValuePairs.load(input);
} catch (IOException e) {
// Deal with not being able to load config, could be a fatal error!
} finally {
if (input != null) {
input.close();
}
}
}
public static String getUsername() {
// ...
}
// Implement getters for other configuration key-value pairs
// DO NOT let configKeyValuePairs be returned to anyone
}
Load the properties object once, and store it a class member.
I find it hard to believe that you will have memory issues because of it.
If you find out that you do, then you can always comeback and rethink it, but don't prematurely optimize a problem that probably doesn't exist.
Yes, there could be a very big memory problem, depending on whether or not there are calling classes that hold a reference to the newly created properties object.
Try something like this:
public class PropertyReader {
private static Properties configKeyValuePairs = null;
private static final String configPropertiesFileName = "Config.properties";
public static void readProperties() throws FileNotFoundException, IOException {
if(null == configKeyValuePairs){
InputStream input;
synchronized(PropertyReader.class){
try{
configKeyValuePairs = new Properties();
input = PropertyReader.class
.getResourceAsStream(configPropertiesFileName);
configKeyValuePairs.load(input);
}finally{
//this can still throw ioexception!
if(null != input){
input.close();
}
}
}
}
}