Groovy/Java: Ini4j insert multiple values to single parameter in different lines - java

I'm trying to add multiple-values option into my ini file from Groovy using ini4j with following codes (I tried some variants):
import org.ini4j.Wini
List valuesList = [ 'val1’, ‘val2’, ‘val3' ]
( new Wini( new File( "test.ini" ) ) ).with{
valuesList.each{
put( 'sectionNa'sectionName','optionName', it)
}
store()
}
import org.ini4j.Wini
List valuesList = [ 'val1’, ‘val2’, ‘val3' ]
( new Wini( new File( "test.ini" ) ) ).with{
Section sectionObject = get( ‘sectionName’ )
sectionObject .put( 'optionName', ‘val1’ )
sectionObject .put( 'optionName', ‘val2’ )
sectionObject .put( 'optionName', ‘val3’ )
}
store()
}
I got ini file like this one:
[sectionName]
optionName = val3
But I want to get:
[sectionName]
optionName = val1
optionName = val2
optionName = val3
Could you please advice me how to resolve my issue? Thanks In Advance!
Update 1
I still waiting more elegant solution. But I created direct ini file editing below. Please provide me any feedback about it:
List newLines = []
File currentFile = new File( "test.ini" )
List currentLines = currentFile.readLines()
int indexSectionStart = currentLines.indexOf( 'sectionName' )
(0..indexSectionStart).each{
newLines.add( currentLines[ it ] )
}
List valuesList = 'val1,val2,val3'.split( ',' )
valuesList.each{
newLines.add( "optionName = $it" )
}
( indexSectionStart + 1 .. currentLines.size() - 1 ).each{
newLines.add( currentLines[ it ] )
}
File newFile = new File( "new_test.ini" )
if ( newFile.exists() ) newFile.delete()
newLines.each {
newFile.append( it+'\n' )
}
And simply delete old file and rename new one. I implemented it because I didn't find any insertLine() like methods in standart File

Right, how's this:
import org.ini4j.*
List valuesList = [ 'val1', 'val2', 'val3' ]
new File( "/tmp/test.ini" ).with { file ->
new Wini().with { ini ->
// Configure to allow multiple options
ini.config = new Config().with { it.multiOption = true ; it }
// Load the ini file
ini.load( file )
// Get or create the section
( ini.get( 'sectionName' ) ?: ini.add( 'sectionName' ) ).with { section ->
valuesList.each {
// Then ADD the options
section.add( 'optionName', it )
}
}
// And write it back out
store( file )
}
}

Related

Writing a Jagged Array in HDF5 using the Java Native Library

I have tried numerous ways and followed some of the examples that are scattered around the web on how to write a jagged array (an array of arrays that may be of differing lengths) in HDF5.
Most of the examples are in C and rather low-level. Anyhow I can't seem to get it working and I just looked at the C-source code and it pretty much says that any variable-length datatypes that are not strings are not supported (if I understood correctly).
My miserable dysfunctional code (as is):
public void WIP_createVLenFloatDataSet( List<? extends Number> floats ) throws Exception
{
String group = "/test";
long groupId = createGroupIfNotExist( group );
MDataQualifier qualifier = new MDataQualifierImpl( group, "float", "0.0.0" );
long datasetId = openDataSet( qualifier );
long heapType = H5.H5Tcopy( MDataType.FLOAT_ARRAY.getHDFType() );
heapType = H5.H5Tvlen_create( heapType );
// heapType = H5.H5Tarray_create( heapType, 1, new long[]{1} );
if( !exists( datasetId ) )
{
long[] maxDims = new long[]{ HDF5Constants.H5S_UNLIMITED };
long dataspaceId = H5.H5Screate_simple( 1, new long[]{ 1 }, null );
// Create the dataset.
long datasetId1 = -1;
try
{
if( exists( m_fileId ) && exists( dataspaceId ) && exists( heapType ) )
{
long creationProperties = H5.H5Pcreate( HDF5Constants.H5P_DATASET_CREATE );
H5.H5Pset_chunk( creationProperties, /*ndims*/1, new long[]{ 1 } );
datasetId1 = H5.H5Dcreate( groupId, qualifier.getVersionedName(), heapType, dataspaceId, H5P_DEFAULT, creationProperties, H5P_DEFAULT );
// H5.H5Pclose( creationProperties );
}
}
catch( Exception e )
{
LOG.error( "Problems creating the dataset: " + e.getMessage(), e );
}
datasetId = datasetId1;
if( exists( datasetId ) )
{
// flushIfNecessary();
LOG.trace( "Wrote empty dataset {}", qualifier.getVersionedName() );
}
}
List<? extends Number> data = ( List<? extends Number> )floats;
// H5.H5Dwrite( datasetId, heapType, dataspaceId, memSpaceId, HDF5Constants.H5P_DEFAULT, Floats.toArray( data) );
ByteBuffer bb = ByteBuffer.allocate( data.size() * 4 );
floats.forEach( f -> bb.putFloat( f.floatValue() ) );
// H5.H5Dwrite( datasetId, heapType, H5S_ALL, H5S_ALL, H5P_DEFAULT, Floats.toArray( data ) );
H5.H5Dwrite( datasetId, heapType, H5S_ALL, H5S_ALL, H5P_DEFAULT, bb.array() );
}
Has anyone done this before and can at least confirm that it's not possible?
The most I can get out of HDF5 is the message "buf does not support variable length type".
Apparently the "glue code" of the JNI wrapper doesn't support this. If you want to use this feature you either have to implement your own JNI or wait for a newer version. The official JNI code is open source and can be found here.

How to join two files via Cascading

Lets see what we have. First file [Interface Class]:
list arrayList
list linkedList
Second file[Class countOfInstanse]:
arrayList 120
linkedList 4
I would like to join this two files by key[Class] and get count per each Interface:
list 124
and code:
public class Main
{
public static void main( String[] args )
{
String docPath = args[ 0 ];
String wcPath = args[ 1 ];
String stopPath = args[ 2 ];
Properties properties = new Properties();
AppProps.setApplicationJarClass( properties, Main.class );
AppProps.setApplicationName( properties, "Part 1" );
AppProps.addApplicationTag( properties, "lets:do:it" );
AppProps.addApplicationTag( properties, "technology:Cascading" );
FlowConnector flowConnector = new Hadoop2MR1FlowConnector( properties );
// create source and sink taps
Tap docTap = new Hfs( new TextDelimited( true, "\t" ), docPath );
Tap wcTap = new Hfs( new TextDelimited( true, "\t" ), wcPath );
Fields stop = new Fields( "class" );
Tap classTap = new Hfs( new TextDelimited( true, "\t" ), stopPath );
// specify a regex operation to split the "document" text lines into a token stream
Fields token = new Fields( "token" );
Fields text = new Fields( "interface" );
RegexSplitGenerator splitter = new RegexSplitGenerator( token, "[ \\[\\]\\(\\),.]" );
Fields fieldSelector = new Fields( "interface", "class" );
Pipe docPipe = new Each( "token", text, splitter, fieldSelector );
// define "ScrubFunction" to clean up the token stream
Fields scrubArguments = new Fields( "interface", "class" );
docPipe = new Each( docPipe, scrubArguments, new ScrubFunction( scrubArguments ), Fields.RESULTS );
Fields text1 = new Fields( "amount" );
// RegexSplitGenerator splitter = new RegexSplitGenerator( token, "[ \\[\\]\\(\\),.]" );
Fields fieldSelector1 = new Fields( "class", "amount" );
Pipe stopPipe = new Each( "token1", text1, splitter, fieldSelector1 );
Pipe tokenPipe = new CoGroup( docPipe, token, stopPipe, text, new InnerJoin() );
tokenPipe = new Each( tokenPipe, text, new RegexFilter( "^$" ) );
// determine the word counts
Pipe wcPipe = new Pipe( "wc", tokenPipe );
wcPipe = new Retain( wcPipe, token );
wcPipe = new GroupBy( wcPipe, token );
wcPipe = new Every( wcPipe, Fields.ALL, new Count(), Fields.ALL );
// connect the taps, pipes, etc., into a flow
FlowDef flowDef = FlowDef.flowDef().setName( "wc" ).addSource( docPipe, docTap ).addSource( stopPipe, classTap ).addTailSink( wcPipe, wcTap );
// write a DOT file and run the flow
Flow wcFlow = flowConnector.connect( flowDef );
wcFlow.writeDOT( "dot/wc.dot" );
wcFlow.complete();
}
}
[I decided to resolve this issue step-by-step and left final result here for others. So first step - Couldn`t join two files with one key via Cascading (Not Completed yet) ]
I would convert the two files to two Map objects, iterate through the keys and sum up the numbers. Then you can write them back to a file.
Map<String,String> nameToType = new HashMap<String,String>();
Map<String,Integer> nameToCount = new HashMap<String,Integer>();
//fill Maps from file here
Map<String,Integer> result = new HashMap<String,Integer>();
for (String name: nameToType.keyset())
{
String type = nameToType.get(name);
int count = nameToCount.get(type);
if (!result.containsKey(type))
result.put(type,0);
result.put(type, result.get(type) + count);
}

Java Properties.getProperty() with an Array of Objects

I have a configuration file formatted as this,
object1=1
object2=2
object3=3
array={
sub_object1=sub_1
sub_object2=sub_2
sub_object3=sub_3
}
object4=4
object5=5
I have been trying to process this with Properties.getProperty, but am unable to find an effective method to process the array.
try {
Properties props = new Properties();
props.load( new FileInputStream( "settings.conf" ) );
if( !props.isEmpty() )
{
props.stringPropertyNames().stream().forEach((key) ->
{
if( !key.equals( "array" ) )
{
List<Object> subkeys = props.list();
for( subkeys: subkey ) )
{
System.out.println( "Subkey: " + props.getProperty( subkey ) );
}
}
});
}
} catch ( Exception e ) {
e.printStackTrace();
}
I realize the above is errorous, but I have been unable to find a solution. Any one have an idea?

Null Pointer Exception when replacing character " with no text then casting value to Float

I am writing a java program fetching JSON data through HTTP GET methods and it returns the following after I navigate the object tree:
{
"year":"2015",
"period":"M03",
"periodName":"March",
"value":"141178",
"footnotes":[{}]
}
Now I want to take value and caste it to a float, I tried to do this like such:
JSONParser parser = new JSONParser();
try
{
JSONObject
BLSemployment = ( JSONObject ) parser.parse( _RDATA );
BLSemployment = ( ( JSONObject ) BLSemployment.get( "Results" ) );
JSONArray
BLSemploymentseries = ( ( JSONArray ) BLSemployment.get( "series" ) );
BLSemployment = ( ( JSONObject ) BLSemploymentseries.get( 0 ) );
BLSemploymentseries = ( ( JSONArray ) BLSemployment.get( "data" ) );
for( int i = 0; i < 12; i++ )
{
BLSemployment = ( (JSONObject) BLSemploymentseries.get( i ) );
HistoricalNonFarmPayrollData[i] = Float.parseFloat( JSONValue.toJSONString( BLSemployment.get( "value" ) ).replace( "\"" , "" ) );
HistoricalNonFarmPayrollYear[i] = JSONValue.toJSONString( BLSemployment.get( "year" ) );
HistoricalNonFarmPayrollMonth[i] = JSONValue.toJSONString( BLSemployment.get( "periodName" ) );
}
}
catch ( ParseException pe )
{
System.out.println( pe );
}
However Now I get the error:
Exception in thread "main" java.lang.NullPointerException
at BLSFramework.getNonFarmPayrolls(playground.java:354)
at playground.main(playground.java:27)
RESOLVED: occasionally BLS website doesn't send data for periods of time and I was coding during that period of time.

Returning JSON from Java to Worklight Adapter

I have one class FileHandling in which there is one method called readAllLayouts which will read the all files from specified folder and return the content as JSONArray to worklight adapter.
I have got this type of Error when invoking worklight procedure:
{
"errors": [
"Evaluator: Java class \"org.json.simple.JSONArray\" has no public instance field or method named \"isSuccessful\"."
],
"info": [
],
"isSuccessful": false,
"warnings": [
]
}
and this is the code of my Java method:
public static JSONArray readAllLayoutFiles ( ){
File layoutDir = new File( LAYOUT_PARENT_DIR );
String allFiles[] = layoutDir.list();
System.out.println( "All Files Length : " + allFiles.length );
JSONObject obj = null;//new JSONObject[ allFiles.length ];
JSONArray retArr = new JSONArray();
for ( String f : allFiles ){
obj= new JSONObject();
obj.put( "layoutname", f.replaceAll (".txt", "" ) );
obj.put( "layouthtml", readLayoutFile ( f ) );
retArr.add(obj);
}
obj= new JSONObject();
obj.put( "isSuccessful", true );
retArr.add(obj);
System.out.println( retArr.toString() );
return retArr;
}
Any help would be appreciated.
Why is "isSuccessful" false in the output whereas it is set to true in your code?
Because of the problem I have changed my code that will working, but now It is not returning JSON but returning String that contains JSON.
Here is the code and Working Fine :
public static String readAllLayoutFiles ( ){
File layoutDir = new File( LAYOUT_PARENT_DIR );
String allFiles[] = layoutDir.list();
System.out.println( "All Files Length : " + allFiles.length );
JSONObject obj = null;//new JSONObject[ allFiles.length ];
JSONArray retArr = new JSONArray();
for ( String f : allFiles ){
obj= new JSONObject();
obj.put( "layoutname", f.replaceAll (".txt", "" ) );
obj.put( "layouthtml", readLayoutFile ( f ) );
retArr.add(obj);
}
obj= new JSONObject();
obj.put( "isSuccessful", true );
retArr.add(obj);
System.out.println( retArr.toJSONString() );
return retArr.toJSONString();
}

Categories