We are using Spring Batch Chunk to read messages from JMS destination and write to a flat file. In this regard, we have below observations,
If the message broker goes down while the reader reads the messages and commit count is not reached, what ever number of messages read so far are being passed to Writer and then batch is going in to FAILED state. Is this the default behaviour of Chunk?
If at all the answer for point 1 is YES, how do we make sure that this partial chunk is not sent to Writer. (To give more background to this issue, we have the JMS Session Transacted in the JMS Template, so when the chunk fails to read complete number of messages equal to Commit Count, all the messages read in the partial chunk are being rolled back to the JMS destination, where as the same partial chunk is being written to file. This is causing duplicates in the file when we restart the batch job).
Any help would be greatly appreciated.
EDIT
The configuration is as shown below,
Chunk:
<batch:step id="step-1" next="step-2">
<batch:tasklet allow-start-if-complete="false">
<batch:chunk reader="jms-reader-1-1" writer="file-writer-1-1" commit-interval="1000">
</batch:chunk>
</batch:step>
Writer (Flat File) :
<bean scope="step" class="o.s.b.i.f.FlatFileItemWriter" id="file-writer-1-1">
<property name="resource" value="file:#{T(com.test.core.BatchConfiguration).BATCH_VFS_LOCAL_TEMP_LOCATION}/#{T(com.test.utils.ThreadContextUtils).getJobInstanceIdAsString()}/AssetMesage"/>
<property name="lineAggregator">
<bean class="o.s.b.i.f.t.DelimitedLineAggregator">
<property name="delimiter" value=","/>
<property name="fieldExtractor">
<bean class="o.s.b.i.f.t.BeanWrapperFieldExtractor">
<property name="names" value="assetId,assetName,assetDesc"/>
</bean>
</property>
</bean>
</property>
</bean>
Reader (JMS):
<bean scope="step" class="com.test.runtime.impl.item.readers.JMSItemReader" id="jms-reader-1-1">
<property name="adapter">
<bean class="com.test.adapter.impl.JMSAdapter">
<property name="resource" ref="JMS.vmsmartbatch02_Regression"/>
<property name="retryerId" value="JMS.vmsmartbatch02_Regression-retryer"/>
</bean>
</property>
<property name="destination" value="#{jobParameters[source1jmsdestination] != null ? jobParameters[source1jmsdestination] : "sourceTopic"}"/><property name="durableSubscriberName" value="sourceTopicDS"/><property name="destinationType" value="Topic"/>
<property name="ackMode" value="#{T(javax.jms.Session).CLIENT_ACKNOWLEDGE}"/>
<property name="maxMessageCount" value="2000"/>
</bean>
EDIT 2
below is the core reader logic I am using,
Reader
public Object read() throws Exception, UnexpectedInputException,
ParseException, NonTransientResourceException {
Object item = null;
try {
if(ackMode != 0 && ackMode >= 1 && ackMode <= 3){
adapter.getResource().setSessionAcknowledgeMode(ackMode);
}
if(maxMessageCount > 0){
ThreadContextUtils.addToExecutionContext("maxMessageCount", maxMessageCount);
if(ThreadContextUtils.getExecutionContext().containsKey("readMessageCount")) {
readMessageCount = ThreadContextUtils.getExecutionContext().getInt("readMessageCount");
}
}
if (TOPIC_KEY.equalsIgnoreCase(destinationType)
&& durableSubscriberName != null) {
item = (Object) adapter.invoke(REC_DS_AND_CONVERT_SELECTED,
OBJECT_CLASS, destination, durableSubscriberName,
receiveTimeout, filter == null ? "" : filter);
} else {
item = (Object) adapter.invoke(REC_AND_CONVERT_SELECTED,
OBJECT_CLASS, destination,
receiveTimeout <= 0 ? adapter.getResource()
.getReceiveTimeout() : receiveTimeout,
(filter == null ? "" : filter));
}
if(maxMessageCount > 0){
if( item !=null){
readMessageCount++;
ThreadContextUtils.addToExecutionContext("readMessageCount", readMessageCount);
}
}
return item;
} finally {
}
}
Related
I have a spring-batch application that reads a file with this reader :
<bean id="tradeItemReader" class="org.springframework.batch.item.file.FlatFileItemReader" scope="step">
<property name="resource">
<bean class="org.springframework.core.io.FileSystemResource">
<constructor-arg value="${input.file.path}/#{jobExecutionContext['trades']}" type="java.lang.String"/>
</bean>
</property>
<property name="linesToSkip" value="1" />
<property name="lineMapper">
<bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper">
<!-- split it -->
<property name="lineTokenizer">
<bean
class="org.springframework.batch.item.file.transform.DelimitedLineTokenizer">
<beans:property name="strict" value="false" />
<beans:property name="includedFields" value="0,2,3,6" />
<property name="names"
value="field0,field2,field3,field6" />
</bean>
</property>
<property name="fieldSetMapper">
<bean
class="org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper">
<property name="prototypeBeanName" value="trade" />
</bean>
</property>
</bean>
</property>
</bean>
The fields are delimited by a comma ,, and here is the catch : some fields look like [LON, TGT] and the line ends up wrongly parsed, because of the comma inside the square brackets.
Example :
Input : Global,,VERIFIED,[LON, TGT],ERerd,3456585,QTR,20190929,20231020
Desired output : Global, VERIFIED, [LON, TGT], QTR
Actual output : Global, VERIFIED, [LON, 3456585
How can I achieve that ? I don't have control over the input file.
EDIT
This is not a duplicate, as the proposed solution would not work : here we don't have a single quote-character, but we have 2 different ones, the opening bracket and the closing bracket.
As explained by Luca Basso Ricci, my input csv is invalid, but I still have to deal with it because I have no control over it.
So I wrote my own delimited line tokenizer, which is just the DelimitedLineTokenizer with a rewritten isDelimiter() method, and replaced it in the conf file :
private boolean isDelimiter(char[] chars, int i, String token, int endIndexLastDelimiter) {
boolean result = false;
int openingBrackets = StringUtils.countOccurrencesOf(new String(Arrays.copyOfRange(chars, 0, i)), "[");
int closingBrackets = StringUtils.countOccurrencesOf(new String(Arrays.copyOfRange(chars, 0, i)), "]");
boolean inBrackets = (openingBrackets - closingBrackets > 0);
if ((i - endIndexLastDelimiter >= this.delimiter.length()) &&
(i >= token.length() - 1)) {
String end = new String(chars, i - token.length() + 1, token.length());
if (token.equals(end)) {
result = !inBrackets;
}
}
return result;
}
I have a program that contains some for-loops. The idea of the program is to log into a website using multiple accounts and retrieve a list (each login brings a different list). So the way I have it setup is with an enhanced for loop:
loginsList.put( "firstUsername", "firstPassword" );
loginsList.put( "secondUsername", "secondPassword" );
loginsList.put( "thirdUsername", "thirdPassword" );
loginsList.put( "fourthUsername", "fourthPassword" );
loginsList.put( "fifthUsername", "fifthPassword" );
for ( Entry<String, String> nextLogin : logins.entrySet() ) {
String nextUser = nextLogin.getKey();
String nextPass = nextLogin.getValue();
Response authenticateUserResponse = Jsoup.connect( WEBSITE_I_NEED_TO_LOGIN_TO )
.data( "username", nextUser )
.data( "password", nextPass )
.execute();
Basically here is what i want the flow to be:
read()--> obtain list----> send list to write() method to write it to the database--> loop back around and get the next login-->read()--> obtain list-->send it to the write()....etc..
however the issue I'm having is that my loop runs in the read method and does not go to the write method until all the lists have been traversed in all the accounts. Essentially the write is only being called once at the end, so what I have right now is something like this(this is the flawed design):
read()--->obtain list-->next account--->obtain list---next account--->obtain list--->write()
How can I organize the chunk processing in Spring to write after I read a chunk only?
for ( Entry<String, String> nextLogin : logins.entrySet() ) {
String nextUser = nextLogin.getKey();
String nextPass = nextLogin.getValue();
//do something
......
//call write function
writeValues(x, y, z);
}
Is this all you want?
Otherwise it seems like a traditional SpringBatch: Read > Process > Proceeed case.
You will have your reader = gets a record
Procesor > saves a record
Spring batch moves you to next record if the was no error.
<step id="processUpdates">
<tasklet task-executor="batchThreadPoolTaskExecutor" throttle-limit="${batch.cviscoreupdate.threadcount}">
<chunk reader="Reader" processor="ItemProcessor" writer="ItemWriter" commit-interval="${batch.commit.interval}" skip-limit="${batch.skip.limit}" >
<skippable-exception-classes>
<include class="batch.support.SkipRecordException" />
</skippable-exception-classes>
</chunk>
</tasklet>
<next on="FAILED" to="errorExit"/>
<next on="*" to="moveFilesFromWorkToDone" />
<listeners>
<listener ref="UpdateSkipListener"/>
</listeners>
</step>
<bean id="CVIScoreUpdateItemProcessor" class="com.batch.MyUpdateItemProcessor" scope="step" init-method="init" />
I have to migrate some data from a database to another and I'm using Spring Batch with partition. The configuration of the job is the following
...
...
<bean id="migrationProcessor" class="it.migrazione.MigrazioneProcessor" scope="step"/>
<bean id="migrationWriter" class="it.migrazione.MigrazioneWriter" scope="step"/>
<bean id="migrationReader" class="it.migrazione.MigrazioneReader" scope="step"/>
<bean id="partitioner" class="it.migrazione.MigrazionePartitioner" />
<bean id="taskExecutor" class="org.springframework.core.task.SimpleAsyncTaskExecutor"/>
<bean id="threadPoolTaskExecutor" class="org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor">
<property name="corePoolSize" value="10" />
<property name="maxPoolSize" value="10" />
<property name="allowCoreThreadTimeOut" value="true" />
</bean>
<job id="migrationJob" xmlns="http://www.springframework.org/schema/batch">
<step id="masterStep">
<partition step="slave" partitioner="partitioner">
<handler grid-size="10" task-executor="threadPoolTaskExecutor" />
</partition>
</step>
</job>
<step id="slave" xmlns="http://www.springframework.org/schema/batch">
<tasklet throttle-limit="1" transaction-manager="transactionManager">
<chunk reader="migrationReader"
processor="migrationProcessor"
writer="migrationWriter"
commit-interval="1"/>
</tasklet>
</step>
<bean id="transactionManager" class="org.springframework.orm.jpa.JpaTransactionManager">
<property name="entityManagerFactory" ref="entityManagerFactory"/>
</bean>
...
...
Job knows how many row must be migrated, so the partitioner creates 10 contexts with a specific range related
for (int threadCount = 1; threadCount <= gridSize; threadCount++) {
if (threadCount == 1)
fromRow = 0;
else
fromRow = toRow + 1;
toRow += delta;
context = new ExecutionContext();
context.putInt("fromRow", fromRow);
context.putInt("toRow", toRow);
context.putString("name", "Processing Thread" + threadCount);
result.put("partition" + threadCount, context);
logger.info("Partition number >> " + threadCount + " from Row#: "
+ fromRow + " to Row#: " + toRow);
}
When I run this job I have some threads that read another time. For example Thread#1 call another time the reader,processor and writer while. I don't understand why, but is it possible to have one thread that executes once the chunk, without checking that read is already called? When the writer related to a specific partition ends, why the thread calls the reader another time? It is like reader doesn't immediately see the changes made by writer.
I've been stuck on this one the last couple of days to no avail and after a lot of googling and trial and error I'm back at the beginning with no luck.
I'm currently working on a Java Application which connects to a third party via JAX-WS. They provide a WSDL which we run in using the jaxws-maven-plugin to generate the services. Implemented via Spring, HTTPConduit is then used to change the endpoints and provide relevant config (e.g. keystores) for connecting to various environments (e.g. SysTest, UAT, Production etc).
The issue is, I haven't set any logging (in fact removing the two interceptors there previously), however the xml message being sent to the third party is appearing in the logs. This is a major issue as we're sending credit card information to the third parties which can no way be logged for obvious reasons. I can change the log4j properties in order to prevent the logging that way, but that's no way a fix.
Here is some code:
This is our beans file.
<jaxws:client id="client1"
xmlns:hsn="http://example.com"
serviceClass="com.example.Service1"
address="${service1.url}"
endpointName="hsn:service1"/>
<jaxws:client id="client2"
xmlns:hsn="http://example.com"
serviceClass="com.example.Service2"
address="${service2.url}"
endpointName="hsn:service2"/>
<jaxws:client id="client3"
xmlns:hsn="http://example.com"
serviceClass="com.example.Service3"
address="${service3.url}"
endpointName="hsn:service3"/>
<http:conduit name="https://*/.*">
<http:tlsClientParameters disableCNCheck="${service.disable-cn-check}">
<sec:keyManagers keyPassword="${service.keystore.password}">
<sec:keyStore type="JKS" password="${service.keystore.password}"
resource="${service.keystore.name}"/>
</sec:keyManagers>
<sec:trustManagers>
<sec:keyStore type="JKS" password="${service.truststore.password}"
resource="${service.truststore.name}"/>
</sec:trustManagers>
<sec:cipherSuitesFilter>
<sec:include>.*_EXPORT_.*</sec:include>
<sec:include>.*_EXPORT1024_.*</sec:include>
<sec:include>.*_WITH_DES_.*</sec:include>
<sec:include>.*_WITH_AES_.*</sec:include>
<sec:include>.*_WITH_NULL_.*</sec:include>
<sec:exclude>.*_DH_anon_.*</sec:exclude>
</sec:cipherSuitesFilter>
</http:tlsClientParameters>
<http:client AutoRedirect="true" Connection="Keep-Alive"
ConnectionTimeout="${service.max-response-time}"
ReceiveTimeout="${service.max-response-time}"/>
</http:conduit>
<http:conduit name="http://*/.*">
<http:client AutoRedirect="true" Connection="Keep-Alive"
ConnectionTimeout="${service.max-response-time}"
ReceiveTimeout="${service.max-response-time}"/>
</http:conduit>
As you can see there are no logging interceptors or logging explicitly turned on using:
<cxf:bus>
<cxf:features>
<cxf:logging/>
</cxf:features>
</cxf:bus>
The only other related file I can think of is META-INF/cxf/org.apache.cxf.Logger which contains:
org.apache.cxf.common.logging.Slf4jLogger
Which even without the file present doesn't make any changes.
Just so you can see, here is a sample from the logs as well:
15:05:45.742 DEBUG | org.apache.cxf.phase.PhaseInterceptorChain - Invoking handleMessage on interceptor org.apache.cxf.interceptor.MessageSenderInterceptor$MessageSenderEndingInterceptor#5e62b59d
15:05:45.742 DEBUG | org.apache.cxf.transport.http.Headers - Accept: */*
15:05:45.743 DEBUG | org.apache.cxf.transport.http.Headers - Connection: Keep-Alive
15:05:45.743 DEBUG | org.apache.cxf.transport.http.Headers - SOAPAction: ""
15:05:45.744 DEBUG | org.apache.cxf.transport.http.HTTPConduit - No Trust Decider for Conduit '{http://example.com}service1.http-conduit'. An afirmative Trust Decision is assumed.
15:05:45.746 DEBUG | org.apache.cxf.transport.http.HTTPConduit - Sending POST Message with Headers to http://localhost:8080/stubs/Service1 Conduit :{http://example.com}service1.http-conduit
15:05:45.746 DEBUG | org.apache.cxf.transport.http.HTTPConduit - Conduit "{http://example.com}service1.http-conduit" Transmit cached message to: http://localhost:8080/stubs/Service1: <soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/"><soap:Body>********************HERE LIES THE XML MESSAGE*********************</soap:Body></soap:Envelope>
15:05:45.766 DEBUG | org.apache.cxf.endpoint.ClientImpl - Interceptors contributed by bus: [org.apache.cxf.ws.policy.PolicyInInterceptor#24ec87dc]
15:05:45.767 DEBUG | org.apache.cxf.endpoint.ClientImpl - Interceptors contributed by client: []
15:05:45.767 DEBUG | org.apache.cxf.endpoint.ClientImpl - Interceptors contributed by endpoint: [org.apache.cxf.jaxws.interceptors.WrapperClassInInterceptor#52d1f1fb, org.apache.cxf.jaxws.interceptors.HolderInInterceptor#5565c037, org.apache.cxf.jaxws.interceptors.SwAInInterceptor#b2e86ae, org.apache.cxf.frontend.WSDLGetInterceptor#1ca801a2]
15:05:45.768 DEBUG | org.apache.cxf.endpoint.ClientImpl - Interceptors contributed by binding: [org.apache.cxf.interceptor.AttachmentInInterceptor#1b8c0f3e, org.apache.cxf.interceptor.StaxInInterceptor#83cbd93, org.apache.cxf.binding.soap.interceptor.SoapActionInInterceptor#4bc2021e, org.apache.cxf.interceptor.DocLiteralInInterceptor#2e19266d, org.apache.cxf.binding.soap.interceptor.SoapHeaderInterceptor#7529d5bf, org.apache.cxf.binding.soap.interceptor.ReadHeadersInterceptor#d902ab1, org.apache.cxf.binding.soap.interceptor.StartBodyInterceptor#73e2d16b, org.apache.cxf.binding.soap.interceptor.CheckFaultInterceptor#3023033d, org.apache.cxf.binding.soap.interceptor.MustUnderstandInterceptor#4aa9b27b]
15:05:45.768 DEBUG | org.apache.cxf.endpoint.ClientImpl - Interceptors contributed by databinging: [org.apache.cxf.jaxb.attachment.JAXBAttachmentSchemaValidationHack#331fef77]
15:05:45.769 DEBUG | org.apache.cxf.phase.PhaseInterceptorChain - Chain org.apache.cxf.phase.PhaseInterceptorChain#273221e was created. Current flow:
receive [PolicyInInterceptor, AttachmentInInterceptor]
post-stream [StaxInInterceptor]
read [WSDLGetInterceptor, ReadHeadersInterceptor, SoapActionInInterceptor, StartBodyInterceptor]
pre-protocol [MustUnderstandInterceptor]
post-protocol [CheckFaultInterceptor, JAXBAttachmentSchemaValidationHack]
unmarshal [DocLiteralInInterceptor, SoapHeaderInterceptor]
post-logical [WrapperClassInInterceptor]
pre-invoke [SwAInInterceptor, HolderInInterceptor]
15:05:45.769 DEBUG | org.apache.cxf.phase.PhaseInterceptorChain - Invoking handleMessage on interceptor org.apache.cxf.ws.policy.PolicyInInterceptor#24ec87dc
Few months back I had come across similar problem, where I needed to mask few fields of my xml
The CustomLoginInterceptor
import org.apache.commons.lang3.StringUtils;
import org.apache.cxf.interceptor.LoggingInInterceptor;
import org.apache.cxf.interceptor.LoggingMessage;
public class KPLogInInterceptor extends LoggingInInterceptor {
#Override
protected String formatLoggingMessage(LoggingMessage loggingMessage) {
String str = loggingMessage.toString();
String output = maskPasswords(str);
//output = maskRestPasswords(output);
return(output);
}
private String maskPasswords(String str) {
// String str =
// "<password1>asdasdad</password1><Password3></Password3><Password5/><PassWord6>fdsfsf</PassWord6>";
final String[] keys = { "password", "authpass", "accountnumber", "authphrase" };
for (String key : keys) {
int beginIndex = 0;
int lastIndex = -1;
boolean emptyPass = false;
boolean multiline = false;
if(key.equals("authphrase") || key.equals("authpass"))
{
//when lines are in multiplelines say <name>authphrase</name><value>vals</value>
multiline = true;
}
while (beginIndex != -1
&& (beginIndex = StringUtils.indexOfIgnoreCase(str, key,
beginIndex)) > 0) {
if(multiline){
beginIndex = StringUtils.indexOfIgnoreCase(str, "value", beginIndex);
}
beginIndex = StringUtils.indexOf(str, ">", beginIndex);
if (beginIndex != -1) {
char ch = str.charAt(beginIndex - 1);
if (ch == '/') {
emptyPass = true;
}
if (!emptyPass) {
lastIndex = StringUtils.indexOf(str, "<", beginIndex);
if (lastIndex != -1) {
String overlay = "*";
String str2 = StringUtils.substring(str,
beginIndex + 1, lastIndex);
if (str2 != null && str2.length() > 1) {
overlay = StringUtils.rightPad(overlay,
str2.length(), "*");
str = StringUtils.overlay(str, overlay,
beginIndex + 1, lastIndex);
}
}
}
if (emptyPass) {
emptyPass = false;
lastIndex = beginIndex + 1;
} else {
if (lastIndex != -1) {
lastIndex = StringUtils
.indexOf(str, ">", lastIndex);
}
}
}
beginIndex = lastIndex;
}
}
return str;
}
}
And the cxf config xml
<bean id="kpInInterceptor" class="com.kp.swasthik.KPLogInInterceptor"></bean>
<bean id="kpOutInterceptor" class="com.kp.swasthik.KPLogOutInterceptor"></bean>
<cxf:bus>
<cxf:inInterceptors>
<ref bean="kpInInterceptor" />
</cxf:inInterceptors>
<cxf:outInterceptors>
<ref bean="kpOutInterceptor" />
</cxf:outInterceptors>
<cxf:outFaultInterceptors>
<ref bean="kpOutInterceptor" />
</cxf:outFaultInterceptors>
<cxf:inFaultInterceptors>
<ref bean="kpInInterceptor" />
</cxf:inFaultInterceptors>
</cxf:bus>
You need to create one more class that extends LogOutInterceptor
EDIT
Create class the sets the loglevel to INFO for
public class KPLogicSupresser {
public void kpinit(){
LogManager.getLogger(HTTPConduit.class).setLevel(Level.INFO);
}
}
And create a bean in CXF configuration file
<bean id="kpLog4Jsupresser" class="com.kp.swasthik.KPLogicSupresser" init-method="kpinit" ></bean>
Just add logback.xml file in your class path with logger level INFO, it will disable all output from CXF DEBUGS.
Sample File
Filename: logback.xml
Location: src/main/resources (In my project its resources, you can place the file accordingly in your project classpath)
File Content:
<contextListener class="ch.qos.logback.classic.jul.LevelChangePropagator">
<resetJUL>true</resetJUL>
</contextListener>
<!-- To enable JMX Management -->
<jmxConfigurator/>
<appender name="console" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<logger name="com.mycompany.subpackage" level="INFO"/>
<root level="INFO">
<appender-ref ref="console"/>
</root>
I have two XML Files that contain information about some classes. After parsing the XML i want to instantiate these classes via Reflection.
I parse the xml with DOM and Recursion. What i want to know is which is the most generic way to implement it. Which is the optimal way to transfer the information and build the GUI.
I really cannot think anything else expect many IF...ELSE(
like this:
if (node.getNodeName() == "class") {
Class cls = Class.forName(node.getNodeValue());
}
)
statements but i do not think that this is the optimal way.
The dom parser:
for (int count = 0; count < nodeList.getLength(); count++) {
Node tempNode = nodeList.item(count);
// make sure it's element node.
if (tempNode.getNodeType() == Node.ELEMENT_NODE) {
// get node name and value
System.out.println("\nNode Name =" + tempNode.getNodeName() + " [OPEN]");
System.out.println("Node Value =" + tempNode.getNodeValue());
if (tempNode.hasAttributes()) {
// get attributes names and values
NamedNodeMap nodeMap = tempNode.getAttributes();
for (int i = 0; i < nodeMap.getLength(); i++) {
Node node = nodeMap.item(i);
System.out.println("attr name : " + node.getNodeName());
System.out.println("attr value : " + node.getNodeValue());
// System.out.println("Node Value : " +);
if (node.getNodeName() == "class") {
Class cls = Class.forName(node.getNodeValue());
}
}
}
if (tempNode.hasChildNodes()) {
// loop again if has child nodes
printNote(tempNode.getChildNodes());
}
System.out.println("Node Name =" + tempNode.getNodeName() + " [CLOSE]");
}
The XML files looks like this:
<ui-model>
<waui>
<abstract-container wauiId = '1'>
<abstract-button wauiId = '2'></abstract-button>
<abstract-button wauiId = '3'></abstract-button>
<abstract-button wauiId = '4'></abstract-button>
</abstract-container>
</waui>
<wrm>
<wr-item wauiId = '2'>
<abstract-properties>
<abstract-property name='text'>Button1</abstract-property>
</abstract-properties>
<polymorphic-properties>
<polymorphic-instance piId='swingRectButton'>
<polymorphic-property name='width'>100</polymorphic-property>
<polymorphic-property name='height'>50</polymorphic-property>
</polymorphic-instance>
<polymorphic-instance piId='swingRoundButton'>
<polymorphic-property name='radius'>80</polymorphic-property>
<polymorphic-property name='background-color'>red</polymorphic-property>
</polymorphic-instance>
</polymorphic-properties>
</wr-item>
<wr-item wauiId = '3'>
<abstract-properties>
<abstract-property name='text'>Button2</abstract-property>
</abstract-properties>
<polymorphic-properties>
<polymorphic-instance piId='swingRectButton'>
<polymorphic-property name='width'>200</polymorphic-property>
<polymorphic-property name='height'>60</polymorphic-property>
</polymorphic-instance>
</polymorphic-properties>
</wr-item>
<wr-item wauiId = '4'>
<abstract-properties>
<abstract-property name='text'>Button3</abstract-property>
</abstract-properties>
<polymorphic-properties>
<polymorphic-instance piId='swingRoundButton'>
<polymorphic-property name='radius'>9</polymorphic-property>
<polymorphic-property name='background-color'>blue</polymorphic-property>
</polymorphic-instance>
</polymorphic-properties>
</wr-item>
</wrm>
<widget name='abstract-button'>
<abstract-properties>
<property name='text' id='wsl_1'/>
</abstract-properties>
<polymorphic-instances>
<instance name='swingRectButton'>
<polymorphic-properties>
<property name='width' />
<property name='height' />
</polymorphic-properties>
</instance>
<instance name='swingRoundButton'>
<property name='radius' />
<property name='background-color' />
</instance>
</polymorphic-instances>
<polymorphic-instances-api>
<polymorphic-instance id='swingRectButton' class='javax.swing.JButton'>
<property name='text'>
<native-method>setText</native-method>
<param-type>String</param-type>
</property>
<property name='width'>
<native-method>setWidth</native-method>
<param-type>Integer</param-type>
</property>
<property name='height'>
<native-method>setHeight</native-method>
<param-type>Integer</param-type>
</property>
</polymorphic-instance>
<polymorphic-instance id='swingRoundButton' class='gr.epp.aswing.RoundButton'>
<property name='text'>
<native-method>setLabel</native-method>
<param-type>String</param-type>
</property>
<property name='radius'>
<native-method>setRadius</native-method>
<param-type>Integer</param-type>
</property>
<property name='background-color'>
<native-method>setBackgroundColor</native-method>
<param-type>String</param-type>
</property>
</polymorphic-instance>
</polymorphic-instances-api>
I thought about writing this as a comment to your question, but after more thought, I think it is an appropriate answer.
Avoid premature optimization.
If you've already written code that works and you're running into a specific problem then explain that problem. But you should not try to optimize your code unless there is an identifiable problem with it.
See http://c2.com/cgi/wiki?PrematureOptimization