How to Add a new Column to a group in Quickfix? - java

Im using quickfix to connect to fix engine and receive data. But the market data that comes in are being rejected by my app, stating that the tag appears twice.
20160624-12:44:36.770 : 8=FIX.4.49=21835=W34=2649=CfhDemoPrices52=20160624-12:44:37.79356=PrimoDEMOFIX55=GBPUSD262=PrimoApp13268=2269=0270=1.37203271=1500000290=164=20160628278=30/34-920771269=1270=1.37228271=1500000290=1278=30/34-92077610=038
20160624-12:44:36.798 : 8=FIX.4.49=12635=334=2749=PrimoDEMOFIX52=20160624-12:44:36.79456=CfhDemoPrices45=2658=Tag appears more than once371=278372=W373=1310=139
After lot of analysis, we found that the tag 278(MDEntryID) is not included in NoMDEntries in fix44. I want to include the field in that group in my quickfix and rebuild it. Any idea how to do that? Or please let me know your suggestions to solve this problem.

You've modified your DD incorrectly, because you don't know how repeating groups work.
This is the standard FIX44 DD for your message. I've added some comments to indicate tag numbers.
<message name="MarketDataSnapshotFullRefresh" msgtype="W" msgcat="app">
<field name="MDReqID" required="N" />
<component name="Instrument" required="Y" />
<group name="NoUnderlyings" required="N">
<component name="UnderlyingInstrument" required="N" />
</group>
<group name="NoLegs" required="N">
<component name="InstrumentLeg" required="N" />
</group>
<field name="FinancialStatus" required="N" />
<field name="CorporateAction" required="N" />
<field name="NetChgPrevDay" required="N" />
<group name="NoMDEntries" required="Y"> <!-- 268 -->
<field name="MDEntryType" required="Y" /> <!-- 269 -->
<field name="MDEntryPx" required="N" /> <!-- 270 -->
<field name="Currency" required="N" />
<field name="MDEntrySize" required="N" /> <!-- 271 -->
<field name="MDEntryDate" required="N" />
... and so on ...
Fields inside of repeating groups must be in the prescribed order. When QF processes a group, if it encounters an unexpected field then it assumes the group is over.
Your DD does not match the order that your sender is sending, so your engine is malfunctioning.
Your sender is sending fields in this order:
268-> (group 1) 269 270 271 290 64 278
(group 2) 269 270 271 290 278
(The above is directly from your rejected message.)
Your DD, however, is expecting 269 278 271 270. As soon as it hits 278, it ends the group and weird things start happening.
Revert your DD back to the default, then add 64/SettlDate and 278/MDEntryID to the end of the NoMDEntries component. Given the evidence you've provided, it's clear that your counterparty has added those fields to the end of the group.
Surely these are not the only modifications that your counterparty has made to the DD. GET THEIR DOCUMENTATION AND READ IT. Then modify your DD accordingly.

Related

Invalid message length Error in jpos when parsing the response from server

I am sending request to iso server.
I some get success and sometime I get parse error
Here is my client parse error jpos log:
<send>
<isomsg direction="outgoing">
<field id="0" value="0200"/>
<field id="2" value="9841414141"/>
<field id="3" value="401010"/>
<field id="4" value="200"/>
<field id="7" value="0812104837"/>
<field id="11" value="002356"/>
<field id="12" value="104837"/>
<field id="13" value="0812"/>
<field id="14" value="0000"/>
<field id="15" value="0812"/>
<field id="18" value="6011"/>
<field id="22" value="901"/>
<field id="25" value="00"/>
...........
............
</isomsg>
</send>
<warn>
channel-receiver-RBB_OUT
<iso-exception>
Invalid message length
03
org.jpos.iso.ISOException: Invalid message length
03
at org.jpos.iso.channel.ASCIIChannel.getMessageLength(ASCIIChannel.java:118)
at org.jpos.iso.BaseChannel.receive(BaseChannel.java:704)
at org.jpos.q2.iso.ChannelAdaptor$Receiver.run(ChannelAdaptor.java:318)
at java.base/java.lang.Thread.run(Thread.java:834)
</iso-exception>
</warn>
Request is success at server end and it looks like server is sending response back as well.
I think it issue when jpos parse the response from server
Here is the response log from server:
03410210F23A40010E8384000000000006020030109841414141401010000000000200081210483700235610385008120812601108............................................................
.............................
I think issue is with 0341 Message length but server is sending 0341 but jpos is unable to parse it.
When I send request to server I get success response 50% of the time without changing any of jpos config so I think my config are ok.
Here is my Channel config:
<channel-adaptor name='abcca' class="org.jpos.q2.iso.ChannelAdaptor" logger="Q2">
<channel class="org.jpos.iso.channel.ASCIIChannel" logger="Q2" packager="org.jpos.iso.packager.GenericPackager">
<property name="host" value="176.17.7.30" />
<property name="port" value="9002" />
<property name="packager-config" value="cfg/packager_abc.xml" />
<property name="timeout" value="300000" />
<property name="keep-alive" value="true" />
</channel>
<in>ABC_IN</in>
<out>ABC_OUT</out>
<reconnect-delay>10000</reconnect-delay>
</channel-adaptor>

Recived different fix message in fromApp method and message file in filStorePath

I am trying to capture MarketDataIncrementalRefresh messages.
It got a group noMDEntries, I am trying to parse over it.
I have two problems:
1st - I have printed the message that is coming in the fromApp method which is different from what I am getting in my file created through FileStorePath.
How is this possible? Then I am cracking through message cracker but even after cracking it is the same message as received in fromApp
Message from fromApp:
8=FIX.4.29=7535=X34=3649=XXXXXXXX52=2019011605:09:51.00056=XXXXXXXX262=1268=110=223
Message received in file :
8=FIX.4.29=0019435=X49=XXXXXXXX56=XXXXXXXX34=3652=20190116-05:09:51.000262=1268=1279=1269=055=ES167=FUT200=201903541=20190315205=1518211=M207=CME100=XCME461=F15=USD270=249375271=20290=110=123
2nd - While iterating through the group I am getting an error as :
quickfix.FieldNotFound: 268, index=1
at quickfix.FieldMap.getGroup(FieldMap.java:633)
but as you can see field 268 is set, also I have checked it again through message.isSetField(268) which results in true, just after it tried :
MarketDataIncrementalRefresh.NoMDEntries mdEntriesGroup = new MarketDataIncrementalRefresh.NoMDEntries();
message.getGroup(1, mdEntriesGroup);
it gives the above error.
FIX42.xml looks like which has same order of fields as in my message received.:
<message name='MarketDataIncrementalRefresh' msgtype='X' msgcat='app'>
<field name='MDReqID' required='Y' />
<field name='PriceFeedStatus' required='N' />
<group name='NoMDEntries' required='Y'>
<field name='MDUpdateAction' required='Y' />
<field name='MDEntryType' required='Y' />
<field name='Symbol' required='N' />
<field name='SecurityType' required='N' />
<field name='SecuritySubType' required='N' />
<field name='MaturityMonthYear' required='N' />
<field name='MaturityDate' required='N' />
<field name='MaturityDay' required='N' />
<field name='PutOrCall' required='N' />
<field name='StrikePrice' required='N' />
<field name='OptAttribute' required='N' />
<field name='DeliveryTerm' required='N' />
<field name='DeliveryDate' required='N' />
<field name='SecurityID' required='N' />
<field name='SecurityExchange' required='N' />
<field name='ExDestination' required='N' />
<field name='CFICode' required='N' />
<field name='Currency' required='N' />
<field name='MDEntryPx' required='N' />
<field name='MDEntrySize' required='N' />
<field name='MDEntryDate' required='N' />
<field name='MDEntryTime' required='N' />
<field name='MDEntryPositionNo' required='N' />
<field name='SecondaryOrderID' required='N' />
<field name='NumberOfOrders' required='N' />
</group>
<field name='ExchangeSendingTime' required='N' />
<field name='ExchangeTransactTime' required='N' />
<field name='ExchangeSeqNum' required='N' />
</message>
Always trust the message log over FromApp() and FromAdmin(). The log is recorded before the engine attempts to parse the message; the callbacks happen after. If an error happens in parsing (like in your case), what you see in the callback will be wrong (which you are seeing).
I've seen your issue before. What is probably happening is that your configuration has a mistake, or your configured DD xml file does not accurately match your counterparty's specification.
First, your config file should have these lines:
UseDataDictionary=Y
# for FIX4
DataDictionary=path/to/your/dd.xml
# for FIX5+
AppDataDictionary=path/to/your/FIX5whatever.xml
TransportDataDictionary=path/to/your/FIXT1.1.xml
Second, check your message against the DD file that's in your config. Probably something in the repeating group definition is not correct. For instance, a field could be missing from the group definition and the parser is exiting the group early when it hits it. Make sure your config points to the correct xml file.
(If you fix your message paste above to include visible field separators, I'll come back and help take a look.)

Key-value mapping in BeanIO fixed-length record

I have the following specification for a fixed-length data file (refer to record-C type of specification, page 4)
a second part, having a length of 1,800 characters, consisting of a table of 75 elements to be used for the display of the only data present in the communication; each of these elements is constituted by a field-code
of 8 characters and by a field-value of 16 characters
It means that the first 89 characters (omitted in the above summary) are plain old fixed-length and then, for the remaining 1800, I have to take them into groups of key-value pairs each counting up to 24 characters. Blank spaces are trimmed and empty pairs are not considered in the process.
Ideally, my bean may be constructed like
public class RecordC{
private List<Pair<String, String>> table = new ArrayList<>(MAX_TABLE_SIZE); //I don't want to use Map **yet**
}
Something can be e.g. Apache Common's Pair<String,String> or anything suitable for KVP mapping.
I understand that I can create a whole TypeHandler that takes the full 1800 bytes but I wanted to exploit the power of BeanIO.
Here is what I have done so far
<record name="RECORD_C" class="it.csttech.ftt.data.beans.ftt2017.RecordC" order="3" minOccurs="1" maxOccurs="1" maxLength="2000">
<field name="tipoRecord" rid="true" at="0" ignore="true" required="true" length="1" lazy="true" literal="C" />
<field name="cfContribuente" at="1" length="16" align="left" trim="true" lazy="true" />
<field name="progressivoModulo" at="17" length="8" padding="0" align="right" trim="true" lazy="true" />
<field name="spazioDisposizioneUtente" at="25" length="3" align="left" trim="true" lazy="true" />
<field name="spazioUtente" at="53" length="20" align="left" trim="true" lazy="true" />
<field name="cfProduttoreSoftware" at="73" length="16" align="left" trim="true" lazy="true" />
<segment name="table" collection="list" lazy="true" class="org.apache.commons.lang3.tuple.ImmutablePair">
<field name="key" type="java.lang.String" at="0" length="8" trim="true" lazy="true" setter="#1" />
<field name="value" type="java.lang.String" at="8" length="16" trim="true" lazy="true" setter="#2" />
</segment>
<field name="terminatorA" at="1897" length="1" rid="true" literal="A" ignore="true" />
</record>
Unfortunately this does not work in testing. I get only a single record in the list, decoded at positions [0-7] and [8-23] instead of expected [89-113][114-???][....][....]
Question is: how do I in BeanIO declare repeating fixed-length fields?
I have currently resolved my unmarshalling problem by removing all at attributes in the RecordC specifications. As I found out, the "at" attribute is absolute to the record and not relative to the repeating segment. However this forced me to add some ignored fields in the unmarshalling at the sole cost of a few ignores.
I will test the marshalling against the official controller once I have data

Mapping both xml element and its attribute using BeanIO

I would like to map the totalAmt tag in below xml file, both its value 100 and it's attribute Ccy.
<?xml version="1.0" encoding="UTF-8"?>
<transaction>
<id>
<eId>transactionId001</eId>
</id>
<amount>
<totalAmt Ccy="XXX">100</totalAmt>
</amount>
</transaction>
By reading BeanIO reference guide and posts here I got the impression that only one of them can be mapped.
So my question is: Can BeanIO handle this tag and could you show me how?
What I have tried and didn't work:
<segment name="amount">
<field name="totalAmount" xmlName="totalAmt"></field>
<field name="currency" xmlName="Ccy" xmlType="attribute"></field>
</segment>
Close, but you still need to add the segment element inside the segment tag to tell which field the attribute is belong to.
example.
<segment name="amount">
<field name="totalAmount" xmlName="totalAmt"></field>
<segment name="totalAmt">
<field name="type" xmlName="Ccy" xmlType="attribute"></field>
</segment>
</segment>
I am using bean io 2.1 version
The
<segment name="totalAmt">
<field name="totalAmount" xmlType="text"></field> --->the bean variable "totalAmount" will give say 100
<field name="Cctype" xmlName="Ccy" xmlType="attribute" default="XXX"></field> -->either set default value as XXX or it will take from cctype variable
</segment>

How can I create a blob index field correctly with Solr 5?

I think the title of my question explains much of what I need. I am using the Data Importer Handler of Apache SOLR 5. I configured my solrconfig.xml, schema.xml and data-config.xml. It's working for now.
However, I need to add one more field. An Oracle Blob field. First, let me show my configurations:
data-config.xml
<dataConfig>
<!-- Datasource -->
<dataSource name="myDS"
setReadOnly="true"
driver="oracle.jdbc.OracleDriver"
url="jdbc:oracle:thin:#//server.example.com:1521/service_name"
user="user"
password="pass"/>
<document name="products">
<entity name="product"
dataSource="myDS"
query="select * from products"
pk="id"
processor="SqlEntityProcessor">
<field column="id" name="id" />
<field column="name" name="name" />
<field column="price" name="price" />
<field column="store" name="store" />
<!-- I've added this blob field -->
<field column="picture" name="picture" />
</entity>
</document>
</dataConfig>
solrconfig.xml
<requestHandler name="/products" class="org.apache.solr.handler.dataimport.DataImportHandler">
<lst name="defaults">
<str name="config">data-config.xml</str>
</lst>
</requestHandler>
<!-- JDBCs -->
<lib dir="../../../lib" />
My fields in schema.xml
<field name="id" type="string" indexed="true" stored="true" required="true" multiValued="false" />
<field name="_version_" type="long" indexed="true" stored="true"/>
<field name="_text" type="string" indexed="true" stored="false" multiValued="true"/>
<field name="name" type="string" indexed="true" stored="true"/>
<field name="price" type="float" indexed="true" stored="true"/>
<!-- BLOB field -->
<field name="picture" type="binary" indexed="true" stored="true"/>
<copyField source="*" dest="_text"/>
<!-- ommited solr default fields -->
Now, when I start a full-importer, SOLR only indexes some records. This is the output after SOLR finish the importing:
Indexing completed. Added/Updated: 64 documents. Deleted 0 documents. (Duration: 04s)
Requests: 1 (0/s), Fetched: 1369 (342/s), Skipped: 0, Processed: 64 (16/s)
Started: less than a minute ago
As you can see, I have 1369 records, but SOLR only index 64 documents. If I remove the field picture from schema or, set index and stored attributes to false, SOLR import all documents.
I opened the SOLR log, and found this error when importing the blob field:
3436212 [Thread-19] WARN org.apache.solr.handler.dataimport.SolrWriter – Error creating document : SolrInputDocument(fields: [name=PRODUCTNAME, price=PRICE, store=STORE, picture=oracle.sql.BLOB#4130607a, _version_=1497915495941144576])
org.apache.solr.common.SolrException: ERROR: [doc=<ID>] Error adding field 'picture'='oracle.sql.BLOB#4130607a' msg=Illegal character .
at org.apache.solr.update.DocumentBuilder.toDocument(DocumentBuilder.java:176)
at org.apache.solr.update.AddUpdateCommand.getLuceneDocument(AddUpdateCommand.java:78)
at org.apache.solr.update.DirectUpdateHandler2.addDoc0(DirectUpdateHandler2.java:240)
at org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:166)
at org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:69)
at org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:51)
at org.apache.solr.update.processor.DistributedUpdateProcessor.doLocalAdd(DistributedUpdateProcessor.java:931)
at org.apache.solr.update.processor.DistributedUpdateProcessor.versionAdd(DistributedUpdateProcessor.java:1085)
at org.apache.solr.update.processor.DistributedUpdateProcessor.processAdd(DistributedUpdateProcessor.java:697)
at org.apache.solr.update.processor.LogUpdateProcessor.processAdd(LogUpdateProcessorFactory.java:104)
at org.apache.solr.handler.dataimport.SolrWriter.upload(SolrWriter.java:71)
at org.apache.solr.handler.dataimport.DataImportHandler$1.upload(DataImportHandler.java:263)
at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:511)
at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:415)
at org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:330)
at org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:232)
at org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:416)
at org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:480)
at org.apache.solr.handler.dataimport.DataImporter$1.run(DataImporter.java:461)
Caused by: java.lang.IllegalArgumentException: Illegal character .
at org.apache.solr.common.util.Base64.base64toInt(Base64.java:150)
at org.apache.solr.common.util.Base64.base64ToByteArray(Base64.java:117)
at org.apache.solr.schema.BinaryField.createField(BinaryField.java:89)
at org.apache.solr.schema.FieldType.createFields(FieldType.java:305)
at org.apache.solr.update.DocumentBuilder.addField(DocumentBuilder.java:48)
at org.apache.solr.update.DocumentBuilder.toDocument(DocumentBuilder.java:123)
... 18 more
I checked querying directly against database, and it's working fine. I am using SOLR 5, ojdbc7 and Java 8. How can I use the binary field correctly in SOLR?
Update
I've changed the properties of picture in schema.xml setting indexed=false. This way:
<!-- BLOB field -->
<field name="picture" type="binary" indexed="false" stored="true"/>
Then, I restarted SOLR, reloaded my core, and did a Full-Import again. No success and same exception. The same 64 documents that I described above was imported and the field picture does not appear in JSON response. The query I execute is:
/select?q=*%3A*&wt=json&indent=true

Categories