I am currently having issues in attempting to display data (i.e. itemSummaries with a certain ContainerType) for sites that are ISP, as an example, www.telstra.com.au from Australia
Logging into the site works fine, and the credentials for the site work fine (in other words, it does a refresh which succeeds), however there doesn't appear to be a way to display the itemSummary data
The soap command getItemSummaries, doesn't display data for the item (it displays item data from financial institutions fine). Upon examining the sample code provided by Yodlee for the java soap api, you are meant to use the getItemSummaries1 along with setting ContainerTypes using a SummaryRequest
The problem is that this returns a CoreExceptionFaultMessage. The getItemSummaries1 command is causing the CoreExceptionFaultError. Using different ContainerTypes with different combinations (i.e. ISP, Telephone, Bills) didn't alleviate the issue
The same error message is returned in Yodlees own sample code, i.e. java_soap_example (run the com.yodlee.sampleapps.accountsummary.DisplayBillsData main method and provide the Yodlee login info as command line arguments)
As a reference, the code that is provided by Yodlee sample app is below
Running the getItemSummaries1 command
public void displayBillsData (UserContext userContext)
{
/*SummaryRequest sr = new SummaryRequest(
new String[] {ContainerTypes.BILL, ContainerTypes.TELEPHONE},
new DataExtent[] { DataExtent.getDataExtentForAllLevels(),DataExtent.getDataExtentForAllLevels() }
);*/
SummaryRequest sr = new SummaryRequest();
List list = new List();
list.setElements(new String[] {ContainerTypesHelper.BILL, ContainerTypesHelper.TELEPHONE});
sr.setContainerCriteria(list);
Object[] itemSummaries = null;
List itemSummariesList = null;
try {
itemSummariesList = dataService.getItemSummaries1(userContext, sr);
if (itemSummariesList != null){
itemSummaries = itemSummariesList.getElements();
}
} catch (StaleConversationCredentialsExceptionFault e) {
e.printStackTrace();
} catch (InvalidConversationCredentialsExceptionFault e) {
e.printStackTrace();
} catch (CoreExceptionFault e) {
e.printStackTrace();
} catch (IllegalArgumentTypeExceptionFault e) {
e.printStackTrace();
} catch (IllegalArgumentValueExceptionFault e) {
e.printStackTrace();
} catch (InvalidUserContextExceptionFault e) {
e.printStackTrace();
} catch (IllegalDataExtentExceptionFault e) {
e.printStackTrace();
} catch (RemoteException e) {
e.printStackTrace();
}
if (itemSummaries == null || itemSummaries.length == 0) {
System.out.println ("No bills data available");
return;
}
for (int i = 0; i < itemSummaries.length; i++) {
ItemSummary is = (ItemSummary) itemSummaries[i];
displayBillsDataForItem(is);
// Dump the BillsData Object
// dumpBillsDataForItem(is);
}
}
Printing the item data
public void displayBillsDataForItem (ItemSummary is)
{
String containerType = is.getContentServiceInfo ().
getContainerInfo ().getContainerName ();
System.out.println("containerType = " + containerType );
if (!(containerType.equals(ContainerTypesHelper.BILL ) || containerType.equals(ContainerTypesHelper.TELEPHONE)
|| containerType.equals(ContainerTypesHelper.MINUTES))) {
throw new RuntimeException ("displayBillsDataForItem called with " +
"invalid container type: " + containerType);
}
DisplayItemInfo displayItemInfo = new DisplayItemInfo ();
System.out.println("DisplayItemInfo:");
displayItemInfo.displayItemSummaryInfo (is);
System.out.println("");
ItemData id = is.getItemData();
if(id == null){
System.out.println("ItemData == null");
}else{
List accountsList = id.getAccounts();
Object[] accounts = null;
if (accountsList != null){
accounts = accountsList.getElements();
}
if (accounts == null || accounts.length == 0) {
System.out.println ("\tNo accounts");
}else {
for (int accts = 0; accts < accounts.length; accts++) {
BillsData billsData = (BillsData) accounts[accts];
System.out.println("\tAccount Holder: " + billsData.getAccountHolder() );
System.out.println("\tAccount Id: " + billsData.getAccountId());
System.out.println("\tItemAccountId: " + billsData.getItemAccountId() );
System.out.println("\tAccountName: " + billsData.getAccountName() );
System.out.println("\tAccountNumber: " + billsData.getAccountNumber() );
System.out.println("");
// Get List of Bill Objects
List billsList = billsData.getBills();
Object[] bills = null;
if (billsList != null){
bills = billsList.getElements();
}
if (bills == null || bills.length == 0) {
System.out.println ("\t\tNo Bill objects");
}else {
for (int b = 0; b < bills.length; b++) {
Bill bill = (Bill) bills[b];
System.out.println("\t\tBill Account Number: " + bill.getAccountNumber() );
System.out.println("\t\tBill Acct Type: " + bill.getAcctType() );
System.out.println("\t\tBill Due Date: " + Formatter.formatDate(bill.getDueDate().getDate(), Formatter.DATE_SHORT_FORMAT) );
System.out.println("\t\tBill Date: " + Formatter.formatDate(bill.getBillDate().getDate(), Formatter.DATE_SHORT_FORMAT) );
System.out.println("\t\tBill Past Due: "
+ (bill.getPastDue() != null ? bill
.getPastDue().getAmount() : 0.0));
System.out
.println("\t\tBill Last payment: "
+ (bill.getLastPayment() != null ? bill
.getLastPayment()
.getAmount()
: 0.0));
System.out.println("\t\tBill Amount Due: "
+ (bill.getAmountDue() != null ? bill
.getAmountDue().getAmount() : 0.0));
System.out
.println("\t\tBill Min Payment: "
+ (bill.getMinPayment() != null ? bill
.getMinPayment()
.getAmount()
: 0.0));
System.out.println("");
// Get List of AccountUsageData
List acctUsageDataList = bill.getAccountUsages();
Object[] acctUsageData = null;
if (acctUsageDataList != null){
acctUsageData = acctUsageDataList.getElements();
}
if (acctUsageData == null || acctUsageData.length == 0) {
System.out.println ("\t\t\tNo AccountUsageData objects");
}else {
for (int usage = 0; usage < acctUsageData.length; usage++) {
AccountUsageData aud = (AccountUsageData) acctUsageData[usage];
System.out.println("\t\t\tAccount Usage Bill ID: " + aud.getBillId() );
System.out.println("\t\t\tAccount Usage Units Used: " + aud.getUnitsUsed() );
}
}
}
}
System.out.println("");
// Get List of AccountUsageData
List acctUsageDataList = billsData.getAccountUsages();
Object[] acctUsageData = null;
if (acctUsageDataList != null){
acctUsageData = acctUsageDataList.getElements();
}
if (acctUsageData == null || acctUsageData.length == 0) {
System.out.println ("\t\tNo AccountUsageData objects");
}else {
for (int usageData = 0; usageData < acctUsageData.length; usageData++) {
AccountUsageData aud = (AccountUsageData) acctUsageData[usageData];
System.out.println("\t\tAccount Usage Bill ID: " + aud.getBillId() );
System.out.println("\t\tAccount Usage Units Used: " + aud.getUnitsUsed() );
}
}
}
}
}
}
EDIT2:
I have updated the getItemSummaries1 command to look like this
ContainerCriteria bills = new ContainerCriteria();
ContainerCriteria telephone = new ContainerCriteria();
ContainerCriteria isp = new ContainerCriteria();
ContainerCriteria utilities = new ContainerCriteria();
bills.setContainerType(ContainerTypesHelper.BILL);
telephone.setContainerType(ContainerTypesHelper.TELEPHONE);
isp.setContainerType(ContainerTypesHelper.ISP);
utilities.setContainerType(ContainerTypesHelper.UTILITIES);
Object[] containerList = {
bills,telephone,isp,utilities
};
SummaryRequest sr = new SummaryRequest();
List list = new List();
list.setElements(containerList);
sr.setContainerCriteria(list);
The command now executes and works correctly, however its returning a list of 0 elements (using DataExtents with different values didn't change anything). My suspicion is that Telstra.com.au site is broken on Yodlee's end (when a full refresh is done on the Telstra site, Yodlee returns a null for refreshing that specific site).
So far I can see some deviation, so do modify your container criteria as mentioned below
object[] list = {
new ContainerCriteria { containerType = "bills" },
new ContainerCriteria { containerType = "telephone" }
};
sr.containerCriteria = list;
You may additionally provide data extent as follows
DataExtent de = new DataExtent();
dataExtent.startLevel = 0; //as per your needs
dataExtent.endLevel = 0; //as per your needs
object[] list = {
new ContainerCriteria { containerType = "bills", dataExtent = de },
new ContainerCriteria { containerType = "telephone", dataExtent = de }
};
sr.containerCriteria = list;
This should solve your issue. If not then try to get the detail which is is node in the response for the CoreExceptionFaultMessage, this detail may help to diagnose the accurate issue.
To get data for any of the container first you need to add an account for the site belonging to that container. Once you have added the account successfully then only you will be able to pull in the data for such container. Also you can check the tag returned in the getItemSummaries API which has and once this statusCode = 0 then only you have data present for that account.
You can do testing by using the Dummy account which yodlee provides. Please refer to Yodlee Dummy account generator page for more info on Dummy accounts.
Related
I am trying to perform batch insertion operation with a list object but while inserting I am getting String cannot be converted to DAO.The receiver in the iterator loop.
I have tried to list the list object, at that time it is printing values from the list. but, when I use generics are normal list it is showing error and I don't find any solution to insert
From this method I am reading the excel file and storing into list
public List collect(Receiver rec)
{
//ReadFromExcel rd = new ReadFromExcel();
List<String> up = new ArrayList<String>();
//List<String> details = rd.reader();
//System.out.println(details);
try( InputStream fileToRead = new FileInputStream(new File(rec.getFilePath())))
{
XSSFWorkbook wb = new XSSFWorkbook(fileToRead);
wb.setMissingCellPolicy(Row.MissingCellPolicy.RETURN_BLANK_AS_NULL);
XSSFSheet sheet = wb.getSheetAt(0);
DataFormatter fmt = new DataFormatter();
String data ="";
for(int sn = 0;sn<wb.getNumberOfSheets()-2;sn++)
{
sheet = wb.getSheetAt(sn);
for(int rn =sheet.getFirstRowNum();rn<=sheet.getLastRowNum();rn++)
{
Row row = sheet.getRow(rn);
if(row == null)
{
System.out.println("no data in row ");
}
else
{
for(int cn=0;cn<row.getLastCellNum();cn++)
{
Cell cell = row.getCell(cn);
if(cell == null)
{
// System.out.println("no data in cell ");
// data = data + " " + "|";
}
else
{
String cellStr = fmt.formatCellValue(cell);
data = data + cellStr + "|";
}
}
}
}
}
up = Arrays.asList(data.split("\\|"));
// System.out.println(details);
}
catch (FileNotFoundException ex)
{
Logger.getLogger(BImplementation.class.getName()).log(Level.SEVERE, null, ex);
}
catch (IOException ex)
{
Logger.getLogger(BImplementation.class.getName()).log(Level.SEVERE, null, ex);
}
Iterator iter = up.iterator();
while(iter.hasNext())
{
System.out.println(iter.next());
}
String row="";
Receiver info = null;
String cid = "";
String cname = "";
String address = "";
String mid = "";
boolean b = false;
List<Receiver> res = new ArrayList<Receiver>();
int c = 0;
try
{
String str = Arrays.toString(up.toArray());
//System.out.println(str);
String s = "";
s = s + str.substring(1,str.length());
// System.out.println("S:"+s);
StringTokenizer sttoken = new StringTokenizer(s,"|");
int count = sttoken.countTokens();
while(sttoken.hasMoreTokens())
{
if(sttoken.nextToken() != null)
{
// System.out.print(sttoken.nextToken());
cid = sttoken.nextToken();
cname = sttoken.nextToken();
address = sttoken.nextToken();
mid = sttoken.nextToken();
info = new Receiver(cid,cname,address,mid);
res.add(info);
System.out.println("cid :"+cid+ " cname : "+cname +" address : "+address+" mid : "+mid);
c = res.size();
// System.out.println(c);
}
else
{
break;
}
}
System.out.println(count);
// System.out.println("s");
}
catch(NoSuchElementException ex)
{
System.out.println("No Such Element Found Exception" +ex);
}
return up;
}
with this method I'm trying to insert into database
public boolean insert(List res)
{
String sqlQuery = "insert into records(c_id) values (?)";
DBConnection connector = new DBConnection();
boolean flag = false;
// Iterator itr=res.iterator();
// while(it.hasNext())
// {
// System.out.println(it.next());
// }
try( Connection con = connector.getConnection();)
{
con.setAutoCommit(false);
PreparedStatement pstmt = con.prepareStatement(sqlQuery);
Iterator it = res.iterator();
while(it.hasNext())
{
Receiver rs =(Receiver) it.next();
pstmt.setString(1,rs.getcID());
pstmt.setString(2,rs.getcName());
pstmt.setString(3,rs.getAddress());
pstmt.setString(4,rs.getMailID());
pstmt.addBatch();
}
int [] numUpdates=pstmt.executeBatch();
for (int i=0; i < numUpdates.length; i++)
{
if (numUpdates[i] == -2)
{
System.out.println("Execution " + i +": unknown number of rows updated");
flag=false;
}
else
{
System.out.println("Execution " + i + "successful: " + numUpdates[i] + " rows updated");
flag=true;
}
}
con.commit();
} catch(BatchUpdateException b)
{
System.out.println(b);
flag=false;
}
catch (SQLException ex)
{
Logger.getLogger(BImplementation.class.getName()).log(Level.SEVERE, null, ex);
System.out.println(ex);
flag=false;
}
return flag;
}
I want to insert list object using JDBC batch insertion to the database.
Your method collect(Receiver rec) returns the List of strings called up.
return up;
However (if you are really using the method collect to pass the List into insert(List res) method), you are expecting this list to contain Receiver objects. Which is incorrect, since it collect(..) returns the list of Strings.
And that causes an error when you try to cast Receiver rs =(Receiver) it.next();
You need to review and fix your code, so you will pass the list of Receiver objects instead of strings.
And I really recommend you to start using Generics wherever you use List class. In this case compiler will show you all data-type errors immediately.
Trying to pull the list of users from large AD Groups via Java - but only get 1500 back - how can I get all the users?
// Step1 method - Pulling ADGroups from Active Directory
private static void getADGroups() {
Hashtable<String, Object> env = new Hashtable<String, Object>(11);
env.put(Context.INITIAL_CONTEXT_FACTORY, "com.sun.jndi.ldap.LdapCtxFactory");
env.put(Context.PROVIDER_URL, "ldap://");
env.put(Context.SECURITY_PRINCIPAL, "xxxx");
env.put(Context.SECURITY_CREDENTIALS, "1233");
env.put(Context.REFERRAL, "follow");
LdapContext ctx = null;
try {
ctx = new InitialLdapContext(env, null);
// Activate paged results
int pageSize = 10000;
byte[] cookie = null;
ctx.setRequestControls(new Control[] { new PagedResultsControl(pageSize, Control.NONCRITICAL) });
int total;
do {
SearchControls searchControls = new SearchControls();
searchControls.setSearchScope(SearchControls.SUBTREE_SCOPE);
String[] attrIDs = { "cn" };
searchControls.setReturningAttributes(attrIDs);
String searchBase = "OU=Groups,DC=cof,DC=ds,DC=com";
String searchFilter = "CN=*Ranger*";
/* perform the search */
NamingEnumeration results = ctx.search(searchBase, searchFilter, searchControls);
/* for each entry print out name + all attrs and values */
int count = 0;
while (results != null && results.hasMore()) {
SearchResult entry = (SearchResult) results.next();
//System.out.println(count + ")" + entry.getName());
count = count + 1;
String gname = entry.getName();
//System.out.println("gname before split " + gname);
String[] gnames = gname.split(",");
gname = gnames[0];
//System.out.println("gname after split - 1 " + gname);
gname = gname.substring(3);
//System.out.println("gname after split - 2 " + gname);
groups.add(gname);
}
//System.out.println("count : " + count);
// Examine the paged results control response
Control[] controls = ctx.getResponseControls();
//System.out.println("controls-size : " + controls.length);
if (controls != null) {
for (int i = 0; i < controls.length; i++) {
if (controls[i] instanceof PagedResultsResponseControl) {
PagedResultsResponseControl prrc = (PagedResultsResponseControl) controls[i];
total = prrc.getResultSize();
//System.out.println("total : " + total);
if (total != 0) {
//System.out.println("*****************
cookie = prrc.getCookie();
//System.out.println("cookie : " + cookie);
}
}
} else {
System.out.println("No controls were sent from the server");
}
// Re-activate paged results
ctx.setRequestControls(new Control[] { new PagedResultsControl(pageSize, cookie, Control.CRITICAL) });
} while (cookie != null);
} catch (NamingException e) {
System.out.println("PagedSearch failed." + e.getMessage());
e.printStackTrace();
} catch (IOException ie) {
System.out.println("PagedSearch failed." + ie.getMessage());
ie.printStackTrace();
} finally {
try {
ctx.close();
} catch (NamingException e) {
System.out.println("PagedSearch failed (error occured in closing context)." + e.getMessage());
e.printStackTrace();
}
}
}
// Step2 method - to pull users from ADgroups that we got for above
private static void getGroupMembers(String groupName) {
searchBase = "Ou=users";
String returnedAtts[] = { "member" };
searchControls.setReturningAttributes(returnedAtts);
searchFilter = String.format("(cn=%s)", groupName);
// System.out.println(searchFilter);
getSearchResult();
filterSearchResultsForGroupMembers(groupName);
} // end of method.
`
The key is where you request the member attribute. If you get back exactly 1500 results, you know there might be more. This is how you request the next batch:
String[] returnedAtts = { "member;range=1500-*" };
Then if you get exactly 1500 back again, you need to ask for more (`member;range=3000-*1). Keep asking for more until you get less than 1500 back.
So setup a loop with a counter and use that counter in the range string.
There is a full example here (search the page for "setReturningAttributes" to find that section of the code): https://community.oracle.com/thread/1157644
I am trying to use twitter4j to query twitter status data. I need tweets only for the user who posted them on his/her time line for a day.
So far, I used this code to achieve this:
try {
for (int i = 0; i < userNames.length; i++) {
int totalCount = 0;
Query query = new Query(userNames[i]);
query.setCount(100);
int searchResultCount;
long lowestTweetId = Long.MAX_VALUE;
totalCount = 0;
Date date = new Date(
DateTimeUtils.getNDaysbackDailySliceStamp(1));
String modifiedDate = new SimpleDateFormat("yyyy-MM-dd")
.format(date);
System.out.println(modifiedDate);
query.setSince(modifiedDate);
date = new Date(DateTimeUtils.getDailySliceStamp());
modifiedDate = new SimpleDateFormat("yyyy-MM-dd").format(date);
System.out.println(modifiedDate);
query.setUntil(modifiedDate);
List<DBObject> dbl = new ArrayList<DBObject>();
Set<Long> ste = new HashSet<Long>();
do {
QueryResult queryResult = twitter.search(query);
searchResultCount = queryResult.getTweets().size();
for (Status st : queryResult.getTweets()) {
if (!st.isRetweet()) {
URLEntity[] uEn = st.getURLEntities();
StringBuilder url = new StringBuilder();
for (URLEntity urle : uEn) {
if (urle.getURL() != null && !urle.getURL().isEmpty()) {
url.append(urle.getExpandedURL());
}
}
ste.add(st.getId());
dbl.add(createTweetObject(userNames[i]/*, total*/,
st.getText(), st.getRetweetCount(), st.getId(),
url.toString(), st.getCreatedAt(), st.isRetweet()));
}
}
} while (searchResultCount != 0 && searchResultCount % 100 == 0);
System.out.println(dbl.size());
System.out.println(dbl);
if (dbl != null && !dbl.isEmpty()) {
// populateTweetCollection(dbl);
}
System.out.println("TweetCount"+ste.size());
System.out.println(ste);
}
} catch (Exception e) {
log.error("Exception in TwitterTime line api---"
+ Config.getStackTrace(e));
}
But this code gives me tweets made by others mentioning the User I am looking for.
For example I searched for my tweets in a day which were actually 8 but it gave me 12 results as some of my friends tweeted on their time line mentioning my twitter name using #username operator.
Also one thing i want to confirm if truncated tweet has same id for the whole group.
Regards
Try this
try {
ResponseList<User> users = twitter.lookupUsers("user name");
for (User auser : users) {
System.out.println("Friend's Name " + auser.getName());
if (auser.getStatus() != null) {
System.out.println("Friend timeline");
List<Status> statusess =
twitter.getHomeTimeline();
for (Status status3 : statusess) {
System.out.println(status3.getText());
}
}
}
} catch (TwitterException e) {
e.printStackTrace();
}
It worked using this code
if (!st.isRetweet() && (st.getUser().getScreenName().equalsIgnoreCase(userNames[i]))) {
URLEntity[] uEn = st.getURLEntities();
StringBuilder url = new StringBuilder();
}
if (st.getId() < lowestTweetId) {
lowestTweetId = st.getId();
query.setMaxId(lowestTweetId);
}
I verified that it is not an RT and also userName screen Name is also similar to the user I am looking for.
Regards
Virendra Agarwal
I'm using flexicapture processor for recognizing my document. I've a case where i've a document with multiple pages i.e. a document have multiple images and each image is need to recognize.
I'm following below procedure to achieve my general task ,either one image in document or multiple image in a document;
create a processor
add document definition file or afl file
run recognition as IDocument document = processor.RecognizeNextDocument();
But when it return a document, document, it has only one page,which is the first page of document, why is it so?
On the other case, if i use project instead processor,IProject, with below procedure
create a project
get batches from project project.getBatches(),
add a document (that have multiple page) to batch
recognize them
The i've all pages information of document,IDocuments documents = batch.getDocuments(),
How i can achieve the same task wit processor? I want processor recognize all pages in a document and return a document with all pages in it. ?
if something is unclear, pls ask for more information.
Please reply asap...
Code :1 using flexicapture processor
/**
*
*/
/**
* #author Nitin
*
*/
import java.sql.BatchUpdateException;
import com.abbyy.FCEngine.*;
public class FlexicaputreVerificationUsingProcessor {
private static Object verificationWorkSet(Object object) {
// TODO Auto-generated method stub
return null;
}
private static void trace( String txt )
{
System.out.println( txt );
}
static private String samplesFolder;
static private String projectFolder;
static private String serialNumber;
static private String dllPath;
static {
samplesFolder = "C:\\ProgramData\\ABBYY\\SDK\\10\\FlexiCapture Engine\\Samples\\";
projectFolder = "C:\\Users\\Nitin\\FlexicaptureTest\\flexiverificationtest" ;
try {
java.io.FileInputStream file = new java.io.FileInputStream( samplesFolder + "SampleConfig\\SamplesConfig.txt" );
java.io.BufferedReader reader = new java.io.BufferedReader( new java.io.InputStreamReader( file ) );
serialNumber = reader.readLine();
dllPath = reader.readLine();
file.close();
} catch( java.io.IOException e ) {
System.out.println( e.getMessage() );
e.printStackTrace();
}
}
/**
* #param args
*/
public static void main(String[] args) {
// Load Engine
try {
trace("Loading engine");
IEngineLoader engineLoader= Engine.CreateEngineOutprocLoader();
IEngine engine = engineLoader.Load(serialNumber,dllPath);
try {
// Create and configure FlexiCaptureProcessor
trace("Creating and configureing FlexiCaptureProcessor");
IFlexiCaptureProcessor processor = engine.CreateFlexiCaptureProcessor();
processor.AddDocumentDefinitionFile( projectFolder + "\\Document_Definition_1.fcdot" );
trace("Adding images/pdf to processor");
final int fileCount = 1 ;
processor.AddImageFile(projectFolder + "\\don't upload to big .pdf");
engine.EnableRecognitionVariants( true );
trace("Creating Document collection");
IDocumentsCollection documentsCollection = engine.CreateDocumentsCollection();
trace( "Reconizing Images/pdfs..." );
int totalErrors = 0 ;
for ( int iterator = 0 ; iterator<fileCount; iterator++ ){
trace("Recongnizing image/pdf number: " +(iterator+1));
IDocument document = processor.RecognizeNextDocument();
trace("Getting last processing error for checksum");
IProcessingError lastProcessingError = processor.GetLastProcessingError() ;
if ( lastProcessingError !=null)
{
String errormsg = lastProcessingError.MessageText();
totalErrors++;
trace("Error occured while recognizeing document, Document number: "+(iterator+1)+ " with Error msg: "+errormsg);
//since we are not handling error (right now) so moving to next document for recognization
processor.ResumeProcessing(false);
}else {
trace("No error occured while recognization of document number : "+(iterator+1));
}
trace("Adding documents in Documents collection");
documentsCollection.Add(document);
}
if ( totalErrors == fileCount){
trace("Facing Error for all document while recongnization");
return ;
}
trace("Creaing Verification session");
try {
IVerificationSession verificationSession = engine.CreateVerificationSession(documentsCollection) ;
try {
//enabling context verification
verificationSession.getOptions().setVerifyFields(true);
//disabling group verification
verificationSession.getOptions().setVerifyBaseSymbols(false);
verificationSession.getOptions().setVerifyExtraSymbols(false);
try {
trace("Get NextWork Set");
IVerificationWorkSet verificationWorkSet = verificationSession.NextWorkSet();
if ( verificationWorkSet == null){
trace("first verificationWork set is null");
}else {
//process each work set in Verification session
trace("Processing Work Set");
while ( verificationWorkSet != null ){
try{
trace("Geting Verification group");
//get next group for verification
IVerificationGroup verificationGroup = verificationWorkSet.NextGroup();
if ( verificationGroup == null ){
trace("First verification group is null");
}else {
trace("processing each group of a workset");
//processing each group of a work set
while ( verificationGroup!= null){
int verificationObjectInAGroupCount = verificationGroup.getCount();
trace("Total number of verification object: " +verificationObjectInAGroupCount);
for ( int iterator = 0; iterator<verificationObjectInAGroupCount; iterator++){
trace ( "getting and Processing "+(iterator +1 ) + " verification object of A group");
//getting verification object
IVerificationObject verificationObject = verificationGroup.getElement(iterator);
if ( verificationObject == null){
trace("verification object is null");
}else {
if ( verificationObject.getType() == VerificationObjectTypeEnum.VOT_Group ) {
IGroupVerificationObject groupVerificationObject = verificationObject.AsGroupVerificationObject();
if ( groupVerificationObject == null){
System.out.println("group verification object is null ");
}
}else if ( verificationObject.getType() == VerificationObjectTypeEnum.VOT_Context) {
IContextVerificationObject contextVerificationObject = verificationObject.AsContextVerificationObject();
if ( contextVerificationObject == null){
trace("ContextVerification object is null");
}else {
IField field = contextVerificationObject.getField();
if ( field == null){
trace("field getting null");
}else {
System.out.println(" field full name: " +field.getFullName() + "\n Name: " +field.getName());
IFieldValue fieldValue = field.getValue();
if ( fieldValue == null){
trace("Field Value is Null");
}else {
trace ( "getting text from field value");
IText text = fieldValue.getAsText() ;
if ( text == null){
trace("text getting null in field value");
}else {
int wordCount = text.getRecognizedWordsCount() ;
trace("recognized word count: "+wordCount);
//getting words from text
for ( int wordIndex = 0 ; wordIndex<wordCount; wordIndex++ ){
trace ("processing word number :" +wordIndex);
IRecognizedWordInfo recognizedWordInfo = engine.CreateRecognizedWordInfo() ;
if ( recognizedWordInfo == null){
trace("Can't create recognizedWordInfo object using engine");
}else {
text.GetRecognizedWord(wordIndex, -1, recognizedWordInfo);
//getting characters from word
for (int characterIndex = 0 ; characterIndex<recognizedWordInfo.getText().length(); characterIndex++ ){
trace("processing character number : " +characterIndex);
IRecognizedCharacterInfo recognizedCharacterInfo = engine.CreateRecognizedCharacterInfo();
if ( recognizedCharacterInfo == null) {
trace("can't create recognizedCharacterInfo object");
}else {
recognizedWordInfo.GetRecognizedCharacter(characterIndex, -1, recognizedCharacterInfo);
System.out.println(" Character: " + recognizedCharacterInfo.getCharacter());
System.out.println(" Confidence level : " +recognizedCharacterInfo.getCharConfidence());
}
}
}
}
}
System.out.println(" Field Value : " +fieldValue.getAsString());
}
}
}
}
}
}
trace("Geting next Verification group");
verificationGroup = verificationWorkSet.NextGroup();
}
}
}catch (Exception e){
trace("Exception occured in getting next work group");
e.printStackTrace();
}
trace("Get next worksets");
//get next work set
verificationWorkSet = verificationSession.NextWorkSet();
}
}
}catch (Exception e){
e.printStackTrace();
}
}finally {
trace("closing Verification object");
verificationSession.Close();
}
} catch (Exception e) {
trace("Exception occured in creating verification sessions");
}
}catch (Exception e){
trace ("Exception occured in");
}
}catch (Exception e) {
// TODO: handle exception
e.printStackTrace();
}
finally {
trace("unloading Engine");
Engine.Unload();
}
}
}
Code : 2 using project
import java.io.File;
import java.io.IOException;
import java.sql.BatchUpdateException;
import com.abbyy.FCEngine.*;
public class VerificationStep {
//same as above
public static void main( String[] args )
{
// Load Engine
try {
trace("Loading engine");
IEngineLoader engineLoader= Engine.CreateEngineOutprocLoader();
IEngine engine = engineLoader.Load(serialNumber,dllPath);
try{
IProject project = engine.OpenProject( projectFolder + "\\flexitest.fcproj" );
try {
IBatch batch = null ;
trace( "Creating Batch..." );
IBatches batchs = project.getBatches();
if (batchs == null || batchs.getCount() == 0){
batch = project.getBatches().AddNew("TestBatch");
}
batch = batchs.getElement(0);
assert(batch == null);
try{
trace("opening batch");
batch.Open();
trace( "Adding pdfs..." );
batch.AddImage(projectFolder + "\\don't upload to big .pdf");
trace( "Reconizing pdfs..." );
batch.Recognize(null, RecognitionModeEnum.RM_ReRecognizeAll,null);
trace("Creating Verification object");
try {
IVerificationSession verificationSession = project.StartVerification(null);
try {
//enabling context verification
verificationSession.getOptions().setVerifyFields(true);
//disabling group verification
verificationSession.getOptions().setVerifyBaseSymbols(false);
verificationSession.getOptions().setVerifyExtraSymbols(false);
try {
trace("Get NextWork Set");
IVerificationWorkSet verificationWorkSet = verificationSession.NextWorkSet();
if ( verificationWorkSet == null){
trace("first verificationWork set is null");
}else {
//process each work set in Verification session
trace("Processing Work Set");
while ( verificationWorkSet != null ){
try{
trace("Geting Verification group");
//get next group for verification
IVerificationGroup verificationGroup = verificationWorkSet.NextGroup();
if ( verificationGroup == null ){
trace("First verification group is null");
}else {
trace("processing each group of a workset");
//processing each group of a work set
while ( verificationGroup!= null){
int verificationObjectInAGroupCount = verificationGroup.getCount();
trace("Total number of verification object: " +verificationObjectInAGroupCount);
for ( int iterator = 0; iterator<verificationObjectInAGroupCount; iterator++){
trace ( "getting and Processing "+(iterator +1 ) + " verification object of A group");
//getting verification object
IVerificationObject verificationObject = verificationGroup.getElement(iterator);
if ( verificationObject == null){
trace("verification object is null");
}else {
if ( verificationObject.getType() == VerificationObjectTypeEnum.VOT_Group ) {
IGroupVerificationObject groupVerificationObject = verificationObject.AsGroupVerificationObject();
if ( groupVerificationObject == null){
System.out.println("group verification object is null ");
}
}else if ( verificationObject.getType() == VerificationObjectTypeEnum.VOT_Context) {
IContextVerificationObject contextVerificationObject = verificationObject.AsContextVerificationObject();
if ( contextVerificationObject == null){
trace("ContextVerification object is null");
}else {
IField field = contextVerificationObject.getField();
if ( field == null){
trace("field getting null");
}else {
System.out.println(" field full name: " +field.getFullName() + "\n Name: " +field.getName());
IFieldValue fieldValue = field.getValue();
if ( fieldValue == null){
trace("Field Value is Null");
}else {
trace ( "getting text from field value");
IText text = fieldValue.getAsText() ;
if ( text == null){
trace("text getting null in field value");
}else {
int wordCount = text.getRecognizedWordsCount() ;
trace("recognized word count: "+wordCount);
//getting words from text
for ( int wordIndex = 0 ; wordIndex<wordCount; wordIndex++ ){
trace ("processing word number :" +wordIndex);
IRecognizedWordInfo recognizedWordInfo = engine.CreateRecognizedWordInfo() ;
if ( recognizedWordInfo == null){
trace("Can't create recognizedWordInfo object using engine");
}else {
text.GetRecognizedWord(wordIndex, -1, recognizedWordInfo);
//getting characters from word
for (int characterIndex = 0 ; characterIndex<recognizedWordInfo.getText().length(); characterIndex++ ){
trace("processing character number : " +characterIndex);
IRecognizedCharacterInfo recognizedCharacterInfo = engine.CreateRecognizedCharacterInfo();
if ( recognizedCharacterInfo == null) {
trace("can't create recognizedCharacterInfo object");
}else {
recognizedWordInfo.GetRecognizedCharacter(characterIndex, -1, recognizedCharacterInfo);
System.out.println(" Character: " + recognizedCharacterInfo.getCharacter());
System.out.println(" Confidence level : " +recognizedCharacterInfo.getCharConfidence());
}
}
}
}
}
System.out.println(" Field Value : " +fieldValue.getAsString());
}
}
}
}
}
}
verificationGroup = verificationWorkSet.NextGroup();
}
}
}catch (Exception e){
e.printStackTrace();
}
//get next work set
verificationWorkSet = verificationSession.NextWorkSet();
}
}
}catch (Exception e){
e.printStackTrace();
}
}finally {
verificationSession.Close();
}
}catch (Exception e){
e.printStackTrace();
}
trace ("Getting Documents");
IDocuments documents = batch.getDocuments();
trace ("Getting Fields and printing");
for ( int j = 0 ; j < documents.getCount(); j++){
trace ("Getting documnets:" +(j+1));
IDocument document = documents.getElement(j);
IDocumentDefinition definition = document.getDocumentDefinition();
assert( definition != null );
assert( document.getPages().getCount() == 1 );
trace( "DocumentType: " + document.getDocumentDefinition().getName() );
try {
trace("opening document");
document.Open(true);
IFields fields = document.getSections().Item( 0 ).getChildren();
for( int i = 0; i < fields.getCount(); i++ ) {
IField field = fields.getElement( i );
trace( field.getName() + ": " +
( field.getValue() != null ? field.getValue().getAsString() : "." ) );
}
}finally {
trace("closing document");
document.Close(true);
}
}
}finally {
trace("Closing Batch");
batch.Close();
}
}catch (Exception e){
System.out.println("Exception in creating Batch");
e.printStackTrace();
}
finally {
trace("closing project");
project.Close();
}
}catch (Exception e){
System.out.println("Exception occured while loading project");
e.printStackTrace();
}
}catch (Exception e) {
// TODO: handle exception
System.out.println("Exception occured while loading engine");
e.printStackTrace();
}
finally {
trace("unloading Engine");
Engine.Unload();
}
}
}
Finally i got my solution, actually it recognize correctly, i'm handling them wrong way...
I'm using the Twitter4j library to retrieve tweets, but I'm not getting nearly enough for my purposes. Currently, I'm getting that maximum of 100 from one page. How do I implement maxId and sinceId into the below code in Processing in order to retrieve more than the 100 results from the Twitter search API? I'm totally new to Processing (and programming in general), so any bit of direction on this would be awesome! Thanks!
void setup() {
ConfigurationBuilder cb = new ConfigurationBuilder();
cb.setOAuthConsumerKey("xxxx");
cb.setOAuthConsumerSecret("xxxx");
cb.setOAuthAccessToken("xxxx");
cb.setOAuthAccessTokenSecret("xxxx");
Twitter twitter = new TwitterFactory(cb.build()).getInstance();
Query query = new Query("#peace");
query.setCount(100);
try {
QueryResult result = twitter.search(query);
ArrayList tweets = (ArrayList) result.getTweets();
for (int i = 0; i < tweets.size(); i++) {
Status t = (Status) tweets.get(i);
GeoLocation loc = t.getGeoLocation();
if (loc!=null) {
tweets.get(i++);
String user = t.getUser().getScreenName();
String msg = t.getText();
Double lat = t.getGeoLocation().getLatitude();
Double lon = t.getGeoLocation().getLongitude();
println("USER: " + user + " wrote: " + msg + " located at " + lat + ", " + lon);
}
}
}
catch (TwitterException te) {
println("Couldn't connect: " + te);
};
}
void draw() {
}
Unfortunately you can't, at least not in a direct way such as doing
query.setCount(101);
As the javadoc says it will only allow up to 100 tweets.
In order to overcome this, you just have to ask for them in batches and in every batch set the maximum ID that you get to be 1 less than the last Id you got from the last one. To wrap this up, you gather every tweet from the process into an ArrayList (which by the way should not stay generic, but have its type defined as ArrayList<Status> - An ArrayList that carries Status objects) and then print everything! Here's an implementation:
void setup() {
ConfigurationBuilder cb = new ConfigurationBuilder();
cb.setOAuthConsumerKey("xxxx");
cb.setOAuthConsumerSecret("xxxx");
cb.setOAuthAccessToken("xxxx");
cb.setOAuthAccessTokenSecret("xxxx");
Twitter twitter = new TwitterFactory(cb.build()).getInstance();
Query query = new Query("#peace");
int numberOfTweets = 512;
long lastID = Long.MAX_VALUE;
ArrayList<Status> tweets = new ArrayList<Status>();
while (tweets.size () < numberOfTweets) {
if (numberOfTweets - tweets.size() > 100)
query.setCount(100);
else
query.setCount(numberOfTweets - tweets.size());
try {
QueryResult result = twitter.search(query);
tweets.addAll(result.getTweets());
println("Gathered " + tweets.size() + " tweets");
for (Status t: tweets)
if(t.getId() < lastID) lastID = t.getId();
}
catch (TwitterException te) {
println("Couldn't connect: " + te);
};
query.setMaxId(lastID-1);
}
for (int i = 0; i < tweets.size(); i++) {
Status t = (Status) tweets.get(i);
GeoLocation loc = t.getGeoLocation();
String user = t.getUser().getScreenName();
String msg = t.getText();
String time = "";
if (loc!=null) {
Double lat = t.getGeoLocation().getLatitude();
Double lon = t.getGeoLocation().getLongitude();
println(i + " USER: " + user + " wrote: " + msg + " located at " + lat + ", " + lon);
}
else
println(i + " USER: " + user + " wrote: " + msg);
}
}
Note: The line
ArrayList<Status> tweets = new ArrayList<Status>();
should properly be:
List<Status> tweets = new ArrayList<Status>();
because you should always use the interface in case you want to add a different implementation. This of course, if you are on Processing 2.x will require this in the beginning:
import java.util.List;
Here's the function I made for my app based on the past answers. Thank you everybody for your solutions.
List<Status> tweets = new ArrayList<Status>();
void getTweets(String term)
{
int wantedTweets = 112;
long lastSearchID = Long.MAX_VALUE;
int remainingTweets = wantedTweets;
Query query = new Query(term);
try
{
while(remainingTweets > 0)
{
remainingTweets = wantedTweets - tweets.size();
if(remainingTweets > 100)
{
query.count(100);
}
else
{
query.count(remainingTweets);
}
QueryResult result = twitter.search(query);
tweets.addAll(result.getTweets());
Status s = tweets.get(tweets.size()-1);
firstQueryID = s.getId();
query.setMaxId(firstQueryID);
remainingTweets = wantedTweets - tweets.size();
}
println("tweets.size() "+tweets.size() );
}
catch(TwitterException te)
{
System.out.println("Failed to search tweets: " + te.getMessage());
System.exit(-1);
}
}
From the Twitter search API doc:
At this time, users represented by access tokens can make 180 requests/queries per 15 minutes. Using application-only auth, an application can make 450 queries/requests per 15 minutes on its own behalf without a user context.
You can wait for 15 min and then collect another batch of 400 Tweets, something like:
if(tweets.size() % 400 == 0 ) {
try {
Thread.sleep(900000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
Just keep track of the lowest Status id and use that to set the max_id for subsequent search calls. This will allow you to step back through the results 100 at a time until you've got enough, e.g.:
boolean finished = false;
while (!finished) {
final QueryResult result = twitter.search(query);
final List<Status> statuses = result.getTweets();
long lowestStatusId = Long.MAX_VALUE;
for (Status status : statuses) {
// do your processing here and work out if you are 'finished' etc...
// Capture the lowest (earliest) Status id
lowestStatusId = Math.min(status.getId(), lowestStatusId);
}
// Subtracting one here because 'max_id' is inclusive
query.setMaxId(lowestStatusId - 1);
}
See Twitter's guide on Working with Timelines for more information.