I'm using the Domino Java API to query a database on a remote server. The server is processing the documents, and I'm trying to get their status. However, when I create the session, and run a query, even if I loop and check again every 30 seconds, my code will never see those documents update- it only sees the status at the time it created the first query. I have a few more loops, but the basic code outline is below- can someone tell me what I'm doing wrong?
Is there a way to update the current Database view from the Java API? The databases are not full text indexed, and cannot be due to outside constraints.
public static boolean queryDatabase(String adminFilePath, String targetItem)
NotesThread.sinitThread();
Session session =NotesFactory.createSession((String) null, (String) null, (String) null);
Registration Reg = s.createRegistration();
Reg.switchToID(adminFilePath, password);
DocumentCollection dc = getRecentDocsFromDB(session);
numResults=dc.getCount();
if (numResults > 0) {
//loop through documents to find what I'm looking for
//if the documents contain "done", finish, else:
Thread.sleep(60000);
session.recycle();
session=SessionFactory.newSession(adminFilePath, "password");
dc = getRecentDocsFromDB(session);
found = searchDocumentCollection(dc, targetItem);
//this is essentially doing the same thing again- create a session, get docs made in the
//past day or so, and loop through looking for the ones I need.
}
private static DocumentCollection getRecentDocsFromDB(Session session){
Database db = SessionFactory.openDatabase(session, server, database);
Calendar cal = Calendar.getInstance();
cal.add(Calendar.DATE, -1);
DateTime dt = session.createDateTime(cal);
DocumentCollection dc = searchNotesDBUsingDate(session, db,"Form=\"event\"", dt);
}
public static DocumentCollection searchNotesDBUsingDate(Session session,
Database database, String query, DateTime dt) throws NotesException {
DocumentCollection dc = null;
dc = database.search(query, dt);
return dc;
}
I've updated the code with a session.recycle() call. (Thanks for the suggestion!) In testing, it's not having any effect- the code is working for the first document, but then never sees a second document being called.
It's insane, because it seems to be caching the session anyway!
I tried to reproduce the problem, but I wasn't able to. In my tests, Database.search() always returns the latest documents -- even if a document is added after the database is opened. I suspect there is a subtle problem in your code (perhaps what Richard Schwartz suggested in his comment).
It may or may not be relevant, but I wasn't able to compile your version of getRecentDocsFromDB() because I don't have a SessionFactory class. My version just uses Session.getDatabase() like so:
private static DocumentCollection getRecentDocsFromDB(Session session, String server, String database) throws NotesException {
Database db = session.getDatabase(server, database);
Calendar cal = Calendar.getInstance();
cal.add(Calendar.DATE, -1);
DateTime dt = session.createDateTime(cal);
DocumentCollection dc = searchNotesDBUsingDate(session, db,"Form=\"event\"", dt);
return dc;
}
Also, as Richard mentioned, you are not reading a view. You are searching the database for all documents created (or modified) by the "event" form in the last 24 hours. And you are doing this in a tight loop. Even if you get it to work, this approach isn't the best for production. You might want to research the use of lotus.domino.View and lotus.domino.ViewNavigator.
Related
I am developing a spring mvc project where a notification will send to user mobile.User will select a datetime at format 'yyyy-MM-dd HH:mm' from frontend and save it to the database.when system time reach that time a notification will be send to the user mobile .
I created scheduler like this
#Component
public class NotificationScheduler extends BaseService {
#Scheduled(fixedRate = 300000)
public void sendNotification() {
Date currentDate = new Date();
System.err.println("HHHHHHHHHKKKKKK");
List<ImageInfo> listImageInfo = imageInfoDao.getImageOfParticularDate(currentDate);
}
}
this is my dao function which run a wuery to database
public List getImageOfParticularDate(Date date) {
session = getSession();
session.beginTransaction();
List<ImageInfo> imageInfoList = session.createQuery("select img from ImageInfo img where img.publishedTime =:publishedTime ").setParameter("publishedTime", date).list();
session.getTransaction().commit();
return imageInfoList;
}
this code is checking repeatedly at 5 min interval whether system time equal to publish time.if it is equal then a notification will be sent.I used Date type in model and Date type column in database. I want to know my approach is right or wrong because i can not get desire output.
Why don't you use a debug mode? Change, your the time rate you use in order to not wait too much, but I think you could determine that yourself, could you?
A cron job is being used to fire this script off once a day. When the script runs it seems to work as expected. The code builds a map, iterates over that map, creates points which are added to a batch, and finally writes those batched points to influxDB. I can connect to the influxDB and I can query my database and see that the points were added. I am using influxdb-java 2.2.
The issue that I am having is that when influxDB is restarted all of my data is being removed. The database still exists and the series still exists, however, all of the points/rows are gone (Each table is empty). My database is not the only database, there are several others, those databases are restored correctly. My guess is that the transaction is not being finalized. I am not aware of a way to make it do a flush and ensure that my points are persisted. I tried to adding:
influxDB.write(batchPoints);
influxDB.disableBatch(); // calls this.batchProcessor.flush() in InfluxDBImpl.java
This was an attempt to force a flush but this didn't work as expected. I am using influxDB 0.13.X
InfluxDB influxDB = InfluxDBFactory.connect(host, user, pass);
String dbName = "dataName";
influxDB.createDatabase(dbName);
BatchPoints batchPoints = BatchPoints
.database(dbName)
.tag("async", "true")
.retentionPolicy("default")
.consistency(ConsistencyLevel.ALL)
.build();
for (Tags type: Tags.values()) {
List<LinkedHashMap<String, Object>> myList = this.trendsMap.get(type.getDisplay());
if (myList != null) {
for (LinkedHashMap<String, Object> data : myList) {
Point point = null;
long time = (long) data.get("time");
if (data.get("date").equals(this.sdf.format(new Date()))) {
time = System.currentTimeMillis();
}
point = Point.measurement(type.getDisplay())
.time(time, TimeUnit.MILLISECONDS)
.field("count", data.get("count"))
.field("date", data.get("date"))
.field("day_of_week", data.get("day_of_week"))
.field("day_of_month", data.get("day_of_month"))
.build();
batchPoints.point(point);
}
}
}
influxDB.write(batchPoints);
Can you upgrade InfluxDB to 0.11.0? There have been many important changes since then and it would be best to test against that.
I have an App which shows special Days. I want to integrate them into the calendar.
The events are static, they don't change, so I don't have to update the calendar very often.
I first thought of creating a local calendar and add the events, but new android versions (since 2.3?) seem not to support that; to implement I would have to create a Calendar Provider.
I saw this project on github: https://github.com/dschuermann/birthday-adapter. It is very complicated; its main use is adding the birthdays of the contacts to a new calendar.
There is lots of code, much of which I don't think I need. Do I really need to register to android's account manager to integrate a Calendar Provider? I just need a new Calendar with my event...
Would it be easier to take the user's default Calendar and add all the Events there? I could add some identifiers to the description, to be able to remove the events if the user doesn't want them.
Any tips, tutorials, or further readings are appreciated.
Metin Kale
You can create events in your device calendar via Intent. I think it could be useful for you.
public long addEventToCalender(ContentResolver cr, String title, String addInfo, String place, int status,
long startDate, boolean isRemind,long endDate) {
String eventUriStr = "content://com.android.calendar/events";
ContentValues event = new ContentValues();
// id, We need to choose from our mobile for primary its 1
event.put("calendar_id", 1);
event.put("title", title);
event.put("description", addInfo);
event.put("eventLocation", place);
event.put("eventTimezone", "UTC/GMT +2:00");
// For next 1hr
event.put("dtstart", startDate);
event.put("dtend", endDate);
//If it is bithday alarm or such kind (which should remind me for whole day) 0 for false, 1 for true
// values.put("allDay", 1);
// event.put("eventStatus", status);
event.put("hasAlarm", 1);
Uri eventUri = cr.insert(Uri.parse(eventUriStr), event);
long eventID = Long.parseLong(eventUri.getLastPathSegment());
if (isRemind) {
String reminderUriString = "content://com.android.calendar/reminders";
ContentValues reminderValues = new ContentValues();
reminderValues.put("event_id", eventID);
// Default value of the system. Minutes is a integer
reminderValues.put("minutes", 5);
// Alert Methods: Default(0), Alert(1), Email(2), SMS(3)
reminderValues.put("method", 1);
cr.insert(Uri.parse(reminderUriString), reminderValues); //Uri reminderUri =
}
return eventID;
}
For more information visit http://developer.android.com/reference/java/util/Calendar.html
In this reply you how to get contacts and birthdays.
Android Applicaiton - How to get birthday of a contact
and this library will offer you a powerful and flexible schedule so you can use
Caldroid Library
I couldn't find anything relevant searching. I am doing bulk inserts and/ or updates depending on whether a previous record exists based on data read in from exported CSVs.
What is below persists two records two the database, whereas I would like it to insert and then update that same record.
If there is already an record in the db, it works as expected, just updating the existing row.
Previously, I was checking if null and just doing if/ else for inserting/ updating and everything worked fine. That had a lot of redundancy in the setters, so I changed it to this. My hope is that it will always run the update, but in the event a record isn't found, it will create a minimally viable pathology record that is subsequently updated. The update works fine - the problem is when no prior record exists, it creates an additional record.
Is this because of where and how the transaction is being merged and committed? I do not understand why the second merge creates a new insert when I am passing it an object to update that must already exist.
I can get everything working as I would like, but I want to better understand why this isn't behaving as I would expect. Am I missing a core concept on merge and transactions?
Here is my code.
Pathology pathology = getPathology(entityManager, slideId, mdCallString, CALLTYPE);
if(null==pathology) {
entityManager.getTransaction().begin();
pathology = new Pathology();
pathology.setSlideId(slideId);
pathology.setCallType(CALLTYPE);
pathology.setCallBy(mdCallString);
entityManager.merge(pathology);
entityManager.getTransaction().commit();
}
entityManager.getTransaction().begin();
pathology.setSample(sample);
pathology.setSlides(slides);
pathology.setLocation(location);
pathology.setCallDx(sampleLevelDx);
pathology.setPatientDx(patientLevelDx);
pathology.setRadiologyReports(radiologyReports);
pathology.setModality(MODALITY);
pathology.setUpdatedOn(new Date());
String dtm = lineContainer.get(lineContainer.size() - 1);
DateFormat sdf = new SimpleDateFormat("MM/dd/yyyy hh:mm a", Locale.US);
Date callDate = sdf.parse(dtm);
pathology.setCallDate(callDate);
if(mdCallString==colbyCallString) {
pathology.setRepresentativeSlideDigitized(currentSlide.get(3));
}
entityManager.merge(pathology);
entityManager.getTransaction().commit();
edit: So, everything works as I would hope if I run
Pathology pathology = getPathology(entityManager, slideId, mdCallString, CALLTYPE);
if(null==pathology) {
entityManager.getTransaction().begin();
pathology = new Pathology();
pathology.setSlideId(slideId);
pathology.setCallType(CALLTYPE);
pathology.setCallBy(mdCallString);
entityManager.merge(pathology);
entityManager.getTransaction().commit();
}
pathology = getPathology(entityManager, slideId, mdCallString, CALLTYPE);
entityManager.getTransaction().begin();
pathology.setSample(sample);
pathology.setSlides(slides);
pathology.setLocation(location);
pathology.setCallDx(sampleLevelDx);
pathology.setPatientDx(patientLevelDx);
pathology.setRadiologyReports(radiologyReports);
pathology.setModality(MODALITY);
pathology.setUpdatedOn(new Date());
String dtm = lineContainer.get(lineContainer.size() - 1);
DateFormat sdf = new SimpleDateFormat("MM/dd/yyyy hh:mm a", Locale.US);
Date callDate = sdf.parse(dtm);
pathology.setCallDate(callDate);
if(mdCallString==colbyCallString) {
pathology.setRepresentativeSlideDigitized(currentSlide.get(3));
}
entityManager.merge(pathology);
entityManager.getTransaction().commit();
a second time, after the blocks that is checking for null. But I would like to avoid the duplication. And regarding the original question, what is the difference between what is being passed in the pathology variable in the two cases that causes a different outcome? In both cases they refer to the same existing record in the db -- so why does the one lead to an insert, whereas the other executes an update..
Thanks
When I am creating a new H2 database via ORMLite the database file get created but after I close my application, all the data that it stored in the database is lost:
JdbcConnectionSource connection =
new JdbcConnectionSource("jdbc:h2:file:" + path.getAbsolutePath() + ".h2.db");
TableUtils.createTable(connection, SomeClass.class);
Dao<SomeClass, Integer> dao = DaoManager.createDao(connection, SomeClass.class);
SomeClass sc = new SomeClass(id, ...);
dao.create(sc);
SomeClass retrieved = dao.queryForId(id);
System.out.println("" + retrieved);
This code will produce good results. It will print the object that I stored.
But when I start the application again this time without creating the table and storing new object I get an exception telling me that the required table is not exists:
JdbcConnectionSource connection =
new JdbcConnectionSource("jdbc:h2:file:" + path.getAbsolutePath() + ".h2.db");
Dao<SomeClass, Integer> dao = DaoManager.createDao(connection, SomeClass.class);
SomeClass retrieved = dao.queryForId(id); // will produce an exception..
System.out.println("" + retrieved);
The following worked fine for me if I ran it once and then a second time with the createTable turned off. The 2nd insert gave me a primary key violation of course but that was expected. It created the file with (as #Thomas mentioned) a ".h2.db.h2.db" prefix.
Some questions:
After you run your application the first time, can you see the path file being created?
Is it on permanent storage and not in some temporary location cleared by the OS?
Any chance some other part of your application is clearing it before the database code begins?
Hope this helps.
#Test
public void testStuff() throws Exception {
File path = new File("/tmp/x");
JdbcConnectionSource connection = new JdbcConnectionSource("jdbc:h2:file:"
+ path.getAbsolutePath() + ".h2.db");
// TableUtils.createTable(connection, SomeClass.class);
Dao<SomeClass, Integer> dao = DaoManager.createDao(connection,
SomeClass.class);
int id = 131233;
SomeClass sc = new SomeClass(id, "fopewjfew");
dao.create(sc);
SomeClass retrieved = dao.queryForId(id);
System.out.println("" + retrieved);
connection.close();
}
I can see Russia from my house:
> ls -l /tmp/
...
-rw-r--r-- 1 graywatson wheel 14336 Aug 31 08:47 x.h2.db.h2.db
Did you close the database? It is closed automatically but it's better to close it manually (so recovery is faster).
In many cases the database URL is the problem. Are you sure the same path is used in both cases? Otherwise you end up with two databases. By the way, ".h2.db" is added automatically, you don't need to add it manually.
To better analyze the problem, you could append ;TRACE_LEVEL_FILE=2 to the database URL, and then check in the *.trace.db file what SQL statements were executed against the database.