neo4j - Property AutoIndexer not storing values - java

The problem that I'm running into is that after I make an AutoIndex index a certain property, I can add a key/value pair and the index won't show that it's there. I'm relatively new to neo4j so my concept of what this class is supposed to do might be wrong. The test code creates an impermanent graph database, instantiates my data service class with it, and then creates a user. When the data service class is instantiated, that's when the properties are added to the AutoIndex. You can see inside the createUser() function, I've printed out the user that I've just created and should be in the AutoIndex, but it prints null.
This is the code I'm testing:
#Before
public void setup() throws IOException {
graphDb = new ImpermanentGraphDatabase();
Transaction tx = graphDb.beginTx();
try {
nA = graphDb.createNode();
nB = graphDb.createNode();
packetA = graphDb.createNode();
packetB = graphDb.createNode();
dataService = new DataServiceImpl(graphDb);
tx.success();
}
finally
{
tx.finish();
}
}
/****************** Test User Creation Functionality *******************/
#Test
public void createUser() throws ExistsException {
Transaction tx = graphDb.beginTx();
try {
UserWrapper user = (UserWrapper) dataService.createUser(BigInteger.valueOf(1));
tx.success();
}
finally {
tx.finish();
}
}
Here's the code in DataServiceImpl:
/**
* Node property keys that should be auto-indexed.
*/
private static final Set<String> NODE_KEYS_INDEXABLE = new HashSet<String>(
Arrays.asList(new String[] { UserWrapper.KEY_IDENTIFIER }));
/**
* Relationship property keys that should be auto-index.
*/
private static final Set<String> REL_KEYS_INDEXABLE = new HashSet<String>(
Arrays.asList(new String[] { SentWrapper.TIME_PROP }));
private void initIndices() {
/* Get the auto-indexers */
AutoIndexer<Node> nodeAutoIndexer = this.graphDb.index()
.getNodeAutoIndexer();
RelationshipAutoIndexer relAutoIndexer = this.graphDb.index()
.getRelationshipAutoIndexer();
this.updateIndexProperties(nodeAutoIndexer,
DataServiceImpl.NODE_KEYS_INDEXABLE);
this.nodeIndex = nodeAutoIndexer.getAutoIndex();
this.updateIndexProperties(relAutoIndexer,
DataServiceImpl.REL_KEYS_INDEXABLE);
this.relIndex = relAutoIndexer.getAutoIndex();
}
/**
* Sets the indexed properties of an {#link AutoIndexer} to the specified
* set, removing old properties and adding new ones.
*
* #param autoIndexer
* the AutoIndexer to update.
* #param properties
* the properties to be indexed.
* #return autoIndexer, this given AutoIndexer (useful for chaining calls.)
*/
private <T extends PropertyContainer> AutoIndexer<T> updateIndexProperties(
AutoIndexer<T> autoIndexer, Set<String> properties) {
Set<String> indexedProps = autoIndexer.getAutoIndexedProperties();
// Remove unneeded properties.
for (String prop : difference(indexedProps, properties)) {
autoIndexer.stopAutoIndexingProperty(prop);
}
// Add new properties.
for (String prop : difference(properties, indexedProps)) {
autoIndexer.startAutoIndexingProperty(prop);
}
// Enable the index, if needed.
if (!autoIndexer.isEnabled()) {
autoIndexer.setEnabled(true);
}
return autoIndexer;
}
public User createUser(BigInteger identifier) throws ExistsException {
// Verify that user doesn't already exist.
if (this.nodeIndex.get(UserWrapper.KEY_IDENTIFIER, identifier).getSingle() != null) {
throw new ExistsException("User with identifier '" + identifier.toString() + "' already exists.");
}
// Create new user.
final Node userNode = graphDb.createNode();
final User user = new UserWrapper(userNode);
user.setIdentifier(identifier);
Node userNode2 = this.nodeIndex.get(UserWrapper.KEY_IDENTIFIER, identifier).getSingle();
System.out.println(userNode2);
userParent.getNode().createRelationshipTo(userNode, NodeRelationships.PARENT);
return user;
}
Here's the code in UserWrapper:
/**
* Mapping to neo4j key for the identifier property.
*/
//Changed to public in order to test in Test class
static final String KEY_IDENTIFIER = "identifier";
#Override
public void setIdentifier(BigInteger newIdentity) {
neo4jNode.setProperty(KEY_IDENTIFIER, newIdentity.toByteArray());
}

Where do you add the second user? By running the test twice? Then ImpermanentGraphDatabase will have deleted all data (as it is intended for testing) before the second run. The indexing happens at commit time, and aggregates all changes during the transaction, that's why you don't see it within the tx (userNode2). If you'd like to you can add this check to the test (see below).
From your code it is also not visible where you call the initIndices, could you please indicate the place? Please also verify that the auto-indexer is indexing the correct properties.
Try to change your test to, then the second call should throw the exception:
#Test(expected = ExistsException.class)
public void createUser() throws ExistsException {
Transaction tx = graphDb.beginTx();
try {
UserWrapper user = (UserWrapper) dataService.createUser(BigInteger.valueOf(1));
tx.success();
}
finally {
tx.finish();
}
Node userNode2 = this.graphDb.index().getNodeAutoIndexer().getAutoIndex().get(UserWrapper.KEY_IDENTIFIER, identifier).getSingle();
assertNotNull(userNode2);
Transaction tx = graphDb.beginTx();
try {
UserWrapper user = (UserWrapper) dataService.createUser(BigInteger.valueOf(1));
tx.success();
}
finally {
tx.finish();
}
}

I found the answer to this question indirectly from the Michael's answer. I realized that using assertNotNull(userNode2); failed if it was directly after dataService.createUser() but passed if it was after the try finally block. Thus, if I wanted to use the node I created, i would need to create another try finally block after the one that creates the node to use this. I believe this is because the user isn't added to the index until tx.success() is called, although that is just a guess.

Related

Testing a Spring ControllerMethod using Mockito which alters an entry in the db with Optional.map NullPOinterException

Im currently Testing my Controller methods. In one Method I add a Reisepunkt(travelpoint) to a Reise(travel), which is already saved inside a database.
private final ReiseRepository repository;
private final ReisepunktRepository reisepunktRepository;
ReiseController(ReiseRepository reiseRepository, ReisepunktRepository reisepunktRepository) {
this.repository = reiseRepository;
this.reisepunktRepository = reisepunktRepository;
}
/**
* Adds a new Reisepunkt to the Reise. Both have to exist in the Database already.
* Will throw a Exception if Reise already contains same Reisepunkt.
* #param idReisepunkt ID of the Reisepunkt.
* #param idReise ID of the Reise.
* #return Configured Reise with new Reisepunkt.
*/
#PutMapping(path = "/reise/reisepunkt/{idReise}")
Reise addReisepunkt(#RequestParam Long idReisepunkt, #PathVariable Long idReise) {
return repository.findById(idReise).map(reise -> {
reisepunktRepository.findById(idReisepunkt).map(reisepunkt -> {
for (int i = 0; i < reise.getReisepunkte().size(); i++) {
if (reisepunkt.getId().equals(reise.getReisepunkte().get(i).getId())) {
throw new IllegalStateException("Reise already contains the Reisepunkt");
}
}
reise.addReisepunkt(reisepunkt);
return reisepunktRepository.save(reisepunkt);
}).orElseThrow(() -> new IllegalStateException("Could not save Reisepunkt"));
return repository.save(reise);
}).orElseThrow(() -> new IllegalStateException("Could not add Reisepunkt to Reise"));
}
Using the generated-request.http API I can use the method to make entrys into the db. Now i wanted to write a test method just so I can get the hang of it.
#Mock
private ReiseRepository reiseRepository;
private ReisepunktRepository reisepunktRepository;
private ReiseController underTest;
#BeforeEach
void setUp() {
underTest = new ReiseController(reiseRepository, reisepunktRepository);
}
#Test
void canAddaReisepunktToReise() {
//given
Reisepunkt reisepunkt = new Reisepunkt(12L, 10.41f, 51.32f,
"nutzer#web.de", "Aussicht");
List<Reisepunkt> reisepunkte = new ArrayList<>();
reisepunkte.add(new Reisepunkt(34L, 4.1f, 32.32f,
"nutzer", "jas"));
List<Reisekatalog> reisekatalogs = new ArrayList<>();
Reise reise = new Reise(new Date(), "TestReise", true,
reisepunkte, reisekatalogs);
long idReise = 1;
long idReisepunkt = 12;
given(reiseRepository.findById(idReise)).willReturn(java.util.Optional.of(reise));
given(reiseRepository.save(reise)).willReturn(reise);
given(reisepunktRepository.findById(idReisepunkt))
.willReturn(java.util.Optional.of(reisepunkt));
given(reisepunktRepository.save(reisepunkt)).willReturn(reisepunkt);
//when
underTest.addReisepunkt(idReisepunkt, idReise);
//then
ArgumentCaptor<Reise> reiseArgumentCaptor = ArgumentCaptor.forClass(Reise.class);
verify(reiseRepository).save(reiseArgumentCaptor.capture());
Reise capturedReise = reiseArgumentCaptor.getValue();
reise.addReisepunkt(reisepunkt);
assertThat(capturedReise).isEqualTo(reise);
}
I always get a NullPointerException in the lines:
given(reisepunktRepository.findById(idReisepunkt))
.willReturn(java.util.Optional.of(reisepunkt));
given(reisepunktRepository.save(reisepunkt)).willReturn(reisepunkt);
Apperently Mockito has a problem when I map an Optional inside another Optional.map and then use given for the second Repository request. I guess there is some special way to implement the Test methode for a given Repo request inside a Optional map.
You forgot to mock ReisepunktRepository and this cause the NullPointerException
Update from
private ReisepunktRepository reisepunktRepository;
To
#Mock
private ReisepunktRepository reisepunktRepository;

NullPointerException occurring infinitely

I want to display list of record from database to output text field. I'm having problem with method with brings record from database. It is causing an infinite loop as it call in constructor of managed bean class. Here is the code.
Constructor of managed bean class:
public InterViewDto() throws SQLException {
User u = getCurrentUser();
InterviewDao d = new InterviewDao();
List<InterViewDto> dao1 = d.getCall(u.getEmailAddress());
setDto(dao1);
}
Method bringing record from database :
public List<InterViewDto> getCall(String email) throws SQLException {
System.out.print("fyc");
List<InterViewDto> list = new ArrayList<InterViewDto>();
String job = null;
boolean exists = false;
Connection c = null;
try {
c = openConnection();
String query_check = "SELECT * FROM interviewcall WHERE useremail = '"+email+"' ";
Statement st = c.createStatement();
ResultSet rs = st.executeQuery(query_check);
while (rs.next()) {
InterViewDto dto = new InterViewDto();
dto.setDate( rs.getDate("time"));
dto.setJobtitle( rs.getString("jobtitle"));
dto.setJobtitle( rs.getString("useremail"));
list.add(dto);
System.out.print(list.get(0).getJobtitle());
} rs.close();
} catch (Exception e) {
System.out.println(e);
} finally {
c.close();
}
return list;
}
You have a circular dependency. Your constructor for the DTO reaches out to the database, which in turn creates a new DTO to represent the data loaded from the database, which goes to the database and back and forth until you overflow the call stack.
Quite simply, you have merged two complementary design approaches.
Either your InterViewDto constructor loads data from the DAO or the DAO constructs a new InterViewDto object. Pick one or the other.
In my opinion, it makes more sense for the DAO to create the DTO objects. If you want the DTO to delegate to the DAO as a matter of convenience, consider a static method.
public class InterViewDto {
public InterViewDto() {
}
...
public static fromCurrentUser() {
return new InterviewDao().getCall(getCurrentUser().getEmailAddress());
}
}
Then change your constructor to be empty.

Transaction for POJOs

I'm implementing a method that does something like:
...
try {
myPojo.setProperty("foo");
myService.execute(myPojo);
} catch (Exception e) {
logger.error(e.getMessage(), e);
}
...
If some exception is thrown by my service from this try block on pojo property will have the new value. Is there some way to start a kind of transaction for pojo changes and roll it back if something goes wrong?
Something like:
PojoTransaction pt = startPojoTransaction();
transactionedPojo = pt.handleByTransaction(myPojo);
try {
transactionedPojo.setProperty("foo");
myService.execute(transactionedPojo);
pt.commit;
} catch (Exception e) {
logger.error(e.getMessage(), e);
}
Or something similar...
Take a look at the Memento Pattern, it includes a Java example.
http://en.wikipedia.org/wiki/Memento_pattern
I toyed around with the idea, this is far from perfect, just a simple proof of concept. There are pitfalls in this implementation:
It only tries to call a parameterless constructor of the given source
object to create the target-copy, would need some logic to select a correct constructor (or only support Cloneables?)
Only copies fields declared in the class, not from superclasses (this problem can be solved walking through the inheritance tree and copying any superclass fields)
If the fields are complex types, only the references are copied to target-object, so any changes to them will not be transactional, as both the source and target share the same instance (solvable by recursively creating copies of nested objects and copying their values, requires walking through the entire object-graph, starting from source, and then doing it vice-versa on commit-time)
But, improving from here, I believe it could become very usable. Here's the POC:
import java.lang.reflect.Field;
import org.junit.Assert;
import org.junit.Test;
public class PojoTransactionTest
{
public static class PojoTransaction<T>
{
/**
* This is the original (unmodified) object
*/
private T source;
/**
* This is the object modified by within the transaction
*/
private T target;
/**
* Creates a new transaction for the given source object
* #param source Source object to modify transactionally
*/
public PojoTransaction(T source)
{
try
{
this.source = source;
this.target = (T)source.getClass().newInstance(); //Note: this only supports parameterless constructors
copyState(source, target);
}
catch(Exception e)
{
throw new RuntimeException("Failed to create PojoTransaction", e);
}
}
/**
* Copies state (member fields) from object to another
* #param from Object to copy from
* #param to Object to copy to
* #throws IllegalAccessException
*/
private void copyState(T from, T to) throws IllegalAccessException
{
//Copy internal state to target, note that this will NOT copy fields from superclasses
for(Field f : from.getClass().getDeclaredFields())
{
f.setAccessible(true);
f.set(to, f.get(from));
}
}
/**
* Returns the transaction target object, this is the one you should modify during transaction
* #return Target object
*/
public T getTransactionTarget()
{
return target;
}
/**
* Copies the changes from target object back to original object
*/
public void commit()
{
try
{
copyState(target, source);
}
catch(Exception e)
{
throw new RuntimeException("Failed to change state of original object", e);
}
}
}
public static class TestData
{
private String strValue = "TEST";
private int intValue = 1;
private float floatValue = 3.1415f;
public String getStrValue()
{
return strValue;
}
public void setStrValue(String strValue)
{
this.strValue = strValue;
}
public int getIntValue()
{
return intValue;
}
public void setIntValue(int intValue)
{
this.intValue = intValue;
}
public float getFloatValue()
{
return floatValue;
}
public void setFloatValue(float floatValue)
{
this.floatValue = floatValue;
}
}
#Test
public void testTransaction()
{
//Create some test data
TestData orig = new TestData();
//Create transaction for the test data, get the "transaction target"-object from transaction
PojoTransaction<TestData> tx = new PojoTransaction<TestData>(orig);
TestData target = tx.getTransactionTarget();
target.setFloatValue(1.0f);
target.setIntValue(5);
target.setStrValue("Another string");
//Original object is still at the original values
Assert.assertEquals(1, orig.getIntValue());
Assert.assertEquals(3.1415f, orig.getFloatValue(), 0.001f);
Assert.assertEquals("TEST", orig.getStrValue());
//Commit transaction
tx.commit();
//The "orig"-object should now have the changes made to "transaction target"-object
Assert.assertEquals(5, orig.getIntValue());
Assert.assertEquals(1.0f, orig.getFloatValue(), 0.001f);
Assert.assertEquals("Another string", orig.getStrValue());
}
}
The question is a bit vague, but it sounds like you are wrestling with the basic design pattern for transaction management. You would benefit greatly from the experience that has gone into the production of the pattern used here:
http://static.springsource.org/spring/docs/3.0.x/spring-framework-reference/html/transaction.html
Perhaps Spring Transaction management would suit you well for your project anyway.

How to access seam component that was created at the time the parameter value was set?

My url has a parameter name=SSA-COLUMB. The seam parameter (set in DevicesList.page.xml) is
<param name="name" value="#{searchDeviceName.paramValue}"/>
When I load the page the searchDeviceName component is instantiated and paramValue is set. I verified it with a print statement on setParamValue(). Here is the code for SearchDeviceName.java
#Name("searchDeviceName")
#Scope(ScopeType.CONVERSATION)
public class SearchDeviceName {
public String paramValue;
public Table table;
public String sqlString;
public ArrayList<JoinClause> joinList;
public SearchDeviceName() {
Table devTable = new Table("devices","d","dev_id");
setTable(devTable);
setSqlString(" d.name like '%"+paramValue+"%'");
}
<getters and setters>
}
I have a stateless session bean that grabs an instance of this component to use in building a sql statement. But when I grab the component instance, it does not have paramValue set. Apparently it is a new instance of the component.
SearchDeviceName searchObj = (SearchDeviceName) Component.getInstance("searchDeviceName", ScopeType.CONVERSATION);
This is not the same instance of searchDeviceName that was instantiated when the parameter value got set. I know this because paramValue is null rather than set to "SSA-COLUMB".
Is there any way to use in my session bean the same component intance that got created when the parameter value was set? How do I grab it?
Here is the code that grabs the component
#Name("idListBuilder")
public class IdListBuilder {
#In
private Context conversationContext;
#In(create = true)
EntityManager entityManager;
private String sqlQuery;
private Table headTable;
private ArrayList<String> restrictionValues = new ArrayList<String>();
private ArrayList<Table> restrictionJoinTables = new ArrayList<Table>();
public IdListBuilder(Table table) {
this.headTable = table;
this.sqlQuery = "SELECT " + table.alias + "." + table.primaryKey + " FROM " +
table.name + " " + table.alias;
searchObjects.add("searchDeviceName");
searchObjects.add("searchOfficeState");
}
public List<Integer> getIdList(){
evaluateSearchObjects();
createQuery();
/*
Code for grabbing resultlist of query goes here.
return resultList;
*/
return null;
}
public void evaluateSearchObjects() {
SearchDeviceName searchObj = (SearchDeviceName) Component.getInstance("searchDeviceName", ScopeType.CONVERSATION);
if ( searchObj != null ) {
restrictionValues.add(searchObj.sqlString);
restrictionJoinTables.add(searchObj.table);
}
}
void createQuery(){
StringBuilder strBuilder = new StringBuilder(sqlQuery);
strBuilder.append(" where ");
for ( String str : restrictionValues ){
strBuilder.append(str);
}
System.out.println("done");
}
}
I am not 100% sure, but maybe contextual events are the way to go.
Take a look at this link
Maybe you can listen to an event
org.jboss.seam.preSetVariable.<name> // called when the context variable <name> is s et
org.jboss.seam.postSetVariable.<name> // called when the context variable <name> is set
#Observer("org.jboss.seam.postSetVariable.YOUR_COMPONENT")
public void method() {
//Maybe you need to retrieve the correct object as parameter also,look at the documentation
}
Have you confirmed that setParamValue() is being called before evaluateSearchObjects()?
What is invoking IdListBuilder.getIdList()? Perhaps that is occurring earlier in the JSF lifecycle that setParamValue().

Automatically opening and closing connection

NOTE: Please ignore my use of MultivaluedMap instead of multiple vargs String...args.
Is there a standard way in java of doing this?
What I have is a resource, that is returned from a remote server. But before each query, the remote connection must be open, and after the returns are returned - it must be closed.
So a natural way of doing this is something like:
Connection c = config.configureConnection();
c.open(); //open
List<Car> cars;
try{
cars = c.getCars();
}finally{
c.close(); //close
}
Now I want to implement something that operates on the level of the resources themselves, without worrying about connection, for example:
List<Car> cars = new CarResource().all(); //opens and closes connection
The way I am currently doing it is by having one abstract class, AbstractQueriable call abstract methods query(String ...args) and query(int id), which any class extending it must implement.
The AbstractQuerieable implements the Queriable interface, which makes it expose the three public methods filter(String ...args), all() and get(int id) - which are the public facing methods.
Here is the Queriable interface:
public interface Queriable <T>{
public T get(String id);
/** Simply returns all resources */
public Collection<T> all();
public Collection<T> filter(MultivaluedMap<String, String> args);
}
here is the AbstractQueriable class that implements it:
public abstract class AbstractQueriable<T> implements Queriable<T> {
#Override
public final T get(String id) {
setup();
try {
return query(id);
} finally {
cleanup();
}
}
#Override
public final Collection<T> filter(MultivaluedMap<String, String> args) {
setup();
try {
return query(args);
} finally {
cleanup();
}
}
/**
* Returns all resources.
*
* This is a convenience method that is equivalent to passing an empty
* arguments list to the filter function.
*
* #return The collection of all resources if possible
*/
#Override
public final Collection<T> all() {
return filter(null);
}
/**
* Queries for a resource by id.
*
* #param id
* id of the resource to return
* #return
*/
protected abstract T query(String id);
/**
* Queries for a resource by given arguments.
*
* #param args
* Map of arguments, where each key is the argument name, and the
* corresponing values are the values
* #return The collection of resources found
*/
protected abstract Collection<T> query(MultivaluedMap<String, String> args);
private void cleanup() {
Repository.close();
}
private void setup() {
Repository.open();
}
and finally my resource, which I want to use in the code, must extend the AbstractQueriable class, for example (please note that the details of these methods are not important):
public class CarRepositoryResource extends AbstractQueriable<Car> {
#Override
protected Car query(String id) {
MultivaluedMap<String, String> params = new MultivaluedMapImpl();
params.add("CarID", id);
// Delegate the query to the parametarized version
Collection<cars> cars = query(params);
if (cars == null || cars.size() == 0) {
throw new WebApplicationException(Response.Status.NOT_FOUND);
}
if (cars.size() > 1) {
throw new WebApplicationException(Response.Status.NOT_FOUND);
}
return cars.iterator().next();
}
#Override
protected Collection<Car> query(MultivaluedMap<String, String> params) {
Collection<Car> cars = new ArrayList<Car>();
Response response = Repository.getConnection().doQuery("Car");
while (response.next()) {
Returned returned = response.getResult();
if (returned != null) {
cars.add(returned);
}
}
return cars;
}
}
which finally, I can use in my code:
Collection<Car> cars = new CarRepositoryResource().all();
//... display cars to the client etc...
There are a few things I don't like about this kind of setup:
I must instantiate a new instance of my "CarRepositoryResource" every time I do a query.
The method names "query", while internal and private, are still confusing and clunky.
I am not sure if there is a better pattern or framework out there.
The connection that I am using does not support/implement the JDBC api and is not sql-based.
You could use a variation of the (in)famous Open session in view pattern.
Basically it comes down to this:
Define a "context" in which connections are available
(usually the request in web applications)
Handle (possibly lazy) initialization and release of a connection when entering/exiting the context
Code your methods taking for granted they will only be used inside such a context
It is not difficult to implement (storing the connection in a static ThreadLocal to make it thread safe) and will definitely spare a few open/close calls (performance-wise that could be a big gain, depending on how heavy your connection is).
The context class could look something like (consider this pseudo-code);
public class MyContext{
private static final
ThreadLocal<Connection> connection = new ThreadLocal<Connection>();
public static void enter() {
connection.set(initializeConnection());
// this is eager initialization
// if you think it will often the case that no connection is actually
// required inside a context, you can defer the actual initialization
// until the first call to get()
}
public static void exit() {
try { connection.close(); }
catch(Throwable t) { /* panic! */ }
finally { connection.set(null); }
}
public static Connection get() {
Connection c = connection.get();
if (c == null) throw new IllegalStateException("blah blah");
return c;
}
}
Then you would use connections like this:
MyContext.enter();
try {
// connections are available here:
// anything that calls MyContext.get()
// gets (the same) valid connection instance
} finally {
MyContext.exit();
}
This block can be put wherever you want (in webapps it usually wraps the processing of each request) - from the main method if you are coding a simple case when you want a single connection available for the whole lifespan of your application, to the finest methods in your API.
You might want to take a look at fluent interfaces (with an interesting example here) and its "Builder" pattern.
You would query like this:
cars().in(DB).where(id().isEqualTo(1234));
This way you can hide the connection/disconnection code in the outermost cars() method, for example.

Categories