I am trying to use the DynamoDBScanExpression withLimit of 1 using Java aws-sdk version 1.11.140
Even if I use .withLimit(1) i.e.
List<DomainObject> result = mapper.scan(new DynamoDBScanExpression().withLimit(1));
returns me list of all entries i.e. 7. Am I doing something wrong?
P.S. I tried using cli for this query and
aws dynamodb scan --table-name auditlog --limit 1 --endpoint-url http://localhost:8000
returns me just 1 result.
DynamoDBMapper.scan will return a PaginatedScanList - Paginated results are loaded on demand when the user executes an operation that requires them. Some operations, such as size(), must fetch the entire list, but results are lazily fetched page by page when possible.
Hence, The limit parameter set on DynamoDBScanExpression is the maximum number of items to be fetched per page.
So in your case, a PaginatedList is returned and when you do PaginatedList.size it attempts to load all items from Dynamodb, under the hood the items were loaded 1 per page (each page is a fetch request to DynamoDb) till it get's to the end of the PaginatedList.
Since you're only interested in the first result, a good way to get that without fetching all the 7 items from Dynamo would be :
Iterator it = mapper.scan(DomainObject.class, new DynamoDBScanExpression().withLimit(1)).iterator();
if ( it.hasNext() ) {
DomainObject dob = (DomainObject) it.next();
}
With the above code, only the first item will fetched from dynamodb.
The take away is that : The limit parameter in DynamoDBQueryExpression is used in the pagination purpose only. It's a limit on the amount of items per page not a limit on the number of pages that can be requested.
Related
I want to fetch specific item from dynamoDB by using table.getItem(xxx) and while fetching the item need to apply some filters.
select * from emp where empid=1 and isadmin=true;
How can I write above query in dynamo-db.
Below is the sample code which I have tried:
GetItemSpec spec = new GetItemSpec()
.withPrimaryKey("empid", "1", "deptno", "12");
Item outcome = table.getItem(spec);
System.out.pritnln("outcome "+outcome);
In above code I want to apply one more filter like "isAdminU=true"
please give me some suggestions to add filters in getItem(....).
Note:
I am able to resolve my use by using table.query(spec) But this method is returning collections I have to parse and get the first item every time which is not required in my case.
As you noticed, DynamoDB's GetItem operation does not accept a filter. It always returns a single item if it exists, so often there is no need for such filtering. But if you really do want such filtering - e.g., your items are large and you don't want to waste network bandwidth on sending them if they don't pass your filter - as you suggested yourself, you can use a Query.
Yes, Query's response format is a little different from GetItem, but any DynamoDB library in any language makes getting the first (and only) item in the response trivial. If you're worried about the possibility of getting more than the first item, you can always set Limit=1 but this isn't necessary if your key condition ensures just one item will match.
You should know that regardless of whether you do an unconditional GetItem and filter in the client or do a Query with a filter at the server side, in both cases the read cost will be the same as the entire item will be read from disk in both cases. The only - small - difference may be in the network bandwidth costs, and this only becomes relevant if the items are large.
You can't specify filter in the GetItem API call in the DynamoDB.
It just supports the projection, for example:
GetItemSpec spec = new GetItemSpec()
.withPrimaryKey("Id", 206)
.withProjectionExpression("Id, Title, RelatedItems[0], Reviews.FiveStar")
.withConsistentRead(true);
However, there is a QuerySpec API, which can be used to get an item with filter condition in the DynamoDB.
Table table = dynamoDB.getTable("test");
QuerySpec spec = new QuerySpec()
.withKeyConditionExpression("apartId = :v_id")
.withFilterExpression("apartName = :apartName")
.withValueMap(new ValueMap()
.withString(":v_id", "123")
.withString(":apartName", "somewhere")
);
ItemCollection<QueryOutcome> items = table.query(spec);
Iterator<Item> iterator = items.iterator();
Item item = null;
while (iterator.hasNext()) {
item = iterator.next();
System.out.println(item.toJSONPretty());
}
Can any body help me on how I can find some record from LDAP using springldap.
My problem is, I have created a rest service and it accepts some parameter. One is offset and another is limit. Offset parameter escape some record like if my ldap server have 500 record.
Now if I give offset value 1 and LIMIT is 100. then it should give first 100 record from ldap.
If I give offset value 100 and LIMIT is 100, then it should give 100 record after first 100 record from ldap.
If I give offset value 50 and LIMIT is 10, then it should give 10 record after first 50 record from ldap.
I am stuck on how to set offset value in spring ldap template. I have set limit value and it is working fine.
I am sharing peace of code.
public OrganisationGroups getOrganisationGroup()
{
SearchControls controls = new SearchControls();
controls.setSearchScope(SearchControls.SUBTREE_SCOPE);
controls.setCountLimit(MAXIMUM_SEARCH_GROUP_COUNT);
AndFilter filter = new AndFilter();
filter.and(new EqualsFilter("objectclass", "groupOfUniqueNames"));
List<OrganisationGroup> organisationGroup = ldapTemplate.search("", filter.toString(), controls, new GroupSearchMapper());
OrganisationGroups groups = new OrganisationGroups();
groups.setOrganisationGroup(organisationGroup);
groups.setCount(organisationGroup.size());
return groups;
}
In this code I have set MAXIMUM_SEARCH_GROUP_COUNT variable to find out maximum record from ldap. But I am not able to set parameter or any other way to escape some records from beginning.
Your best option is to use the Virtual List View request control (link to specification) also known as VLV. Note that to use VLV, you will need to make configuration changes on your LDAP server (I assume you are using OpenDJ, which supports VLV).
There is a code example for JNDI LDAP provider with the VLV request control at this forum.
I am creating a view in code using the Couchbase Java API 2.1.2 like this:
DefaultView view = DefaultView.create(viewName, jsMapFunctionAsString);
List<View> views = new ArrayList<>();
views.add(view);
DesignDocument doc = DesignDocument.create(name, views);
bucket.bucketManager().insertDesignDocument(doc);
Calling ViewResult result = bucket.query(ViewQuery.from(name, viewName)) directly after inserting the document, viewResult.success() always returns true, but both iterators rows() and iterator return 0 rows (there are definitely results present. When I execute the view in the web interface, it returns correct values).
The workaround I found after several hours is to call query twice with enough waiting time in between like
ViewResult result = bucket.query(ViewQuery.from(name, viewName));
Thread.sleep(10000);
result = bucket.query(ViewQuery.from(name, viewName));
The second call will then return the correct result.
It seems as if Couchbase has to build the index for the query first but returns directly, even before the index has been built.
Waiting 10 seconds is of course not optimal, maybe creating the index will take more time in the future.
So my question is, how can I make sure, that I only wait until the index has been built?
Is this a bug in the API?
You can use the stale method and set it to false in the ViewQuery, this will force to wait for the indexation to finish before returning results.
When using myBatis, I should get's very large result set from DB and do secuential operation. (such as CSV Export)
I am thinking and afraid that if return type is List, all returned data on my memory and will cause OutOfMemoryException.
So, I want to get result as a type of ResultSet or Iterable<MyObject> using myBatis.
Tell me any solutions.
Starting from mybatis 3.4.1 you can return Cursor which is Iterable and can be used like this (under condition that result is ordered, see above Cursor API java doc for details):
MyEntityMapper.java
#Select({
"SELECT *",
"FROM my_entity",
"ORDER BY id"
})
Cursor<MyEntity> getEntities();
MapperClient.java
MyEntityMapper mapper = session.getMapper(MyEntityMapper.class);
try (Cursor<MyEntity> entities = mapper.getEntities()) {
for (MyEntity entity:entities) {
// process one entity
}
}
You should use fetchSize(refer here). Based on the heap size and data size per row you can select the number of result sets to be fetched from database. Alternatively since basically you are using data export to csv, you can use spring batch which has mybatis paging item reader. Though the draw back in this item reader for each after each page an request is fired to get next page, which increase the load on your database. If you are not worried about the load you can go ahead with paging item reader. or there is simple another item reader called JdbccursorItem reader
I am trying to create a junit test. Scenario:
setUp: I'm adding two json documents to database
Test: I'm getting those documents using view
tearDown: I'm removing both objects
My view:
function (doc, meta) {
if (doc.type && doc.type == "UserConnection") {
emit([doc.providerId, doc.providerUserId], doc.userId);
}
}
This is how I add those documents to database and make sure that "add" is synchronous:
public boolean add(String key, Object element) {
String json = gson.toJson(element);
OperationFuture<Boolean> result = couchbaseClient.add(key, 0, json);
return result.get();
}
JSON Documents that I'm adding are:
{"userId":"1","providerId":"test_pId","providerUserId":"test_pUId","type":"UserConnection"}
{"userId":"2","providerId":"test_pId","providerUserId":"test_pUId","type":"UserConnection"}
This is how I call the view:
View view = couchbaseClient.getView(DESIGN_DOCUMENT_NAME, VIEW_NAME);
Query query = new Query();
query.setKey(ComplexKey.of("test_pId", "test_pUId"));
ViewResponse viewResponse = couchbaseClient.query(view, query);
Problem:
Test fails due to invalid number of elements fetched from view.
My observations:
Sometimes tests are passing
Number of elements that are fetched from view is not consistent(from 0 to 2)
When I've added those documents to database instead of calling setUp the test passed every time
Acording to this http://www.couchbase.com/docs/couchbase-sdk-java-1.1/create-update-docs.html documentation I'm adding those json documents synchronously by calling get() on returned Future object.
My question:
Is there something wrong with how I've approached to fetching data from view just after this data was inserted to DB? Is there any good practise for solving this problem? And can someone explain it to me please what I've did wrong?
Thanks,
Dariusz
In Couchbase 2.0 documents are required to be written to disk before they will show up in a view. There are three ways you can do an operation with the Java SDK. The first is asynchronous which means that you just send the data and at a later time check to make sure that the data was received correctly. If you do an asynchronous operation and then immediately call .get() as you did above then you have created a synchronous operation. When an operation returns success in these two cases above you are only guaranteed that the item has been written into memory. Your test passed sometimes only because you were lucky enough that both items were written to disk before did your query.
The third way to do an operation is with durability requirements and this is the one you want to do for your tests. Durability requirements allow you to say that you want an item to be written to disk or replicated before success is returned to the client. Take a look at the following function.
https://github.com/couchbase/couchbase-java-client/blob/1.1.0/src/main/java/com/couchbase/client/CouchbaseClient.java#L1293
You will want to use this function and set the PersistedTo parameter to MASTER.