I did syncing from local couchbase server to my android and IOS application and it is working fine for mobile to server and server to mobile. Then i tried to insert document from JAVA Web application to local server and i succeed to do that. But the problem is that the document inserted by java web application is not syncing with both ios/android mobile applications. My java code to insert document to local server is as follows:
public class CouchBase {
public static void main(String args[]) {
Cluster cluster = CouchbaseCluster.create("127.0.0.1");
Bucket bucket = cluster.openBucket("test");
JsonObject user = JsonObject.empty()
.put("name", "amol")
.put("city", "mumbai");
JsonDocument doc = JsonDocument.create("102", user);
bucket.insert(doc);
System.out.println(doc.content().getString("name"));
}
}
In this code i have created one bucket and then i have created one json object holding required values and passing this object to the json document and finally inserting that document into bucket.
Now my mobile side code to create document:
Document document = database.getDocument(etId.getText().toString());
Map<String, Object> map = new HashMap<String, Object>();
map.put("name", etName.getText().toString());
map.put("city", etCity.getText().toString());
try {
document.putProperties(map);
} catch (CouchbaseLiteException e) {
Log.e(TAG, "Error putting", e);
}
In this code i am simply creating one document and putting values in it.
My syncing code is as follows:
Replication pullReplication = database.createPullReplication(syncUrl);
Replication pushReplication = database.createPushReplication(syncUrl);
pullReplication.setContinuous(true);
pushReplication.setContinuous(true);
pullReplication.start();
pushReplication.start();
Where i am doing Bi-directional syncing.
I am not getting where i am wrong with java code.please help me to out of this problem
Sync gateway doesnt track document inserted through Couchbase-Server java sdk,Also It is not advised to directly insert the data in sync-gateway bucket through java-sdk, you can use bucket shadowing for that.
If you want to insert data through your web application you can make use of sync gateway rest api calls http://developer.couchbase.com/documentation/mobile/1.1.0/develop/references/sync-gateway/rest-api/index.html
At the time of this writing, it's not possible to use the Server SDKs on the bucket used by Sync Gateway. That's because when a new document revision is saved in a Sync Gateway database it goes through the Sync Function to route the documents to channels and grant users and roles access to channels. Some of that metadata is persisted under the _sync property in the document in Couchbase Server. The Server SDKs are not currently aware of the revision based system so it will update the field on the document without creating a new revision.
The recommended way to read/write the Sync Gateway data from a Java Web app is to use the Sync Gateway REST API.
Related
I have some Dataflow pipelines written in Java that run on GCP in different environments/projects (development, UAT, production). Currently, the environment configuration (mainly connection parameters for Cloud SQL instances and BigQuery datasets) is managed using a static map in a Java class (key = env, value = map of properties) and an utility class to dynamically load additional files from Cloud Storage.
What are the best practices (if any) for managing the configuration in such a context?
Essentially, I see two kinds of configuration parameters:
plain values (something that in a Spring application you'd store in a plain property file)
secret values (property files containing data that must be encrypted - username/password for a database, API keys - something that in a K8S context can be mounted as Secret)
Thanks.
I think you will find this tutorial helpful on how to Access Secret Manager from Dataflow Pipeline
"As of today, Dataflow does not provide native support for storing and
accessing secrets. To secure those secrets, the common approach is to
use Cloud KMS to encrypt the secret and decrypt it when running the
data pipeline. With the newly launched Secret Manager, we can now
store those secrets in Secret Manager and access them from our
pipeline to provide better security and ease of use."
The following code uses Secret Manager SDK to access the secret given a JDBC URL secret name.
private static String jdbcUrlTranslator(String jdbcUrlSecretName) {
try (SecretManagerServiceClient client = SecretManagerServiceClient.create()) {
AccessSecretVersionResponse response = client.accessSecretVersion(jdbcUrlSecretName);
return response.getPayload().getData().toStringUtf8();
} catch (IOException e) {
throw new RuntimeException("Unable to read JDBC URL secret");
}
}
public static void main(String[] args) {
PipelineOptionsFactory.register(MainPipelineOptions.class);
MainPipelineOptions options =
PipelineOptionsFactory.fromArgs(args)
.withValidation()
.as(MainPipelineOptions.class);
NestedValueProvider<String, String> jdbcUrlValueProvider =
NestedValueProvider.of(
options.getJdbcUrlSecretName(), MainPipeline::jdbcUrlTranslator);
Pipeline pipeline = Pipeline.create(options);
pipeline
.apply("SQL Server - Read Sales.Customers_Archive",
JdbcIO.<KV<Integer, String>>read()
.withDataSourceConfiguration(
JdbcIO.DataSourceConfiguration.create(
StaticValueProvider.of("com.microsoft.sqlserver.jdbc.SQLServerDriver"),
jdbcUrlValueProvider)
);
// Other transforms
pipeline.run();
}
One way to handle to handle secret values inside the Google Cloud Platform is using the Secret Manager handling the encryption and access control to the stored passwords.
Inside your Java code you can use the Google Cloud Secret Manager maven module to get the sotred secret values
I am using the Couchbase Java SDK to query a couchbase analytics service. The process is covered in this tutorial: https://docs.couchbase.com/java-sdk/2.7/analytics-using-sdk.html
The Java SDK provides a Bucket object as the means of accessing couchbase. However a bucket is a separate entity from the analytics data-set. For example, my bucket is called data, and I have an analytics data-set that I want to query called requests.
I cannot find a means of connecting to just the requests data-set. The SDK will only connect to the data bucket. From there I can query the requests data-set by writing some N1QL. This work-around means that the user credentials I'm using to run analytics queries must also have access to my main production data bucket, which I'd rather prevent.
Is there a way to connect to simply the analytics data-set using the SDK?
The code I have currently creating the connection looks like this:
public class CouchbaseConfig {
#Bean
public Bucket bucket(CouchbaseProperties properties) {
return cluster().openBucket("data"); // Changing this to the data-set name returns error
}
private Cluster cluster() {
Cluster cluster = CouchbaseCluster.create("localhost");
cluster.authenticate("Administrator", "password");
return cluster;
}
}
Using the requests data-set name in the bucket name results in this error:
Failed to instantiate [com.couchbase.client.java.Bucket]: Factory method 'bucket' threw exception; nested exception is com.couchbase.client.java.error.BucketDoesNotExistException: Bucket "requests" does not exist.
Using the data bucket name, but authentication username / password "analytics-reader" / "password" (with only Analytics Reader) roles results in this error:
Could not load bucket configuration: FAILURE({"message":"Forbidden. User needs one of the following permissions","permissions":["cluster.bucket[data].settings!read"]})
The only work around I have found is to give the analytics-reader user 'Application Access' to the data` bucket 😢
Connecting directly to analytics is possible with SDK3 and Couchbase 6.5 . In all the previous versions (SDK 2.7 included), the only way to query analytics is to connect to a bucket first.
What is the best and correct way to list Azure Database for PostgreSQL servers present in my Resource Group using Azure Java SDK?
Currently, we have deployments that happen using ARM templates and once the resources have been deployed we want to be able to get the information about those resources from Azure itself.
I have tried doing in the following way:
PagedList<SqlServer> azureSqlServers = azure1.sqlServers().listByResourceGroup("resourceGrpName");
//PagedList<SqlServer> azureSqlServers = azure1.sqlServers().list();
for(SqlServer azureSqlServer : azureSqlServers) {
System.out.println(azureSqlServer.fullyQualifiedDomainName());
}
System.out.println(azureSqlServers.size());
But the list size returned is 0.
However, for virtual machines, I am able to get the information in the following way:
PagedList<VirtualMachine> vms = azure1.virtualMachines().listByResourceGroup("resourceGrpName");
for (VirtualMachine vm : vms) {
System.out.println(vm.name());
System.out.println(vm.powerState());
System.out.println(vm.size());
System.out.println(vm.tags());
}
So, what is the right way of getting the information about the Azure Database for PostgreSQL using Azure Java SDK?
P.S.
Once I get the information regarding Azure Database for PostgreSQL, I would need similar information about the Azure Database for MySQL Servers.
Edit: I have seen this question which was asked 2 years back and would like to know if Azure added Support for Azure Database for PostgreSQL/MySQL servers or not.
Azure Java SDK for MySQL/PostgreSQL databases?
So, I kind of implemented it in the following way and it can be treated as an alternative way...
Looking at the Azure SDK for java repo on Github (https://github.com/Azure/azure-sdk-for-java/tree/master/sdk/postgresql), looks like they have it in beta so I searched for the pom in mvnrepository. I imported the following pom in my project (azure-mgmt-postgresql is still in beta):
<!-- https://mvnrepository.com/artifact/com.microsoft.azure.postgresql.v2017_12_01/azure-mgmt-postgresql -->
<dependency>
<groupId>com.microsoft.azure.postgresql.v2017_12_01</groupId>
<artifactId>azure-mgmt-postgresql</artifactId>
<version>1.0.0-beta-5</version>
</dependency>
In the code, Following is the gist of how I did it:
I already have a service principal created and have its information with me.
But, anyone trying this will require clientId, tenantId, clientSecret, and subscriptionId with them, the way #Jim Xu explained.
// create the credentials object
ApplicationTokenCredentials credentials = new ApplicationTokenCredentials(clientId, tenantId, clientSecret, AzureEnvironment.AZURE);
// build a rest client object configured with the credentials created above
RestClient restClient = new RestClient.Builder()
.withBaseUrl(credentials.environment(), AzureEnvironment.Endpoint.RESOURCE_MANAGER)
.withCredentials(credentials)
.withSerializerAdapter(new AzureJacksonAdapter())
.withResponseBuilderFactory(new AzureResponseBuilder.Factory())
.withInterceptor(new ProviderRegistrationInterceptor(credentials))
.withInterceptor(new ResourceManagerThrottlingInterceptor())
.build();
// use the PostgreSQLManager
PostgreSQLManager psqlManager = PostgreSQLManager.authenticate(restClient, subscriptionId);
PagedList<Server> azurePsqlServers = psqlManager.servers().listByResourceGroup(resourceGrpName);
for(Server azurePsqlServer : azurePsqlServers) {
System.out.println(azurePsqlServer.fullyQualifiedDomainName());
System.out.println(azurePsqlServer.userVisibleState().toString());
System.out.println(azurePsqlServer.sku().name());
}
Note: Server class refers to com.microsoft.azure.management.postgresql.v2017_12_01.Server
Also, if you take a look at the Azure class, you will notice this is how they do it internally.
For reference, you can use SqlServerManager sqlServerManager in the Azure class and look at how they have used it and created an authenticated manager in case you want to use some services that are still in preview or beta.
According to my test, we can use java sdk azure-mgmt-resources to implement your need. For example
Create a service principal
az login
# it will create a service pricipal and assign a contributor rolen to the sp
az ad sp create-for-rbac -n "MyApp" --scope "/subscriptions/<subscription id>" --sdk-auth
code
String tenantId = "<the tenantId you copy >";
String clientId = "<the clientId you copy>";
String clientSecret= "<the clientSecre you copy>";
String subscriptionId = "<the subscription id you copy>";
ApplicationTokenCredentials creds = new
ApplicationTokenCredentials(clientId,domain,secret,AzureEnvironment.AZURE);
RestClient restClient =new RestClient.Builder()
.withBaseUrl(AzureEnvironment.AZURE, AzureEnvironment.Endpoint.RESOURCE_MANAGER)
.withSerializerAdapter(new AzureJacksonAdapter())
.withReadTimeout(150, TimeUnit.SECONDS)
.withLogLevel(LogLevel.BODY)
.withResponseBuilderFactory(new AzureResponseBuilder.Factory())
.withCredentials(creds)
.build();
ResourceManager resourceClient= ResourceManager.authenticate(restClient).withSubscription(subscriptionId);
ResourceManagementClientImpl client= resourceClient.inner();
String filter="resourceType eq 'Microsoft.DBforPostgreSQL/servers'"; //The filter to apply on the operation
String expand=null;//The $expand query parameter. You can expand createdTime and changedTime.For example, to expand both properties, use $expand=changedTime,createdTime
Integer top =null;// The number of results to return. If null is passed, returns all resource groups.
PagedList<GenericResourceInner> results= client.resources().list(filter, null,top);
while (true) {
for (GenericResourceInner resource : results.currentPage().items()) {
System.out.println(resource.id());
System.out.println(resource.name());
System.out.println(resource.type());
System.out.println(resource.location());
System.out.println(resource.sku().name());
System.out.println("------------------------------");
}
if (results.hasNextPage()) {
results.loadNextPage();
} else {
break;
}
}
Besides, you also can use Azure REST API to implement your need. For more details, please refer to https://learn.microsoft.com/en-us/rest/api/resources/resources
I am following https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-bulk-identity-mgmt to do Bulk upload of Device Identities in Azure IoT Hub. All codes given here are in C# so I am converting it to JAVA equivalent.
Using Import devices example – bulk device provisioning I am getting following json-
{"id":"d3d78b0d-6c8c-4ef5-a321-91fbb6a4b7d1","importMode":"create","status":"enabled","authentication":{"symmetricKey":{"primaryKey":"f8/UZcYbhPxnNdbSl2J+0Q==","secondaryKey":"lbq4Y4Z8qWmfUxAQjRsDjw=="}}}
{"id":"70bbe407-8d65-4f57-936f-ef402aa66d07","importMode":"create","status":"enabled","authentication":{"symmetricKey":{"primaryKey":"9e7fDNIFbMu/NmOfxo/vGg==","secondaryKey":"nwFiKR4HV9KYHzkeyu8nLA=="}}}
To import the file from blob following function is called-
CompletableFuture<JobProperties> importJob = registryManager
.importDevicesAsync(inURI, outURI);
In the above code, we need to provide URI with SAS code, for that Get the container SAS URI equivalent code is below-
static String GetContainerSasUri(CloudBlobContainer container) {
SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy();
sasConstraints.setSharedAccessExpiryTime(new Date(new Date().getTime() + TimeUnit.DAYS.toMillis(1)));
sasConstraints.setPermissions(EnumSet.of(SharedAccessBlobPermissions.READ, SharedAccessBlobPermissions.WRITE,
SharedAccessBlobPermissions.LIST, SharedAccessBlobPermissions.DELETE));
BlobContainerPermissions permissions = new BlobContainerPermissions();
permissions.setPublicAccess(BlobContainerPublicAccessType.CONTAINER);
permissions.getSharedAccessPolicies().put("testpolicy", sasConstraints);
try {
container.uploadPermissions(permissions);
} catch (StorageException e1) {
e1.printStackTrace();
}
String sasContainerToken = null;
try {
sasContainerToken = container.generateSharedAccessSignature(sasConstraints, "testpolicy");
} catch (InvalidKeyException e) {
e.printStackTrace();
} catch (StorageException e) {
e.printStackTrace();
}
System.out.println("URI " + container.getUri() +"?"+ sasContainerToken);
return container.getUri() + "?" + sasContainerToken;
}
Now the problem is coming here. For the output container I am getting following error-
java.util.concurrent.ExecutionException: com.microsoft.azure.sdk.iot.service.exceptions.IotHubBadFormatException: Bad message format! ErrorCode:BlobContainerValidationError;Unauthorized to write to output blob container. Tracking ID:2dcb2efbf1e14e33ba60dc8415dc03c3-G:4-TimeStamp:11/08/2017 16:16:10
Please help me to know why I am getting Bad Message Format error? Is there a problem with the SAS key generating code or my blob container is not having Write permission?
are you using a service or Account-level SAS? The error thrown suggests the service isn't authorized or have the delegated permissions to write to the designated blob container. Check out the resource here on how to setup an account level SAS and how to delegate read, write and delete operations on blob containers. https://learn.microsoft.com/en-us/rest/api/storageservices/Delegating-Access-with-a-Shared-Access-Signature?redirectedfrom=MSDN "snipped content: "An account-level SAS, introduced with version 2015-04-05. The account SAS delegates access to resources in one or more of the storage services. All of the operations available via a service SAS are also available via an account SAS. Additionally, with the account SAS, you can delegate access to operations that apply to a given service, such as Get/Set Service Properties and Get Service Stats. You can also delegate access to read, write, and delete operations on blob containers, tables, queues, and file shares that are not permitted with a service SAS. See Constructing an Account SAS for more information about account SAS."
I was facing the same issue while using private storage account as import/output container.
It is working smooth after I started using public storage account.
Anyway, it should work even with private storage account. So, I have raised an issue. For more into, you may refer this link.
I am developing an Android app which takes the current location of the user and displays a list of restaurants close to his/her location. The restaurants' data is available to me (i.e I do have the lat/long of each restaurant I want to display in the search results). I can't use Google Places API, because I need to show only those restaurants that are available in our database(in our website). My question is how do I access my database(or even an URL),which is on a computer, to extract the restaurants' data and display as search results in my android app?
I am actually making a Seamless ( http://bit.ly/Jp7pUN ) type application for my company.
I am a complete newbie to android app development. So, pardon me if this is really a very broad or a stupid question. Please just tell me what topics I need to study to implement this. I would study and do it myself.
Thanks.
You will need:
a Sqlite database to store the restaurants and their longitude/latitude
a MapView to display the map (Don't forget to register your Google Maps API key)
a map overlay to show the markers on the map
GPS access to get the user's location (needs the appropriate Android permission)
a simple search algorithm that retrieves a result set of restaurants within x distance of the user's location
EDIT
If your database is stored on a server, you will need a way to query the server, preferably using an HTTP-based protocol such as REST. It is useful (but not required) to cache the restaurant locations on the Android device (using Sqlite), in case the user is offline (The good news: Since you can use Java both on Android and the server, 90% of your data access layer you will only need to write once).
For the data transfer from server to the Android client, JSON is a popular format.
To acces database on your computer (not SQLite on Android) you should use url for your database server changing localhost to: 10.0.2.2. But in case your database will be on the Internet - you should create maybe some REST API to get the data you need. Then use HttpClient to fetch the data from server.
Everything that you need is in Developer Guide: MapView
And for retrieving current location I advice using MyLocationOverlay
For example (url to server):
//public static final String SERVER_ADDRESS = "http://10.0.2.2:3000"; // for localhost server
public static final String SERVER_ADDRESS = "http://railsserver.herokuapp.com"; //for remote server
Accessing data on your server - this depends on that how you implement (and using what thechnology) your server (REST API?, WebService?, Plain HTML?) and what will be the format of the response from server (JSON? XML?, etc.)
I suggest using JSON because it is easy to parse using included classes in Android SDK:
String json = execute(new HttpGet(Constants.SERVER_URL + "/fetchData"));
JSONObject responseJSON = new JSONObject(json);
if(responseJSON.has("auth_error")) {
throw new IOException("fetchData_error");
}