Download Stripe invoice through Java with defined language - java

I'm working on a Spring Boot application with integration of Stripe for payment management.
When a user is dealing with a payment, an invoice is generated through Stripe.
Application downloads this invoice from Stripe to copy it in a Cloud provider using S3 API.
All of this is working correctly.
My concern is application users can choose different language.
When downloading Stripe invoice through Stripe dashboard, invoice is automatically generated in language defined in web browser.
I would like to be able to "set" the language when downloading invoice through API depending on user settings.
Here is how my current code looks like:
public void copyInvoice(Etude etude, String invoiceName, URL invoiceURL, String bucketName) {
var invoiceTmp = new File(System.getProperty("java.io.tmpdir") + invoiceName);
try {
var defaultLocale = Locale.getDefault();
log.debug("copyInvoice - defaultLocale : {}", defaultLocale);
FileUtils.copyURLToFile(invoiceDTO.getUrl(), invoiceTmp);
var s3 = this.getCredentials();
s3.putObject(new PutObjectRequest(bucketName, "invoices/" + invoiceName, invoiceTmp));
invoiceTmp.delete();
} catch (IOException e) {
log.error("copyInvoice, IOException exception when copying invoice from Stripe", e);
}
}
When trying on my side, generated invoice downloaded by this code is always in English, even if the linked Customer in Stripe is set as French, invoice account country is FR (invoice) and my default JVM locale is fr_FR.
Thanks in advance for all suggestions and advices !

Unfortunately it's not possible to define a language when getting the PDF from the Invoice. The language is determined by the browser locale and cannot be set via the API.

Related

Spring doesn't track changes on files stored in "./resources/" folder

I'm new to Spring Boot, so I'm not sure about how to store/manipulate files (use persistance within spring). Use case: Store list of films (title, director...) on a JSON file stored on API server with persistance instead of using a DB.
I have a favorites.json at src/main/resources. This file is updated when request arrives as I said. Code here: GitHub Repo
A kind person has left in the comments what is probably the problem. Changes files in classpath won't work. I still struggling how store data in JSON without a database.
Problem I'm facing:
Files are updated correctly at POST request via OutputStream, but it seems like favorites.json is treated as a static resource, so any update will be ignored until API starts again (I have tried restarting the api when the file is updated, see this but it doesn't change anything. It's still needed to stop and start manually, bash script may help, but I prefer another solution if better-possible.
Maybe I'm looking for a file-based repository, place this file in a specific project path where spring detect updates.
I think I'm skipping some important concepts of spring behaviour.
Here POST Resource
#CrossOrigin(origins = "http://localhost:3000")
#PostMapping(path = TaskLinks.FAVORITES, consumes = "application/json", produces = "application/json")
#ResponseBody
public String updateFavs(#RequestBody List<Show> newFavorites) {
showService.updateFavorites(newFavorites);
return "All right";
}
Methods that modify the file:
public boolean updateFavorites(List<Show> newFavorites) {
if (newFavorites == null)
return false;
setNewFavorites(newFavorites);
return true;
}
private void setNewFavorites(List<Show> newFavorites) {
Gson gson = new Gson();
try {
FileWriter fileW = new FileWriter(FAVORITES_PATH);
String strNewFavs = gson.toJson(newFavorites);
fileW.write(strNewFavs);
fileW.close(); // auto flush
} catch (JsonIOException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
If someone needs to use spring boot persistence system, I will let here what I've found.
The unique solution that I've found to use file persistance on spring-boot (API) is to hard-reload the whole API, which I think is not a clean thing.
So I ended up storing the JSON file on mysql.
Maybe spring have specific tools that I've omitted, but I don't have time to check right now.
The closest approach I got was accessing system temporary file, which is correctly updated because it's allocated outside the application.
I didn't get access to files outside the application other than temporary ones.
Now I'm working with NodeJS express and implemented a png delivery API. I don't really know how I would've done it with spring at all, but there's probably a file focused database or something that may work fine with spring. If I have to face this situation, I will upload the solution that I find most favorable. At the moment express works fine.

Firestore; add user email identifier without creating a collection

I'm planning to create a web page (lets say for admin) to add user identifier only which is using email and create a temporary password for user without creating a collection.
I'm considering to either use python or java language but I couldn't find any answer whether this can be done or not with cloud firestore. does anyone have any idea?
You can create users for your firebase app without adding them to a collection.
Firestore Admin SDK supports Node.js, Java, Python, Go, C# and .NET it really depends on what language you wish to use for your web server. It could really be either that you choose or you could not use an SDK and instead use something completely different because you can also use firestore with javascript. It really depends on your requirements.
Here is more information on how to set up your environment for which ever option you may need.
To create a user with an email and a password you can use this code in:
javascript
firebase.auth().createUserWithEmailAndPassword(email, password)
.then((userCredential) => {
// Signed in
var user = userCredential.user;
// ...
})
.catch((error) => {
var errorCode = error.code;
var errorMessage = error.message;
// ..
});
Python
user = auth.create_user(
uid='some-uid', email='user#example.com', phone_number='+15555550100')
print('Sucessfully created new user: {0}'.format(user.uid))
Java
CreateRequest request = new CreateRequest()
.setUid("some-uid")
.setEmail("user#example.com")
.setPhoneNumber("+11234567890");
UserRecord userRecord = FirebaseAuth.getInstance().createUser(request); System.out.println("Successfully created new user: " + userRecord.getUid());
More information on creating users with email Here

Tensorflow serving get all active models metadata in Java maven

I'm looking for a way to get model metadata from all currently active models on Tensorflow Serving in Java Maven.
I have some working code for retrieving metadata from a specific model and version so if it would be possible to get a list of all model names and versions through grpc (or api) that would be great. Working code using tensorflow-client (com.yesup.oss) :
static ManagedChannel channel = ManagedChannelBuilder.forAddress(TF_SERVICE_HOST, TF_SERVICE_PORT)
.usePlaintext(true).build();
static PredictionServiceGrpc.PredictionServiceBlockingStub stub = PredictionServiceGrpc.newBlockingStub(channel);
public static void getMetadata(String model, Integer version) {
System.out.println("Create request");
GetModelMetadataRequest request = GetModelMetadataRequest.newBuilder()
.setModelSpec(ModelSpec.newBuilder()
.setName(model)
.setSignatureName("serving_default")
.setVersion(Int64Value.newBuilder().setValue(version))
)
.addMetadataField("signature_def")
.build();
System.out.println("Collecting metadata...");
GetModelMetadataResponse response = stub.getModelMetadata(request);
System.out.println("Done");
try {
SignatureDefMap sdef = SignatureDefMap.parseFrom(
response.getMetadataMap().get("signature_def").getValue());
System.out.println( sdef);
} catch (InvalidProtocolBufferException e1) {
e1.printStackTrace();
}
}
Own thoughts
I have thought about a couple of solutions, however none of them are preferable.
Create a server on the same device running Tensorflow Serving that can share the content of Tensorflow Serving config file. The config file contains model names and version, but we will not know if they are currently active.
Use jython or python to access other libraries (tensorflow-serving-api) which seems to contain "list-all-model-names" and "retriveConfig".
Any advice are appreciated, thanks in advance!

Why does the Google Datastore console behave differently to the GAE Java library for Datastore?

I have a Google App Engine + Java app which has been happily running for many years (using JDO + datastore for persistence) and I have had no problem (occasionally, and reluctantly) updating a property of an entity in the Google Datastore console manually.
Recently (maybe the last 2-3 months) I have noticed a change in behaviour which breaks our app. I do not understand exactly what's going wrong or how we could handle it.
So my question is:
Why is it behaving differently and what can I do about it?
Let me first try to explain the behaviour I am seeing and then show my smallest possible replicating test case.
Suppose you had a simple persistence class:
#PersistenceCapable
public class Account implements Serializable {
#Persistent private ShortBlob testShortBlob;
#Persistent private String name;
// ...etc...
}
If I edited the name via the Datastore web console in the past, it would work as expected, the name field would change and everything else would work fine.
The behaviour I am seeing now is that after saving the entity via the console, I can no longer query and load the entity in JDO, I get:
java.lang.ClassCastException: com.google.appengine.api.datastore.Blob cannot be cast to com.google.appengine.api.datastore.ShortBlob
Which points to some underlying datastore change that means that ShortBlob field is having it's type change from ShortBlob to Blob (even though I make no edits to that field via the console).
This test case will replicate the issue:
DatastoreService datastore = DatastoreServiceFactory.getDatastoreService();
// this one really is a ShortBlob - will load fine in JDO
Entity account = new Entity("Account", "123");
account.setProperty("name", "Test Name");
account.setUnindexedProperty("testShortBlob", new ShortBlob("blah".getBytes()));
datastore.put(account);
// this one really is not a ShortBlob, its a blob - it will fail for the same reason I am seeing in production.
account = new Entity("Account", "124");
account.setProperty("name", "Test Name 2");
account.setUnindexedProperty("testShortBlob", new Blob("blah".getBytes()));
datastore.put(account);
// then load the entity via JDO
try {
accountFromJdo = pm.getObjectById(Account.class, key);
} catch (Exception ex) {
System.out.println("We get here, the object won't load with the ClassCast Exception");
}
So that's the issue, but why would saving via the cloud datastore console be changing the ShortBlob's to Blob?
My workaround currently is to set the ShortBlob fields to null in the Datastore console - that will then allow the entity to load. But that sucks if the data in the blob is important!
Update:
I have been doing more testing on this, using the low-level JSON API to see if I could se a difference in the raw JSON responses before and after saving the entity via console. The good news is, I can!
Before editing the entity via the console, a shortBlob field saved to the Datastore via the JDO App Engine interface will look like this:
},
"testShortBlob": {
"blobValue": "tNp7MfsjhdfjkahsdvfkjhsdvfIItWyzy6glmIrow4WWhRPbhQ/U+MGX3opVvpxu"
},
But if I go in to the Datastore console, and edit the entity (leave the blob field unchanged, edit an unrelated field, such as name. Now when I run the same query I get:
},
"testShortBlob": {
"blobValue": "tNp7MfsjhdfjkahsdvfkjhsdvfIItWyzy6glmIrow4WWhRPbhQ/U+MGX3opVvpxu",
"excludeFromIndexes": true
},
Subtle difference, but I think it's important, according to the Java docs ShortBlob are indexed, and Blob are not.
So I think my question now is: why does editing an entity via the Cloud Datastore console change the indexed status of blob fields?
Thanks for the detailed question and debugging. This seems fishy. I will make sure https://issuetracker.google.com/issues/79547492 gets assigned to the correct team.
As far as workarounds go:
The JSON API you noticed is Cloud Datastore API v1 there are a variety of client libraries to help make it easy to access.
It is possible to use that API to transactionally read/modify/write entities. In your case it would allow you to perform the desired transforms. Alternatively, making modifications through JDO would also work.

How to fetch history of elements using ClearCase CM API?

I want to fetch history of file elements like pdf files, doc files, etc. which are under clearcase control using Rational CM API which are provided by clearcase. I have written following code to fetch the history but it is incomplete so please help me out here.
public void fetchFileElementHistory()
{
try
{
CcFile fetchElement = provider.ccFile(provider.filePathLocation(testFile)); // file under Clearcase control
PropertyRequest wantedProps = new PropertyRequest(CcFile.DISPLAY_NAME, CcFile.CREATION_DATE,CcFile.VIEW_RELATIVE_PATH,CcFile.CLIENT_PATH,CcFile.VERSION_HISTORY,CcFile.PREDECESSOR_LIST,CcFile.ELEMENT);
fetchElement = (CcFile) fetchElement.doReadProperties(wantedProps);
VersionHistory versionHistory = fetchElement.getVersionHistory();
versionHistory = (VersionHistory) versionHistory.doReadProperties(new PropertyRequest(VersionHistory.CHILD_LIST,VersionHistory.ROOT_VERSION,
VersionHistory.CHILD_MAP,VersionHistory.PARENT_LIST,VersionHistory.PROVIDER_LIST,VersionHistory.WORKSPACE_FOLDER_LIST));
/*
* what to do here ?
*/
}
catch(Exception e){
e.printStackTrace();
}
}
Thanks in advance
The official documentation for CM API 7.1.x.
Make sure you have selected the "CM Library Samples and Documentation" feature under the Client Components section of the install. in order to check the code examples included with the javadoc.
From the object model overview, check if collections apply your case.

Categories