How can encrypt the data base fields when using the hibernate?
We have developed the product some of the clients are using that application Some clients is asking about the data base encryption
Is there any possible to encrypt the data in application level with out more changes in the code.
Please give me the suggestion as soon as possible.
Try this:
Put an attribute in your entity:
private byte[] encryptedBody;
Use this getter and setters:
#Column(columnDefinition= "LONGBLOB", name="encryptedBody")
#ColumnTransformer(
read="AES_DECRYPT(encryptedBody, 'yourkey')",
write="AES_ENCRYPT(?, 'yourkey')")
public byte[] getEncryptedBody() {
return encryptedBody;
}
public void setEncryptedBody(byte[] encryptedBody) {
this.encryptedBody = encryptedBody;
}
And then when you retrive the column use:
private final Charset UTF8_CHARSET = Charset.forName("UTF-8");
String decodeUTF8(byte[] bytes) {
return new String(bytes, UTF8_CHARSET);
}
String s = decodeUTF8(entity.getEncryptedBody());
BEWARE: AES_DECRYPT and AES_ENCRYPT belong to MySQL. If you have a different data base engine find similar functions.
Hope this helps.
You can use the #ColumnTransformer annotation like this:
#ColumnTransformer(
read = "pgp_sym_decrypt(" +
" storage, " +
" current_setting('encrypt.key')" +
")",
write = "pgp_sym_encrypt( " +
" ?, " +
" current_setting('encrypt.key')" +
") "
)
#Column(columnDefinition = "bytea")
private String storage;
This way, Hibernate will be able to encrypt the entity attribute when you persist or merge it and decrypt it when you read the entity.
I think that you are looking for column transformers. You can find how to do it in the Hibernate reference:
http://docs.jboss.org/hibernate/core/3.6/reference/en-US/html/mapping.html#mapping-column-read-and-write
I hope that helps!
You could use jasypt. It has an Hibernate integration that allows you to encrypt properties while saving (and decrypt while loading).
http://www.jasypt.org/hibernate.html
Related
I'm looking a way to do something like this. I don't know how to call it, so i don't know if it exist or how to find it. Some keyword would be welcome :)
String var_1 = "user data";
String fix_1 = "supply data";
String mix = mixer(var_1,fix_1);
// mix = " something fully random "
String var_2 = "user data changed";
String fix_2 = fixer(var_2,mix);
And mix == mixer(var_2, fix_2);
So to resume, I need to generate a random data from 2 variables. 1 is variable from user and 1 is supply by me.
First time , I generate the data with these 2 variables with one function.
Then, if the user data change, with another function, I compute the new supply data with the first result and the new user data. And if I use again the computed data and the new user data, I must obtain the same data computed the first time.
Is there something to do that ? Like some cipher technique or so?
Thanks for Intel.
In fact there is something like this already which may satisfy you needs. In fact you know this function too. It's the good old XOR. And yes, it is used in crypto a lot. In fact it's the core idea of the stream ciphers and the One Time Pad.
It goes like this:
Assume you have a byte array of length n called var_1.
Assume you have a random value fix_1 of the same length.
If you do var_1 XOR fix_1 you get mix.
If you do mix XOR fix_1 you get var_1 again. (Basic math: fix_1 XOR fix_1 equals chain of zero value bytes and var_1 XOR zero bytes = var_1.
This whole thing will be as random and secure as random and secret fix_1 remains. If one of the values is not random the approach is not secure at all.
So following the idea of User253751 in comment, I was able to do it.
Step:
generate the private constant key => privateKey = encrypt(publicKey, Password_1) (the first public key is random )
if password change, generate a new public key by decoding the private constant key with password_2 => publicKey_Updated = decrypt(privateKey, Password_2)
Check if the new public key is valid : privateKey_Rebuild = encrypt(publicKey_Updated, Password_2) ====> if everything is ok, privateKey == privateKey_Rebuild.
---> I test it only with a low cryptage i use just for obfuscation, but it should work with symmetric key too. I'm not sur about Asymetric key, because to make this work, you need a crypting protocol who always give you the same crypted data with the same input. And RSA do not gave you the same crypted data even with the same input.
Here my code (not a copy/paste snippet beacause it use my own library), but you can catch the idea easily with the function name.
KeyObfusc publicKey_1 = KeyObfusc.fromPassword("publicKey_1");
KeyObfusc password_1 = KeyObfusc.fromPassword("password_1");
Encoder encoder_1 = new Encoder(password_1, CipherFormat.HEX);
Decoder decoder_1 = new Decoder(password_1, CipherFormat.HEX);
byte[] privateKey = encoder_1.toBytes(publicKey_1.getEncoded());
byte[] publicKey_1_Rebuild = decoder_1.fromBytesToBytes(privateKey);
LogDelay.send("password_1 : " + BytesTo.stringHex(password_1.getEncoded()));
LogDelay.send("publicKey_1 : " + BytesTo.stringHex(publicKey_1.getEncoded()));
LogDelay.send("privateKey : " + BytesTo.stringHex(privateKey));
LogDelay.send("publicKey_1 Rebuild : " + Arrays.equals(publicKey_1.getEncoded(), publicKey_1_Rebuild) +
" " + BytesTo.stringHex(publicKey_1_Rebuild));
LogDelay.send();
KeyObfusc password_2 = KeyObfusc.fromPassword("password_2");
Encoder encoder_2 = new Encoder(password_2, CipherFormat.HEX);
Decoder decoder_2 = new Decoder(password_2, CipherFormat.HEX);
byte[] publicKey_2 = decoder_2.fromBytesToBytes(privateKey);
byte[] privateKey_Rebuild = encoder_2.toBytes(publicKey_2);
LogDelay.send("password_2 : " + BytesTo.stringHex(password_2.getEncoded()));
LogDelay.send("publicKey_2 : " + BytesTo.stringHex(publicKey_2));
LogDelay.send("privateKey Rebuild: " + Arrays.equals(privateKey, privateKey_Rebuild) +
" " + BytesTo.stringHex(privateKey_Rebuild));
LogDelay.send();
I have a table TestTable with columns ID as binary(16) and name as varchar(50)
I've been trying to store an ordered UUID as PK like in this article Store UUID in an optimized way
I see the UUID is saved in database as HEX (blob)
So I want to save this ID from java but I am getting this error
Data truncation: Data too long for column 'ID' at row 1
I am currently using the library sql2o to interact with mysql
So basically this is my code
String suuid = UUID.randomUUID().toString();
String partial_id = suuid.substring(14,18) + suuid.substring(9, 13) + suuid.substring(0, 8) + suuid.substring(19, 23) + suuid.substring(24)
String final_id = String.format("%040x", new BigInteger(1, partial_id.getBytes()));
con.createQuery("INSERT INTO TestTable(ID, Name) VALUES(:id, :name)")
.addParameter("id", final_id)
.addParameter("name", "test1").executeUpdate();
The partial id should be something like this 11d8eebc58e0a7d796690800200c9a66
I tried this statement in mysql without issue
insert into testtable(id, name) values(UNHEX(CONCAT(SUBSTR(uuid(), 15, 4),SUBSTR(uuid(), 10, 4),SUBSTR(uuid(), 1, 8),SUBSTR(uuid(), 20, 4),SUBSTR(uuid(), 25))), 'Test2');
But I got the same error when I remove the unhex function. So how can I send the correct ID from Java to mysql?
UPDATE
I solved my problem inspired on the answer of David Ehrmann. But in my case I used the HexUtils from tomcat to transform my sorted UUID string into bytes[]:
byte[] final_id = HexUtils.fromHexString(partial_id);
Try storing it as bytes:
UUID uuid = UUID.randomUUID();
byte[] uuidBytes = new byte[16];
ByteBuffer.wrap(uuidBytes)
.order(ByteOrder.BIG_ENDIAN)
.putLong(uuid.getMostSignificantBits())
.putLong(uuid.getLeastSignificantBits());
con.createQuery("INSERT INTO TestTable(ID, Name) VALUES(:id, :name)")
.addParameter("id", uuidBytes)
.addParameter("name", "test1").executeUpdate();
A bit of an explanation: your table is using BINARY(16), so serializing UUID as its raw bytes is a really straightforward approach. UUIDs are essentially 128-bit ints with a few reserved bits, so this code writes it out as a big-endian 128-bit int. The ByteBuffer is just an easy way to turn two longs into a byte array.
Now in practice, all the conversion effort and headaches won't be worth the 20 bytes you save per row.
Using test cases I was able to see how ELKI can be used directly from Java but now I want to read my data from MongoDB and then use ELKI to cluster geographic (long, lat) data.
I can only cluster data from a CSV file using ELKI. Is it possible to connect de.lmu.ifi.dbs.elki.database.Database with MongoDB? I can see from the java debugger that there is a databaseconnection field in de.lmu.ifi.dbs.elki.database.Database.
I query MongoDB creating POJO for each row and now I want to cluster these objects using ELKI.
It is possible to read data from MongoDB and write it in a CSV file then use ELKI to read that CSV file but I would like to know if there is a simpler solution.
---------FINDINGS_1:
From ELKI - Use List<String> of objects to populate the Database I found that I need to implement de.lmu.ifi.dbs.elki.datasource.DatabaseConnection and specifically override the loadData() method which returns an instance of MultiObjectsBundle.
So I think I should wrap a list of POJO with MultiObjectsBundle. Now i'm looking at the MultiObjectsBundle and it looks like the data should be held in columns. Why columns datatype is List> shouldnt it be List? just a list of items you want to cluster?
I'm a little confused. How is ELKI going to know that it should look at the long and lat for POJO? Where do I tell ELKI to do this? Using de.lmu.ifi.dbs.elki.data.type.SimpleTypeInformation?
---------FINDINGS_2:
I have tried to use ArrayAdapterDatabaseConnection and I have tried implementing DatabaseConnection. Sorry I need thing in very simple terms for me to understand.
This is my code for clustering:
int minPts=3;
double eps=0.08;
double[][] data1 = {{-0.197574246, 51.49960695}, {-0.084605692, 51.52128377}, {-0.120973687, 51.53005939}, {-0.156876, 51.49313},
{-0.144228881, 51.51811784}, {-0.1680743, 51.53430039}, {-0.170134484,51.52834133}, { -0.096440751, 51.5073853},
{-0.092754157, 51.50597426}, {-0.122502346, 51.52395143}, {-0.136039674, 51.51991453}, {-0.123616824, 51.52994371},
{-0.127854211, 51.51772703}, {-0.125979294, 51.52635795}, {-0.109006325, 51.5216612}, {-0.12221963, 51.51477076}, {-0.131161087, 51.52505093} };
// ArrayAdapterDatabaseConnection dbcon = new ArrayAdapterDatabaseConnection(data1);
DatabaseConnection dbcon = new MyDBConnection();
ListParameterization params = new ListParameterization();
params.addParameter(de.lmu.ifi.dbs.elki.algorithm.clustering.DBSCAN.Parameterizer.MINPTS_ID, minPts);
params.addParameter(de.lmu.ifi.dbs.elki.algorithm.clustering.DBSCAN.Parameterizer.EPSILON_ID, eps);
params.addParameter(DBSCAN.DISTANCE_FUNCTION_ID, EuclideanDistanceFunction.class);
params.addParameter(AbstractDatabase.Parameterizer.DATABASE_CONNECTION_ID, dbcon);
params.addParameter(AbstractDatabase.Parameterizer.INDEX_ID,
RStarTreeFactory.class);
params.addParameter(RStarTreeFactory.Parameterizer.BULK_SPLIT_ID,
SortTileRecursiveBulkSplit.class);
params.addParameter(AbstractPageFileFactory.Parameterizer.PAGE_SIZE_ID, 1000);
Database db = ClassGenericsUtil.parameterizeOrAbort(StaticArrayDatabase.class, params);
db.initialize();
GeneralizedDBSCAN dbscan = ClassGenericsUtil.parameterizeOrAbort(GeneralizedDBSCAN.class, params);
Relation<DoubleVector> rel = db.getRelation(TypeUtil.DOUBLE_VECTOR_FIELD);
Relation<ExternalID> relID = db.getRelation(TypeUtil.EXTERNALID);
DBIDRange ids = (DBIDRange) rel.getDBIDs();
Clustering<Model> result = dbscan.run(db);
int i =0;
for(Cluster<Model> clu : result.getAllClusters()) {
System.out.println("#" + i + ": " + clu.getNameAutomatic());
System.out.println("Size: " + clu.size());
System.out.print("Objects: ");
for(DBIDIter it = clu.getIDs().iter(); it.valid(); it.advance()) {
DoubleVector v = rel.get(it);
ExternalID exID = relID.get(it);
System.out.print("DoubleVec: ["+v+"]");
System.out.print("ExID: ["+exID+"]");
final int offset = ids.getOffset(it);
System.out.print(" " + offset);
}
System.out.println();
++i;
}
The ArrayAdapterDatabaseConnection produces two clusters, I just had to play around with the value of epsilon, when I set epsilon=0.008 dbscan started creating clusters. When i set epsilon=0.04 all the items were in 1 cluster.
I have also tried to implement DatabaseConnection:
#Override
public MultipleObjectsBundle loadData() {
MultipleObjectsBundle bundle = new MultipleObjectsBundle();
List<Station> stations = getStations();
List<DoubleVector> vecs = new ArrayList<DoubleVector>();
List<ExternalID> ids = new ArrayList<ExternalID>();
for (Station s : stations){
String strID = Integer.toString(s.getId());
ExternalID i = new ExternalID(strID);
ids.add(i);
double[] st = {s.getLongitude(), s.getLatitude()};
DoubleVector dv = new DoubleVector(st);
vecs.add(dv);
}
SimpleTypeInformation<DoubleVector> type = new VectorFieldTypeInformation<>(DoubleVector.FACTORY, 2, 2, DoubleVector.FACTORY.getDefaultSerializer());
bundle.appendColumn(type, vecs);
bundle.appendColumn(TypeUtil.EXTERNALID, ids);
return bundle;
}
These long/lat are associated with an ID and I need to link them back to this ID to the values. Is the only way to go that using the ID offset (in the code above)? I have tried to add ExternalID column but I don't know how to retrieve the ExternalID for a particular NumberVector?
Also after seeing Using ELKI's Distance Function I tried to use Elki's longLatDistance but it doesn't work and I could not find any examples to implement it.
The interface for data sources is called DatabaseConnection.
JavaDoc of DatabaseConnection
You can implement a MongoDB-based interface to get the data.
It is not complicated interface, it has a single method.
I am inserting record into Hazelcast from C Application using Memcached Client Library API's, where record is as follows:
typedef struct _activeClient
{
char ID[25];
int IP;
char aMethod[16];
}activeClient;
Now I am trying reading same record using Hazelcast Java Native API's. Here is my Java program.
IMap < String, MemcacheEntry > mapInst = client.getMap("hz_memcache_ABC_MAP");
System.out.println("Map Size:" + mapInst.size());
String key = new String("70826892122991");
MemcacheEntry tmpValRec = pvrMapIst.get(key);
System.out.println("Key:" + key + "ID:" + tmpValRec.getValue());
Here tmpValRec.getValue() printing record content in single String format. But, I want to retrive each member value from tmpValRec to my own java class object. Here is the class
class ActiveClients
{
String ueID;
int Ip;
String aMethod;
ActiveClients()
{
ueID = "";
Ip = 0;
aMethod = "";
}
}
Pointing me to an example would be great help.
I guess the only option is to parse the string to deserialize your object. I know this is a pain, but I don't see a better alternative. Unless of course you store a blob as a value in memcached where the blob is the serialized content of the class.
I am trying to insert byte array into Blob data type in my Cassandra table.. I am using Datastax Java driver. Below is my code -
for (Map.Entry<String, byte[]> entry : attributes.entrySet()) {
System.out.println("Key = " + entry.getKey() + ", Value = " + entry.getValue());
String cql = "insert into test_data (user_id, name, value) values ('"+userId+"', '"+entry.getKey()+"', '"+entry.getValue()+"');";
System.out.println(cql);
CassandraDatastaxConnection.getInstance();
CassandraDatastaxConnection.getSession().execute(cql);
}
And this is the exception I am getting back -
InvalidQueryException: cannot parse '[B#50908fa9' as hex bytes
I guess the problem is, the way I am making my above cql.. Something is missing for sure...
I have created the table like this -
create table test_data (user_id text, name text, value blob, primary key (user_id, name));
Can anybody help me? Thanks...
The problem is that when you append the byte array to the String it calls toString on the byte[] which prints the unhelpful pointer you are seeing. You need to manually convert it to a String for your data type. In your case you are using a blob, so you need to convert to a hex string.
This question has code for converting the byte[] to String:
How to convert a byte array to a hex string in Java?
You can use one of those functions and prepend '0x' to it. Then you should have a valid String for your blob.