I have a table TestTable with columns ID as binary(16) and name as varchar(50)
I've been trying to store an ordered UUID as PK like in this article Store UUID in an optimized way
I see the UUID is saved in database as HEX (blob)
So I want to save this ID from java but I am getting this error
Data truncation: Data too long for column 'ID' at row 1
I am currently using the library sql2o to interact with mysql
So basically this is my code
String suuid = UUID.randomUUID().toString();
String partial_id = suuid.substring(14,18) + suuid.substring(9, 13) + suuid.substring(0, 8) + suuid.substring(19, 23) + suuid.substring(24)
String final_id = String.format("%040x", new BigInteger(1, partial_id.getBytes()));
con.createQuery("INSERT INTO TestTable(ID, Name) VALUES(:id, :name)")
.addParameter("id", final_id)
.addParameter("name", "test1").executeUpdate();
The partial id should be something like this 11d8eebc58e0a7d796690800200c9a66
I tried this statement in mysql without issue
insert into testtable(id, name) values(UNHEX(CONCAT(SUBSTR(uuid(), 15, 4),SUBSTR(uuid(), 10, 4),SUBSTR(uuid(), 1, 8),SUBSTR(uuid(), 20, 4),SUBSTR(uuid(), 25))), 'Test2');
But I got the same error when I remove the unhex function. So how can I send the correct ID from Java to mysql?
UPDATE
I solved my problem inspired on the answer of David Ehrmann. But in my case I used the HexUtils from tomcat to transform my sorted UUID string into bytes[]:
byte[] final_id = HexUtils.fromHexString(partial_id);
Try storing it as bytes:
UUID uuid = UUID.randomUUID();
byte[] uuidBytes = new byte[16];
ByteBuffer.wrap(uuidBytes)
.order(ByteOrder.BIG_ENDIAN)
.putLong(uuid.getMostSignificantBits())
.putLong(uuid.getLeastSignificantBits());
con.createQuery("INSERT INTO TestTable(ID, Name) VALUES(:id, :name)")
.addParameter("id", uuidBytes)
.addParameter("name", "test1").executeUpdate();
A bit of an explanation: your table is using BINARY(16), so serializing UUID as its raw bytes is a really straightforward approach. UUIDs are essentially 128-bit ints with a few reserved bits, so this code writes it out as a big-endian 128-bit int. The ByteBuffer is just an easy way to turn two longs into a byte array.
Now in practice, all the conversion effort and headaches won't be worth the 20 bytes you save per row.
Related
I need to communicate a Guid that was generated in .NET to a Java application.
This is my GUID
ce095552-b466-4d03-ac41-430ec9286806
and I want to set it to UUID variable !
UUID.nameUUIDFromBytes(stringUUID.getBytes())
UUID.fromString(stringUUID)
I am getting error
Caused by: java.lang.NumberFormatException: Invalid long: ""ce095552"
how can I cast GUID to UUID?
When you get some Microsoft objectGUIDs , Active Directory objectGUID of the group object for example, you need to get the binary and then convert it to hexadecimal after then you need generate a MS GUID (look at order byte sequence inside convertToDashedString function).
The byte order comparing UUID and GUID is different: try convert it using online converters like: robobunny converter
Now I'm storing and working with ms guid
public static String convertMSGUIDToHexFormat(String guid){
guid = guid.replaceAll("-", "");
guid = guid.replaceAll("(.{8})(.{4})(.{4})(.{4})(.{12})", "$1-$2-$3-$4-$5").replaceAll("(.{2})(.{2})(.{2})(.{2}).(.{2})(.{2}).(.{2})(.{2})(.{18})", "$4$3$2$1-$6$5-$8$7$9");
guid = guid.replaceAll("-", "");
return guid;
}
public static String convertHexToMSGUIDFormat(String hex){
return hex.replaceAll("(.{8})(.{4})(.{4})(.{4})(.{12})", "$1-$2-$3-$4-$5").replaceAll("(.{2})(.{2})(.{2})(.{2}).(.{2})(.{2}).(.{2})(.{2})(.{18})", "$4$3$2$1-$6$5-$8$7$9");
}
When I know a better way, then I change that supposed quick fix
UUID.fromString() works fine:
String guid = "ce095552-b466-4d03-ac41-430ec9286806";
UUID uuid = UUID.fromString(guid);
System.out.println(uuid);
Invalid long: ""ce095552"
Looks like the GUID you are passing to UUID.fromString still includes quotes ("). Make sure the GUID string does not include any additional characters and it should work.
String guid = "41e72bd6-d38f-4f78-855f-160562262a54";
UUID uuid = UUID.fromString(guid);
ReadValueId readValueId = new ReadValueId(
new NodeId(2, uuid),
AttributeId.Value.uid(), null, QualifiedName.NULL_VALUE);
I am trying to insert byte array into Blob data type in my Cassandra table.. I am using Datastax Java driver. Below is my code -
for (Map.Entry<String, byte[]> entry : attributes.entrySet()) {
System.out.println("Key = " + entry.getKey() + ", Value = " + entry.getValue());
String cql = "insert into test_data (user_id, name, value) values ('"+userId+"', '"+entry.getKey()+"', '"+entry.getValue()+"');";
System.out.println(cql);
CassandraDatastaxConnection.getInstance();
CassandraDatastaxConnection.getSession().execute(cql);
}
And this is the exception I am getting back -
InvalidQueryException: cannot parse '[B#50908fa9' as hex bytes
I guess the problem is, the way I am making my above cql.. Something is missing for sure...
I have created the table like this -
create table test_data (user_id text, name text, value blob, primary key (user_id, name));
Can anybody help me? Thanks...
The problem is that when you append the byte array to the String it calls toString on the byte[] which prints the unhelpful pointer you are seeing. You need to manually convert it to a String for your data type. In your case you are using a blob, so you need to convert to a hex string.
This question has code for converting the byte[] to String:
How to convert a byte array to a hex string in Java?
You can use one of those functions and prepend '0x' to it. Then you should have a valid String for your blob.
I have an image stored in an Oracle database and I'm using spring data to retrieve the image.
#Query("SELECT c.binaryContent from ContentEntity c join c.ParentContentEntities pce where pce.SpecificEntity.id = :id and pce.contentType.id = 11")
byte [] getImageBinaryContent(#Param("id") Long id);
#Lob
#Column(name = "BINARY_CONTENT")
private byte [] binaryContent;
byte[] testImageArray = serviceLayer.getImageBinaryContent(id) RETURNS NULL
Tested individually this query works. It finds content but when the call is made to getImageBinaryContent passing in an id, I get nothing back, just null result. If I change the return type to blob, I successfully get a blob back.
Why can't I read a blob into a byte array directly? My searches have shown examples of getting the blob returned, and then converting the blob with an inputstream into a byte array, but it seems like I should just be able to do this directly.
There is a column message in the database table fsr_system_log for the schema fsr_appl. This table is supposed to store the system log message. The size of the column is 255 and the datatype is varchar2. The logic implemented for saving message whose size is greater than 255 characters is :
public void saveSystemLog(SystemLogRequest systemLog){
User user = systemLog.getUser();
String system = systemLog.getSystem();
Log log = systemLog.getSystemLog();
try {
initializeDelegate();
delegate.beginTransaction();
LogEntry[] logEntries = log.getItemArray();
for (LogEntry logEntry : logEntries) {
// Save each entry
ParamVector<Object> params = new ParamVector<Object>();
//Check if message is greater than 255 characters
String message = logEntry.getMsg();
notifier.debug("Log Message is : " + message);
if(message.length()>255){
message = message.substring(0,255);
notifier().debug("Message string greater than 255 characters : " + message);
}
params.add(message, 255, false);
}
But despite implementing the code, I face the following error :
Failed to save system log due to SQL error: ORA-12899: value too large for column "FSR_APPL". "FSR_SYSTEM_LOG". "MESSAGE" (actual: 257,
maximum: 255)
A sample log being used is :
<xbe:systemLogRequest xmlns:xbe="http://tdc.dk/fsr/common/xbean">
<user>
<userNumber>a62267</userNumber>
</user>
<system>Client</system>
<systemLog>
<item>
<timestamp>2011-10-27T17:03:08.404+02:00</timestamp>
<type>Info</type>
<msg><![CDATA[<html><center>Din registrering er nu sendt<br><br>Tak for indmeldingen</center></html>]]></msg>
</item>
<item>
<timestamp>2011-10-27T17:03:13.701+02:00</timestamp>
<type>Info</type>
<msg><![CDATA[<html><center>Din registrering er nu sendt<br><br>Tak for indmeldingen</center></html>]]></msg>
</item>
<item>
<timestamp>2011-10-28T12:45:47.801+02:00</timestamp>
<type>Info</type>
<msg><![CDATA[<html><center>Din registrering er nu sendt<br><br>Tak for indmeldingen</center></html>]]></msg>
</item>
<item>
<timestamp>2011-10-28T12:45:57.926+02:00</timestamp>
<type>Info</type>
<msg>Afsluttet uden at gemme fejlregistering</msg>
</item>
</systemLog>
</xbe:systemLogRequest>
Please help!
Note : The error thrown is for certain system log messages only. And the actual value is always a constant 257 when this error is being thrown.
Apparently you have some characters that are represented with more than one byte.
In Java you get the length of the String in characters whereas Oracle apparently checks the byte limit due to the way the column was defined.
Check out the definition of your table, most probably the column is defined with default character semantics - which is byte (but depends on your Oracle installation), so it defaults to VARCHAR2(255 Byte)
If you redefine the column as VARCHAR2(255 Char) things should be fine.
Something like:
CREATE TABLE FSR_SYSTEM_LOG
(
...
MESSAGE VARCHAR2(255 Char),
...
);
To me it seems rather strange to store an XML in such a (length) limited column. Can you make sure your XML is never longer than 255 characters? Why don't you store the message as a CLOB?
You can change your field type to CLOB, BLOB, or XMLType. Looking up Oracle data types, VARCHAR2 seems to also work on bytes.
You might, understandably, not want to change your field type, in which case, replace
if(message.length()>255){
message = message.substring(0,255);
notifier().debug("Message string greater than 255 characters : " + message);
}
with
pos = 255
while (message.getBytes().size() > 255) {
message = message.substring(0,pos--);
}
At first sight, I suspect an encoding issue...
Java uses unicode : a character in unicode may cost one to many byte(s)...
A java string's length may be 1 but when marshalled on storage, it can take 2 bytes, 4bytes...
To check bytes length, maybe you should try:
if(message.getBytes("US-ASCII").length()>255){
What is the encoding of your Oracle DB ?
Another possible issue is the class that writes to the DB adds a CR/LF(0D0A) to the msg parameter or something else...
How can encrypt the data base fields when using the hibernate?
We have developed the product some of the clients are using that application Some clients is asking about the data base encryption
Is there any possible to encrypt the data in application level with out more changes in the code.
Please give me the suggestion as soon as possible.
Try this:
Put an attribute in your entity:
private byte[] encryptedBody;
Use this getter and setters:
#Column(columnDefinition= "LONGBLOB", name="encryptedBody")
#ColumnTransformer(
read="AES_DECRYPT(encryptedBody, 'yourkey')",
write="AES_ENCRYPT(?, 'yourkey')")
public byte[] getEncryptedBody() {
return encryptedBody;
}
public void setEncryptedBody(byte[] encryptedBody) {
this.encryptedBody = encryptedBody;
}
And then when you retrive the column use:
private final Charset UTF8_CHARSET = Charset.forName("UTF-8");
String decodeUTF8(byte[] bytes) {
return new String(bytes, UTF8_CHARSET);
}
String s = decodeUTF8(entity.getEncryptedBody());
BEWARE: AES_DECRYPT and AES_ENCRYPT belong to MySQL. If you have a different data base engine find similar functions.
Hope this helps.
You can use the #ColumnTransformer annotation like this:
#ColumnTransformer(
read = "pgp_sym_decrypt(" +
" storage, " +
" current_setting('encrypt.key')" +
")",
write = "pgp_sym_encrypt( " +
" ?, " +
" current_setting('encrypt.key')" +
") "
)
#Column(columnDefinition = "bytea")
private String storage;
This way, Hibernate will be able to encrypt the entity attribute when you persist or merge it and decrypt it when you read the entity.
I think that you are looking for column transformers. You can find how to do it in the Hibernate reference:
http://docs.jboss.org/hibernate/core/3.6/reference/en-US/html/mapping.html#mapping-column-read-and-write
I hope that helps!
You could use jasypt. It has an Hibernate integration that allows you to encrypt properties while saving (and decrypt while loading).
http://www.jasypt.org/hibernate.html