This thread looks to be a little on the old side and therefore may no longer be relevant. Please see if there is a newer thread on the subject and ensure you're using the most recent build of any software if your question regards a particular product.
This thread has been locked and is no longer accepting new posts, if you have a question regarding this topic please email us at support@mindscape.co.nz
|
Hello,
I am getting the following error when I try to save a 34K file as a BLOB to an Oracle database:
ORA-01460: unimplemented or unreasonable conversion requested
This is the code I am using to convert a file into a byte array:
private byte[] ConvertFileToBlob(string path)
{
if (!File.Exists(path))
throw new FileNotFoundException(path);
byte[] blob = null;
using (FileStream fs = new FileStream(path, FileMode.Open))
{
blob = new byte[fs.Length];
fs.Read(blob, 0, System.Convert.ToInt32(fs.Length));
}
return blob;
}
Any help would be appreciated.
Thanks!
|
|
|
We can't reproduce this in our test environment. Could you provide us with your Oracle CREATE TABLE statement, your entity class definition, and the code which populates and saves the entity instance please? Also let us know whether you are using the Oracle9 provider or the Oracle9Odp (ODP.NET) provider. Thanks! |
|
|
I am attaching a sample program that shows the problem. Please note that I was able to get it to work on the sample program by calling SaveChanges twice, however, that trick did not work on my main system. I think this has something to do with the total size o BLOB data being saved in the transaction - but I could be wrong.
Thanks!
|
|
|
Thanks! We can now reproduce the problem and we are investigating. However I'd be interested to learn more about the problem you're seeing on the main system. I've tried saving a 1 MB BLOB as a single item, and that seems to work okay, so I don't think this is a size issue. How big is the BLOB that your main system is not able to save? |
|
|
Okay, here is some more info. There *is* a size issue, as you suspected, *but* it only affects batching, and it's about BLOB size, not total size. Basically if you are committing more than one record at the same time, BLOBs are limited to about 32K. However, you *can* commit multiple BLOBs of up to 32K in the same batch (I have tried committing up to 4 x just-under-32K blobs and that worked fine). And if you are committing only one record, rather than a batch, then the BLOB can be as big as you like. (I note that you may be seeing problems with saving even a single record containing a BLOB: if you can provide additional info then we will investigate this, but it seems to work fine for us. I have also tried saving a record containing two BLOBs, both large, and that works fine too.) How can you address this problem? Obviously you can call SaveChanges after each add, but what if that's incorrect (e.g. for transactional purposes) or inconvenient? The simplest workaround is to set the LightSpeedContext's UpdateBatchSize to 1. This effectively suppresses LightSpeed batching: it will issue a separate command to the database for each entity, instead of doing 10 or so entities per command. In this non-batched scenario, the ORA-01460 issue does not kick in, and BLOBs of arbitrary size are allowed. Note that suppressing batching is less efficient for 'small' records (those without BLOBs or where all BLOBs are under 32000 bytes). If you have many 'small' and fewer 'large' records you may want to see if you can add them separately, using a suitable UpdateBatchSize for each. I will log a feature request for us to identify inserts and updates that may run afoul of this issue and run them non-batched, but can't make any promises! |
|
|
That worked!!!
I just created a function that sets it to 1, Saves the changes, and then returns it to its original value so the rest of the system isn't affected. Seems to work great.
Thank you very much for your quick response!
Regards,
Gil
|
|