DBD::Oracle and CLOBs
am 28.10.2005 15:50:13 von steves06
I've read all the threads out there on this but am not sure if
there's a known general solution or not.
We have a table with one CLOB and a number of other large VARCHAR2
columns. The VARCHAR2 columns are mostly 4000 characters. We can
get the CLOB data to bind and insert but eventually we start getting
those ORA-01461 errors:
ORA-01461: can bind a LONG value only for insert into a LONG column
The fewer of the VARCHAR2 columns we include in the insert, the
better it seems to go. That is, it seems to be the combination of
the CLOB and the big VARCHAR2 columns together that makes things
fail. I've tried explicitly binding all columns using ora_type
settings (ORA_VARCHAR2) and I've tried using the ora_maxdata_size
with those. It fails faster when I bind the VARCHAR2 columns.
In all cases so far, the actual data is a lot smaller than the
maximum column sizes. I've read about the possible character set
issue, where it may use twice as much space for a double byte
encoding. But the failures start with data that's less than half
the size of the column maximums.
It really seems like there's some shared buffer size under there
or something.
We have not tried inserting the CLOB data by itself (doing a two
pass insert).
Any ideas appreciated. This is on Solaris using DBD::Oracle version
1.16; DBI version 1.37.
--
Steve Sapovits steves06@comcast.net
Re: DBD::Oracle and CLOBs
am 28.10.2005 21:41:29 von Tim.Bunce
Last time this came up (which was recently) Jared Still [CC'd]
reported that he'd found several non-perl related references to this
error on Metalink. They were related to improper NLS settings.
Perhaps Jared, or someone else, could provide some specific
Metalink references.
Tim.
On Fri, Oct 28, 2005 at 09:50:13AM -0400, Steve Sapovits wrote:
>
> I've read all the threads out there on this but am not sure if
> there's a known general solution or not.
>
> We have a table with one CLOB and a number of other large VARCHAR2
> columns. The VARCHAR2 columns are mostly 4000 characters. We can
> get the CLOB data to bind and insert but eventually we start getting
> those ORA-01461 errors:
>
> ORA-01461: can bind a LONG value only for insert into a LONG column
>
> The fewer of the VARCHAR2 columns we include in the insert, the
> better it seems to go. That is, it seems to be the combination of
> the CLOB and the big VARCHAR2 columns together that makes things
> fail. I've tried explicitly binding all columns using ora_type
> settings (ORA_VARCHAR2) and I've tried using the ora_maxdata_size
> with those. It fails faster when I bind the VARCHAR2 columns.
>
> In all cases so far, the actual data is a lot smaller than the
> maximum column sizes. I've read about the possible character set
> issue, where it may use twice as much space for a double byte
> encoding. But the failures start with data that's less than half
> the size of the column maximums.
>
> It really seems like there's some shared buffer size under there
> or something.
>
> We have not tried inserting the CLOB data by itself (doing a two
> pass insert).
>
> Any ideas appreciated. This is on Solaris using DBD::Oracle version
> 1.16; DBI version 1.37.
>
> --
> Steve Sapovits steves06@comcast.net
Re: DBD::Oracle and CLOBs
am 30.10.2005 01:50:54 von steves06
Tim Bunce wrote:
> Last time this came up (which was recently) Jared Still [CC'd]
> reported that he'd found several non-perl related references to this
> error on Metalink. They were related to improper NLS settings.
This does appear either NLS or character set related. First, it's
not the CLOB column that's a problem. We were looking there since
that was relatively new. The problem columns are VARCHAR2(4000)
columns. The data coming in is from an XML package and is in UTF-8.
Our default Oracle character set is ISO-8859-1. If I add a filter
to convert from UTF-8 to ISO-8859-1, everything works. My guess at
this point is that Oracle is choking trying to convert the UTF-8 to
ISO-8859 and that this is only an issue for the VARCHAR2 columns, not
the CLOB. I'll have to investigate the whole Oracle character set
thing a bit more. Some threads I found Googling this indicated that
a VARCHAR2 of 4000 might need to have its input data truncated to 2000
(halved) if there was a conversion that would take one byte characters
to two. Even though that should not be the case here (ISO-8859 is a
single byte character set) I tried that and still got errors. One thing
I want to try yet is setting Oracle's character set to UTF-8 and see
what the behavior is.
If these points ring a bell with anyone and you can fill in the gaps
it would be appreciated. I'd like to write something up on what this
all means for future internal reference.
The good news is that there do not seem to be any DBI or DBD::Oracle
issues.
For UTF8 conversions CPAN has Unicode::MapUTF8 if anyone else ends up
down this path.
--
Steve Sapovits steves06@comcast.net
Re: DBD::Oracle and CLOBs
am 02.11.2005 02:19:45 von jkstill
Here's a few likely looking suspects. There were quite a fewm,
but I just grabbed some that seemed to hold some promise.
You should probably get on MetaLink to do a full review.
----
These are excerpts only from the notes so you
can get an idea of the problem addressed in the document.
I won't post the entire documents.
============================================================ =================
ML Note: 241358.1
There are a number of ways to "hit" this bug, and some of the
workarounds can be more or less relevant depending on the exact
circumstances. However, in all cases the problem will be down to using a
single byte client character set and a multibyte database character set.
If that is not a setup you use then this is not a problem you have hit.
If you indeed have a setup like that then there is a good chance that
some of the workarounds given below will solve your problem.
============================================================ =================
ML Note: 97047.1
Problem Description
-------------------
When attempting to load data into a CLOB field from a varchar2 variable
using an INSERT statement, you receive the following error:
ORA-01461 : can bind a LONG value only for insert into a LONG column
The error occurs if the number of characters you are trying to insert exceeds
a certain value (i.e. more than 4000).
============================================================ =================
ML Note: 280341.1
(This one may have to do with DBD::Oracle - someone else will have to check)
Errors
ORA-1461 can bind a LONG value only for insert into a LONG column
Symptoms
When you are inserting data from an OCI application you get
ORA-1461 can bind a LONG value only for insert into a LONG column
Cause
In the OCIBindByName function you gave a value for the Parameter
value_sz (IN) larger than the corresponding column on the database where
the data was to be inserted.
Fix
Set the value of the parameter value_sz at or smaller than the size of
the corresponding column on the database.
============================================================ =================
On Fri, 2005-10-28 at 20:41 +0100, Tim Bunce wrote:
> Last time this came up (which was recently) Jared Still [CC'd]
> reported that he'd found several non-perl related references to this
> error on Metalink. They were related to improper NLS settings.
>
> Perhaps Jared, or someone else, could provide some specific
> Metalink references.
>
> Tim.
>
> On Fri, Oct 28, 2005 at 09:50:13AM -0400, Steve Sapovits wrote:
> >
> > I've read all the threads out there on this but am not sure if
> > there's a known general solution or not.
> >
> > We have a table with one CLOB and a number of other large VARCHAR2
> > columns. The VARCHAR2 columns are mostly 4000 characters. We can
> > get the CLOB data to bind and insert but eventually we start getting
> > those ORA-01461 errors:
> >
> > ORA-01461: can bind a LONG value only for insert into a LONG column
> >
> > The fewer of the VARCHAR2 columns we include in the insert, the
> > better it seems to go. That is, it seems to be the combination of
> > the CLOB and the big VARCHAR2 columns together that makes things
> > fail. I've tried explicitly binding all columns using ora_type
> > settings (ORA_VARCHAR2) and I've tried using the ora_maxdata_size
> > with those. It fails faster when I bind the VARCHAR2 columns.
> >
> > In all cases so far, the actual data is a lot smaller than the
> > maximum column sizes. I've read about the possible character set
> > issue, where it may use twice as much space for a double byte
> > encoding. But the failures start with data that's less than half
> > the size of the column maximums.
> >
> > It really seems like there's some shared buffer size under there
> > or something.
> >
> > We have not tried inserting the CLOB data by itself (doing a two
> > pass insert).
> >
> > Any ideas appreciated. This is on Solaris using DBD::Oracle version
> > 1.16; DBI version 1.37.
> >
> > --
> > Steve Sapovits steves06@comcast.net