All Forums Database
MeenakshiM 11 posts Joined 10/15
08 Apr 2016
Will increasing block size help handle the error : The maximum possible Row length in the table is too large

Hi,
I am getting error 3933 : The Maximum possible Row length in the table is too large . The table contains around 800 columns that is being pulled from Greenplum as source .
Will increasing the block size help handle this scenario ? If yes, how to do that ?
Thanks,
Meenakshi

david.craig 73 posts Joined 05/13
08 Apr 2016

The 3933 error will not be removed by increasing the block size. Perm and response rows are currently limited to ~64KB. This is the limit which is exceeded. The source table must be split into two, or more, tables.

ToddAWalter 316 posts Joined 10/11
08 Apr 2016

A review of data types may be in order as well. If there are a lot of char/varchar columns, review the lengths to see if the lengths defined are really required for instance.

MeenakshiM 11 posts Joined 10/15
10 Apr 2016

We had reviewed the column lengths to consider the size really needed for our data , still it is exceeding the row length limit and throwing error 3933 . Is there some other way other than splitting the table into smaller tables ?

dnoeth 4628 posts Joined 11/04
11 Apr 2016

Hi Meenakshi,
as Greenplum is based on PostgreSQL which has a much higher record size limit, so you probably need to split the table (or start normalizing it).

Dieter

david.craig 73 posts Joined 05/13
11 Apr 2016

Large objects only take up about 50 bytes in the base table row for the object identifier as the data is stored in subtable rows. This may avoid the 3933, but introduce other issues when processing LOBs. Refer to chapter 14 (database-level capacity planning) of the 'database design' reference and the 'data types and literals' reference for more details.

You must sign in to leave a comment.