0 - 6 of 6 tags for alc

Hi All,
What is the best options to compress BLOB data? BLC Vs ALC
We are loading large volume of Binary files in teradata. We are thinking to use ALC, is there UPDs to compress and uncompress BOLB data?
Thanks
Khom
 
 
 
 
 
 

Hi Guys,

 

I am doing fast load to an empty table which has one column with algorithmic compression (ALC) using TransUnicodeToUTF8. Below is my table structure.

 

CREATE MULTISET TABLE SCPLN_W.SAMPLE1, NO FALLBACK

(

C1 VARCHAR(1) NOT NULL,

We have a basic (test) ALC UDF (actually it is just a dummy), and it works.
But we are not able to return with -1 to indicate that we cannot compress the value, because it returns with:
INSERT Failed. 7509: Result Exceeded maximum length for UDF/XSP/UDM SYSUDTLIB.alc1_num.
 
void alc1_num(   VARCHAR_LATIN *inStr,

Go in-depth on how NULLs and Compression are managed in a Teradata system.

The ALC (ALgorithmic Compression) test package contains UDFs simulating TD13.10 built-in compression functions, test templates for Latin and Unicode character columns and step-by-step instructions. It is intended for TD users to run over specific data at column level to determine compression rates of TD 13.10 built-in compression algorithms. The test results provide information for selecting an appropriate algorithm for specific data. These tests use read-only operations and they can be executed on any release that supports UDFs (V2R6.2 & forward). It is recommended to run these tests off peak hours - they will use a significant amount of system resources (CPU bound).

Teradata 13.10 provides Algorithmic Compression (ALC) feature that allows  users to apply compression / decompression functions on a specific column of character or byte type. The compression / decompression functions may be Teradata built-in functions provided along with ALC or user provided compression / decompression algorithm registered as UDFs.