All Forums Teradata Studio
mahesh_mm1 6 posts Joined 05/14
09 May 2014
How to lad the keycolumn level sapaces data

Hi Firends,
 
               In sources data level keycolumn level some spaces are there but we want to load that data. could please give any sugessions to load the data.
 

fgrimmer 553 posts Joined 04/09
09 May 2014

Mahesh, Are you loading data into Teradata Database?What tool are you using to load the data? 

mahesh_mm1 6 posts Joined 05/14
10 May 2014

in 'Cust_Type varchar(10)' this is key column. we would like to load like this data ' CST1234' but some of the data level we have first some spacess. our customer want to load that type of data. could pleae provide any sugession.
we are using Teradata12 

dnoeth 4628 posts Joined 11/04
10 May 2014

I still don't get it:
Do you want to load those leading spaces or not?
If yes, then you should be fine as TD does not remove leading blanks by default. When they are removed it must be done explicitly during the load process.
If no, you can easily add a TRIM(col) during INSERT.

Dieter

fgrimmer 553 posts Joined 04/09
12 May 2014

Mahesh, Teradata Studio provides a Load utility that accepts Excel or delimited files. You can also specify the character string delimiter. You can quote the strings, as you show above, to capture the leading blank.

mahesh_mm1 6 posts Joined 05/14
13 May 2014

Teradata load utility is ok. is there any possibility could please provide the function 'CSTT1234'

fgrimmer 553 posts Joined 04/09
13 May 2014

Mahesh, Sorry, I am not sure what you are asking? The Load utility is part of Teradata Studio. You can download Teradata Studio from the Developer Exchange Download page. Have you downloaded Teradata Studio?

mahesh_mm1 6 posts Joined 05/14
13 Jun 2014

Hi friends,
      why, we are saying primary index is a sinagle amp opearion. Suppose we are loading large volum of data throw that amp. why that is not use more amp operation like secondery index. mean while if that amp fail. what is the situation. can, I know the clarification.

VandeBergB 182 posts Joined 09/06
13 Jun 2014

Mahesh,
You don't load through an amp directly.  The data is hashed by the system and distributed across amps via the PI.  Single amp operations are typically queries where the only argument in the predicate is something like 'where PIColumn = scalar value', with no residual conditions.
Your large volume data loads will all be hashed through the hash map and distributed across the amps, if you've picked a good PI.  
Cheers!

Some drink from the fountain of knowledge, others just gargle.

mahesh_mm1 6 posts Joined 05/14
17 Jun 2014

In My case is, Customer table will have cid is primary key column and cartid is the primary index column. In this customer table, I want to load the 30 lack records and some calculation's also there, for that i prepared BTEQ Script. is that script run in single amp or multiple amp's. base line primary index will be load data throw one AMP operation. How this will be handle. 
                   you meen to say script level not single amp. can i know the reasion?
              

You must sign in to leave a comment.