When I changed the target DDL in TD to remove the DOC_ID and making the table as NOPI / Multiset, I get an error
$FILE_READER[1]: TPT19003 LOB column data length error
The maximum characters in 1 line in above sample JSON data file is 1800 characters or less.
Regards,
Babar
If you want to read in only character data, you can only user record format "Text" or "Delimtied".
You cannot use "unformatted" as that is used for binary data where the schema defines EXACTLY how the data will be read in. And since your JSON data will be varying in length, JSON(32000) will not work in the schema because the file reader will be expecting to 32000 bytes.
Since you only have a single column, try "Delimited". The fact that you have no delimiter does not matter because the end-of-record will supercede the delimiter at the end of the record.
--SteveF
Another thing you must do is change your INSERT statement from:
'INSERT INTO D1_STG_WEB.FRESH_JSON (
:DATA_JSON
);'
to:
'INSERT INTO D1_STG_WEB.FRESH_JSON VALUES (
:DATA_JSON
);'
--SteveF
Has this issue been resolved?
I am still facign the same issue. Here is my code: -
DEFINE JOB LOAD_JSON_FROM_FILE DESCRIPTION 'Load file using inserter operator' ( /*****************************/ DEFINE SCHEMA FLIGHT_SCHEMA DESCRIPTION 'FLIGHT RECORD SCHEMA' ( FLIGHTPACK JSON(6400) ); /*****************************/ DEFINE OPERATOR FLIGHT_READER() DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DATA CONNECTOR OPERATOR' TYPE DATACONNECTOR PRODUCER SCHEMA FLIGHT_SCHEMA ATTRIBUTES ( VARCHAR DirectoryPathyPath = 'C:\Research\TD 15 to Mongo', VARCHAR FileName = 'MBA.json', VARCHAR OpenMode = 'Read', VARCHAR Format = 'Delimited' ); /*****************************/ DEFINE OPERATOR FLIGHT_INSERTER DESCRIPTION 'Teradata PT INSERTER OPERATOR' TYPE INSERTER SCHEMA * ATTRIBUTES ( VARCHAR TdpId = '192.168.111.129', VARCHAR UserName = 'dbc', VARCHAR UserPassword = 'dbc', VARCHAR PrivateLogName = 'SQL_inserter_log' ); /*****************************/ STEP export_to_file ( APPLY ('INSERT INTO SAMPLES.FLIGHT_JSON VALUES (:FLIGHTPACK);') TO OPERATOR (FLIGHT_INSERTER() ) SELECT FLIGHTPACK FROM OPERATOR (FLIGHT_READER()); ); );
Now as you can see, here is the Table as defined: -
CREATE MULTISET TABLE SAMPLES.FLIGHT_JSON ,NO FALLBACK ,
NO BEFORE JOURNAL, NO AFTER JOURNAL, CHECKSUM = DEFAULT, DEFAULT MERGEBLOCKRATIO ( DOC_ID BIGINT GENERATED BY DEFAULT AS IDENTITY (START WITH 1 INCREMENT BY 1 MINVALUE -99999999999999999 MAXVALUE 99999999999999999 NO CYCLE), FLIGHTPACK JSON(8388096) CHARACTER SET UNICODE) PRIMARY INDEX ( DOC_ID );
Now when I try to run the above, I get the error -
Hello there,
Figured this out guys. It was a simple syntax issue: - (see below)
STEP export_to_file
(
APPLY (
'INSERT INTO SAMPLES.FLIGHT_JSON(
FLIGHTPACK)VALUES (:FLIGHTPACK);'
)
TO
OPERATOR (FLIGHT_INSERTER() )
SELECT
FLIGHTPACK
FROM
OPERATOR (FLIGHT_READER());
);
);
Thanks.
Dears,
I have been unable to read JSON data from a flat file into Teradata v15. Here is my setup.
myRecords.json file contents: (4 JSON documents each terminated by a new line character. Thus 1 JSON doc per line. First attribute of each JSON doc is "_id")
Here is the table in Teradata v15 that I am trying to load this data into
SourceFormat is set to "unformatted". Using Data Connector Producer operator to read the file and sending it to Selector operator for inserting into TD.
However, I get this error all the time.
Can anyone point out the issue here ? Any help / pointers would be much appriciated.