Code = 2644 No more room in database - response (8) by KVB
Teradata Parallel Transporter - Session Character Set - response (24) by MaximeV
Hi Santanu84,
2 questions:
Can you provide the character set used on your oracle source which seems to both manage those extended ascii chars and chinese characters ?
On the informatica side, the only parameter you can modify related to character set is the one on the relational source used by tpt , am i right ?
Data mismatch while migrating data from SAS to Teradata using TPT - response (9) by feinholz
Again, I cannot help until I have more detailed information about the TPT operator.
I need verification that the job is using a tool such as DataStage to run the TPT operators (via TPTAPI).
I then need some type of tracing diagnostics for the TPT operators.
The information you are providing might be ok for some, but not when trying to diagnose a TPT task.
(For example, the diagnostic information will tell me exactly how many rows were given to the operator to send to Teradata. That might be a clue as to what is going on.)
End of Record marker with TPT - response (2) by feinholz
Please explain this answer:
"When the value (no. of character) is more/less than the buffer size, it will error out. It need to be exactly same number of character defined in the schema if you are not using delimiter in the last column"
I am unaware of such a rule (that the size of the field must be exactly the same number of characters defined in the schema).
The use of a terminating delimiter is OPTIONAL.
Where can I download the Teradata Parallel Transport API for all platforms? - response (2) by richa.prvr
Hi,
I have installed Teradata 14.00 on SUSE linux provided by teradata for VMware machine. Could anyone tell me from where I can download the TPT API and steps for its installation.
Thanks,
Richa
Where can I download the Teradata Parallel Transport API for all platforms? - response (3) by feinholz
It is part of the TPT installation. There is no separate installation for TPTAPI.
Teradata Wallet - response (5) by MaxG
Quoting "EOF" and escaping the dollar-sign do the same thing -- prevent the shell from interpreting the $tdwallet keyword as a shell script variable.
Issue while loading space delimited flat file using fastload on TPT - forum topic by EUsha
I am trying to loada flat file that is delimited with space having 8 columns. While loading this data using TPT, I am getting erorr "Delimited Data Parsing error: Column length overflow(s) in row 1". Even when i try to convert the space delimiter to pipe, i am getting number of columns mismatch in flat file.
Any sugesstions as to why flat file with SPACE delimiter is not working with TPT. Also there are missing columns in this flat file but we have used ACCEPTMISSINGCOLUMNS option. Is this correct.
Sample data (which has multiple spaces between columns) is given below along with the log and the script that is run.
Data:
====
d1 d2 d3 d4 d5 d6 d7 d8
1001 09/01/1995 09/03/1997 2 112 14233 1001 0
1001 05/02/2008 12/02/2008 2 447 14189 9001 Odkp27uEjCEaByuZLQgXw6kbb88bmPwUfGAEnaH0mg0=
1001 09/01/1995 09/03/1997 2 112 84 1001 kVbMu8z5RMJprze4ob1AX/IU6X3lDT6oIMbPgJyPbt0=
1001 18/07/2003 18/07/2003 2 325 35 9001 0
1001 05/02/2008 12/02/2008 2 447 14172 9001 0
1001 05/02/2008 12/02/2008 2 447 8687 9001 0
1001 05/02/2008 12/02/2008 2 447 14173 9001 0
1001 03/06/2010 03/06/2010 2 551 15987 219001 MJTjTPGHOlOGg1LgSqUMXrB/4yfz4EGHeUfsrzGj780=
1001 20/07/2010 20/07/2010 2 561 2632 0 27YEsg3G78SuGTcI4zcPCiIHK8wNsWcTrtU6glepyCA=
1001 18/12/2001 19/12/2001 2 271 13607 9001 0
1001 18/12/2001 19/12/2001 2 271 7578 9001 0
1001 18/12/2001 19/12/2001 2 271 77 9001 0
1001 16/05/2001 17/05/2001 2 253 63 9001 Lc8ybk+g5Iu9iF9eyCF0hL+9E8AlC3wTTOvRPzOQYf4=
1001 16/10/2003 13/11/2003 2 349 13745 9001 1lKpwbahAH9QZW7hSqCaBqrIhSGoKF4TDYCWwNf/ZbE=
1001 16/10/2003 28/10/2003 2 346 7926 9001 OXPTHvQ39elodq8CjRdpj5iqH40bhSLscBZjGicPIPU=
1001 11/02/2008 11/02/2008 2 444 7326 0 2R/fiQqH9AlFd2tpGWii+peuoh9x2DywdGn+wG1QpVc=
1001 03/11/1998 03/12/1998 2 173 1004 9001 0
1001 26/05/2010 26/05/2010 2 548 7326 0 MJTjTPGHOlOGg1LgSqUMXr0rffpBEfqVFyScrFH5Gzc=
1001 20/01/2010 21/01/2010 2 523 10289 219001 0
1001 16/10/2003 20/10/2003 2 339 8491 9001 0
1001 16/10/2003 23/10/2003 2 344 8491 9001 0NaMPHiuGhfoZy7cv1QNxqHxIsPjhQRUny2OZ69CMqA=
1001 16/05/2001 17/05/2001 2 252 66 9001 0
1001 16/05/2001 17/05/2001 2 252 23 9001 0
1001 18/12/2001 19/12/2001 2 271 13721 9001 0
1001 16/05/2001 17/05/2001 2 252 24 9001 0
1001 16/05/2001 17/05/2001 2 252 32 9001 0
1001 20/01/2010 21/01/2010 2 522 64 219001 lfYay0a9dVu2wcF1OHpuBguP5n57iCep6GxIliM1DV0=
1001 29/05/2009 29/05/2009 2 485 14563 219001 0
TPT Log:
==========
Teradata Parallel Transporter Version 14.00.00.10
Job log: /opt/teradata/client/14.00/tbuild/logs/plk1-40.out
Job id is plk1-40, running on us111
Teradata Parallel Transporter DataConnector_C2: TPT19006 Version 14.00.00.10
DataConnector_C2 Instance 1 directing private log report to 'STG_DB.TPT_TEST'.
DataConnector_C2: TPT19008 DataConnector Producer operator Instances: 1
Teradata Parallel Transporter SQL Inserter Operator Version 14.00.00.10
Insert_TPT_TEST2: private log specified: STG_DB.TEST_Space
DataConnector_C2: TPT19003 ECI operator ID: DataConnector_C2-382
Insert_TPT_TEST2: connecting sessions
DataConnector_C2: TPT19222 Operator instance 1 processing file '/COPY/RAW_DATA/test_10000.txtab'.
Insert_TPT_TEST2: Total Rows Sent To RDBMS: 0
Insert_TPT_TEST2: Total Rows Applied: 0
Insert_TPT_TEST2: disconnecting sessions
Insert_TPT_TEST2: Total processor time used = '0.16 Second(s)'
Insert_TPT_TEST2: Start : Wed Mar 19 12:04:37 2014
Insert_TPT_TEST2: End : Wed Mar 19 12:04:41 2014
Job step Load_TPT_TEST2 terminated (status 12)
Job plk1 terminated (status 12)
DataConnector_C2: TPT19350 I/O error on file '/COPY/RAW_DATA/test_10000.txtab'.
DataConnector_C2: TPT19003 Delimited Data Parsing error: Column length overflow(s) in row 1
DataConnector_C2: TPT19003 TPT Exit code set to 12.
DataConnector_C2: TPT19221 Total files processed: 0.
TPT Script:
===========
DEFINE JOB Load_TPT_TEST2
DESCRIPTION 'Load a Teradata table from a space delimited flat file' (
DEFINE SCHEMA Schema_TAB2 (
D1 VARCHAR(20),
D2 VARCHAR(10),
D3 VARCHAR(10),
D4 VARCHAR(3),
D5 VARCHAR(20),
D6 VARCHAR(20),
D7 VARCHAR(20),
D8 VARCHAR(50)
);
DEFINE OPERATOR DataConnector_C2
TYPE DATACONNECTOR PRODUCER
SCHEMA Schema_C2
ATTRIBUTES (
VARCHAR PrivateLogName = 'STG_DB.TPT_TEST',
VARCHAR FileName = '/COPY/RAW_DATA/test_10000.txtab',
VARCHAR TraceLevel = 'All',
VARCHAR FORMAT = 'Delimited',
VARCHAR TextDelimiter = 'space',
VARCHAR OpenMode = 'read',
VARCHAR AcceptMissingColumns = 'Y'
);
DEFINE OPERATOR Insert_TPT_TEST2
TYPE INSERTER
SCHEMA *
ATTRIBUTES (
VARCHAR PrivateLogName = 'STG_DB.TEST_Space',
VARCHAR TdpId = 'xx.xxx.xxx.xx',
VARCHAR UserName = 'USER1',
VARCHAR UserPassword = 'USER1',
VARCHAR TargetTable = 'STG_DB.TPT_TEST',
VARCHAR LogTable = 'STG_DB.TPT_TEST_L',
VARCHAR ErrorTable1 = 'STG_DB.TPT_TEST_E1',
VARCHAR ErrorTable2 = 'STG_DB.TPT_TEST_E2',
VARCHAR WorkTable = 'STG_DB.TPT_TEST_WT'
);
STEP Load_TPT_TEST2 (
APPLY (
'INSERT INTO STG_DB.TEST_Space (
D1,
D2,
D3,
D4,
D5,
D6,
D7,
D8
)
VALUES (
:D1,
:D2,
:D3,
:D4,
:D5,
:D6,
:D7,
:D8
);'
)
TO OPERATOR (
Insert_TPT_TEST2[1]
)
SELECT
D1,
D2,
D3,
D4,
D5,
D6,
D7,
D8
FROM OPERATOR (
DataConnector_C2[1]
);
);
);
Code = 2644 No more room in database - response (9) by cs.sonu@gmail.com
Hi I am loading a file from Fastload script.
The file was create using fastexport from a Teradata table of 50 column.
Only 8 columns are used in the file creation , rest are '' to go as null. File is Pipe delimited
When i run fast load the table loaded has size two times as of the 1s teradta table( used for creating the file).
What could be the reason for this big size.
thank ,
Abhishek
Issue while loading space delimited flat file using fastload on TPT - response (1) by krishaneesh
Can you skip the first row and try. i believe the header is also being read
Issue while loading space delimited flat file using fastload on TPT - response (2) by feinholz
The TextDelimiter you have defined will look for a delimiter made up of the characters (the word) "space". You need to actually tell us the characters, and they must all be the same for every column separator on every row.
In other words, your delimiter cannot be 4 spaces one time and 3 spaces another.
Thus, if you want the delimiter to be 4 space characters, you need to do:
VARCHAR TextDelimiter = ' ' <----- 4 spaces characters within the quotes
BTEQ in Windows Environment - response (2) by john.cantu
Thank you, Fred. Appreciate the feedback!
Error in MLOAD - forum topic by Jugalkishorebhatt1
Hello All,
This the first time i am running MLOAD Script. Please Help me find the error.
Script:
.LOGTABLE DB.logs2;
.LOGON Jugal/jbhatt,jugal;
CREATE MULTISET TABLE DB.Mload_Input ,NO FALLBACK ,
NO BEFORE JOURNAL,
NO AFTER JOURNAL,
CHECKSUM = DEFAULT,
DEFAULT MERGEBLOCKRATIO
(
Empid INTEGER,
EmpName VARCHAR(5) CHARACTER SET LATIN CASESPECIFIC)
PRIMARY INDEX ( Empid );
.BEGIN IMPORT MLOAD TABLES DB.Mload_Input;
.LAYOUT S1;
.FIELD EmpId * VARCHAR(10);
.FIELD EmpName * VARCHAR(5);
.DML LABEL L1;
INSERT into DB.Mload_Input values(:EmpId,:EmpName);
.IMPORT INFILE /home/jbhatt/data.txt FORMAT VARTEXT ','
LAYOUT S1
APPLY L1;
.END MLOAD;
.LOGOFF;
Logs:
$ mload<MLOAD.txt
========================================================================
= =
= MultiLoad Utility Release MLOD.14.00.00.08 =
= Platform LINUX =
= =
========================================================================
= =
= Copyright 1990-2011 Teradata Corporation. ALL RIGHTS RESERVED. =
= =
========================================================================
**** 09:10:51 UTY2411 Processing start date: FRI MAR 21, 2014
========================================================================
= =
= Logon/Connection =
= =
========================================================================
0001 .LOGTABLE DB.logs2;
0002 .LOGON Jugal/jbhatt,;
**** 09:10:52 UTY8400 Teradata Database Release: 14.00.05.02
**** 09:10:52 UTY8400 Teradata Database Version: 14.00.05.03
**** 09:10:52 UTY8400 Default character set: ASCII
**** 09:10:52 UTY8400 Current RDBMS has interval support
**** 09:10:52 UTY8400 Current RDBMS has UDT support
**** 09:10:52 UTY8400 Current RDBMS has Large Decimal support
**** 09:10:52 UTY8400 Current RDBMS has TASM support
**** 09:10:52 UTY8400 Maximum supported buffer size: 1M
**** 09:10:52 UTY8400 Data Encryption supported by RDBMS server
**** 09:10:52 UTY6211 A successful connect was made to the RDBMS.
**** 09:10:52 UTY6210 Logtable 'DB.logs2' indicates that a restart is
in progress.
========================================================================
= =
= Processing Control Statements =
= =
========================================================================
0003 .BEGIN IMPORT MLOAD TABLES DB.Mload_Input;
========================================================================
= =
= Processing MultiLoad Statements =
= =
========================================================================
0004 .LAYOUT S1;
0005 .FIELD EmpId * VARCHAR(10);
0006 .FIELD EmpName * VARCHAR(5);
0007 .DML LABEL L1;
0008 INSERT into DB.Mload_Input values(:EmpId,:EmpName);
0009 .IMPORT INFILE /home/jbhatt/data.txt FORMAT VARTEXT ','
LAYOUT S1
APPLY L1;
0010 .END MLOAD;
========================================================================
= =
= MultiLoad Initial Phase =
= =
========================================================================
**** 09:10:52 UTY0829 Options in effect for this MultiLoad import task:
. Sessions: One session per available amp.
. Checkpoint: 15 minute(s).
. Tenacity: 4 hour limit to successfully connect load sessions.
. Errlimit: No limit in effect.
. AmpCheck: In effect for apply phase transitions.
**** 09:10:52 UTY0817 MultiLoad submitting the following request:
Select NULL from DB.logs2 where (LogType = 125) and (Seq = 1)
and (MloadSeq = 0);
**** 09:10:52 UTY0817 MultiLoad submitting the following request:
Select NULL from DB.logs2 where (LogType = 120) and (Seq = 1);
**** 09:10:52 UTY0817 MultiLoad submitting the following request:
SET QUERY_BAND='UTILITYNAME=MULTLOAD;' UPDATE FOR SESSION;
**** 09:10:52 UTY0817 MultiLoad submitting the following request:
CHECK WORKLOAD FOR BEGIN MLOAD DB.Mload_Input;
**** 09:10:52 UTY0817 MultiLoad submitting the following request:
CHECK WORKLOAD END;
**** 09:10:52 UTY0844 Session count 16 returned by the DBS overrides
user-requested session count.
**** 09:10:56 UTY0815 MLOAD session(s) connected: 16.
**** 09:10:56 UTY0817 MultiLoad submitting the following request:
BEGIN MLOAD DB.Mload_Input WITH INTERVAL;
**** 09:10:56 UTY0817 MultiLoad submitting the following request:
Select NULL from DB.logs2 where (LogType = 130) and (Seq = 1)
and (MloadSeq = 20);
**** 09:10:56 UTY0832 This MultiLoad import task cannot proceed: an unexpected
MultiLoad phase, data acquisition, was reported by the RDBMS.
========================================================================
= =
= Logoff/Disconnect =
= =
========================================================================
**** 09:10:58 UTY6212 A successful disconnect was made from the RDBMS.
**** 09:10:58 UTY2410 Total processor time used = '1.84 Seconds'
. Start : 09:10:51 - FRI MAR 21, 2014
. End : 09:10:58 - FRI MAR 21, 2014
. Highest return code encountered = '12'.
$
Mload error while using Update/Insert for null PI fields - forum topic by moloy_kundu
Hi All,
We are trying to use update else insert loading startegy for a teradata table. The table has PI defined on fields which is nullable. The issue is whenever we are trying to update a record having a null value in any of the PI fields, it is getting rejcted into the UV table (unable to indentify the update and thus trying to do an insert).
So solve the issue, we have created two DML labels, one for null records and the other for not null. However when we are trying to use two apply statements (with is null and is not null), we are facing error.
-----------------------------------------------------
.DML Label tagDML_Null
Do insert for missing update rows;
UPDATE :CF.DatabaseName.TAB1
SET
R_ID = :R_ID ,
WHERE
STAT IS NULL;
INSERT INTO :CF.DatabaseName.TAB1 (
R_ID ,
STAT) VALUES(
:R_ID,
NULL);
.DML Label tagDML
Do insert for missing update rows;
UPDATE :CF.DatabaseName.TAB1
SET
R_ID = :R_ID ,
WHERE
STAT = :STAT;
INSERT INTO :CF.DatabaseName.TAB1 (
R_ID ,
STAT) VALUES(
:R_ID,
:STAT);
Import Infile ':CF.ImportFileName'
Layout InputFileLayout
Format Unformat
Apply tagDML WHERE STAT IS NULL
Apply tagDML_Null WHERE STAT IS NOT NULL
;
Can anyone please help me here what can be done to fix the issue?
Thanks,
moloy
Error in MLOAD - response (1) by sgarlapa
It seems mload failed from no point of return. The log shows it is a rerun.
Better don't include create table statement in mload script.
To fix and run from 0th record onwards
you can try below steps -
1) run from sql assistant - "release mload DB.Mload_Input;"
2) run from sql assistant - "release mload DB.Mload_Input in apply;"
3) Drop all the supporting tables (log, error1, error2 and work)
4) remove create table statment from scirpt. execute create table statment from either bteq or sql assistant (or can automate with shell scirpt to run as bteq if requried)
5) run the mload script
--Sri
Issue while loading space delimited flat file using fastload on TPT - response (3) by EUsha
In the flat file, there is no header.
Though the delimiter is define to be ' ', the number of spaces between the columns was not even in the flat file. As stated this could be the issue. So we modified the flat file to ensure that we have only one single space as delimiter between columns. With this change too, we are getting the same error. Is there any alternate solution for this issue.
Issue while loading space delimited flat file using fastload on TPT - response (4) by sgarlapa
As feinholz said above, please try changing
VARCHAR TextDelimiter = 'space',
to
VARCHAR TextDelimiter = '',
in the script and try.
“Warning: RDBMS CRASHED OR SESSIONS RESET. RECOVERY IN PROGRESS” running bteq on AIX - response (2) by gneal
I will try that. Thank you.
Error with Teradata connector for Hadoop with HCAT -> TD fastload - response (1) by RyanBerti
Forgive us for the delayed response-
The comma separated list of jars specified alongside the '-libjars' option should include the hcatalog jar, take a look at the Environment Variables subsection of the Use Case Examples Chapter of the TDCH Tutorial for more info. The NoClassDefFoundError is usually thrown when the '-libjars' value, or the HADOOP_CLASSPATH env, is not setup correctly. Let me know if this doesn't resolve your issue? Thanks
Ryan
Issue while loading space delimited flat file using fastload on TPT - response (5) by EUsha
Tried with TextDelimiter ='' (Bioth single and multiple spaces) but got the same error.
You can maintain anotehr backup database to hold the tables which were unused for 2 to 3 months and delete from the main database.so that you can able to achive the space in the frequently used database.