Teradata tpt - response (3) by akd2k6
Teradata tpt - response (4) by feinholz
3. Why would you want to export from TPT and load back in with MultiLoad? Why not just load back in with TPT?
2. For delimited output, it depends on which version of TPT you are using. Earlier versions of TPT only supported writing out delimited in VARCHAR format and so you had to CAST the SELECT statement, or use the SQL Selector operator and specify ReportMode='Y' which would force the DBS to convert all columns to character. For later (I think starting in 14.10 or 15.0) the DataConnector file writer will do the conversion for you, and the schema does not have to be comprised of all VARCHAR columns.
1. Please explain why you need to save the DML away for future use.
TPT ODBC Operator - response (3) by TDUser2000
Hello,
We are using TPT 13.1 and we are trying to connect to Oracle using ODBC_OPERATOR on AIX. Could you please confirm the following
"Datadirect drivers need to be downloaded and installed on AIX"?
Is there any other way ?
Regards
PT job with Oracle and Informix ODBC operator error - response (4) by TDUser2000
Hi Feinholz,
To be more specific, TPT and ODBC operator are only certified with 32 -bit ODBC drivers from Datadirect?
Regards
TPT - passing CHAR SET as variable - forum topic by Gupta_Pieeater
Hi all
I'm new to TPT so please be gentle with me. I have a generic script whcih will load data files to a Teradata table. One if my files contains Unicode (UTF16) data.
I have successfully laoded the data by invoking a TPT script with the following statement:
USING CHAR SET UTF16
DEFINE JOB EXTRACT_TABLE_LOAD
.....
I have also successfully loaded the data by declaring a variable in the .vars file to define the CHAR SET E.g.
DATA_CHARACTER_SET='UTF16'
...
USING CHAR SET @DATA_CHARACTER_SET
DEFINE JOB EXTRACT_TABLE_LOAD
However, what I want to do is define a variable which holds the entire "using statement", but having tried lots of different permutations I continually get an error "Keywords 'DEFINE JOB' are missing from the beginning of the job script.". Is there any restriction on my doing this ? I often see / hear the statement "you can use variables everywhere in TPT" but it appears that I can't in this scenario.
My attempt:
Vars:
DATA_CHARACTER_SET='UTF16'
COMMAND_LINE ='USING CHAR SET ' || @DATA_CHARACTER_SET
TPT script statements
@COMMAND_LINE
DEFINE JOB EXTRACT_TABLE_LOAD
Thanks for any assistance.
TPT & Quoted data bug? - forum topic by danielA
HI
Im curently using TPT 14.10.00.05 with the default configuration:
Native I/O (non-PIPE)
Data Format: 'DELIMITED'
Character set: ASCII
Delimiter = '|' (length 1), x'7C'
EscapeTextCharacter = '' (length 0), x''
Quoted data: No
OpenQuoteMark = '"' (length 1), x'22'
CloseQuoteMark = '"' (length 1), x'22'
Im getting a parse error on a line
1|2|string3|"string4|string5|6|7
Paramater 4 is not being parsed due to the open quote. It is reading until EOL as the paramater
Quoted data is set to No!
So why is TPT attaching importance to the (") and expecting a CloseQuoteMark. Surely it should just be treated as a character?
IS this a BUG in TPT?
Script Repository - forum topic by kmiesse
In a regionalized environment where no single administrator has access to all servers, what tool/repository do you use to share scripts between all administrators?
Thanks!
TPT ODBC Operator - response (4) by feinholz
TPT is only certified with ODBC drivers from Progress DataDirect.
You are free to use whatever ODBC drivers you like, but we will only offer support when you use the ones from Datadirect.
It is up to the user to obtain the drivers until TPT 15.0. Starting in 15.0 we bundle the drivers with TPT (however, you still need to contact us to obtain the license).
PT job with Oracle and Informix ODBC operator error - response (5) by feinholz
TPT 13.0 only runs in 32-bit mode, so yes we are only certified with the 32-bit versions of the drivers.
Mload utility is loading unexpected data into table - response (3) by Ivyuan
You can try the following layout:
.FIELD EmpNo * char(4);
.FIELD DeptNo * char(4);
.FIELD Salary * char(3);
for this specific data file.
BTW, text format requires all CHAR or ANSIDATE data types.
Cached credentials causing ABU jobs to fail? - forum topic by Demosthenes
Hello,
We are running into issues with performing backups using ABU.
We are running version 14 of Teradata on 2 nodes.
The specific issue is that the jobs are failing at the initial logon with the error:
Failure 8017: The UserId, Password or Account is invalid.
When the ABU GUI is initialized, it prompts for an Administrator password; the password field is empty, and when I enter the p/w, it is accepted.
From within the GUI, selecting Task > Backup Selected Objects... results in an additional credential check.
The UserId and Password fields here are populated; when I click Connect with the cached credentials, the ABU errors:
Error 505: Query to Teradata failed with DBS error 8017: The UserId, Password or Account is invalid.
If I manually enter the correct password for the listed account and click Connect, it is accepted.
If I attempt to run a saved backup job, or create a new backup job, the job fails with the same error.
If I manually edit the saved ARC script for a saved job, and manually enter the logon credentials into the script instead of using the LOGON$ string, the job succeeds.
I have followed the directions in the ABU Installation and User Guide for re-running the ABU configuration script (pg. 21-22), but this does not seem to have corrected the ABU cached credentials. This has resulted in the creation of a new 'defaults' configuration file. However, since the password values in the file are hashed, I cannot absolutely verify that the file has the correct passwords for the listed UserID's.
Is there some other place that could be caching the credentials used by ABU, or is there something else here that I'm not seeing?
Thanks,
Mike
TPT ODBC Operator - response (6) by TDUser2000
Hello,
Thank you for your quick reply.
I have tried with Teradata native 32bit bit odbc driver but getting the following error
TPT_INFRA: TPT01036: Error: Task (TaskID: 5, Task Name: SELECT_2[0001]) terminated due to the receipt of signal number 11
I have checked the previous posts and got that we need to move to latest TPT efix. We use the following TPT version and would like to know the latest version number?
Teradata Parallel Transporter Version 13.10.00.04
Regards
TERADATA PROFILER QUERIES - forum topic by Rohan_Sawant
Hi All,
Can anyone please help me out in extracting the Teradata Profiler queries? I would like to know what kind of queries does the Teradata Profiler make in detail if possible
TPT ODBC Operator - response (7) by feinholz
TPT 13.10.00.04 is quite old.
I think the infrastructure (which includes the ODBC operator) is up to 13.10.00.20.
I suggest you go to the patch server and download the latest patches for all of the TPT components. It is never a good idea to only upgrade one component.
MLOAD and loading of: Empty Strings and Null Strings - forum topic by TDDeveloper
When loading data from flat files (text ascii) to tables I find that if the file is a delimited file with all the fields defined as VARCHAR for the columns the columns with blank values (all spaces not just two consecutive delimitters for a column which is defined in the target table as CHAR or VARCHAR) are loaded as NULLs wheres if the file is a fixed type file with the layout dfined as CHAR for the columns the columns are loaded with an 'empty string'. Question is: (1) what's the theory behind this behavior if this is really by design? (2) how do I a make MLOAD load (without using a CASE Expression) emptry string when loading data from delimitted files.
TERADATA PROFILER QUERIES - response (1) by Raja_KT
I remember vaguely how I resolved once the issue, when I tweaked a query, 2 years back. I do not know if TD Profiler has changed , the look and feel since then. I think in INPUT, OUTPUT, RESULTS... It may be hiding under RESULTS...like [REPORTS, DATA, SQL]. If my memory does not fail it is on SQL tab, I could see the generated SQL.
Like other profilers, it must be there. Just see the tabs.
TERADATA PROFILER QUERIES - response (2) by RS255043
Hi Raja_KT,
I could see the SQL's in SQL tab but I just want the standard queries used in Profiler. I wanted to understand the standard ways in which a Profiler does profiling of data. If someone has the same segregated then it would or else need to research on the same
MLOAD and loading of: Empty Strings and Null Strings - response (1) by Ivyuan
Hi,
I tried a MultiLoad script, MultiLoad can load blank space charcaters and NULLs respectively.
Here is the MultiLoad script:
.LOGTABLE logt_a;
.LOGON <tdpid>/<userid>, <password>;
drop table testtbl;
drop table et_testtbl;
drop table wt_testtbl;
drop table uv_testtbl;
/*************************************************/
/* Create and insert table */
/*************************************************/
CREATE MultiSET TABLE testtbl, FALLBACK (
c1 varchar(5),
c2 varchar(5),
c3 varchar(5))
PRIMARY INDEX(C1);
.BEGIN IMPORT MLOAD TABLES testtbl
;
.LAYOUT LAY1A;
.FIELD c1 * varchar(5) ;
.FIELD c2 * varchar(5) ;
.FIELD c3 * varchar(5) ;
.DML LABEL LABELA;
INSERT INTO testtbl VALUES (:c1,:c2,:c3);
.IMPORT INFILE atrdata.txt
FORMAT VARTEXT '|' NOSTOP
LAYOUT LAY1A
APPLY LABELA;
.END MLOAD;
.logoff ;
The data file atrdata.txt looks like:
12345|12345|12345
||
12|as |123
After MultiLoading, here is the result:
*** Query completed. 3 rows found. 3 columns returned.
*** Total elapsed time was 1 second.
c1 c2 c3
----- ----- -----
12 as 123
12345 12345 12345
?
"?" means a NULL charcater was inserted,
MLOAD and loading of: Empty Strings and Null Strings - response (2) by Fred
This behavior is documented in the MultiLoad manual. Supplying NULL rather than a zero-length string was a conscious choice.
As the original poster noted, you can use CASE/COALESCE in the INSERT statement to convert the NULL values to something else.
TERADATA PROFILER QUERIES - response (3) by dnoeth
If I want to know the exact SQL submitted by an application I simlpy extract it from DBQL :)
Thanks Steve, I have gone through tpt scripts from this forum and understood the thing.
1.Is there any way I can generate the dml of the file to flat file and save it for future use?
2.Also If I am creating delimited unload file say '|' separated, it is failing for the tables which all records are not varchar.is there any solution?confused about lot of solution on this in the forum.
3.is there any way that I can unload the table by tpt and load the unloaded file by mload? as in this case I need the dml which not being able to generate in tpt.