Quantcast
Channel: Teradata Forums - Tools
Viewing all 4252 articles
Browse latest View live

Installing BTEQ on Solaris - response (4) by kfiras

$
0
0

Yes. where is the link. I see instructions on all posts but no download links anywhere.


Installing BTEQ on Solaris - response (5) by dnoeth

$
0
0

There's no public download for the Solaris TTU via DevEx, you must be a Teradata customer to get it.
You need to logon to the Teradata @ Your Service patch server at tays.teradata.com

TTU for solaris - response (1) by Adeel Chaudhry

$
0
0

Please have a look at following link:
 
http://forums.teradata.com/forum/tools/installing-bteq-on-solaris

Skip header row of file using TDLoad - response (1) by Adeel Chaudhry

$
0
0

If by tdload  you mean TPT, i dont think the situation has changed from what is mentioned in the post below:
 
http://forums.teradata.com/forum/tools/using-tpt-how-to-specify-a-starting-line-when-importing-from-a-file

BTEQ examples - response (23) by vishaman

$
0
0

hi
I have a run file that has collect stats statement for hundred's of databases ,i want that when i execute the run file ,with .run command ,any tables on which stats collection could not be completed due to any error should be redirected to my logfile
Can someone please help me with this ?
 

TPT12106: Cannot restart !!! Please resubmit the job. - response (8) by TDJeet1982

$
0
0

Reason : Checkpoint files are getting created of same name for all jobs.
Need to use jobname -j option. 

BTEQ examples - response (24) by dnoeth

$
0
0

Simply redirecting errors should work:
bteq < myfile > mylog.log 2> myerrors.log
And within BTEQ there .SET ERROROUT to switch from STDOUT to STDERR 

Skip header row of file using TDLoad - response (2) by dnoeth

$
0
0

In newer releases of TPT there's a SkipRows/SkipRowsEveryFile option.


TPT - ODBC connector missing - response (8) by urarunprasad

$
0
0

Hi Feinholz,
I got the sample script from "*\Teradata\Client\14.00\Teradata Parallel Transporter\sample\userguide\uguide06.txt".
Thank you for assisting me on this !

BTEQ and the Commit statement - response (5) by dnoeth

$
0
0

SQL Assistant automatically adds a COMMIT to each request in an ANSI session (you can easily verify that when you look at DBQL) unless you use .BeginTx/.CommitTx
BTEQ never autocommits, if you don't specify SET SESSION TRANSACTION to set the session mode BTEQ will use the system default mode, which seems to be ANSI in your case.
Possible solutions:
- add COMMIT to each CREATE (you could do a Search & Replace to simplify it)
- use SQL Assistant to submit the script (there's a command line mode, too)
- switch to BTET mode, but as this results in different defaults for SET/MULTISET and [NOT] CASESPECIFIC this might be even more work than adding COMMIT

 

Skip header row of file using TDLoad - response (3) by manharrishi

$
0
0

Hi,
Thank you very much for the responses. I was actually trying to get info for tdload utility for which we don't need a script like TPT to load a flat file into Teradata.
I am running a script like below:
tdload -f TestFile -u ETL_USER -t TestTable -h ETL_DEV -p xxxxx
and while doing the load, I would like to check the option of skipping the first line of the file as its a header.
Is there any way of achieving this directly with tdload? 
Thanks,
Manjeeth
 
 
 
 

Skip header row of file using TDLoad - response (4) by dnoeth

$
0
0

Hi Manjeeth,
i didn't test it, but you might try to put it in a job variable file and use this with the -j option. 

DEFINE SCHEMA target_schema FROM TABLE - response (8) by Donald.Iuppa@teoco.com

$
0
0

Ok - i see that won't work for me even if I get past the login issue:
This script below  creates aschema file and then uses it.
One issue I need to get past is to use a variable as part of the sql statement.
Below you will see :

from  Dbc.COLUMNS c 

where TableName = ''tabelname''

AND DatabaseName = ''dbname''

 

 

I tried various ways to substute job variables but couldn't seem to get it to work.

 

Is there some special quoting necessary?

 

 

My table name is defined in @jobVar_TargetTable

 

 

Code below:

 

 

DEFINE JOB FILE_LOAD

DESCRIPTION 'Load a Teradata table from a file'

(

 

DEFINE SCHEMA SCHEMA_GEN

(

schema_out char(128)

);

 

/*****************************/

DEFINE OPERATOR FILE_WRITER()

DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DATA CONNECTOR OPERATOR'

TYPE DATACONNECTOR CONSUMER

SCHEMA *

RIBUTES

(

VARCHAR PrivateLogName    = 'file_writer_privatelog',

VARCHAR FileName          = 'gen_schema_output.txt',

VARCHAR IndicatorMode     = 'N',

VARCHAR OpenMode          = 'Write',

VARCHAR Format            = 'TEXT',

VARCHAR TrimColumns = 'TRAILING'

);

 

 

/*****************************/

DEFINE OPERATOR EXPORT_OPERATOR()

DESCRIPTION 'TERADATA PARALLEL TRANSPORTER EXPORT OPERATOR'

TYPE EXPORT

SCHEMA SCHEMA_GEN

RIBUTES

(

VARCHAR PrivateLogName = '_load_log_' || @LoadId ,

VARCHAR TdpId = @jobvar_tdpid,

VARCHAR UserName = @jobvar_username,

VARCHAR UserPassword = @jobvar_password,

INTEGER MaxSessions       =  32,

INTEGER MinSessions       =  1,

VARCHAR AccountId,

VARCHAR SelectStmt        = 

'SELECT

CAST( 

CASE rn

when 1 then ''DEFINE SCHEMA source_schema (''

else '',''

end || s

||

case

when rn = cnt then  '');''

else ''''

end   

AS CHAR(128)) AS schema_out

from (

select  

ColumnName ||

CASE columnType

WHEN ''CV'' THEN ''VARCHAR('' || TRIM( TRAILING '')'' FROM SUBSTRING(columnFormat FROM 3)) || '')'' 

WHEN ''CF'' THEN ''VARCHAR('' || TRIM( TRAILING '')'' FROM SUBSTRING(columnFormat FROM 3)) || '')'' 

WHEN ''DA'' THEN ''ANSIDATE''

WHEN ''I'' THEN ''INTEGER''

WHEN ''I1'' THEN ''BYTEINT''

WHEN ''I2'' THEN ''SMALLINT''

WHEN ''I8'' THEN ''BIGINT''

WHEN ''TS'' THEN ''VARCHAR(27)''

ELSE  ''50''

END  as s,

row_number() over ( order by ColumnId) as rn,

count(*)  over ()  as cnt,

ColumnId

from  Dbc.COLUMNS c 

where TableName = ''tablename''

AND DatabaseName = ''dbname''

) x

order by ColumnId

;'

 

);

/*****************************/

 

STEP export_to_file

(

APPLY TO OPERATOR (FILE_WRITER() )

SELECT * FROM OPERATOR (EXPORT_OPERATOR() [1] );

);

 

/*****************************/

 

INCLUDE 'gen_schema_output.txt'

 

DEFINE OPERATOR DDL_OPERATOR

TYPE DDL

RIBUTES

(

VARCHAR PrivateLogName = '_ddl_log_' || @LoadId ,

VARCHAR TdpId = @jobvar_tdpid,

VARCHAR UserName = @jobvar_username,

VARCHAR UserPassword = @jobvar_password,

VARCHAR ErrorList = '3807'

);

 

DEFINE OPERATOR ODBC_OPERATOR

DESCRIPTION 'ODBC_OPERATOR'

TYPE ODBC

SCHEMA source_schema

RIBUTES

(

VARCHAR UserName = 'stage_adm',

VARCHAR UserPassword   = 'stage_adm',

VARCHAR DSNName = 'stage' ,

VARCHAR SelectStmt = ' select * FROM tablename limit 1000',

VARCHAR PrivateLogName = '_dataconnector_log_' || @LoadId 

);

 

DEFINE OPERATOR LOAD_OPERATOR

TYPE LOAD

SCHEMA *

RIBUTES

(

VARCHAR PrivateLogName = '_load_log_' || @LoadId ,

VARCHAR TdpId = @jobvar_tdpid,

VARCHAR UserName = @jobvar_username,

VARCHAR UserPassword = @jobvar_password,

VARCHAR TargetTable = @jobVar_TargetTable,

VARCHAR LogTable = @jobvar_LogTable ,

VARCHAR ErrorTable1 = @jobvar_ErrorTable1,

VARCHAR ErrorTable2 = @jobvar_ErrorTable2

);

 

STEP Setup_Tables

(

APPLY

('DROP TABLE ' || @jobvar_LogTable || ';'),

('DROP TABLE ' || @jobvar_ErrorTable1 || ';'),

('DROP TABLE ' || @jobvar_ErrorTable2 || ';')

TO OPERATOR (DDL_OPERATOR);

);

 

STEP Load_Trans_Table

(

APPLY $INSERT @jobVar_TargetTable TO OPERATOR (LOAD_OPERATOR[2]) SELECT * FROM OPERATOR(ODBC_OPERATOR[2]);

);

);

 

DEFINE SCHEMA target_schema FROM TABLE - response (9) by feinholz

$
0
0

TPT will not parse/touch anything in single quotes.
 
Thus, if you want to use job variables inside a SQL statement, then you need to break it up into pieces, and use the concatenation operator.
  
But you already did that when you implemented the DROP TABLE statements.
You would do the same thing with the SELECT.
The "@" character denotes the name of a job variable. Thus, you would do:
 
SelectStmt = 'SELECT ..... from  Dbc.COLUMNS c where TableName = ' || @my_tablename || ' AND DatabaseName = '|| @my_dbname || ';'
 

Can I perform Delete and Insert in the same trigger - forum topic by terankit

$
0
0

Hi,
 
I have to create a trigger on table A so that whenever there is any insert or delete in table A, there should be same insert and Delete in Table B.
Please let me know if it is possible. If you can provide the syntax as well that will be great.
 
Thanks.

Forums: 

Can I perform Delete and Insert in the same trigger - response (1) by dnoeth

$
0
0

Of course this is possible, you just need seperate triggers for Update and Delete.
Crreate AFTER INSERT/DELETE triggers with FOR EACH STATEMENT and use INSERT SELECT FROM NEW_TABLE for Insert and DELETE FROM OLD_TABLE for Delete.
You'll find the syntax and examples in the DDL manual, e.g.
CREATE TRIGGER/REPLACE TRIGGER

DEFINE SCHEMA target_schema FROM TABLE - response (10) by Donald.Iuppa@teoco.com

$
0
0

Thank you feinholz.   So I don't confuse anyone I thought I had a single script  working to create a schema file and then use it via a include.   I thought this was working but in reality it was using the existing file (previous run) before I recreated it.
I wish there was a way to differ this include until after I create the schema.  For now I split this into two executions of tbuild.   
I also had some copy and paste issues above so here is the working schema generation script:
 

DEFINE JOB FILE_LOAD

DESCRIPTION 'Load a Teradata table from a file'

(

 

DEFINE SCHEMA SCHEMA_GEN

(

schema_out char(128)

);

 

/*****************************/

DEFINE OPERATOR FILE_WRITER()

DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DATA CONNECTOR OPERATOR'

TYPE DATACONNECTOR CONSUMER

SCHEMA *

ATTRIBUTES

(

VARCHAR PrivateLogName    = 'file_writer_privatelog',

VARCHAR FileName          = 'gen_schema_output.txt',

VARCHAR IndicatorMode     = 'N',

VARCHAR OpenMode          = 'Write',

VARCHAR Format            = 'TEXT',

VARCHAR TrimColumns = 'TRAILING'

);

 

 

/*****************************/

DEFINE OPERATOR EXPORT_OPERATOR()

DESCRIPTION 'TERADATA PARALLEL TRANSPORTER EXPORT OPERATOR'

TYPE EXPORT

SCHEMA SCHEMA_GEN

ATTRIBUTES

(

VARCHAR PrivateLogName = '_load_log_' || @LoadId ,

VARCHAR TdpId = @jobvar_tdpid,

VARCHAR UserName = @jobvar_username,

VARCHAR UserPassword = @jobvar_password,

INTEGER MaxSessions       =  32,

INTEGER MinSessions       =  1,

VARCHAR AccountId,

VARCHAR SelectStmt        = 

'SELECT

CAST( 

CASE rn

when 1 then ''DEFINE SCHEMA source_schema (''

else '',''

end || s

||

case

when rn = cnt then  '');''

else ''''

end   

AS CHAR(128)) AS schema_out

from (

select  

ColumnName ||

CASE columnType

WHEN ''CV'' THEN ''VARCHAR('' || TRIM( TRAILING '')'' FROM SUBSTRING(columnFormat FROM 3)) || '')'' 

WHEN ''CF'' THEN ''VARCHAR('' || TRIM( TRAILING '')'' FROM SUBSTRING(columnFormat FROM 3)) || '')'' 

WHEN ''DA'' THEN ''ANSIDATE''

WHEN ''I'' THEN ''INTEGER''

WHEN ''I1'' THEN ''BYTEINT''

WHEN ''I2'' THEN ''SMALLINT''

WHEN ''I8'' THEN ''BIGINT''

WHEN ''TS'' THEN ''VARCHAR(27)''

ELSE  ''50''

END  as s,

row_number() over ( order by ColumnId) as rn,

count(*)  over ()  as cnt,

ColumnId

from  Dbc.COLUMNS c 

where TableName = ''' || @TargetTable || '''

AND DatabaseName = ''' || @jobvar_tgt_dbname || '''

) x

order by ColumnId

;'

 

);

/*****************************/

 

STEP export_to_file

(

APPLY TO OPERATOR (FILE_WRITER() )

SELECT * FROM OPERATOR (EXPORT_OPERATOR() [1] );

);

);

 

Teradata Parallel Transporter - queries - response (1) by feinholz

$
0
0

For #1:
 
Ok, so in TPT we do things a little differently in our script language but we accomplish the same thing. The schema object defines what the incoming data record looks like.
 
The SELECT statement (in the APPLY-SELECT) allows you to define which fields from the input record you would like to send from the producer operator (i.e. the file reader) to the consumer operator (i.e. the Update operator).
 
Thus, if your schema had these 5 columns, but you only wanted to send a subset of those columns to the DBS, you would do something like this:
 

DEFINE SCHEMA abc
   F1 INTEGER,
   F2 CHAR(10),
   F3 DECIMAL(18,2),
   F4 CHAR(2),
   F5 VARCHAR(50)
);
. . . .
APPLY
   <some-DML>
TO OPERATOR ($UPDATE[1])
SELECT F2, F3, F5 FROM OPERATOR ($FILE_READER[1]);

Teradata Parallel Transporter - queries - response (2) by feinholz

$
0
0

For #2, the DML statement is enclosed in single-quotes. Thus, if you have text inside those single-quotes that are enclosed in single-quotes, you need to escape them (by doubling each single-quote).
 
For #3, no you define the operators first. The APPLY-SELECT statement is just an executable statement that references the operators. You do notput an operator definition in the APPLY-SELECT.
 
Please refer to the documentation for proper script syntax.
 

DEFINE SCHEMA target_schema FROM TABLE - response (11) by Donald.Iuppa@teoco.com

$
0
0

Looking back at my intial approach:
DEFINE SCHEMA EOD_USAGE FROM TABLE 'SONARTEST.EOD_USAGE'; 
 
i got it sort of working by suplying the source login information on the command line.
 

T:\td>tbuild   -u 'sourceTdpid='10.xx.xx.xx2' sourcerName='sonartest' sourceUserPassword='sonartest'" -o
my current error is:

FILE_READER: TPT19003 NotifyMethod: 'None (default)'

FILE_READER: TPT19008 DataConnector Producer operator Instances: 2

FILE_READER: TPT19108 Data Format 'DELIMITED' requires all 'VARCHAR/VARDATE' schema.

FILE_READER: TPT19003 TPT Exit code set to 12.

FILE_READER: TPT19221 Total files processed: 0.

FILE_READER: TPT19108 Data Format 'DELIMITED' requires all 'VARCHAR/VARDATE' schema.

FILE_READER: TPT19003 TPT Exit code set to 12.

LOAD_OPERATOR: connecting sessions

 

The manual talkes about a special variable sourceFormat when set to 'Delimited' changes the generated schema output.

 

"SourceFormat

Before generating the DEFINE SCHEMA statement, Teradata PT queries the special job

variable SourceFormat. If has the value Delimited, then the generated schema will be in

delimited-file format; otherwise, it will be in the normal format in which the schema column

definitions closely match the Teradata Database table's column descriptions."

 

I tried supplting that on the command line as part of the -u" sourceFormat='Formated'.

 

IS there something wrong with my syntax?

 

-u" ...  sourceFormat='Delimited'"

 

 

I saw no change in the generated schema output.

 

Viewing all 4252 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>