Quantcast
Channel: Teradata Forums - Tools
Viewing all 4252 articles
Browse latest View live

How to make use of ARCMAIN utility to take DB backup. - response (8) by dnoeth

$
0
0

You can't use more than 8 characters for FILE or add a path within the ARC script:

archive data tables (financial.custonline),
release lock,
file=ARCHIVE;

Then you use a runtime parameter to map ARCHIVE to the actual name:

ARCMAIN FILEDEF=(ARCHIVE,/root/Teradata/custonline_Data) < yourscript

 


Can we pass schema and sql statement as variables to tpt script - response (2) by vennelakanti00

$
0
0

Hi Steve,
Thanks for the inputs on this. We were able to pass the values having a shell wrapper around the actual tbuild command and also schema file as you mentioned. We can use similar approach for INSERT as well.
In Shell:
sql=`cat sql.txt`
ins=`cat insert.txt`
schema="schema.txt"
tbuild -f tpt_script -u "sql='$sql' schema='$schema'"
tpt script:
INCLUDE @schema
sql=@sql
 

TPT - Delimited Data Parsing error: Invalid multi-byte character - response (1) by feinholz

$
0
0

When you want assistance, it is always a good idea to provide:
1. the version of TPT you are using
2. the actual failure (is it a DBS failure? a TPT failure)
 
The word "fail" can mean many things.
Did the job complete bu the row(s) with the aforementioned character end up in the error table?
If so, that would indicate the character is not supported by Teradata.
Did TPT fail?
If so, what was the error message?
 

TPT Stream Loader - Create Macro Access error on target database (not working database). - response (1) by feinholz

$
0
0

You might want to look at the Stream operator information in the TPT reference manual for the use of the MacroDatabase attribute.

FastExport ISSUE !!! - forum topic by prasshanth

$
0
0

Greetings TD experts, 
I am a noob in using FastExport and I am facing a problem during runtime. After executing the script in command, I get the following error:

**** 14:36:41 UTY0847 Warning: RDBMS or network down. Trying to logon again.

**** 14:37:41 UTY8400 Network or RDBMS down,Cli error 207

**** 14:37:41 UTY2410 Total processor time used = '0.0936006 Seconds'

     .       Start : 14:28:25 - WED MAR 25, 2015

     .       End   : 14:37:41 - WED MAR 25, 2015

     .       Highest return code encountered = '12'.

 

Here is my script:

 

.LOGTABLE <db>.<tablename_lg>;

.RUN FILE <path>.logon.fxp; 

<databasename>;

.BEGIN EXPORT SESSIONS 20;

 

.EXPORT OUTFILE <path><filename>.txt MODE RECORD FORMAT TEXT;

 

SELECT  cast(Ven_ID as (VARCHAR(2))||','||cast (VERTICAL as VARCHAR(17))||','||cast(CATEGORY as VARCHAR(8)) 

FROM <db>.<table>;

 

.END EXPORT;

.LOGOFF; 

 

I created a logon file which contains the username password as someone suggested its secure this way.

script:

.LOGON usernname,pwd;

 

 

Please help me out in whats going wrong here coz I am not able to move further.  To my understanding from the error it says network is down. But in reality it isn't. The tool stays to about 10-15 mins after executing the logon. Is there something wrong with my logon? I am working in windows platform. And I am not sure where to find my tpid incase I need to mention it, but the syntax says its optional and a default value ll be added in case if its not given. 

Thanks a lot people in advance :) 

Have a great day !

Forums: 

FastExport ISSUE !!! - response (1) by ulrich

$
0
0

Your logon need to point also to the server.
in the host file (/etc/host on Linux) you need to specifiy a cop entriy for the db
then you need to specifiy the system
.LOGON system/usernname,pwd;

Merging of Bteq and shell scripts - forum topic by mukuljain015

$
0
0

I want to use shell variables in bteq commands and shell commands in bteq script. e.q  create table $variable (some xyz column list); and some sort of shell  commands within bteq scripts.
So can you please assit me in doing it.
 
Thanks

Forums: 

FastExport ISSUE !!! - response (2) by prasshanth


Loading from DB2 to Teradata with SSIS 2012 converts content with special characters ("zero characters") to NULL. - forum topic by pleino

$
0
0

Loading from DB2 to Teradata with SSIS 2012 converts content with special characters ("zero characters") to NULL.

 

When I hit PREVIEW in SSIS DB2 OLE DB Source component, I can see the data correctly but when I load it into some destination file or database table, the content will be lost. NULL. Other rows are OK but the ones with some special characters will get lost.

http://hot.ee/phil/work/DB2_zero_character.png

 

How to fix this? How to load correctly from DB2 with SQL Server Integration Services?

 

I have tried Microsoft OLEDB driver for DB2 and also IBM OLEDB provider for DB2, they both have the same issue.

Forums: 

Merging of Bteq and shell scripts - response (1) by ULICKERT

$
0
0

Hi,
you may just use a 'Here-script'

#!/usr/bin/ksh
LOGON='.logon $TDPID/$user,$tdwallet($user)'
variable='MYDatabase.MyTable'
bteq <<-END
create table $variable ( some xyz column list);
.remark  $(date)
END

all UNIX-variables will be expanded before giving the text to bteq.
Hope it helps

FASTLOAD: Can it skip blank lines? - forum topic by Rik Bitter

$
0
0

I did a few searches and couldn't turn up anything on this topic so I thought I'd ask.  I've already worked around it by preprocessing my data file through sed to remove any blank lines.  For future reference, is there an option in fastload that will have it skip blank lines?
Thx,
Rik

Forums: 

FASTLOAD: Can it skip blank lines? - response (1) by feinholz

$
0
0

No, FastLoad cannot skip blank lines.
 

Updating SQL assistant 15 - forum topic by zkimble

$
0
0

Hello, I recently downloaded TTU v15.00, which seems to have a lot of bugs on my machine. I see some comments about updates to SQL Assistant, but I can't find an update version or updater. How can I update SQL assistant to the latest version?
Thanks

Forums: 

bteq .OS calls to UNIX "date" command - forum topic by smartyollie

$
0
0

I have bteq .RUN command sandwiched by .OS calls which log the start and end date times of the .RUN. The dates are gotten by calls to UNIX 'date' command. The START date is fine, however, my COMPLETED date is always equal to my START date. I don't see how that's possible because the 'date' command is invoked fresh on the COMPLETED line. How can I fix this to get the real, actual COMPLETED date to print?
Here's the code:
.OS echo Running TERADATA script <script name>        on `date`>&2;                      <=========== this date is correct.
.RUN FILE = <script file>       ;
.IF ERRORCODE <> 0 THEN .GOTO BADEXIT;

.LABEL GOODEXIT
.OS echo TERADATA file <script name>         COMPLETED on `date` >&2;                  <=========== this date is equal to the date above, even though it should be several minutes later.
.QUIT 0;

Forums: 

TTU 15 Installed, SQL Assistant Error - response (7) by kneelame

$
0
0

was this problem solved. I am facing the same error.. I understand that TTU 15.00.03 may have resovled but I dont see this package for download.. any help is appreciated.


Teradata Parallel Transporter Wizard 13.10 Issues - response (21) by doognek

$
0
0

It is simply an issue with the TPT Wizard trying to use the 64-bit instead of the 32-bit Java client. Edit the tptwizard.cmd and add a path to the beginning of PATH environment variable to point to the 32-bit Java bin directory first instead of the 64-bit client:
 @Echo off
[cut]
set PATH=C:\Program Files (x86)\Java\jre1.8.0_31\bin;%PATH%
set TPT_MIN_JAVA_VERSION=1.4.2_06
[cut]
Save the tptwizard.cmd file and TPT Wizard will load when you try it again.

 

Load data from Oracle to Teradata with correct character set - response (5) by paulogil

$
0
0

Hi Steve, 
Could you talk to your colleague and asked my question?
Thanks.
 

String-genererated SQL-statment into BTEQ - forum topic by denilsson10

$
0
0

Hi!
I have created a SQL-statement that is in a string.
Ex:
'insert into MetadataTable.TestTable ' !!
'sel ''' !! trim(substr(Col.databasename,1,3)) !! ''' as Environment, ''' !!

trim(T1322.Id) !! ''' as Id, ''' !!

trim(T1324.RId) !! ''' as RId, ''' !!

trim(substr(Col.databasename,4,20)) !! ''' as DatabaseName, ''' !!

trim(Col.tablename) !! ''' as TableName, ''' !!....etc etc.
As it is right now - I´m running it in SQL Assistent but my goal is to run it as a scheaduled batch job in Z/OS.
Can someone help out here - how to
1. Run the "genereate-sql-statment. Which will generate an outfil
2. Make it run the generate sql-statement in a new script/source the the outfile from step 1.
Brgds

Forums: 

Load data from Oracle to Teradata with correct character set - response (6) by vishnuvardhan

$
0
0

Hi,
There is a checkbox under the Advanced options tab, in the ODBC administrator Oracle Wire Protocol DSN settings called "Enable N-Char Support".
you should have this checked so as to enable the utf support.
 

Fastexport with MLOAD option - forum topic by Gnana Reddy

$
0
0

Hi,
Considering TD space, we planned to export the old data to dat file and when ever we required load it again. I am using below fexport script to export data with MLOAD option.
Script:
====
.logtable DB.logtable;
.logon xx/xxxxx/xxxxx;
.BEGIN EXPORT SESSIONS 20;
.EXPORT OUTFILE temp.dat
 MLSCRIPT ML_SCRIPT.mlds;
LOCKING ROW FOR ACCESS
SELECT * FROM DB.TABLE_NAME;
.END EXPORT;
.LOGOFF;
==========================
Observation:
====
> dat file data is not readable format (see sample data below).
=========================
04163311041
           ¹3   1130Ó$!
0 2916221161¹4   1130ÓL
02111211030Ï    Õ·1   1230Ó/
==========================
> Row count also not matching with table record count.
Can you please clarify for the below questions.
1. How to ensure exported dat file is correct upon successful completion of my job.
2. Can I load sample 100 records adding some condition in MLOAD script. If yes please let me know the option and where to change in the mload script.
Thank You.

Forums: 
Viewing all 4252 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>