Quantcast
Channel: Teradata Forums - Tools
Viewing all 4252 articles
Browse latest View live

Getting export results in same date format as in Teradata table using BTEQ - forum topic by prateek.bansal07

$
0
0

I am fairly new to Teradata. I have been struggling to get exact format for all the dates in my select query when doing an export using BTEQ. 
 
Here is my script :

/opt/teradata/client/15.10/bin/bteq <<EOI

.LOGON hostname/username,password;

.EXPORT REPORT  FILE = sample_table.csv

.SET RECORDMODE OFF

.SET TITLEDASHES OFF

.SET FORMAT OFF

.SET WIDTH 65531

.SET SEPARATOR ","

SELECT  top 100 * from table;

.EXPORT RESET

.LOGOFF;

.EXIT;

EOI

 

The result gets populated in sample_table.csv but all the date formats are in yy-MM-dd format where as the requirement is to get them exactly in the same format as they are in the Teradata table. 

 

 

 

Thanks

Tags: 
Forums: 

Getting export results in same date format as in Teradata table using BTEQ - response (1) by Fred

$
0
0

"Exactly the same format as they are in the Teradata table" would be an internal binary representation; probably not what you really want.
BTEQ in FIELDMODE will use the FORMAT associated with the column to convert to character form. Other tools (e.g. SQL Assistant) typically convert the ODBC/JDBC/etc. driver format to the client's local form. You could ALTER the column format in the table, or explicitly CAST the date field(s) to some other format, or possibly SET SESSION DATEFORM=ANSIDATE; if what you want is yyyy-mm-dd format.

SQL Assistant v.15.00 Keyboard Shortcut problems - Comment - response (2) by bhushan.koli

$
0
0

These are few keyboard shortcuts (Used in only GUI mode On windows/linux) 
F2 : It will open QUERY BUILDER, with SYNTAX for all SQL Queries.
F5 : Execute SQL Query
F6 : Explain Plan for SQL Query
F9 : Execute SQL Queries in Parallel
F10 : Abort SQL Query
F11 : Display Last Error encountered.
Ctrl + N : New SQL Query Window
Ctrl + Q : FORMAT SQL Query
Ctrl + U : Convert to UPPERCASE
Ctrl + H : Find & Replace

Getting export results in same date format as in Teradata table using BTEQ - response (2) by prateek.bansal07

$
0
0

 

I tried using SET SESSION DATEFORM=ANSIDATE; to get all date fields in YYYY-mm-dd format but the end result is always yy-MM-dd :(   Is there anything wrong with my shell script?/opt/teradata/client/15.10/bin/bteq <<EOI

 

.LOGON hostname/username,password;

 

.EXPORT REPORT  FILE = sample_table.csv

 

.SET RECORDMODE OFF

 

.SET TITLEDASHES OFF

 

SET SESSION DATEFORM=ANSIDATE;

.SET FORMAT OFF

 

.SET WIDTH 65531

 

.SET SEPARATOR ","

 

SELECT  top 100 * from table;

 

.EXPORT RESET

 

.LOGOFF;

 

.EXIT;

 

EOI

Getting export results in same date format as in Teradata table using BTEQ - response (3) by ToddAWalter

$
0
0

Please provide the SHOW TABLE. Need to see the data type and format for the column that contains the date.

CLI error" MTDP: EM_GSSINITFAIL:... When logging on BTEQ in remote server - forum topic by ericawang0829

$
0
0

Hi all,
I'm trying to use BTEQ Utility inside a remote server. What I did was to install TTU 15.10.05 Windows in my local machine first and then copied the installed folder into the remote server since I can't install or download any packages there. 
Here is the error message:
"
.logon XXX/dbc
password:
 *** CLI error: MTDP: EM_GSSINITFAIL<235>: call to gss_init failed. [terasso] Can not load TDGSS library. The TDGSS Library currently has initialization in progress already
 *** Return code from CLI is: 235
 *** Error: Logon failed!
"
I'm not sure if it's wrong to copy the installed package like this? If so, is there another way to be able to use BTEQ Utility in a remote server without installing it directly?
Any help would be great, thanks!

Forums: 

Month days !!! - forum topic by RevathiS

$
0
0

Hi,
Could any one explain this code logic,
DAY(DATEADD(ms,-2,DATEADD(MONTH, DATEDIFF(MONTH,0,CURRENT_DATE.DAY)+1,0)))
and provide me alternate code in tera data...
 
Thanks..

Forums: 

SQL Assistant Change Password - response (1) by Swati@td

$
0
0

Hi,
To change the password in SQL Assistant you must log in to the system


Month days !!! - response (1) by dnoeth

$
0
0

What DBMS is this?
Looks like MS SQL Server, but this CURRENT_DATE.DAY is strange.
 
Would be easier if you add the current result, but this seems to return the last day of the current month:
EXTRACT(DAY FROM LAST_DAY(CURRENT_DATE))

 
 

TPT export into AWS S3 buckets. - response (2) by srivigneshkn

Unicode Export through TPT - forum topic by srivigneshkn

$
0
0

Hi, 

 

I need to export few chinese and japanese characters stored in teradata to a file.

 

I have tried different options to export data through TPT but i am either not able to see the unicode characters in the generated file or i face the conflicting data length error.

 

Please find the details below.

 

Requirement : 

 

1. Export chinese / japanese characters out of teradata to a file.

2. The table has japanese characters stored in ClassNm and DeptNm field 

Eg      : 本菊地 , 竹益下

 

TPT Options tried :

 

1. Generated TPT with the same schema and column length as in TD   

        Output : Exports data , but unicode data not displayed in file.

2. Generated TPT with column length 2 times higher for unicode columns  

        Output : Exports data, but unicode data not displayed in file.

3. Generated TPT with column length 2 times higher for unicode columns and mentioned USING CHARACTER SET UTF8 as the first line of the TPT file.

        Output : Fails with Error as FILE_WRITER[1]: Operator instance 1 processing file 'unitest.out'.

                                     EXPORT_OPERATOR: connecting sessions

                                     TPT_INFRA: TPT02638: Error: Conflicting data length for column(1) - EmpCatNumber. Source column's data length (20) Target column's data length (60).

                                     EXPORT_OPERATOR: TPT12108: Output Schema does not match data from SELECT statement

                                     FILE_WRITER[1]: Total files processed: 0.

                                     EXPORT_OPERATOR: disconnecting sessions

                                     EXPORT_OPERATOR: Total processor time used = '0.020462 Second(s)'

 

 

TPT Execution command :

 

tbuild -f unitest.ctl -v TPTParameter.param -e UTF8 -s 1

 

Table Structure :

CREATE SET TABLE master_t.unitest ,NO FALLBACK ,

     NO BEFORE JOURNAL,

     NO AFTER JOURNAL,

     CHECKSUM = DEFAULT,

     DEFAULT MERGEBLOCKRATIO

     (

      EmpCatNumber VARCHAR(10) CHARACTER SET UNICODE NOT CASESPECIFIC NOT NULL,

      EmpType CHAR(5) CHARACTER SET UNICODE NOT CASESPECIFIC NOT NULL,

      EmpCd VARCHAR(10) CHARACTER SET LATIN NOT CASESPECIFIC,

      ClassNm VARCHAR(180) CHARACTER SET UNICODE NOT CASESPECIFIC,

      DeptNm VARCHAR(180) CHARACTER SET UNICODE NOT CASESPECIFIC,

      CrtdBy VARCHAR(25) CHARACTER SET LATIN NOT CASESPECIFIC NOT NULL DEFAULT USER ,

      CrtTmstmp TIMESTAMP(0) NOT NULL DEFAULT CURRENT_TIMESTAMP(0))

PRIMARY INDEX Emp_Nm_NUPI ( EmpCatNumber ,EmpType );

 

TPT Script :

 

USING CHARACTER SET UTF8

DEFINE JOB EXPORT_unitest_TABLE_TO_FILE

DESCRIPTION 'EXPORT unitest TABLE TO A FILE'

(

/*****************************/

DEFINE SCHEMA unitest_SCHEMA

DESCRIPTION 'SAMPLE unitest SCHEMA'

(

EmpCatNumber Varchar(20) , EmpType Varchar(12) , EmpCd Varchar(10) , ClassNm Varchar(360) , DeptNm Varchar(360) , CrtdBy Varchar(25) , CrtTmstmp Varchar(30)

);

/*****************************/

/*****************************/

DEFINE OPERATOR FILE_WRITER()

DESCRIPTION 'TERADATA PARALLEL TRANSPORTER DATA CONNECTOR OPERATOR'

TYPE DATACONNECTOR CONSUMER

SCHEMA *

ATTRIBUTES

(

VARCHAR PrivateLogName    = 'file_writer_privatelog',

VARCHAR FileName          = 'unitest.out',

VARCHAR IndicatorMode     = 'N',

VARCHAR OpenMode          = 'Write',

VARCHAR Format = 'DELIMITED',

VARCHAR TextDelimiter = @MyDelimiter,

VARCHAR TRACELEVEL='ALL'

);

/*****************************/

/*****************************/

DEFINE OPERATOR EXPORT_OPERATOR()

DESCRIPTION 'TERADATA PARALLEL TRANSPORTER EXPORT OPERATOR'

TYPE EXPORT

SCHEMA unitest_SCHEMA

ATTRIBUTES

(

VARCHAR PrivateLogName    = 'export_privatelog',

INTEGER MaxSessions       = @MaxSessions ,

INTEGER MinSessions       = @MinSessions,

VARCHAR TdpId             = @MyTdPid,

VARCHAR UserName          = @MyUserName,

VARCHAR UserPassword      = @MyPassword,

VARCHAR AccountId,

VARCHAR SelectStmt        = 'select    

  cast (EmpCatNumber as Varchar(20)), cast (EmpType as Varchar(12)), cast (EmpCd as Varchar(10)), cast (ClassNm as Varchar(360)), cast (DeptNm as Varchar(360)), cast (CrtdBy as Varchar(25)), cast (CrtTmstmp as Varchar(30))

from eis_t.unitest  where  1=1 ; '

);

/*****************************/

STEP export_to_file

(

APPLY TO OPERATOR (FILE_WRITER() )

SELECT * FROM OPERATOR (EXPORT_OPERATOR() [1] );

);

);

 

 

Request your help on the same.

 

Thanks & Regards,

Srivignesh KN

 

Forums: 

Unicode Export through TPT - response (1) by feinholz

$
0
0

In order to see the chinese/japanese characters, you MUST specify either UTF8 or UTF16 as the character set in the script.
 
If you specify UTF8, you must triple the size of the character fields (CHAR, VARCHAR).
If you specify UTF16, you must double the size of the character fields (CHAR,VARCHAR).
 

Unicode Export through TPT - response (2) by srivigneshkn

$
0
0

Hi Steve, 
I have mentioned UTF8 and tripled the size of the unicode fields, i still get the following error.
 

EXPORT_OPERATOR: connecting sessions

TPT_INFRA: TPT02638: Error: Conflicting data length for column(1) - EmpCatNumber. Source column's data length (30) Target column's data length (90).

EXPORT_OPERATOR: TPT12108: Output Schema does not match data from SELECT statement

 

 

Thanks

Unicode Export through TPT - response (3) by feinholz

$
0
0

In the example script you sent, EmpCatNumber was defined in the table as VARCHAR(10).
If that is the case, then the script must specify VARCHAR(30).
If this is done correctly, I am not sure how TPT could think that the target column's data length would be 90.
 
What version of TPT are you running?
 

Unicode Export through TPT - response (4) by srivigneshkn

$
0
0

I have changed the empcatnumber to 30. I even tried increasing the size to 60/90 but the error meesage remains the same.
My TPT version is Teradata Parallel Transporter Version 15.10.01.02 64-Bit.
Can you please help me understand if we need to make any system wide property defined for having handling unicode exports.

$ tbuild -f unitest.ctl -v TPTParameter.param -e UTF8 -s 1

Teradata Parallel Transporter Version 15.10.01.02 64-Bit

TPT_INFRA: TPT03624: Warning: tbuild -s option argument specifies the first job step;

  no job steps will be skipped (unless this is a restarted job).

Job log: /opt/teradata/client/15.10/tbuild/logs/edwexp-97.out

Job id is edwexp-97, running on ushexpd1ltdexp1

Teradata Parallel Transporter Export Operator Version 15.10.01.02

EXPORT_OPERATOR: private log specified: export_privatelog

Teradata Parallel Transporter DataConnector Operator Version 15.10.01.02

FILE_WRITER[1]: Instance 1 directing private log report to 'file_writer_privatelog-1'.

FILE_WRITER[1]: DataConnector Consumer operator Instances: 1

FILE_WRITER[1]: ECI operator ID: 'FILE_WRITER-63622'

FILE_WRITER[1]: Operator instance 1 processing file 'unitest.out'.

EXPORT_OPERATOR: connecting sessions

TPT_INFRA: TPT02638: Error: Conflicting data length for column(1) - EmpCatNumber. Source column's data length (30) Target column's data length (90).

EXPORT_OPERATOR: TPT12108: Output Schema does not match data from SELECT statement

FILE_WRITER[1]: Total files processed: 0.

EXPORT_OPERATOR: disconnecting sessions

EXPORT_OPERATOR: Total processor time used = '0.020582 Second(s)'

EXPORT_OPERATOR: Start : Fri Aug 19 14:00:09 2016

EXPORT_OPERATOR: End   : Fri Aug 19 14:00:10 2016

Job step export_to_file terminated (status 8)

Job edwexp terminated (status 8)

Job start: Fri Aug 19 14:00:05 2016

Job end:   Fri Aug 19 14:00:10 2016

 

Thanks,

Srivignesh KN


Unicode Export through TPT - response (5) by feinholz

$
0
0

Please send me your script.
(I may make changes to it.)
 

Zero records are fetching when using where statement in TPT - response (16) by venkata_k01

$
0
0

Hi Steve,
 
We tried many ways by putting select statement in the script but still no luck.
It would be very helpful if you could please provide one simple sample example.
 
Thanks & Regards,
Hanu

Unicode Export through TPT - response (6) by dnoeth

$
0
0

You must triple the VarChar size in your Export Schema, not in your Select. 
Additionally you don't need to specify a schema at all, when the Format is DELIMITED, and your Select can be a simple SELECT *  (unless you want to use a specific Format for the cast), as typecasts to varchar are done automatically.
 

Teradata Tool and Utilities package for Linux - response (5) by ChrisRx

$
0
0

Is there a particular reason why the TTU for Windows is made available and the TTU for Linux is not?

Easyloader Schema Error 15.10 Only - forum topic by toadrw

$
0
0

Hello:

We're receiving the following error on 15.10 whcih we were not receiving on 15.00.  Could you please help?  Here's our TD Load command line:

 

tdload -r "C:\Users\Leona\AppData\Roaming\Coffing Data Warehousing\Nexus Portal\Scripts\TPT\checkpoint" -L "C:\Users\Leona\AppData\Roaming\Coffing Data Warehousing\Nexus Portal\Scripts\TPT\logs" --TargetTable "SJB_TESTING"."addresses4444" --TargetTdpId tdat15 --TargetUserName <TARGET_PASSWORD> --TargetUserPassword <TARGET_PASSWORD> --TargetWorkingDatabase "SJB_TESTING" --SourceFileName "C:\Users\Leona\AppData\Roaming\Coffing Data Warehousing\Nexus Portal\Scripts\TPT\temp\sql_class.csv" --SourceTextDelimiter "|" addresses4444_LoadJob

 

CSV Flat File

5555555|121 Jump St.|Big City|NY|334566598|310|4531111

2222222|123 Some St.|Sometown|CA|256781212|475|5651213

4444444|12 Jump St.|Big City|NY|334566576|310|4530097

1111111|123 Any St.|Anytown|AL|456780000|435|5551213

3333333|2468 Appreciate Ave.|Mytown|MI|123561111|937|3334567

 

Original DDL

 

CREATE MULTISET TABLE SJB_TESTING.addresses244 ,NO FALLBACK ,

     NO BEFORE JOURNAL,

     NO AFTER JOURNAL,

     CHECKSUM = DEFAULT,

     DEFAULT MERGEBLOCKRATIO

     (

      SUBSCRIBER_NO FLOAT,

      STREET VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,

      CITY VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,

      STATE VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,

      ZIP FLOAT,

      AREACODE FLOAT,

      PHONE FLOAT)

NO PRIMARY INDEX ;

 

Modified DDL (all varchar still fails)

 

CREATE MULTISET TABLE SJB_TESTING.addresses4444 ,NO FALLBACK ,

     NO BEFORE JOURNAL,

     NO AFTER JOURNAL,

     CHECKSUM = DEFAULT,

     DEFAULT MERGEBLOCKRATIO

     (

      SUBSCRIBER_NO FLOAT,

      STREET VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,

      CITY VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,

      STATE VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,

      ZIP VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,

      AREACODE VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,

      PHONE VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC)

PRIMARY INDEX ( SUBSCRIBER_NO );

 

Error log

 

C:\Code\Nexus\Main-Branch-DaveDev\ CDW.Admin.Presentation\bin\Debug>cd "C:\Users\Leona\AppData\Roaming\Coffing Data Warehousing\Nexus Portal\Scripts\TPT\"

C:\Users\Leona\AppData\Roaming\Coffing Data Warehousing\Nexus Portal\Scripts\TPT>tdload -r "C:\Users\Leona\AppData\Roaming\Coffing Data Warehousing\Nexus Portal\Scripts\TPT\checkpoint" -L "C:\Users\Leona\AppData\Roaming\Coffing Data Warehousing\Nexus Portal\Scripts\TPT\logs" --TargetTable "SJB_TESTING"."addresses4444" --TargetTdpId tdat15 --TargetUserName <TARGET_PASSWORD> --TargetUserPassword <TARGET_PASSWORD> --TargetWorkingDatabase "SJB_TESTING" --SourceFileName "C:\Users\Leona\AppData\Roaming\Coffing Data Warehousing\Nexus Portal\Scripts\TPT\temp\sql_class.csv" --SourceTextDelimiter "|" addresses4444_LoadJob

Teradata Parallel Transporter Version 15.10.01.02 32-Bit

Job log: C:\Users\Leona\AppData\Roaming\Coffing Data Warehousing\Nexus Portal\Scripts\TPT\logs/addresses4444_LoadJob-34.out

Job id is addresses4444_LoadJob-34, running on Coffingdw

Teradata Parallel Transporter DataConnector Operator Version 15.10.01.02

Teradata Parallel Transporter Load Operator Version 15.10.01.02

$LOAD: private log specified: LoadLog

$FILE_READER[1]: DataConnector Producer operator Instances: 1

$FILE_READER[1]: TPT19108 Data Format 'DELIMITED' requires all 'VARCHAR/JSON/JSON BY NAME/CLOB BY NAME/BLOB BY NAME/XML BY NAME/XML/CLOB' schema.

$FILE_READER[1]: TPT19015 TPT Exit code set to 12.

$LOAD: connecting sessions

$LOAD: preparing target table

$LOAD: entering Acquisition Phase

$FILE_READER[1]: Total files processed: 0.

$LOAD: disconnecting sessions

$LOAD: Total processor time used = '0.28125 Second(s)'

$LOAD: Start : Mon Aug 22 10:07:07 2016

$LOAD: End   : Mon Aug 22 10:07:18 2016

Job step MAIN_STEP terminated (status 8)

Job addresses4444_LoadJob terminated (status 8)

Job start: Mon Aug 22 10:07:07 2016

Job end:   Mon Aug 22 10:07:18 2016

Teradata Load Utility Version 15.10.01.02 32-Bit

C:\Users\Leona\AppData\Roaming\Coffing Data Warehousing\Nexus Portal\Scripts\TPT>Exit

Forums: 
Viewing all 4252 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>