Quantcast
Channel: Teradata Forums - Tools
Viewing all 4252 articles
Browse latest View live

BTEQ and OReplace - response (2) by dnoeth

$
0
0

Check the datatype returned by oReplace, it's probably a VarChar(8000), but your .WIDTH is set to 2000, which leads to truncation. You need to cast to a properly sized VarChar.


Understanding teradata sample scripts - forum topic by kchinnasamy@snaplogic.com

$
0
0

Hi Guys,
I'm new to Terdata so trying to understand the sample/qstart programs in the Teradata dir. I was looking into the qstart1, and didn't understand where definition of the OPERATOR "$FILE_WRITER" or "$LOAD" was.
Can some one please help me with that.
--Thanks,
Kannan 

Forums: 

Understanding teradata sample scripts - response (1) by dnoeth

$
0
0

You mean the scripts found in Teradata\Client...? Those are examples for TPT (Teradata Parallel Transporter).

 

Files in quickstart are based on the TPT Quick Start Guide and userguide scripts are based on the TPT User Guide.

 

Download those PDFs matching your TPT version at http://www.info.teradata.com/templates/eSrchResults.cfm?txtpid=&txtttlkywrd=TBUILD&txtrelno=&prodline=&nm=Teradata%20Parallel%20Transporter&srtord=Desc&rdSort=Date

Understanding teradata sample scripts - response (2) by feinholz

$
0
0

The $FILE_WRITER and $LOAD syntax describes the use of templates.
Those templates define the operators you are trying to use, and those operator definitions are found in the "templates" directory where TPT is installed.
You should see $FILE_WRITER.txt and $LOAD.txt files in that directory.
 

BTEQ and OReplace - response (3) by KellnerK

$
0
0

Thank you Dieter, that was exactly the case.  I did not realize that by not explicity casting the oReplace that it would expand the field size so significantly.  Casting back to VarChar(32) corrected the issue.

Casting Date in Bteq utilities - response (2) by Fred

$
0
0

FastLoad supports only limited data type conversions, either implicit or using Teradata-style syntax. And at a minimum you would need to change the input to always have 2 digits for the hour. If you are saving from Excel as CSV, apply a custom format to the column to make it compatible with Teradata / ANSI standard: yyyy-mm-dd hh:mm:ss
Also if the input is delimited text, all the fields in your FastLoad DEFINE must be VARCHAR.

Fatal Error with TD TPT API - forum topic by mitsukiefi

$
0
0

Dear All
Our reporting tool generates the following SQL query to be executed on TD 15.10:

select	a11.Gl_Acct_Prod_Grp_Id  Gl_Acct_Prod_Grp_Id,
	a11.Gl_Acct_Prod_Grp_Src_Id  Gl_Acct_Prod_Grp_Src_Id,
	a13.Prod_Grp_Desc  Prod_Grp_Desc,
	a13.Prod_Grp_Id_L1  Prod_Grp_Id_L1,
	a13.Prod_Grp_Src_Id_L1  Prod_Grp_Src_Id_L1,
	a13.Prod_Grp_Desc_L1  Prod_Grp_Desc_L1,
	a12.Sme_Service_Area_Id  Service_Area_Id,
	a12.Sme_Service_Area_Src_Id  Service_Area_Src_Id,
	a111.Service_Area_Desc  Service_Area_Desc,
	a11.Serv_Connect_Prod_Grp_Id  Prod_Grp_Id,
	a11.Serv_Connect_Prod_Grp_Src_Id  Prod_Grp_Src_Id,
	a19.Prod_Grp_Desc  Prod_Grp_Desc0,
	a11.Serv_Prod_Grp_Id  Serv_Prod_Grp_Id,
	a11.Serv_Prod_Grp_Src_Id  Serv_Prod_Grp_Src_Id,
	a110.Prod_Grp_Desc  Prod_Grp_Desc1,
	a11.Am_Login_Id  Ac_Login_Id,
	a11.Am_Org_Unit_Long_Name  Ac_Login_Id0,
	a11.Scs_Customer_Id  Scs_Customer_Id,
	a11.Seg_Name_L1  Seg_Name_L1,
	a11.Seg_Name_L2  Seg_Name_L2,
	a14.Calendar_Year  Calendar_Year,
	a14.Month_Of_Year  Month_Of_Year,
	a14.Month_Of_Year_Short_Name  Month_Of_Year_Short_Name,
	a11.Calendar_Month  Calendar_Month,
	a11.Ac_Login_Id  Ac_Login_Id1,
	a11.Ac_Org_Unit_Long_Name  Ac_Login_Id2,
	a11.Sales_Area_Id  Sales_Area_Id,
	a11.Sales_Area_Typ_Id  Sales_Area_Typ_Id,
	a15.Sme_Cust_Sales_Area_Name  Sme_Cust_Sales_Area_Name,
	a11.Sales_Region_Id  Sme_Cust_Sales_Region_Id,
	a11.Sales_Region_Typ_Id  Sme_Cust_Sales_Region_Typ_Id,
	a18.Sme_Cust_Sales_Region_Name  Sme_Cust_Sales_Region_Name,
	a12.Sme_Business_Area_Id  Business_Area_Id,
	a12.Sme_Business_Area_Src_Id  Business_Area_Src_Id,
	a112.Business_Area_Desc  Business_Area_Desc,
	a11.Connect_Prod_Grp_Id  Prod_Grp_Id_L4,
	a11.Connect_Prod_Grp_Src_Id  Prod_Grp_Src_Id_L4,
	a17.Connect_Prod_Grp_Desc  Prod_Grp_Desc_L4,
	a11.Bill_Prod_Id  Bill_Prod_Id,
	a11.Bill_Prod_Src_Id  Bill_Prod_Src_Id,
	a16.Prod_Short_Desc  Prod_Short_Desc,
	a11.Item_Ut_Of_Meas  Item_Ut_Of_Meas,
	a11.Master_Scs_Company_Id  Master_Scs_Company_Id,
	a15.Sme_Cust_Sales_Region_Id  Sme_Cust_Sales_Region_Id0,
	a15.Sme_Cust_Sales_Region_Typ_Id  Sme_Cust_Sales_Region_Typ_Id0,
	sum(a11.Item_Net_Amt)  Billed_Rev,
	sum(sum(a11.Item_Net_Amt)) over(partition by a14.Calendar_Year, a11.Serv_Prod_Grp_Id, a11.Serv_Prod_Grp_Src_Id, a11.Serv_Connect_Prod_Grp_Id, a11.Serv_Connect_Prod_Grp_Src_Id, a12.Sme_Service_Area_Id, a12.Sme_Service_Area_Src_Id, a15.Sme_Cust_Sales_Region_Id, a15.Sme_Cust_Sales_Region_Typ_Id, a11.Sales_Area_Id, a11.Sales_Area_Typ_Id, a11.Gl_Acct_Prod_Grp_Id, a11.Gl_Acct_Prod_Grp_Src_Id, a11.Connect_Prod_Grp_Id, a11.Connect_Prod_Grp_Src_Id, a12.Sme_Business_Area_Id, a12.Sme_Business_Area_Src_Id, a11.Item_Ut_Of_Meas, a11.Bill_Prod_Id, a11.Bill_Prod_Src_Id order by a11.Calendar_Month asc rows unbounded preceding)  Billed_Rev_Ytm
from	Dm_Billed_Rev_Sme	a11
	join	Scp_Serv_Con_Map	a12
	  on 	(a11.Serv_Connect_Prod_Grp_Id = a12.Serv_Connect_Prod_Grp_Id and 
	a11.Serv_Connect_Prod_Grp_Src_Id = a12.Serv_Connect_Prod_Grp_Src_Id)
	join	Prod_Gl_Acct_Hier_X	a13
	  on 	(a11.Gl_Acct_Prod_Grp_Id = a13.Gl_Acct_Prod_Grp_Id and 
	a11.Gl_Acct_Prod_Grp_Src_Id = a13.Gl_Acct_Prod_Grp_Src_Id)
	join	Cal_Window_Month_Ext_X	a14
	  on 	(a11.Calendar_Month = a14.Calendar_Month)
	join	Sme_Sales_Region_Hier_Cur_X	a15
	  on 	(a11.Sales_Area_Id = a15.Sme_Cust_Sales_Area_Id and 
	a11.Sales_Area_Typ_Id = a15.Sme_Cust_Sales_Area_Typ_Id)
	join	Prod_Bill_Item_Inv_Hier_Cur_X	a16
	  on 	(a11.Bill_Prod_Id = a16.Prod_Id and 
	a11.Bill_Prod_Src_Id = a16.Prod_Src_Id)
	join	Base_Connect_Hier_X	a17
	  on 	(a11.Connect_Prod_Grp_Id = a17.Connect_Prod_Grp_Id and 
	a11.Connect_Prod_Grp_Src_Id = a17.Connect_Prod_Grp_Src_Id)
	join	Sme_Sales_Region_Cur_X	a18
	  on 	(a11.Sales_Region_Id = a18.Sme_Cust_Sales_Region_Id and 
	a11.Sales_Region_Typ_Id = a18.Sme_Cust_Sales_Region_Typ_Id)
	join	Prod_Service_Connectivity_X	a19
	  on 	(a11.Serv_Connect_Prod_Grp_Id = a19.Prod_Grp_Id and 
	a11.Serv_Connect_Prod_Grp_Src_Id = a19.Prod_Grp_Src_Id)
	join	Prod_Serv_Hier_X	a110
	  on 	(a11.Serv_Prod_Grp_Id = a110.Serv_Prod_Grp_Id and 
	a11.Serv_Prod_Grp_Src_Id = a110.Serv_Prod_Grp_Src_Id)
	join	Scp_Service_Area	a111
	  on 	(a12.Sme_Service_Area_Id = a111.Service_Area_Id and 
	a12.Sme_Service_Area_Src_Id = a111.Service_Area_Src_Id)
	join	Scp_Business_Area	a112
	  on 	(a12.Sme_Business_Area_Id = a112.Business_Area_Id and 
	a12.Sme_Business_Area_Src_Id = a112.Business_Area_Src_Id)
where	a14.Calendar_Year in (2016)
group by
        a11.Gl_Acct_Prod_Grp_Id,
	a11.Gl_Acct_Prod_Grp_Src_Id,
	a13.Prod_Grp_Desc,
	a13.Prod_Grp_Id_L1,
	a13.Prod_Grp_Src_Id_L1,
	a13.Prod_Grp_Desc_L1,
	a12.Sme_Service_Area_Id,
	a12.Sme_Service_Area_Src_Id,
	a111.Service_Area_Desc,
	a11.Serv_Connect_Prod_Grp_Id,
	a11.Serv_Connect_Prod_Grp_Src_Id,
	a19.Prod_Grp_Desc,
	a11.Serv_Prod_Grp_Id,
	a11.Serv_Prod_Grp_Src_Id,
	a110.Prod_Grp_Desc,
	a11.Am_Login_Id,
	a11.Am_Org_Unit_Long_Name,
	a11.Scs_Customer_Id,
	a11.Seg_Name_L1,
	a11.Seg_Name_L2,
	a14.Calendar_Year,
	a14.Month_Of_Year,
	a14.Month_Of_Year_Short_Name,
	a11.Calendar_Month,
	a11.Ac_Login_Id,
	a11.Ac_Org_Unit_Long_Name,
	a11.Sales_Area_Id,
	a11.Sales_Area_Typ_Id,
	a15.Sme_Cust_Sales_Area_Name,
	a11.Sales_Region_Id,
	a11.Sales_Region_Typ_Id,
	a18.Sme_Cust_Sales_Region_Name,
	a12.Sme_Business_Area_Id,
	a12.Sme_Business_Area_Src_Id,
	a112.Business_Area_Desc,
	a11.Connect_Prod_Grp_Id,
	a11.Connect_Prod_Grp_Src_Id,
	a17.Connect_Prod_Grp_Desc,
	a11.Bill_Prod_Id,
	a11.Bill_Prod_Src_Id,
	a16.Prod_Short_Desc,
	a11.Item_Ut_Of_Meas,
	a11.Master_Scs_Company_Id,
	a15.Sme_Cust_Sales_Region_Id,
	a15.Sme_Cust_Sales_Region_Typ_Id
;

The reporting tool is configured to use the TPT API for this query with the following settings:
TD_TDP_ID=dwhpprd;TD_MAX_SESSIONS=10;TD_MIN_SESSIONS=10;TD_MAX_INSTANCES=10;
When executing the query, it immediately fails with an error message:
Status: Execution failed

Error: SQL Generation Complete

QueryEngine encountered error: Coordinator::RunExport failed. Teradata TPT API encountered an error. 

Error type: TeradataWrapper Error. Error occured in exporter thread 6. Error type is 0, Thread[6]: Error occured in Initiate(). Error type is 0. Error message is Operator(libexportop.so) instance(1): INITIATE method failed with status = Fatal Error

 

No error is thrown when executing the same query over a traditional ODBC connection. Do you have any ideas why this is throwing an error in TPT mode?

 

Kind regards

  Christoph
 

Tags: 
Forums: 

TPT with JDBC to Kafka - response (2) by skunkwerk

$
0
0

thanks Tom.
 
Does anyone have a working Teradata -> Kafka configuration they can share?
I have Kafka, Zookeeper, & a Schema Registry up and running.
But when I run the JDBC connector, the consumer of the topic doesn't seem to show any entries.
I'm testing with a single table in the whitelist, and a quickstart-jdbc.properties file like this:
http://pastebin.com/yqSrqQS8
 
regards,
imran


Casting Date in Bteq utilities - response (3) by krishna1985

$
0
0

Hi Fred,
thank you very much for your inputs howver I have tried to create the staging table with the below query and used oTranslate funcion to pad 0 if its single digit 
 

substring( DIARY_DATE  from 6 for 4)|| '-'||  SUBSTRING(DIARY_DATE  from 3 for 3)||'-'|| SUBSTRING(DIARY_DATE  from 1 for 2)||'' || oTRANSLATE(SUBSTRING('0' from 1 for 18-length(DIARY_DATE) )||

SUBSTRING(DIARY_DATE from 10 for 10),   '','')

TPT Wizard 15.10.01.02 64-Bit: error occurred while accessing the log - response (6) by pinaldba

$
0
0

The error can be avoided by just editing the config file of the tptwizard.properties.
It's recomdonned to take the backup of the existing tptwizard.properties file.
Edit the path of the LOG_DIRECTORY as below.
LOG_DIRECTORY=C:\\Program Files\\Teradata\\client\\15.10\\Teradata Parallel Transporter\\logs\

ARCMAIN - response (2) by seven11

$
0
0

Note: the CHECKPOINT parameter is a depricated feature and now only really applicable for mainframe based arcmain
 
 
 

SSIS Error: The Teradata TPT registry key cannot be opened. - response (1) by larry_white

$
0
0

I am having the same issue with SQL 2016.  Has there been a resolution?

TPT - Instances Vs Sessions - response (9) by ssavoye

$
0
0

Hello,
I have seen a few posts that indicate multiple instances can be used to read from a SINGLE file.  I have some rather large files, but the log (below) shows the 2nd and 3rd instances are ignored because no data files assigned to them.  What might I be doing wrong? Thanks
My File Loader File looks like this:
DEFINE JOB File_Load
DESCRIPTION 'Load a Teradata table from a file'
(   STEP MAIN_STEP
  (    APPLY $INSERT TO OPERATOR ( $LOAD        [@LoadInstances]    )
    SELECT * FROM OPERATOR    ( $FILE_READER [@ReaderInstances]  );   );
My Jobvariable file has many variables, but last are:
,LoadInstances     = 3
,UpdateInstances   = 1
,ExportInstances   = 1
,StreamInstances   = 1
,InserterInstances = 1
,SelectorInstances = 1
,ReaderInstances   = 3
,WriterInstances   = 1
,OutmodInstances   = 1
--------------------------
 
TLOGVIEW indicates only single instance because no other files assigned to the others:
...
$FILE_READER[1]: DataConnector Producer operator Instances: 3
$FILE_READER[3]: TPT19012 No files assigned to instance 3.  This instance will be inactive.
$FILE_READER[2]: TPT19012 No files assigned to instance 2.  This instance will be inactive.
...
$LOAD: Statistics for Target Table:  'ORDERS'
$LOAD: Total Rows Sent To RDBMS:      318671360
$LOAD: Total Rows Applied:            318671360
$LOAD: Total Rows in Error Table 1:   0
$LOAD: Total Rows in Error Table 2:   0
$LOAD: Total Duplicate Rows:          0
TPT_INFRA: TPT02255: Message Buffers Sent/Received = 476805, Total Rows Received = 0, Total Rows Sent = 0
TPT_INFRA: TPT02255: Message Buffers Sent/Received = 0, Total Rows Received = 0, Total Rows Sent = 0
TPT_INFRA: TPT02255: Message Buffers Sent/Received = 0, Total Rows Received = 0, Total Rows Sent = 0
TPT_INFRA: TPT02255: Message Buffers Sent/Received = 0, Total Rows Received = 0, Total Rows Sent = 0
TPT_INFRA: TPT02255: Message Buffers Sent/Received = 13, Total Rows Received = 0, Total Rows Sent = 0
TPT_INFRA: TPT02255: Message Buffers Sent/Received = 476792, Total Rows Received = 0, Total Rows Sent = 0
$LOAD: disconnecting sessions
$FILE_READER[1]: Total files processed: 1.

TPT - Instances Vs Sessions - response (10) by feinholz

$
0
0

In order to use multiple instances to read from a single file, you must set the MultipleReaders attribute to 'yes'.

Teradata 14.10 History Window Gone - forum topic by tduqu

$
0
0

My history window no longer appears at login and I can't select Show History from view window, the Show History text is greyed out. I can still access SQL history in a database but it is much easier to view and name my history when in the application.

Forums: 

TPT - Instances Vs Sessions - response (11) by ssavoye

$
0
0

Thanks for responding Steve.  I set the jobvars file as such.  Am I setting it in the wrong place?
,MultipleReaders = 'yes'
,LoadInstances     = 3
,...
,ReaderInstances   = 3
 
TLOGVIEW still says: 
$FILE_READER[1]: DataConnector Producer operator Instances: 3
$FILE_READER[1]: ECI operator ID: '$FILE_READER-33489'
$FILE_READER[2]: TPT19012 No files assigned to instance 2.  This instance will be inactive.
$FILE_READER[3]: TPT19012 No files assigned to instance 3.  This instance will be inactive.
$FILE_READER[1]: Operator instance 1 processing file .......

TPT - Instances Vs Sessions - response (12) by feinholz

$
0
0

Since you are using templates, you will need to use the template job variable name.
It could be FileReaderMultipleInstances.
To be sure, take a look at the $FILE_READER.txt template file.
 

TPT - Instances Vs Sessions - response (13) by ssavoye

$
0
0

Thanks Steve!
It looks like FileReaderMultipleReaders = 'Yes' works.

Fatal Error with TD TPT API - response (1) by Fred

$
0
0

On current releases of TD the number of sessions is limited via TASM/TIWM settings. You can't have more instances than sessions, and you can't have more sessions than workload management allows.
Try using one instance and 4 sessions. Those are the recommended / default values.

Casting Date in Bteq utilities - response (4) by Fred

$
0
0

Not sure I understand your question at this point. Are you saying that you loaded to a staging table as character, and now you want a SQL expression that will CAST the character string to a Timestamp, such as:
CAST(
CASE WHEN SUBSTRING(DIARY_DATE FROM 12 for 1) = ':'
THEN SUBSTRING(DIARY_DATE from 1 for 10)||'0'||SUBSTRING(DIARY_DATE from 11 for 7)
ELSE DIARY_DATE END
AS TIMESTAMP FORMAT'ddmmmyyyy hh:mm:ss')
 
 
 
 

Viewing all 4252 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>