Quantcast
Channel: Teradata Forums - Tools
Viewing all 4252 articles
Browse latest View live

FASTEXPORT MLSCRIPT Getting junk values like p1 - forum topic by BalaChuppala

$
0
0

Hi,
While exporting data from Teradata table with MLSCRIPT option to generate mload script too. I am getting below strings p1<96> or p1 instead of DATE/TIME/TIMESTAMP fields in layout section.
.....
p1<96>.FIELD COLUMN1 * DATE;
p1<96>p1<96>.FIELD COLUMN2 * CHAR(1);
.......
Could some one help me in understanding this, why it is generating like this for DATE/TIMESTAMPE/TIME fields alone? Is this string gets change  like p1<96> , p1, p1<65>,... for defferent date/timestamp fields?
Thanks,
Bala Chuppala

Forums: 

named parameter - sql assistant 15.00.00.03 - forum topic by achikan01

$
0
0

If the “named parameter” you’re attempting to use starts with the letter “B” or “C” it doesn’t work.
 
Here’s a clearer example:
 
Prompts correctly for parameter:
select *
from DBC.ALL_RI_CHILDREN
where ChildDB = '?A'
 
Fails to prompt for parameter:

select *

from DBC.ALL_RI_CHILDREN

where ChildDB = '?B'
Also, further testing revealed that of the 26 English alphabetic characters, only B,C, & Y cause the problem.
 
 
 

Forums: 

TPUMP Question - response (1) by vikas_yadav

$
0
0

You can't delete flat file using tpump or by any TD load utility .You need to delete flat file using sed or any unix equivalent command.

bteq utility - forum topic by prasadh_p

$
0
0

i have inserted n records from a job in bteq. i want to know how many records i inserted through bteq

Forums: 

bteq utility - response (1) by ulrich

$
0
0

As long as you are not sharing your bteq file it is difficult to give you the best choices...
can't you simply query the table with some count(*)?

bteq utility - response (2) by prasadh_p

$
0
0

hi ulrich,  i dont wont table record count, suppose i inserted 100 rows through bteq from a job and i want to see how many inserted through log table, can u explain plz

bteq utility - response (3) by BalaChuppala

$
0
0

You can use ACTIVITYCOUNT built in variable and combine unix functionality to direct it to log

Errors connecting to SQL Server source using TPT - forum topic by Campdave

$
0
0

We are trying our first job with TPT and are having problems connecting to our source database.  We are receiving this error...
TPT17101: Fatal error received from ODBC driver:
STATE=IM002, CODE=0,
MSG='[DataDirect][ODBC lib] Data source name not found and no default driver specified'
I am including copies of the files in use as well as a copy of the output.
-----------------UNIX Shell
#!/bin/ksh
#set -x
LOGONDIR=/etl/ST/ABC/DEF/LOGON
. $LOGONDIR/DEF_DB.sh
echo SQL_ABC_ODBC_SRC_DB $SQL_ABC_ODBC_SRC_DB
ODBC_DSNName=$SQL_ABC_ODBC_SRC_DB
echo ODBC_DSNName $ODBC_DSNName
TERADATA_HOME=/usr/odbc
TD_TPT=/opt/teradata/client/14.10
TD_TPT_HOME=$TD_TPT/tbuild
TD_ICU_DATA=/opt/teradata/client/14.10/tdicu/lib
TWB_ROOT=/opt/teradata/client/14.10/tbuild
TD_HOME=$TERADATA_HOME
ODBCHOME=/opt/teradata/client/ODBC_32
export ODBCINST=/etl/ST/ABC/LOGON/odbcinst.ini
export ODBCINI=/etl/ST/ABC/LOGON/odbc.ini
export TD_TPT ODBCHOME ODBCINST ODBCINI TERADATA_HOME TD_TPT_HOME TD_ICU_DATA TWB_ROOT
PATH=$TD_TPT:$TERADATA_HOME/bin:$TD_TPT_HOME/bin:$TD_ICU_DATA:$TWB_ROOT/lib:/usr/bin:/etc:/usr/java5/bin:/usr/sbin:/usr/ucb:$HOME/bin:/usr/bin/X11:/sbin:$ODBCHOME/bin:$ODBCHOME/lib:$TERADATA_HOME/bin:/opt/quest/bin:$MY_HOME:.
LIBPATH=$TD_TPT:$TERADATA_HOME/lib:$TD_TPT_HOME/lib:$TD_ICU_DATA:$TWB_ROOT/lib:/usr/lib/lib:$ODBCHOME/bin:$ODBCHOME/lib:$TERADATA_HOME/lib:/usr/opt/$DB2HOME/lib64:.
export PATH LIBPATH

tbuild -f /HOME/ABC/drh7742/TPTScripts/DRH_GUIDE_ABC.tpt -j DRH_GUIDE_ABC
---------------------------TPT Script
/***************************************************************************************************/
DEFINE JOB ODBC_LOAD
(
DEFINE SCHEMA qa_result
  (
  uname VARCHAR(50));
  
DEFINE OPERATOR ABC_operator
   TYPE ODBC
   SCHEMA qa_result
   ATTRIBUTES
   (
     VARCHAR UserName = 'myuser', 
     VARCHAR UserPassword = 'mypassword',
     VARCHAR DSNName = '@ODBC_DSNName',
   VARCHAR SelectStmt  = 'Select uname FROM mydb.dbo.mytable where uid =601039843;',
   VARCHAR PrivateLogName = 'PrivateQuality_Audit_Result_log'
   ); 
DEFINE OPERATOR STREAM_operator
   TYPE STREAM
   SCHEMA *
   ATTRIBUTES
   (
  VARCHAR TdpId = 'my_td_db_id',
    VARCHAR UserName = 'my_td_User',
    VARCHAR UserPassword = 'my_td_Password',
    VARCHAR LogTable = 'my_td_db.Result_log',
  VARCHAR WorkingDatabase  = 'my_td_db',
    VARCHAR TargetTable = 'my_td_db.drhTPT'
   ); 
 STEP step1
(
  APPLY
   ('INSERT INTO my_td_db.drhTPT
   (columnone)
   VALUES ( :uname);')
    TO OPERATOR (STREAM_operator)
   SELECT uname FROM OPERATOR (ABC_operator);
      );
);
-------------------DEF_DB.sh
#  @@START EXPORTED_VARIABLES

export SQL_ABC_ODBC_SRC_DB='SQL_ABC'
export SQL_ABC_ODBC_SRC_USER='myuser'
export SQL_ABC_ODBC_SRC_PASSWORD='mypassword'

#  @@END EXPORTED_VARIABLES
#  End of Exported variables section

-------------------------------------------------------
--------------------------------ODBC ini
[ODBC]
IANAAppCodePage=4
InstallDir=/opt/Progress/DataDirect/Connect64_for_ODBC_61
Trace=Yes
TraceFile=/etl/admin/odbctrace.out
TraceDll=/opt/Progress/DataDirect/Connect64_for_ODBC_61/lib/ddtrc25.so
[ODBC Data Sources]
SQL_ABC=DataDirect 6.1 SQL Server Wire Protocol
[SQL_ABC]
Driver=/etl/usr/dmexpress/ThirdParty/DataDirect/lib/_Ssqls26.so
Description=SQL Server
Databasee=mydb
Address=myserver, 98765
LogonID=
Password=
QuoteId=NO
AnsiNPW=No
---------------------------------------------------
--------------Output
SQL_ABC_ODBC_SRC_DB SQL_ABC
ODBC_DSNName SQL_ABC
Teradata Parallel Transporter Version 14.10.00.02
Job log: /opt/teradata/client/14.10/tbuild/logs/DRH_GUIDE_ABC-297.out
Job id is DRH_GUIDE_ABC-297, running on n#####11
Found CheckPoint file: /opt/teradata/client/14.10/tbuild/checkpoint/DRH_GUIDE_ABCLVCP
This is a restart job; it restarts at step step1.
Teradata Parallel Transporter Stream Operator Version 14.10.00.02
STREAM_operator: private log not specified
Teradata Parallel Transporter ODBC Operator Version 14.10.00.02
ABC_operator: private log specified: PrivateQuality_Audit_Result_log
ABC_operator: connecting sessions
ABC_operator: TPT17122: Error: unable to connect to data source
ABC_operator: TPT17101: Fatal error received from ODBC driver:
              STATE=IM002, CODE=0,
              MSG='[DataDirect][ODBC lib] Data source name not found and no default driver specified'
ABC_operator: disconnecting sessions
ABC_operator: TPT17124: Error: unable to disconnect from data source
ABC_operator: TPT17101: Fatal error received from ODBC driver:
              STATE=08003, CODE=0,
              MSG='[DataDirect][ODBC lib] Connection not open'
ABC_operator: Total processor time used = '0.006269 Second(s)'
ABC_operator: Start : Sun Dec  7 15:33:34 2014
ABC_operator: End   : Sun Dec  7 15:33:34 2014
STREAM_operator: Start-up Rate: UNLIMITED statements per Minute
STREAM_operator: Operator Command ID for External Command Interface: STREAM_operator31064610
STREAM_operator: connecting sessions
STREAM_operator: disconnecting sessions
STREAM_operator: Total processor time used = '0.321904 Second(s)'
STREAM_operator: Start : Sun Dec  7 15:33:34 2014
STREAM_operator: End   : Sun Dec  7 15:33:40 2014
Job step step1 terminated (status 12)
Job DRH_GUIDE_ABC terminated (status 12)
Job start: Sun Dec  7 15:33:30 2014
Job end:   Sun Dec  7 15:33:40 2014

Forums: 

SQL Assistant install error - forum topic by Greyghost

$
0
0

Hello,
I have downloaded the TTU 15 For Windows zip file from the downloads page.  After unzipping, all the components installed properly EXCEPT SQL Assistant.  I downloaded the .zip file a couple more times thinking something was corupt in the original download.  Unfortunatly, SQL Assistant would not install in any of the downloads.
Has anyone run into this issue or have any suggestions on how to resolve this download issue?
Thanks!
Paul

Forums: 

Avoid User to work with Bteq, FastLoad, FastExport - forum topic by marcodom

$
0
0

Hello,
we have Teradata 13.10, and we would avoid some user to launch Utility like FastLoad, FastExport and Bteq.
Do u know if this is possibile ?
 
thanks 
Marco

Forums: 

How to skip BOM (Byte Order Mark )in BTeq scripts - response (3) by Fred

$
0
0

Either -c UTF16 or -c UTF8, as appropriate.

Attempted to read or write protected memory error - forum topic by rjperry611

$
0
0

Hello,
I am pretty new to Teradata so I was hoping that somebody could give me some insight on a problem I am having. Using Teradata SQL Assistant, I am trying to load a text file (477MB) from my computer into a database on a server. The file is very large and the load always fails with a memory error when it has read in about 1.0-1.1 million of the rows. It does not make sense to me because I have 8GB of RAM and at the point that the error is thrown, only about 4GB of total RAM is used and Teradata is using about 1.7GB of it.
The code used is:

INSERT INTO DB_NAME.TABLE_NAME VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,DB_NAME.TIME_STAMP())

The error window is titled "Unable to open History database" and the error says "Attempted to read or write protected memory. This is often an indication that other memory is corrupt..."
The log says: 

12/10/2014 9:05:23 AM
SQLA Version: 13.10.0.2
System.AccessViolationException
Attempted to read or write protected memory. This is often an indication that other memory is corrupt.
   at Microsoft.Office.Interop.Access.Dao.DBEngineClass.OpenDatabase(String Name, Object Options, Object ReadOnly, Object Connect)
   at Teradata.SQLA.HistoryTbl.OpenDatabase() in F:\ttu1310_efix_snap\tdcli\qman\sqla\HistoryTbl.vb:line 508

 

Forums: 

Avoid User to work with Bteq, FastLoad, FastExport - response (1) by gryback

$
0
0

I believe you could consider a TASM / Workload Management filter on these utilities.

Fast Export Data File Issue - forum topic by KamleshKhollam

$
0
0

Hi,
I am trying to export below column using Fastexport.
 
SELECT
CAST(COL1 as VARCHAR(4000))
from
TABLE;
 
When COL1 actual size exceeds 2559 chars , new record is getting added after that row having only one alphabetic char. 
Also, I observed one strage behaviour: usually when we try to export varchar data using FEXP, it gives some control chars which are nothing but VARCHAR data size (2 byte).  But in above case when VARCHAR record length is exceeding 2559 then that record is coming with no control char and one new records with one char is getting added after it in a file.
 

Forums: 

Fast Export Data File Issue - response (1) by feinholz

$
0
0

Please provide more detailed information.
Please provide the output of the job itself (all of the commands and information), and then provide the actual data from the data file (even if you have to provide it in hex mode; I want to see every byte from the resulting data file).
 


Trimming spaces from fields - response (1) by abhijit.p

$
0
0

Hello.
I feel simply using "Select Trim(Column) From Table" will give desired result.
This will remove leading & trailing blank spaces from the applied column.
Thanks

Different Setting of MLoadDiscardDupRowUPI on Teradata 14.10 and Teradata 12 - forum topic by skywalker69692

$
0
0

Hi all,
we have just migrated from Teradata 12 to Teradata 14.10, we have a problem with mload with duplicate rows, in the old version (TD12) we skypped this rows due to the setting of the MLoadDiscardDupRowUPI = TRUE.
On new sytem (TD14.10) the MLoadDiscardDupRowUPI = FALSE in the TERADATA 14.10 manual is written:

  • TERADATA 14.10: MLoadDiscardDupRowUPI = FALSE: "MultiLoad logs rows with UPI violations to the application error table."

In the TERADATA 12 manual is written:
MLoadDiscardDupRowUPI = TRUE: "MultiLoad logs rows with UPI violations to the application error table."
so it seems that we will have the same behavior on the two system,  but we are facing error due to UPI violation discarded on ERR2 table on TERADATA 14.10, that we do not have had on Teradata 12, we have done test with the same data on the two system.
Could you please help me to understand this strange behavior ?
 
Thanks
 

Forums: 

confidence levels wrt date of stats - forum topic by Rahul A

$
0
0

Will optimizer check the date on which the stats are collected before suggesting the confidence levels? 

Tags: 
Forums: 

Teradata tpt - forum topic by akd2k6

$
0
0

Hi All, I am new to this post and in teradata.

I am trying to create one unix script that will do the table sync from one to another for COB sync with tpt utility.

 

I need script to unload from source table and load to target table by tpt and if possible unload file in delimited.But also I need to generate the dml of the unload file and save it for future use.

 

Can you please help if there are anything in tpt like .ml script in fast export automatically created, else how to achieve that.

Also I want the tpt load should be upsert which tdload is not doing.I tried with tdload but it is only doing insert.

 

Forums: 

Teradata tpt - response (2) by feinholz

$
0
0

The TPT documentation has sample scripts to help you with this.
When you install TPT, there is also a samples directory with sample scripts for all of the popular scenarios (like the one you seek).
TPT does not have a feature like FastExport's (to create an ML script).
As for tdload, right now it is only used for inserts.
(The original concept behind the Easy Loader feature is to quickly and easily perform inserts from a flat file into a table without the need for a script.)

Viewing all 4252 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>