Quantcast
Channel: Teradata Forums - Tools
Viewing all 4252 articles
Browse latest View live

tdload having issues loading data where the delimiter is part of a field - response (5) by manharrishi

$
0
0

I am trying to use just tdload, which will not require any scripting. Can v14 tdload handle data where delimiter used is part of a field. If TPT 14 is the only option, I will have to start working on a script. 
tdload will inturn generate a TPT script I believe, can't we provide some options for tdload to consider quoted data.


tdload having issues loading data where the delimiter is part of a field - response (6) by feinholz

$
0
0

thomspsonhab: TPT does not yet support embedded cr/lf.
 
manharrishi: tdload (aka EasyLoader) does not yet support quoted data (but we should have so I will make sure we get this efixed).

Fast export and mload in vartext mode - Data item too large for field issue - response (6) by feinholz

$
0
0

You should consider switching to TPT.
In 14.10 TPT supports the ability to write out data (retrieved from Teradata or any other ODBC compliant data source) to delimited fashion without the need to specially code the SELECT statement.

Error in FastLoad Script - forum topic by Jugalkishorebhatt1

$
0
0

Hi All,
 
Below is the script for the fastload:
.LOGON home/jbhatt,jugal;
BEGIN LOADING DB.tables1
ERRORFILES ERR1,ERR2;
DEFINE col1 (VARCHAR(50)),
        col2 (VARCHAR(50)) 
FILE= /home/jugal/finsert.txt;
insert into DB.tables1 values(:col1,:col2);
END LOADING;
LOGOFF;
 When i ran the script i got the below error in the logs:
  ===================================================================
     =                                                                 =
     =          FASTLOAD UTILITY     VERSION 14.00.00.07               =
     =          PLATFORM LINUX                                         =
     =                                                                 =
     ===================================================================
     ===================================================================
     =                                                                 =
     =          Copyright 1984-2012, Teradata Corporation.             =
     =          ALL RIGHTS RESERVED.                                   =
     =                                                                 =
     ===================================================================
**** 10:25:58 Processing starting at: Fri Feb 28 10:25:58 2014
     ===================================================================
     =                                                                 =
     =          Logon/Connection                                       =
     =                                                                 =
     ===================================================================
0001 .LOGON home/Jbhatt,
**** 10:25:59 Teradata Database Release: 14.00.05.02
**** 10:25:59 Teradata Database Version: 14.00.05.03
**** 10:25:59 Number of AMPs available: 96
**** 10:25:59 Current CLI or RDBMS allows maximum row size: 64K
**** 10:25:59 Character set for this job: ASCII

0002 BEGIN LOADING DB.tables1
     ERRORFILES ERR1,ERR2;
**** 10:25:59 Session count 16 returned by the DBS overrides
              user-requested session count
**** 10:26:03 Number of FastLoad sessions connected = 16
**** 10:26:03 FDL4808 LOGON successful
**** 10:26:03 FastLoad is continuing a multifile job
**** 10:26:03 Number of AMPs available: 96
**** 10:26:03 BEGIN LOADING COMPLETE
0003 DEFINE col1 (VARCHAR(50)),
             col1 (VARCHAR(50))
     FILE= /home/jugal/finsert.txt;
**** 10:26:03 FDL4803 DEFINE statement processed
     ===================================================================
     =                                                                 =
     =          Insert Phase                                           =
     =                                                                 =
     ===================================================================
0004 insert into DB.tables1 values(:col1,:col2);
**** 10:26:03 I/O Error on File Checkpoint: 42, Text: Unable to obtain
              data signature Unexpected data format !ERROR! EOF
              encountered before expected EOR
     ===================================================================
     =                                                                 =
     =          Logoff/Disconnect                                      =
     =                                                                 =
     ===================================================================
**** 10:26:03 Logging off all sessions
**** 10:26:05 Total processor time used = '1.86 Seconds'
     .        Start : Fri Feb 28 10:25:58 2014
     .        End   : Fri Feb 28 10:26:05 2014
     .        Highest return code encountered = '12'.
**** 10:26:05 FastLoad Paused
I think i am not inserting the data from the test properly.
Please Help me with the error.

Tags: 
Forums: 

Issue with SQL assistant - forum topic by renu.l.b

$
0
0

I have installed TD 13 demo version on my machine. When I run teradata service control, I get the expected status message "Teradata is running". When I connect to databse using BTEQ, I am able to connect successfully and run queries. However, when I try to run SQL assistant, the connection times out and PDE state changes to RESET/PDEDUMP. If I try to log in next time I get the error message "Teradata sever can't be reached over the network". Could you please suggest me solution for this? Thank you in advance.

Forums: 

Issue with SQL assistant - response (1) by wmmiteff

$
0
0

Try adjusting the connection timeout parameter on your connection and try setting a value for the Data Source DNS Entries property.  For ODBC these are found in options then advanceds on the DSN.  For Teradata.net connections these are found in the advanced tab.
For the connection timeout I have found 60 works good and for Data Source DNS Entries i would try 1 or 2.

Teradata Parallel Transporter - Session Character Set - response (6) by david.craig

$
0
0

What Unicode characters you are refereing to? Use U+ code point notation. For example: U+000A  (<control>)
U+00BC  (VULGAR FRACTION ONE QUARTER)
These characters should load from UTF8 into the Unicode, or Latin, server character sets. Note that Unicode supports various Latin scripts: basic latin (ASCII), Latin-1, Latin Extended (see http://www.unicode.org/charts/).
 

Teradata Parallel Transporter - Session Character Set - response (7) by Santanu84

$
0
0

Hi David
Thanks for reply.
 
Well, there is the problem. Even I thought defining the column as unicode and using UTF8 as session charset will work.
But the problem is we are supposed to load the data through ETL tool Informatica where we are using code page UTF8.
 
Now the problems we are facing,
1. The source is Oracle
2. In the same source table we have 2 types of columns
a)
Column with extended ascii char (U+000A  <control> and U+00BC  (VULGAR FRACTION ONE QUARTER) within a large string)
b) 
Column with Asian characters such as Chinease.
3. When we are using MS-Latin code page and loding into Uniode colum ascii getting loaded. But the asian characters are becoming garbled.
4. On the other hand when UTF8 code page is used asian characters are getting loaded but these 8-bit ASCII characters are getting rejected with 6705 error code.
 
I am looking for a solution which will load both in single shot. 
I tried changing the 8-bit ascii characters to hexadecimal format using Char2HexInt function but then how to change those hex values back to ascii char using any function (not by using ''XC format) ?
 
Hope I am able to explain you the actual problem. Any solution or guidance is appreciated.
 
Thanking You
Santanu
 


Teradata Parallel Transporter - Session Character Set - response (8) by david.craig

$
0
0

All columns in the source table need to use the same character set. In this case, a Unicode character set is the only one to support both Chinese and Latin. So UTF8 is a good choice. There should be a way in Informatica to convert Latin to UTF8 before the load.

Teradata Parallel Transporter - Session Character Set - response (9) by arun_tim1

$
0
0

Hi ,

I would like share my issue which is currently facing in Teradata parallel transporter script in Z/Os. I am having the TPT job for loading file data into table but the file volume is high.
i am using the DATA CONNECTOR AS PRODUCER to read the file. i gave producer instance as 2 but it is taking 1 instance to read file.
Like , Instance 1 Reading file 'DD:PTYIN'. i have used the attribute MultipeReaders='Y' but the TPT job got disconnecting while acqusition phase.
Kindly help on this !!

Thanks in advance .

Teradata Parallel Transporter - Session Character Set - response (10) by ratchetandclank

$
0
0

@Santanu84, Try using the attributes ValidUTF8 and ReplaceUTF8Char. Refer to the documentation to see if it helps. 

Teradata Parallel Transporter - Session Character Set - response (11) by ratchetandclank

$
0
0

@arun_tim1, The information provided is not enough to dig deep into the issue you are facing. Please provide the logs of the operators to see where the problem might lie.
 
And, I think you should open a new thread for this question. 

TPT 14.10 output to named pipe and then gzip to final files - response (21) by asilby

$
0
0

Hi, I am getting this same issue in the Express version. We are currently evaluating Teradata. Is there someway I can update TPT as I'm using the VM Ware download?
Regards, 
Andy

TPT 14.10 output to named pipe and then gzip to final files - response (22) by dnoeth

$
0
0

Hi Andy,
there was a new version of TD Express a few days ago, 14.10.01.01, you might try if this includes TPT 14.10.00.03.
Dieter

TPT 14.10 output to named pipe and then gzip to final files - response (23) by asilby

$
0
0

Hi Dieter, Thank you. The version listed on the download page for VM Ware seems to be a version we downloaded a few weeks ago. Do you know when a new VM Ware version might be released?
Andy
 


TPT 14.10 output to named pipe and then gzip to final files - response (24) by dnoeth

$
0
0

Can't be "a few weeks ago", don't use the one on top, scroll down :-)

Version: 14.10.01.01

Released: 28 Feb 2014

Teradata Parallel Transporter - Session Character Set - response (12) by arun_tim1

$
0
0

 Hi
 I am using below syntax for the DATA CONNECTOR AS PRODUCER operator in the TPT job.
ATTRIBUTES                                           
 (                                                    
      VARCHAR FILENAME='DD:PTYIN',                    
      VARCHAR MULTIPLEREADERS='Y',                    
      VARCHAR FORMAT='DELIMITED',                     
      VARCHAR OPENMODE='READ',                        
      VARCHAR TEXTDELIMITER ='¬',                     
      VARCHAR PRIVATELOGNAME = 'data_logfile',
      VARCHAR INDICATORMODE='N'                       
) ;    
INSERT INTO TABLENAME
(
)
TO OPERATOR (UPDATE_OPERATORÝ3¨)      
SELECT * FROM OPERATOR(FILE_READERÝ2¨);
If i gave 2 instance for the producer operator i am getting the following abend in the mainframe job.                                                       
FILE_READER: TPT19008 DataConnector Producer operator Instances: 2
FILE_READER: TPT19003 ECI operator ID: FILE_READER-197561         
FILE_READER: TPT19221 Total files processed: 0.                   
UPDATE_OPERATOR: connecting sessions                              
UPDATE_OPERATOR: preparing target table(s)                        
UPDATE_OPERATOR: entering DML Phase                               
UPDATE_OPERATOR: entering Acquisition Phase                       
UPDATE_OPERATOR: disconnecting sessions                           
UPDATE_OPERATOR: Total processor time used = '0.034438 Second(s)' 
UPDATE_OPERATOR: Start : Mon Mar  3 07:05:29 2014                 
UPDATE_OPERATOR: End   : Mon Mar  3 07:05:32 2014                 
Job step LOAD_TABLES terminated (status 8)   
ABEND NAME :
07.05.42 JOB37991 $HASP165 ZKLD76AE ENDED AT IMF9S - ABENDED S000 U3000 CN(INTERNAL)
IN SPOOL JESMSG : The following abend details i got from the jcl log.
IDI0044I Current fault is a duplicate of fault ID F08604 in history file SYS3B.FLTANLZR.HIST2 - the duplicate count is 18
IDI0053I Fault history file entry suppressed due to: Duplicate fault or End Processing user exit.
kindly help on this !
 
                     

Teradata Parallel Transporter - Session Character Set - response (13) by Santanu84

$
0
0

Hi ratchetandclank
Are these attributes present some where in informatica ? Just wanted to know as I could not find such attributes in teradata. Any reference document suggestion for this would be helpful?
Please let me know your response.
 
Thanking You
Santanu

Data mover 14.xx pool of users - forum topic by sanushks

$
0
0

Hi..
 
Our Data mover(DM) jobs are using  a pool of users for data mover arcmain and tptapi. Assume a scenario , DM has launched multiple tptapi sessions with username "john"(avalable from the user pool). Now another DM job runs in parallel and launches a arcmain, is there a possibility of DM job launching arcmain to pick up "john"? or will it up other pool of users available?
Based on the documentation, the dm jobs picks up unique user id when several DM arcmain jobs are running in parallel. I would like to confirm if the same mechanism applies   if DM tptapi and DM arcmain jobs runs in parallel.
 
Thanks in Advance!
 
 

Forums: 

Fast Export parallelism - forum topic by Piotr_Skrzypiec

$
0
0

Hi,
I am successfully exporting with fast export tool. However I am not sure about parallelism of export.
From fastexport logs I have:
**** 13:49:28 UTY8715 FastExport is submitting the following request:
     CHECK WORKLOAD END;
**** 13:49:28 UTY0844 Session count 8 returned by the DBS overrides
     user-requested session count.
**** 13:49:36 UTY8705 EXPORT session(s) requested: 8.
**** 13:49:36 UTY8706 EXPORT session(s) connected: 8.
**** 13:49:36 UTY8715 FastExport is submitting the following request:
     BT;BEGIN FASTEXPORT;
 
But when I am viewing in Viepoint, I see 8 connections. 7 connections are idle, but 1 connection is in transfering state <->
I would rather expect all 8 connections to be transferring <-> , but as I am not teradata expert, I d like to have answer from
advanced users.
 
Any explanation?
 
Regards,
Pitr

Forums: 
Viewing all 4252 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>