Quantcast
Channel: Teradata Forums - Tools
Viewing all 4252 articles
Browse latest View live

Help Needed on bteq export/import option - response (4) by baji

$
0
0

hai iam new to teradata.when iam importing my file to table iam getting the following error.please help me
.sessions 5;
.logon 127.0.0.1/dbc,dbc;
.import data file=D:\input.dat;
.repeat*;
using custid  (integer),
income        (integer),
age           (smallint),
yearswithbank (smallint),
nbrchildren   (smallint),
gender        (char(1))
insert into cust
(
cust_id  ,
income  ,
age     ,      
yearswithbank,
nchildren,
gender)values
(:custid,
:income,
:age,
:yearswithbank,
:nbrchildren,
:gender     );
.logoff;
.quit;
 *** Error: Use IMPORT to open a file first before
           trying to read from it.
 *** Failure 3593 No DATA parcel sent and request uses a USING clause.
                Statement# 1, Info =0
 *** Total elapsed time was 1 second.


cannot connect to teradata database by using teradata studio express 14.1 on Mac os 10.8 - response (1) by swatisvl

$
0
0

com.teradata.jdbc.jdbc_4.util.JDBCException: [Teradata JDBC Driver] [TeraJDBC 14.00.00.21] [Error 1000] [SQLState 08S01] Login failure for Connection to testServer Wed Mar 13 02:05:34 PDT 2013 socket orig=testServer cid=1bc6cc7 sess=0 java.net.UnknownHostException: testServer   at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)   at java.net.InetAddress$1.lookupAllHostAddr(Unknown Source)   at java.net.InetAddress.getAddressesFromNameService(Unknown Source)   at java.net.InetAddress.getAllByName0(Unknown Source)   at java.net.InetAddress.getAllByName(Unknown Source)   at java.net.InetAddress.getAllByName(Unknown Source)   at com.teradata.jdbc.jdbc_4.io.TDNetworkIOIF$Lookup.<init>(TDNetworkIOIF.java:183)   at com.teradata.jdbc.jdbc_4.io.TDNetworkIOIF.connectToHost(TDNetworkIOIF.java:282)   at com.teradata.jdbc.jdbc_4.io.TDNetworkIOIF.createSocketConnection(TDNetworkIOIF.java:131)   at com.teradata.jdbc.jdbc_4.io.TDNetworkIOIF.<init>(TDNetworkIOIF.java:117)   at com.teradata.jdbc.jdbc_4.TDSession.getIO(TDSession.java:585)   at com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:95)   at com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:214)   at com.teradata.jdbc.jdk6.JDK6_SQL_Connection.<init>(JDK6_SQL_Connection.java:34)   at com.teradata.jdbc.jdk6.JDK6ConnectionFactory.constructConnection(JDK6ConnectionFactory.java:22)   at com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:130)   at com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:120)   at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:232)   at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:158)   at com.teradata.datatools.dtp.connectivity.db.teradata.TeradataJDBCConnection.makeConnection(TeradataJDBCConnection.java:293)   at com.teradata.datatools.dtp.connectivity.db.teradata.TeradataJDBCConnection.createConnection(TeradataJDBCConnection.java:121)   at org.eclipse.datatools.connectivity.DriverConnectionBase.internalCreateConnection(DriverConnectionBase.java:105)   at org.eclipse.datatools.connectivity.DriverConnectionBase.open(DriverConnectionBase.java:54)   at org.eclipse.datatools.connectivity.drivers.jdbc.JDBCConnection.open(JDBCConnection.java:73)   at com.teradata.datatools.dtp.connectivity.db.teradata.TeradataPingFactory.createConnection(TeradataPingFactory.java:36)   at org.eclipse.datatools.connectivity.internal.ConnectionFactoryProvider.createConnection(ConnectionFactoryProvider.java:83)   at org.eclipse.datatools.connectivity.internal.ConnectionProfile.createConnection(ConnectionProfile.java:359)   at org.eclipse.datatools.connectivity.ui.PingJob.createTestConnection(PingJob.java:76)   at org.eclipse.datatools.connectivity.ui.PingJob.run(PingJob.java:59)   at org.eclipse.core.internal.jobs.Worker.run(Worker.java:54)  
    at com.teradata.jdbc.jdbc_4.util.ErrorFactory.makeDriverJDBCException(ErrorFactory.java:93)
    at com.teradata.jdbc.jdbc_4.util.ErrorFactory.makeDriverJDBCException(ErrorFactory.java:68)
    at com.teradata.jdbc.jdbc_4.util.ErrorFactory.makeIoJDBCException(ErrorFactory.java:206)
    at com.teradata.jdbc.jdbc_4.util.ErrorAnalyzer.analyzeIoError(ErrorAnalyzer.java:61)
    at com.teradata.jdbc.jdbc_4.io.TDNetworkIOIF.createSocketConnection(TDNetworkIOIF.java:138)
    at com.teradata.jdbc.jdbc_4.io.TDNetworkIOIF.<init>(TDNetworkIOIF.java:117)
    at com.teradata.jdbc.jdbc_4.TDSession.getIO(TDSession.java:585)
    at com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:95)
    at com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:214)
    at com.teradata.jdbc.jdk6.JDK6_SQL_Connection.<init>(JDK6_SQL_Connection.java:34)
    at com.teradata.jdbc.jdk6.JDK6ConnectionFactory.constructConnection(JDK6ConnectionFactory.java:22)
    at com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:130)
    at com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:120)
    at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:232)
    at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:158)
    at com.teradata.datatools.dtp.connectivity.db.teradata.TeradataJDBCConnection.makeConnection(TeradataJDBCConnection.java:293)
    at com.teradata.datatools.dtp.connectivity.db.teradata.TeradataJDBCConnection.createConnection(TeradataJDBCConnection.java:121)
    at org.eclipse.datatools.connectivity.DriverConnectionBase.internalCreateConnection(DriverConnectionBase.java:105)
    at org.eclipse.datatools.connectivity.DriverConnectionBase.open(DriverConnectionBase.java:54)
    at org.eclipse.datatools.connectivity.drivers.jdbc.JDBCConnection.open(JDBCConnection.java:73)
    at com.teradata.datatools.dtp.connectivity.db.teradata.TeradataPingFactory.createConnection(TeradataPingFactory.java:36)
    at org.eclipse.datatools.connectivity.internal.ConnectionFactoryProvider.createConnection(ConnectionFactoryProvider.java:83)
    at org.eclipse.datatools.connectivity.internal.ConnectionProfile.createConnection(ConnectionProfile.java:359)
    at org.eclipse.datatools.connectivity.ui.PingJob.createTestConnection(PingJob.java:76)
    at org.eclipse.datatools.connectivity.ui.PingJob.run(PingJob.java:59)
    at org.eclipse.core.internal.jobs.Worker.run(Worker.java:54)
Caused by: java.net.UnknownHostException: testServer
    at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
    at java.net.InetAddress$1.lookupAllHostAddr(Unknown Source)
    at java.net.InetAddress.getAddressesFromNameService(Unknown Source)
    at java.net.InetAddress.getAllByName0(Unknown Source)
    at java.net.InetAddress.getAllByName(Unknown Source)
    at java.net.InetAddress.getAllByName(Unknown Source)
    at com.teradata.jdbc.jdbc_4.io.TDNetworkIOIF$Lookup.<init>(TDNetworkIOIF.java:183)
    at com.teradata.jdbc.jdbc_4.io.TDNetworkIOIF.connectToHost(TDNetworkIOIF.java:282)
    at com.teradata.jdbc.jdbc_4.io.TDNetworkIOIF.createSocketConnection(TDNetworkIOIF.java:131)
    ... 21 more

Same issue with credentials like
 
username :testUser
SErver name : testServer
pwd - dbc

TD 13 Query Scheduler Startup Error on Windows 7 64-bit - response (7) by msitekkie

$
0
0

I have the same problem. I discovered that I get the error when running as an ordinary user, but not as an Admin, so it seems to be permissions related. Unfortunately Process Monitor didn't seem to point a smoking gun at exactly where the problem lay.
I worked round the problem by installing without the Scheduler components as they weren't actually required, so sorry I can't give a definitive fix, but the above may help.
Incidentally I noticed that the strange problem where files install to C:\Program Files (x86)\(x86), went away when I installed without the Scheduler components.

TPT 14 WorkingDatabase and TargetTable getting crossed - forum topic by rmattson

$
0
0

I'm on Day 2 of learning TPT, so far so good, except TPT seems to be crossing the WorkingDatabase and TargetTable variables.
My setup:
1. Local Job Variables file:
TdpId = 'xxxxx',
UserName = 'xxxxx,
UserPassword = 'xxxxx',
TargetTable = 'BUB_LOAD',
WorkingDatabase = 'SND_PMRA',
DirectoryPath = '/sdlopr/teradata',
TextDelimiter = '|',
Format = 'Delimited',
OpenMode = 'Read'
 
2. Job Script:
DEFINE JOB load_sample
DESCRIPTION 'This is a sample load job'
(
        DEFINE SCHEMA bub_load
        (
                BUB_COL VARCHAR(4),
                BUB_COL1 VARCHAR(4)
        );
        DEFINE OPERATOR get_file
        DESCRIPTION 'This is the producer operator'
        TYPE DATACONNECTOR PRODUCER
        SCHEMA bub_load
        ATTRIBUTES
        (
                PrivateLogName = 'load_sample.log',
                DirectoryPath  = @DirectoryPath,
                FileName       = 'bub.txt',
                Format         = @Format,
                OpenMode       = @OpenMode,
                TextDelimiter  = @TextDelimiter
        );
        DEFINE OPERATOR load_file
        DESCRIPTION 'This is the consumer operator'
        TYPE LOAD
        /* Can use * if the input and output schema are the same */
        SCHEMA *
        ATTRIBUTES
        (
                 /* Only need to specify VARCHAR or INTEGER if not assigning a value */
                TdpId                   = @TdpId,
                UserName                = @UserName,
                UserPassword            = @UserPassword,
                TargetTable             = @TargetTable,
                WorkingDatabase         = @WorkingDatabase,
                LogTable                = @TargetTable || '.LG_Trans'
        );
   APPLY
        (
                'INSERT INTO ' || @TargetTable || '(BUB_COL, BUB_COL1) VALUES (:BUB_COL, :BUB_COL1);'
        )
        TO OPERATOR (load_file)
        SELECT * FROM OPERATOR (get_file);
);
3. Command: tbuild -f load_sample.txt -v local.jobvars -j bub1
 
When I run my command I get the following:
Teradata Parallel Transporter Version 14.00.00.08
Job log: /opt/teradata/client/14.00/tbuild/logs/bub1-150.out
Job id is bub1-150, running on sdlompa2
Found CheckPoint file: /opt/teradata/client/14.00/tbuild/checkpoint/bub1LVCP
This is a restart job; it restarts at step MAIN_STEP.
Teradata Parallel Transporter Load Operator Version 14.00.00.08
Teradata Parallel Transporter get_file: TPT19006 Version 14.00.00.08
load_file: private log not specified
get_file Instance 1 directing private log report to 'load_sample.log-1'.
get_file: TPT19008 DataConnector Producer operator Instances: 1
get_file: TPT19003 ECI operator ID: get_file-21349
get_file: TPT19222 Operator instance 1 processing file '/sdlopr/teradata/bub.txt'.
load_file: connecting sessions
load_file: TPT10508: RDBMS error 3802: Database ''BUB_LOAD'' does not exist.
load_file: disconnecting sessions
load_file: Total processor time used = '0.05 Second(s)'
load_file: Start : Fri Mar 15 16:16:18 2013
load_file: End   : Fri Mar 15 16:16:22 2013
get_file: TPT19221 Total files processed: 0.
Job step MAIN_STEP terminated (status 12)
Job bub1 terminated (status 12)
 
As you can see, BUB_LOAD is my TargetTable and not my WorkingDatabase.  To verify it's thinking my TargetTable is my WorkingDatabase, I changed the value to:
TargetTable = 'WHOAMI'
 
With the output being:
Teradata Parallel Transporter Version 14.00.00.08
Job log: /opt/teradata/client/14.00/tbuild/logs/bub1-151.out
Job id is bub1-151, running on sdlompa2
Found CheckPoint file: /opt/teradata/client/14.00/tbuild/checkpoint/bub1LVCP
This is a restart job; it restarts at step MAIN_STEP.
Teradata Parallel Transporter Load Operator Version 14.00.00.08
Teradata Parallel Transporter get_file: TPT19006 Version 14.00.00.08
load_file: private log not specified
get_file Instance 1 directing private log report to 'load_sample.log-1'.
get_file: TPT19008 DataConnector Producer operator Instances: 1
get_file: TPT19003 ECI operator ID: get_file-21480
get_file: TPT19222 Operator instance 1 processing file '/sdlopr/teradata/bub.txt'.
load_file: connecting sessions
load_file: TPT10508: RDBMS error 3802: Database ''WHOAMI'' does not exist.
load_file: disconnecting sessions
load_file: Total processor time used = '0.05 Second(s)'
load_file: Start : Fri Mar 15 16:17:55 2013
load_file: End   : Fri Mar 15 16:17:58 2013
get_file: TPT19221 Total files processed: 0.
Job step MAIN_STEP terminated (status 12)
Job bub1 terminated (status 12)
 
I'm sure I'm missing something here as I can't imagine TPT is actually crossing the variable values but I don't know what.  Any ideas?
 
 

Forums: 

How to load EBCDIC file to db using TPT? - forum topic by novalyn

$
0
0

Is there any possible to load IBM EBCDIC file using TPT? There is a EBCDIC file in the mainframe, we are trying to load the file to teradata which installed in the Linux server, is there any possible using TPT connected to mainframe and load the file directly?
If can't connect to mainframe directly and load the file, is there any possible load binary EBCDIC file using TPT in the linux server?
Thanks in advance. 

Tags: 
Forums: 

BTEQ not remove trailing blank - response (5) by skrafi

$
0
0

Hi,
Dnoeth
 
While i'm trying to load data from staging to Edw using Teradata macros and  Informatica ,how i can Excute macro in informatica .
could you please share any solution for this 
 
 
Thanks & Regards

Mload acquisition and application phase - response (1) by dnoeth

$
0
0

Hi Cheeli,
in the Aqcuisition phase the rows are loaded (similar to a FastLoad) into the worktable, one row for each APPLY WHERE.
At the end of this phase the worktable is sorted by the so-called "match tag" columns (automatically added) which ensures that all modifications are done in the correct order based on input sequence number.
Both target and work table have the same PI and during the application phase all those modifications are merged into the target using a sequential scan on both tables. 
So your friend was not really right, the data for the inserts/updates/deletes are recorded in the worktable, but they're actually done on the target rows.
If you read the MLoad manual there shoud be more detailed info.
Dieter

Fast load Vs Mload when AMP is down - response (1) by dnoeth

$
0
0

Hi Cheeli,
FastLoad directly loads into the target table and is simply not implemented to handle down AMPs.
MLoad uses worktables which are always Fallback protected, when an AMP is offline there's still the Fallback copy of the row to work with. And if the target table is also Fallback there's TD's "Down AMP Recovery Journal" to deal with it.
But when an AMP is down (data loss in a RAID?) you usually have more urgent things to do than running a load job :-)
Dieter
 
 


Mload error - Highest return code encountered = '23' - response (2) by dnoeth

$
0
0

Hi Cheeli,
in the UV table are not only duplicates, but also errors based on the MARK option in the DML label and errors during the update like numeric overflow.
When you don't care about duplicate row errors, etc., you might better use IGNORE instead of MARK.
The MLoad will not fail when there are errors in the UV table, but of course you should never drop an error table without investigating the cause of the errors. Did you check the errorcodes for those rows?
If you drop error tables without checking them your load process is faulty.
Dieter
 
 
 

Need of Collect Statistics - response (9) by Adeel Chaudhry

$
0
0

Collection of unnessasary stats does have a downside, but considering your scenario, it shouldnt be an issue.
 
Moverover, it is always an evolving thing to come up with the best possible stats to collect. So, its better to analyze for a time-being and then tweak, add or remove stats.
 
HTH!

Teradata SQL Assistant - Database Explorer Window - forum topic by Shadyguy904

$
0
0

I am having problems with Teradata Assistant (verion Teradata.Net 13.0.0.14) The Problem is with Database Explorer window becuse it is fixed length and width and I can´t adjust it as with the  answer and history windows. It seems to be docked, with no way of unpinning. I have tried seemingly everything. Any assistance would be greatly appreciated.
I found a post on StackOverflow with the exact same issue:
http://stackoverflow.com/questions/14339475/teradata-assistant-database-explorer

Forums: 

TPT 14 WorkingDatabase and TargetTable getting crossed - response (1) by TonyL

$
0
0

The problem is this line in the TPT job script:
               LogTable                = @TargetTable || '.LG_Trans'
The log table is a fully qualified table name with the target table name as the database name.
I think that was not your intention.
I believe this is your intention:
               LogTable                = @TargetTable || '_LG_Trans'

FASTLOAD Error - Loading Fixed Length file to Table - response (1) by ThomasNguyen

$
0
0

Hi Kiranwt,
You need to remove the character ':' in front of the fields defined in the DEFINE command, it will become:
DEFINE
EID(CHAR(1)),
ENM(CHAR(5)),
ECD(CHAR(2)),
ESAL(CHAR(5))
FILE=EMP_FLAT.TXT;
 
Also, the fields in the input record must match with the DEFINE command. With the above DEFINE command, each record has 13 characters; while in your data file, each record has only 10 characters.
Thomas

Tpump Error, Invalid session mode - response (1) by feinholz

$
0
0

The user cannot change the session mode.
TPump will always run in Teradata mode.

Teradata SQL Assistant - Database Explorer Window - response (1) by Shadyguy904


BTEQ not remove trailing blank - response (6) by KS42982

$
0
0

You can create a new task in informatica and in the properties you can add - exec macroname .. and run that task using informatica (assuming you have all setup connection already existing between informatica and teradata)

Date Format In Teradata SQL Assistant - response (4) by ZQkal

$
0
0

How do you display last row (record) from a table.
Given this table
Name      create_tmp
AA          09-11-2009 01:02:00
AA           09-11-2011 01:02:00
AA           09-11-2012 01:12:09
AA           09-11-2013 01:02:59
BB           09-11-2010 01:02:00
BB          09-11-2011 10:02:10
CC          09-15-2012 01:02:00
 
Disired result
Name       Create_time
AA           09-11-2013 01:02:59
BB          09-11-2011 10:02:10
CC          09-15-2012 01:02:00
 
Thanks in advance.
 

SQL_SELECTOR: TPT15105: Error 13 in finalizing the table schema definition - response (6) by ericsun2

$
0
0

Hi feinholz@,
For a table with 3 columns
NET_ID DECIMAL(38,0) NOT NULL,
NET_NAME VARCHAR(300),
NET_KEY DECIMAL(9,0) NOT NULL

What will be the FILE_SCHEMA in TPT 13.10 scipt? I tried the following but got " TPT15105: Error 13 in finalizing the table schema definition"
DEFINE SCHEMA OUTPUT_FILE_SCHEMA
(
"NET_ID"  Varchar(18),
"NET_NAME" VARCHAR(300),
"NET_KEY"  Varchar(9)
);
DEFINE OPERATOR SQL_SELECTOR
TYPE SELECTOR
SCHEMA OUTPUT_FILE_SCHEMA
ATTRIBUTES
(
  INTEGER MaxDecimalDigits = 18,
  VARCHAR DateForm = 'ANSIDATE',
  VARCHAR PrivateLogName = 'selector_log',

 
 

SQL_SELECTOR: TPT15105: Error 13 in finalizing the table schema definition - response (7) by ericsun2

$
0
0

Does UTF8 change the byte size in SCHEMA definition?

USING CHARACTER SET UTF8
DEFINE JOB EXPORT_DELIMITED_FILE
DESCRIPTION 'Export rows from a Teradata table to a delimited file'
(
DEFINE SCHEMA OUTPUT_FILE_SCHEMA
(
"NET_ID"   Varchar(39),
"NET_NAME" VARCHAR(900),
"NET_KEY"  Varchar(39)
);
DEFINE OPERATOR SQL_SELECTOR
TYPE SELECTOR
SCHEMA OUTPUT_FILE_SCHEMA
ATTRIBUTES

 

SQL_SELECTOR: TPT15105: Error 13 in finalizing the table schema definition - response (8) by feinholz

$
0
0

@ericsun2: the schema must match the data. Thus the Selector is retrieving 2 DECIMALs and a VARCHAR. Your schema does not match that.
And yes, if you are using a client session character set of UTF8, you must adjust the sizes of the VARCHAR fields in the schema definition to account for the extra data. You must multiply by 3.
 

Viewing all 4252 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>