Quantcast
Channel: Teradata Forums - Tools
Viewing all 4252 articles
Browse latest View live

Connect to Microsoft SQL Server from Teradata Studio 14.10 - forum topic by PeterSchwennesen

$
0
0

Is it possible to use the new Teradata Studio 14.10 to also connect to a Microsoft SQL Server?
In the “New Connection Profile” wizard is listed “SQL Server”, selecting this brings up a menu for “Specify a Driver and Connection Details”, there are however no driver installed. Pressing the “New Driver Definition”, brings up a menu “Spe3cify a Driver Template and Definition Name” menu, here are listed Microsoft SQL Server 2000, 2005 and 2008 bases, but selecting any of those just give the information “Unable to locate JAR/zip in file system as specified by the driver definition: sqljdbc.jar”.
Any help on how to connect to SQL Server from Teradata Studio or any description on how to do this?
Peter Schwennesen 

Forums: 

BTEQ losing header comments of view definitions - response (1) by dnoeth

$
0
0

Hi Matthias,
afaik you can't do that as-is in BTEQ.
The only way i know is to move the comment after the REPLACE, e.g.

REPLACE VIEW SampleDB.TestView AS
/*
    Header comment:
    This is a test view.
*/
LOCKING ROW FOR ACCESS
SELECT *
FROM DBC.TablesV;

 

Starting TPUMP on Windows 2008 R2 fails - forum topic by rzenkert

$
0
0

Hi,
I have installed Teradata utilities 14.10 on Windows 2008 R2 (64 bit). After I tried to start TPUMP i get the error msg: The application was unable to start correctly (0x000007b). Click OK to close the application. 
Does this mean that the utilities do not run on this platform
 
thanks

Forums: 

Connect to Microsoft SQL Server from Teradata Studio 14.10 - response (1) by Chuckbert

$
0
0

You need to specify the full path names to the SQL Server jars in the "JAR List" tab of the New Driver Definition dialog. You'll need to remove the default jar file names (that don't have any path information in them) that are listed there and use the Add JAR/Zip button to add them using the dialog where you locate their actual locations.

Re: Delete Syntax in MLOAD - response (1) by Fred

$
0
0

In a standard BEGIN [IMPORT] MLOAD, the DELETE statement must specify values for the PI with equality condition, just as an UPDATE would. Or you can use BEGIN DELETE MLOAD and non-PI equality WHERE condition. Check the manual for details.

Connect to Microsoft SQL Server from Teradata Studio 14.10 - response (2) by PeterSchwennesen

$
0
0

Ok, but next question. I search my Teradata folder in programs, but it seems that there are o *.jar files. From where can I download the needed Jar (or Zip) files to be able to use Teradata Studio to connect to SQL Server. I looked around but was not able to locate anything. Are these files some ware on the Teradata web site for download? And if where is the link to the files?
Peter Schwennesen

Error with Teradata connector for Hadoop with HCAT -> TD fastload - forum topic by rupert160

$
0
0

I have an HCategory table:

CREATE TABLE src.t (
   msgtype string
   , ts string
   , source string
   , msgclass int
   , msgtext string
) PARTITIONED BY (device_range string, device_id string);

and a TD table:

CREATE SET TABLE tgt.t ,FALLBACK ,
     NO BEFORE JOURNAL,
     NO AFTER JOURNAL,
     CHECKSUM = DEFAULT,
     DEFAULT MERGEBLOCKRATIO
     (
      msgtype VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,
      ts VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,
      source VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,
      msgclass INTEGER,
      msgtext VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,
      device_range VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC,
      device_id VARCHAR(255) CHARACTER SET LATIN NOT CASESPECIFIC)
PRIMARY INDEX ( ts );

After exporting a HADOOP_CLASSPATH of:

export HADOOP_HOME=/usr/lib/hadoop
export HIVE_HOME=/usr/lib/hive
export HCAT_HOME=/usr/lib/hcatalog
export TDCH_HOME=/usr/lib/tdch

export HADOOP_CLASSPATH=$HIVE_HOME/conf:\
${HIVE_HOME}/lib/antlr-runtime-3.4.jar:\
${HIVE_HOME}/lib/commons-dbcp-1.4.jar:\
${HIVE_HOME}/lib/commons-pool-1.5.4.jar:\
${HIVE_HOME}/lib/datanucleus-core-3.0.9.jar:\
${HIVE_HOME}/lib/datanucleus-enhancer-3.0.1.jar:\
${HIVE_HOME}/lib/datanucleus-rdbms-3.0.8.jar:\
${HIVE_HOME}/lib/hive-cli-0.11.0.1.3.2.0-111.jar:\
${HIVE_HOME}/lib/hive-exec-0.11.0.1.3.2.0-111.jar:\
${HIVE_HOME}/lib/hive-metastore-0.11.0.1.3.2.0-111.jar:\
${HIVE_HOME}/lib/jdo2-api-2.3-ec.jar:\
${HIVE_HOME}/lib/libfb303-0.9.0.jar:\
${HIVE_HOME}/lib/libthrift-0.9.0.jar:\
${HIVE_HOME}/lib/mysql-connector-java.jar:\
${HIVE_HOME}/lib/slf4j-api-1.6.1.jar:\
${HCAT_HOME}/usr/lib/hcatalog/share/hcatalog/hcatalog-core-0.11.0.1.3.2.0-111.jar:\
${TDCH_HOME}/hive-builtins-0.9.0.jar

And using the teradata connector for hadoop command:

hadoop jar /usr/lib/tdch/teradata-connector-1.2.jar
com.teradata.hadoop.tool.TeradataExportTool 
-libjars /usr/lib/hive/lib/hive-cli-0.11.0.1.3.2.0-111.jar,/usr/lib/hive/lib/hive-exec-0.11.0.1.3.2.0-111.jar,/usr/lib/hive/lib/hive-metastore-0.11.0.1.3.2.0-111.jar,/usr/lib/hive/lib/jdo2-api-2.3-ec.jar,/usr/lib/hive/lib/libfb303-0.9.0.jar,/usr/lib/hive/lib/libthrift-0.9.0.jar,/usr/lib/hive/lib/slf4j-api-1.6.1.jar,/usr/lib/tdch/hive-builtins-0.9.0.jar 
-classname com.teradata.jdbc.TeraDriver 
-url jdbc:teradata://td_server/DATABASE=tgt 
-username myuser
-password mypasswd 
-jobtype hcat 
-method multiple.fastload 
-sourcedatabase src
-sourcetable t
-targettable t

I get the following error:

ERROR tool.TeradataExportTool: java.lang.NoClassDefFoundError: org/apache/hcatalog/mapreduce/HCatInputFormat

I have been playing around with following arguments, but no combination helps right now...

-targettableschema "msgtype VARCHAR(255),ts VARCHAR(255),source VARCHAR(255),msgclass INT,msgtext VARCHAR(255),device_range VARCHAR(255),device_id VARCHAR(255)" 
-targetfieldnames "msgtype,ts,source,msgclass,msgtext,device_range,device_id"
-targetfieldcount "7" 
-sourcetableschema "msgtype STRING,ts STRING,source STRING,msgclass INT,msgtext STRING,device_range STRING,device_id STRING"
-sourcefieldnames "msgtype,ts,source,msgclass,msgtext,device_range,device_id"

 

 

Forums: 

Connect to Microsoft SQL Server from Teradata Studio 14.10 - response (3) by Chuckbert

$
0
0

The JDBC jars are provided by Microsoft. Try their Microsoft JDBC Driver for SQL Server page at http://msdn.microsoft.com/en-us/sqlserver/aa937724.aspx
 


Starting TPUMP on Windows 2008 R2 fails - response (1) by Ivyuan

$
0
0

Hi,
TPump should be able to run on this platform. Please re-install all 32-bit packages (including dependencies, like TERAGSS, TDICU, TDCLI, TD Dataconnector(PIOM) etc and re-try.
Thanks!
--Ivy.

Load rows of data from teradata tables to mysql tables - response (2) by VJI

$
0
0

Hi Sudhansu,
     Thank you for response, would probe TPT further
 
Thank you

TPT 14.10 error with fifo input files - forum topic by datamo

$
0
0

Hi everyone!!
I have several tpt scripts (which input is a fifo file) that used to run properly in tpt 13.0, now the system in which I'm working is upgrading to 14.10 and the scripts fails...
the error is:
        "TPT19120 !ERROR! multi-instance read feature requires REGULAR file names."
I've tried to run de tpt sample scripts founded in  
/opt/teradata/client/14.10/tbuild/sample/userguide/02b    (your installation path)  this sample show how to run tpt with an input fifo file, but I get the same error:

"TPT19120 !ERROR! multi-instance read feature requires REGULAR file names. 'data/datapipe' is not a REGULAR file."

 

Anyone know what is happening? Are there any parameter I've to define in order to get input data from a fifo file? 

 

I can't find any differences in paper related to this error... and the scripts in 13.0 runs fine!

 

Thanks in advance for your help!!!
 

Forums: 

BTEQ exporting multiple files based on column value in a table - forum topic by uco

$
0
0

Hi 
Can anyone please give me some thoughts  on this
I have a scenario where I need to create multiple extract files based on a column values using bteq script and in unx environment.
example 

 table abc 

 

C_Name   ID

xxxxx        1

yyyy          1

aaaaa       2

bbbbb       2

ccccc         1

Now i need to create  files based up on ID and the outfile  name should Name_ID.txt (Eg: Name_1.txt).

And Name_1.txt should have 1's Data only...

There are many more columns in the extract but for example i am using 2 columns

 

thanks

Krish

Forums: 

FASTLOAD SCRIPT ISSUE - forum topic by Gans

$
0
0

I have an issue with my fastload script. It loads the data into my target table however the characters like '*' in my fixed width flat file  need to be converted to null.
My client  requirement is that multiple '********' in fixed width file need to be treated as null while loading to table . Could anyone help on this.
Thanks .Appreciate your answers.

Forums: 

Handle records in Mload ET table in TEXT mode - A column or character expression is larger than the max size - forum topic by cheeli

$
0
0

Greetings experts,
I have fastexported a table in text mode and loaded the data in to target table using Mload in text mode (I am using 13.00 Demo version on windows 7).  
source/target table structure:

      L_ORDERKEY INTEGER,

      L_PARTKEY INTEGER,

      L_QUANTITY DECIMAL(15,2),

      L_LINESTATUS CHAR(1) CHARACTER SET LATIN NOT CASESPECIFIC

PRIMARY INDEX ( L_ORDERKEY )
Now I have manually edited some records in the fastexported file to have numeric overflow thereby some records end up in ET table.  I was trying to handle the records in ET table.
Following is the BTEQ script that I tried to use to export the records in REPORT mode from ET table which is failing.

.logon localtd/tduser,tduser;
.set format on;

.export report file="G:\Users\cheeli\Desktop\bteq_op\et_itemppi_text.txt";

select hostdata from samples.et_itemppi_wodate;

.export reset;

 
Error message is:

select hostdata from samples.et_itemppi_wodate;

$

 *** Failure 3798 A column or character expression is larger than the max size.

                Statement# 1, Info =0 

 *** Total elapsed time was 1 second.

 

 

However, when I tried the same with fast export it worked and I successfully loaded the records into the target table.

 

Fastexport script:

.logtable LT_itemppi_;
.logon localtd/tduser,tduser;
.begin export sessions 12;
.export outfile "G:\Users\cheeli\Desktop\fexp_out\et_fexp_itemppi_text.txt" format text mode record;
select hostdata from samples.et_itemppi_wodate;
.end export;
 

Mload script:

.LOGTABLE SAMPLES.ML_ITEMPPI_wodate;
.logon localtd/tduser,tduser;

.begin import mload tables samples.itemppi_wodate
checkpoint 70
errlimit 3;

.LAYOUT DATA_LAYOUT;
.filler abc * char(2);
.field L_ORDERKEY * char(12);
.filler l_partkey_filler * char(7);
.field L_PARTKEY * char(5); 
.field L_QUANTITY * char(20); 
.field L_LINESTATUS * CHAR(2); 

.dml label insert_itemppi;
insert into samples.itemppi_wodate values (:L_ORDERKEY, :L_PARTKEY, :L_QUANTITY, :L_LINESTATUS);

.import infile "G:\Users\cheeli\Desktop\fexp_out\et_fexp_itemppi_text.txt" 
format text
layout data_layout 
apply insert_itemppi;
.end mload;

.logoff;

Can you please let me know how to export the records from BTEQ!
I have tried to cast the hostdata to char(1000) and it has failed as cast is not allowed on VARBYTE.

 

Forums: 

FASTLOAD SCRIPT ISSUE - response (1) by ThomasNguyen

$
0
0

Hi Gans,
Not sure about your question but if records in the input file, have a field with value '********' and you want to load it as a NULL, then you can use NULLIF clause in the DEFINE command.
Thomas


FASTLOAD SCRIPT ISSUE - response (2) by SuSeSi

$
0
0

Use NULLIF command in Fastload. Check Manual for details.

Handle records in Mload ET table in TEXT mode - A column or character expression is larger than the max size - response (1) by SuSeSi

$
0
0

Try this:

.logon localtd/tduser,tduser;
.export data file="G:\Users\cheeli\Desktop\bteq_op\ et_itemppi_text.txt";
select hostdata from samples.et_itemppi_wodate;
.export reset;

 

FASTLOAD SCRIPT ISSUE - response (3) by Gans

$
0
0

 trans_id (char (28) , NULLIF = '*' ),
 item_name (char (127), NULLIF ='*'),
 
This is how i am defining my statement ... but still the table is loaded with  '*******' from the fixed width file. 

FASTLOAD SCRIPT ISSUE - response (4) by feinholz

$
0
0

FastLoad does an exact match, byte-for-byte.
It is not pattern matching.
So, if you field will have 7 '*'s, then your NULLIF clause must have 7 '*'s.
If each row will have a different number of them, then the NULLIF feature will not work.
 

Handle records in Mload ET table in TEXT mode - A column or character expression is larger than the max size - response (2) by cheeli

$
0
0

Hi Sudhansu,
That worked, thank you.  I was expecting the data to be in unreadable format due to DATA mode.  But I could see that the content is readable.  Any thoughts on this?  
Also, why did the above one fail with REPORT mode.  Can you please elucidate on this.

Viewing all 4252 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>