Loading data files from hadoop to teradata using TPT - response (3) by...
Hi, Whether the feature 'TPT for Hadoop' available in TD 14.0 ? I'm able to unload data from Teradata to Hadoop using TDCH. But, Customer prefers to unload huge volume of data using TPT for Hadoop...
View ArticleTPT 15.00 Teradata HDP Sanbox data movement - response (3) by acnithin
Hi, Can anyone please share code examples for exporting from Teradata to Hadoop via TPT.
View ArticleFastload Skips Header when using from Java - forum topic by sshekhard
Hi All, I am new to Teradata and am currently trying out the JDBC fastload. In my application I am trying to use Fastload using JDBC. The first record is skipped everytime I try to export the...
View ArticleFastload Skips Header when using from Java - response (1) by tomnolan
I assume that you are using the JDBC FastLoad CSV feature of the Teradata JDBC Driver. The JDBC FastLoad CSV feature always expects the first line of the CSV text file to contain the column headers....
View ArticleEXPORT_OPERATOR: TPT12108: Output Schema does not match data from SELECT...
Hello Steve, We are converting our data loads to use the TPT interface. We're having a similar problem and wanted to find a fix. Here's the setup: Moving data from Teradata system A to Teradata...
View ArticleEXPORT_OPERATOR: TPT12108: Output Schema does not match data from SELECT...
You will have to contact Informatica. If both source and target tables are the same, I do not see how (or why) TPTAPI would report that they are not. BTW, TPTAPI does not do any conversions.
View ArticleLoading data files from hadoop to teradata using TPT - response (4) by feinholz
TPT - Hadoop integration is supported in TPT 15.0. The TPT documentation User Guide has sample scripts.
View ArticleTPT 15.00 Teradata HDP Sanbox data movement - response (4) by feinholz
The TPT User Guide has sample scripts.
View ArticleFastload Skips Header when using from Java - response (2) by tomnolan
By the way, here is a code snippet to illustrate how to dynamically prepend a column header line to a CSV InputStream. The basic idea is to use a SequenceInputStream to combine the column header line...
View ArticleLoading data files from hadoop to teradata using TPT - response (5) by Raja_KT
This link maybe of help : http://developer.teradata.com/connectivity/articles/teradata-connector-for-hadoop-now-available I am thinking people are using hadapt, because it can bifurcate works into MR...
View ArticleArcmain Estimates - forum topic by Harpreet Singh
Hi, Can we estimate time used for arcmain data copy/restore using linear scale from sample data? i.e. 5 gb is copied in 5 minutes then 1 tb in 1000 minutes. Regards, HarpreetTags: arcmainForums: Tools
View ArticleFastload Skips Header when using from Java - response (3) by sshekhard
Thank you @tomnolan. It was very helpful.
View ArticleConcatenate one column values into single row value - forum topic by Gnana Reddy
Hi, Can you help me how to fetch multicolumn index columns table vice. From dbc.indices we will get o/p as below: ============================== DBName TBName IndexType Columnname ABC Table1...
View ArticleConcatenate one column values into single row value - response (1) by dnoeth
If your TD system includes XML, this is the simplest way: SELECT DatabaseName ,TABLENAME ,IndexNumber ,IndexType ,TRIM(TRAILING ',' FROM (XMLAGG(TRIM(Columnname)|| ',' ORDER BY ColumnPosition)...
View ArticleArcmain Estimates - response (1) by Glass
Harpreet, If your network speed is consistent and the same as for the 5 G copy, then yes. If you are not already, then you may want to use multstream job for Higher volume of data so this will be...
View ArticleTeradata Parallel Transporter Wizard 14.10 Issue - response (10) by...
Hi, I have windows vista 32 bit os. I am trying to run tpt wizard but get an error while opening. I can see a command prompt run after i click it with error message - syntax error while parsing etc.. I...
View Articleunable to varchar(max) / nvarchar(max) data from sql server to Teradata using...
Hi Steven, Thanks for the response. Is there any other way I can load column from sql server to teradata that exceeds 64000 bytes?
View Articleunable to varchar(max) / nvarchar(max) data from sql server to Teradata using...
Right now, the only way is to export the LOB-sized data from SQLServer via some other tool, write the data out to a file, and then load from that file into Teradata with TPT.
View ArticleConcatenate one column values into single row value - response (2) by Gnana...
Thank you very much Dieter. Working as expected.
View ArticleTPT - source data enclosed with double quotes and delimited by pipe (|) -...
Hi Steve, Thanks for answering all my queries. And sorry for bothering again n again. I have few more queries - 1. In TPT script, why we require - (The schema in the script must be all VARCHAR, but...
View Article