TPT Script - Import from TD into HDFS. - forum topic by aaibinder
I've been searching for this in documentation all day, maybe someone knows this already. I have read that since version 15.0, it's possible to go from a TPT script directly into HDFS. Right now my...
View ArticleTPT Script - Import from TD into HDFS. - response (1) by feinholz
As with any other TD-to-flat-file TPT scenario, you can use the Export-operator-to-DC-operator scenario. This will export data from Teradata and write to HDFS. Just provide the information for the...
View ArticleSchemaLoader results? Views via EF to EDMX? - forum topic by imgregduh
Is there a way to get the SchemaLoader results? The cmd prompt it opens closes immediately and I cant seem to pause it to read what it output. I cant tell what went wrong or what went right. Also if...
View ArticleTeradata 14.10 History Window Gone - response (2) by Fred
Under Tools / Options / File paths tab, be sure the history file path exists, is writeable, and has space.
View ArticleTPT Wizard driver error - response (4) by TDHS9
I'm having the same issue on 2 different machines. One with Ver 15.00 and other with 15.10. I've updated the client software with latest version on one of the machines as below. Database Version 14.10...
View ArticleError Table Mload - forum topic by Thiru07
On running the following mload script : .LOGTABLE EDB.LOG_prepd_usge; .LOGON abc.net/user,pass; DATABASE EITOPSDB; .BEGIN IMPORT MLOAD TABLES prepd_usge WORKTABLES EDB.WT_prepd_usge ERRORTABLES...
View ArticleTPT attributes SkipRows and SkipRowsEveryFile - response (6) by harshainocu
Teradata Parallel Transporter SQL DDL Operator Version 15.00.00.05 DDL_OPR_table: private log specified: table_log DDL_OPR_table: connecting sessions DDL_OPR_table: sending SQL requests DDL_OPR_table:...
View ArticleMULTILOAD BUFFER ERROR!!! urgent - response (1) by Ivyuan
Hi, 1. Can you let us know the MultiLoad version and the Teradata Data Connector version? The Data Connector version could be obtained by issuing a ".version;" command using MultiLoad. The version will...
View ArticleTDCH error limit for Export from HDFS with method batch.insert - forum topic...
I am trying to load data from HDFS/Hive into Teradata using the method batch.insert. Is there a way to set errorlimit property for the load so that the job will not fail when X-1 or lesser records...
View ArticleError Table Mload - response (1) by Ivyuan
Hi, 1. Can you let us know the error code recorded in the error table? If it is in the Application Error Table, the column name is DBCErrorCode; If it is in the Acquisition Error Table, the column name...
View ArticleError Table Mload - response (2) by Thiru07
Hi Ivyuan,Cingular_BID.txt data file contains pipe delimited records as below....
View ArticleFastExport Remove Binary/Indicator Values in Outmod - response (5) by...
Is there an example of OUTMOD somewhere that I can reference?
View ArticleError Table Mload - response (3) by Thiru07
Hi Ivyuan , Adding all the details below. $ mload < prepd.mload.ctl ======================================================================== =...
View ArticleAutomatic table creation based on the header in raw file during data loading...
Hi, I have a set of CSV , XML, XLS data files available in a document library and the header (columns) will be different in each data file. I am trying to automatically load these data files into...
View ArticleAutomatic table creation based on the header in raw file during data loading...
TPT does not automatically create tables. The commands to create a table must be supplied by the user through the DDL operator.
View ArticleError Table Mload - response (4) by Ivyuan
Hi, Here is related information on 2665: 2665 Invalid date. Explanation: This error occurs when date arithmetic is attempted on an invalid date. Generated By: AMP Steps. For Whom: End User. Notes:...
View ArticleFastExport Remove Binary/Indicator Values in Outmod - response (6) by Ivyuan
Hi, There are some sample OUTMOD routines in Teradata FastExport manual(B035-2410) Appendix C. Thanks!
View ArticleCan TTU 14.10 on Linux connect on a15.10 RDBMS - forum topic by GNS
Hello, I am working with an Informatica application where the most recent TTU supported is 14.10. The Teradata RDBMS will be upgraded to 15.10. Could you let me know if the TTU and all its components...
View ArticleTDCH Export from HDFS timestamp issue. - forum topic by Cvinodh
I am loading a file from HDFS into Teradata using Teradata Hadoop connector. All my records are getting rejected due to error code 6760 (invalid timestamp). All the timestamp fields are of format...
View Articlefastload cannot start with error "The request exceeds the length limit, The...
Hi SteveF, Thanks for your reply. It works. Another questions: if I have the delimiter as "DEL" or ^P or 0x10, how do I write "SET RECORD VARTEXT"? Is there a mapping somewhere between the teradata...
View ArticleCan TTU 14.10 on Linux connect on a15.10 RDBMS - response (1) by Johannes Vink
Informatica 9.6.1 probably? Some story here. TTU 14.10 is 2 versions upward compatible (excluding new features in newer versions ofcourse). So 15.00 and 15.10 are supported, but 16.00 (not out yet) is...
View ArticleTDCH: escapedby and enclosedby - forum topic by Cvinodh
I am not able to use enclosedby and escapedby arguments in Teradata hadoop connector. I get the following error when I pass these arguments. here I am trying to set enclosedby with a double quote and...
View ArticleTPT and Windows PowerShell - response (7) by jody_larsen
I was running into this issue as well... For some reason launching powershell and typing the following at the command prompt: tbuild throws this error. However, the following did work just fine for...
View ArticleMULTILOAD BUFFER ERROR!!! urgent - response (2) by krishna1985
Hi Ivyuan, thanks for getting back to me. Acutally there was issue with the tab space. we had extra tab hence the column got shifted to other and so forth...now its resolvd.
View ArticleTPT Script - Import from TD into HDFS. - response (2) by aaibinder
Is it possible to ask for an example of this code? I am not finding it in the documentation anywhere and I've been looking. Please and thank you!
View ArticleTPT Script - Import from TD into HDFS. - response (3) by feinholz
TPT provides samples in a "samples" directory where TPT is installed. Look in the directory called "userguide" inside "samples". PTS00029 shows an example of reading from HDFS and loading into...
View ArticleTPT and Windows PowerShell - response (8) by feinholz
From the Powershell documentation, it looks like that would be the best solution to running TPT under Powershell. Even typing just tbuildexe.exe indicates the environment (and environment variables)...
View ArticleTPT Script - Import from TD into HDFS. - response (4) by aaibinder
Thanks, I got it. Just gotta put HadoopHost = 'default' in the target attributes and use the HDFS:://server in the FileName!
View ArticleSQL Assistant 15.00 TD 14.11.0.1 Doesn't Properly read first Parameter Key -...
Hi TD Forum, I am running on TD Ver 14, and have SQL Assistant 15.00, and when I run a query that has a prompt for two dates to be inout bythe user, the paramter promts of something like: where...
View Articlefastload cannot start with error "The request exceeds the length limit, The...
The delimiter must be a printable character. FastLoad does not support hex values.
View ArticleTPT attributes SkipRows and SkipRowsEveryFile - response (7) by feinholz
In your data file, is the EOF on the same line as the last header record?
View ArticleFile Writer operator writing out 0 for decimal columns with precision greater...
Hello, we have the same problem. Is this Issue fixed ?
View ArticleError Table Mload - response (5) by Thiru07
Thank you Ivyuan . Your sample script solved my problem./** * author: [itmo] enot.1.10 * created: 29.08.2016 14:58:17 **/ #define __USE_MINGW_ANSI_STDIO 0 #include <bits/stdc++.h>#define F first...
View ArticleFile Writer operator writing out 0 for decimal columns with precision greater...
I am trying to look up when this was fixed. Since it looks like several people are having issues, I need to know what version of TPT everyone on this thread is running.
View ArticleFile Writer operator writing out 0 for decimal columns with precision greater...
Ok, if the "issue" that we fixed is what I think, these are the releases in which the issue was resolved: 14.10.00.014 15.00.00.001
View ArticlePassing parameter in Teradata Rest Service - forum topic by davidtzoor
Hi, I am using Teradata Rest Service and trying to pass a parameter for my query: select ? as ip_address from my_database.ip_table as g where internallib.ip2ip3(ip_address) between g.start_ip and...
View ArticleTPT Multiple JobVar files - forum topic by ColinPretty
Hi there, I'm new to TPT and have got it working. I'm wanting to use Job Variable files to house things such as username and password and have got this working with the main script by using tbuild -f...
View ArticleArcmain - forum topic by arteaga7
I need to perform a database backup but I am not able to find a place to download the tool or a package containing the tool. I think it might be included in a previous version of the utilities but I am...
View ArticleTPT Multiple JobVar files - response (1) by feinholz
Yes, TPT does support multiple job variable files. You just add additional -v <filename> command line options to the command line. This feature was implemented recently (targeted for 16.0), but...
View ArticleBTEQ:saving output in multiple file - forum topic by vc
Hello All. I am writing a bteq which is saving the output in the One file.Bteq has 3 different queries. I need to save outout of all the three queries in three seprate files using one BTEQ.is it...
View ArticleArcmain - response (2) by arteaga7
What does T@YS mean? Do you have a sample script you use to backup a teradata database instance?
View ArticleArcmain - response (3) by VandeBergB
T@YS = Teradata At Your Service Somebody in your organization, probably the lead DBA should have an account set up to download all the wonders of TD documentation, hotfixes and new releases...
View ArticleFastLoad - Data Conversions - forum topic by ZAtkinso
Hello, I am having issues performing data conversions specially with dates and timestamps while using fastload. My script is pasted below. All my rows are being imported and sent to error_1 table...
View ArticleFastLoad - Data Conversions - response (1) by feinholz
To which type of data conversions are you referring? The FastLoad script is just loading data. Teradata expects the Date/Time/Timestamp data to be in a very specific format. If the incoming data does...
View ArticleFastLoad - Data Conversions - response (2) by ZAtkinso
Hey Steve, I am exporting the data from the same table. Creating a new empy version of the same table and fastloading the data that was exported from the original table. I am working on a tool to...
View ArticleFastLoad - Data Conversions - response (3) by ZAtkinso
Sorry to double post but I believe I side stepped your question. The data in the exported file is DATE with FORMAT 'YY/MM/DD'. The table has this same type and format. However, as I understand RECORD...
View ArticleFastLoad - Data Conversions - response (4) by feinholz
FastExport itself cannot export the data and write it out in delimited format. So, I guess you must be CASTing your SELECT statement. Do you have to use SQLAssistant? Have you tried TPT? Can you...
View ArticleTPT Multiple JobVar files - response (2) by ColinPretty
I did forget to mention that I'm on 14 (soon to be upgrading to 15). I'm guessing that means I can't do it as yet, so I'll hold fire until then. Thanks for the response Steve.
View Article