FASTEXPORT Looping - response (5) by Raja_KT
Fast load Error - forum topic by Rohit Ranjan
Hi
I am getting error while trying to run a fastload script. The error says " FDL4822 DEFINE statement rejected".
Below is the script:
.logon abcd/passwd;
.set record vartext "";
create table abc
(
eno. interger,
ename varchar(20),
dno integer,
sal decimal(10,2)
)
unique primary index(eno);
.DEFINE
eno (varchar(20)),
ename (varchar(20)),
dno (varchar(20)),
sal (varchar(20));
.BEGIN LOADING abc
.ERRORFILES
emp_err1,emp_err2
.CHECKPOINT 10000
File = file path
show;
insert into abc
(
:eno,
:ename,
:dno,
:sal
);
.END LOADING;
.logoff;
Fast load Error - response (1) by feinholz
When in doubt, please read the manual.
The "File" specification is part of the DEFINE statement.
You have the BEGIN LOADING command in between the DEFINE and the "File".
And the syntax of your BEGIN LOADING is incorrect.
The FastLoad Reference manual will provide you with everything you need.
Fast load Error - response (2) by dnoeth
And there's another error, remove the period before DEFINE:
• Commands may begin with a period, but do not have to begin with a period.
• If there is no leading period, then there must be a semicolon at the end.
• If the command has a leading period, it must all be on one line. Commands that begin
with a period cannot span multiple lines.
TPT - Scripts from 13.10 doesn't work with 15.10 using named pipes - forum topic by EricSantAnna
Hi,
I'm was using TTU 13.10 until now, when I upgraded my production machine with TTU 15.00.
But now I'm getting this error:
TPT19435 pmRead failed. EOF encountered before end of record (35)
The process that load data to the named pipes doesn't have time to connect to the pipes.
When I uninstalled 15.00 and installed 13.10 again everything back to normal.
I also tryed to run tbuild 15.00 using the np_AXSMOD.dll from the 13.10 package, but result was the same.
I can't find anything wrong in the docs.
Here's the dataconnector producer:
DEFINE OPERATOR PIPE_READER1() DESCRIPTION 'Define opcoes de leitura de arquivos' TYPE DATACONNECTOR PRODUCER SCHEMA LOAD_AS_IS_TEST_SCHEMA ATTRIBUTES ( VARCHAR AccessModuleName = 'np_axsmod.dll' , VARCHAR AccessModuleInitStr , VARCHAR FileName = '\\.\pipe\LoadAsIs_1' , VARCHAR Format = 'TEXT' , VARCHAR IndicatorMode = 'N' , VARCHAR OpenMode = 'Read' , VARCHAR RowErrFileName = @errorFilePath , INTEGER MaxSessions = @maxExportSessions , INTEGER MinSessions = @minExportSessions );
Log (using 13.10):
15:24:07 INFO > Teradata Parallel Transporter Version 13.10.00.02 15:24:07 INFO > Job log: D:\Programacao\Java\Projetos\teradata-loader\src\test\resources\temp/LoadAsIs-1.out 15:24:07 ERROR> WARN:Failed to lookup account administrators 15:24:07 ERROR> WARN:Failed to lookup account administrators 15:24:07 INFO > Job id is LoadAsIs-1, running on DragonZord 15:24:10 INFO > Teradata Parallel Transporter SQL DDL Operator Version 13.10.00.02 15:24:10 INFO > PREPARES_LOAD: private log not specified 15:24:13 INFO > PREPARES_LOAD: connecting sessions 15:24:14 INFO > PREPARES_LOAD: sending SQL requests 15:24:16 INFO > PREPARES_LOAD: TPT10508: RDBMS error 3807: Object 'DCT_PROD_TDM.LOAD_AS_IS_TEST' does not exist. 15:24:16 INFO > PREPARES_LOAD: TPT18046: Warning: error is ignored as requested in ErrorList 15:24:16 INFO > PREPARES_LOAD: disconnecting sessions 15:24:17 INFO > PREPARES_LOAD: Total processor time used = '0.140401 Second(s)' 15:24:17 INFO > PREPARES_LOAD: Start : Mon Aug 11 15:24:10 2014 15:24:17 INFO > PREPARES_LOAD: End : Mon Aug 11 15:24:17 2014 15:24:17 INFO > Job step PREPARES_LOAD_LOAD_AS_IS_TEST completed successfully 15:24:26 INFO > Teradata Parallel Transporter DataConnector Version 13.10.00.02 15:24:26 INFO > PIPE_READER4: TPT19008 DataConnector Producer operator Instances: 1 15:24:26 INFO > Teradata Parallel Transporter DataConnector Version 13.10.00.02 15:24:26 INFO > Teradata Parallel Transporter Load Operator Version 13.10.00.02 15:24:26 INFO > DATA_LOAD: private log not specified 15:24:26 INFO > PIPE_READER5: TPT19008 DataConnector Producer operator Instances: 1 15:24:26 INFO > Teradata Parallel Transporter DataConnector Version 13.10.00.02 15:24:26 INFO > Teradata Parallel Transporter DataConnector Version 13.10.00.02 15:24:26 INFO > PIPE_READER3: TPT19008 DataConnector Producer operator Instances: 1 15:24:26 INFO > PIPE_READER2: TPT19008 DataConnector Producer operator Instances: 1 15:24:26 INFO > Teradata Parallel Transporter DataConnector Version 13.10.00.02 15:24:26 INFO > Teradata Parallel Transporter DataConnector Version 13.10.00.02 15:24:26 INFO > PIPE_READER8: TPT19008 DataConnector Producer operator Instances: 1 15:24:26 INFO > PIPE_READER6: TPT19008 DataConnector Producer operator Instances: 1 15:24:26 INFO > Teradata Parallel Transporter DataConnector Version 13.10.00.02 15:24:26 INFO > PIPE_READER7: TPT19008 DataConnector Producer operator Instances: 1 15:24:26 INFO > Teradata Parallel Transporter DataConnector Version 13.10.00.02 15:24:26 INFO > PIPE_READER1: TPT19008 DataConnector Producer operator Instances: 1 15:24:26 INFO > PIPE_READER1: TPT19003 ECI operator ID: PIPE_READER1-6656 15:24:26 INFO > PIPE_READER4: TPT19003 ECI operator ID: PIPE_READER4-2596 15:24:26 INFO > PIPE_READER7: TPT19003 ECI operator ID: PIPE_READER7-6900 15:24:26 INFO > PIPE_READER2: TPT19003 ECI operator ID: PIPE_READER2-6128 15:24:26 INFO > PIPE_READER5: TPT19003 ECI operator ID: PIPE_READER5-3504 15:24:26 INFO > PIPE_READER8: TPT19003 ECI operator ID: PIPE_READER8-5456 15:24:26 INFO > PIPE_READER6: TPT19003 ECI operator ID: PIPE_READER6-5728 15:24:26 INFO > PIPE_READER3: TPT19003 ECI operator ID: PIPE_READER3-3396 15:24:28 INFO > DATA_LOAD: connecting sessions 15:24:37 INFO > PIPE_READER2: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_2'. 15:24:37 INFO > PIPE_READER1: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_1'. 15:24:37 INFO > PIPE_READER5: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_5'. 15:24:37 INFO > PIPE_READER3: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_3'. 15:24:37 INFO > PIPE_READER8: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_8'. 15:24:37 INFO > PIPE_READER6: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_6'. 15:24:37 INFO > PIPE_READER4: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_4'. 15:24:37 INFO > PIPE_READER7: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_7'. 15:24:39 INFO > DATA_LOAD: preparing target table 15:24:40 INFO > DATA_LOAD: entering Acquisition Phase 15:24:49 INFO > DATA_LOAD: entering Application Phase 15:24:50 INFO > DATA_LOAD: Statistics for Target Table: 'DCT_PROD_TDM.LOAD_AS_IS_TEST' 15:24:50 INFO > DATA_LOAD: Total Rows Sent To RDBMS: 3352 15:24:50 INFO > DATA_LOAD: Total Rows Applied: 3352 15:24:52 INFO > DATA_LOAD: disconnecting sessions 15:24:52 INFO > PIPE_READER4: TPT19221 Total files processed: 1. 15:24:52 INFO > PIPE_READER7: TPT19221 Total files processed: 1. 15:24:52 INFO > PIPE_READER8: TPT19221 Total files processed: 1. 15:24:52 INFO > PIPE_READER6: TPT19221 Total files processed: 1. 15:24:52 INFO > PIPE_READER3: TPT19221 Total files processed: 1. 15:24:52 INFO > PIPE_READER2: TPT19221 Total files processed: 1. 15:24:52 INFO > PIPE_READER5: TPT19221 Total files processed: 1. 15:24:52 INFO > PIPE_READER1: TPT19221 Total files processed: 1. 15:24:57 INFO > DATA_LOAD: Total processor time used = '0.405603 Second(s)' 15:24:57 INFO > DATA_LOAD: Start : Mon Aug 11 15:24:26 2014 15:24:57 INFO > DATA_LOAD: End : Mon Aug 11 15:24:57 2014 15:24:57 INFO > Job step LOAD_LOAD_AS_IS_TEST completed successfully 15:24:57 INFO > Job LoadAsIs completed successfully
Log (using 15.00):
14:24:54 INFO > Teradata Parallel Transporter Version 15.00.00.00 14:24:54 INFO > Job log: D:\Programacao\Java\Projetos\teradata-loader\src\test\resources\temp/LoadAsIs-42.out 14:24:54 ERROR> WARN:Failed to lookup account administrators 14:24:54 ERROR> WARN:Failed to lookup account administrators 14:24:54 INFO > Job id is LoadAsIs-42, running on DragonZord 14:24:57 INFO > Teradata Parallel Transporter SQL DDL Operator Version 15.00.00.00 14:24:57 INFO > PREPARES_LOAD: private log not specified 14:24:59 INFO > PREPARES_LOAD: connecting sessions 14:25:00 INFO > PREPARES_LOAD: The RDBMS retryable error code list was not found 14:25:00 INFO > PREPARES_LOAD: The job will use its internal retryable error codes 14:25:00 INFO > PREPARES_LOAD: sending SQL requests 14:25:07 INFO > PREPARES_LOAD: disconnecting sessions 14:25:08 INFO > PREPARES_LOAD: Total processor time used = '0.0936006 Second(s)' 14:25:08 INFO > PREPARES_LOAD: Start : Mon Aug 11 14:24:57 2014 14:25:08 INFO > PREPARES_LOAD: End : Mon Aug 11 14:25:08 2014 14:25:08 INFO > Job step PREPARES_LOAD_LOAD_AS_IS_TEST completed successfully 14:25:18 INFO > Teradata Parallel Transporter PIPE_READER2[1]: TPT19006 Version 15.00.00.00 14:25:18 INFO > Teradata Parallel Transporter PIPE_READER6[1]: TPT19006 Version 15.00.00.00 14:25:18 INFO > Teradata Parallel Transporter PIPE_READER7[1]: TPT19006 Version 15.00.00.00 14:25:18 INFO > Teradata Parallel Transporter PIPE_READER3[1]: TPT19006 Version 15.00.00.00 14:25:18 INFO > PIPE_READER2[1]: TPT19010 Instance 1 directing private log report to 'dtacop-Home Work-5760-1'. 14:25:18 INFO > PIPE_READER6[1]: TPT19010 Instance 1 directing private log report to 'dtacop-Home Work-4060-1'. 14:25:18 INFO > PIPE_READER7[1]: TPT19010 Instance 1 directing private log report to 'dtacop-Home Work-5348-1'. 14:25:18 INFO > PIPE_READER3[1]: TPT19010 Instance 1 directing private log report to 'dtacop-Home Work-3300-1'. 14:25:18 INFO > Teradata Parallel Transporter PIPE_READER5[1]: TPT19006 Version 15.00.00.00 14:25:18 INFO > PIPE_READER2[1]: TPT19003 NotifyMethod: 'None (default)' 14:25:18 INFO > PIPE_READER2[1]: TPT19008 DataConnector Producer operator Instances: 1 14:25:18 INFO > PIPE_READER7[1]: TPT19003 NotifyMethod: 'None (default)' 14:25:18 INFO > PIPE_READER7[1]: TPT19008 DataConnector Producer operator Instances: 1 14:25:18 INFO > Teradata Parallel Transporter PIPE_READER8[1]: TPT19006 Version 15.00.00.00 14:25:18 INFO > PIPE_READER5[1]: TPT19010 Instance 1 directing private log report to 'dtacop-Home Work-1488-1'. 14:25:18 INFO > PIPE_READER6[1]: TPT19003 NotifyMethod: 'None (default)' 14:25:18 INFO > PIPE_READER3[1]: TPT19003 NotifyMethod: 'None (default)' 14:25:18 INFO > PIPE_READER6[1]: TPT19008 DataConnector Producer operator Instances: 1 14:25:18 INFO > PIPE_READER3[1]: TPT19008 DataConnector Producer operator Instances: 1 14:25:18 INFO > PIPE_READER5[1]: TPT19003 NotifyMethod: 'None (default)' 14:25:18 INFO > PIPE_READER5[1]: TPT19008 DataConnector Producer operator Instances: 1 14:25:18 INFO > PIPE_READER8[1]: TPT19010 Instance 1 directing private log report to 'dtacop-Home Work-1196-1'. 14:25:18 INFO > PIPE_READER8[1]: TPT19003 NotifyMethod: 'None (default)' 14:25:18 INFO > PIPE_READER8[1]: TPT19008 DataConnector Producer operator Instances: 1 14:25:18 INFO > Teradata Parallel Transporter PIPE_READER4[1]: TPT19006 Version 15.00.00.00 14:25:18 INFO > Teradata Parallel Transporter Load Operator Version 15.00.00.00 14:25:18 INFO > DATA_LOAD: private log not specified 14:25:18 INFO > PIPE_READER4[1]: TPT19010 Instance 1 directing private log report to 'dtacop-Home Work-6988-1'. 14:25:18 INFO > PIPE_READER4[1]: TPT19003 NotifyMethod: 'None (default)' 14:25:18 INFO > PIPE_READER4[1]: TPT19008 DataConnector Producer operator Instances: 1 14:25:18 INFO > PIPE_READER6[1]: TPT19003 ECI operator ID: 'PIPE_READER6-4060' 14:25:18 INFO > PIPE_READER5[1]: TPT19003 ECI operator ID: 'PIPE_READER5-1488' 14:25:18 INFO > PIPE_READER8[1]: TPT19003 ECI operator ID: 'PIPE_READER8-1196' 14:25:18 INFO > PIPE_READER7[1]: TPT19003 ECI operator ID: 'PIPE_READER7-5348' 14:25:18 INFO > PIPE_READER2[1]: TPT19003 ECI operator ID: 'PIPE_READER2-5760' 14:25:18 INFO > PIPE_READER3[1]: TPT19003 ECI operator ID: 'PIPE_READER3-3300' 14:25:18 INFO > PIPE_READER6[1]: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_6'. 14:25:18 INFO > PIPE_READER8[1]: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_8'. 14:25:18 INFO > PIPE_READER7[1]: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_7'. 14:25:18 INFO > PIPE_READER3[1]: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_3'. 14:25:18 INFO > PIPE_READER2[1]: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_2'. 14:25:18 INFO > Teradata Parallel Transporter PIPE_READER1[1]: TPT19006 Version 15.00.00.00 14:25:18 INFO > PIPE_READER1[1]: TPT19010 Instance 1 directing private log report to 'dtacop-Home Work-5768-1'. 14:25:18 INFO > PIPE_READER4[1]: TPT19003 ECI operator ID: 'PIPE_READER4-6988' 14:25:18 INFO > PIPE_READER1[1]: TPT19003 NotifyMethod: 'None (default)' 14:25:18 INFO > PIPE_READER1[1]: TPT19008 DataConnector Producer operator Instances: 1 14:25:18 INFO > PIPE_READER5[1]: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_5'. 14:25:18 INFO > PIPE_READER4[1]: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_4'. 14:25:18 INFO > PIPE_READER1[1]: TPT19003 ECI operator ID: 'PIPE_READER1-5768' 14:25:18 INFO > PIPE_READER1[1]: TPT19222 Operator instance 1 processing file '\\.\pipe\LoadAsIs_1'. 14:25:21 INFO > DATA_LOAD: connecting sessions 14:25:21 INFO > DATA_LOAD: The RDBMS retryable error code list was not found 14:25:21 INFO > DATA_LOAD: The job will use its internal retryable error codes 14:25:52 INFO > DATA_LOAD: preparing target table 14:25:58 INFO > DATA_LOAD: entering Acquisition Phase 14:26:01 INFO > PIPE_READER1[1]: TPT19435 pmRead failed. EOF encountered before end of record (35) 14:26:01 INFO > PIPE_READER2[1]: TPT19435 pmRead failed. EOF encountered before end of record (35) 14:26:01 INFO > PIPE_READER3[1]: TPT19435 pmRead failed. EOF encountered before end of record (35) 14:26:01 INFO > PIPE_READER1[1]: TPT19305 Fatal error reading data. 14:26:01 INFO > PIPE_READER2[1]: TPT19305 Fatal error reading data. 14:26:01 INFO > PIPE_READER3[1]: TPT19305 Fatal error reading data. 14:26:01 INFO > PIPE_READER1[1]: TPT19015 TPT Exit code set to 12. 14:26:01 INFO > PIPE_READER2[1]: TPT19015 TPT Exit code set to 12. 14:26:01 INFO > PIPE_READER3[1]: TPT19015 TPT Exit code set to 12. 14:26:01 INFO > PIPE_READER4[1]: TPT19435 pmRead failed. EOF encountered before end of record (35) 14:26:01 INFO > PIPE_READER5[1]: TPT19435 pmRead failed. EOF encountered before end of record (35) 14:26:01 INFO > PIPE_READER4[1]: TPT19305 Fatal error reading data. 14:26:01 INFO > PIPE_READER5[1]: TPT19305 Fatal error reading data. 14:26:01 INFO > PIPE_READER4[1]: TPT19015 TPT Exit code set to 12. 14:26:01 INFO > PIPE_READER6[1]: TPT19435 pmRead failed. EOF encountered before end of record (35) 14:26:01 INFO > PIPE_READER5[1]: TPT19015 TPT Exit code set to 12. 14:26:01 INFO > PIPE_READER6[1]: TPT19305 Fatal error reading data. 14:26:01 INFO > PIPE_READER6[1]: TPT19015 TPT Exit code set to 12. 14:26:01 INFO > PIPE_READER7[1]: TPT19435 pmRead failed. EOF encountered before end of record (35) 14:26:01 INFO > PIPE_READER8[1]: TPT19435 pmRead failed. EOF encountered before end of record (35) 14:26:01 INFO > PIPE_READER7[1]: TPT19305 Fatal error reading data. 14:26:01 INFO > PIPE_READER8[1]: TPT19305 Fatal error reading data. 14:26:01 INFO > PIPE_READER7[1]: TPT19015 TPT Exit code set to 12. 14:26:01 INFO > PIPE_READER8[1]: TPT19015 TPT Exit code set to 12. 14:26:01 INFO > DATA_LOAD: disconnecting sessions 14:26:01 INFO > PIPE_READER2[1]: TPT19221 Total files processed: 0. 14:26:01 INFO > PIPE_READER3[1]: TPT19221 Total files processed: 0. 14:26:01 INFO > PIPE_READER5[1]: TPT19221 Total files processed: 0. 14:26:01 INFO > PIPE_READER6[1]: TPT19221 Total files processed: 0. 14:26:01 INFO > PIPE_READER4[1]: TPT19221 Total files processed: 0. 14:26:01 INFO > PIPE_READER1[1]: TPT19221 Total files processed: 0. 14:26:01 INFO > PIPE_READER8[1]: TPT19221 Total files processed: 0. 14:26:01 INFO > PIPE_READER7[1]: TPT19221 Total files processed: 0. 14:26:18 INFO > DATA_LOAD: Total processor time used = '0.312002 Second(s)' 14:26:18 INFO > DATA_LOAD: Start : Mon Aug 11 14:25:18 2014 14:26:18 INFO > DATA_LOAD: End : Mon Aug 11 14:26:18 2014 14:26:18 INFO > Job step LOAD_LOAD_AS_IS_TEST terminated (status 12) 14:26:18 INFO > Job LoadAsIs terminated (status 12) 14:26:18 INFO > Job start: Mon Aug 11 14:24:54 2014 14:26:18 INFO > Job end: Mon Aug 11 14:26:18 2014
Sorry for the long post.
Thanks for the help.
TPT - Scripts from 13.10 doesn't work with 15.10 using named pipes - response (1) by EricSantAnna
Before my workmates read this post: when I said production machine, I meant development machine...
TPT - Scripts from 13.10 doesn't work with 15.10 using named pipes - response (2) by feinholz
That is a known issue and has been fixed in these versions of TPT:
13.10.00.19
14.00.00.12
14.10.00.05
15.00.00.02
Fast load Error - response (3) by Rohit Ranjan
Thanks Dieter and Steve for your help. Much appreciated
FASTEXPORT Looping - response (6) by Jugalkishorebhatt1
Hi Raja,
Your right. It is a conditional statement, i was wrong to call it looping. What i dont understand in the example or i don see a section which explains the way, conditional statement can be applied in FAST EXPORT. I request u to post a small example scripts which explains how the COndtional statement is applied.
TPT export timeout issue - forum topic by mkravikumar
A TPT extract that creates a very large extract (about 100GB uncompressed size) times out in approximately 1 hour after completing about 40% of the extract. This has happened repeatedly with different number of instances and different buffer sizes. This is the largest extract in our system. We didn't have this issue with any other extracts.
The error is:
EXPORT_OPERATOR: sending SELECT request
EXPORT_OPERATOR: TPT10508: RDBMS error 2594: One of the FastExport session has been logged off
EXPORT_OPERATOR: TPT10508: RDBMS error 2594: One of the FastExport session has been logged off
EXPORT_OPERATOR: disconnecting sessions
Any idea how to fix this? The database timeout is set to 20 minutes. Even changing the database timeout to 60 minutes didn't solve the problem.
Any thoughts/suggestions?
TPT_ISSUE - response (8) by alchang
I have same issue. So could you help to find out the solution? Thanks so much.
the error log:
FastLoad/MultiLoad/TPT Log start here
Try to open: /ETL/DEV/LOG/ext_rmd_discode.logTeradata Parallel Transporter Version 14.00.00.10
Job log: /ETL/DEV/LOG/ext_rmd_discode-2348.out
Job id is ext_rmd_discode-2348, running on TDExpress1403_Sles10
Found CheckPoint file: /ETL/DEV/checkpoint/ext_rmd_discodeLVCP
This is a restart job; it restarts at step MAIN_STEP.
Teradata Parallel Transporter W_dc_op_loadext_rmd_discode: TPT19006 Version 14.00.00.10
W_dc_op_loadext_rmd_discode Instance 1 directing private log report to 'dtacop-etl-25716-1'.
W_dc_op_loadext_rmd_discode Instance 1 restarting.
W_dc_op_loadext_rmd_discode: TPT19008 DataConnector Producer operator Instances: 1
W_dc_op_loadext_rmd_discode: TPT19003 ECI operator ID: W_dc_op_loadext_rmd_discode-25716
Teradata Parallel Transporter Load Operator Version 14.00.00.10
W_l_op_loadext_rmd_discode: private log specified: /ETL/DEV/LOG/ext_rmd_discode.log-1
W_l_op_loadext_rmd_discode: connecting sessions
W_l_op_loadext_rmd_discode: preparing target table
W_l_op_loadext_rmd_discode: entering Acquisition Phase
W_dc_op_loadext_rmd_discode: TPT19404 pmOpen failed. Requested file not found (4)
W_dc_op_loadext_rmd_discode: TPT19304 Fatal error opening file.
W_dc_op_loadext_rmd_discode: TPT19003 TPT Exit code set to 12.
TPT_INFRA: TPT02263: Error: Operator restart error, status = Fatal Error
Task(SELECT_2[0001]): restart completed, status = Fatal Error
W_l_op_loadext_rmd_discode: disconnecting sessions
W_dc_op_loadext_rmd_discode: TPT19221 Total files processed: 0.
W_l_op_loadext_rmd_discode: Total processor time used = '0.44 Second(s)'
W_l_op_loadext_rmd_discode: Start : Thu Aug 14 10:36:52 2014
W_l_op_loadext_rmd_discode: End : Thu Aug 14 10:36:55 2014
Job step MAIN_STEP terminated (status 12)
Job ext_rmd_discode terminated (status 12)
Teradata to Oracle data transfer - forum topic by raju197402
Currenty I have BTEQ scripts written to move data from schema1 to schema2. However there is a plan to move schema2 to Oracle DB from Teradata. Can BTEQ scripts be used to move data from Teradata to Oracle. OR what is the best alternate for me to achive this please?
Thanks in advance,
Cheers
Teradata to Oracle data transfer - response (1) by VandeBergB
Fast export from TD and Bulkload into ORA...or a data warehouse automation tool like Wherescape Red
TPT_ISSUE - response (9) by feinholz
In the script, I will need you to add an attribute to the DataConnector operator definition, as:
VARCHAR TraceLevel = 'all'
and re-run the job that fails and send me the .out file for that job (can be found in the "logs" subdirectory which is located in the TPT install directory).
steven.feinholz@teradata.com
TPT scripts - forum topic by raju197402
Currently I have TPT scripts which transfers data from schema1 (Teradata box1) to schema2 (Teradata box2). There is a plan to move the contents of Teradata Box2 to Oracle.
Will the TPT scripts written can be reused to tranfer data to Oracle. If so does it require any major changes to be done to make it work for Oracle. If not what is the alternate way?
Thanks,
TPT export timeout issue - response (1) by dnoeth
Hi Ravikumar,
there's no timeout on Teradata side, only for logons, otherwise it's always a restriction on client side.
But this might be due to a TASM rule automatically aborting prolonged idle sessions.
You should check QueryLog what actaully happend and/or ask your DBA for help.
TPT - Scripts from 13.10 doesn't work with 15.10 using named pipes - response (4) by EricSantAnna
Where can I get 15.00.00.02?
I can only find 15.00.00.00 in this site (http://downloads.teradata.com/download/tools/teradata-tools-and-utilities-windows-installation-package).
I can't find an "older" or "non-stable" or "RC" versions of TTU or TPT anywhere.
TPT - Scripts from 13.10 doesn't work with 15.10 using named pipes - response (5) by feinholz
Efixes are not shipped to the Developer Exchange.
They are available on the patch server.
RDBMS error 3029: Associated LSN was not found - response (4) by Rufuce
how did the problem resolved? i have the same problem
TPT_INFRA TPT01057 Error Insufficient main storage for attempted allocation - forum topic by fabianoal
Hi guys,
I've no ideia of what could be the cause of this error. On TD Docs (http://www.info.teradata.com/htmlpubs/DB_TTU_14_10/index.html#page/General_Reference/B035_1096_112K/TPT.41.1267.html), the explanation doesn't say how to solve the problem.
The thing is: I wrote a vbs script to generate a TPT script and a amj file to migrate a entire SQLServer database to TD. After several adjustements, I've reached a script that works very well, but it can generate pretty big tpt scripts. The one that is causing me the problem has almost 14k lines. Although the script is big, it is simple. Basicly, a DDL Operator, then, one schema per table. After, one Dataconector Producer (access module OLEDB_AXSMOD) per table (each one references a job in the .amj file) and at last, one Load Operator per table. Toghether with the script, I also create a amj file with one job per table (acctually, a select statement wich I use to do some transformations).
On the steps, first I call the DDL Operator with some DDL Instructions to drop and create the stage tables. Then, the sptes to pull the data from SQLServer to TD (one per table) and push into TD, and at the end, a last call to the DDL Operator to rename the normal tables to backup tables, and the stage tables to the official tables.
As I've said before, the generated script can get a little big, with hundreds of steps. Then, with this script with 257 steps (apart from the initial and final step), when it gets about the 90th I get the following message: "TPT_INFRA TPT01057 Error Insufficient main storage for attempted allocation".
That is the end. Tpt hangs and I can't get anywere from there. There isn't a exactly point where it hangs. It can be at the 90th, 91th, 92th and etc. If I ctrl+c the job e reissue the tbuild command, Tpt processes one more step and again issues the TPT01057 message. The tpt infra doesn't responde to commands like twbcmd <job> resume
nor twbcmd <job> pause
(still it says that is processing the command). The twbstat
says that the job is running.
One important detail is: this only happens between the end of a step and the start of the next step. In other words, it doesn't happens in the middle of a step.
This is one example of the output from tbuild:
Job step step_91 completed successfully Teradata Parallel Transporter src_operator_ativa_ModeloPapelTrabalho_TipoOs: TPT19006 Version 14.10.00.01 src_operator_ativa_ModeloPapelTrabalho_TipoOs Instance 1 directing private log report to 'producer_log-1'. src_operator_ativa_ModeloPapelTrabalho_TipoOs: TPT19003 NotifyMethod: 'None (default)' src_operator_ativa_ModeloPapelTrabalho_TipoOs: TPT19008 DataConnector Producer operator Instances: 1 Teradata Parallel Transporter Load Operator Version 14.10.00.01 load_operator_ativa_ModeloPapelTrabalho_TipoOs: private log specified: load_log_ativa_ModeloPapelTrabalho_TipoOs src_operator_ativa_ModeloPapelTrabalho_TipoOs: TPT19003 ECI operator ID: src_operator_ativa_ModeloPapelTrabalho_TipoOs-4624 load_operator_ativa_ModeloPapelTrabalho_TipoOs: connecting sessions src_operator_ativa_ModeloPapelTrabalho_TipoOs: TPT19222 Operator instance 1 processing file 'corp_ativa_acesso.amj'. load_operator_ativa_ModeloPapelTrabalho_TipoOs: preparing target table load_operator_ativa_ModeloPapelTrabalho_TipoOs: entering Acquisition Phase load_operator_ativa_ModeloPapelTrabalho_TipoOs: entering Application Phase load_operator_ativa_ModeloPapelTrabalho_TipoOs: Statistics for Target Table: 'Desenvolvimento.bdc_carga_ativa_ModeloPapelTra balho_TipoOs' load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total Rows Sent To RDBMS: 99 load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total Rows Applied: 99 load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total Rows in Error Table 1: 0 load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total Rows in Error Table 2: 0 load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total Duplicate Rows: 0 src_operator_ativa_ModeloPapelTrabalho_TipoOs: TPT19221 Total files processed: 1. load_operator_ativa_ModeloPapelTrabalho_TipoOs: disconnecting sessions load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total processor time used = '1.46641 Second(s)' load_operator_ativa_ModeloPapelTrabalho_TipoOs: Start : Mon Aug 18 18:53:19 2014 load_operator_ativa_ModeloPapelTrabalho_TipoOs: End : Mon Aug 18 18:53:30 2014 Job step step_92 completed successfully TPT_INFRA: TPT01057: Error: Insufficient main storage for attempted allocation
As you can see, after TPT completed the 92th step, it hanged. From tlogview:
Task(SELECT_2[0001]): checkpoint completed, status = Success Task(APPLY_1[0004]): checkpoint completed, status = Success Task(APPLY_1[0003]): checkpoint completed, status = Success Task(APPLY_1[0001]): checkpoint completed, status = Success Task(APPLY_1[0002]): checkpoint completed, status = Success load_operator_ativa_ModeloPapelTrabalho_TipoOs: entering Application Phase load_operator_ativa_ModeloPapelTrabalho_TipoOs: Statistics for Target Table: 'Desenvolvimento.bdc_carga_ativa_ModeloPapelTra balho_TipoOs' load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total Rows Sent To RDBMS: 99 load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total Rows Applied: 99 load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total Rows in Error Table 1: 0 load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total Rows in Error Table 2: 0 load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total Duplicate Rows: 0 TPT_INFRA: TPT02255: Message Buffers Sent/Received = 1, Total Rows Received = 0, Total Rows Sent = 0 TPT_INFRA: TPT02255: Message Buffers Sent/Received = 0, Total Rows Received = 0, Total Rows Sent = 0 TPT_INFRA: TPT02255: Message Buffers Sent/Received = 0, Total Rows Received = 0, Total Rows Sent = 0 TPT_INFRA: TPT02255: Message Buffers Sent/Received = 0, Total Rows Received = 0, Total Rows Sent = 0 TPT_INFRA: TPT02255: Message Buffers Sent/Received = 1, Total Rows Received = 0, Total Rows Sent = 0 src_operator_ativa_ModeloPapelTrabalho_TipoOs: TPT19221 Total files processed: 1. load_operator_ativa_ModeloPapelTrabalho_TipoOs: disconnecting sessions load_operator_ativa_ModeloPapelTrabalho_TipoOs: Total processor time used = '1.46641 Second(s)' load_operator_ativa_ModeloPapelTrabalho_TipoOs: Start : Mon Aug 18 18:53:19 2014 load_operator_ativa_ModeloPapelTrabalho_TipoOs: End : Mon Aug 18 18:53:30 2014 Job step step_92 completed successfully TPT_INFRA: TPT01057: Error: Insufficient main storage for attempted allocation TPT_INFRA: TPT02813: Error: Failed to create the Job variable Teradata Parallel Transporter Executor Version 14.10.00.01 Teradata Parallel Transporter Coordinator Version 14.10.00.01 Teradata Parallel Transporter Executor Version 14.10.00.01 Teradata Parallel Transporter Executor Version 14.10.00.01 Teradata Parallel Transporter Executor Version 14.10.00.01 Teradata Parallel Transporter Executor Version 14.10.00.01
From the tpt script, here is a small extract (only things related with the step 92 and 93). Again, as I've said before, it hangs after the 90th, and not necessarily between the 91th and 92th.
DEFINE JOB FILE_LOAD DESCRIPTION 'Job corp_ativa_acesso' ( DEFINE OPERATOR DDL_Operator TYPE DDL ATTRIBUTES ( VARCHAR PrivateLogName = 'ddl_log', VARCHAR UserName = 'DBADMIN', VARCHAR UserPassword = '*********', VARCHAR ARRAY ErrorList = ['3807','3803'], VARCHAR TdpId = 'maxcgu01-1-1' ); DEFINE SCHEMA schema_ativa_ModeloPapelTrabalho_TipoOs ( "IdModeloPapelTrabalho_TipoOs" INTEGER,"IdModeloPapelTrabalho" INTEGER,"IdTipoOs" INTEGER,"BolObrigatorio" SMALLINT,"IdPerfil" INTEGER ); DEFINE SCHEMA schema_ativa_ModeloPapelTrabalhoVinculado ( "IdModeloPapelTrabalhoVinculado" INTEGER,"IdModeloPapelTrabalhoA" INTEGER,"IdModeloPapelTrabalhoB" INTEGER ); DEFINE OPERATOR src_operator_ativa_ModeloPapelTrabalho_TipoOs DESCRIPTION 'Odbc Operator para tabela [ativa].[ModeloPapelTrabalho_TipoOs]' TYPE DATACONNECTOR PRODUCER SCHEMA schema_ativa_ModeloPapelTrabalho_TipoOs ATTRIBUTES ( VARCHAR AccessModuleName = 'OLEDB_AXSMOD', VARCHAR FileName = 'corp_ativa_acesso.amj', VARCHAR Format = 'Formatted', VARCHAR AccessModuleInitStr = 'noprompt jobid=92', VARCHAR OpenMode = 'Read', VARCHAR EnableScan = 'No', VARCHAR IndicatorMode = 'Yes', VARCHAR PrivateLogName = 'producer_log' ); DEFINE OPERATOR src_operator_ativa_ModeloPapelTrabalhoVinculado DESCRIPTION 'Odbc Operator para tabela [ativa].[ModeloPapelTrabalhoVinculado]' TYPE DATACONNECTOR PRODUCER SCHEMA schema_ativa_ModeloPapelTrabalhoVinculado ATTRIBUTES ( VARCHAR AccessModuleName = 'OLEDB_AXSMOD', VARCHAR FileName = 'corp_ativa_acesso.amj', VARCHAR Format = 'Formatted', VARCHAR AccessModuleInitStr = 'noprompt jobid=93', VARCHAR OpenMode = 'Read', VARCHAR EnableScan = 'No', VARCHAR IndicatorMode = 'Yes', VARCHAR PrivateLogName = 'producer_log' ); DEFINE OPERATOR load_operator_ativa_ModeloPapelTrabalho_TipoOs TYPE LOAD SCHEMA schema_ativa_ModeloPapelTrabalho_TipoOs ATTRIBUTES( VARCHAR PrivateLogName = 'load_log_ativa_ModeloPapelTrabalho_TipoOs', VARCHAR UserName = 'DBADMIN', VARCHAR UserPassword = '*****', VARCHAR TdpId = 'maxcgu01-1-1', VARCHAR TargetTable = 'Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs', VARCHAR LogTable = 'Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs_log', VARCHAR ErrorTable1 = 'Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs_error_1', VARCHAR ErrorTable2 = 'Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs_error_2' ); DEFINE OPERATOR load_operator_ativa_ModeloPapelTrabalhoVinculado TYPE LOAD SCHEMA schema_ativa_ModeloPapelTrabalhoVinculado ATTRIBUTES( VARCHAR PrivateLogName = 'load_log_ativa_ModeloPapelTrabalhoVinculado', VARCHAR UserName = 'DBADMIN', VARCHAR UserPassword = '******', VARCHAR TdpId = 'maxcgu01-1-1', VARCHAR TargetTable = 'Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado', VARCHAR LogTable = 'Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado_log', VARCHAR ErrorTable1 = 'Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado_error_1', VARCHAR ErrorTable2 = 'Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado_error_2' ); STEP step_inicial_1 ( APPLY ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs;'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs_log'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs_error_1'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs_error_2'), ('create table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs( IdModeloPapelTrabalho_TipoOs INTEGER,IdModeloPapelTrabalho INTEGER,IdTipoOs INTEGER,BolObrigatorio SMALLINT,IdPerfil INTEGER )UNIQUE PRIMARY INDEX(IdModeloPapelTrabalho_TipoOs)') , ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado;'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado_log'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado_error_1'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado_error_2'), ('create table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado( IdModeloPapelTrabalhoVinculado INTEGER,IdModeloPapelTrabalhoA INTEGER,IdModeloPapelTrabalhoB INTEGER )UNIQUE PRIMARY INDEX(IdModeloPapelTrabalhoVinculado)') TO OPERATOR (DDL_Operator); ); STEP step_92 ( APPLY ('Insert into Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs ("IdModeloPapelTrabalho_TipoOs","IdModeloPapelTrabalho","IdTipoOs","BolObrigatorio","IdPerfil") values (:"IdModeloPapelTrabalho_TipoOs",:"IdModeloPapelTrabalho",:"IdTipoOs",:"BolObrigatorio",:"IdPerfil")') TO OPERATOR (load_operator_ativa_ModeloPapelTrabalho_TipoOs[4]) SELECT "IdModeloPapelTrabalho_TipoOs","IdModeloPapelTrabalho","IdTipoOs","BolObrigatorio","IdPerfil" FROM OPERATOR (src_operator_ativa_ModeloPapelTrabalho_TipoOs); ); STEP step_93 ( APPLY ('Insert into Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado ("IdModeloPapelTrabalhoVinculado","IdModeloPapelTrabalhoA","IdModeloPapelTrabalhoB") values (:"IdModeloPapelTrabalhoVinculado",:"IdModeloPapelTrabalhoA",:"IdModeloPapelTrabalhoB")') TO OPERATOR (load_operator_ativa_ModeloPapelTrabalhoVinculado[4]) SELECT "IdModeloPapelTrabalhoVinculado","IdModeloPapelTrabalhoA","IdModeloPapelTrabalhoB" FROM OPERATOR (src_operator_ativa_ModeloPapelTrabalhoVinculado); ); STEP step_final_1 ( APPLY ('drop table Desenvolvimento.bdc_ativa_ModeloPapelTrabalho_TipoOs_bkp;'), ('rename table Desenvolvimento.bdc_ativa_ModeloPapelTrabalho_TipoOs to Desenvolvimento.bdc_ativa_ModeloPapelTrabalho_TipoOs_bkp;'), ('rename table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs to Desenvolvimento.bdc_ativa_ModeloPapelTrabalho_TipoOs;'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs_log'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs_error_1'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalho_TipoOs_error_2') , ('drop table Desenvolvimento.bdc_ativa_ModeloPapelTrabalhoVinculado_bkp;'), ('rename table Desenvolvimento.bdc_ativa_ModeloPapelTrabalhoVinculado to Desenvolvimento.bdc_ativa_ModeloPapelTrabalhoVinculado_bkp;'), ('rename table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado to Desenvolvimento.bdc_ativa_ModeloPapelTrabalhoVinculado;'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado_log'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado_error_1'), ('drop table Desenvolvimento.bdc_carga_ativa_ModeloPapelTrabalhoVinculado_error_2') TO OPERATOR (DDL_Operator); ); );
And at last, a piece of the .amj file:
<?xml version="1.0" encoding="UTF-8" standalone="no"?> <?OLE_DB_AXSMOD_FirstCompatibleVersion 14.00.00.00?> <!--Configuration information for the OLE DB AXSMOD--> <OLE_DB_AXSMOD_Jobs> <Job Id="92"> <Source> <DataSourceParseName>{397C2819-8272-4532-AD3A-FB5E43BEAA39}<!--SQL Server Native Client 11.0 (SQLNCLI11)--> </DataSourceParseName> <DataSourceProperties> <PropertySet> ... </PropertySet> </DataSourceProperties> <TableCommand>SELECT [IdModeloPapelTrabalho_TipoOs],[IdModeloPapelTrabalho],[IdTipoOs],convert(smallint,[BolObrigatorio]) as [BolObrigatorio],[IdPerfil] FROM [ativa].[ModeloPapelTrabalho_TipoOs]</TableCommand> <Columns> <Column><Selected/><SourceName>IdModeloPapelTrabalho_TipoOs</SourceName><DestinationName>IdModeloPapelTrabalho_TipoOs</DestinationName><TypeName>INTEGER</TypeName></Column> <Column><Selected/><SourceName>IdModeloPapelTrabalho</SourceName><DestinationName>IdModeloPapelTrabalho</DestinationName><TypeName>INTEGER</TypeName></Column> <Column><Selected/><SourceName>IdTipoOs</SourceName><DestinationName>IdTipoOs</DestinationName><TypeName>INTEGER</TypeName></Column> <Column><Selected/><SourceName>BolObrigatorio</SourceName><DestinationName>BolObrigatorio</DestinationName><TypeName>SMALLINT</TypeName></Column> <Column><Selected/><SourceName>IdPerfil</SourceName><DestinationName>IdPerfil</DestinationName><TypeName>INTEGER</TypeName></Column> </Columns> <LocationOfLogTables>0<!--User's default database 0, Source database 1, Other database 2--> </LocationOfLogTables> <OtherDatabase/> <CharDataTransferUTF8>1<!--OldMethod 0, NewMethod 1--> </CharDataTransferUTF8> </Source> <CharacterEncoding>ASCII</CharacterEncoding> <CheckpointInterval/> <LargeDecimalSupport>Supported</LargeDecimalSupport> <RowsPerFetch>15000</RowsPerFetch> <BufferSize/> </Job> <Job Id="93"> <Source> <DataSourceParseName>{397C2819-8272-4532-AD3A-FB5E43BEAA39}<!--SQL Server Native Client 11.0 (SQLNCLI11)--> </DataSourceParseName> <DataSourceProperties> <PropertySet> ... </PropertySet> </DataSourceProperties> <TableCommand>SELECT [IdModeloPapelTrabalhoVinculado],[IdModeloPapelTrabalhoA],[IdModeloPapelTrabalhoB] FROM [ativa].[ModeloPapelTrabalhoVinculado]</TableCommand> <Columns> <Column><Selected/><SourceName>IdModeloPapelTrabalhoVinculado</SourceName><DestinationName>IdModeloPapelTrabalhoVinculado</DestinationName><TypeName>INTEGER</TypeName></Column> <Column><Selected/><SourceName>IdModeloPapelTrabalhoA</SourceName><DestinationName>IdModeloPapelTrabalhoA</DestinationName><TypeName>INTEGER</TypeName></Column> <Column><Selected/><SourceName>IdModeloPapelTrabalhoB</SourceName><DestinationName>IdModeloPapelTrabalhoB</DestinationName><TypeName>INTEGER</TypeName></Column> </Columns> <LocationOfLogTables>0<!--User's default database 0, Source database 1, Other database 2--> </LocationOfLogTables> <OtherDatabase/> <CharDataTransferUTF8>1<!--OldMethod 0, NewMethod 1--> </CharDataTransferUTF8> </Source> <CharacterEncoding>ASCII</CharacterEncoding> <CheckpointInterval/> <LargeDecimalSupport>Supported</LargeDecimalSupport> <RowsPerFetch>15000</RowsPerFetch> <BufferSize/> </Job> </OLE_DB_AXSMOD_Jobs>
Any ideas?
IF ELSE is conditional statement. You read the line thus:
In the following example, the user has created the table named &TABLE and a variable named
CREATERC, into which is set the system return code resulting from the execution of the
CREATE TABLE statement: .SET CREATERC TO &SYSRC;
.SET CREATERC TO &SYSRC;
.IF &CREATERC = 3803 /* Table &TABLE exists */ THEN;
.RUN FILE RUN01;
.ELSE;
.IF &CREATERC <> 0 THEN;
.LOGOFF &CREATRC;
.ENDIF;
.ENDIF:
Looping is achieved by using for, while, do while or calling a program recursively. Maybe you can think of calling a fastexport script applying conditions from a unix script or you can put in a loop according to your requirement?