Using 10g datapump and scheduler to copy schemas

Update 13-Feb-07: Setting the datapump table_exists_action to replace does not overwrite views, sequences, plsql objects. They have to be dropped separately before datapump is called, see gotchas and code below.

Update 02-Mar-07: dba_ views rather than all_ views have to be used to identify what objects to drop prior to datapump, for reasons explained in this post.

For a nightly process to copy small schemas from one database to another, the normal, or old style, way to do that is with export/import in a cron controlled shell script, which would include a sql-from-sql script to drop all the target schema objects before doing the import.

But that presents problems in RAC, which is what I was working on – which node does the cron script reside on? What happens if that node is down?

An option would be to have the script run on another machine, one unrelated to the RAC cluster, but that introduces more points of failure, and means sending data back and forward over sql*net.

So instead I used the oracle’s 10g scheduler and datapump, as below. This data pumps straight over a database link, without having to write to a dump file in between, which is nice. And because it is all in the database, it ought to be unaffected by particular RAC nodes being down.

Some gotchas:

  • The documentation for dbms_datapump isn’t at all clear on what the syntax for SCHEMA_LISTS’s should be. It should be this type of format:
    name => 'SCHEMA_LIST' , value => '''SCHEMA1'',''SCHEMA2'''

    which is a lot of quotes. Thanks to Sunit Joshi for that tip in

  • The dbms_datapump documentation has a bug: does not have a parameter called “mode“, it has one called “job_mode” instead.
  • When calling dbms_datapump from within a stored procedure, as opposed to an anonymous pl/sql block, the owner of the stored procedure has to have had “create table” privilege granted to them explicitly, rather than through a role. That’s detailed in metalink note 315488.1, but I wasted a lot of time trying to debug that before I went on to metalink. The error reported doesn’t help much:
    ERROR at line 1:
    ORA-31626: job does not exist
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 911
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 4330
    ORA-06512: at "NIGHTLY_IMPORT", line 4
    ORA-06512: at line 1
  • Jobs are created disabled by default with dbms_scheduler, you have to specify “enabled => TRUE” to get them to run.
  • new 13-Feb-07 Setting table_exists_action to ‘replace’ overwrites any existing tables ok, but does not replace view definitions, plsql objects, sequences, and so on. They all have to be explicitly dropped before datapump is called. That is quite a tricky counterintuitive limitation of the table_exists_action replace functionality – why would you want new tables copied over but old sequences left in place? It is suprises like this that make old fashioned (but well understood) exp/imp still worth using.
  • new 13-Feb-07 Be aware that the dropped tables will stay on in the recyclebin. There is a good recyclebin article at

Good information on this is at and

Here’s my code. I included a check against a flag table which allows developers to switch the job on and off:

create table run_copy ( run_copy varchar2(3) ) tablespace users ;
grant select, insert, update, delete, references on run_copy to public ;
create public synonym run_copy for run_copy ;
insert into run_copy ( run_copy ) values ( 'Yes' ) ;
create database link nightly_import_sourcedb connect to username identified by password using 'sourcedb' ;
-- procedure owner needs explicit create table
grant create table to username ;

  -- Andrew Fraser 30 January 2007
  run_import NUMBER ;
  hand NUMBER ;
  run_import := 0 ;
  SELECT COUNT(*) INTO run_import FROM run_copy WHERE UPPER(run_copy) IN ('N','NO') ;
  IF run_import > 0
    -- dont run import if someone has set NO flag in run_copy table
    NULL ;
    -- first have to drop plsql and other objects that table_exists_action does not deal with
    FOR d1 IN ( select object_type, owner, object_name from all_objects
      where object_type in ('FUNCTION','PACKAGE','PROCEDURE','SEQUENCE','SYNONYM','VIEW')
      execute immediate 'drop '||d1.object_type||' '||d1.owner||'.'||d1.object_name ;
    END LOOP ;
    -- then run datapump import itself
    hand := (
      operation    => 'IMPORT' ,
      job_mode     => 'SCHEMA' ,
      remote_link  => 'nightly_import_sourcedb' ) ;
    dbms_datapump.metadata_filter (
      handle       => hand ,
      name         => 'SCHEMA_LIST' ,
      value        => '''BACKUP'',''GATEKEEPER'',''REFDATA'',''TRANSDATA'',''WHSUSR'',''WHS_VIEWER''' ) ;
    dbms_datapump.set_parameter (
      handle       => hand ,
      name         => 'TABLE_EXISTS_ACTION' ,
      value        => 'REPLACE' ) ;
    dbms_datapump.start_job(hand) ;
  END IF ;

   job_name           =>  'run_nightly_import' ,
   job_type           =>  'STORED_PROCEDURE' ,
   job_action         =>  'nightly_import' ,
   start_date         =>  TRUNC(sysdate+1)+4/24 , /* start 4am tomorrow */
   repeat_interval    =>  'FREQ=DAILY; INTERVAL=1', /* run every night */
   enabled            =>  TRUE ,
   comments           =>  'Andrew Fraser 30 January 2007');

Even with the code for this method so short, I’m not sure I would always want to use it over cron and old fashioned export/import. But for RAC, and also for Windows servers (because they lack a decent shell scripting language), this is the way to go.

Posted in RAC, Scripts
8 comments on “Using 10g datapump and scheduler to copy schemas
  1. Adoer says:

    Nice script…

    How about an Export?

  2. This was on RAC, so an export/import running from cron on one node wouldn’t work when that node was down. Would have to introduce failover detection or run the export/import on a third server over sqlnet.

    But given the issues I ran into doing this with datapump compared to export, it would have been easier – and more supportable for other DBAs – to stick with old fashioned export/import. Datapump does not work easily in practice as easily as the documentation implies.

  3. Rob Prince says:

    I am trying use a db link to perform an import, but am running into various errors.

    Do you know where I could find a list of the required privileges I’d need to use dbms_datapump from a stored procedure to import a schema using a db link?


  4. Moise says:

    Hello Rob,

    I have just install this procedure and I put this privileges and roles :

    GRANT create table TO MYUSER;
    GRANT create session TO MYUSER;

  5. vishnu says:

    Great, Thanks. That solved my problem.
    Yet, I am running two different schema exports in one Package, however the current user/schema runs succesfully… the next dynamically formed user schema based on some input, is not exported. it donot see any errors but not the export too doesnot happen.

    I tried grant permissions as you had said, to the second user/schema also. No result! Any guess about it?

  6. Sam says:

    Hi Andrew,

    Firstly thanks a lot for pointing out that other objects like procedures and functions need to be manually cleaned up. Is it the same case with using “Import/Export” Utility, or will it replace all the existing objects with the new ones?


  7. Robert says:

    That tip about the CREATE TABLE privilege was just what I needed. Thanks!

  8. Sven says:

    Hi, Andrew,

    Thanks for sharing this tip, but I really wanted to know how to include a way to generate a report and/or log file through this scheduled process, only because was not possible to obtain the updated schemas in the target Oracle instance, as described.

    Please advice, as soon as possible too.

    Thanks in advance.

1 Pings/Trackbacks for "Using 10g datapump and scheduler to copy schemas"

Leave a Reply

Your email address will not be published. Required fields are marked *


You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>