DataPump
Pre-Requisites
Export
Single Schema Export
export SCHEMA=schema1
export OUTDIR=ORAOUT
expdp SCHEMAS=${SCHEMA} DIRECTORY=${OUTDIR} DUMPFILE=${SCHEMA}.dmp LOGFILE=${SCHEMA}.explog
...where SCHEMA1 is an example Schema name and ORAOUT is an example Directory name.
Single Schema Export (no data)
To export a schema definition only (without any data):
export SCHEMA=schema1
export OUTDIR=ORAOUT
expdp SCHEMAS=${SCHEMA} DIRECTORY=${OUTDIR} DUMPFILE=${SCHEMA}.dmp LOGFILE=${SCHEMA}.explog CONTENT=metadata_only
Multiple Schema Export
export SCHEMA=schema1,schema2,schema3,etc
export DMPNAME=NameOfDumpfileWithoutDmpSuffix
export OUTDIR=ORAOUT
expdp SCHEMAS=${SCHEMA} DIRECTORY=${OUTDIR} DUMPFILE=${DMPNAME}.dmp LOGFILE=${DMPNAME}.explog
Single Table Export
export SCHEMA=schema1
export OUTDIR=ORAOUT
export TABLE=table
expdp TABLES=${SCHEMA}.${TABLE} DIRECTORY=${OUTDIR} DUMPFILE=${TABLE}.dmp LOGFILE=${TABLE}.explog
NOTE: if you connect as the table owner you can omit the schema from the TABLES= sectionMultiple Table Export (PARFILE)
cat > /tmp/parfile.txt <<EOF
SCHEMAS=schema1
DIRECTORY=ORAOUT
DUMPFILE=dumpfile.dmp
INCLUDE=TABLE:"IN ('table1','table2','table3','etc')"
LOGFILE=dumpfile.explog
EOF
expdp PARFILE=/tmp/parfile.txt
Compression
If you have a license for the Advanced Compression option you can add this option to your exports...
COMPRESSION=ALL
Consistent Export
CONSISTENT=Y
which is synonymous with this (without the work of finding out the scn)...
FLASHBACK_SCN=SCN
Alternatively use soemthing like..
FLASHBACK_TIME=SYSTIMESTAMP
Package Export
The easiest way to do this is to use SQL Developer. Right click on the Package and select "export". This will start a Wizard to guide you through the steps to create an SQL file that can be run to recreate the package.
User Export
If you want to copy a user from one instance to another without taking any objects then the easiest way to to this is by using DBMS_METADATA. See the "Users" page for an example of doing this.
Import
Single Schema Import
export SCHEMA=schema1
export OUTDIR=ORAOUT
impdp SCHEMAS=${SCHEMA} DIRECTORY=${OUTDIR} DUMPFILE=${SCHEMA}.dmp LOGFILE=${SCHEMA}.implog
Multiple Schema Import
export SCHEMA=schema1,schema2,schema3,etc
export DMPNAME=NameOfDumpfileWithoutDmpSuffix
export OUTDIR=ORAOUT
impdp SCHEMAS=${SCHEMA} DIRECTORY=${OUTDIR} DUMPFILE=${DMPNAME}.dmp LOGFILE=${DMPNAME}.implog
without Grants and Synonyms...
impdp SCHEMAS=${SCHEMA} DIRECTORY=${OUTDIR} DUMPFILE=${DMPNAME}.dmp LOGFILE=${DMPNAME}.implog EXCLUDE=GRANTS,SYNONYMS
Table Import
export SCHEMA=schema1
export TABLE=table1
export OUTDIR=ORAOUT
impdp TABLES=${SCHEMA}.${TABLE} DIRECTORY=${OUTDIR} DUMPFILE=${TABLE}.dmp LOGFILE=${TABLE}.implog
Multiple Table Import (PARFILE)
cat > /tmp/parfile.txt <<EOF
SCHEMAS=schema1
DIRECTORY=ORAOUT
DUMPFILE=dumpfile.dmp
INCLUDE=TABLE:"IN ('table1','table2','table3','etc')"
LOGFILE=dumpfile.implog
REMAP_TABLESPACE=source_ts:target_ts
EOF
impdp PARFILE=/tmp/parfile.txt
Schema Duplicate
See also: "Schema Duplication Notes" later on this page for some checks to avoid common errors.To duplicate a schema from an existing Schema Export file...
export SOURCE_SCHEMA=schema1
export TARGET_SCHEMA=schema2
export SOURCE_TS=tablespace1
export TARGET_TS=tablespace2
export OUTDIR=ORAOUT
impdp SCHEMAS=${SOURCE_SCHEMA} DIRECTORY=${OUTDIR} DUMPFILE=${SOURCE_SCHEMA}.dmp LOGFILE=${SOURCE_SCHEMA}.duplog \
REMAP_SCHEMA=${SOURCE_SCHEMA}:${TARGET_SCHEMA} REMAP_TABLESPACE=${SOURCE_TS}:${TARGET_TS}
NOTE: For an alternative way of achieving this see the DataPump API section below.Generate DDL
NOTE: This does NOT load anything into the target database.export SCHEMA=schema1,schema2,schema3,etc
export DMPNAME=NameOfDumpfileWithoutDmpSuffix
export OUTDIR=ORAOUT
impdp SCHEMAS=${SCHEMA} DIRECTORY=${OUTDIR} DUMPFILE=${DMPNAME}.dmp LOGFILE=${DMPNAME}.ddllog SQLFILE=${DMPNAME}.sql
DataPump API
Schema Duplicate
See also: "Schema Duplication Notes" later on this page for some checks to avoid common errors.The example below gives the minimum recommended commands to clone an existing schema to a new schema.
NOTE: If the target schema already exists, you should consider using the TABLE_EXISTS_ACTION Parameter.NOTE: If you require the export to be consistent, then you should specify the FLASHBACK_SCN Parameter value.NOTE: If you need to exclude certain parts of the export, then you should use the METADATA_FILTER.
DECLARE
jobhandle NUMBER;
BEGIN
jobhandle := DBMS_DATAPUMP.OPEN(operation => 'IMPORT',
job_mode => 'SCHEMA',
remote_link => 'DBLINKNAME',
job_name => 'SOURCE_SCHEMA_CLONE');
DBMS_DATAPUMP.ADD_FILE (handle => jobhandle,
filename => 'SOURCE_SCHEMA_CLONE.log',
directory => 'ORAOUT',
filetype => DBMS_DATAPUMP.ku$_file_type_log_file);
DBMS_DATAPUMP.METADATA_FILTER(jobhandle,'SCHEMA_LIST','''SOURCE_SCHEMA''');
DBMS_DATAPUMP.METADATA_REMAP(jobhandle,'REMAP_SCHEMA','SOURCE_SCHEMA','TARGET_SCHEMA');
DBMS_DATAPUMP.METADATA_REMAP(jobhandle,'REMAP_TABLESPACE','SOURCE_TS','TARGET_TS');
DBMS_DATAPUMP.START_JOB(jobhandle);
END;
/
...where DBLINKNAME is an example Database Link to the source instance...and SOURCE_SCHEMA_CLONE is an example job_name...and ORAOUT is an example Directory name (target for the log file only)...and SOURCE_SCHEMA is an example source Schema and TARGET_SCHEMA is an example target Schema....and SOURCE_TS and TARGET_TS are example Tablespace names.Some useful optional parameters (to be placed before DBMS_DATAPUMP.START_JOB in the example above)...
DBMS_DATAPUMP.METADATA_FILTER((jobhandle,'EXCLUDE_PATH_EXPR',''GRANTS'');
...where GRANTS indicates that GRANTS should be excludedDBMS_DATAPUMP.SET_PARAMETER(jobhandle,'FLASHBACK_SCN',999999999999);
...where 999999999999 is an example SCN.DBMS_DATAPUMP.SET_PARAMETER(jobhandle,'TABLE_EXISTS_ACTION','REPLACE');
... to replace exiting objects in the target schema with objects from the export fileSchema Duplication Notes
Characterset
Run on both Source and Target databases...
SELECT value
FROM NLS_DATABASE_PARAMETERS
WHERE parameter='NLS_CHARACTERSET';
If the target database has a characterset that is different to the source database then there could be data loss (unless the target database characterset is a subset of the source database). For example, if you export from a database with WE8MSWIN1252 and import into a database with WE8ISO8859P1 then any occurrences of the Euro symbol will be lost. You are likey to get errors in the DataPump import log (e.g. ORA-39346)
Grants
Run on the Source database...
SELECT DISTINCT grantee
FROM dba_tab_privs
WHERE owner = '&SCHEMA';
If the grantees do not exist in the target database then you will get errors in the DataPump import log (e.g. ORA-01917).
Synonyms
Run on the Source database...
SET LINESIZE 132
SET PAGESIZE 40
COLUMN SYNONYM_NAME FORMAT A30
COLUMN TABLE_OWNER FORMAT A30
COLUMN TABLE_NAME FORMAT A30
SELECT synonym_name, table_owner, table_name
FROM dba_synonyms
WHERE owner = '&&SCHEMA'
AND table_owner != '&SCHEMA';
If the table_owners and table_names do not exist in the target database then you will get errors in the DataPump import log (e.g. ORA-39082)
Troubleshooting
Poor Performance
If data pump export starts to work very slow then check "EXPDP And IMPDP Slow Performance In 11gR2 and 12cR1 And Waits On Streams AQ: Enqueue Blocked On Low Memory (Doc ID 1596645.1)"
ALTER SYSTEM SET STREAMS_POOL_SIZE=200M SCOPE=BOTH;
If increasing STREAMS_POOL_SIZE does not help then you have to restart database also.
ORA-02304: invalid object identifier literal
ORA-39083: Object type TYPE failed to create with error:
ORA-02304: invalid object identifier literal
CAUSE
During Schema Duplication... The object types already exist in the source schema. When the types are exported, we also export the object_identifier (OID) of the types. Within the current archictecture, the object-identifier needs to be unique in the database. During import (into the same database), we try to create the same object types in the target schema. As the OID of the types already exists in the source schema, the types cannot be created in the target schema.
SOLUTION
Use the Import DataPump parameter TRANSFORM which enables you to alter object creation DDL for the types. The value N for the transform name OID implies that a new OID is assigned.
export SOURCE_SCHEMA=schema1
export TARGET_SCHEMA=schema2
export SOURCE_TS=tablespace1
export TARGET_TS=tablespace2
export OUTDIR=ORAOUT
impdp SCHEMAS=${SOURCE_SCHEMA} DIRECTORY=${OUTDIR} DUMPFILE=${SOURCE_SCHEMA}.dmp LOGFILE=${SOURCE_SCHEMA}.duplog \
REMAP_SCHEMA=${SOURCE_SCHEMA}:${TARGET_SCHEMA} TRANSFORM=oid:n
ORA-39346: data loss in character set conversion for object
ORA-39346: data loss in character set conversion for object PROCOBJ:"OWNER"."OBJECT_NAME"
CAUSE
NLS_CHARACTERSET mismatch between source and target databases during DataPump Import.
Check with:
SELECT value FROM NLS_DATABASE_PARAMETERS WHERE parameter='NLS_CHARACTERSET';
SOLUTION
Ideally, both databases should be converted to a common characterset before you start. For 12.1+ it is recommended that this is AL32UTF8.
ORA-01917: user or role 'NAME' does not exist
ORA-39083: Object type OBJECT_GRANT failed to create with error:
ORA-01917: user or role 'NAME' does not exist
CAUSE
A User or Role does not exist in the target database so the GRANT cannot be made.
SOLUTION
Create the User in the target database
ORA-39082: Object type ... created with compilation warnings
ORA-39082: Object type OBJECT TYPE:"OWNER"."OBJECT_NAME" created with compilation warnings
CAUSE
It is likely that a dependency does not exist in the target database (i.e. this error is more common when performing a schema duplicate between two different Oracle Instances).
SOLUTION
Use SQL Developer to attempt to compile the object on the target instance and review the errors
ORA-56920: a prepare or ... time zone ... is in an active state
ORA-56920: a prepare or upgrade window or an on-demand or datapump-job loading of a secondary time zone data file is in an active state
SYMPTOM
impdp SCHEMAS=${SCHEMA} DIRECTORY=${OUTDIR} DUMPFILE=${DMPNAME}.dmp LOGFILE=${DMPNAME}.implog
UDI-31626: operation generated ORACLE error 31626ORA-31626: job does not existORA-39086: cannot retrieve job informationORA-06512: at "SYS.DBMS_DATAPUMP", line 3326ORA-06512: at "SYS.DBMS_DATAPUMP", line 4551ORA-06512: at line 1
cat ${DMPNAME}.implog
ORA-39097: Data Pump job encountered unexpected error -56920ORA-39065: unexpected master process exception in DISPATCHORA-56920: a prepare or upgrade window or an on-demand or datapump-job loading of a secondary time zone data file is in an active state
CAUSE
This was due to a DST_UPGRADE_STATE status of UPGRADE...
SELECT property_name, SUBSTR(property_value, 1, 30) value
FROM database_properties
WHERE property_name LIKE 'DST_%'
ORDER BY property_name;
PROPERTY_NAME VALUE------------------------------ ------------------------------------------------- -DST_PRIMARY_TT_VERSION 14DST_SECONDARY_TT_VERSION 3DST_UPGRADE_STATE UPGRADE
SOLUTION
CONNECT / AS SYSDBA
ALTER SESSION SET "_with_subquery"=materialize;
ALTER SESSION SET "_simple_view_merging"=TRUE;
SET SERVEROUTPUT ON
VAR numfail NUMBER
BEGIN
DBMS_DST.UPGRADE_DATABASE(:numfail,
parallel => TRUE,
log_errors => TRUE,
log_errors_table => 'SYS.DST$ERROR_TABLE',
log_triggers_table => 'SYS.DST$TRIGGER_TABLE',
error_on_overlap_time => FALSE,
error_on_nonexisting_time => FALSE);
DBMS_OUTPUT.PUT_LINE('Failures:'|| :numfail);
END;
/
VAR fail number
BEGIN
DBMS_DST.END_UPGRADE(:fail);
DBMS_OUTPUT.PUT_LINE('Failures:'|| :fail);
END;
/
SELECT property_name, SUBSTR(property_value, 1, 30) value
FROM database_properties
WHERE property_name LIKE 'DST_%'
ORDER BY property_name;
ORA-30094: failed to find the time zone data file...
ORA-30094: failed to find the time zone data file for version 18 in $ORACLE_HOME/oracore/zoneinfo
SYMPTOM
impdp SCHEMAS=${SCHEMA} DIRECTORY=${OUTDIR} DUMPFILE=${DMPNAME}.dmp LOGFILE=${DMPNAME}.implog
UDI-31626: operation generated ORACLE error 31626ORA-31626: job does not existORA-39086: cannot retrieve job informationORA-06512: at "SYS.DBMS_DATAPUMP", line 3326ORA-06512: at "SYS.DBMS_DATAPUMP", line 4551ORA-06512: at line 1
cat ${DMPNAME}.implog
ORA-39097: Data Pump job encountered unexpected error -30094ORA-39065: unexpected master process exception in DISPATCHORA-30094: failed to find the time zone data file for version 18 in $ORACLE_HOME/oracore/zoneinfo
CAUSE
Source database has a higher timezone version than the target database.
SOLUTION
Upgrade the timezone of the target database.
Bibliography
DataPump Import (IMPDP) Fails With Errors ORA-39097 ORA-56920 secondary time zone data file is in an active state (Doc ID 1488316.1)ORA-30094: Failed To Find The Time Zone Data File For Version <DST VERSION> In $ORACLE_HOME/oracore/zoneinfo (Doc ID 2214695.1)