Oracle 12c DataPump EXPORT (EXPDP) and IMPORT (IMPDP) new features

Audit all expdp/impdp operations with unified auditing

You can perform auditing on Data Pump jobs in order to monitor and record selected user database actions. Data Pump uses unified auditing, in which all audit records are centralized in one place.

For example:

CREATE AUDIT POLICY [policy_name] ACTIONS COMPONENT=DATAPUMP { EXPORT | IMPORT | ALL };

Keep policy:

AUDIT POLICY [policy_name] BY SYSTEM;

You can query the UNIFIED_AUDIT_TRAIL data dictionary view to find Oracle Data Pump audited events:

SQL> SELECT DP_TEXT_PARAMETERS1, DP_BOOLEAN_PARAMETERS1 FROM UNIFIED_AUDIT_TRAIL WHERE AUDIT_TYPE = 'Datapump';

DP_TEXT_PARAMETERS1                                     DP_BOOLEAN_PARAMETERS1
------------------------------------------------------  ----------------------------------
MASTER TABLE: SCOTT.SYS_EXPORT_TABLE_01, MASTER_ONLY:   FALSE,
JOB_TYPE: EXPORT, DATA_ONLY:                            FALSE,
METADATA_JOB_MODE: TABLE_EXPORT, METADATA_ONLY:         FALSE,
JOB VERSION: 12.1.0.0, DUMPFILE_PRESENT:                TRUE,
ACCESS METHOD: DIRECT_PATH, JOB_RESTARTED:              FALSE
DATA OPTIONS:                                           0,
DUMPER DIRECTORY:                                       NULL
REMOTE LINK:                                            NULL,
TABLE EXISTS:                                           NULL,
PARTITION OPTIONS:                                      NONE

Export one or more views as tables

The new VIEWS_AS_TABLES parameter allows you to export one or more views as tables. Data Pump exports a table with the same columns as the view and with row data fetched from the view. Data Pump also exports objects dependent on the view, such as grants and constraints. Dependent objects that do not apply to tables (for example, grants of the UNDER object privilege) are not exported. The VIEWS_AS_TABLES parameter can be used by itself or along with the TABLES parameter.

For example:

$ expdp scott/[PASSWORD] DIRECTORY=dpump1 DUMPFILE=scott1.dmp VIEWS_AS_TABLES=empview tables=emp

Silent encryption password for expdp job

You can now specify silently a password during expdp runtime. When ENCRYPTION_PWD_PROMPT=YES on the command line, DataPump will prompt you for the encryption password, rather than you entering it on the command line with the ENCRYPTION_PASSWORD parameter. The advantage to doing this is that the encryption password is not echoed to the screen when it is entered at the prompt.

Whereas, when it is entered on the command line using the ENCRYPTION_PASSWORD parameter, it appears in plain text. The password will not be visible by commands like ps or will not be stored in scripts. If you specify an encryption password on the export operation, you must also supply it on the import operation.

For example:

$ expdp scott/[PASSWORD] DIRECTORY=dpump1 DUMPFILE=export.dmp ENCRYPTION_PWD_PROMPT=Y

Transportable feature

The transportable option specifies whether the transportable option should be used during a table mode export (specified with the TABLES parameter) or a full mode export (specified with the FULL parameter).

Full Transportable Export/Import (Full Database)

Example: 1. Make the tablespaces read only.

2. Export the database:

$ expdp user_name full=y dumpfile=expdat.dmp directory=data_pump_dir transportable=always logfile=export.log

3. Check the export log, to determine the datafiles which should be copied to target system. For example:

******************************************************************************
Dump file set for SYSTEM.SYS_EXPORT_TRANSPORTABLE_01 is:
/u01/app/oracle/admin/mydb/dpdump/expdat.dmp
******************************************************************************
Datafiles required for transportable tablespace SALES:
/u01/app/oracle/oradata/mydb/sales01.dbf
Datafiles required for transportable tablespace CUSTOMERS:
/u01/app/oracle/oradata/mydb/cust01.dbf
Datafiles required for transportable tablespace EMPLOYEES:
/u01/app/oracle/oradata/mydb/emp01.dbf

4. Check the endian conversion if required:

SQL> SELECT d.PLATFORM_NAME, ENDIAN_FORMAT
FROM V$TRANSPORTABLE_PLATFORM tp, V$DATABASE d
WHERE tp.PLATFORM_NAME = d.PLATFORM_NAME;

5. Copy the datafile and the dump file to target environment.

6. Import the database:

$ impdp scott/[PASSWORD] FULL=y DUMPFILE=fullexp.dmp DIRECTORY=dpump1 TRANSPORT_DATAFILES='/u01/app/oracle/oradata/db1211/users01.dbf' LOGFILE=import.log

Transportable Export/Import (Tables/Partitions)

Example:

1. Mark the datafiles that are associated with the table as read only:

SQL>ALTER TABLESPACE sales_prt_tbs READ ONLY;

2. Export using expdp, tables:

$ expdp user_name dumpfile=sales_prt.dmp directory=data_pump_dir transportable=always logfile=exp.log tables=sh.sales_prt:sales_q1_2000,sh.sales_prt:sales_q2_2000

3. Check the log for the datafiles to be copied.

Datafiles required for transportable tablespace SALES_PRT_TBS:
/u01/app/oracle/oradata/sourcedb/sales_prt.dbf
Job SYSTEM.SYS_EXPORT_TABLE_01 successfully completed at 11:32:13

4. Copy the datafile(s) and the dump file to target environment.

5. Import the dump:

$ impdp user_name dumpfile=sales_prt.dmp directory=data_pump_dir logfile=imp.log transport_datafiles='/u01/app/oracle/oradata/targetdb/sales_prt.dbf' tables=sh.sales_prt:sales_q1_2000,sh.sales_prt:sales_q2_2000

Compressing Tables

During import jobs, you can now change the compression type for all tables in the job, including tables that provide storage for materialized views using TABLE_COMPRESSION_CLAUSE option.

Example:

$ impdp scott/[PASSWORD] DIRECTORY=dpump1 DUMPFILE=export.dmp TRANSFORM=TABLE_COMPRESSION_CLAUSE:COMPRESS FOR OLTP

Creating SecureFile LOBs

You can now change the LOB storage (either SECUREFILE or BASICFILE) for all tables in the job, including tables that provide storage for materialized views.

Example:

$ impdp scott/[PASSWORD] DIRECTORY=dpump1 DUMPFILE=export.dmp LOB_STORAGE:SECUREFILE

Disabling Logging For Tables/Indexes

12c DISABLE_ARCHIVE_LOGGING DataPump parameter is introduced to disable logging for table, index or both during import job. Logging is not completely disable but only a small amount is generated. Also don’t forget that there is a database parameter FORCE LOGGING which overwrites this feature.

Example:

- For schema:

$ impdp scott/<password> DIRECTORY=dpump1 DUMPFILE=scott1.dmp schemas=scott TRANSFORM=DISABLE_ARCHIVE_LOGGING:Y</password>

- For index no logging:

$ impdp scott/<password> DIRECTORY=dpump1 DUMPFILE=scott1.dmp schemas=scott TRANSFORM=DISABLE_ARCHIVE_LOGGING:Y:INDEX</password>

- For table no logging but for other objects logging=Y:

$ impdp scott/<password> DIRECTORY=dpump1 DUMPFILE=scott1.dmp schemas=scott TRANSFORM=DISABLE_ARCHIVE_LOGGING:Y TRANSFORM=DISABLE_ARCHIVE_LOGGING:N:TABLE</password>

Timestamp printed for every DataPump operation

Request to timestamp the messages displayed during import using the new 12c LOGTIME DataPump parameter.

Example:

$ impdp logtime=all test/test DIRECTORY=dpump1 DUMPFILE=export.dmp schemas=test

Import: Release 12.1.0.1.0 - Production on Tue Sep 15 13:33:16 2013
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.1.0 - 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options

15-Sep-13 13:33:19.192: Starting TEST.SYS_IMPORT_SCHEMA_01;: test/******** logtime=all directory=test
15-Sep-13 13:33:19.347: Estimate in progress using BLOCKS method...
...
15-Sep-13 13:33:46.884: . . imported TEST 0 KB 0 rows ... ...