Loading and Unloading Database Content Using Oracle Data Pump

This tutorial describes how to unload and load data and metadata.

Approximately 30 minutes

This tutorial covers the following topics:

Place the cursor over this icon to load and view all the screenshots for this tutorial. (Caution: This action loads all screenshots simultaneously, so response time may be slow depending on your Internet connection.)

Note: Alternatively, you can place the cursor over each individual icon in the following steps to load and view only the screenshot associated with that step.

Oracle Database 10g offers a variety of methods for unloading and loading data. In this tutorial, you use the Oracle Data Pump Export utility to unload data then use the Oracle Data Pump Import utility to load data.

What is Oracle Data Pump?

Oracle Data Pump is a new feature of Oracle Database 10g that provides high speed, parallel, bulk data and metadata movement of Oracle database contents. A new public interface PL/SQL package, DBMS_DATAPUMP, provides a server-side infrastructure for fast data and metadata movement. Oracle Database 10g includes new Export (expdp) and Import (impdp) clients that use this interface. The Data Pump Export and Import tools have vastly improved performance and greatly enhanced functionality, such as restartability, flexible object selection, and better monitoring and control of export and import jobs. Because of these valuable improvements, Oracle recommends that you use the Data Pump Export and Import clients rather than the original Export (exp) and Import (imp) clients.

MyCompany is evaluating its product portfolio to determine which products are most profitable to its bottom line. To accomplish this, MyCompany is using Data Pump Export and Import to unload and load various database objects that they later analyze. The Sales History (SH) schema has several tables which need to be first unloaded, then loaded into a different schema for analysis.

Before starting this tutorial, you should:

1.

Perform the Installing Oracle Database 10g on Windows tutorial.

2.

Download and unzip the datapump.zip into your working directory (c:\wkdir).

During this tutorial, you may either execute these files or enter the commands directly at the command prompt.

Data Pump is server-based, rather than client-based. Dump files, log files, and SQL files are accessed relative to server-based directory paths, so that appropriate file security can be enforced. Data Pump requires you to specify directory paths as directory objects. A directory object maps a name to a directory name on the file system.

Before you can run Data Pump Export or Data Pump Import, a directory object must be created by a DBA or by any user with CREATE ANY DIRECTORY privilege. Then, when you are using Export or Import, you specify the directory object with the DIRECTORY parameter.

Creating Directory Objects

To create directory objects, perform the following steps:

1.

To start a SQL*Plus session, select Start > Programs > Oracle - OraDb10g_home1 > Application Development > SQL Plus. Enter system as the User Name, oracle as Password and click OK.

Move your mouse over this icon to see the image

 

2.

Execute the following commands to create two directories for this tutorial:

@c:\wkdir\dir01

The dir01.sql file includes the following commands:

CREATE DIRECTORY datadir1 AS 'c:\wkdir';
CREATE DIRECTORY datadir2 AS 'c:\wkdir'; 

Move your mouse over this icon to see the image

 

3.

After the directory objects are created, you need to grant READ and WRITE permission on them to other users. To allow the Oracle database to read and to write to files on behalf of SH user in the datadir1 and datadir2 directories, execute the following commands:

Execute the following commands to create two directories for this tutorial:

@c:\wkdir\dir02

The dir02.sql file includes the following commands:

CONNECT / AS SYSDBA
GRANT READ, WRITE ON DIRECTORY datadir1 TO sh;
GRANT READ, WRITE ON DIRECTORY datadir2 TO sh;

 

You need to unload the SALES, PRODUCTS, and COSTS tables. To determine if these tables have any dependencies on other tables (which you may also want to unload), perform the following steps:

1.

Open your browser and enter the following URL (Replace <hostname> with your own host name or IP address):

http://<hostname>:1158/em

Enter system as User Name, oracle as Password, Normal in the Connect As field, and click Login.

Move your mouse over this icon to see the image

If this is the first time, that you log into Enterprise Manager as this user, click I agree to acknowledge the Oracle Database 10g Licensing Information.

 

2.

 

Select the Administration tab.

 

3.

Scroll down to the Schema section and click on the Tables link.

 

4.

Enter SH in the Schema field then click Go.

 

5.

Select Costs then select Show Dependencies from the drop-down list. Click Go.

 

6.

Notice that the COSTS table is dependent on four other tables: CHANNELS, PRODUCTS, PROMOTIONS, and TIMES. Other than PRODUCTS, none of the other tables are needed for the analysis. Click OK.

 

7.

Select Products, then select Show Dependencies from the drop-down list. Click Go.

 

8.

The PRODUCTS table is not dependent on any other table. Click OK.

 

9.

Select Sales then select Show Dependencies from the drop-down list. Click Go.

 

10.

The SALES table is dependent on several other tables: CHANNELS, COUNTRIES, CUSTOMERS, PRODUCTS, PROMOTIONS, and TIMES. Other than PRODUCTS, none of the other tables are needed for the analysis. Click your Database Instance: orcl breadcrumb.

 

Data Pump Export is a utility for unloading data and metadata into a set of operating system files called a dump file set. The dump file set can be copied to another system and loaded by the Data Pump Import utility. The dump file set is made up of one or more disk files that contain table data, database object metadata, and control information. The files are written in a proprietary, binary format. During an import operation, the Data Pump Import utility uses these files to locate each database object in the dump file set. Data Pump Export allows you to specify that a job should move a subset of the data and metadata, as determined by the export mode. This is done using data filters and metadata filters, which are implemented through Export parameters.

Oracle Data Pump Export can be used in the following ways:

Using Enterprise Manager
Using Command Line

Using Enterprise Manager

To unload your company's sales data using Enterprise Manager, perform the following steps:

1.

Click on the Maintenance tab.

 

2.

Click on Export to Export Files link in the Data Movement section.

 

3.

To export tables, select Tables and enter oracle as Username and Password, select "Save as Preferred Credential" and click Continue.

 

4.

Click the Add button to see the tables available for export.

 

5.

Enter SH in the Schema field and click Go.

 

6.

Select Costs and click Next 7 .

Select Products, and Sales tables and click the Select button.

 

7.

Click Next.

 

8.

Enter 4 for the Maximum Number of Threads in Export Job and select DATADIR1 for the Directory Object. Change the name of the export log file to EXPDAT_<today's date in mmddyy format>.log and click Next.

 

9.

Select the DATADIR1 for the Directory Object and change the name of the DMP file to EXPDAT_<today's date in mmddyy format>.DMP. Then click Next.

 

10.

Enter a job name of EXPORT_<today's date> and make sure Start is set to Immediately and click Next.

 

11.

Click the Submit Job button to submit the export job.

 

12.

Your export job was successfully submitted. Click the EXPORT_<mmddyy> link.

 

13.

Click the Status link.

 

14.

Select the Export link to see the Export log status.

 

15.

The job is still running. Click Reload in your browser window.

 

16.

When the job has finished, scroll down to the bottom to see all the messages in the log. Your export has completed successfully. Note the name of the dump file for use later in the tutorial.

 

Back to Topic

Using Command Line

Use the Data Pump Export command-line interface for the following tasks:

Performing a Table Mode Export
Estimating How Much Disk Space Will Be Consumed in a Schema Mode Export

Back to Topic

Performing a Table Mode Export

A table export is specified using the TABLES parameter. In the following example, the NOLOGFILE parameter indicates that an Export logfile of the operation will not be generated.

Issue the following commands to perform a table export of the COSTS and SALES tables:

1.

Open a terminal or command prompt window and navigate to the c:\wkdir directory with the following command:

cd c:\wkdir

 

2.

Execute the following command to export tables:

exp01

The exp01.bat file includes the following command:

expdp system/oracle TABLES=sh.costs,sh.sales DUMPFILE=datadir2:table.dmp NOLOGFILE=y

Scroll down to view the successful completion of the export.

 

Back to Subtopic

Estimating How Much Disk Space Will Be Consumed in a Schema Mode Export

The ESTIMATE_ONLY parameter estimates the space that would be consumed in a schema export, but stops without actually performing the export operation. The estimate is printed in the log file and displayed on the client's standard output device. The estimate is for table row data only; it does not include metadata.

Perform the following steps:

1.

From your terminal window, issue the following command to estimate the amount of blocks required to export the data in the three tables: SALES, PRODUCTS and COSTS from the Sales History (SH) schema.

Execute the following command to estimate block space for tables:

est01

The est01.bat file includes the following command:

expdp sh/sh DIRECTORY=datadir2 ESTIMATE_ONLY=y TABLES=SALES, PRODUCTS, COSTS 

 

Back to Subtopic

Data Pump Import is a utility for loading an export dump file set into a target system. The dump file set is made up of one or more disk files that contain table data, database object metadata, and control information. The files are written by the Data Pump Export utility in a proprietary, binary format. During an import operation, the Data Pump Import utility uses these files to locate each database object in the dump file set.

Import can also be used to load a target database directly from a source database with no intervening files, which allows export and import operations to run concurrently. This avoids the creation of dump files on the file system, and may also minimize the total elapsed time for the entire export and import operation. This is known as network import.

Data Pump Import allows you to specify whether a job should move a subset of the data and metadata, as determined by the import mode. This is done using data filters and metadata filters, which are implemented through Import parameters.

Oracle Data Pump Import can be used in the following ways:

Using Enterprise Manager
Using Command Line

Using Enterprise Manager

To import a schema through Enterprise Manager, perform the following steps:

1.

Open your browser and enter the following URL (Replace <hostname> with your own host name or IP address):

http://<hostname>:1158/em

Enter system as User Name, oracle as Password, Normal in the Connect As field, and click Login.

Move your mouse over this icon to see the image

 

2.

Click on the Maintenance tab.

 

3.

Click on Import from Export Files link.

 

4.

Select DATADIR1 for the Directory Objects and change the file name to EXPORT_<today's date>.DMP (the file which you created in the previous section of this tutorial "Unloading Data"). Select Tables for the Import Type and enter oracle as Username and Password, then click Continue.

 

5.

Click the Add button to see the tables available for import.

 

6.

Enter SH in the Schema field and click Go.

 

7.

Select the checkboxes for Costs, Products, and Sales then click Select.

 

8.

Click Next.

 

9.

Click the Add Another Row button under the Re-Map Schemas section.

 

10.

Under the Destination Schema column, select PM then click Next.

 

11.

Select DATADIR1 for the Directory Object and change the Log File name to IMPORT_<today's date>.LOG. Then click Next.

 

12.

Enter IMPORT_<today's date> in the Job field, ensure that the job starts Immediately, then click Next.

 

13.

Click Submit Job.

 

14.

Click the link IMPORT_<today's date>.

 

15.

Select the Running link.

 

16.

Click Import.

 

17.

The job is still running. Click Reload in your browser window.

 

18.

When the job has finished, scroll down to the bottom to see all the messages in the log. Your import has completed successfully even though the log file displayed some errors. These errors were generated because the SALES and COSTS tables are dependent on several tables which were not included in the export and thus were not imported. In this case, the omission was intentional and you can ignore the errors in the log file.

 

Back to Subtopic

Using Command Line

Perform a data-only table mode import using Data Pump Import command-line interface.

The CONTENT parameter allows you to filter the data and metadata that Import loads. The DATA_ONLY value loads only table row data; no database object definitions (metadata) are recreated. Perform the following steps:

1.

You are about to duplicate the data in the SH.COSTS table with the upcoming data import. To restore the original version of the SH.COSTS table during your "cleanup" activities, create a duplicate copy now. From your SQL*Plus session, execute the following command:

@c:\wkdir\costs2

The costs2.sql file includes the following command:

create table sh.costs2 as select * from sh.costs;

 

2.

Extract the data of the COSTS table from the TABLE.DMP dump file (created previously in the Export section of this tutorial). From a terminal window in the c:\wkdir directory, execute the following command:

imp01

The imp01.bat file includes the following command:

impdp system/oracle TABLES=sh.costs DUMPFILE=datadir2:table.dmp NOLOGFILE=y

 

Back to Topic

Drop the directory objects and "cleanup" the schema SH and PM modifications (coming from tasks in this tutorial).

1.

From your SQL*Plus session, execute the following command:

@c:\wkdir\cleanup

The cleanup.sql file includes the following commands:

CONNECT / AS SYSDBA

DROP DIRECTORY datadir1;
DROP DIRECTORY datadir2;

DROP TABLE SH.COSTS PURGE;
CREATE TABLE SH.COSTS AS SELECT * FROM SH.COSTS2;
DROP TABLE SH.COSTS2 PURGE;

DROP TABLE PM.COSTS CASCADE CONSTRAINTS PURGE;
DROP TABLE PM.PRODUCTS CASCADE CONSTRAINTS PURGE;
DROP TABLE PM.SALES CASCADE CONSTRAINTS PURGE;

EXIT

 

2.

Close all open tutorial windows.

 

 

Back to Topic List

In this tutorial, you learned how to:

Back to Topic List

Move your mouse over this icon to hide all screenshots.