Migrating an Oracle 11g Database to a 12c Pluggable Database in the Oracle Cloud


Options



Before You Begin

Purpose

In this tutorial, you learn how to migrate and upgrade an on-premises Oracle 11gR2 database to a 12c pluggable database in the Oracle Database Cloud Service (DBaaS).

Last Updated

April 2015, Release 15.2.1

Please note that Oracle develops its cloud services in an agile manner and makes rapid changes to meet our users' needs. Therefore this tutorial may not reflect all of the changes since the last update to this service. For the very latest information see Oracle Database Cloud Service on Oracle Help Center.

Time to Complete

Approximately 40 minutes.

Background

A variety of methods exist for migrating data from on-premises databases to databases in the Oracle Database Cloud Service (DBaaS). These include Data Pump export and import of transportable tablespaces, exporting to external tables, remote cloning, and uplugging and plugging databases. Some of these methods support migrating on-premises 11gR2 to cloud 11gR2, some support migrating on-premises 12c to cloud 12c, and some methods (shown in this tutorial) support migrating and upgrading on-premises 11gR2 to cloud 12c.

Scenario

You have an on-premises Oracle 11gR2 database. You want to not only migrate it to the Oracle Public Cloud Service, but at the same time you want also to upgrade it from 11gR2 to a 12c pluggable database in the Oracle Database Cloud Service (DBaaS). In this tutorial, you will use Data Pump export/import with a transportable tablespace to perform both the migration and upgrade. You will:

  • Create data on an on-premises Oracle 11gR2 database:
    • Create a tablespace.
    • Create a user in the tablespace.
    • Create a table and insert data into it.
  • Prepare the tablespace in the on-premises database for transport to the cloud database:
    • Create a Data Pump export directory.
    • Set permissions on the Data Pump export directory.
    • Change the tablespace to READ ONLY.
    • Run the export command to create a dump file from the tablespace.
  • Copy the export dump file and tablespace file from the on-premises database to the cloud database instance:
    • Use the scp program to copy the dump file from the on-premises database to the cloud database.
    • Use the scp program to copy the tablespace file from the on-premises database to the cloud database.
  • Prepare the Oracle 12c pluggable database in the Oracle Database Cloud Service (DBaaS):
    • Create a Data Pump export directory in the pluggable database.
    • Set permissions on the Data Pump export directory.
    • Create a user in the database.
    • Import the tablespace into the pluggable database.
    • Use SQL*Plus to change the tablespace to READ WRITE.

What Do You Need?

  • An on-premises Oracle 11gR2 database instance:
    • Preferably created with passphrase-protected SSH keys for secure access (use of passphrase--protected SSH keys might also best ensure the scp command to copy files successfully from on-premises to cloud in all environments)
    • Created with Unicode (AL32UTF8) character set (to match the AL32UTF8 character set used by database instances created in DBaaS) 
      • The migration method used in this tutorial, “Data Pump Transportable Tablespace”, requires matching character sets or the prior use of Database Migration Assistant for Unicode (DMU) to convert the source database character set to Unicode)
        • To use the migration method used in this tutorial, do not attempt to use an on-premises database created with WE8MSWIN1252 or other character sets.
      • To migrate on-premises 11gR2 databases with character sets other than AL32UTF8, see the Oracle Database Cloud Service (DBaaS) documentation. One alternative is to migrate non-matching character set databases using the conventional Data Pump export/import method (see Oracle Database Administrator's Guide).
  • An Oracle 12c pluggable database in the Oracle Database Cloud Service (DBaaS)
    • During the data pump import step, you will be prompted for a password. You will need the "Administration" password that was specified when the DBaaS database instance was created.

Creating Data in an On-premises Oracle 11gR2 Database

This tutorial assumes that you do not have data in an on-premises 11gR2 database. If you already have data to migrate, you can skip this step.

  1. Log in to SQL*Plus as sys with sysdba privileges. Create a tablespace upgrade_me in your on-premises 11gR2 database.
    Note: The example code in this tutorial is for a database named onpremdb1. Substitute the path in this example with your actual database name and location.

    CREATE SMALLFILE TABLESPACE UPGRADE_ME
        DATAFILE
            '/u01/app/oracle/database/oradata/onpremdb1/upgrademe01.dbf' SIZE 52428800 AUTOEXTEND ON NEXT 1 MAXSIZE 104857600
        BLOCKSIZE 8192
        NOLOGGING
        DEFAULT NOCOMPRESS
        ONLINE
        SEGMENT SPACE MANAGEMENT AUTO
        EXTENT MANAGEMENT LOCAL AUTOALLOCATE;

    create tablespace upgrade_me
    Description of this image
  2. Create the user bob in the tablespace.

    CREATE USER bob IDENTIFIED BY bob
    DEFAULT TABLESPACE "UPGRADE_ME"
    TEMPORARY TABLESPACE "TEMP";

    create user bob
    Description of this image
  3. Specify unlimited quota for bob.

      ALTER USER bob QUOTA UNLIMITED ON upgrade_me;

    alter user bob quota unlimed on upgrade_me
    Description of this image
  4. Grant CONNECT and RESOURCE to bob.

    GRANT CONNECT, RESOURCE to bob;

    grant connect and resource to bob
    Description of this image
  5. Create the table test in bob.

    CREATE TABLE bob.test (id number, name varchar2(100));

    create table test in bob
    Description of this image
  6. Insert a record in the table.

    INSERT INTO bob.test (ID, NAME) VALUES ('1', 'Bob');

    insert into bob_test values
    Description of this image

Preparing the Tablespace in the On-premises Database for Transport

Perform the following steps on the on-premises 11gR2 database. 

  1. Define a Data Pump export directory, data_pump_exp, in the on-premises database.
    Note: For speed and simplicity of this tutorial, we are defining this database directory to point to the pre-existing dpdump directory in the operating system. As a best practice, you may wish to first use the mkdir command to create a new operating system directory and then point the database directory to that new directory.
     
    CREATE OR REPLACE DIRECTORY DATA_PUMP_EXP as '/u01/app/oracle/database/admin/onpremdb1/dpdump/';  

    create directory data_pump_exp
    Description of this image
  2. Set permissions on the Data Pump export directory, granting READ and WRITE to sys.

    GRANT READ, WRITE ON DIRECTORY DATA_PUMP_EXP to sys;

    grant read write on directory data_pump_exp
    Description of this image
  3. Change the tablespace to READ ONLY.

    ALTER TABLESPACE UPGRADE_ME READ ONLY;

    alter tablespace upgrade_me read only
    Description of this image
  4. Exit SQL*Plus. At the command line prompt, run the Data Pump Export utility expdp as system.

    expdp system@onpremdb1 dumpfile=export.dmp logfile=export.log directory=DATA_PUMP_EXP transport_tablespaces=upgrade_me transport_full_check=y

    When the export process is done, you will see a message similar to this:
    Dump file set for SYSTEM.SYS_EXPORT_TRANSPORTABLE_01 is:
      /u01/app/oracle/database/admin/onpremdb1/dpdump/export.dmp

    run expdp Data Pump Export utility
    Description of this image

Copying the Dump File and Tablespace File to the Cloud Database Instance

Next, use the scp command in your on-premises environment, to copy the dump file and tablespace file from the on-premises 11gR2 database to the Cloud virtual machine (VM) hosting the 12c database. 

The datafile required for your transportable tablespace UPGRADE_ME is located (in this tutorial example) in:
  /u01/app/oracle/database/oradata/onpremdb1/upgrademe01.dbf

  1. Before using the scp command to copy the export file to the cloud instance, prepare the SSH private key that is paired with the SSH public key that was used to create your target cloud database instance. If you will be running the scp command from your Oracle home directory, you can place your SSH private key file in that directory. In a command shell, use the chmod 600 command to set the file permissions of the private key file so that only you have access to it.
    Note:  For the following example, you need to swap in your SSH private key name; this example uses a private key named ua_key_openssh.

    chmod 600 ua_key_openssh

    chmod 600 ua_key_openssh
    Description of this image
  2. Use the scp command to copy the export file to the cloud instance.
    Note:  For the following example, you need to swap in your SSH private key name, your directory paths, and your public cloud IP address. If this is your first time connecting via SSH, you may get the warning asking if you wish to continue connecting. If so, reply yes.

    scp –i private_key.ppk /u01/app/oracle/database/admin/onpremdb1/dpdump/export.dmp oracle@PUBLIC_CLOUD_IP_ADDRESS:/u01/app/oracle/admin/ORCL/dpdump/export.dmp

    scp to copy export dumpfile to cloud
    Description of this image
  3. Use the scp command to copy the tablespace file to the cloud instance.
    Note: Remember to replace LOCAL_PDB_NAME with your pluggable database (PDB) name, usually PDB1 with a default cloud instance.
    Note: By default, the directory path on the Cloud instance uses UPPER CASE for the names of the container database (CDB) and PDB.

    scp –i private_key.ppk /u01/app/oracle/database/oradata/onpremdb1/upgrademe01.dbf oracle@PUBLIC_CLOUD_IP_ADDRESS:/u02/app/oracle/oradata/ORCL/LOCAL_PDB_NAME/upgrademe01.dbf

    scp command to copy tablespace
    Description of this image

Preparing the Oracle 12c Pluggable Database in the Cloud

Perform the following steps in your Oracle Database Cloud Service (DBaaS) account.

  1. Start a SQL*Plus session, and create a Data Pump export directory, DATA_PUMP_EXP, in the pluggable database.
    Note: Remember to replace our generic CLOUD_PDB_NAME with your actual pluggable database name, and CLOUD_CDB_NAME with your container database name. (In this example, the pluggable database name is PDB1 and the container database name is ORCL.)
    Note: For speed and simplicity of this tutorial, we are defining this database directory to point to the pre-existing dpdump directory in the operating system. As a best practice, you may wish to first use the mkdir command to create a new operating system directory and then point to that new directory.

    ALTER SESSION SET CONTAINER = CLOUD_PDB_NAME;

    CREATE OR REPLACE DIRECTORY DATA_PUMP_EXP as '/u01/app/oracle/admin/CLOUD_CDB_NAME/dpdump/';

    create datapump export directory in PDB
    Description of this image
  2. Set permissions on the DATA_PUMP_EXP export directory, granting READ and WRITE to sys.

    GRANT READ, WRITE ON DIRECTORY DATA_PUMP_EXP to sys;

    grant read write to datapump directory
    Description of this image
  3. Create the user bob in the cloud database instance.

    CREATE USER bob IDENTIFIED BY bob;

    create user bob
    Description of this image
  4. Grant CONNECT and RESOURCE to bob.

    GRANT CONNECT, RESOURCE to bob;

    grant connect and resource to bob
    Description of this image
  5. Exit SQL*Plus, and Import the tablespace upgrademe01.dbf into the pluggable database, using the impdp Data Pump import command.
    Note: Remember to replace LOCAL_PDB_NAME with your 12c cloud pluggable database name.
    Note: When you invoke the impdp command, you are prompted for a password. Provide the "Administrator" password that was used to create the Oracle Database Cloud Service database instance.

    impdp system@LOCAL_PDB_NAME dumpfile=export.dmp directory=data_pump_exp transport_datafiles=/u02/app/oracle/oradata/ORCL/
    LOCAL_PDB_NAME
    /upgrademe01.dbf

    import into PDB1
    Description of this image
  6. Log in again to SQL*Plus as sys. Set the container to your pluggable database.
    Note: Remember to replace LOCAL_PDB_NAME with your pluggable database name. This example uses the default name PDB1.

    ALTER SESSION SET CONTAINER =LOCAL_PDB_NAME;

    alter session set container pdb1
    Description of this image
  7. Grant READ and WRITE to the upgrade_me tablespace.

    ALTER TABLESPACE upgrade_me READ WRITE;

    alter tablespace upgrade_me read write
    Description of this image

Verifying that the 11gR2 Data Migrated to the 12c Cloud Database

Verify that the table with data from the 11gR2 on-premises database migrated successfully to the 12c cloud pluggable database.

  1. Log in to SQL*Plus in the 12c cloud pluggable database, to verify that the data from the on-premises 11gR2 database has been imported into the DBaaS 12c pluggable database. The bob.test schema table and record should now be accessible in the DBaaS pluggable database.

    proof
    Description of this image

Want to Know More?

Credits

  • Lead Curriculum Developer: Rick Green

  • Other Contributors: Brian Spendolini, Jim Spiller, Donna Keesling, Dominique Jeunot