Search This Blog

18 October 2011

Oracle Database Administration chapter-7,8


To Use Data Pump,  DBA has to create a directory in Server Machine and create a Directory Object in the database mapping to the directory created in the file system.
The following example creates a directory in the filesystem and creates a directory object in the database and grants privileges on the Directory Object to the SCOTT user.
$mkdir my_dump_dir
$sqlplus
Enter User:/ as sysdba
SQL>create directory data_pump_dir as ‘/u01/oracle/my_dump_dir’;
Now grant access on this directory object to SCOTT user
SQL> grant read,write on directory data_pump_dir to scott;
To Export Full Database, give the following command
$expdp  scott/tiger FULL=y DIRECTORY=data_pump_dir DUMPFILE=full.dmp
             LOGFILE=myfullexp.log JOB_NAME=myfullJob
The above command will export the full database and it will create the dump file full.dmp in the directory on the server /u01/oracle/my_dump_dir
In some cases where the Database is in Terabytes the above command will not feasible since the dump file size will be larger than the operating system limit, and hence export will fail. In this situation you can create multiple dump files by typing the following command
$expdp  scott/tiger FULL=y DIRECTORY=data_pump_dir DUMPFILE=full%U.dmp
       FILESIZE=5G  LOGFILE=myfullexp.log JOB_NAME=myfullJob
This will create multiple dump files named full01.dmp, full02.dmp, full03.dmp and so on. The FILESIZE parameter specifies how much larger the dump file should be.
To export all the objects of SCOTT’S schema you can run the following export data pump command.
$expdp scott/tiger DIRECTORY=data_pump_dir DUMPFILE=scott_schema.dmp
         SCHEMAS=SCOTT
You can omit SCHEMAS since the default mode of Data Pump export is SCHEMAS only.
If you want to export objects of multiple schemas you can specify the following command
$expdp scott/tiger DIRECTORY=data_pump_dir DUMPFILE=scott_schema.dmp
         SCHEMAS=SCOTT,HR,ALI
You can use Data Pump Export utility to export individual tables. The following example shows the syntax to export tables
$expdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=tables.dmp  
                               TABLES=employees,jobs,departments

Exporting Tables located in a Tablespace
If you want to export tables located in a particular tablespace you can type the following command
$expdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=tbs.dmp
TABLESPACES=tbs_4, tbs_5, tbs_6
The above will export all the objects located in tbs_4,tbs_5,tbs_6
You can exclude objects while performing a export by using EXCLUDE option of Data Pump utility. For example you are exporting a schema and don’t want to export tables whose name  starts with “A” then you can type the following command
$expdp scott/tiger DIRECTORY=data_pump_dir DUMPFILE=scott_schema.dmp
         SCHEMAS=SCOTT EXCLUDE=TABLE:”like ‘A%’”
Then all tables in Scott’s Schema whose name starts with “A “ will not be exported.
Similarly you can also INCLUDE option to only export certain objects like this
$expdp scott/tiger DIRECTORY=data_pump_dir DUMPFILE=scott_schema.dmp
         SCHEMAS=SCOTT INCLUDE=TABLE:”like ‘A%’”
This is opposite of EXCLUDE option i.e. it will export only those tables of Scott’s schema whose name starts with “A”
Similarly you can also exclude INDEXES, CONSTRAINTS, GRANTS, USER, SCHEMA
You can use QUERY option to export only required rows. For Example, the following will export only those rows of employees tables whose salary is above 10000 and whose dept id is 10.
 expdp hr/hr QUERY=emp:'"WHERE dept_id > 10 AND sal > 10000"'
        NOLOGFILE=y DIRECTORY=dpump_dir1 DUMPFILE=exp1.dmp

You can suspend running export jobs and later on resume these jobs or kill these jobs using Data Pump Export. You can start a job in one client machine and then, if because of some work, you can suspend it. Afterwards when your work has been finished you can continue the job from the same client, where you stopped the job, or you can restart the job from another client machine.
For Example, suppose a DBA starts a full database export by typing the following command at one client machine CLNT1 by typing the following command
$expdp scott/tiger@mydb FULL=y DIRECTORY=data_pump_dir  
     DUMPFILE=full.dmp LOGFILE=myfullexp.log JOB_NAME=myfullJob
After some time, the DBA wants to stop this job temporarily. Then he presses CTRL+C to enter into interactive mode. Then he will get the Export> prompt where he can type interactive commands
Now he wants to stop this export job so he will type the following command
Export> STOP_JOB=IMMEDIATE
Are you sure you wish to stop this job ([y]/n): y
The job is placed in a stopped state and exits the client.
After finishing his other work, the DBA wants to resume the export job and the client machine from where he actually started the job is locked because, the user has locked his/her cabin. So now the DBA will go to another client machine and he reattach to the job by typing the following command
$expdp hr/hr@mydb ATTACH=myfulljob
After the job status is displayed, he can issue the CONTINUE_CLIENT command to resume logging mode and restart the myfulljob job.
Export> CONTINUE_CLIENT
A message is displayed that the job has been reopened, and processing status is output to the client.
Note: After reattaching to the Job a DBA can also kill the job by typing KILL_JOB, if he doesn’t want to continue with the export job.




Objects exported by Data Pump Export Utility can be imported into a database using Data Pump Import Utility. The following describes how to use Data Pump Import utility to import objects
If you want to Import all the objects in a dump file then you can type the following command.
$impdp hr/hr DUMPFILE=dpump_dir1:expfull.dmp FULL=y
LOGFILE=dpump_dir2:full_imp.log
This example imports everything from the expfull.dmp dump file. In this example, a DIRECTORY parameter is not provided. Therefore, a directory object must be provided on both the DUMPFILE parameter and the LOGFILE parameter
The following example loads all tables belonging to hr schema to scott schema
 $impdp SYSTEM/password DIRECTORY=dpump_dir1 DUMPFILE=hr.dmp
REMAP_SCHEMA=hr:scott
 If SCOTT account exist in the database then hr objects will be loaded into scott schema. If scott account does not exist, then Import Utility will create the SCOTT account with an unusable password because, the dump file was exported by the user SYSTEM and imported by the user SYSTEM who has DBA privileges.
You can use remap_tablespace option to import objects of one tablespace to another tablespace by giving the command
$impdp SYSTEM/password DIRECTORY=dpump_dir1 DUMPFILE=hr.dmp
REMAP_TABLESPACE=users:sales

The above example loads tables, stored in users tablespace, in the sales tablespace.

You can generate SQL file which contains all the DDL commands which Import would have executed if you actually run Import utility
The following is an example of using the SQLFILE parameter.
$ impdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=expfull.dmp
SQLFILE=dpump_dir2:expfull.sql
A SQL file named expfull.sql is written to dpump_dir2.
If you have the IMP_FULL_DATABASE role, you can use this parameter to perform a schema-mode import by specifying a single schema other than your own or a list of schemas to import. First, the schemas themselves are created (if they do not already exist), including system and role grants, password history, and so on. Then all objects contained within the schemas are imported. Nonprivileged users can specify only their own schemas. In that case, no information about the schema definition is imported, only the objects contained within it.
The following is an example of using the SCHEMAS parameter. You can create the expdat.dmp file used in this example by running the example provided for the Export SCHEMAS parameter.
$impdp hr/hr SCHEMAS=hr,oe DIRECTORY=dpump_dir1 LOGFILE=schemas.log
DUMPFILE=expdat.dmp

The hr and oe schemas are imported from the expdat.dmp file. The log file, schemas.log, is written to dpump_dir1
The following example shows a simple use of the TABLES parameter to import only the employees and jobs tables from the expfull.dmp file. You can create the expfull.dmp dump file used in this example by running the example provided for the Full Database Export in Previous Topic.
$impdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=expfull.dmp TABLES=employees,jobs
This will import only employees and jobs tables from the DUMPFILE.
Similar to the DATA PUMP EXPORT utility the Data Pump Import Jobs can also be suspended, resumed or killed. And, you can attach to an already existing import job from any client machine.
For Example, suppose a DBA starts a importing by typing the following command at one client machine CLNT1 by typing the following command
$impdp scott/tiger@mydb FULL=y DIRECTORY=data_pump_dir  
     DUMPFILE=full.dmp LOGFILE=myfullexp.log JOB_NAME=myfullJob
After some time, the DBA wants to stop this job temporarily. Then he presses CTRL+C to enter into interactive mode. Then he will get the Import> prompt where he can type interactive commands
Now he wants to stop this export job so he will type the following command
Import> STOP_JOB=IMMEDIATE
Are you sure you wish to stop this job ([y]/n): y
The job is placed in a stopped state and exits the client.
After finishing his other work, the DBA wants to resume the export job and the client machine from where he actually started the job is locked because, the user has locked his/her cabin. So now the DBA will go to another client machine and he reattach to the job by typing the following command
$impdp hr/hr@mydb ATTACH=myfulljob
After the job status is displayed, he can issue the CONTINUE_CLIENT command to resume logging mode and restart the myfulljob job.
Import> CONTINUE_CLIENT
A message is displayed that the job has been reopened, and processing status is output to the client.
Note: After reattaching to the Job a DBA can also kill the job by typing KILL_JOB, if he doesn’t want to continue with the import job.

1 comment: