DataPump
•
The data pump facility runs on the server side.
•
It enables very high speed movement of data and
metadata from one DB to another.
•
Dump files generated by Data Pump export are not
compatible with original export utility.
Benefits of using DataPump over normal export
and import
•
The ability to restart data pump jobs
•
The ability to detach from and reattach to long
running jobs without affecting the job itself.
•
Data pump export and import use parallel
execution rather than a single stream of execution, for improved performance.
Data Pump Export Modes
A Data Pump export utility is invoked using expdp command. Backup
can be taken at below modes:
• Full
Export Mode:- Specified using FULL parameter in expdp command
• Schema
Mode:- Specified using SCHEMAS parameter
• Table
Mode:- Specified using TABLES parameter
• Tablespace
Mode:- TABLESPACES parameter
• Transportable
Tablespace Mode: - TRANSPORT_TABLESPACES parameter.
Data Pump Import Modes
The Data Pump import utility is invoked using impdp command.
Backup can be imported below modes:
• Full
Import Mode:- Specified using FULL parameter in impdp command
• Schema
Mode:- Specified using SCHEMAS parameter
• Table
Mode:- Specified using TABLES parameter
• Tablespace
Mode:- Specified using TABLESPACES parameter
• Transportable
Tablespace Mode: - Specified using TRANSPORT_TABLESPACES parameter.
Export Examples
Before taking the datapump backup, we need to create a
backup directory at O.S. level and then at the DB level as the SYS user.
1.
Create /u02/expdp_bkp at O.S. Level.
2.
At SQL prompt:
SQL> create or replace directory
expdp_bkp as ‘/u02/expdp_bkp’;
SQL> grant read, write on directory
expdp_bkp to public;
Example 1:-
Taking Schema (SCOTT user) backup
$expdp system/<password> dumpfile=scott.dmp
logfile=scott.log directory=expdp_bkp schemas=scott
Giving the job name for the backup
$ expdp scott/<password> dumpfile=scott.dmp
logfile=scott.log directory=expdp_bkp job_name=dpschema
We can monitor the datapump jobs by querying on
dba_datapump_jobs as SYS/SYSTEM users.
SQL> select *from dba_datapump_jobs;
Example 2:-
Taking Full database backup
$expdp system/<password> dumpfile=fulldb.dmp logfile=fulldb.log
directory=expdp_bkp full=y
Example 3:-
Taking Table Level backup
$ expdp scott/<password> dumpfile=tables.dmp logfile=tables.log
directory=expdp_bkp job_name=dptables tables=emp,dept
Example 4:-
Taking Row level backup
If you want to take the backup of EMP table whose deptno =
10
$expdp scott/<password> dumpfile=query.dmp
logfile=query.log directory=expdp_bkp tables=emp query=\’ where deptno=10\’
Example 5:-
Stopping and restarting the datapump jobs
Stopping the datapump job:-
When the backup job is running at OS level press Ctrl+C,
follow below instructions
Export>stop_job=immediate
YES
Restarting the stopped jobs:-
$expdp attach=<job name>
Export> continue_client
Example 6: -
Using parallel option during export or import.
$expdp system/<password> dumpfile=fulldb.dmp logfile=fulldb.log
directory=expdp_bkp full=y parallel=4
Import Examples
Example 1:-
Importing SCOTT backup to another database
NOTE:
- If the backup is exported by SCOTT user, then during the time of import you
need to create SCOTT user explicitly to do the import. Else if the backup is
exported by SYS/SYSTEM user, no need to create the target user explicitly.
IMPDP will take care of creating the users.
$impdp system/<password> dumpfile=schema.dmp
logfile=scottimp.log directory=expdp_bkp
schemas=scott
Example 2:-
Importing the dump of SCOTT user to USER3 schema
$impdp system/<password> dumpfile=schema.dmp
logfile=scottimp.log directory=expdp_bkp
remap_schema=scott:USER3 table_exists_action=replace
No comments:
Post a Comment