Skip to main content

Posts

Showing posts from October, 2018

Export Import Over Network Link

<<Back to Oracle DATAPUMP Main Page How to Use EXPDP/IMPDP NETWORK_LINK Parameter SOURCE : TEST1T HOST : test1.oracle.com TARGET : TEST2T HOST : test2.oracle.com Objective : Trigger the Export of Database schema TEST_EXP@TEST1T from test2.oracle.com and write the export dumps on test2.oracle.com Preparation: Step1 : Add the source database TNS entry in target database tnsnames.ora TEST1T =   (DESCRIPTION =     (ADDRESS = (PROTOCOL = TCP)(HOST = test1.oracle.com )(PORT = 1521))     (CONNECT_DATA =       (SERVER = DEDICATED)       (SERVICE_NAME = TEST1T)     )   ) Step2 : Create private database link in target database pointing to sourced database SQL>create database link exp_test connect to TEST_EXP identified by xxxx using 'TEST1T'; Step3 : Create directories and grant privilege to the user (supposed to initiate export) in target database SQL>create directo...

ORA-31625 and ORA-01031 IMPDP

<<Back to Oracle DATAPUMP Main Page ORA-31625: Schema XXX is needed to import this object, but is unaccessible ORA-01031: insufficient privileges $impdp test/xxxx@pdb01 directory=exp NETWORK_LINK=exp_test logfile=exp.log Import: Release 12.2.0.1.0 - Production on Mon Oct 29 14:26:48 2018 Copyright (c) 1982, 2017, Oracle and/or its affiliates.  All rights reserved. Connected to: Oracle Database 12c Enterprise Edition Release 12.2.0.1.0 - 64bit Production Starting "TEST"."SYS_IMPORT_SCHEMA_01":  test/********@pdb01 directory=exp NETWORK_LINK=exp_test logfile=exp.log Estimate in progress using BLOCKS method... Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA Total estimation using BLOCKS method: 64 KB Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA ORA-39083: Object type PROCACT_SCHEMA failed to create with error: ORA-31625: Schema TEST_EXP is needed to import this object, but is unaccessible ORA-01031: insufficient privileges...

Track the Time Spent on each operation in Expdp and Impdp

<<Back to Oracle DATAPUMP Main Page Trace the Expdp and Impdp Performance Issue using LOGTIME and METRICS Parameter of course you can track the time for each export/import operation and direct export/import job to write the timestamp in logfile using LOGTIME parameter. More on Expdp/Impdp LOGTIME Read Here But if you like additional information about the job (eg. which process is doing what and taking how much time) you can use METRICS=Y parameter. These parameters are useful to investigate datapump performance and estimate the time of similar export, import operation METRICS Default : NO Syntax and Description METRICS=[YES | NO] An Example $ cat exp_full_pdb01.par DIRECTORY=DUMP DUMPFILE=EXP_PDB01_FULL%U.DMP LOGFILE=EXP_PDB01_FULL.LOG SCHEMAS=TEST REUSE_DUMPFILES=Y FLASHBACK_TIME=SYSTIMESTAMP METRICS=Y $ expdp system@PDB01 parfile=exp_full_pdb01.par Export: Release 12.2.0.1.0 - Production on Fri Oct 26 13:14:59 2018 Copyright (c) 1982, 2017, Oracle and/or its affili...

LOGFILE and LOGTIME in Expdp and Impdp

<<Back to Oracle DATAPUMP Main Page LOGFILE and LOGTIME in Expdp and Impdp LOGFILE Specifies the name of export/import logfile Default : export.log Syntax and Description LOGFILE=[directory_object:]file_name LOGTIME Specifies that messages displayed during export operations be timestamped. Using timestamp in logfile you can investigate which operation took what time. Default :NONE <= No timestamps are recorded Syntax and Description LOGTIME=[NONE | STATUS | LOGFILE | ALL] • STATUS--Timestamps on status messages only • LOGFILE--Timestamps on log file messages only • ALL--Timestamps on both status and log file messages An Example $ cat exp_full_pdb01.par DIRECTORY=DUMP DUMPFILE=EXP_PDB01_FULL%U.DMP LOGFILE=EXP_PDB01_FULL.LOG FULL=Y REUSE_DUMPFILES=Y FLASHBACK_TIME=SYSTIMESTAMP LOGTIME=ALL $ expdp system@PDB01 parfile=exp_full_pdb01.par Export: Release 12.2.0.1.0 - Production on Fri Oct 26 11:53:35 2018 Copyright (c) 1982, 2017, Oracle and/or its affil...

ORA-46646: file from which keys are to be imported is invalid

<<Back to DB Administration Main Page ORA-46646: file from which keys are to be imported is invalid SQL> administer key management import encryption keys with secret "xxxxx" from '/tmp/pdb02_tde_key.exp' force keystore identified by "xxxxx"; administer key management import encryption keys with secret "xxxx" from '/tmp/pdb02_tde_key.exp' force keystore identified by "xxxx" * ERROR at line 1: ORA-46646: file from which keys are to be imported is invalid Cause : Wrong file permission for file  '/tmp/pdb02_tde_key.exp'. Solution : Verify and correct the permission of encryption key dumpfile and re-run the import. Importing User must have the read/write privilege on the file to successfully import it.

ORA-46655: no valid keys in the file from which keys are to be imported

<<Back to DB Administration Main Page SQL> administer key management import encryption keys with secret "xxxx" from '/tmp/pdb02_tde_key.exp' force keystore identified by "xxxx" with backup; administer key management import encryption keys with secret "xxxxxx" from '/tmp/pdb02_tde_key.exp' force keystore identified by "xxxxxx" with backup * ERROR at line 1: ORA-46655: no valid keys in the file from which keys are to be imported Cause: Either the keys to be imported already present in the target database or correct container (PDB) is not set. Solution: In my case I got the error because I attempted to import the keys for newly plugged database PDB02 from CDB$ROOT container. To Solve the issue just switched to the correct container and re run the import. SQL> show con_name CON_NAME ------------------------------ CDB$ROOT <===Wrong Container selected  SQL> alter session set container=PDB02; Session alt...

How to Track the Status of completed Export or Import Jobs

<<Back to Oracle DATAPUMP Main Page How to Retain Master Table of completed Export or Import Jobs Use KEEP_MASTER parameter Default : NO Syntax and Description KEEP_MASTER=[YES | NO] $ cat exp_full_pdb01.par DIRECTORY=DUMP DUMPFILE=EXP_PDB01_FULL%U.DMP LOGFILE=EXP_PDB01_FULL.LOG FULL=Y REUSE_DUMPFILES=Y FLASHBACK_TIME=SYSTIMESTAMP JOB_NAME=MY_EXPORT_JOB KEEP_MASTER=Y You can query the status of export/import jobs by attaching to the job even if it completed only if you have used KEEP_MASTER=Y parameter while performing export/import. If you have not used the parameter and try to query the status you will see errors similar to this expdp system@PDB01 attach=MY_EXPORT_JOB Export: Release 12.2.0.1.0 - Production on Wed Oct 24 16:21:38 2018 Copyright (c) 1982, 2017, Oracle and/or its affiliates.  All rights reserved. Password: Connected to: Oracle Database 12c Enterprise Edition Release 12.2.0.1.0 - 64bit Production ORA-31626: job does not exist ORA-06512: at "SYS.DB...

User Defined Name for Export or Import Jobs

<<Back to Oracle DATAPUMP Main Page How to Give a Name to an Export or Import Job In some you don't want to use system generated names for an export or import jobs but you want to assign name to recognize the jobs easily while attaching to the job or while querying  information from DBA_DATAPUMP_JOBS. You can use  JOB_NAME parameter of expdp or impdp utility to achieve the same JOB_NAME Default: system-generated name of the form SYS_EXPORT_<mode>_NN <= in export Default: system-generated name of the form SYS_<IMPORT or SQLFILE>_<mode>_NN <= in import Syntax and Description JOB_NAME=jobname_string An example: $ cat exp_full_pdb01.par DIRECTORY=DUMP DUMPFILE=EXP_PDB01_FULL%U.DMP LOGFILE=EXP_PDB01_FULL.LOG FULL=Y REUSE_DUMPFILES=Y FLASHBACK_TIME=SYSTIMESTAMP JOB_NAME=MY_EXPORT_JOB $ expdp system@PDB01 parfile=exp_full_pdb01.par Export: Release 12.2.0.1.0 - Production on Wed Oct 24 16:14:38 2018 Copyright (c) 1982, 2017, O...

How to Take Consistent Database Export Using expdp

<<Back to Oracle DATAPUMP Main Page Taking Consistent Database Export Using expdp To perform consistent database export using expdp you can use either FLASHBACK_SCN or FLASHBACK_TIME parameter Expdp FLASHBACK_SCN Syntax and Description FLASHBACK_SCN=scn_value $ cat exp_full_pdb01.par DIRECTORY=DUMP DUMPFILE=EXP_PDB01_FULL%U.DMP LOGFILE=EXP_PDB01_FULL.LOG FULL=Y FLASHBACK_SCN=4170479 You can find the current_scn from v$database view. Expdp FLASHBACK_TIME Syntax and Description FLASHBACK_TIME="TO_TIMESTAMP(time-value)" $ cat exp_full_pdb01.par DIRECTORY=DUMP DUMPFILE=EXP_PDB01_FULL%U.DMP LOGFILE=EXP_PDB01_FULL.LOG FULL=Y REUSE_DUMPFILES=Y FLASHBACK_TIME="TO_TIMESTAMP('23-10-2018 14:40:00', 'DD-MM-YYYY HH24:MI:SS')" $ cat exp_full_pdb01.par DIRECTORY=DUMP DUMPFILE=EXP_PDB01_FULL%U.DMP LOGFILE=EXP_PDB01_FULL.LOG FULL=Y REUSE_DUMPFILES=Y FLASHBACK_TIME=SYSTIMESTAMP

How to Exclude or Include Particular Object from Database Export: expdp

<<Back to Oracle DATAPUMP Main Page Excluding Objects while exporting database You can exclude objects while exporting database using EXCLUDE parameter. Syntax and Description EXCLUDE=object_type[:name_clause] [, ...] The object_type specifies the type of object to be excluded. NOTE: If an object is excluded, then all of its dependent objects are also excluded. For example, excluding a table will also exclude all indexes and triggers on the table. Run an export using expdp with par file as below. $expdp system@PDB01 parfile=exp_full_pdb01.par Excluding Particular Schema From Full Database Export $ cat exp_full_pdb01.par DIRECTORY=DUMP DUMPFILE=EXP_PDB01_FULL%U.DMP LOGFILE=EXP_PDB01_FULL.LOG FULL=Y EXCLUDE=SCHEMA:"='TEST'" REUSE_DUMPFILES=Y Excluding Multiple Schemas From Full Database Export $ cat exp_full_pdb01.par DIRECTORY=DUMP DUMPFILE=EXP_PDB01_FULL%U.DMP LOGFILE=EXP_PDB01_FULL.LOG FULL=Y EXCLUDE=SCHEMA:" IN ('TEST...

How to Estimate Space Requirement for an Oracle Export

<<Back to Oracle DATAPUMP Main Page expdp ESTIMATE_ONLY=Y To estimate the space requirement for an export use ESTIMATE_ONLY parameter Default: NO Syntax and Description ESTIMATE_ONLY=[YES | NO] If ESTIMATE_ONLY=YES, then Export estimates the space that would be consumed, but quits without actually performing the export operation. ESTIMATE_ONLY by default usage parameter ESTIMATE=BLOCKS. But if you want to use STATISTICS instead of BLOCKS for required space calculation use ESTIMATE=STATISTICS together with ESTIMATE_ONLY=Y For ESTIMATE=STATISTICS to calculate accurate space requirement all tables should have been analyzed recently. Note: DUMPFILE parameter together with ESTIMATE_ONLY is not supported and you will receive below error if you use it together, therefore while using ESTIMATE_ONLY=Y comment out DUMPFILE parameter in the export par file ORA-39002: invalid operation ORA-39201: Dump files are not supported for estimate only jobs. $ cat exp_full_pdb01.par DIRECTORY=...

ORA-15040: diskgroup is incomplete

<<Back to Oracle ASM Main Page ORA-15040: diskgroup is incomplete SQL> startup ORA-00099: warning: no parameter file specified for ASM instance ASM instance started Total System Global Area 1140850688 bytes Fixed Size                  8629704 bytes Variable Size            1107055160 bytes ASM Cache                  25165824 bytes ORA-15110: no diskgroups mounted Reason: The reason of this error is simply the ASM is not able to find the some or all the disks. Solution: Investigate and make all the disks available to ASM to mount the disk group. Make sure the disks has proper permissions. If you are using AFD check following services are online oracleacfs oracleadvm oracleoks  oracleafd   Source of Problem : Issue started af...