Transporting Statistics From one Database to Another
Notes:
I use the TOOL schema to preserve my tables statistics.
Example uses schema APP and stat table ORDERS.
Data Pump requires a valid DIRECTORY object with read/write privileges.
Transporting statistics is useful to reproduce plans between environments (PROD > DEV > QA > UAT).
Version:
9iR2 / 2002
Licensing:
Does not require Enterprise Edition
Summary
Short guide on how to export and import optimizer statistics for a schema using DBMS_STATS and Oracle Data Pump. On the source, create a stats table, gather schema stats, export them into that table, and dump it with expdp. On the target, load the dump with impdp and import the stats into the dictionary with DBMS_STATS.IMPORT_SCHEMA_STATS. Use this to move execution plans between environments consistently.
Quick Checklist
- DBMS_STATS.CREATE_STAT_TABLE
- DBMS_STATS.EXPORT_*_STATS (schema/table/index/column and optional system/dictionary)
- Take a datapump and move it to the destination
- DBMS_STATS.UPGRADE_STAT_TABLE (if needed)
- (Optional) DBMS_STATS.LOCK_*_STATS
- (Optional) Backup the current stats using DBMS_STATS.EXPORT_*_STATS and STATID
- DBMS_STATS.IMPORT_*_STATS (with appropriate FORCE and NO_INVALIDATE)
Statistics Transport
Source
First let's create a table for us to preserve the statistics.
SQL> BEGIN
DBMS_STATS.CREATE_STAT_TABLE(
ownname => 'TOOLS', -- owner of the stat table
stattab => 'STAT_TAB_APP', -- table name
tblspace => 'USERS' -- optional tablespace
);
END;
/
Now we use the DBMS_STATS.EXPORT_SCHEMA_STATS to write the statistics detail to the table we previous created using the DBMS_STATS.CREATE_STAT_TABLE>.
SQL> BEGIN
DBMS_STATS.EXPORT_SCHEMA_STATS(
ownname => 'APP', -- schema containing the objects
stattab => 'STAT_TAB_APP',
statid => 'APP_2025-09-03' -- label for identification
);
END;
/
This step is optional: export system and dictionary stats if you want full reproducibility.
SQL> BEGIN
DBMS_STATS.EXPORT_SYSTEM_STATS(stattab => 'STAT_TAB_APP', statid => 'SYS_2025-09-03');
DBMS_STATS.EXPORT_DICTIONARY_STATS(stattab => 'STAT_TAB_APP', statid => 'DICT_2025-09-03');
END;
/
With that done, let's export our table that has all the statistics inside and move the dmp file to the other server.
$ expdp tools/t@SRC DIRECTORY=DP_DIR \
TABLES=TOOLS.STAT_TAB_APP CONTENT=DATA_ONLY \
DUMPFILE=stats_APP_20250903.dmp LOGFILE=stats_APP_20250903.exp.log
Destination
Import the table that we exported.
$ impdp tools/t@DST DIRECTORY=DP_DIR \
TABLES=TOOLS.STAT_TAB_APP \
DUMPFILE=stats_APP_20250903.dmp LOGFILE=stats_APP_20250903.imp.log
If destination database is in a higher version, upgrade the stat table:
SQL> BEGIN
DBMS_STATS.UPGRADE_STAT_TABLE(ownname => 'TOOLS', stattab => 'STAT_TAB_APP');
END;
/
Lock stats before importing to prevent it from overwriting by auto-stats or any other job you may have:
SQL> BEGIN
DBMS_STATS.LOCK_SCHEMA_STATS('APP');
END;
/
Backup the current stats before import
This is a best practice, before importing new stats, export the current ones using a different STATID.
SQL> BEGIN
DBMS_STATS.CREATE_STAT_TABLE('TOOLS','STAT_SAFETY_NET');
DBMS_STATS.EXPORT_TABLE_STATS(
ownname => 'APP', tabname => 'ORDERS',
stattab => 'STAT_SAFETY_NET', statid => 'PRE_IMPORT_2025-09-03'
);
END;
/
Import the whole schema stats from the source
Now we just need to run the DBMS_STATS.IMPORT_SCHEMA_STATS and Oracle will use the table we used for the transport to import the stats to the table we want.
The example below will import the whole schema stats.
SQL> BEGIN
DBMS_STATS.IMPORT_SCHEMA_STATS(
ownname => 'APP',
stattab => 'STAT_TAB_APP',
statid => 'APP_2025-09-03',
no_invalidate => DBMS_STATS.AUTO_INVALIDATE,
force => TRUE
);
END;
/
Import only a table stats from the source
But if you want to just bring the stats from one table, of from specific tables, you can also import them this way:
SQL> BEGIN
DBMS_STATS.IMPORT_TABLE_STATS(
ownname => 'APP',
tabname => 'ORDERS',
stattab => 'STAT_TAB_APP',
statid => 'APP_2025-09-03', -- label used in the EXPORT
force => TRUE, -- overwrites the current stats
no_invalidate => DBMS_STATS.AUTO_INVALIDATE
);
END;
/
If exported system/dictionary stats:
SQL> BEGIN
DBMS_STATS.IMPORT_SYSTEM_STATS(stattab => 'STAT_TAB_APP', statid => 'SYS_2025-09-03');
DBMS_STATS.IMPORT_DICTIONARY_STATS(stattab => 'STAT_TAB_APP', statid => 'DICT_2025-09-03');
END;
/
Rollback
If you need to rollback your statistics and you are using the version control (STATID), you can do this way:SQL> BEGIN
DBMS_STATS.IMPORT_TABLE_STATS(
ownname => 'APP',
tabname => 'ORDERS',
stattab => 'STAT_TAB_APP',
statid => 'PRE_IMPORT_2025-09-03',
force => TRUE,
no_invalidate => DBMS_STATS.AUTO_INVALIDATE
);
END;
/
If you already imported, don't exported it before the opperation and still wants to rollback, check this method instead.
Best Practices
- Always use STATID to tag each batch (by date/release) to simplify rollbacks.
- FORCE => TRUE ensures existing stats are overwritten on import.
- NO_INVALIDATE: Undestand what you want, usually you will want to invalidate all the current cursors.
- Export your current stats before importing the new one as a backup plan.
- Lock stats with DBMS_STATS.LOCK_*_STATS to prevent auto-gather jobs from altering imported stats.
- Histograms are transported. If you don’t want them in the target, adjust METHOD_OPT and recollect.
- Partitioned tables: if using incremental stats, ensure preferences match across environments.
- System/Dictionary stats: transport only if you want the same CPU/I/O cost model; otherwise gather fresh ones in the target when necessary.
Comments
Post a Comment