Pythian Blog: Technical Track

Data Pump Monitoring

During database maintenance one of the main questions is when long running processes
like export, import, backup, recovery, flashback, etc. are finished.

I usually use v$session_longops which allows to get some estimation of when process finished:

select message, time_remaining/3600 hrs from v$session_longops where sofar <> totalwork

After one of the recent Data Pump executions I tested import and export on Oracle 11.1.0.7 Linux 64-bit
and found that for both export and import master processes value of message column in v$session_longops
stays something like “0 out of … MB done” during the whole process even it is far away from the beginning.

However, there are Data Pump Worker processes that do the job and for export there is
information how many blocks scanned out of total number:

d117> @longops
     SOFAR  TOTALWORK UNITS                            TIME_REMAINING ELAPSED_SECONDS MESSAGE
---------- ---------- -------------------------------- -------------- --------------- ------------------------------------------------------------
     16613     512501 Blocks                                      239               8 Rowid Range Scan:  AG.T1: 16613 out of 512501 Blocks done

But for import I did not find any information in v$session_longops that could identify the progress
of work done by worker process.

In addition to that there some other observations seen during experiments on test database
(Oracle 11.1.0.7 on Linux x86-64) with 4Gb table:

1. Number of bytes written to export file stays ~4k up the end of the export even
the file grown to several Gbytes.

2. Percent done for worker status import set to 100% from the beginning and not changing.

3. Number of completed bytes for import shows the size of the whole import while completed rows
reflects number of processed rows.

4. Number of processed rows for import does not necessary mean inserted rows since it is not
being reset during stop_job action which rolls back all inserted to a table rows.

5. If import stopped by stop_job and then resumed by start_job v$session_longops
does not have information about import anymore.

6. Time_remaining in v$session_longops is null for Data Pump Import master process.

7. Short command naming can be used – stat for status, star for start_job. Shorter one – sta
behaves as start_job and status in conjunction. Guess what “s” is up to …

Import> s
ORA-39004: invalid state
ORA-39016: Operation not supported when job is in EXECUTING state.
Are you sure you wish to stop this job ([yes]/no):

If there is no much useful information in v$session_longops and space for the segment
is already allocated how to get the same “rows completed” value of worker status
as in intercative mode but using sql script?

With small modification of example from the documentation I came with the PL/SQL block
to get value of one Data Pump Import worker status attributes:

DECLARE
  ind NUMBER;              -- Loop index
  h1 NUMBER;               -- Data Pump job handle
  percent_done NUMBER;     -- Percentage of job complete
  job_state VARCHAR2(30);  -- To keep track of job state
  js ku$_JobStatus;        -- The job status from get_status
  ws ku$_WorkerStatusList; -- Worker status
  sts ku$_Status;          -- The status object returned by get_status
BEGIN
h1 := DBMS_DATAPUMP.attach('SYS_IMPORT_FULL_01', 'SYS'); -- job name and owner
dbms_datapump.get_status(h1,
           dbms_datapump.ku$_status_job_error +
           dbms_datapump.ku$_status_job_status +
           dbms_datapump.ku$_status_wip, 0, job_state, sts);
js := sts.job_status;
ws := js.worker_status_list;
      dbms_output.put_line('*** Job percent done = ' ||
                           to_char(js.percent_done));
      dbms_output.put_line('restarts - '||js.restart_count);
ind := ws.first;
  while ind is not null loop
    dbms_output.put_line('rows completed - '||ws(ind).completed_rows);
    ind := ws.next(ind);
  end loop;
DBMS_DATAPUMP.detach(h1);
end;
/

PL/SQL block will allow to not only monitor progress of Data Pump export/import but to perform
some actions based on results which definitely more flexible than Data Pump interactive mode.

Have a good day!

No Comments Yet

Let us know what you think

Subscribe by email