Home » RDBMS Server » Server Utilities » impdp error - bad dump file specification (Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production)
impdp error - bad dump file specification [message #655722] Fri, 09 September 2016 16:24 Go to next message
jturck
Messages: 7
Registered: September 2016
Junior Member
I'm ery new to oracle's expdp/impdp.
I received a dump file from the oracle DBA (exp_CNTRJTURCK.dmp) he created using expdp and I'm trying to load it to a new Oracle RDS instance. I used a perl script to load the dump file to the DATA_PUMP_DIR and I'm now attempting to import/build my new schema and load the data.

The command I'm running is:
impdp oracleadmin@ORCL schemas=CNTRJTURCK directory=DATA_PUMP_DIR dumpfile=exp_CNTRJTURCK.dmp logfile=exp_CNTRJTURCK.log

And the message I received is:
Import: Release 11.2.0.1.0 - Production on Fri Sep 9 21:16:07 2016

Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Password:

Connected to: Oracle Database 11g Release 11.2.0.4.0 - 64bit Production
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31619: invalid dump file "/rdsdbdata/datapump/exp_CNTRJTURCK.dmp"
ORA-27072: File I/O error
Linux-x86_64 Error: 22: Invalid argument
Additional information: 4
Additional information: 1
Additional information: -1

I believe the dump file was loaded to the DATA_PUMP_DIR because I can see this:
SQL> select DIRECTORY_NAME, DIRECTORY_PATH from dba_directories;

DIRECTORY_NAME
------------------------------
DIRECTORY_PATH
--------------------------------------------------------------------------------
BDUMP
/rdsdbdata/log/trace

DATA_PUMP_DIR
/rdsdbdata/datapump

ADUMP
/rdsdbdata/log/audit


SQL> SELECT * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR'));

FILENAME
--------------------------------------------------------------------------------
TYPE FILESIZE MTIME
---------- ---------- ---------
datapump/
directory 4096 09-SEP-16

import.log
file 460 07-SEP-16

exp_CNTRJTURCK.log
file 468 07-SEP-16


FILENAME
--------------------------------------------------------------------------------
TYPE FILESIZE MTIME
---------- ---------- ---------
exp_CNTRJTURCK.dmp
file 1575583744 09-SEP-16

Ideas?






Re: impdp error - bad dump file specification [message #655724 is a reply to message #655722] Fri, 09 September 2016 16:41 Go to previous messageGo to next message
Solomon Yakobson
Messages: 3269
Registered: January 2010
Location: Connecticut, USA
Senior Member
Does dump file reside on NAS drive? Most likely it is. You must mount NAS drive with specific options so Oracle can read it. There is an document on mOS listing what options should be used depending on OS.

SY.
Re: impdp error - bad dump file specification [message #655727 is a reply to message #655724] Fri, 09 September 2016 16:51 Go to previous messageGo to next message
jturck
Messages: 7
Registered: September 2016
Junior Member
Hi Solomon,

I'm not sure. I asked my cloud guy and he said "OK, I don't know what kind of drive the RDS has, it's hidden, only a service.".
I do know the dump file is on Ec2 and on the RDS back end.

Does this make sense?

J
Re: impdp error - bad dump file specification [message #655729 is a reply to message #655727] Fri, 09 September 2016 18:28 Go to previous messageGo to next message
jturck
Messages: 7
Registered: September 2016
Junior Member
I should add that I copied the dump file from Ec2 to the DATA_PUMP_DIR via a perl script.

And just for clarification my source DB is an on premises Oracle DB and my target DB is an AWS RDS Oracle database.

I also tried loading via a PL/SQL script:
declare h1 number;

begin

h1 := dbms_datapump.open (operation => 'IMPORT', job_mode => 'schema', job_name => 'IMPORT_DUMP3', version => 'COMPATIBLE');
dbms_datapump.add_file(handle => h1, filename => 'IMPORT.LOG', directory => 'DATA_PUMP_DIR', filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
dbms_datapump.add_file(handle => h1, filename => 'IMPORT.LOG', directory => 'DATA_PUMP_DIR', filetype => 3);
dbms_datapump.set_parameter(handle => h1, name => 'KEEP_MASTER', value => 0);
dbms_datapump.add_file(
handle => h1,
filename => 'exp_CNTRJTURCK.dmp',
directory => 'DATA_PUMP_DIR',
filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE ,
reusefile => 1
);
dbms_datapump.set_parameter(handle => h1, name => 'INCLUDE_METADATA', value => 1);
dbms_datapump.set_parameter(handle => h1, name => 'DATA_ACCESS_METHOD', value => 'AUTOMATIC');
dbms_datapump.set_parameter(handle => h1, name => 'REUSE_DATAFILES', value => 0);
dbms_datapump.set_parameter(handle => h1, name => 'SKIP_UNSUSABLE_INDEXES', value => 0);
dbms_datapump.start_job(handle => h1, skip_current => 0, abort_step => 0);
dbms_datapump.detach(handle => h1);
end;
/

When I execute the SQL I get the following:
SQL> @import2.sql
declare h1 number;
*
ERROR at line 1:
ORA-39001: invalid argument value
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.DBMS_DATAPUMP", line 3507
ORA-06512: at "SYS.DBMS_DATAPUMP", line 3756
ORA-06512: at line 9
Re: impdp error - bad dump file specification [message #655730 is a reply to message #655729] Fri, 09 September 2016 19:31 Go to previous messageGo to next message
BlackSwan
Messages: 26766
Registered: January 2009
Location: SoCal
Senior Member
Post results from OS commands below

id
ls -l /rdsdbdata/datapump/exp_CNTRJTURCK.dmp
ls -ld /rdsdbdata/datapump/
ls -ld /rdsdbdata/
Re: impdp error - bad dump file specification [message #655737 is a reply to message #655722] Sat, 10 September 2016 04:46 Go to previous messageGo to next message
John Watson
Messages: 8922
Registered: January 2010
Location: Global Village
Senior Member
THe problem may be to do with the log file, as I see there is already a log file of the same name. Can you try the import again, specifying a different log file name?

@BS, I don't think one gets shell access to an RDS instance.
@SY, I don't think RDS gives you that level of control.
Re: impdp error - bad dump file specification [message #655751 is a reply to message #655737] Sun, 11 September 2016 20:36 Go to previous messageGo to next message
jturck
Messages: 7
Registered: September 2016
Junior Member
Hi John,
Thanks for the response. I tried with a new log file name and still got the same error:

oracle Cloud DB RDBMS: impdp oracleadmin@ORCL schemas=CNTRJTURCK directory=DATA_PUMP_DIR dumpfile=exp_CNTRJTURCK.dmp logfile=exp_CNTRJTURCK2.log


Import: Release 11.2.0.1.0 - Production on Mon Sep 12 01:34:09 2016

Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Password:

Connected to: Oracle Database 11g Release 11.2.0.4.0 - 64bit Production
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-31619: invalid dump file "/rdsdbdata/datapump/exp_CNTRJTURCK.dmp"
ORA-27072: File I/O error
Linux-x86_64 Error: 22: Invalid argument
Additional information: 4
Additional information: 1
Additional information: -1

And yes you are right. i can not access the dump file from the command line. I can only see that has been loaded to the RDS.

Re: impdp error - bad dump file specification [message #655761 is a reply to message #655751] Mon, 12 September 2016 03:23 Go to previous messageGo to next message
John Watson
Messages: 8922
Registered: January 2010
Location: Global Village
Senior Member
I would raise a TAR. But you may not get any help. Oracle support (assuming you can call them - is it a BYOL installation?) sometimes won't help with Amazon systems, as they say that Amazon Linux and virtualization is not certified. Amazon may refuse to help because they say could say it is an Oracle issue. Try them both, you may get a solution.
Re: impdp error - bad dump file specification [message #655856 is a reply to message #655730] Wed, 14 September 2016 08:55 Go to previous messageGo to next message
jturck
Messages: 7
Registered: September 2016
Junior Member
Hi John and BlackSwan,

I got the data loaded so I thought I would post the solution here in case others run into something similar. The actual problem was the expdp dump file loaded to RDS somehow got corrupted. I checked the existence and file size when i first transmitted the file, but not after that. Not sure what caused the reduction / corruption in the file.

These are the notes I put together on the entire process. This was a migration of an on premises Oracle schema to a new RDS Oracle database/schema. The on premises database is government owned and I therefore did not have access to the database directly and had to rely on the dump file provided to me. No database link capability was available which is what most of the documentation gave as the method of connectivity.

Steps to load EXPDP dump to Oracle RDS

Pull dump file from S3 to Ec2
The dump file created from the expdp command was zipped and loaded to the S3 bucket. In this case the dump was pushed to s3://sba-utils-general/data_migration
Run an aws copy command from the Ec2 command line.
$ aws s3 cp s3://sba-utils-general/data_migration .
Run an unzip dump command from the Ec2 command line.
$ unzip CNTRJTURCK.zip

Push dump from Ec2 to backend RDS perl script (attached)
Execute a perl script from the Ec2 command line.
$ perl put_data_RDS.pl exp_CNTRJTURCK.dmp
Verify the dump was pushed to RDS by running the following query. The file size here can be compared to the filesize in Ec2.
SQL> SELECT * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR'));
select * from table(RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR')) order by FILENAME;

Create tablespace and Grant DB privileges
Open an SQL session
$ sqlplus oracleadmin@ORCL
$ password <password>
From the SQL prompt, run the following SQL privileges commands.
SQL> create tablespace tempadmin2;
SQL> create user CNTRJTURCK identified by turckpassword default tablespace tempadmin2;
SQL> grant read, write on directory data_pump_dir to CNTRJTURCK;
SQL> grant execute on dbms_datapump to CNTRJTURCK;
SQL> grant dba to CNTRJTURCK;

Run IMPDP command
From the Ec2 command line run the impdp command with required parameters.
$ impdp CNTRJTURCK@ORCL/<password> schemas=CNTRJTURCK directory=DATA_PUMP_DIR dumpfile=exp_CNTRJTURCK.dmp logfile=exp_CNTRJTURCK.log TRANSFORM=SEGMENT_ATTRIBUTES:n TABLE_EXISTS_ACTION = REPLACE


Perl script put_data_RDS.pl
oracle Cloud DB RDBMS: cat put_data_RDS.pl
use DBI;
use warnings;
##use strict;

# RDS instance info
my $RDS_PORT=1521;
my $RDS_HOST="<HOST>";
my $RDS_LOGIN="oracleadmin/<password>";
my $RDS_SID="ORCL";
my $dirname = "DATA_PUMP_DIR";
my $fname = $ARGV[0];
my $data = dummy;
my $chunk = 8192;
my $sql_open = "BEGIN perl_global.fh := utl_file.fopen(:dirname, :fname, 'wb', :chunk); END;";
my $sql_write = "BEGIN utl_file.put_raw(perl_global.fh, :data, true); END ;";
my $sql_close = "BEGIN utl_file.fclose(perl_global.fh); END;";
my $sql_global = "create or replace package perl_global as fh utl_file.file_type; end;";

my $conn = DBI->connect('dbi:Oracle:host='.$RDS_HOST.';sid='.$RDS_SID.';port='.$RDS_PORT,$RDS_LOGIN, '') || die ( $DBI::errstr . " \ n") ;

my $updated=$conn->do($sql_global);
my $stmt = $conn->prepare ($sql_open);
$stmt->bind_param_inout(":dirname", \ $dirname, 12);
$stmt->bind_param_inout(":fname", \ $fname, 12);
$stmt->bind_param_inout(":chunk", \ $chunk, 4);
$stmt->execute() || die ( $DBI::errstr . " \ n");

open (INF, $fname) || die " \ nCan't open $fname for reading: $!\ n";
binmode(INF);
$stmt = $conn->prepare ($sql_write);
my %attrib = ('ora_type','24');
my $val=1;
while ($val> 0) {
$val = read (INF, $data, $chunk);
$stmt->bind_param(":data", $data , \%attrib);
$stmt->execute() || die ( $DBI::errstr . "\n") ; };
die "Problem copying: $!\n" if $!;
close INF || die "Can't close $fname: $!\n";
$stmt = $conn->prepare ($sql_close);
$stmt->execute() || die ( $DBI::errstr . "\n") ;
Re: impdp error - bad dump file specification [message #655857 is a reply to message #655856] Wed, 14 September 2016 11:13 Go to previous messageGo to next message
John Watson
Messages: 8922
Registered: January 2010
Location: Global Village
Senior Member
Sussed! A right scholarly piece of work. It is a topic many people will find helpful. Not me, though - we always use EC2 and do the Oracle install ourselves, I find that Amazon RDS is not up to date enough with the Oracle and APEX releases. But if you can live with what they provide, it's pretty good and cost effective environment.
Re: impdp error - bad dump file specification [message #655858 is a reply to message #655857] Wed, 14 September 2016 12:21 Go to previous messageGo to next message
jturck
Messages: 7
Registered: September 2016
Junior Member
Hi John,
We are only loading to Oracle RDS to use their AWS Schema Conversion Tool. We are converting Oracle to PostgreSQL and will be cloud based. Interesting to say the least!
Re: impdp error - bad dump file specification [message #655869 is a reply to message #655858] Thu, 15 September 2016 00:11 Go to previous messageGo to next message
Littlefoot
Messages: 21806
Registered: June 2005
Location: Croatia, Europe
Senior Member
Account Moderator
jturck

I checked the existence and file size when i first transmitted the file, but not after that. Not sure what caused the reduction / corruption in the file.

Just guessing: if you FTPed it, maybe it was ASCII mode first time you did that, while the second time you used BINARY mode.
Re: impdp error - bad dump file specification [message #655890 is a reply to message #655869] Thu, 15 September 2016 07:41 Go to previous message
jturck
Messages: 7
Registered: September 2016
Junior Member
Nope, I transmitted binary both times (same script, no modifications). The file, size when I transmitted the first time matched Ec2 and was close to 2 GBs. But when when I checked it a couple days later it was like 400K.

But happy now the data is loaded. Onto the AWS Schema Conversion Tool!
Previous Topic: Import ERROR
Next Topic: Message 2100???
Goto Forum:
  


Current Time: Fri Mar 29 10:17:47 CDT 2024