Reviewing Migration Options
The best method for moving a database of medium or larger size to the Autonomous Database Cloud Service is DATAPUMP import.
Sql*Loader is also an option, but the unload is slow, takes awhile, is error-prone, and does not gaurantee data consistency, as each table is unloaded at different points of time.
A final, somewhat viable method is using Goldengate, which is an additional licensing cost (in most cases), and requires complex configuration and expertise to implement properly. Even then, Goldengate is best used in conjunction with datapump for larger databases, and will be quite costly in terms of licensing and duration.
SQL*Loader and Goldengate can work for some use cases, but are not the subject of this post.
A faster method might involve the use of Oracle's backup/recovery product, known as RMAN, but it is not available as a tool in Autonomous at this point in 2024.
(This may change in the future - and I predict it will - stay tuned for a new blog should this happen!).
Why is datapump the best alternative?
It is important to understand that Oracle's Autonomous Database is a serverless implementation. Due to this, you will not have direct access to the back end linux server and OS hosting the database. This is standard with other Oracle Database implementations.
Please also understand that this offers Oracle and the customer some great economies - operating system infrastructure is a variable cost and maintenance headache on it's own - and by taking care of this for you - and in a consistent way - time and money can be saved.
It also means the customer can get by without even needing these skills.
Also understand that this is the way the industry is headed. Unless you pay more, all the major cloud vendors are pushing towards the serverless model for databases - and other services as well.
Using DATAPUMP with Autonomous Database in OCI:
Oracle has put their best foot forward in making enhancements to facilitate the use of datapump for loading ADB. A new configuration parameter, called CREDENTIALS, has been added. You are also now allowed to specify a web URL for the DUMPFILES parameter to indicate where your source files are located.
Important Note:
It is now MANDATORY that the dumpfiles location contains a URL with the storage location of datafiles.
In the past, a local or network file system on the database server was the only supported option.
This does not work anymore!
In order to load data from a specified URL to an autonomous database, several steps must be taken. The first of which is to assure the storage location specified is supported by OCI ADB. At this writing, there are several option. Of course, Oracle's object storage is among them, but Amazon S3, Microsoft Azure, and Google Cloud Storage are also supported. This represents a multicloud initiative from Oracle which changes by the day, so more may be coming soon!
Check here for the latest features and changes, including multi-cloud options for hosting the database itself!
Connecting ADB and Object Storage
Because of the nature of Cloud infrastructure and the need for security above and beyond requirements from just a few years ago, your Database will not have immediate access to your object storage by default, even if they are in the same tenancy/region/compartment. This connection requires both a security configuration in your cloud and the target database.
First, your account in OCI must have permissions to access the storage. This can be implied with the proper level of permissions in the IAM of the tenancy for your account. This configuration and it's details are covered separately, but more information is also in the documentation link above.
Secondly, the database needs credentials explicitly detailing the URL of the storage. This is done by using the DBMS_CLOUD package, which comes pre-built and installed with the database.
DBMS_CLOUD.CREATE_CREDENTIAL is the first procedure which will be needed.
DBMS_CLOUD.PUT_OBJECT might also be needed to load a local file to cloud object storage.
Depending upon the Oracle version, more commands may be required, one of which includes setting defaults for the storage location.
Important Notes: All users to be imported must be precreated, and given a large quota to the main supported tablespace DATA.
Finally, the special datapump configuration parameters (credentials/dumpfiles URL) will validate supplied credentials and storage URL.
The background and purpose of this is all about security. Since the Cloud and the internet are public, all vendors are required (by law, in many cases), that access and movement of all data be authenticated.
The purpose of this blog is to present a conceptual overview and understand of how Autonomous Database is different and decidedly new for most Oracle Database professionals. A very good summary set of documentation, with current syntax for the entire process, is available here.
Please respond in the comments if you would like more information, more details, or to discuss this complex topic.