はじめに
今回はずっと触りたかったMySQL Shellのダンプをやりつつ、 最後は前回作ったMDS相手にインポートしてみようかと思います。
MDSにインポートするためにはociクライアントの設定や、オブジェクトストレージが必要なので、それらを途中で設定していきます。
準備
1. 手元のdbdeployerで作成したMySQL8.0.21の環境に、公式が出しているテスト用データベースたちを適当に入れていきます。
自分はgit cloneで取得しましたが、zip形式でもダウンロードは可能です。
$ git clone https://github.com/datacharmer/test_db.git
2. MySQL Shellももちろんインストールしておいてください
自分のところの環境はUbuntuとOCI側はCentOSですので、以下のとおりです。
Ubuntu
$ sudo dpkg -i /PATH/version-specific-package-name.deb -- 今時点だとmysql-apt-config_0.8.15-1_all.debが最新の様子 $ sudo apt-get update $ sudo apt-get install mysql-shell
CentOS(前回やりましたが、一応転記)
$ sudo yum install https://dev.mysql.com/get/mysql80-community-release-el7-3.noarch.rpm $ sudo yum install mysql-shell
3. 1で取得したスキーマ達をローカルの8.0.21にインポート
$ cd test_db/ $ mysql -uroot -p -h127.0.0.1 -P8021 < employees.sql $ mysql -uroot -p -h127.0.0.1 -P8021 < test_employees_md5.sql $ mysql -uroot -p -h127.0.0.1 -P8021 < test_employees_sha.sql $ mysql -uroot -p -h127.0.0.1 -P8021 < sakila/sakila-mv-schema.sql $ mysql -uroot -p -h127.0.0.1 -P8021 < sakila/sakila-mv-data.sql $ $ # 作成したスキーマを確認(testはdbdeployerで自動で作っている) $ mysql -uroot -p -h127.0.0.1 -P8021 (root@127.0.0.1) [(none)]> show databases; +--------------------+ | Database | +--------------------+ | employees | | information_schema | | mysql | | performance_schema | | sakila | | sys | | test | +--------------------+ 7 rows in set (0.00 sec)
MySQL ShellのdumpInstanceを使っていく
ダンプ取得で使えるものとしてdumpInstanceとdumpSchemasがあるけど、その差はスキーマ(データベース)単位でやるか否かだけの話。
dumpInstanceのほうが自分はよく使いそうだし、2つの差も引数としてスキーマ指定が絡むだけっぽいのでdumpInstanceを触っていく
dumpInstanceを試す
ダンプユーティリティのオプションについては公式ドキュメントにあるのでそれを参照するのだが、事例が少ない。
書き方で戸惑うかもしれないので試してみて実行できた書き方を以下並べてみる。
第1引数が出力先のディレクトリ名になります。(/tmp/dumpdirとか)
上書きしようとするとエラーになるので、
excludeSchemas
指定したスキーマをダンプから除外します。
例)testスキーマを除外
MySQL 127.0.0.1:8021 ssl JS > util.dumpInstance("/tmp/dumpdir",{excludeSchemas:["test"]}) Acquiring global read lock All transactions have been started Locking instance for backup Global read lock has been released Writing global DDL files Writing users DDL Preparing data dump for table `sakila`.`actor` Writing DDL for schema `sakila` Writing DDL for view `sakila`.`customer_list` Writing DDL for view `sakila`.`actor_info` Writing DDL for view `sakila`.`film_list` Data dump for table `sakila`.`actor` will be chunked using column `actor_id` Preparing data dump for table `sakila`.`address` Data dump for table `sakila`.`address` will be chunked using column `address_id` Preparing data dump for table `sakila`.`category` Data dump for table `sakila`.`category` will be chunked using column `category_id` Preparing data dump for table `sakila`.`city` Data dump for table `sakila`.`city` will be chunked using column `city_id` Preparing data dump for table `sakila`.`country` Data dump for table `sakila`.`country` will be chunked using column `country_id` Preparing data dump for table `sakila`.`customer` Data dump for table `sakila`.`customer` will be chunked using column `customer_id` Preparing data dump for table `sakila`.`film` Data dump for table `sakila`.`film` will be chunked using column `film_id` Preparing data dump for table `sakila`.`film_actor` Data dump for table `sakila`.`film_actor` will be chunked using column `actor_id` Preparing data dump for table `sakila`.`film_category` Data dump for table `sakila`.`film_category` will be chunked using column `film_id` Preparing data dump for table `sakila`.`film_text` Writing DDL for view `sakila`.`nicer_but_slower_film_list` Data dump for table `sakila`.`film_text` will be chunked using column `film_id` Preparing data dump for table `sakila`.`inventory` Data dump for table `sakila`.`inventory` will be chunked using column `inventory_id` Preparing data dump for table `sakila`.`language` Writing DDL for view `sakila`.`sales_by_film_category` Data dump for table `sakila`.`language` will be chunked using column `language_id` Preparing data dump for table `sakila`.`payment` Writing DDL for view `sakila`.`sales_by_store` Writing DDL for view `sakila`.`staff_list` Data dump for table `sakila`.`payment` will be chunked using column `payment_id` Preparing data dump for table `sakila`.`rental` Data dump for table `sakila`.`rental` will be chunked using column `rental_id` Preparing data dump for table `sakila`.`staff` Writing DDL for table `sakila`.`actor` Data dump for table `sakila`.`staff` will be chunked using column `staff_id` Preparing data dump for table `sakila`.`store` Data dump for table `sakila`.`store` will be chunked using column `store_id` Preparing data dump for table `employees`.`departments` Data dump for table `employees`.`departments` will be chunked using column `dept_no` Preparing data dump for table `employees`.`dept_emp` Data dump for table `employees`.`dept_emp` will be chunked using column `emp_no` Preparing data dump for table `employees`.`dept_manager` Writing DDL for table `sakila`.`address` Data dump for table `employees`.`dept_manager` will be chunked using column `emp_no` Preparing data dump for table `employees`.`employees` Writing DDL for table `sakila`.`category` Data dump for table `employees`.`employees` will be chunked using column `emp_no` Preparing data dump for table `employees`.`salaries` Data dump for table `employees`.`salaries` will be chunked using column `emp_no` Preparing data dump for table `employees`.`titles` Data dump for table `employees`.`titles` will be chunked using column `emp_no` Running data dump using 4 threads. NOTE: Progress information uses estimated values and may not be accurate. Writing DDL for table `sakila`.`city` Writing DDL for table `sakila`.`country` Writing DDL for table `sakila`.`customer` Writing DDL for table `sakila`.`film` Writing DDL for table `sakila`.`film_actor` Writing DDL for table `sakila`.`film_category` Writing DDL for table `sakila`.`film_text` Writing DDL for table `sakila`.`inventory` Writing DDL for table `sakila`.`language` Writing DDL for table `sakila`.`payment` Writing DDL for table `sakila`.`rental` Writing DDL for table `sakila`.`staff` Writing DDL for table `sakila`.`store` Writing DDL for schema `employees` Writing DDL for view `employees`.`current_dept_emp` Writing DDL for view `employees`.`dept_emp_latest_date` Writing DDL for table `employees`.`departments` Writing DDL for table `employees`.`dept_emp` Writing DDL for table `employees`.`dept_manager` Writing DDL for table `employees`.`employees` Writing DDL for table `employees`.`salaries` Writing DDL for table `employees`.`titles` Data dump for table `sakila`.`actor` will be written to 1 file Data dump for table `sakila`.`category` will be written to 1 file Data dump for table `sakila`.`address` will be written to 1 file Data dump for table `sakila`.`country` will be written to 1 file Data dump for table `sakila`.`customer` will be written to 1 file Data dump for table `sakila`.`city` will be written to 1 file Data dump for table `sakila`.`film_actor` will be written to 1 file Data dump for table `sakila`.`film` will be written to 1 file Data dump for table `sakila`.`film_category` will be written to 1 file Data dump for table `sakila`.`film_text` will be written to 1 file Data dump for table `sakila`.`language` will be written to 1 file Data dump for table `sakila`.`payment` will be written to 1 file Data dump for table `sakila`.`inventory` will be written to 1 file Data dump for table `sakila`.`rental` will be written to 1 file Data dump for table `sakila`.`store` will be written to 1 file Data dump for table `sakila`.`staff` will be written to 1 file Data dump for table `employees`.`departments` will be written to 1 file Data dump for table `employees`.`dept_manager` will be written to 1 file Data dump for table `employees`.`dept_emp` will be written to 1 file Data dump for table `employees`.`employees` will be written to 1 file Data dump for table `employees`.`salaries` will be written to 1 file Data dump for table `employees`.`titles` will be written to 1 file 1 thds dumping - 100% (3.97M rows / ~3.93M rows), 733.12K rows/s, 26.80 MB/s uncompressed, 7.02 MB/s compressed Duration: 00:00:05s Schemas dumped: 2 Tables dumped: 22 Uncompressed data size: 144.53 MB Compressed data size: 37.75 MB Compression ratio: 3.8 Rows written: 3966288 Bytes written: 37.75 MB Average uncompressed throughput: 26.31 MB/s Average compressed throughput: 6.87 MB/s
確かにtestスキーマに対してはバックアップを取得していないですね。
最初だけは出力結果を上記のように表示しましたが、長いんであとは省略。
基本コマンド・オプション実行例だけだしておきます。
excludeTables
任意のテーブルをダンプから除外します。
DDLだけとかデータだけとかそういうオプションはありません、とのこと。
ちなみにmysql.apply_status、 mysql.general_log、 mysql.schema、とmysql.slow_log tablesあたりのデータだけは勝手に除外されるようです。
例)test.test02テーブルを除外
MySQL 127.0.0.1:8021 ssl JS > util.dumpInstance("/tmp/dumpdir2",{excludeTables:["test.test02"]})
ユーザ情報を除外
ユーザ関連の情報を除外します。これはcreate userやrole,grantとかそういったものを移行先に持っていかないようにするやつですね。
MySQL 127.0.0.1:8021 ssl JS > util.dumpInstance("/tmp/dumpdir3",{users:false})
メタ定義のみ
いわゆるmysqldumpでいう--no-dataですね。DDLだけ取得するやつです。
MySQL 127.0.0.1:8021 ssl JS > util.dumpInstance("/tmp/dumpdir4",{ddlOnly:true})
データオンリー
いわゆるmysqldumpでいう--no-create-infoですね。データだけ取得するやつです。
MySQL 127.0.0.1:8021 ssl JS > util.dumpInstance("/tmp/dumpdir5",{dataOnly:true})
ドライラン
要するに実際に作りはしないけど、やったらどうなるかな?ってやつ。
試しに既に作成済のダンプディレクトリに出力しようとした→ドライランなのにエラー(なかなか便利そう)
MySQL 127.0.0.1:8021 ssl JS > util.dumpInstance("/tmp/dumpdir5",{excludeSchemas:["test"],dryRun:true}) Util.dumpInstance: Cannot proceed with the dump, '/tmp/dumpdir5' already exists at the target location /tmp/dumpdir5. (ArgumentError)
testスキーマ抜きでやってみる→実際は出力されない(OK)
MySQL 127.0.0.1:8021 ssl JS > util.dumpInstance("/tmp/dumpdir6",{excludeSchemas:["test"],dryRun:true})
/tmp$ ll 合計 92 drwxrwxrwt 20 root root 4096 10月 4 06:29 ./ drwxr-xr-x 26 root root 4096 9月 23 09:59 ../ drwxr-x--- 2 (user) (user) 4096 10月 4 06:21 dumpdir5/ srwxrwxrwx 1 (user) (user) 0 10月 4 06:11 mysql_sandbox8021.sock= ...
/tmp/dumpdir6がないですね。
mdsに持っていけるかチェック
エラーのオンパレードの模様…
MySQL 127.0.0.1:8021 ssl JS > util.dumpInstance("/tmp/dumpdir6",{excludeSchemas:["test"],dryRun:true, ocimds: true}) Checking for compatibility with MySQL Database Service 8.0.21 ERROR: User root@localhost is granted restricted privileges: RELOAD, FILE, SUPER, BINLOG_ADMIN, SET_USER_ID ERROR: User R_DO_IT_ALL@% is granted restricted privileges: RELOAD, FILE, SUPER, BINLOG_ADMIN, SET_USER_ID ERROR: User R_POWERFUL@% is granted restricted privileges: RELOAD, FILE, SUPER, BINLOG_ADMIN, SET_USER_ID NOTE: Database sakila had unsupported ENCRYPTION option commented out ERROR: Function sakila.get_customer_balance - definition uses DEFINER clause set to user `root`@`localhost` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges ERROR: Function sakila.inventory_held_by_customer - definition uses DEFINER clause set to user `root`@`localhost` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges ERROR: Function sakila.inventory_in_stock - definition uses DEFINER clause set to user `root`@`localhost` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges ERROR: Procedure sakila.film_in_stock - definition uses DEFINER clause set to user `root`@`localhost` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges ERROR: Procedure sakila.film_not_in_stock - definition uses DEFINER clause set to user `root`@`localhost` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges ERROR: Procedure sakila.rewards_report - definition uses DEFINER clause set to user `root`@`localhost` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges NOTE: Database employees had unsupported ENCRYPTION option commented out ERROR: Trigger sakila.customer_create_date - definition uses DEFINER clause set to user `root`@`localhost` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges ERROR: Trigger sakila.ins_film - definition uses DEFINER clause set to user `root`@`localhost` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges ERROR: Trigger sakila.upd_film - definition uses DEFINER clause set to user `root`@`localhost` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges ERROR: Trigger sakila.del_film - definition uses DEFINER clause set to user `root`@`localhost` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges ERROR: Trigger sakila.payment_date - definition uses DEFINER clause set to user `root`@`localhost` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges ERROR: Trigger sakila.rental_date - definition uses DEFINER clause set to user `root`@`localhost` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges ERROR: View sakila.actor_info - definition uses DEFINER clause set to user `root`@`localhost` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges ERROR: View sakila.customer_list - definition uses DEFINER clause set to user `root`@`localhost` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges ERROR: View sakila.film_list - definition uses DEFINER clause set to user `root`@`localhost` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges ERROR: View sakila.nicer_but_slower_film_list - definition uses DEFINER clause set to user `root`@`localhost` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges ERROR: View sakila.sales_by_film_category - definition uses DEFINER clause set to user `root`@`localhost` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges ERROR: View sakila.sales_by_store - definition uses DEFINER clause set to user `root`@`localhost` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges ERROR: View sakila.staff_list - definition uses DEFINER clause set to user `root`@`localhost` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges ERROR: View employees.current_dept_emp - definition uses DEFINER clause set to user `root`@`localhost` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges ERROR: View employees.dept_emp_latest_date - definition uses DEFINER clause set to user `root`@`localhost` which can only be executed by this user or a user with SET_USER_ID or SUPER privileges Compatibility issues with MySQL Database Service 8.0.21 were found. Please use the 'compatibility' option to apply compatibility adaptations to the dumped DDL. Util.dumpInstance: Compatibility issues were found (RuntimeError)
どうやら今のところMDSに持っていくためには変換が必要=compatibilityっていうオプションが必要な模様
compatibilityを使う
MySQL 127.0.0.1:8021 ssl JS > util.dumpInstance("/tmp/dumpdir6",{excludeSchemas:["test"],dryRun:true,ocimds:true,compatibility:["strip_definers", "strip_restricted_grants"]}) Checking for compatibility with MySQL Database Service 8.0.21 NOTE: User root@localhost had restricted privileges (RELOAD, FILE, SUPER, BINLOG_ADMIN, SET_USER_ID) removed NOTE: User R_DO_IT_ALL@% had restricted privileges (RELOAD, FILE, SUPER, BINLOG_ADMIN, SET_USER_ID) removed NOTE: User R_POWERFUL@% had restricted privileges (RELOAD, FILE, SUPER, BINLOG_ADMIN, SET_USER_ID) removed NOTE: Database sakila had unsupported ENCRYPTION option commented out NOTE: Function sakila.get_customer_balance had definer clause removed and SQL SECURITY characteristic set to INVOKER NOTE: Function sakila.inventory_held_by_customer had definer clause removed and SQL SECURITY characteristic set to INVOKER NOTE: Function sakila.inventory_in_stock had definer clause removed and SQL SECURITY characteristic set to INVOKER NOTE: Procedure sakila.film_in_stock had definer clause removed and SQL SECURITY characteristic set to INVOKER NOTE: Procedure sakila.film_not_in_stock had definer clause removed and SQL SECURITY characteristic set to INVOKER NOTE: Procedure sakila.rewards_report had definer clause removed and SQL SECURITY characteristic set to INVOKER NOTE: Database employees had unsupported ENCRYPTION option commented out NOTE: Trigger sakila.customer_create_date had definer clause removed NOTE: Trigger sakila.ins_film had definer clause removed NOTE: Trigger sakila.upd_film had definer clause removed NOTE: Trigger sakila.del_film had definer clause removed NOTE: Trigger sakila.payment_date had definer clause removed NOTE: Trigger sakila.rental_date had definer clause removed NOTE: View sakila.actor_info had definer clause removed NOTE: View sakila.customer_list had definer clause removed and SQL SECURITY characteristic set to INVOKER NOTE: View sakila.film_list had definer clause removed and SQL SECURITY characteristic set to INVOKER NOTE: View sakila.nicer_but_slower_film_list had definer clause removed and SQL SECURITY characteristic set to INVOKER NOTE: View sakila.sales_by_film_category had definer clause removed and SQL SECURITY characteristic set to INVOKER NOTE: View sakila.sales_by_store had definer clause removed and SQL SECURITY characteristic set to INVOKER NOTE: View sakila.staff_list had definer clause removed and SQL SECURITY characteristic set to INVOKER NOTE: View employees.current_dept_emp had definer clause removed and SQL SECURITY characteristic set to INVOKER NOTE: View employees.dept_emp_latest_date had definer clause removed and SQL SECURITY characteristic set to INVOKER Compatibility issues with MySQL Database Service 8.0.21 were found and repaired. Please review the changes made before loading them. ...
チェック作業が事前に入ってる。
パフォーマンスの調整したい人に
並列数2スレッド、ダンプ中のデータ読み取りスループットのスレッドあたりの1秒あたりの最大バイト数を10MBに抑えるような場合
MySQL 127.0.0.1:8021 ssl JS > util.dumpInstance("/tmp/dumpdir7",threads:2,maxRate:"10M"})
他にも
まだオプションはあるんだけど、あんまり使わなそうなやつは今回省略w
知りたい方はドキュメントを熟読して僕に教えてください。
MDSに直接入れたい
ローカルにあるダンプ結果をOCIだとどうやって上げるんだろう?
というので、調べたところ、オブジェクトストレージというものがありました。(AWSで言うところのS3かな)
これ作って、そこにアップしたらええんやな?
いや、待てよ。直接MySQL Shellでそこにアップできない?
できるんです!くうぅ〜〜〜!!
準備作業
オブジェクトストレージ(正確にはバケット)を作りましょう。(作り方はドキュメントをご参照ください)
OCI クライアントをインストールしましょう。(これもインストールの仕方はドキュメントを参照してください)
とはいえ、OCI クライアントはちょっとハマったので簡単にメモを残しておきます
インストール
$ sudo bash -c "$(curl -L https://raw.githubusercontent.com/oracle/oci-cli/master/scripts/install/install.sh)" --accept-all-defaults
oci cliのconfig作成
$ oci setup config
こんな感じのやり取りがある
This command provides a walkthrough of creating a valid CLI config file. The following links explain where to find the information required by this script: User API Signing Key, OCID and Tenancy OCID: https://docs.cloud.oracle.com/Content/API/Concepts/apisigningkey.htm#Other Region: https://docs.cloud.oracle.com/Content/General/Concepts/regions.htm General config documentation: https://docs.cloud.oracle.com/Content/API/Concepts/sdkconfig.htm Enter a location for your config [/home/(user)/.oci/config]: Config file: /home/(user)/.oci/config already exists. Do you want add a profile here? (If no, you will be prompted to overwrite the file) [Y/n]: n File: /home/(user)/.oci/config already exists. Do you want to overwrite (Removes existing profiles!!!)? [y/N]: y Enter a user OCID: ocid1.user.oc1..aaaaaaaaxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx Enter a tenancy OCID: ocid1.tenancy.oc1..aaaaaaaavyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy Enter a region (e.g. ap-chiyoda-1, ap-chuncheon-1, ap-hyderabad-1, ap-melbourne-1, ap-mumbai-1, ap-osaka-1, ap-seoul-1, ap-sydney-1, ap-tokyo-1, ca-montreal-1, ca-toronto-1, eu-amsterdam-1, eu-frankfurt-1, eu-zurich-1, me-jeddah-1, sa-saopaulo-1, uk-gov-cardiff-1, uk-gov-london-1, uk-london-1, us-ashburn-1, us-gov-ashburn-1, us-gov-chicago-1, us-gov-phoenix-1, us-langley-1, us-luke-1, us-phoenix-1, us-sanjose-1): ap-tokyo-1 Do you want to generate a new API Signing RSA key pair? (If you decline you will be asked to supply the path to an existing key.) [Y/n]: Y Enter a directory for your keys to be created [/home/(user)/.oci]: Enter a name for your key [oci_api_key]: Public key written to: /home/(user)/.oci/oci_api_key_public.pem Enter a passphrase for your private key (empty for no passphrase): Private key written to: /home/(user)/.oci/oci_api_key.pem Fingerprint: 1e:2b:f5:83:93:93:9e:03:85:05:66:7f:b1:f8:ac:ca Config written to /home/(user)/.oci/config If you haven't already uploaded your API Signing public key through the console, follow the instructions on the page linked below in the section 'How to upload the public key': https://docs.cloud.oracle.com/Content/API/Concepts/apisigningkey.htm#How2
公開鍵をアップロードする
このドキュメントを見ながらユーザー>公開キーの追加で公開鍵を登録します。
公開鍵はこんな感じで見れるかと。
cat /home/(user)/.oci/oci_api_key_public.pem
トークンの検証
oci iam region list
を実行して、以下のように出たらOK
{ "data": [ { "key": "AMS", "name": "eu-amsterdam-1" }, { "key": "BOM", "name": "ap-mumbai-1" }, { "key": "DXB", "name": "me-dubai-1" }, ... { "key": "YYZ", "name": "ca-toronto-1" }, { "key": "ZRH", "name": "eu-zurich-1" } ] }
さっそくOCIへ吐き出してみる
osBucketNameとosNamespace
OCIのオブジェクトストレージのバケットに出力するにはこの2つのオプションが必要。
osBucketNameは文字通り、オブジェクトストレージのバケット名、osNamespaceでバケットの名前空間を指定します。
MySQL 127.0.0.1:8021 ssl JS > util.dumpInstance("testbackup",{ocimds:true,compatibility:["strip_definers", "strip_restricted_grants"],"osBucketName": "bucket-hogehoge", "osNamespace": "xxxxxxx"})
osNamespaceは以下のコマンドで取得するか、オブジェクトストレージ>バケットの詳細>ネームスペースで確認するといいです。
oci os ns get
MDSにインポート
事前準備
OCIにインスタンスを一つ立てて、そっちにもOCI クライアントをインストールしましょう。
インポート
そしてMySQL Shellで以下のように実行すればインポートできます
MySQL xxxxxx:3306 ssl JS >util.loadDump("testbackup", {osBucketName: "bucket-hogehoge", osNamespace: "xxxxxxx"})
並列度はインスタンスのCPUパワー次第で上げたら良いんじゃないかなと思いますが、僕はお試しなんでデフォルトのままでやりました。
インポートのほうのパラメータはあんまり調べませんでしたが、パフォーマンスの設定以外は基本エクスポートでやっちゃう人が多いんじゃないかな?と思って今回は手を抜きましたw
さいごに
今回はMySQL ShellとOCI両方ちょっとずつ学ぶことができました。
インポートは若干駆け足気味ですが、まあそんなに難しくないかと思って。
MySQL Shellのダンプが使えるのは8.0.21限定ですし、MDSもまだまだ検証段階って人も多いと思います。
そんな段階だからこそ、触っていざというときにどっちも使えるようになっておけば、きっと将来役に立つこと間違い無しです(?)