MongoDBのバックアップとリカバリ
15200 ワード
0.目次
1.バックアップ
MongoDBはバックアップツール、mongodumpを提供しています.exeはbinディレクトリの下で、次のように使用されます.
mongodump.exe -h localhost -d database_name -o d:\mongodump
-h:MongDB , :127.0.0.1, :127.0.0.1:27017
-d: , :database_name
-o: , :d:\mongodump, , ,
dump database_name , 。
mongodump説明:
Export MongoDB data to BSON files.
Options:
--help produce help message
-v [ --verbose ] be more verbose (include multiple times
for more verbosity e.g. -vvvvv)
--quiet silence all non error diagnostic
messages
--version print the program's version and exit
-h [ --host ] arg mongo host to connect to ( <set
name>/s1,s2 for sets)
--port arg server port. Can also use --host
hostname:port
--ipv6 enable IPv6 support (disabled by
default)
-u [ --username ] arg username
-p [ --password ] arg password
--authenticationDatabase arg user source (defaults to dbname)
--authenticationMechanism arg (=MONGODB-CR)
authentication mechanism
--gssapiServiceName arg (=mongodb) Service name to use when authenticating
using GSSAPI/Kerberos
--gssapiHostName arg Remote host name to use for purpose of
GSSAPI/Kerberos authentication
--dbpath arg directly access mongod database files
in the given path, instead of
connecting to a mongod server - needs
to lock the data directory, so cannot
be used if a mongod is currently
accessing the same path
--directoryperdb each db is in a separate directory
(relevant only if dbpath specified)
--journal enable journaling (relevant only if
dbpath specified)
-d [ --db ] arg database to use
-c [ --collection ] arg collection to use (some commands)
-o [ --out ] arg (=dump) output directory or "-" for stdout
-q [ --query ] arg json query
--oplog Use oplog for point-in-time
snapshotting
--repair try to recover a crashed database
--forceTableScan force a table scan (do not use
$snapshot)
--dumpDbUsersAndRoles Dump user and role definitions for the
given database
2.リカバリ
mongorestore.exe -h localhost -d database_name –directoryperdb d:\mongodump\database_name
-h:MongoDB
-d: , :database_name, , test2
–directoryperdb: , :d:\mongodump\database_name
–drop: , , 。 , , , .
mongorestoreの説明:
Import BSON files into MongoDB.
usage: mongorestore [options] [directory or filename to restore from]
Options:
--help produce help message
-v [ --verbose ] be more verbose (include multiple times
for more verbosity e.g. -vvvvv)
--quiet silence all non error diagnostic
messages
--version print the program's version and exit
-h [ --host ] arg mongo host to connect to ( <set
name>/s1,s2 for sets)
--port arg server port. Can also use --host
hostname:port
--ipv6 enable IPv6 support (disabled by
default)
-u [ --username ] arg username
-p [ --password ] arg password
--authenticationDatabase arg user source (defaults to dbname)
--authenticationMechanism arg (=MONGODB-CR)
authentication mechanism
--gssapiServiceName arg (=mongodb) Service name to use when authenticating
using GSSAPI/Kerberos
--gssapiHostName arg Remote host name to use for purpose of
GSSAPI/Kerberos authentication
--dbpath arg directly access mongod database files
in the given path, instead of
connecting to a mongod server - needs
to lock the data directory, so cannot
be used if a mongod is currently
accessing the same path
--directoryperdb each db is in a separate directory
(relevant only if dbpath specified)
--journal enable journaling (relevant only if
dbpath specified)
-d [ --db ] arg database to use
-c [ --collection ] arg collection to use (some commands)
--objcheck validate object before inserting
(default)
--noobjcheck don't validate object before inserting
--filter arg filter to apply before inserting
--drop drop each collection before import
--oplogReplay replay oplog for point-in-time restore
--oplogLimit arg include oplog entries before the
provided Timestamp (seconds[:ordinal])
during the oplog replay; the ordinal
value is optional
--keepIndexVersion don't upgrade indexes to newest version
--noOptionsRestore don't restore collection options
--noIndexRestore don't restore indexes
--restoreDbUsersAndRoles Restore user and role definitions for
the given database
--w arg (=0) minimum number of replicas per write
3.単一collectionバックアップ
mongoexport -h dbhost -d dbname -c collectionname -f collectionKey -o dbdirectory
-h: MongoDB
-d:
-c:
-f: ( )
-o:
mongoexport説明
Export MongoDB data to CSV, TSV or JSON files.
Options:
--help produce help message
-v [ --verbose ] be more verbose (include multiple times
for more verbosity e.g. -vvvvv)
--quiet silence all non error diagnostic
messages
--version print the program's version and exit
-h [ --host ] arg mongo host to connect to ( <set
name>/s1,s2 for sets)
--port arg server port. Can also use --host
hostname:port
--ipv6 enable IPv6 support (disabled by
default)
-u [ --username ] arg username
-p [ --password ] arg password
--authenticationDatabase arg user source (defaults to dbname)
--authenticationMechanism arg (=MONGODB-CR)
authentication mechanism
--gssapiServiceName arg (=mongodb) Service name to use when authenticating
using GSSAPI/Kerberos
--gssapiHostName arg Remote host name to use for purpose of
GSSAPI/Kerberos authentication
--dbpath arg directly access mongod database files
in the given path, instead of
connecting to a mongod server - needs
to lock the data directory, so cannot
be used if a mongod is currently
accessing the same path
--directoryperdb each db is in a separate directory
(relevant only if dbpath specified)
--journal enable journaling (relevant only if
dbpath specified)
-d [ --db ] arg database to use
-c [ --collection ] arg collection to use (some commands)
-f [ --fields ] arg comma separated list of field names
e.g. -f name,age
--fieldFile arg file with field names - 1 per line
-q [ --query ] arg query filter, as a JSON string, e.g.,
'{x:{$gt:1}}'
--csv export to csv instead of json
-o [ --out ] arg output file; if not specified, stdout
is used
--jsonArray output to a json array rather than one
object per line
-k [ --slaveOk ] arg (=1) use secondaries for export if
available, default true
--forceTableScan force a table scan (do not use
$snapshot)
--skip arg (=0) documents to skip, default 0
--limit arg (=0) limit the numbers of documents
returned, default all
--sort arg sort order, as a JSON string, e.g.,
'{x:1}'
4.単一collectionリカバリ
mongoimport -d dbhost -c collectionname –type csv –headerline –file
-type:
-headerline: ,
-file:
mongoimport説明
Import CSV, TSV or JSON data into MongoDB.
When importing JSON documents, each document must be a separate line of the input file.
Example:
mongoimport --host myhost --db my_cms --collection docs < mydocfile.json
Options:
--help produce help message
-v [ --verbose ] be more verbose (include multiple times
for more verbosity e.g. -vvvvv)
--quiet silence all non error diagnostic
messages
--version print the program's version and exit
-h [ --host ] arg mongo host to connect to ( <set
name>/s1,s2 for sets)
--port arg server port. Can also use --host
hostname:port
--ipv6 enable IPv6 support (disabled by
default)
-u [ --username ] arg username
-p [ --password ] arg password
--authenticationDatabase arg user source (defaults to dbname)
--authenticationMechanism arg (=MONGODB-CR)
authentication mechanism
--gssapiServiceName arg (=mongodb) Service name to use when authenticating
using GSSAPI/Kerberos
--gssapiHostName arg Remote host name to use for purpose of
GSSAPI/Kerberos authentication
--dbpath arg directly access mongod database files
in the given path, instead of
connecting to a mongod server - needs
to lock the data directory, so cannot
be used if a mongod is currently
accessing the same path
--directoryperdb each db is in a separate directory
(relevant only if dbpath specified)
--journal enable journaling (relevant only if
dbpath specified)
-d [ --db ] arg database to use
-c [ --collection ] arg collection to use (some commands)
-f [ --fields ] arg comma separated list of field names
e.g. -f name,age
--fieldFile arg file with field names - 1 per line
--ignoreBlanks if given, empty fields in csv and tsv
will be ignored
--type arg type of file to import. default: json
(json,csv,tsv)
--file arg file to import from; if not specified
stdin is used
--drop drop collection first
--headerline first line in input file is a header
(CSV and TSV only)
--upsert insert or update objects that already
exist
--upsertFields arg comma-separated fields for the query
part of the upsert. You should make
sure this is indexed
--stopOnError stop importing at first error rather
than continuing
--jsonArray load a json array, not one item per
line. Currently limited to 16MB.