so today I didn't find any (well at least without iterating through mysqldump) way to export a subset of a database tables so I wrote the following bash script:
partial-mysqldump
- usage() { echo "Usage: $0 [-h <host>] [-u <user>] [-p <pass>] [-d <database>] [-t [nametables]] [-f <output_file>] [-l <limit>]"1>&2; exit 1; }
- while getopts ":t:f:d:l:p:u:h" opt; do
- case "${opt}" in
- f)
- f=${OPTARG}
- ;;
- d)
- d=${OPTARG}
- ;;
- p)
- p=${OPTARG}
- ;;
- h)
- h=${OPTARG}
- ;;
- u)
- u=${OPTARG}
- ;;
- l)
- l=${OPTARG}
- ;;
- t)
- t+=("$OPTARG")
- ;;
- *)
- echo
- echo "[ERROR]: Unknown, ignoring ${OPTARG}"
- usage
- ;;
- esac
- done
- shift $((OPTIND-1))
- if [ -z "${h}" ]; then
- h='127.0.0.1'
- fi
- if [ -z "${l}" ]; then
- limit=''
- else
- limit="where=\"1 limit $l\""
- fi
- if [ -z "${d}" ] || [ -z "${t}" ] || [ -z "${f}" ]; then
- usage
- else
- for val in ${t[@]}; do
- mysqldump \
- -h $h \
- -u $u \
- -p$p \
- --no-create-info \
- --databases $d \
- --table $val \
- --lock-tables=false \
- $limit \
- >> $f
- done
- gzip -9 -c $f > $f.gz; rm $f
- fi
so now, i can just limit my data and export what I want.
./partial-mysqldump -u <user> -h <host> -p <password> -d <database> -t <table1> <table2> <table3> -f test -l 1000
will dump first 1000 rows from table1, table2 and table3.
The problema with mysqldump --where="1 limit <yourlimit>" option is that it counts the overall result set and not a limit for each table.
Cheers
Tiago
No comments:
Post a Comment