Previously I’ve covered the topic around export and import the Elasticsearch index from one ES cluster to another. So now if you want to back up a particular index, instead of storing it locally, we can see how can you put it on S3 bucket. Let’s jump on it.
Prerequisites
- AWS S3
- Elasticsearch cluster
- Docker
Export ES index data to S3
The command for exporting the ES index data is pretty much the same as exporting for another ES cluster, just you need to provide the s3 access key and the secret.
docker run --rm -ti elasticdump/elasticsearch-dump \
--s3AccessKeyId "${access_key_id}" \
--s3SecretAccessKey "${access_key_secret}" \
--input=http://prod.es.io:9200/index_name \
--output "s3://${bucket_name}/${file_name}.json"
Import the index data stored in S3 into ES cluster
The procedure is same as exporting, just we need to swap the input
and output
endpoints of the ES and S3.
docker run --rm -ti elasticdump/elasticsearch-dump \
--s3AccessKeyId "${access_key_id}" \
--s3SecretAccessKey "${access_key_secret}" \
--input "s3://${bucket_name}/${file_name}.json" \
--output=http://prod.es.io:9200/index_name
Conclusion
Backing up some crucial ES indexes on S3 is a good practice, cause it allows you to quickly access it and restore it if something went wrong with some of your ES cluster indexes. Feel free to leave a comment below and if you find this tutorial useful, follow our official channel on Telegram.