Backup solution for Kubernetes with Velero
Hello folks, I have come up with a new blog on the Kubernetes backup solution. To lose deployed applications or the whole Kubernetes cluster is a nightmare for any DevOps engineer. So in this blog, I am going to explain to you how to setup velero in k8’s cluster for scheduled backup of given resources.
What is velero?
Velero is an open-source tool to safely backup and restore, perform disaster recovery, and migrate Kubernetes cluster resources and persistent volumes. Velero create backup and store backup files to configured cloud storage. It is very useful in cluster disaster recovery and migration tasks.
Setup
Step 1 : Install velero cli
Windows (using chocolatey)
choco install velero
Linux
curl -L -o /tmp/velero.tar.gz https://github.com/vmware-tanzu/velero/releases/download/v1.5.1/velero-v1.5.1-linux-amd64.tar.gztar -C /tmp -xvf /tmp/velero.tar.gzmv /tmp/velero-v1.5.1-linux-amd64/velero /usr/local/bin/velerochmod +x /usr/local/bin/velerovelero --help
MacOs
brew install velero
Step:2 AWS Storage setup
login to AWS -
# Access your "My Security Credentials" section in your profile.
# Create an access key
aws configure
Default region name: ap-southeast-2
Default output format: json
Create Storage bucket -
export BUCKET=valero-backupexport REGION=us-east-1export PROFILE=myawsprofileaws s3api create-bucket --bucket $BUCKET --region $REGION --acl private
Note — If you have multiple aws account setup on your system then add — profile flag in the above command to select appropriate storage.
Create IAM User to acces the created storage -
aws iam create-user --user-name velero --profile $PROFILE
Setup Policy for the User -
cat > velero-policy.json <<EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"ec2:DescribeVolumes",
"ec2:DescribeSnapshots",
"ec2:CreateTags",
"ec2:CreateVolume",
"ec2:CreateSnapshot",
"ec2:DeleteSnapshot"
],
"Resource": "*"
},
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:DeleteObject",
"s3:PutObject",
"s3:AbortMultipartUpload",
"s3:ListMultipartUploadParts"
],
"Resource": [
"arn:aws:s3:::${BUCKET}/*"
]
},
{
"Effect": "Allow",
"Action": [
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::${BUCKET}"
]
}
]
}
EOFaws iam put-user-policy \
--user-name velero \
--policy-name velero \
--policy-document file://velero-policy.json
Create Access Key for that IAM user -
aws iam create-access-key --user-name velero > /tmp/key.jsonAWS_ACCESS_ID=`cat /tmp/key.json | jq .AccessKey.AccessKeyId | sed s/\"//g`
AWS_ACCESS_KEY=`cat /tmp/key.json | jq .AccessKey.SecretAccessKey | sed s/\"//g`
Step 3: Velero setup
Install velero in the cluster -
cat > /tmp/credentials-velero <<EOF
[default]
aws_access_key_id=$AWS_ACCESS_ID
aws_secret_access_key=$AWS_ACCESS_KEY
EOFvelero install \
--provider aws \
--plugins velero/velero-plugin-for-aws:v1.1.0 \
--bucket $BUCKET \
--backup-location-config region=$REGION \
--snapshot-location-config region=$REGION \
--secret-file /tmp/credentials-velerokubectl -n velero get pods
kubectl logs deployment/velero -n velero
Create backup -
velero backup create default-namespace-backup --include-namespaces default# describe created backup
velero backup describe default-namespace-backup# logs of the backup
velero backup logs default-namespace-backup
After backup completed you will backup tarball files created in aws storage.
Schedule backup -
Even you can create scheduled backup which will create backup on regular basis according to given cron expression. When using scheduled backup it will good idea to set expiry of each backup you create so you can maintain fix count of recents backups. here in below scheduled backup each backup will live only of 72 hours and new backup is created daily after every 12 hours.
velero create schedule $NAME --schedule="@every 12h" --exclude-namespaces kube-system --ttl 72h0m0s
Restore backup -
velero restore create default-namespace-backup --from-backup default-namespace-backup
# describe
velero restore describe default-namespace-backup
#logs
velero restore logs default-namespace-backup
You can setup alerts using botkube so you will receive notifications on slack for every new backup/restore.