I wrote this script with the help of a lot of ddg searching and countless how to posts. I was looking for a simple way to package up and archive all of my repos on gitlab periodically so that I always have a local copy available.
This script makes use of a token that you generate on gitlab under your account. I like to set the expiration of this token a few weeks in the future so that you don't worry about account security. You can always generate a new token when you need it.
For this script to work you will need to paste your own token and username. You will also need git (obviously) and jq installed to parse the json file that you will access through the gitlab api.
After that just set the .sh file to executable and use ./backup.sh and it will download your repos one by one and then tar and gzip the entire folder with the date and time in the filename.
I am not great with markdown so the function getrepolist is split over 4 lines to keep it in the code block, not sure how to wrap the line.
Hopefully this helps locally archive your precious repos too!
gitlab_token="your-token-goes-here"
gitlab_username="gitlab_username"
date=$(/bin/date '+%Y-%m-%d-%H-%M')
NEW_DIR="gitlab_${date}"
mkdir "${NEW_DIR}"
function getrepoList {
curl
"https://gitlab.com/api/v4/users/$1/projects?private_token=$2&per_page=100"
| jq ".[0:] | .[] | .name" | sed "s/^\([\"']\)\(.*\)\1\$/\2/g"
}
repoList=$(getrepoList "${gitlab_username}" "${gitlab_token}")
while IFS= read -r repo; do
git clone "git@gitlab.com:${gitlab_username}/$repo" "${NEW_DIR}""/$repo"
#echo "${repo}"
done <<< "${repoList}"
tar --remove-files -czvf "${NEW_DIR}".tar.gz "${NEW_DIR}"