15 Nov

Dumb script for Picasaweb backup on Linux server & Amazon S3

Just wrote a quick script to pull dump of Picasaweb albums backup on my server & further to Amazon S3. Overall I have good trust on Google for data but it’s always a poor idea to leave all eggs in single bucket.

OK here’s the script (poorly written code. Literally spent 10mins on this, thus suggestions to improve my coding are more then welcome!)


google picasa list-albums | cut -d”,” -f1 >> $Destination/tmp/album_list.txt

cat $Destination/tmp/album_list.txt | while read album

          google picasa get “$album” $Destination/tmp

FileName=PicsBackupdate '+%d-%B-%Y'.tar
tar -cpzf $Destination/$FileName $Destination/tmp
gpg –output $Destination/$FileName.pgp -r <YOUR-PGP-KEY-HERE> –always-trust –encrypt $Destination/$FileName
s3cmd put $Destination/$FileName.pgp s3://YOUR-AWS-S3-BUCKET-ADDRESS-HERE

rm -r $Destination/tmp/*
rm $Destination/$FileName
rm $Destination/$FileName.pgp



How to use

Simply download Google Cli scripts, and get your Google account working with the installed stack. Also if you need Amazon S3 backup support then install & configure s3cmd. Once you have both of these configured with your account, simple give executable bit to the script & run!


Code logic

Yes it’s super crappy code but anyways it does the work.

I couldn’t find an easy to way to download entire album base from Picasa. There seems to be some bug with Google Cli tools in directory creation and hence google picasa get .* . fails right after 1st album pull up. Google Cli offers pullup of album names (along with hyperlinks) with list-albums parameter. Thus first part of code is to pull that list and cut the first part of output using comma as delimiter. Next. the output is taken on a txt file which is read line by line in a loop. And the loop has simple code for download of each album one by one. Once download is completed, tar runs to create compress archive followed by gpg to encrypt the tar. This encrypted file is then uploaded to Amazon S3 using s3cmd tool and lastly all downloaded files are just deleted!


On Amazon S3 I have a bucket expiry rule which takes care of rotation and removal of old data. I can spend few more mins to make it more complex but this one just works! 😉


Moral: My programming is crappy, no doubt!

28 May

Domain to IP/ASN/BGP block mapping script

Sleepless night. Reading more about Quagga and it’s options.


In meanwhile a quick 5min script to enable domain to BGP/IP/ASN mapping. This script is using basic dig command (for finding IP address) and Team Cymru whois service for IP to ASN/block mapping.



# Script for domain name to IP/ASN/BGP block mapping
IP=$(dig $1 a +short)
whois -h $hostname ” -c -p $IP”


 Yeah just 3 line script! Less code = more power! 

I set this one up as alias in ~/.bashrc. Here’s live working example:


anurag@laptop ~ $ bwhois he.net
AS | IP | BGP Prefix | CC | AS Name
6939 | | | US | HURRICANE – Hurricane Electric, Inc.

anurag@laptop ~ $ bwhois vsnl.in
AS | IP | BGP Prefix | CC | AS Name
4755 | | | IN | TATACOMM-AS TATA Communications formerly VSNL is Leading ISP

anurag@laptop ~ $ bwhois airtel.in
AS | IP | BGP Prefix | CC | AS Name
9498 | | | IN | BBIL-AP BHARTI Airtel Ltd.

anurag@laptop ~ $ bwhois rcom.co.in
AS | IP | BGP Prefix | CC | AS Name
18101 | | | IN | RELIANCE-COMMUNICATIONS-IN Reliance Communications Ltd.DAKC MUMBAI



Time for me to complete my CN file for tomorrow’s viva. Most boring & dumb work, but anyways have to do.

Feel free to leave comments below about just anything!