WGet for Premium Accounts

So, some of you on Linux may also be like me and want to download from Megaupload or Rapidshare using your premium accounts… but want to be able to do so from a terminal with low overhead… well, here’s how ya do it!

The tool of the day is wget and it a wonderfully resourceful piece of code… but now we’ll be focussing on using it to download from the above mentioned sites from now on known as 1 (megaupload) and 2 (rapidshare) by alphabetical order.

First thing you will need to do is grab your cookies for your premium accounts. Replace USERNAME and PASSWORD with your accounts details:

1:

wget --save-cookies=~/.cookies/rapidshare --post-data "login=USERNAME&password=PASSWORD" -O - https://ssl.rapidshare.com/cgi-bin/premiumzone.cgi > /dev/null

2:

wget --save-cookies ~/.cookies/megaupload --post-data "login=1&redir=1&username=USERNAME&password=PASSWORD" -O - http://megaupload.com/?c=login > /dev/null

This saves a cookie for each account into your hidden cookie folder… run ls ~.cookies to make sure they’re there if you want.

Believe it or not, that’s the hard work done… now for the download just use the following replacing URI with the download link:

1.

wget -c --load-cookies=~/.cookies/rapidshare URI

2.

wget -c --load-cookies=~/.cookies/megaupload URI

If you have problems with the megaupload downloads, login to your account and make sure that the direct download option is active.

And that’s it… to make it a bit more automated, create a script like so:

Put the following into a file (ie dlfiles) adjusting it where necessary:

#!/bin/bash

wget -c --load-cookies=.cookies/megaupload http://www.megaupload.com/s.....

save it and make it executable:

chmod -x dlfiles

And then execute it with:

./dlfiles

You can also save all the links to the files you wanted downloaded and use the input option with wget like this replacing LINKLIST with the file containing the URI’s to the files you wish to download:

wget -c --load-cookies=.cookies/megaupload -i LINKLIST

Later I will be looking into creating a script to run as a download server to almost completely automate the process based on info from this post, but for now, this should do you well!

Leave a Reply

Your email address will not be published. Required fields are marked *