download whole website

A small company with a pretty old website. They outsourced the administration of the site and only have access to the content via web interface, so no database, ftp etc. login data is available anymore because the company, to which the administration was outsourced is no more and the login data was lost. A new website will be put in place and they want to old website to stay in a subdomain, just static html pages are enough for this purpose. The goal is to download a whole website incl. all graphics, documents etc. without following external links to other sites for backup and archive reasons. scriptable solutions preferred (no gui apps)
1 answer

wget & curl

use wget or curl from the command line(almost every linux distro and the like has them...)

wget -mk http://example.com
or use
http://curl.haxx.se/programs/curlmirror.txt

Taggings: