Web Site Backup Script

They have: 5 posts

Joined: Apr 2001

Just wondering if any of you could recommend a good Perl or PHP script to back up my web site. I have found a few by looking in the usual places (hotscipts.com, etc.), but would like to know how others back up their sites.

Thanks!

Brian Farkas's picture

They have: 1,015 posts

Joined: Apr 1999

Usually your host would have backups... But you should usually keep your own as well. I would recommend just keeping an active version of your web site on your home computer, you can also tar up your directory on your web server and download it to your computer.

Brian

They have: 5 posts

Joined: Apr 2001

"tar up your directory on your web server and download it to your computer"

I guess that's what I would like to do, but do not yet know how. I DO have telnet access to the server, though.

They have: 5 posts

Joined: Apr 2001

In searching for more information, I came up with the following:

<--begin snippet-->
Tar Zipping Archive

gtar -cvfz archive.tar.gz dir
<--end snippet-->

Do I telnet to the server and do something like this? Would this archive the whole web site?

(Thank you for your help!)

They have: 5 posts

Joined: Apr 2001

Or is the following closer to what you would use to make a backup of a web site:

<--begin snippet-->

find . -name "*.[ch]" -print | zip source -@

<--end snippet>

Brian Farkas's picture

They have: 1,015 posts

Joined: Apr 1999

You should be able to use either the tar or gzip commands to do what you want... You can get more information such as available options and usage info by typing:
tar --help
or
gzip --help

at the command line.

Brian

They have: 5 posts

Joined: Apr 2001

I will go try it right now.

Thank you.

They have: 488 posts

Joined: Feb 2000

If you have experience with Unix programming and know Pico language then using Telnet shell access works fine but some Host does not provide this feature or you have to pay extra for it.

For most users to do a full backup of your site on your Host's server, I would suggest using FTP. I use it often for some of my clients' site having interactive, database, password protect directories, so should my Host have problems with their server, I just upload all my file again.

For beginners only:

1)Create a folder on your harddisk.
2)Connect to your Host server using FTP.
3)Move up to your root directory.
4)Switch your FTP to auto.
5)Highlight all the folders/directories on Host's server.
6)Make sure your folder is the right one on your harddisk.
7)Click Download/transfer button.

All files that your have on the server will be downloaded to the required folder on your harddisk.

They have: 5,633 posts

Joined: Jan 1970

If you're running on Linux or other Unices, check out the utility wget (gnu.org/software/wget/wget.html) as well. It recursively traverses from a starting URL, and can pull over the user-visible portions of a site and build a backup/mirror.

Leo

Want to join the discussion? Create an account or log in if you already have one. Joining is fast, free and painless! We’ll even whisk you back here when you’ve finished.