Currently Viewing Posts Tagged shell

Remove Old Kernels on CentOS

I know how to remove old Kernels on Ubuntu. But, I don’t know how to do it on CentOS. After googling, I found answer.

Go through and see how many kernels here.

[root@juntixiao ~]# rpm -q kernel
kernel-2.6.32-573.3.1.el6.x86_64
kernel-2.6.32-573.7.1.el6.x86_64
kernel-2.6.32-573.8.1.el6.x86_64

Three kernels in machine.

Delete, remove old kernels.

Continue reading “Remove Old Kernels on CentOS”

Switch to bzip2 compression on my server backup scripts

Previously I used gzip as compression tool to compress the web site files and mysql database.  It is still working on my Linode server. I learned something from the compairasion from this post .

So I tried to use bzip instead of gzip on my backup script.

Here is two examples.

1) Mysqldump file, which is a text sql file.

The original sql file is 52,317KB

  • gzip with level 9 (best) compression:  23,227KB  – Compress Ratio 44.39%
  • bzip2 with default (I think it is level 9) compression: 19,666KB – Compress Ratio 37.59%

Bzip2 is smaller than gzip file. It reduced about 15%.

2) Website files, including php, jpeg, png, css, any files used in web site.

The original tar gile is 187,670KB

  • gzip compression: 42,101KB – Ratio 22.43%
  • bzip2 compression: 34,753KB – Ratio 18.51%

Bzip2 is better. Backup file size is reduced by about 17%.

Continue reading “Switch to bzip2 compression on my server backup scripts”

How to know the disk usage of your VPS

It is a simple task and a very common task for a VPS user. You have a VPS, and want to know the disk usage of your data and system. You are also want to know which folder takes the biggest disk space.

The command is du

du

Usage

du takes a single argument, specifying a pathname for du to work; if it is not specified, the current directory is used. The SUS mandates for du the following options:

-a, display an entry for each file (and not directory) contained in the current directory
-c, display a grand total of the disk usage found by the other arguments
-d #, the depth at which summing should occur. -d 0 sums at the current level, -d 1 sums at the subdirectory, -d 2 at sub-subdirectories, etc.
-H, calculate disk usage for link references specified on the command line
-k, show sizes as multiples of 1024 bytes, not 512-byte
-L, calculate disk usage for link references anywhere
-s, report only the sum of the usage in the current directory, not for each file
-x, only traverse files and directories on the device on which the pathname argument is specified.

 

Above command usage method retrieved from Wikipedia.

Here is what I entered to get the result I want.

 

du-command

Continue reading “How to know the disk usage of your VPS”

bc not found error when running Mysql tuning primer script

To fine tuning mysql server, run Mysql tuning primer script on server.

But I got error which said bc is not found. So just enter following yum install bc to install it.

[root@vps test]# ./tuning-primer.sh
which: no bc in (/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin)
Error: Command line calculator 'bc' not found!
[root@vps test]# yum install bc

How to Synchonize files between two Dreamhost accounts

If you have two Dreamhost ftp account with SSH permission, and you want to transfer the content from account A to Account B, you need following script to do it.
It is simply one line shell command. I save it into a sh file. Schedule to run it every two hours.
Here is the script for you.
Name of Account A: ftp2010
Host name for ftp2010: hosting.mydomain.com
Name of Account B: ftp2011
Host name of ftp2011: hosting.yourdomain.com
The target is to transfer all files under ftp2010 to ftp2011 at hosting.yourdomain.com

Continue reading “How to Synchonize files between two Dreamhost accounts”

Large mySQL database transferring

To transfer the the large mySQL database, you need to learn how to use command to do it.
Large, means the database is bigger than 8 MB after compressing.
phpmyadmin, is a wonderful web tool for mySQL. But the limitation of upload is 8 MB. If the backup file is larger than it, you can not upload it, of course can not import it.
The platform I am working is Linux, Centos. It is ssh command environment.
1) Backup, export all tables of one database into one file.

mysqldump -u root -p databasename | gzip > backup.sql.tar.gz

I use this command to backup my forum database. The result is gzip file, means after compressing. I have a database is above 500MB. The gz file is about 125MB.
When I have the gz file, I can download it to my PC, or put it on other server for transferring.

Continue reading “Large mySQL database transferring”

Schedule Backup MySQL database and FTP offsite in Centos

I have a VPS powered by Centos 5.2.
There are some very important forum on it. I would like to backup it every day.
Now because HyperVM/LxAdmin – Backup Home, can not backup MySQL or not easy to get it. I have to write the shell script to do the backup job.
To use the FTP I need lftp tool.
The whole picture looks like below.
1) Dump all MySQL databases and compress them, save them in /home/usera/backup/mysql/
2) FTP them to Backup server of Dreamhost at /vps-backup/mysql/
3) Remove the backup files 4 days old.
You have to use your own account name and password.
usera is my sample user name. You can also use any other FTP server to save your offsite backups.
You make a file under /home/usera/mysql.backup.sh
Put the content below in it, save and make it executive.


Continue reading “Schedule Backup MySQL database and FTP offsite in Centos”

  • Archives