segunda-feira, 24 de fevereiro de 2014

Installing 'sendEmail' ( SMTP Send Email Client Software) on Linux CentOs

1. Introduction


This post show how to install 'sendEmail' SMTP send e-mail client software on Linux CentOs


2. Step-by-Step

  • Step #1: Installing binary software in /opt

$ mkdir /opt
$ cd /opt
$ wget http://caspian.dotconf.net/menu/Software/SendEmail/sendEmail-v1.56.tar.gz
tar xvf sendEmail-v1.56.tar.gz

  • Step #2: Install a symbolic link on PATH called 'sendEMail'

ln -s /opt/sendEmail-v1.56/sendEmail /usr/sbin/sendEmail



  • Step #3: Test solution 


$ sendEmail -f noreply@josemarsilva.com.br -t josemarsilva@yahoo.com.br -u "Subject for sendEmail " -m "Send Email Body" -s 10.1.0.2



  • Step #4: Command Line Help


$ sendEmail --help





3. References

Installing, configure and usefull backup scripts to Amazon s3cmd ( v 1.5 beta ) tool on Linux


1. Introduction

This post show how to install Amazon s3cmd on Linux and how to use s3cmd to backup your to Amazon S3 services

2. Introduction

2.1. Installing Amazon s3cmd tool

  • Step #1: Download s3cmd Tool 

$ mkdir /opt # Make directory for installation
$ cd /opt
wget http://sourceforge.net/projects/s3tools/files/s3cmd/1.5.0-beta1/s3cmd-1.5.0-beta1.tar.gz/download

  • Step #2: Unpacking  binary install files

tar -xvf s3cmd-1.5.0-beta1.tar.gz


  • Step #3: Run Python Install Scripts

$ cd ./s3cmd-1.5.0-beta1
python setup.py install

  • Step #4: Configure s3cmd with your Amazon S3 account and parameters
To configure your s3cmd tool, you need information the following information from your Amazon S3 account: a) access_key; b) secret_key, c) S3 Encryption Password. Please connect to Amazon Console and get these parameters before run "s3cmd --configure". In example bellow, all colored blue information were input data.

s3cmd --configure

Enter new values or accept defaults in brackets with Enter.
Refer to user manual for detailed description of all options.

Access key and Secret key are your identifiers for Amazon S3
Access Key: ********************
Secret Key: ****************************************

Encryption password is used to protect your files from reading
by unauthorized persons while in transfer to S3
Encryption password: *************
Path to GPG program [/usr/bin/gpg]:

When using secure HTTPS protocol all communication with Amazon S3
servers is protected from 3rd party eavesdropping. This method is
slower than plain HTTP and can't be used if you're behind a proxy
Use HTTPS protocol [No]:

On some networks all internet access must go through a HTTP proxy.
Try setting it here if you can't conect to S3 directly
HTTP Proxy server name:

New settings:
  Access Key: ********************
  Secret Key: ****************************************
  Encryption password: ************
  Path to GPG program: /usr/bin/gpg
  Use HTTPS protocol: False
  HTTP Proxy server name:
  HTTP Proxy server port: 0

Test access with supplied credentials? [Y/n] Y
Please wait, attempting to list all buckets...
Success. Your access key and secret key worked fine :-)

Now verifying that encryption works...
Success. Encryption and decryption worked fine :-)

Save settings? [y/N] y
Configuration saved to '/root/.s3cfg'




2.2. Test your 's3cmd'


    • Testing "directory" in your bucket: 'ls' command list all files or directory from a s3 specification path

    $ s3cmd ls s3://your-bucket-name

                           DIR   s3://your-bucket-name/area_archive/
                           DIR   s3://your-bucket-name/area_backup/




    • Putting a Copy of a simple file to your bucket in Amazon S3: 

    $ s3cmd -r put /etc/hosts s3://your-bucket-name
    WARNING: Module python-magic is not available. Guessing MIME types based on file extensions.
    /etc/hosts -> s3://your-bucket-name/hosts  [1 of 1]
     174 of 174   100% in    0s   514.89 B/s  done



    • Deleting a a simple file from your bucket in Amazon S3: 

    # s3cmd del s3://your-bucket-name/hosts
    File s3://your-bucket-name/hosts deleted



    2.3. Usefull Backup Scripts for Amazon  's3cmd'

    Here are my favorites Backup To Amazon S3 script. You can use, copy, edit and adapt for your needs. 


    2.3.1. Script Put (Copy) a list of sub-directories to Amazon S3 (rotative 31 days a month each day in one sub-directory)


    • This script is prepared to separate backup in sub-directory using CURRENT_DAY information. So copy executed at 1th day of month is placed on a subdirectory named ".../1/...".  
    • "Settings" session, has some configuration variables that you can edit for your needs, like "SUBDIRECTORY_LIST" that contains a list separated by spaces of each linux file or directory specification that must be backuped
    • In the end of script file, an e-mail is sent  with log of copy, so you can put this script in your crontab. 

    #!/bin/sh
    ###############################################################################
    #       Filename:
    #               s3cmdput-to-amazon.srv0108.sh
    #       Log-changes:
    #               2014-02-25 [josemarsilva@yahoo.com.br] Backup dayly to Amazon S3
    #       Pre-reqs:
    #               - sendEmail: SMTP Client
    ###############################################################################

    # Start ...
    date
    echo ""
    PATH=/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin

    # Settings ...
    SERVERNAME_PUT_TO_AMAZON="srv0108"
    SUBDIRECTORY_LIST="/root/backup"
    AMAZON_S3_AREA_BACKUP=s3://bucket-backup/area_backup
    AMAZON_S3_SUFIXO_DISCO=disco
    DAY=$(date +%e)
    DAY=${DAY/ /}
    SCRIPT_FILENAME=$0
    SCRIPT_LOG_FILENAME="${SCRIPT_FILENAME/.sh/.log}"
    SMTP_SERVER="smtp.server.com.br"
    MAIL_FROM=josemarsilva@yahoo.com.br
    MAIL_TO=josemarsilva@yahoo.com.br
    MAIL_SUBJECT="[BACKUP] ${SERVERNAME_PUT_TO_AMAZON} S3CMDPUT-TO-AMAZON - Day: ${DAY}"
    MAIL_BODY="Script: ${SCRIPT_FILENAME} \nServer: ${SERVERNAME_PUT_TO_AMAZON} \nContent: ${SUBDIRECTORY_LIST}\n"
    MAIL_ATTACH=${SCRIPT_LOG_FILENAME}


    #
    # Send each subdirectry from subdirectory_list
    #
    for SUBDIRECTORY in ${SUBDIRECTORY_LIST}
    do
      echo ""
      date
      echo ""
      echo "s3cmd put -r ${SUBDIRECTORY}/* ${AMAZON_S3_AREA_BACKUP}/${DAY}/${SERVERNAME_PUT_TO_AMAZON}/${AMAZON_S3_SUFIXO_DISCO}/"
      s3cmd put -r ${SUBDIRECTORY}/* ${AMAZON_S3_AREA_BACKUP}/${DAY}/${SERVERNAME_PUT_TO_AMAZON}/${AMAZON_S3_SUFIXO_DISCO}/
    done

    #
    # Send e-mail log ...
    #
    echo ""
    echo "Send e-mail log ..."
    sendEmail -f ${MAIL_FROM} -t ${MAIL_TO} -u "${MAIL_SUBJECT}" -s ${SMTP_SERVER}  -m "${MAIL_BODY}" -a ${MAIL_ATTACH}
    echo ""

    # Finish ...
    echo ""
    date




    2.3.2. Script backup stage in a sub-directory every Linux important configuration

    #!/bin/sh
    ###############################################################################
    #       Filename:
    #               backup.srv0112.sh
    #       Log-changes:
    #               2014-02-25 [josemarsilva@yahoo.com.br] Backup Linux SO
    #       Pre-reqs:
    #               - n/a
    ###############################################################################

    export BACKUP_STAGE_DIR="/root/backup"

    date
    ###############################################################################
    echo "Step 1: Scripts, configuracoes, etc..."
    ###############################################################################

    tar -cvf ${BACKUP_STAGE_DIR}/etc.rc.d.tar               /etc/rc.d
    tar -cvf ${BACKUP_STAGE_DIR}/etc.sysconfig.tar          /etc/sysconfig
    tar -cvf ${BACKUP_STAGE_DIR}/var.spool.cron.tar         /var/spool/cron
    tar -cvf ${BACKUP_STAGE_DIR}/root.script.tar            /root/script
    tar -cvf ${BACKUP_STAGE_DIR}/root.ssh.tar               /root/.ssh
    tar -cvf ${BACKUP_STAGE_DIR}/etc.hosts.tar              /etc/hosts*
    tar -cvf ${BACKUP_STAGE_DIR}/etc.resolv.conf.tar        /etc/resolv.conf
    tar -cvf ${BACKUP_STAGE_DIR}/etc.rsyslog.tar            /etc/rsyslog*
    tar -cvf ${BACKUP_STAGE_DIR}/root.bash.tar              /root/.bash*
    tar -cvf ${BACKUP_STAGE_DIR}/root.ssh.tar               /root/.ssh/*
    tar -cvf ${BACKUP_STAGE_DIR}/root.bash.tar              /root/.bash*
    tar -cvf ${BACKUP_STAGE_DIR}/root.ssh.tar               /root/.ssh/*

    date
    ###############################################################################
    echo "Step 2: MYsql "
    ###############################################################################

    /root/script/backup-mysql.srv0112.sh > /root/script/backup-mysql.srv0112.log 2>&1


    date
    ###############################################################################
    echo "Step 3: SVN "
    ###############################################################################

    /root/script/backup-svn.sh > /root/script/backup-svn.log 2>&1


    date
    ###############################################################################
    echo ""
    echo "Step 4.1: Sending backup to HDEX ..."
    echo ""
    ###############################################################################
    nohup /root/script/scp-to-hdex.srv0112.sh > /root/script/scp-to-hdex.srv0112.log 2>&1 &

    date
    ###############################################################################
    echo ""
    echo "Step 4.2: Sending backup to AMAZON ..."
    echo ""
    ###############################################################################
    nohup /root/script/s3cmdput-to-amazon.srv0112.sh > /root/script/s3cmdput-to-amazon.srv0112.log 2>&1 &


    date
    ###############################################################################
    echo "Fim"
    ###############################################################################
    date



         3. References