Differences between revisions 7 and 14 (spanning 7 versions)
Revision 7 as of 2008-10-31 05:49:09
Size: 2004
Editor: DannyCheung
Comment:
Revision 14 as of 2009-08-27 01:42:02
Size: 3550
Editor: JonCo
Comment:
Deletions are marked like this. Additions are marked like this.
Line 1: Line 1:
SQL Mirroring
## page was renamed from sql and control
= SQL Mirroring =
== How it works ==
 1. A record in one of the .dat files is altered. This includes changes to quantities, balances. This includes some changes that polling would not normally send/receive.
 1. The type of record and the record number is recorded in dblog.dat
 1. {{{log2odbc}}} takes unprocessed dblog.dat entries, generates SQL query for the whole record and sends it via ODBC
 1. dblog.dat gets updated so that newly processed entries are not re-processed.
== Installation ==
Line 5: Line 11:
All control data files are store in .dat files and .idx is a header which works like index page for .dat file. All control data files are stored in .dat files and .idx files are indexes for the .dat files.
Line 10: Line 16:
 {{{
# yum install postgresql}}}
Line 14: Line 23:
  * [control] -> This is a connection name that you use for database
  * Driver: -> This is driver that is used
  * Databasename: -> Name of database that you create
  * login: -> SQL login name
  * password: -> SQL password
  * [xxxxxx]: This is a ODBC connection name
  * Driver: This is ODBC driver that is used to connect to database (different for different SQL vedors)
  * Database: Name of database that you created
  * ServerName: The IP of the SQL server (possibly this machine)
  * Port: Port SQL server listens on (different for different SQL vendors)
  * UserName: -> SQL login name
  * Password: -> SQL password
 {{{
[control]
Driver = /usr/local/lib/psqlodbcw.so
Port = 5432
ServerName = localhost
Database = control
UserName = ccc
Password = support}}}
Line 21: Line 40:
#createdb databasename  {{{
# createdb databasename
}}}
 at this point, if you run into problems using postgres, check the ["Postgresql problems"] page.
Line 24: Line 46:
#createtb --help ( to see option) #createtb -d <databasename>  {{{
# createtb --help ( to see option)
#
createtb -d <databasename>}}}
Line 26: Line 50:
 1. Kill all Control users
 1. Set flag in cooad.
 1. Log off all Control users

1. Set flag in {{{cooad}}}.
Line 31: Line 56:
Line 32: Line 58:
  * #fdb2log
 1. use #dblogdump command to check contents of dblog.dat
 {{{
#
fdb2log}}}

 1. Use
{{{
# dblogdump}}}
command to check contents of dblog.dat
Line 35: Line 67:
  * #log2odbc --help
  * #log2odbc -d
  * {{{
# log2odbc --help}}}
  * {{{
# log2odbc -d <ODBC_Connection_Name>}}}
Line 38: Line 73:
  * #psql dannydb postgresql
  * #select * from stock; ( if you see data you are looking for then is working)
 1. Now after everything is done, you have to put this in cronjob to do automatic update. See the example in braun server cronjob how incremental_dumb script is used in crontab
 {{{
# psql dannydb postgresql
  => select * from stock;
}}}
 If you see the data you are looking for, then it is working.

 1. {{{log2odbc}}} defaults has the following default values:
 {{{
ODBC Connection Name = control
Username = ccc
Password = support }}}
 You will need to alter the {{{/u/cc/binl/incremental_dump}}} script so that {{{log2odbc}}} accesses the correct DB

 1. Now after everything is done, place a job in ccc's {{{crontab}}} to do perform the automatic update.
 {{{
* * * * * /u/cc/binl/incremental_dump >> /u/cc/LOG/incremental_dump.log 2>&1}}}

 1. dblog.dat does not get re-initialised. You will need to run {{{rolldblog}}} to initialise dblog.dat
Maybe add to backup script

----
 . CategorySql

SQL Mirroring

How it works

  1. A record in one of the .dat files is altered. This includes changes to quantities, balances. This includes some changes that polling would not normally send/receive.
  2. The type of record and the record number is recorded in dblog.dat
  3. log2odbc takes unprocessed dblog.dat entries, generates SQL query for the whole record and sends it via ODBC

  4. dblog.dat gets updated so that newly processed entries are not re-processed.

Installation

This instruction will help you understand how CONTROL and SQL database works together. Also this instruction will be helpful to troubleshoot or implementing SQL database for the client.

All control data files are stored in .dat files and .idx files are indexes for the .dat files.

Steps to configure SQL database and control

  1. Install SQL database. This could be either on same server that Control resides on or a different server.
    # yum install postgresql
  2. Download and install odbc driver for your database.
    # yum install unixODBC
  3. Configure /etc/odbc.ini which you will need to configure:
    • [xxxxxx]: This is a ODBC connection name
    • Driver: This is ODBC driver that is used to connect to database (different for different SQL vedors)
    • Database: Name of database that you created
    • ServerName: The IP of the SQL server (possibly this machine)

    • Port: Port SQL server listens on (different for different SQL vendors)
    • UserName: -> SQL login name

    • Password: -> SQL password

    [control]
    Driver     = /usr/local/lib/psqlodbcw.so
    Port       = 5432
    ServerName = localhost
    Database   = control
    UserName   = ccc
    Password   = support
  4. Create database now using following command
    # createdb databasename
    at this point, if you run into problems using postgres, check the ["Postgresql problems"] page.
  5. Create table command
    # createtb --help ( to see option)
    # createtb -d <databasename>
  6. Log off all Control users
  7. Set flag in cooad.

    Enable update of odbc data file = Y
    Now this flag will start loging in dblog.dat.
  8. If the sql database is not used from the begining of control, all files will not be logged in dblog.dat to update in sql database. So simple enabling flag will not logged old files of control to dblog.dat, only the new files will be logged. To update old and as well as old , use this command.
    # fdb2log
  9. Use
    # dblogdump
    command to check contents of dblog.dat
  10. Now to start dumping control data to sql database. use this command
    • # log2odbc --help
    • # log2odbc -d <ODBC_Connection_Name>
  11. To test to see if the data dumping is successful.
    # psql dannydb postgresql
      => select * from stock;
    If you see the data you are looking for, then it is working.
  12. log2odbc defaults has the following default values:

    ODBC Connection Name = control
    Username = ccc
    Password = support 

    You will need to alter the /u/cc/binl/incremental_dump script so that log2odbc accesses the correct DB

  13. Now after everything is done, place a job in ccc's crontab to do perform the automatic update.

    * * * * * /u/cc/binl/incremental_dump >> /u/cc/LOG/incremental_dump.log 2>&1
  14. dblog.dat does not get re-initialised. You will need to run rolldblog to initialise dblog.dat

Maybe add to backup script


SQL_Mirroring (last edited 2013-09-18 06:09:33 by localhost)