Search This Blog

18 October 2011

Oracle Database Administration chapter-6


SQL LOADER utility is used to load data from other data source into Oracle. For example, if you have a table in FOXPRO, ACCESS or SYBASE or any other third party database, you can use SQL Loader to load the data into Oracle Tables. SQL Loader will only read the data from Flat files. So If you want to load the data from Foxpro or any other database, you have to first convert that data into Delimited Format flat file or Fixed length format flat file, and then use SQL loader to load the data into Oracle.
Following is procedure to load the data from Third Party Database into Oracle using SQL Loader.
  1. Convert the Data into Flat file using third party database command.
  2. Create the Table Structure in Oracle Database using appropriate datatypes
  3. Write a Control File, describing how to interpret the flat file and options to load the data.
  4. Execute SQL Loader utility specifying the control file in the command line argument
To understand it better let us see the following case study.
Suppose you have a table in MS-ACCESS by name EMP, running under Windows O/S, with the following structure
            EMPNO          INTEGER
            NAME             TEXT(50)
            SAL                 CURRENCY
            JDATE                        DATE
This table contains some 10,000 rows. Now you want to load the data from this table into an Oracle Table. Oracle Database is running in LINUX O/S.
Solution
Steps
Start MS-Access and convert the table into comma delimited flat (popularly known as csv) , by clicking on File/Save As menu. Let the delimited file name be emp.csv
  1. Now transfer this file to Linux Server using FTP command
    1. Go to Command Prompt in windows
    2. At the command prompt type FTP followed by IP address of the server running Oracle.
FTP will then prompt you for username and password to connect to the Linux Server. Supply a valid username and password of Oracle User in Linux
For example:-
C:\>ftp 200.200.100.111
Name: oracle
Password:oracle
FTP>
    1. Now give PUT command to transfer file from current Windows machine to Linux machine.
FTP>put
Local file:C:\>emp.csv
remote-file:/u01/oracle/emp.csv
File transferred in 0.29 Seconds
FTP>
    1. Now after the file is transferred quit the FTP utility by typing bye command.
FTP>bye
Good-Bye
  1. Now come the Linux Machine and create a table in Oracle with the same structure as in MS-ACCESS by taking appropriate datatypes. For example,  create a table like this
$sqlplus scott/tiger
SQL>CREATE TABLE emp (empno number(5),
                                    name varchar2(50),
                                    sal  number(10,2),
                                    jdate date);
  1. After creating the table, you have to write a control file describing the actions which SQL Loader should do. You can use any text editor to write the control file. Now let us write a controlfile for our case study
$vi emp.ctl
1        LOAD DATA
2        INFILE   ‘/u01/oracle/emp.csv’
3        BADFILE                       ‘/u01/oracle/emp.bad
4        DISCARDFILE ‘/u01/oracle/emp.dsc’
5        INSERT INTO TABLE emp
6        FIELDS TERMINATED BY “,” OPTIONALLY ENCLOSED BY ‘”’ TRAILING NULLCOLS
7        (empno,name,sal,jdate date ‘mm/dd/yyyy’)
Notes:
(Do not write the line numbers, they are meant for explanation purpose)
1.       The LOAD DATA statement is required at the beginning of the control file.
2.       The INFILE option specifies where the input file is located
3.       Specifying BADFILE is optional. If you specify,  then bad records found during loading will be stored in this file.
4.       Specifying DISCARDFILE is optional. If you specify, then records which do not meet a WHEN condition will be written to this file.
5.       You can use any of the following loading option
1.       INSERT : Loads rows only if the target table is empty
2.       APPEND: Load rows if the target table is empty or not.
3.       REPLACE: First deletes all the rows in the existing table and then, load rows.
4.       TRUNCATE: First truncates the table and then load rows.
6.       This line indicates how the fields are separated in input file. Since in our case the fields are separated by “,” so we have specified “,” as the terminating char for fields. You can replace this by any char which is used to terminate fields. Some of the popularly use terminating characters are semicolon “;”, colon “:”, pipe “|” etc. TRAILING NULLCOLS means if the last column is null then treat this as null value, otherwise,  SQL LOADER will treat the record as bad if the last column is null.
7.        In this line specify the columns of the target table. Note how do you specify format for Date columns
  1. After you have wrote the control file save it and then, call SQL Loader utility by typing the following command
$sqlldr userid=scott/tiger control=emp.ctl log=emp.log
After you have executed the above command SQL Loader will shows you the output describing how many rows it has loaded.
The LOG option of sqlldr specifies where the log file of this sql loader session should be created.  The log file contains all actions which SQL loader has performed i.e. how many rows were loaded, how many were rejected and how much time is taken to load the rows and etc. You have to view this file for any errors encountered while running SQL Loader.
Suppose we have a fixed length format file containing employees data, as shown below, and wants to load this data into an Oracle table.
7782 CLARK      MANAGER   7839  2572.50          10
7839 KING       PRESIDENT       5500.00          10
7934 MILLER     CLERK     7782   920.00          10
7566 JONES      MANAGER   7839  3123.75          20
7499 ALLEN      SALESMAN  7698  1600.00   300.00 30
7654 MARTIN     SALESMAN  7698  1312.50  1400.00 30
7658 CHAN       ANALYST   7566  3450.00          20
7654 MARTIN     SALESMAN  7698  1312.50  1400.00 30

SOLUTION:
Steps :-
1.      First Open the file in a text editor and count the length of fields, for example in our fixed length file, employee number is from 1st position to 4th position, employee name is from 6th position to 15th position, Job name is from 17th position to 25th position. Similarly other columns are also located.
2.      Create a table in Oracle, by any name, but should  match columns specified in fixed length file. In our case give the following command to create the table.


SQL> CREATE TABLE emp (empno  NUMBER(5),
                  name VARCHAR2(20),
                  job       VARCHAR2(10),
                  mgr      NUMBER(5),
                  sal        NUMBER(10,2),
                  comm NUMBER(10,2),
                  deptno NUMBER(3) );
                         
3.      After creating the table, now write a control file by using any text editor
$vi empfix.ctl
1)   LOAD DATA
2)   INFILE '/u01/oracle/fix.dat'
3)   INTO TABLE emp
4)   (empno         POSITION(01:04)   INTEGER EXTERNAL,
       name         POSITION(06:15)   CHAR,
       job          POSITION(17:25)   CHAR,
       mgr          POSITION(27:30)   INTEGER EXTERNAL,
       sal          POSITION(32:39)   DECIMAL EXTERNAL,
       comm         POSITION(41:48)   DECIMAL EXTERNAL,
5)   deptno         POSITION(50:51)   INTEGER EXTERNAL)

Notes:
(Do not write the line numbers, they are meant for explanation purpose)
1.       The LOAD DATA statement is required at the beginning of the control file.
2.       The name of the file containing data follows the INFILE parameter.
3.       The INTO TABLE statement is required to identify the table to be loaded into.
4.       Lines 4 and 5 identify a column name and the location of the data in the datafile to be loaded into that column. empno, name, job, and so on are names of columns in table emp. The datatypes (INTEGER EXTERNAL, CHAR, DECIMAL EXTERNAL) identify the datatype of data fields in the file, not of corresponding columns in the emp table.
5.       Note that the set of column specifications is enclosed in parentheses.

4.      After saving the control file now start SQL Loader utility by typing the following command.

$sqlldr userid=scott/tiger control=empfix.ctl log=empfix.log direct=y
After you have executed the above command SQL Loader will shows you the output describing how many rows it has loaded.
You can simultaneously load data into multiple tables in the same session. You can also use WHEN condition to load only specified rows which meets a particular condition (only equal to “=” and not equal to “<>” conditions are allowed).
For example, suppose we have a fixed length file as shown below
7782 CLARK      MANAGER   7839  2572.50          10
7839 KING       PRESIDENT       5500.00          10
7934 MILLER     CLERK     7782   920.00          10
7566 JONES      MANAGER   7839  3123.75          20
7499 ALLEN      SALESMAN  7698  1600.00   300.00 30
7654 MARTIN     SALESMAN  7698  1312.50  1400.00 30
7658 CHAN       ANALYST   7566  3450.00          20
7654 MARTIN     SALESMAN  7698  1312.50  1400.00 30

Now we want to load all the employees whose deptno is 10 into emp1 table and those employees whose deptno is not equal to 10 in emp2 table. To do this first create the tables emp1 and emp2 by taking appropriate columns and datatypes. Then, write a control file as shown below
$vi emp_multi.ctl
Load Data
infile ‘/u01/oracle/empfix.dat’
append into table scott.emp1
WHEN (deptno=’10 ‘)
  (empno        POSITION(01:04)   INTEGER EXTERNAL,
   name         POSITION(06:15)   CHAR,
   job          POSITION(17:25)   CHAR,
   mgr          POSITION(27:30)   INTEGER EXTERNAL,
   sal          POSITION(32:39)   DECIMAL EXTERNAL,
   comm         POSITION(41:48)   DECIMAL EXTERNAL,
   deptno       POSITION(50:51)   INTEGER EXTERNAL)
    INTO TABLE scott.emp2
  WHEN (deptno<>’10 ‘)
  (empno        POSITION(01:04)   INTEGER EXTERNAL,
   name         POSITION(06:15)   CHAR,
   job          POSITION(17:25)   CHAR,
   mgr          POSITION(27:30)   INTEGER EXTERNAL,
   sal          POSITION(32:39)   DECIMAL EXTERNAL,
   comm         POSITION(41:48)   DECIMAL EXTERNAL,
   deptno       POSITION(50:51)   INTEGER EXTERNAL)

After saving the file emp_multi.ctl run sqlldr
$sqlldr userid=scott/tiger control=emp_multi.ctl
SQL Loader can load the data into Oracle database using Conventional Path method or Direct Path method. You can specify the method by using DIRECT command line option. If you give DIRECT=TRUE then SQL loader will use Direct Path Loading otherwise, if omit this option or specify DIRECT=false, then SQL Loader will use Conventional Path loading method.
Conventional Path
Conventional path load (the default) uses the SQL INSERT statement and a bind array buffer to load data into database tables.
When SQL*Loader performs a conventional path load, it competes equally with all other processes for buffer resources. This can slow the load significantly. Extra overhead is added as SQL statements are generated, passed to Oracle, and executed.
The Oracle database looks for partially filled blocks and attempts to fill them on each insert. Although appropriate during normal use, this can slow bulk loads dramatically.
In Direct Path Loading, Oracle will not use SQL INSERT statement for loading rows. Instead it directly writes the rows, into fresh blocks beyond High Water Mark, in datafiles i.e. it does not scan for free blocks before high water mark. Direct Path load is very fast because
  • Partial blocks are not used, so no reads are needed to find them, and fewer writes are performed.
  • SQL*Loader need not execute any SQL INSERT statements; therefore, the processing load on the Oracle database is reduced.
  • A direct path load calls on Oracle to lock tables and indexes at the start of the load and releases them when the load is finished. A conventional path load calls Oracle once for each array of rows to process a SQL INSERT statement.
  • A direct path load uses multiblock asynchronous I/O for writes to the database files.
  • During a direct path load, processes perform their own write I/O, instead of using Oracle's buffer cache. This minimizes contention with other Oracle users.
The following conditions must be satisfied for you to use the direct path load method:
  • Tables are not clustered.
  • Tables to be loaded do not have any active transactions pending.
  • Loading a parent table together with a child Table
  • Loading BFILE columns
Starting with Oracle 10g,  Oracle has introduced an enhanced version of EXPORT and IMPORT utility known as DATA PUMP. Data Pump is similar to EXPORT and IMPORT utility but it has many advantages. Some of the advantages are:
  • Most Data Pump export and import operations occur on the Oracle database server. i.e. all the dump files are created in the server even if you run the Data Pump utility from client machine. This results in increased performance because data is not transferred through network.
  • You can Stop and Re-Start export and import jobs. This is particularly useful if you have started an export or import job and after some time you want to do some other urgent work.
  • The ability to detach from and reattach to long-running jobs without affecting the job itself. This allows DBAs and other operations personnel to monitor jobs from multiple locations.
  • The ability to estimate how much space an export job would consume, without actually performing the export
  • Support for an interactive-command mode that allows monitoring of and interaction with ongoing jobs 

No comments:

Post a Comment