The multi table insert is one of the great features that will benefit the developers in ETL like processing . Basically , you would select the data from the datastore and insert into the multiple tables.
Let us look at an example.
SQL> select count(*) from all_objects;
COUNT(*)
----------
69495
SQL> Create table t1 as Select object_id , object_name from all_objects Where 1 = 2;
Table created.
SQL> Create table t2 as Select object_id , object_name from all_objects Where 1 = 2;
Table created.
SQL> Insert
2 All
3 When (Object_Type) <> 'TABLE' Then
4 Into T1
5 (Object_Id , Object_Name )
6 Values
7 (500 , Object_Name )
8 When (Object_Type) = 'TABLE' Then
9 Into T2
10 (Object_Id , Object_Name )
11 Values
12 (600 , Object_Name )
13 Select Object_Name , Object_Type
14 From All_Objects;
69497 rows created.
SQL> select count(*) from t1;
COUNT(*)
----------
66604
SQL> select count(*) from t2;
COUNT(*)
----------
2893
As you can see , we loaded the data from all_objects to different tables t1 and t2.
In my tenure with Barnes & Noble , I have used this feature ( and the error logging feature , which I will cover in the next few posts ) heavily , there by reducing the data load time from hours to minutes.
Let us look at an example.
SQL> select count(*) from all_objects;
COUNT(*)
----------
69495
SQL> Create table t1 as Select object_id , object_name from all_objects Where 1 = 2;
Table created.
SQL> Create table t2 as Select object_id , object_name from all_objects Where 1 = 2;
Table created.
SQL> Insert
2 All
3 When (Object_Type) <> 'TABLE' Then
4 Into T1
5 (Object_Id , Object_Name )
6 Values
7 (500 , Object_Name )
8 When (Object_Type) = 'TABLE' Then
9 Into T2
10 (Object_Id , Object_Name )
11 Values
12 (600 , Object_Name )
13 Select Object_Name , Object_Type
14 From All_Objects;
69497 rows created.
SQL> select count(*) from t1;
COUNT(*)
----------
66604
SQL> select count(*) from t2;
COUNT(*)
----------
2893
As you can see , we loaded the data from all_objects to different tables t1 and t2.
In my tenure with Barnes & Noble , I have used this feature ( and the error logging feature , which I will cover in the next few posts ) heavily , there by reducing the data load time from hours to minutes.