Oracle export from Oracle 7.3 (AIX) to Oracle 11g (Linux)

If you have an old AIX IBM Server you will know, there is a limit of the size of file you can have, what do you do if the database is much bigger than the limit.  In the example below we will split the file to 1gb chunks.

———– Export Section

L1:/home5/dump > mknod /tmp/split_pipe p

L1:/home5/dump > mknod /tmp/exp.pipe p

mknod: /tmp/exp.pipe: Do not specify an existing file.

L1:/home5/dump > nohup split -b1000m < /tmp/split_pipe > /home5/dump/L1_dmp.dmp.Z &

[1]     18820

L1:/home5/dump > nohup compress < /tmp/exp.pipe > /tmp/split_pipe &

[2]     16918

L1:/home5/dump > nohup exp system/12345 file=/tmp/exp.pipe full=Y log=/home5/dump/exp_live.txt &

[3]     19830

L1:/home5/dump > Sending nohup output to nohup.out.

—————— IMPORT Section

# The import script assumes in this example that the above export script created 2 split files

# called expfileaa and expfileab. The order of the file for the cat command is very important.

Copy the files over to the Linux machine.

mknod /tmp/split_pipe p

cat /cima1/backup/export/infobase/xaa /cima1/backup/export/infobase/xab > /cima1/backup/export/infobase/L1.dmp.Z

uncompress L1.dmp.Z

SQL> CREATE DIRECTORY aix AS ‘/backup/export/aix’;

Directory created.

SQL> GRANT read, write ON DIRECTORY aix TO starsky;

Grant succeeded.

imp startsky/12345 full=y file=L1.dmp show=y log=import.sql

Conclusion

Well you have exported a database in AIX by splitting the file, thereafter moving the files to linux and importing them into a Oracle 11g Database.

V$BACKUP_DATAFILE Shows Corruptions for File #0

Just saw the following in some database backups:

SQL> select recid, file#, MARKED_CORRUPT, MEDIA_CORRUPT,LOGICALLY_CORRUPT,COMPLETION_TIME from v$backup_datafile where logically_corrupt = 1;

File# 0 is the control file.

As you can see some of the backups are OK for the file:

SQL> select recid, file#, MARKED_CORRUPT, MEDIA_CORRUPT,LOGICALLY_CORRUPT,COMPLETION_TIME from v$backup_datafile where file#=0;

 

The actual backup logs have no error, Also no errors in the Alert log file.

Apparently this is safe if you see this.

The non-zero values do not indicate a corruption in the controlfile(s) – instead it is a result of the underlying fields being used by RMAN for timestamp and sequence information for autobackups, ie. this is intended behavior.

TimesTen import from oracle utility

The other day, someone wanted to place some oracle tables into TimesTen DB for some testing, this is a great quick way:

1)     Make sure you have a connection to your Oracle database in the TimesTen tnsfile, so I have the following in my tnsnames.ora in the following directory (/u01/app/oracle/product/TimesTen/tt1122/network/admin/samples) :

MYDB =

(DESCRIPTION =

(ADDRESS = (PROTOCOL = TCP)(HOST = XXX) (PORT=1521))

(CONNECT_DATA =

(SERVICE_NAME = live)

)

)

2)     Set your environment

TNS_ADMIN=/u01/app/oracle/product/TimesTen/tt1122/network/admin/samples

Export TNS_ADMIN

cd /u01/app/oracle/product/TimesTen/tt1122/ttoracle_home/instantclient_11_2

./sqlplus read/only@mydb

3)     Create a directory to place the files that will be generated for the import, and cd to the directory.

4)     Run the following command:

ttImportFromOracle –oraConn user/oracle@myoracle –tables my_small_tab –typeMap 2,1 –prefix tm1

myoracle = ip:port/service

prefix = prefix for files generated.

5)     Now we shall add the MYDB to the TimesTen settings:

cd /u01/app/oracle/product/TimesTen/tt1122/info

vi sys.odbc.ini

Add the following to your Times Ten Store ours is called TT_AGGR_STORE

[TT_AGGR_STORE]

Driver=/u01/app/oracle/product/fmw/../TimesTen/tt1122/lib/libtten.so

DataStore=/u01/app/oracle/product/fmw/../aggregate_store/tt_aggr_store

LogDir=/u01/app/oracle/product/fmw/../aggregate_store/logs

DatabaseCharacterSet=AL32UTF8

ConnectionCharacterSet=AL32UTF8

LogFileSize=1024

LogBufMB=1024

LogBufParallelism=16

Preallocate=0

PermSize=400000

TempSize=200000

MemoryLock=4

CkptFrequency=30

CkptLogVolume=0

CkptRate=20

PrivateCommands=1

RecoveryThreads=40

oraclenetservicename=MYDB

6)     Restart TimesTen

cd /sbin

./sysctl -p

ttdaemonadmin -stop

ttdaemonadmin -start

7)     In the example below, we are doing the table siebel.wc-KPI_F_041174

[oracle@CIMAlyticsSrv1 TT_INSTALLERS]$  ./ttImportFromOracle -oraConn read/only@8x.2xx.1xx.1x:1521/live -tables siebel.wc_KPI_F_041174 -typeMap 2,1 -prefix ncmp1

Beginning processing

Resolving any tablename wildcards

Eliminating any duplicate tables

Getting metadata from source

Generating database user list

Assigning TimesTen datatypes

Analyzing source tables

Analyzing table ‘SIEBEL.WC_KPI_F_041174’ …

Estimating table sizes

Evaluating parallel data load

Generating output files

Finished processing

 

8)     The following files will be generated:

ncmp1CreateUsers.sql

ncmp1CreateTables.sql

ncmp1LoadData.sql

ncmp1CreateIndexes.sql

ncmp1UpdateStats.sql

9)     Connect to your TimesTen DB and the run the above sql files in the order above:

[oracle@CIMAlyticsSrv1 bin]$ ttisql -connstr “dsn=TT_AGGR_STORE;uid=user;pwd=password;oraclepwd=opassword”

**the opassword is the password to your oracle instance

Command> @/home/oracle/TT_INSTALLERS/ncmp1CreateUsers.sql

..