ODC Appreciation Day: Database In-Memory

This year I have been working mostly on Oracle Database In-Memory, so far have been impressed. As part of a project to move an analytical database to the cloud we have decided to use the In-Memory function. So far so good – it just works. Once the option is enabled a simple command loads the table into memory: Alter table w_expense_f inmemory priority critical; The columnar compression has worked really well for us, we have managed to get up to 85% compression on some tables. We are currently testing the performance against an exalytics system as well as TimesTen. So, watch this space

OUG Ireland & Promotion to Oracle Ace Director

I set out on Thursday at 4am to catch a flight for Dublin at 6.30am to attend the OUG Ireland Conference as I had a session I was presenting after lunch. This conference has grown over the years and I am sure the numbers of attendees are increasing every year, That's a credit to the organisers for the organisation UKOUG and the quality of the talks. It was great seeing everyone - speakers and delegates.
User group conferences are a great way to learn old and new technologies - you can never know everything. What I enjoy about speaking at these conferences is that I get to attend other sessions and learn something new 😃.
I missed the first few sessions as I was on a sev 1 web ex with oracle support over the conference WiFi, and was funny seeing Philippe also on a similar call. Work continues no matter where you are 😱. Once that was over I was mingling with everyone, I went to Mike's talk on 'Upgrading database to 12.2' - nice to see they are making upgrades easier. 😉.
After lunch it was my talk on the DataSync tool with a live demo - it was the first time with this new talk but went Ok.
Later I went to Christian's talk on 'Oracle Analytics and EPM' - it was interesting - I think he is a secret accountant 😉.
As the day came to a close, it was time for the social event. This is where we had some nibbles, drinks and jokes as well as chocolate tasting and some gin that Heli had brought with her. Once the social was over some of the speakers were out for the customary dinner together :). We went to a nice restaurant called the Rustic Stone.
Once everyone was seated, Brendan stood up and announced he had a special announcement and it was a promotion. The Ace Program had promoted me to 'Oracle Ace Director'. It had been a long day but a day that I will never forget and surrounded by great friends to share this moment😃 . This new way of announcing promotions is a nice touch. I couldn't attend the 2nd day as I had to be at the Oracle Dublin offices - working with development on resolving some work issues that needed attention and presenting about 'Analytics and Data in the cloud' to some of the staff.
In the end, an awesome conference and hope to return again...

Copying an existing configuration for Cloud DataSync

We are assuming you have already installed DataSync and you now want to copy an existing configuration you had already done on a another machine.

  1. Copy the entire folder path of the DataSync you want to copy and rename folder .._Old.
  2. Drill to new installed Datasync folder.
  3. Double click datasync.bat.

    Select Next

    Select Next

    Select the folder where the configuration exists that you want copied. Select OK

    Select Yes

    Select Next

    That's it you have copied a previous configured Datasync to your new environment.

Oracle Management Cloud (OMC) Part 2 – Monitoring Database

OMC monitoring a Premises Database

In my previous blog, we looked at monitoring a premises linux server (Agent Install). In this blog we will add the monitoring of the database to the agent. For this we need 2 files :

 

File 1 (A Credential file for the Oracle database, you will need to update the password to what you have set in your environment): omc_oracle_db_creds.json

{

    "credentials" : [

        {

            "id" : "SQLCreds",

            "name" : "SQLCreds",

            "credType" : "DBCreds",

            "properties" : [

                {

                    "name" : "DBUserName",

                    "value" : "CLEAR[dbsnm]p"

                },

                {

                    "name" : "DBPassword",

                    "value" : "CLEAR[w1ll0w]"

                },

                {

                    "name" : "DBRole",

                    "value" : "CLEAR[Normal]"

                }

            ]

        }

    ]

}

 

File 2 (The Agent Configuration file, you will need to update the fields highlighted in yellow to your environment) : oracle_agent_add.json

{

      "entities": [{

            "name": "C_stest1",

            "type": "omc_oracle_db",

            "displayName": "stest1",

            "timezoneRegion": "BST",

            "credentialRefs": ["SQLCreds"],

            "properties": {

                  "host_name": {

                        "displayName": "host_name",

                        "value": "e33392"

                  },

                 "port": {

                        "displayName": "port",

                        "value": "1521"

                  },

                  "sid": {

                        "displayName": "sid",

                       "value": "stest1"

                  },

                  "capability": {

                        "displayName": "capability",

                        "value": "monitoring"

                  }

            }

      }]

}

Lets get started to monitor the database. Lets check the status of the upload That's all, Now if we log onto the OMC and drill into the Infastructure Monitoring – we can see the server and see our database being monitored.
We can drill into the database to see Top SQL and Top Wait Events Also the TableSpace Usage!

Oracle Management Cloud (OMC) – Installing Agent

Pre-Requisites

Once you have received all your credentials for OMC, you will be able to log into OMC and look at your account details. First thing we want to do is get the Master Installer. In OMC :
  1. On the navigation bar, click the navigator icon, and then click Oracle Management Cloud Agents.
  2. On the Oracle Management Cloud Agents page, click Download on the left navigation pane.
  3. Click the Learn more icon to access other information resources.
  4. Click the Download icon to download the master installer.
Also from this section , you can get your key that you will need with the installer. If you do not have one, just create one.

Premises Linux Database Server

Step 1 Install the agent

[x33245 oms]$ cd agentsw

[x33245 agentsw]$ ls

AgentInstall.zip

[x33245 agentsw]$ unzip AgentInstall.zip

Archive: AgentInstall.zip

  inflating: AgentInstall.sh

  inflating: AgentInstall.bat

  inflating: README

[x33245 agentsw]$ pwd

/cima1/oms/agentsw

[x33245 agentsw]$ mkdir temp

[x33245 agentsw]$ pwd

/cima1/oms/agentsw

[x33245 agentsw]$ ls

AgentInstall.bat  AgentInstall.sh  AgentInstall.zip  README temp

[x33245 agentsw]$ ./AgentInstall.sh AGENT_TYPE=cloud_agent STAGE_LOCATION=./temp AGENT_REGISTRATION_KEY=<ENTER REGKEY> -download_only

WARNING: The current cURL version 7.15.5 does not support TLS1.2 protocol; the downloader script will attempt to continue but may not work. Please install 7.49.1 or later versions of cURL for error-free execution.

Downloading lama agent software ...

Downloading AgentInstall.sh ...

[x33245 agentsw]$

[x33245 agentsw]$ cd temp

[x33245 temp]$ ls

AgentInstall.sh  lama.zip

[x33245 temp]$ ./AgentInstall.sh AGENT_TYPE=cloud_agent AGENT_REGISTRATION_KEY=<ENTER REGKEY> AGENT_BASE_DIR=/cima1/oms/agent -staged -ignoreUlimitCheck

WARNING: The current cURL version 7.15.5 does not support TLS1.2 protocol; the downloader script will attempt to continue but may not work. Please install 7.49.1 or later versions of cURL for error-free execution.

Generating emaas.properties ...

Extracting Agent Software ...

Installing the Agent ...

Registering the Agent ...

Downloading Certificates ...

Configuring the Agent ...

Cleanup temporary files ...

The following configuration scripts need to be executed as the root user

#!/bin/sh

#Root script to run

/cima1/oms/agent/core/1.16.0/root.sh

[x33245 temp]$

[root@ x33245 temp]# /cima1/oms/agent/core/1.16.0/root.sh

/etc exist

Creating /etc/oragchomelist file...

[root@x33245 temp]#

  • We can check in OMS, to see if we can see the agent.

Step 2 Configure the Agent

/cima1/oms/agent/plugins/oracle.em.sgfm.zip/1.16.0/configs

[root@eux3300392 configs]# ls

discovery.properties  entityid.properties  explorer.properties

[root@eux3300392 configs]# vi discovery.properties

<![if !supportLists]>- <![endif]>What we doing here is to enable the monitoring of linux, by commenting out the disable section highlighted below.

[x33245 bin]$ pwd

/cima1/oms/agent/agent_inst/bin

[x33245 bin]$ ./omcli stop agent

Oracle Management Cloud Agent

Copyright (c) 1996, 2016 Oracle Corporation.  All rights reserved.

Stopping agent ... stopped.

[x33245 bin]$ ./omcli start agent

Oracle Management Cloud Agent

Copyright (c) 1996, 2016 Oracle Corporation.  All rights reserved.

Starting agent .............. started.

  • If we now log into the OMS, we can see premises server being monitored.

Using EXPDP to export table data based on a filter against a different table using ku$

Using EXPDP to export table data based on a filter against a different table using ku$

Scenario – We want to export table data based on a certain filter against a different table ie .  We have table WC_CAMP_HIST_FX, we only want to export the data from this table where the row_wid exists in table W_CAMP_HIST_F and data that is only 1 years old in the W_CAMP_HIST_F table. In this situation we can use the syntax 'ku$' in the query filter, this is a database function so would work in cloud or on-premises.

 

Solution

 

1.       First we will create a par file and in the query syntax we call the parent table by using ku$.

 

$ vi WC_CAMP_HIST_FX.par

userid="aaa/bbb"

directory=TTS_EXP

dumpfile= WC_CAMP_HIST_FX.dmp

logfile= WC_CAMP_HIST_FX.txt

tables= WC_CAMP_HIST_FX

query= WC_CAMP_HIST_FX:"where exists (select 1 from W_CAMP_HIST_F f where ku$.row_wid = f.ROW_WID and f.created_dt_wid >= 20161001)"

 

2.       Execute the expdp

 

$ nohup expdp parfile= WC_CAMP_HIST_FX.par &

 

Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS

. . exported "SIEBEL"."WC_CAMP_HIST_FX"                  3.163 GB 23611775 rows

Master table "SIEBEL"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded

******************************************************************************

Dump file set for SIEBEL.SYS_EXPORT_TABLE_01 is:

  /backup/export/WC_CAMP_HIST_FX.dmp

Job "SIEBEL"."SYS_EXPORT_TABLE_01" successfully completed at Tue Apr 25 13:50:51 2017 elapsed 0 00:28:09

 

Cloud – RMAN-06026: some targets not found – aborting restore

RMAN Failures when database tooling has been updated

   

When checking backup status in the Oracle Cloud Database server, I noticed the following error:

RMAN-06026: some targets not found - aborting restore

RMAN-08526: channel ORA_SBT_TAPE_5: Oracle Database Backup Service Library VER=3.16.9.21

RMAN-00571: ===========================================================

RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============

RMAN-00571: ===========================================================

RMAN-03002: failure of restore command at 03/19/2017 16:49:39

RMAN-06026: some targets not found - aborting restore

RMAN-06025: no backup of archived log for thread 1 with sequence 1332 and starting SCN of 416745785 found to restore

RMAN-06025: no backup of archived log for thread 1 with sequence 1331 and starting SCN of 416745691 found to restore

RMAN-06025: no backup of archived log for thread 1 with sequence 1330 and starting SCN of 416633416 found to restore

RMAN-06025: no backup of archived log for thread 1 with sequence 1329 and starting SCN of 416632713 found to restore

RMAN-06025: no backup of archived log for thread 1 with sequence 1328 and starting SCN of 415836751 found to restore

RMAN-06025: no backup of archived log for thread 1 with sequence 1327 and starting SCN of 414639947 found to restore

Solution

Why did this happen, what happened.  Oracle periodically change the Cloud tooling, when they do this you can see the above error.  By updating the Cloud Tooling this error was fixed in this case.  To update or check the version of the cloud tooling you can follow the steps from the Oracle website.

Using Oracle Database Cloud Service