Michael Dinh

Subscribe to Michael Dinh feed Michael Dinh
Michael T. Dinh, Oracle DBA
Updated: 14 hours 47 min ago

Using sshUserSetup.sh for Passwordless ssh

Sun, 2020-04-05 07:03

Quick short and sweet. I am creating POC for Dataguard with multiple standby configuration using/hacking vagrant virtual box.

Being as lazy as I am and not liking to have to enter password, use sshUserSetup.sh

[oracle@ol7-121-dg1 ~]$ cd /u01/software/database/sshsetup/
[oracle@ol7-121-dg1 sshsetup]$


[oracle@ol7-121-dg1 sshsetup]$ ./sshUserSetup.sh -h
Please specify a valid and existing cluster configuration file.
Either user name or host information is missing
Usage ./sshUserSetup.sh -user <user name> [ -hosts "<space separated hostlist>" | -hostfile <absolute path of cluster configuration file> ] [ -advanced ]  [ -verify] [ -exverify ] [ -logfile <desired absolute path of logfile> ] [-confirm] [-shared] [-help] [-usePassphrase] [-noPromptPassphrase]
[oracle@ol7-121-dg1 sshsetup]$


[oracle@ol7-121-dg1 sshsetup]$ ./sshUserSetup.sh -user oracle -hosts "ol7-121-dg1 ol7-121-dg2 ol7-121-dg3" -noPromptPassphrase
The output of this script is also logged into /tmp/sshUserSetup_2020-04-05-11-53-56.log
Hosts are ol7-121-dg1 ol7-121-dg2 ol7-121-dg3
user is oracle
Platform:- Linux
Checking if the remote hosts are reachable
PING ol7-121-dg1.localdomain (192.168.56.101) 56(84) bytes of data.
64 bytes from ol7-121-dg1.localdomain (192.168.56.101): icmp_seq=1 ttl=64 time=0.016 ms
64 bytes from ol7-121-dg1.localdomain (192.168.56.101): icmp_seq=2 ttl=64 time=0.019 ms
64 bytes from ol7-121-dg1.localdomain (192.168.56.101): icmp_seq=3 ttl=64 time=0.036 ms
64 bytes from ol7-121-dg1.localdomain (192.168.56.101): icmp_seq=4 ttl=64 time=0.045 ms
64 bytes from ol7-121-dg1.localdomain (192.168.56.101): icmp_seq=5 ttl=64 time=0.041 ms

--- ol7-121-dg1.localdomain ping statistics ---
5 packets transmitted, 5 received, 0% packet loss, time 4293ms
rtt min/avg/max/mdev = 0.016/0.031/0.045/0.012 ms
PING ol7-121-dg2.localdomain (192.168.56.102) 56(84) bytes of data.
64 bytes from ol7-121-dg2.localdomain (192.168.56.102): icmp_seq=1 ttl=64 time=0.333 ms
64 bytes from ol7-121-dg2.localdomain (192.168.56.102): icmp_seq=2 ttl=64 time=0.657 ms
64 bytes from ol7-121-dg2.localdomain (192.168.56.102): icmp_seq=3 ttl=64 time=0.547 ms
64 bytes from ol7-121-dg2.localdomain (192.168.56.102): icmp_seq=4 ttl=64 time=0.539 ms
64 bytes from ol7-121-dg2.localdomain (192.168.56.102): icmp_seq=5 ttl=64 time=0.514 ms

--- ol7-121-dg2.localdomain ping statistics ---
5 packets transmitted, 5 received, 0% packet loss, time 4310ms
rtt min/avg/max/mdev = 0.333/0.518/0.657/0.104 ms
PING ol7-121-dg3.localdomain (192.168.56.103) 56(84) bytes of data.
64 bytes from ol7-121-dg3.localdomain (192.168.56.103): icmp_seq=1 ttl=64 time=0.356 ms
64 bytes from ol7-121-dg3.localdomain (192.168.56.103): icmp_seq=2 ttl=64 time=0.554 ms
64 bytes from ol7-121-dg3.localdomain (192.168.56.103): icmp_seq=3 ttl=64 time=0.463 ms
64 bytes from ol7-121-dg3.localdomain (192.168.56.103): icmp_seq=4 ttl=64 time=0.362 ms
64 bytes from ol7-121-dg3.localdomain (192.168.56.103): icmp_seq=5 ttl=64 time=0.472 ms

--- ol7-121-dg3.localdomain ping statistics ---
5 packets transmitted, 5 received, 0% packet loss, time 4517ms
rtt min/avg/max/mdev = 0.356/0.441/0.554/0.076 ms
Remote host reachability check succeeded.
The following hosts are reachable: ol7-121-dg1 ol7-121-dg2 ol7-121-dg3.
The following hosts are not reachable: .
All hosts are reachable. Proceeding further...
firsthost ol7-121-dg1
numhosts 3
The script will setup SSH connectivity from the host ol7-121-dg1.localdomain to all
the remote hosts. After the script is executed, the user can use SSH to run
commands on the remote hosts or copy files between this host ol7-121-dg1.localdomain
and the remote hosts without being prompted for passwords or confirmations.

NOTE 1:
As part of the setup procedure, this script will use ssh and scp to copy
files between the local host and the remote hosts. Since the script does not
store passwords, you may be prompted for the passwords during the execution of
the script whenever ssh or scp is invoked.

NOTE 2:
AS PER SSH REQUIREMENTS, THIS SCRIPT WILL SECURE THE USER HOME DIRECTORY
AND THE .ssh DIRECTORY BY REVOKING GROUP AND WORLD WRITE PRIVILEDGES TO THESE
directories.

Do you want to continue and let the script make the above mentioned changes (yes/no)?
yes

The user chose yes
User chose to skip passphrase related questions.
Creating .ssh directory on local host, if not present already
Creating authorized_keys file on local host
Changing permissions on authorized_keys to 644 on local host
Creating known_hosts file on local host
Changing permissions on known_hosts to 644 on local host
Creating config file on local host
If a config file exists already at /home/oracle/.ssh/config, it would be backed up to /home/oracle/.ssh/config.backup.
Creating .ssh directory and setting permissions on remote host ol7-121-dg1
THE SCRIPT WOULD ALSO BE REVOKING WRITE PERMISSIONS FOR group AND others ON THE HOME DIRECTORY FOR oracle. THIS IS AN SSH REQUIREMENT.
The script would create ~oracle/.ssh/config file on remote host ol7-121-dg1. If a config file exists already at ~oracle/.ssh/config, it would be backed up to ~oracle/.ssh/config.backup.
The user may be prompted for a password here since the script would be running SSH on host ol7-121-dg1.
Warning: Permanently added 'ol7-121-dg1,192.168.56.101' (ECDSA) to the list of known hosts.
oracle@ol7-121-dg1's password:
Done with creating .ssh directory and setting permissions on remote host ol7-121-dg1.
Creating .ssh directory and setting permissions on remote host ol7-121-dg2
THE SCRIPT WOULD ALSO BE REVOKING WRITE PERMISSIONS FOR group AND others ON THE HOME DIRECTORY FOR oracle. THIS IS AN SSH REQUIREMENT.
The script would create ~oracle/.ssh/config file on remote host ol7-121-dg2. If a config file exists already at ~oracle/.ssh/config, it would be backed up to ~oracle/.ssh/config.backup.
The user may be prompted for a password here since the script would be running SSH on host ol7-121-dg2.
Warning: Permanently added 'ol7-121-dg2,192.168.56.102' (ECDSA) to the list of known hosts.
oracle@ol7-121-dg2's password:
Done with creating .ssh directory and setting permissions on remote host ol7-121-dg2.
Creating .ssh directory and setting permissions on remote host ol7-121-dg3
THE SCRIPT WOULD ALSO BE REVOKING WRITE PERMISSIONS FOR group AND others ON THE HOME DIRECTORY FOR oracle. THIS IS AN SSH REQUIREMENT.
The script would create ~oracle/.ssh/config file on remote host ol7-121-dg3. If a config file exists already at ~oracle/.ssh/config, it would be backed up to ~oracle/.ssh/config.backup.
The user may be prompted for a password here since the script would be running SSH on host ol7-121-dg3.
Warning: Permanently added 'ol7-121-dg3,192.168.56.103' (ECDSA) to the list of known hosts.
oracle@ol7-121-dg3's password:
Done with creating .ssh directory and setting permissions on remote host ol7-121-dg3.
Copying local host public key to the remote host ol7-121-dg1
The user may be prompted for a password or passphrase here since the script would be using SCP for host ol7-121-dg1.
oracle@ol7-121-dg1's password:
Done copying local host public key to the remote host ol7-121-dg1
Copying local host public key to the remote host ol7-121-dg2
The user may be prompted for a password or passphrase here since the script would be using SCP for host ol7-121-dg2.
oracle@ol7-121-dg2's password:
Done copying local host public key to the remote host ol7-121-dg2
Copying local host public key to the remote host ol7-121-dg3
The user may be prompted for a password or passphrase here since the script would be using SCP for host ol7-121-dg3.
oracle@ol7-121-dg3's password:
Done copying local host public key to the remote host ol7-121-dg3
cat: /home/oracle/.ssh/known_hosts.tmp: No such file or directory
cat: /home/oracle/.ssh/authorized_keys.tmp: No such file or directory
SSH setup is complete.

------------------------------------------------------------------------
Verifying SSH setup
===================
The script will now run the date command on the remote nodes using ssh
to verify if ssh is setup correctly. IF THE SETUP IS CORRECTLY SETUP,
THERE SHOULD BE NO OUTPUT OTHER THAN THE DATE AND SSH SHOULD NOT ASK FOR
PASSWORDS. If you see any output other than date or are prompted for the
password, ssh is not setup correctly and you will need to resolve the
issue and set up ssh again.
The possible causes for failure could be:
1. The server settings in /etc/ssh/sshd_config file do not allow ssh
for user oracle.
2. The server may have disabled public key based authentication.
3. The client public key on the server may be outdated.
4. ~oracle or ~oracle/.ssh on the remote host may not be owned by oracle.
5. User may not have passed -shared option for shared remote users or
may be passing the -shared option for non-shared remote users.
6. If there is output in addition to the date, but no password is asked,
it may be a security alert shown as part of company policy. Append the
additional text to the <OMS HOME>/sysman/prov/resources/ignoreMessages.txt file.
------------------------------------------------------------------------
--ol7-121-dg1:--
Running /usr/bin/ssh -x -l oracle ol7-121-dg1 date to verify SSH connectivity has been setup from local host to ol7-121-dg1.
IF YOU SEE ANY OTHER OUTPUT BESIDES THE OUTPUT OF THE DATE COMMAND OR IF YOU ARE PROMPTED FOR A PASSWORD HERE, IT MEANS SSH SETUP HAS NOT BEEN SUCCESSFUL. Please note that being prompted for a passphrase may be OK but being prompted for a password is ERROR.
Sun Apr  5 11:54:28 UTC 2020
------------------------------------------------------------------------
--ol7-121-dg2:--
Running /usr/bin/ssh -x -l oracle ol7-121-dg2 date to verify SSH connectivity has been setup from local host to ol7-121-dg2.
IF YOU SEE ANY OTHER OUTPUT BESIDES THE OUTPUT OF THE DATE COMMAND OR IF YOU ARE PROMPTED FOR A PASSWORD HERE, IT MEANS SSH SETUP HAS NOT BEEN SUCCESSFUL. Please note that being prompted for a passphrase may be OK but being prompted for a password is ERROR.
Sun Apr  5 11:54:28 UTC 2020
------------------------------------------------------------------------
--ol7-121-dg3:--
Running /usr/bin/ssh -x -l oracle ol7-121-dg3 date to verify SSH connectivity has been setup from local host to ol7-121-dg3.
IF YOU SEE ANY OTHER OUTPUT BESIDES THE OUTPUT OF THE DATE COMMAND OR IF YOU ARE PROMPTED FOR A PASSWORD HERE, IT MEANS SSH SETUP HAS NOT BEEN SUCCESSFUL. Please note that being prompted for a passphrase may be OK but being prompted for a password is ERROR.
Sun Apr  5 11:54:28 UTC 2020
------------------------------------------------------------------------
SSH verification complete.
[oracle@ol7-121-dg1 sshsetup]$

I wonder if Tim reads my blog?

__ATA.cmd.push(function() { __ATA.initDynamicSlot({ id: 'atatags-26942-5e8a477d8cadf', location: 120, formFactor: '001', label: { text: 'Advertisements', }, creative: { reportAd: { text: 'Report this ad', }, privacySettings: { text: 'Privacy settings', } } }); });

Why Name Listener?!

Sun, 2020-04-05 06:48

May be I am too naive to know, If you have reason, then please share.

With the following configuration, environment can be easily migrated/duplicated with minimal or no change.
The only change would be port for local_listener if port# changed.

Also, easier to use and maintain.

local_listener=(ADDRESS=(PROTOCOL=TCP)(HOST=)(PORT=1591))
lsnrctl status
tnsnames.ora entry ***not*** required

With the following configuration, environment can be migrated/duplicated requiring multiple changes.

local_listener=LISTENER_NAME
lsnrctl status LISTENER_NAME
tnsnames.ora entry required for LISTENER_NAME

If listener is named, then does every environment has different names?

Security will mostly likely be used for justification but I don’t see it.

DATAGUARD sqlnet.ora NAMES.DEFAULT_DOMAIN

Sun, 2020-04-05 00:20

If you just want the solution, then read Database Startup Fails With ORA-00119 (Doc ID 471767.1)

From standby database, startup mount resulted in the following errors:

ORA-00119: invalid specification for system parameter LOCAL_LISTENER
ORA-00132: syntax error or unresolved network name 'LISTENER'

When setting local_listener, my preference is:

alter system set local_listener='(ADDRESS=(PROTOCOL=TCP)(HOST=)(PORT=1521))' scope=spfile sid='*';

However, some implementations will use the following:

alter system set local_listener=LISTENER scope=spfile sid='*';

There are pros and cons to both.

When using LISTENER with tnsnames, modifications can be performed from tnsnames.ora without having to modify database parameters.

However, it’s not forgiving when there are misconfiguration.

Demo 1:
Modify local_listener and restart DB.

SQL> alter system set local_listener='(ADDRESS=(PROTOCOL=TCP)(HOST=)(PORT=1521))' scope=spfile sid='*';

System altered.

SQL> shu abort
ORACLE instance shut down.
SQL> startup mount
ORACLE instance started.

Total System Global Area 1610612736 bytes
Fixed Size                  2924928 bytes
Variable Size             520097408 bytes
Database Buffers         1073741824 bytes
Redo Buffers               13848576 bytes
Database mounted.
SQL> show parameter listener

NAME                                 TYPE        VALUE
------------------------------------ ----------- ---------------------------------------------
listener_networks                    string
local_listener                       string      (ADDRESS=(PROTOCOL=TCP)(HOST=)(PORT=1521))
remote_listener                      string
SQL>

Demo 2:
Modify local_listener and restart DB failed. Modify sqlnet.ora (delete NAMES.DEFAULT_DOMAIN=world).

### This is a bad omen and no changes should be made to DB until tnsping is resolved.
[oracle@ol7-121-dg2 ~]$ tnsping LISTENER

TNS Ping Utility for Linux: Version 12.1.0.2.0 - Production on 05-APR-2020 04:32:05

Copyright (c) 1997, 2014, Oracle.  All rights reserved.

Used parameter files:
/u01/app/oracle/product/12.1.0.2/dbhome_1/network/admin/sqlnet.ora

TNS-03505: Failed to resolve name
[oracle@ol7-121-dg2 ~]$

### Restart DB FAILED
SQL> alter system set local_listener=LISTENER scope=spfile sid='*';

System altered.

SQL> shu abort
ORACLE instance shut down.
SQL> startup mount
ORA-00119: invalid specification for system parameter LOCAL_LISTENER
ORA-00132: syntax error or unresolved network name 'LISTENER'
SQL>

That’s not good!

Check sqlnet.ora

[oracle@ol7-121-dg2 ~]$ cat /u01/app/oracle/product/12.1.0.2/dbhome_1/network/admin/sqlnet.ora
SQLNET.INBOUND_CONNECT_TIMEOUT=400
SQLNET.ENCRYPTION_SERVER=REQUIRED
SQLNET.ENCRYPTION_TYPES_SERVER=(AES256)

SQLNET.ENCRYPTION_CLIENT=REQUIRED
SQLNET.ENCRYPTION_TYPES_CLIENT=(AES256)

SQLNET.CRYPTO_CHECKSUM_SERVER=REQUIRED
SQLNET.CRYPTO_CHECKSUM_TYPES_SERVER = (SHA256)

SQLNET.CRYPTO_CHECKSUM_CLIENT=REQUIRED
SQLNET.CRYPTO_CHECKSUM_TYPES_CLIENT = (SHA256)
NAMES.DEFAULT_DOMAIN=world

[oracle@ol7-121-dg2 ~]$ vi /u01/app/oracle/product/12.1.0.2/dbhome_1/network/admin/sqlnet.ora

[oracle@ol7-121-dg2 ~]$ cat /u01/app/oracle/product/12.1.0.2/dbhome_1/network/admin/sqlnet.ora
SQLNET.INBOUND_CONNECT_TIMEOUT=400
SQLNET.ENCRYPTION_SERVER=REQUIRED
SQLNET.ENCRYPTION_TYPES_SERVER=(AES256)

SQLNET.ENCRYPTION_CLIENT=REQUIRED
SQLNET.ENCRYPTION_TYPES_CLIENT=(AES256)

SQLNET.CRYPTO_CHECKSUM_SERVER=REQUIRED
SQLNET.CRYPTO_CHECKSUM_TYPES_SERVER = (SHA256)

SQLNET.CRYPTO_CHECKSUM_CLIENT=REQUIRED
SQLNET.CRYPTO_CHECKSUM_TYPES_CLIENT = (SHA256)
[oracle@ol7-121-dg2 ~]$

Did you see the problem? Not sure why NAMES.DEFAULT_DOMAIN=world was set and remove solved the issue.

SQL> startup mount
ORACLE instance started.

Total System Global Area 1610612736 bytes
Fixed Size                  2924928 bytes
Variable Size             520097408 bytes
Database Buffers         1073741824 bytes
Redo Buffers               13848576 bytes
Database mounted.
SQL> show parameter listener

NAME                                 TYPE        VALUE
------------------------------------ ----------- ------------------------------
listener_networks                    string
local_listener                       string      LISTENER
remote_listener                      string
SQL>

Demo 3:
Rollback sqlnet.ora (add NAMES.DEFAULT_DOMAIN=world), modify tnsnames.ora (LISTENER.world), and restart DB.

[oracle@ol7-121-dg2 ~]$ cat /u01/app/oracle/product/12.1.0.2/dbhome_1/network/admin/sqlnet.ora
SQLNET.INBOUND_CONNECT_TIMEOUT=400
SQLNET.ENCRYPTION_SERVER=REQUIRED
SQLNET.ENCRYPTION_TYPES_SERVER=(AES256)

SQLNET.ENCRYPTION_CLIENT=REQUIRED
SQLNET.ENCRYPTION_TYPES_CLIENT=(AES256)

SQLNET.CRYPTO_CHECKSUM_SERVER=REQUIRED
SQLNET.CRYPTO_CHECKSUM_TYPES_SERVER = (SHA256)

SQLNET.CRYPTO_CHECKSUM_CLIENT=REQUIRED
SQLNET.CRYPTO_CHECKSUM_TYPES_CLIENT = (SHA256)

NAMES.DEFAULT_DOMAIN=world

[oracle@ol7-121-dg2 ~]$ tnsping LISTENER

TNS Ping Utility for Linux: Version 12.1.0.2.0 - Production on 05-APR-2020 04:54:52

Copyright (c) 1997, 2014, Oracle.  All rights reserved.

Used parameter files:
/u01/app/oracle/product/12.1.0.2/dbhome_1/network/admin/sqlnet.ora

TNS-03505: Failed to resolve name
[oracle@ol7-121-dg2 ~]$

[oracle@ol7-121-dg2 ~]$ cat /u01/app/oracle/product/12.1.0.2/dbhome_1/network/admin/tnsnames.ora
LISTENER.world = (ADDRESS = (PROTOCOL = TCP)(HOST = ol7-121-dg2.localdomain)(PORT = 1521))

hawka.world =
  (DESCRIPTION =
    (ADDRESS_LIST =
      (ADDRESS = (PROTOCOL = TCP)(HOST = ol7-121-dg1.localdomain)(PORT = 1521))
    )
    (CONNECT_DATA =
      (SERVER = DEDICATED)
      (SID = hawk)
    )
  )

hawkb.world =
  (DESCRIPTION =
    (ADDRESS_LIST =
      (ADDRESS = (PROTOCOL = TCP)(HOST = ol7-121-dg2.localdomain)(PORT = 1521))
    )
    (CONNECT_DATA =
      (SERVER = DEDICATED)
      (SID = hawk)
    )
  )

[oracle@ol7-121-dg2 ~]$ vi /u01/app/oracle/product/12.1.0.2/dbhome_1/network/admin/tnsnames.ora

[oracle@ol7-121-dg2 ~]$ tnsping LISTENER

TNS Ping Utility for Linux: Version 12.1.0.2.0 - Production on 05-APR-2020 05:04:18

Copyright (c) 1997, 2014, Oracle.  All rights reserved.

Used parameter files:
/u01/app/oracle/product/12.1.0.2/dbhome_1/network/admin/sqlnet.ora


Used TNSNAMES adapter to resolve the alias
Attempting to contact (ADDRESS = (PROTOCOL = TCP)(HOST = ol7-121-dg2.localdomain)(PORT = 1521))
OK (0 msec)

[oracle@ol7-121-dg2 ~]$

SQL> shu abort
ORACLE instance shut down.
SQL> startup mount
ORACLE instance started.

Total System Global Area 1610612736 bytes
Fixed Size                  2924928 bytes
Variable Size             520097408 bytes
Database Buffers         1073741824 bytes
Redo Buffers               13848576 bytes
Database mounted.
SQL> show parameter listener

NAME                                 TYPE        VALUE
------------------------------------ ----------- ------------------------------
listener_networks                    string
local_listener                       string      LISTENER
remote_listener                      string
SQL>

As demonstrated, having more options is not always good as it can lead to more likelihood for errors. Chose your evil wisely.

DATAGUARD Using DBCA Silent Mode Is Not Setting DB_UNIQUE_NAME

Sat, 2020-04-04 18:30

Unfortunately, db_name=db_unique_name which is not ideal for Data Guard environment.

Even without Data Guard, my preference is to have db_name different from db_unique_name.

Typically, I like to append letter to db_name vs using number which can be confused with RAC instance.

Another option is using airport code; however, this will be inaccurate when database is migrated to new DC.

If I can help it, then I would like implementation to be robust.

To resolve the issue, add -initparams db_unique_name=${NODE1_DB_UNIQUE_NAME} to dbca.

dbca -silent -createDatabase                                                 \
  -responseFile NO_VALUE                                                     \
  -templateName General_Purpose.dbc                                          \
  -sid ${ORACLE_SID}                                                         \
  -gdbname ${ORACLE_SID}                                                     \
  -characterSet AL32UTF8                                                     \
  -sysPassword ${SYS_PASSWORD}                                               \
  -systemPassword ${SYS_PASSWORD}                                            \
  -createAsContainerDatabase false                                           \
  -databaseType MULTIPURPOSE                                                 \
  -automaticMemoryManagement false                                           \
  -totalMemory 2048                                                          \
  -storageType FS                                                            \
  -datafileDestination "${DATA_DIR}"                                         \
  -redoLogFileSize 50                                                        \
  -emConfiguration NONE                                                      \
  -sampleSchema false                                                        \
  -initparams db_unique_name=${NODE1_DB_UNIQUE_NAME}                         \
  -ignorePreReqs

set lines 100
column NAME_COL_PLUS_SHOW_PARAM format a30
column VALUE_COL_PLUS_SHOW_PARAM format a55

SQL> set lines 100
SQL> column NAME_COL_PLUS_SHOW_PARAM format a30
SQL> column VALUE_COL_PLUS_SHOW_PARAM format a55

SQL> show parameter db%name

NAME                           TYPE        VALUE
------------------------------ ----------- -------------------------------------------------------
db_file_name_convert           string
db_name                        string      hawk
db_unique_name                 string      hawka
pdb_file_name_convert          string

SQL> show parameter create%dest

NAME                           TYPE        VALUE
------------------------------ ----------- -------------------------------------------------------
db_create_file_dest            string      /u01/oradata
db_create_online_log_dest_1    string      /u01/oradata
db_create_online_log_dest_2    string
db_create_online_log_dest_3    string
db_create_online_log_dest_4    string
db_create_online_log_dest_5    string

SQL> show parameter dump_dest

NAME                           TYPE        VALUE
------------------------------ ----------- -------------------------------------------------------
background_dump_dest           string      /u01/app/oracle/product/12.1.0.2/dbhome_1/rdbms/log
core_dump_dest                 string      /u01/app/oracle/diag/rdbms/hawka/hawk/cdump
user_dump_dest                 string      /u01/app/oracle/product/12.1.0.2/dbhome_1/rdbms/log
SQL>

Silent Install 11.2.0.4 DB Software With GI 18c On OEL 7.7

Sat, 2020-03-28 15:45

Just some note:

One good thing about GUI install is that it allows one to fix any issues and retry and not so much with silent install

================================================================================
Requirements for Installing Oracle 11.2.0.4 RDBMS on OL7 or RHEL7 64-bit (x86-64) (Doc ID 1962100.1)	

PRVF-4037 : CRS is not installed on any of the nodes (Doc ID 1316815.1)	

Installation of Oracle 11.2.0.4 Database Software on OL7 fails with 'Error in invoking target 'agent nmhs' of makefile ' & 
"undefined reference to symbol 'B_DestroyKeyObject'" error (Doc ID 1965691.1)	
================================================================================


================================================================================
### First install attempt without -ignorePrereq
================================================================================

$ ./runInstaller -ignorePrereq

Note that the above command does not perform any pre-requisite checks. 
Hence, ensure that all the software requirements documented in the install guide are fulfilled before executing the installer using the above option.

================================================================================

[oracle@ol7-183-rac1 ~]$ ./install_db_software.sh

+ /u01/app/oracle/software/database/runInstaller -force -silent -waitforcompletion
-responseFile /u01/app/oracle/software/database/response/db_install.rsp 
oracle.install.option=INSTALL_DB_SWONLY 
ORACLE_HOSTNAME=ol7-183-rac1.localdomain 
UNIX_GROUP_NAME=oinstall 
INVENTORY_LOCATION=/u01/app/oraInventory 
SELECTED_LANGUAGES=en ORACLE_HOME=/u01/app/oracle/product/11.2.0.4/dbhome_1 
ORACLE_BASE=/u01/app/oracle 
oracle.install.db.InstallEdition=EE 
oracle.install.db.EEOptionsSelection=false 
oracle.install.db.DBA_GROUP=dba 
oracle.install.db.OPER_GROUP=oper 
oracle.install.db.CLUSTER_NODES=ol7-183-rac1,ol7-183-rac2 
oracle.installer.autoupdates.option=SKIP_UPDATES 
oracle.install.db.isRACOneInstall=false 
SECURITY_UPDATES_VIA_MYORACLESUPPORT=false 
DECLINE_SECURITY_UPDATES=true

Starting Oracle Universal Installer...

Checking Temp space: must be greater than 120 MB.   Actual 25005 MB    Passed
Checking swap space: must be greater than 150 MB.   Actual 17391 MB    Passed
Preparing to launch Oracle Universal Installer from /tmp/OraInstall2020-03-26_04-15-06PM. Please wait ...

[FATAL] [INS-13013] Target environment do not meet some mandatory requirements.
   CAUSE: Some of the mandatory prerequisites are not met. See logs for details. /u01/app/oraInventory/logs/installActions2020-03-26_04-15-06PM.log
   ACTION: Identify the list of failed prerequisite checks from the log: /u01/app/oraInventory/logs/installActions2020-03-26_04-15-06PM.log. 
   Then either from the log file or from installation manual find the appropriate configuration to meet the prerequisites and fix it manually.
[oracle@ol7-183-rac1 ~]$


================================================================================
### Review types of errors
================================================================================

[oracle@ol7-183-rac1 ~]$ grep -e '[[:upper:]]: ' /u01/app/oraInventory/logs/installActions2020-03-26_04-15-06PM.log |cut -d ":" -f1 |sort -u
   ACTION
   CAUSE
INFO
SEVERE
WARNING
[oracle@ol7-183-rac1 ~]$


================================================================================
### Review List of failed Tasks
================================================================================

[oracle@ol7-183-rac1 ~]$ grep -A100 "List of failed Tasks" /u01/app/oraInventory/logs/installActions2020-03-26_04-15-06PM.log
INFO: ------------------List of failed Tasks------------------
INFO: *********************************************
INFO: Package: pdksh-5.2.14: This is a prerequisite condition to test whether the package "pdksh-5.2.14" is available on the system.
INFO: Severity:IGNORABLE
INFO: OverallStatus:VERIFICATION_FAILED
INFO: *********************************************
INFO: CRS Integrity: This test checks the integrity of Oracle Clusterware stack across the cluster nodes.
INFO: Severity:CRITICAL
INFO: OverallStatus:OPERATION_FAILED
INFO: *********************************************
INFO: Cluster Manager Integrity: This test checks the integrity of cluster manager across the cluster nodes.
INFO: Severity:CRITICAL
INFO: OverallStatus:OPERATION_FAILED
INFO: *********************************************
INFO: Node Application Existence: This test checks the existence of Node Applications on the system.
INFO: Severity:CRITICAL
INFO: OverallStatus:OPERATION_FAILED
INFO: *********************************************
INFO: Clock Synchronization: This test checks the Oracle Cluster Time Synchronization Services across the cluster nodes.
INFO: Severity:CRITICAL
INFO: OverallStatus:OPERATION_FAILED
INFO: *********************************************
INFO: Database Clusterware Version Compatibility: This test ensures that the Database version is compatible with the CRS version.
INFO: Severity:CRITICAL
INFO: OverallStatus:OPERATION_FAILED
INFO: -----------------End of failed Tasks List----------------
INFO: Adding ExitStatus PREREQUISITES_NOT_MET to the exit status set
SEVERE: [FATAL] [INS-13013] Target environment do not meet some mandatory requirements.
   CAUSE: Some of the mandatory prerequisites are not met. See logs for details. /u01/app/oraInventory/logs/installActions2020-03-26_04-15-06PM.log
   ACTION: Identify the list of failed prerequisite checks from the log: /u01/app/oraInventory/logs/installActions2020-03-26_04-15-06PM.log. Then either from the log file or from installation manual find the appropriate configuration to meet the prerequisites and fix it manually.
INFO: Advice is ABORT
INFO: Adding ExitStatus INVALID_USER_INPUT to the exit status set
INFO: Completed validating state {performChecks}
INFO: Terminating all background operations
INFO: Terminated all background operations
INFO: Finding the most appropriate exit status for the current application
INFO: Exit Status is -3
INFO: Shutdown Oracle Database 11g Release 2 Installer
[oracle@ol7-183-rac1 ~]$


================================================================================
### Search for "Error Message"
================================================================================

[oracle@ol7-183-rac1 ~]$ grep -i 'error message' /u01/app/oraInventory/logs/installActions2020-03-26_04-15-06PM.log
INFO: Error Message:PRVF-7532 : Package "pdksh" is missing on node "ol7-183-rac2"
INFO: Error Message:PRVF-7532 : Package "pdksh" is missing on node "ol7-183-rac1"
INFO: Error Message:PRVF-4037 : CRS is not installed on any of the nodes
INFO: Error Message:PRVF-4037 : CRS is not installed on any of the nodes
INFO: Error Message:PRVF-4037 : CRS is not installed on any of the nodes
INFO: Error Message:PRVF-4037 : CRS is not installed on any of the nodes
INFO: Error Message:PRVF-4037 : CRS is not installed on any of the nodes
[oracle@ol7-183-rac1 ~]$


================================================================================
PRVF-4037 : CRS is not installed on any of the nodes (Doc ID 1316815.1)	
The bug is fixed in 11.2.0.3, the workaround is to update GI home with CRS="true" flag.
================================================================================


================================================================================
### Check inventory for GI RAC install
================================================================================

[oracle@ol7-183-rac1 ContentsXML]$ cat inventory.xml
(?xml version="1.0" standalone="yes" ?)
(!-- Copyright (c) 1999, 2020, Oracle and/or its affiliates.
All rights reserved. --)
(!-- Do not modify the contents of this file by hand. --)
(INVENTORY)
(VERSION_INFO)
   (SAVED_WITH)12.2.0.4.0(/SAVED_WITH)
   (MINIMUM_VER)2.1.0.6.0(/MINIMUM_VER)
(/VERSION_INFO)
(HOME_LIST)
(HOME NAME="OraGI18Home1" LOC="/u01/app/18.0.0/grid" TYPE="O" IDX="1" CRS="true"/)
(/HOME_LIST)
(COMPOSITEHOME_LIST)
(/COMPOSITEHOME_LIST)
(/INVENTORY)


================================================================================
### UPDATE inventory for GI RAC install
================================================================================

[oracle@ol7-183-rac1 ContentsXML]$ . oraenv {{{ +ASM1
ORACLE_SID = [cdbrac1] ? The Oracle base remains unchanged with value /u01/app/oracle

[oracle@ol7-183-rac1 ContentsXML]$ export GRID_HOME=$ORACLE_HOME

[oracle@ol7-183-rac1 ContentsXML]$ $GRID_HOME/oui/bin/runInstaller -silent -ignoreSysPrereqs -updateNodeList ORACLE_HOME=$GRID_HOME "CLUSTER_NODES={ol7-183-rac1,ol7-183-rac2}" CRS=true
Starting Oracle Universal Installer...

Checking swap space: must be greater than 500 MB.   Actual 17391 MB    Passed
The inventory pointer is located at /etc/oraInst.loc
'UpdateNodeList' was successful.


================================================================================
### VERIFY inventory for GI RAC install
================================================================================

[oracle@ol7-183-rac1 ContentsXML]$ cat inventory.xml
(?xml version="1.0" standalone="yes" ?)
(!-- Copyright (c) 1999, 2020, Oracle and/or its affiliates.
All rights reserved. --)
(!-- Do not modify the contents of this file by hand. --)
(INVENTORY)
(VERSION_INFO)
   (SAVED_WITH)12.2.0.4.0(/SAVED_WITH)
   (MINIMUM_VER)2.1.0.6.0(/MINIMUM_VER)
(/VERSION_INFO)
(HOME_LIST)
(HOME NAME="OraGI18Home1" LOC="/u01/app/18.0.0/grid" TYPE="O" IDX="1" CRS="true")
   (NODE_LIST)
      (NODE NAME="ol7-183-rac1"/)
      (NODE NAME="ol7-183-rac2"/)
   (/NODE_LIST)
(/HOME)
(/HOME_LIST)
(COMPOSITEHOME_LIST)
(/COMPOSITEHOME_LIST)
(/INVENTORY'
[oracle@ol7-183-rac1 ContentsXML]$


================================================================================
### Retry Install
================================================================================

[oracle@ol7-183-rac1 ~]$ cat install_db_software.sh
#!/bin/sh -x
/u01/app/oracle/software/database/runInstaller -force \
-silent -waitforcompletion -ignorePrereq \
-responseFile /u01/app/oracle/software/database/response/db_install.rsp \
oracle.install.option=INSTALL_DB_SWONLY \
ORACLE_HOSTNAME=ol7-183-rac1.localdomain \
UNIX_GROUP_NAME=oinstall \
INVENTORY_LOCATION=/u01/app/oraInventory \
SELECTED_LANGUAGES=en \
ORACLE_HOME=/u01/app/oracle/product/11.2.0.4/dbhome_1 \
ORACLE_BASE=/u01/app/oracle \
oracle.install.db.InstallEdition=EE \
oracle.install.db.EEOptionsSelection=false \
oracle.install.db.DBA_GROUP=dba \
oracle.install.db.OPER_GROUP=oper \
oracle.install.db.CLUSTER_NODES=ol7-183-rac1,ol7-183-rac2 \
oracle.installer.autoupdates.option=SKIP_UPDATES \
oracle.install.db.isRACOneInstall=false \
SECURITY_UPDATES_VIA_MYORACLESUPPORT=false \
DECLINE_SECURITY_UPDATES=true
[oracle@ol7-183-rac1 ~]$


[oracle@ol7-183-rac1 ~]$ ./install_db_software.sh
+ /u01/app/oracle/software/database/runInstaller -force -silent -waitforcompletion -ignorePrereq 
-responseFile /u01/app/oracle/software/database/response/db_install.rsp 
oracle.install.option=INSTALL_DB_SWONLY 
ORACLE_HOSTNAME=ol7-183-rac1.localdomain 
UNIX_GROUP_NAME=oinstall 
INVENTORY_LOCATION=/u01/app/oraInventory 
SELECTED_LANGUAGES=en 
ORACLE_HOME=/u01/app/oracle/product/11.2.0.4/dbhome_1 
ORACLE_BASE=/u01/app/oracle 
oracle.install.db.InstallEdition=EE 
oracle.install.db.EEOptionsSelection=false 
oracle.install.db.DBA_GROUP=dba 
oracle.install.db.OPER_GROUP=oper 
oracle.install.db.CLUSTER_NODES=ol7-183-rac1,ol7-183-rac2 
oracle.installer.autoupdates.option=SKIP_UPDATES 
oracle.install.db.isRACOneInstall=false 
SECURITY_UPDATES_VIA_MYORACLESUPPORT=false 
DECLINE_SECURITY_UPDATES=true

Starting Oracle Universal Installer...

Checking Temp space: must be greater than 120 MB.   Actual 24578 MB    Passed
Checking swap space: must be greater than 150 MB.   Actual 17391 MB    Passed
Preparing to launch Oracle Universal Installer from /tmp/OraInstall2020-03-26_05-17-28PM. Please wait ...

You can find the log of this install session at:
 /u01/app/oraInventory/logs/installActions2020-03-26_05-17-28PM.log

The installation of Oracle Database 11g was successful.
Please check '/u01/app/oraInventory/logs/silentInstall2020-03-26_05-17-28PM.log' for more details.

As a root user, execute the following script(s):
        1. /u01/app/oracle/product/11.2.0.4/dbhome_1/root.sh

Execute /u01/app/oracle/product/11.2.0.4/dbhome_1/root.sh on the following nodes:
[ol7-183-rac1, ol7-183-rac2]

Successfully Setup Software.
[oracle@ol7-183-rac1 ~]$


[root@ol7-183-rac1 ~]# /u01/app/oracle/product/11.2.0.4/dbhome_1/root.sh
Check /u01/app/oracle/product/11.2.0.4/dbhome_1/install/root_ol7-183-rac1.localdomain_2020-03-26_17-44-13.log for the output of root script
[root@ol7-183-rac1 ~]#


[root@ol7-183-rac2 ~]# /u01/app/oracle/product/11.2.0.4/dbhome_1/root.sh
Check /u01/app/oracle/product/11.2.0.4/dbhome_1/install/root_ol7-183-rac2.localdomain_2020-03-26_17-44-55.log for the output of root script
[root@ol7-183-rac2 ~]#


================================================================================
### FROM silentInstall*.log - Known Issues - (Doc ID 1965691.1)	
================================================================================

[oracle@ol7-183-rac1 ~]$ cat /u01/app/oraInventory/logs/silentInstall2020-03-26_05-17-28PM.log
silentInstall2020-03-26_05-17-28PM.log
sNativeVolName:/u01/app/oracle/product/11.2.0.4/dbhome_1/
m_asNodeArray:ol7-183-rac1,ol7-183-rac2
m_sLocalNode:ol7-183-rac1
sNativeVolName:/tmp/
m_asNodeArray:ol7-183-rac1,ol7-183-rac2
m_sLocalNode:ol7-183-rac1
Error in invoking target 'agent nmhs' of makefile '/u01/app/oracle/product/11.2.0.4/dbhome_1/sysman/lib/ins_emagent.mk'. See '/u01/app/oraInventory/logs/installActions2020-03-26_05-17-28PM.log' for details.
sNativeVolName:/u01/app/oracle/
m_asNodeArray:ol7-183-rac1,ol7-183-rac2
m_sLocalNode:ol7-183-rac1
sNativeVolName:/u01/app/oraInventory/
m_asNodeArray:ol7-183-rac1,ol7-183-rac2
m_sLocalNode:ol7-183-rac1
The installation of Oracle Database 11g was successful.
[oracle@ol7-183-rac1 ~]$


================================================================================
### Check installActions*.log
================================================================================

[oracle@ol7-183-rac1 ~]$ grep -e '[[:upper:]]: ' /u01/app/oraInventory/logs/installActions2020-03-26_05-17-28PM.log |cut -d ":" -f1 |sort -u
INFO
WARNING
[oracle@ol7-183-rac1 ~]$


================================================================================
### Check inventory for DB RAC install
================================================================================

[oracle@ol7-183-rac1 ContentsXML]$ cat inventory.xml
{?xml version="1.0" standalone="yes" ?}
{!-- Copyright (c) 1999, 2013, Oracle and/or its affiliates.
All rights reserved. --}
{!-- Do not modify the contents of this file by hand. --}
{INVENTORY}
{VERSION_INFO}
   {SAVED_WITH}11.2.0.4.0{/SAVED_WITH}
   {MINIMUM_VER}2.1.0.6.0{/MINIMUM_VER}
{/VERSION_INFO}
{HOME_LIST}
{HOME NAME="OraGI18Home1" LOC="/u01/app/18.0.0/grid" TYPE="O" IDX="1" CRS="true"}
   {NODE_LIST}
      {NODE NAME="ol7-183-rac1"/}
      {NODE NAME="ol7-183-rac2"/}
   {/NODE_LIST}
{/HOME}
{HOME NAME="OraDb11g_home1" LOC="/u01/app/oracle/product/11.2.0.4/dbhome_1" TYPE="O" IDX="2"}
   {NODE_LIST}
      {NODE NAME="ol7-183-rac1"/}
      {NODE NAME="ol7-183-rac2"/}
   {/NODE_LIST}
{/HOME}
{/HOME_LIST}
{COMPOSITEHOME_LIST}
{/COMPOSITEHOME_LIST}
{/INVENTORY}
[oracle@ol7-183-rac1 ContentsXML]$


================================================================================
### cluvfy comp healthcheck
================================================================================

[oracle@ol7-183-rac1 cvu]$ . oraenv <<< +ASM1
ORACLE_SID = [cdbrac1] ? The Oracle base remains unchanged with value /u01/app/oracle

[oracle@ol7-183-rac1 ~]$ cluvfy comp software

Verification of Health Check was unsuccessful.
Checks did not pass for the following nodes:
        ol7-183-rac2,ol7-183-rac1


Failures were encountered during execution of CVU verification request "Health Check".

Verifying Physical Memory ...FAILED
ol7-183-rac2: PRVF-7530 : Sufficient physical memory is not available on node
              "ol7-183-rac2" [Required physical memory = 8GB (8388608.0KB)]

ol7-183-rac1: PRVF-7530 : Sufficient physical memory is not available on node
              "ol7-183-rac1" [Required physical memory = 8GB (8388608.0KB)]

Verifying Ethernet Jumbo Frames ...FAILED
ol7-183-rac2: PRVE-0293 : Jumbo Frames are not configured for interconnects
              "eth2" on node "ol7-183-rac2.localdomain". [Expected="eth2=9000";
              Found="eth2=1500"]

ol7-183-rac1: PRVE-0293 : Jumbo Frames are not configured for interconnects
              "eth2" on node "ol7-183-rac1.localdomain". [Expected="eth2=9000";
              Found="eth2=1500"]


CVU operation performed:      Health Check
Date:                         Mar 26, 2020 6:07:08 PM
CVU home:                     /u01/app/18.0.0/grid/
User:                         oracle
[oracle@ol7-183-rac1 cvu]$

Query Required RPM Using Text File

Wed, 2020-03-25 19:50

There’s request to install 11.2.0.4 database software for RHEL7.

Requirements for Installing Oracle 11.2.0.4 RDBMS on OL7 or RHEL7 64-bit (x86-64) (Doc ID 1962100.1)

binutils-2.23.52.0.1-12.el7.x86_64
compat-libcap1-1.10-3.el7.x86_64
compat-libstdc++-33-3.2.3
gcc-4.8.2-3.el7.x86_64
gcc-c++-4.8.2-3.el7.x86_64
glibc-2.17-36.el7.x86_64
glibc-devel-2.17-36.el7.x86_64
ksh
libaio-0.3.109-9.el7.x86_64
libaio-devel-0.3.109-9.el7.x86_64
libgcc-4.8.2-3.el7.x86_64
libstdc++-4.8.2-3.el7.x86_64
libstdc++-devel-4.8.2-3.el7.x86_64
libXi-1.7.2-1.el7.x86_64
libXtst-1.2.2-1.el7.x86_64
make-3.82-19.el7.x86_64
sysstat-10.1.5-1.el7.x86_64

Create /tmp/req-rpm.txt

[root@ol7-183-rac1 ~]# cat /tmp/req-rpm.txt
binutils
compat-libcap1
compat-libstdc++-33
gcc
gcc-c++
glibc
glibc-devel
ksh
libaio
libaio-devel
libgcc
libstdc++
libstdc++-devel
libXi
libXtst
make
sysstat

Verify required RPM using /tmp/req-rpm.txt

[root@ol7-183-rac1 ~]# rpm -q --qf '%{NAME}-%{VERSION}-%{RELEASE}(%{ARCH})\n' `awk '{print $1}' /tmp/req-rpm.txt`
binutils-2.27-41.base.0.7.el7_7.3(x86_64)
compat-libcap1-1.10-7.el7(x86_64)
package compat-libstdc++ is not installed
package gcc is not installed
compat-libstdc++-33-3.2.3-72.el7(x86_64)
compat-libstdc++-33-3.2.3-72.el7(i686)
glibc-2.17-292.0.1.el7(x86_64)
glibc-2.17-292.0.1.el7(i686)
glibc-devel-2.17-292.0.1.el7(x86_64)
glibc-devel-2.17-292.0.1.el7(i686)
ksh-20120801-140.0.1.el7_7(x86_64)
libaio-0.3.109-13.el7(x86_64)
libaio-0.3.109-13.el7(i686)
libaio-devel-0.3.109-13.el7(x86_64)
libaio-devel-0.3.109-13.el7(i686)
libgcc-4.8.5-39.0.3.el7(x86_64)
libgcc-4.8.5-39.0.3.el7(i686)
libstdc++-4.8.5-39.0.3.el7(x86_64)
libstdc++-4.8.5-39.0.3.el7(i686)
libstdc++-devel-4.8.5-39.0.3.el7(x86_64)
libstdc++-devel-4.8.5-39.0.3.el7(i686)
libXi-1.7.9-1.el7(x86_64)
libXi-1.7.9-1.el7(i686)
libXtst-1.2.3-1.el7(x86_64)
libXtst-1.2.3-1.el7(i686)
make-3.82-24.el7(x86_64)
sysstat-10.1.5-18.el7_7.1(x86_64)

Find missing RPM.

[root@ol7-183-rac1 ~]# rpm -q --qf '%{NAME}-%{VERSION}-%{RELEASE}(%{ARCH})\n' `awk '{print $1}' /tmp/req-rpm.txt` | grep not
package gcc is not installed
package gcc-c++ is not installed

Install missing RPM

[root@ol7-183-rac1 ~]# yum -y install gcc gcc-c++

Verify NO MISSING RPM

[root@ol7-183-rac1 ~]# rpm -q --qf '%{NAME}-%{VERSION}-%{RELEASE}(%{ARCH})\n' `awk '{print $1}' /tmp/req-rpm.txt`
binutils-2.27-41.base.0.7.el7_7.3(x86_64)
compat-libcap1-1.10-7.el7(x86_64)
compat-libstdc++-33-3.2.3-72.el7(x86_64)
compat-libstdc++-33-3.2.3-72.el7(i686)
gcc-4.8.5-39.0.3.el7(x86_64)
gcc-c++-4.8.5-39.0.3.el7(x86_64)
glibc-2.17-292.0.1.el7(x86_64)
glibc-2.17-292.0.1.el7(i686)
glibc-devel-2.17-292.0.1.el7(x86_64)
glibc-devel-2.17-292.0.1.el7(i686)
ksh-20120801-140.0.1.el7_7(x86_64)
libaio-0.3.109-13.el7(x86_64)
libaio-0.3.109-13.el7(i686)
libaio-devel-0.3.109-13.el7(x86_64)
libaio-devel-0.3.109-13.el7(i686)
libgcc-4.8.5-39.0.3.el7(x86_64)
libgcc-4.8.5-39.0.3.el7(i686)
libstdc++-4.8.5-39.0.3.el7(x86_64)
libstdc++-4.8.5-39.0.3.el7(i686)
libstdc++-devel-4.8.5-39.0.3.el7(x86_64)
libstdc++-devel-4.8.5-39.0.3.el7(i686)
libXi-1.7.9-1.el7(x86_64)
libXi-1.7.9-1.el7(i686)
libXtst-1.2.3-1.el7(x86_64)
libXtst-1.2.3-1.el7(i686)
make-3.82-24.el7(x86_64)
sysstat-10.1.5-18.el7_7.1(x86_64)

[root@ol7-183-rac1 ~]# rpm -q --qf '%{NAME}-%{VERSION}-%{RELEASE}(%{ARCH})\n' `awk '{print $1}' /tmp/req-rpm.txt` | grep not
[root@ol7-183-rac1 ~]#

ORA-14758: Last partition in the range section cannot be dropped

Sat, 2020-03-14 14:27

Quick and dirty post.

Works for 12.2.0.1 and failed with ORA-14758 below 12.2.0.1

00:51:45 DINH @ HAWK:HAWK:>@p.sql

TABLE_OWNER  TABLE_NAME   PARTITION_NAME  INT HIGH_VALUE
------------ ------------ --------------- --- --------------------------------------------------------------------------------
DINH         TEST         D1              NO  TO_DATE(' 2001-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA
DINH         TEST         D2              NO  TO_DATE(' 2002-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA
DINH         TEST         D3              NO  TO_DATE(' 2003-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA
DINH         TEST         SYS_P661        YES TO_DATE(' 2004-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA
DINH         TEST         SYS_P662        YES TO_DATE(' 2005-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA
DINH         TEST         SYS_P663        YES TO_DATE(' 2006-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA
DINH         TEST         SYS_P664        YES TO_DATE(' 2007-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA
DINH         TEST         SYS_P665        YES TO_DATE(' 2008-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA
DINH         TEST         SYS_P666        YES TO_DATE(' 2009-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA
DINH         TEST         SYS_P667        YES TO_DATE(' 2010-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA

10 rows selected.

00:51:49 DINH @ HAWK:HAWK:>alter table test drop partition D1;

Table altered.

00:52:00 DINH @ HAWK:HAWK:>alter table test drop partition D2;

Table altered.

00:52:08 DINH @ HAWK:HAWK:>alter table test drop partition D3;

Table altered.

00:52:20 DINH @ HAWK:HAWK:>@p.sql

TABLE_OWNER  TABLE_NAME   PARTITION_NAME  INT HIGH_VALUE
------------ ------------ --------------- --- --------------------------------------------------------------------------------
DINH         TEST         SYS_P661        NO  TO_DATE(' 2004-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA
DINH         TEST         SYS_P662        YES TO_DATE(' 2005-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA
DINH         TEST         SYS_P663        YES TO_DATE(' 2006-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA
DINH         TEST         SYS_P664        YES TO_DATE(' 2007-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA
DINH         TEST         SYS_P665        YES TO_DATE(' 2008-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA
DINH         TEST         SYS_P666        YES TO_DATE(' 2009-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA
DINH         TEST         SYS_P667        YES TO_DATE(' 2010-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA

7 rows selected.

00:52:23 DINH @ HAWK:HAWK:>alter table test drop partition SYS_P662;

Table altered.

00:52:46 DINH @ HAWK:HAWK:>alter table test drop partition SYS_P663;

Table altered.

00:52:58 DINH @ HAWK:HAWK:>alter table test drop partition SYS_P664;

Table altered.

00:53:09 DINH @ HAWK:HAWK:>@p.sql

TABLE_OWNER  TABLE_NAME   PARTITION_NAME  INT HIGH_VALUE
------------ ------------ --------------- --- --------------------------------------------------------------------------------
DINH         TEST         SYS_P661        NO  TO_DATE(' 2004-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA
DINH         TEST         SYS_P665        YES TO_DATE(' 2008-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA
DINH         TEST         SYS_P666        YES TO_DATE(' 2009-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA
DINH         TEST         SYS_P667        YES TO_DATE(' 2010-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA

00:53:12 DINH @ HAWK:HAWK:>alter table test drop partition SYS_P665;

Table altered.

00:53:43 DINH @ HAWK:HAWK:>alter table test drop partition SYS_P666;

Table altered.

00:53:53 DINH @ HAWK:HAWK:>@p.sql

TABLE_OWNER  TABLE_NAME   PARTITION_NAME  INT HIGH_VALUE
------------ ------------ --------------- --- --------------------------------------------------------------------------------
DINH         TEST         SYS_P661        NO  TO_DATE(' 2004-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA
DINH         TEST         SYS_P667        YES TO_DATE(' 2010-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA

00:54:04 DINH @ HAWK:HAWK:>alter table test drop partition SYS_P667;

Table altered.

00:54:22 DINH @ HAWK:HAWK:>@p.sql

TABLE_OWNER  TABLE_NAME   PARTITION_NAME  INT HIGH_VALUE
------------ ------------ --------------- --- --------------------------------------------------------------------------------
DINH         TEST         SYS_P661        NO  TO_DATE(' 2004-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA

00:54:28 DINH @ HAWK:HAWK:>insert into test values(to_date('01.01.2009', 'dd.mm.yyyy'), 1);

1 row created.

00:54:36 DINH @ HAWK:HAWK:>commit;

Commit complete.

00:54:41 DINH @ HAWK:HAWK:>@p.sql

TABLE_OWNER  TABLE_NAME   PARTITION_NAME  INT HIGH_VALUE
------------ ------------ --------------- --- --------------------------------------------------------------------------------
DINH         TEST         SYS_P661        NO  TO_DATE(' 2004-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA
DINH         TEST         SYS_P668        YES TO_DATE(' 2010-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIA

00:54:44 DINH @ HAWK:HAWK:>exit
Disconnected from Oracle Database 12c Enterprise Edition Release 12.2.0.1.0 - 64bit Production


====================================================================================================

18:59:19 DB-FS-1:(MDINH@hawk):PRIMARY> @p.sql

TABLE_OWNER  TABLE_NAME   PARTITION_NAME  INT HIGH_VALUE
------------ ------------ --------------- --- -------------------------------------------------------------------------------------
MDINH        TEST         D1              NO  TO_DATE(' 2001-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIAN')
MDINH        TEST         D2              NO  TO_DATE(' 2002-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIAN')
MDINH        TEST         D3              NO  TO_DATE(' 2003-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIAN')
MDINH        TEST         SYS_P68         YES TO_DATE(' 2004-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIAN')
MDINH        TEST         SYS_P69         YES TO_DATE(' 2005-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIAN')
MDINH        TEST         SYS_P70         YES TO_DATE(' 2006-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIAN')
MDINH        TEST         SYS_P71         YES TO_DATE(' 2007-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIAN')
MDINH        TEST         SYS_P72         YES TO_DATE(' 2008-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIAN')
MDINH        TEST         SYS_P73         YES TO_DATE(' 2009-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIAN')
MDINH        TEST         SYS_P74         YES TO_DATE(' 2010-01-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIAN')

10 rows selected.

18:59:23 DB-FS-1:(MDINH@hawk):PRIMARY> drop alter table test drop partition D1;
drop alter table test drop partition D1
     *
ERROR at line 1:
ORA-00950: invalid DROP option


18:59:40 DB-FS-1:(MDINH@hawk):PRIMARY> alter table test drop partition D1;

Table altered.

18:59:50 DB-FS-1:(MDINH@hawk):PRIMARY> alter table test drop partition D2;

Table altered.

18:59:58 DB-FS-1:(MDINH@hawk):PRIMARY> alter table test drop partition D3;
alter table test drop partition D3
                                *
ERROR at line 1:
ORA-14758: Last partition in the range section cannot be dropped


19:00:06 DB-FS-1:(MDINH@hawk):PRIMARY> exit
Disconnected from Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
[oracle@db-fs-1 sql]$

Follow Release Numbers Major.0.0.0.0 To Avoid Unexpected Issues

Sat, 2020-03-07 18:35

Here is an example where I was trying to move GI home from 19.0.0 to 19.6.0

Obviously the directory /u01/app/19.6.0.0  does not exists.

[oracle@ol7-19-rac2 ~]$ rhpctl move gihome -sourcehome /u01/app/19.0.0/grid -desthome /u01/app/19.6.0/grid -eval
ol7-19-rac2.localdomain: Audit ID: 6
ol7-19-rac2.localdomain: Evaluation in progress for "move gihome" ...
ol7-19-rac2.localdomain: PRGO-1774 : The evaluation revealed potential failure for command "move gihome".
PRCT-1402 : Attempt to retrieve version of SRVCTL from Oracle Home /u01/app/19.6.0/grid/bin failed.
PRCT-1011 : Failed to run "srvctl". Detailed error: PRCZ-3002 : failed to locate Oracle base

*** /u01/app/19.6.0/grid/bin/srvctl: line 259: /u01/app/19.6.0.0/grid/srvm/admin/getcrshome: No such file or directory ***
[oracle@ol7-19-rac2 ~]$

Rename directory 19.6.0 to 19.6.0.0 and retry.

[root@ol7-19-rac2 app]# ls -l
total 0
drwxrwxr-x. 3 root   oinstall 18 Mar  6 22:33 19.0.0
drwxrwxr-x. 3 oracle oinstall 18 Mar  7 04:33 19.6.0
drwxrwxr-x. 8 oracle oinstall 91 Mar  6 23:17 oracle
drwxrwxr-x. 5 oracle oinstall 92 Mar  7 12:28 oraInventory
[root@ol7-19-rac2 app]# mv 19.6.0 19.6.0.0
[root@ol7-19-rac2 app]# ls -l
total 0
drwxrwxr-x. 3 root   oinstall 18 Mar  6 22:33 19.0.0
drwxrwxr-x. 3 oracle oinstall 18 Mar  7 04:33 19.6.0.0
drwxrwxr-x. 8 oracle oinstall 91 Mar  6 23:17 oracle
drwxrwxr-x. 5 oracle oinstall 92 Mar  7 12:28 oraInventory
[root@ol7-19-rac2 app]#

Previous error is resolved but faced new errors.

[oracle@ol7-19-rac2 ~]$ rhpctl move gihome -sourcehome /u01/app/19.0.0/grid -desthome /u01/app/19.6.0.0/grid -eval
ol7-19-rac2.localdomain: Audit ID: 10
ol7-19-rac2.localdomain: Evaluation in progress for "move gihome" ...
ol7-19-rac2.localdomain: PRGO-1774 : The evaluation revealed potential failure for command "move gihome".
PRGP-1033 : The specified destination home '/u01/app/19.6.0.0/grid' is not a software only home.
[oracle@ol7-19-rac2 

There’s a first time for everything and honestly, this is the first time.

My conclusion, follow release format and avoid FPP local.

Mining gridSetupActions Log

Tue, 2020-03-03 22:22

After completing GI upgrade, what’s the most efficient way to mine results?

Upgrade GI to 19.6: typical information provided from terminal

[oracle@ol7-122-rac1 ~]$ /u01/app/19.6.0.0/grid/gridSetup.sh -applyRU /u01/app/oracle/patch/30501910

Preparing the home to patch...

Applying the patch /u01/app/oracle/patch/30501910...
Successfully applied the patch.

The log can be found at: /u01/app/oraInventory/logs/GridSetupActions2020-03-04_00-24-53AM/installerPatchActions_2020-03-04_00-24-53AM.log
Launching Oracle Grid Infrastructure Setup Wizard...

The response file for this session can be found at:
 /u01/app/19.6.0.0/grid/install/response/grid_2020-03-04_00-24-53AM.rsp

You can find the log of this install session at:
 /u01/app/oraInventory/logs/GridSetupActions2020-03-04_00-24-53AM/gridSetupActions2020-03-04_00-24-53AM.log

[oracle@ol7-122-rac1 ~]$

Example response file from 12.2 install:

[oracle@ol7-122-rac1 response]$ pwd
/u01/app/12.2.0.1/grid/install/response

[oracle@ol7-122-rac1 response]$ sdiff -iEZbWBs -w 150 gridsetup.rsp grid_*.rsp
INVENTORY_LOCATION=                                                       |     INVENTORY_LOCATION=/u01/app/oraInventory
oracle.install.option=                                                    |     oracle.install.option=CRS_CONFIG
ORACLE_BASE=                                                              |     ORACLE_BASE=/u01/app/oracle
oracle.install.asm.OSDBA=                                                 |     oracle.install.asm.OSDBA=dba
oracle.install.asm.OSASM=                                                 |     oracle.install.asm.OSASM=dba
oracle.install.crs.config.gpnp.scanName=                                  |     oracle.install.crs.config.gpnp.scanName=ol7-122-scan
oracle.install.crs.config.gpnp.scanPort=                                  |     oracle.install.crs.config.gpnp.scanPort=1521
oracle.install.crs.config.ClusterConfiguration=                           |     oracle.install.crs.config.ClusterConfiguration=STANDALONE
oracle.install.crs.config.configureAsExtendedCluster=                     |     oracle.install.crs.config.configureAsExtendedCluster=false
oracle.install.crs.config.clusterName=                                    |     oracle.install.crs.config.clusterName=ol7-122-cluster
oracle.install.crs.config.gpnp.configureGNS=                              |     oracle.install.crs.config.gpnp.configureGNS=false
oracle.install.crs.config.autoConfigureClusterNodeVIP=                    |     oracle.install.crs.config.autoConfigureClusterNodeVIP=false
oracle.install.crs.config.gpnp.gnsOption=                                 |     oracle.install.crs.config.gpnp.gnsOption=CREATE_NEW_GNS
oracle.install.crs.config.clusterNodes=                                   |     oracle.install.crs.config.clusterNodes=ol7-122-rac1.localdomain:ol7-12
oracle.install.crs.config.networkInterfaceList=                           |     oracle.install.crs.config.networkInterfaceList=eth1:192.168.56.0:1,eth
oracle.install.asm.configureGIMRDataDG=                                   |     oracle.install.asm.configureGIMRDataDG=false
oracle.install.crs.config.useIPMI=                                        |     oracle.install.crs.config.useIPMI=false
oracle.install.asm.storageOption=                                         |     oracle.install.asm.storageOption=ASM
oracle.install.asmOnNAS.configureGIMRDataDG=                              |     oracle.install.asmOnNAS.configureGIMRDataDG=false
oracle.install.asm.diskGroup.name=                                        |     oracle.install.asm.diskGroup.name=DATA
oracle.install.asm.diskGroup.redundancy=                                  |     oracle.install.asm.diskGroup.redundancy=EXTERNAL
oracle.install.asm.diskGroup.AUSize=                                      |     oracle.install.asm.diskGroup.AUSize=4
oracle.install.asm.diskGroup.disksWithFailureGroupNames=                  |     oracle.install.asm.diskGroup.disksWithFailureGroupNames=/dev/oracleasm
oracle.install.asm.diskGroup.disks=                                       |     oracle.install.asm.diskGroup.disks=/dev/oracleasm/asm-disk3,/dev/oracl
oracle.install.asm.diskGroup.diskDiscoveryString=                         |     oracle.install.asm.diskGroup.diskDiscoveryString=/dev/oracleasm/*
oracle.install.asm.gimrDG.AUSize=                                         |     oracle.install.asm.gimrDG.AUSize=1
oracle.install.asm.configureAFD=                                          |     oracle.install.asm.configureAFD=false
oracle.install.crs.configureRHPS=                                         |     oracle.install.crs.configureRHPS=false
oracle.install.crs.config.ignoreDownNodes=                                |     oracle.install.crs.config.ignoreDownNodes=false
oracle.install.config.managementOption=                                   |     oracle.install.config.managementOption=NONE
oracle.install.config.omsPort=                                            |     oracle.install.config.omsPort=0
oracle.install.crs.rootconfig.executeRootScript=                          |     oracle.install.crs.rootconfig.executeRootScript=false
[oracle@ol7-122-rac1 response]$

Review response file: compare original response file versus the one used for upgrade (grid_2020-03-04_00-24-53AM.rsp)

[oracle@ol7-122-rac1 response]$ pwd
/u01/app/19.6.0.0/grid/install/response

[oracle@ol7-122-rac1 response]$ ls -l
total 76
-rw-r--r--. 1 oracle oinstall 36450 Mar  4 00:38 grid_2020-03-04_00-24-53AM.rsp
-rw-r-----. 1 oracle oinstall 36221 Jan 19  2019 gridsetup.rsp
-rw-r-----. 1 oracle oinstall  1541 May 21  2016 sample.ccf

[oracle@ol7-122-rac1 response]$ sdiff -iEZbWBs -w 150 gridsetup.rsp grid_*.rsp
INVENTORY_LOCATION=                                                       |     INVENTORY_LOCATION=/u01/app/oraInventory
oracle.install.option=                                                    |     oracle.install.option=UPGRADE
ORACLE_BASE=                                                              |     ORACLE_BASE=/u01/app/oracle
oracle.install.crs.config.scanType=                                       |     oracle.install.crs.config.scanType=LOCAL_SCAN
oracle.install.crs.config.ClusterConfiguration=                           |     oracle.install.crs.config.ClusterConfiguration=STANDALONE
oracle.install.crs.config.configureAsExtendedCluster=                     |     oracle.install.crs.config.configureAsExtendedCluster=false
oracle.install.crs.config.clusterName=                                    |     oracle.install.crs.config.clusterName=ol7-122-cluster
oracle.install.crs.config.gpnp.configureGNS=                              |     oracle.install.crs.config.gpnp.configureGNS=false
oracle.install.crs.config.autoConfigureClusterNodeVIP=                    |     oracle.install.crs.config.autoConfigureClusterNodeVIP=false
oracle.install.crs.config.gpnp.gnsOption=                                 |     oracle.install.crs.config.gpnp.gnsOption=CREATE_NEW_GNS
oracle.install.crs.config.clusterNodes=                                   |     oracle.install.crs.config.clusterNodes=ol7-122-rac2:,ol7-122-rac1:
oracle.install.crs.configureGIMR=                                         |     oracle.install.crs.configureGIMR=true
oracle.install.asm.configureGIMRDataDG=                                   |     oracle.install.asm.configureGIMRDataDG=false
oracle.install.crs.config.storageOption=                                  |     oracle.install.crs.config.storageOption=FLEX_ASM_STORAGE
oracle.install.crs.config.useIPMI=                                        |     oracle.install.crs.config.useIPMI=false
oracle.install.asm.diskGroup.name=                                        |     oracle.install.asm.diskGroup.name=DATA
oracle.install.asm.diskGroup.AUSize=                                      |     oracle.install.asm.diskGroup.AUSize=0
oracle.install.asm.gimrDG.AUSize=                                         |     oracle.install.asm.gimrDG.AUSize=1
oracle.install.asm.configureAFD=                                          |     oracle.install.asm.configureAFD=false
oracle.install.crs.configureRHPS=                                         |     oracle.install.crs.configureRHPS=false
oracle.install.crs.config.ignoreDownNodes=                                |     oracle.install.crs.config.ignoreDownNodes=false
oracle.install.config.managementOption=                                   |     oracle.install.config.managementOption=NONE
oracle.install.config.omsPort=                                            |     oracle.install.config.omsPort=0
oracle.install.crs.rootconfig.executeRootScript=                          |     oracle.install.crs.rootconfig.executeRootScript=false
[oracle@ol7-122-rac1 response]$

Review log directory:

[oracle@ol7-122-rac1 GridSetupActions2020-03-04_00-24-53AM]$ pwd
/u01/app/oraInventory/logs/GridSetupActions2020-03-04_00-24-53AM

[oracle@ol7-122-rac1 GridSetupActions2020-03-04_00-24-53AM]$ ls -alrt
total 17988
-rw-r-----.  1 oracle oinstall   11578 Mar  4 00:31 installerPatchActions_2020-03-04_00-24-53AM.log
-rw-r-----.  1 oracle oinstall       0 Mar  4 00:31 gridSetupActions2020-03-04_00-24-53AM.err
drwxrwx---.  3 oracle oinstall      21 Mar  4 00:31 temp_ob
-rw-r-----.  1 oracle oinstall       0 Mar  4 00:38 oraInstall2020-03-04_00-24-53AM.err
-rw-r-----.  1 oracle oinstall     157 Mar  4 00:38 oraInstall2020-03-04_00-24-53AM.out
-rw-r-----.  1 oracle oinstall 9728749 Mar  4 00:39 gridSetupActions2020-03-04_00-24-53AM.out
-rw-r-----.  1 oracle oinstall       0 Mar  4 00:44 oraInstall2020-03-04_00-24-53AM.err.ol7-122-rac2
-rw-r-----.  1 oracle oinstall     142 Mar  4 00:44 oraInstall2020-03-04_00-24-53AM.out.ol7-122-rac2
-rw-r-----.  1 oracle oinstall   29328 Mar  4 02:05 time2020-03-04_00-24-53AM.log
-rw-r-----.  1 oracle oinstall 8624226 Mar  4 02:05 gridSetupActions2020-03-04_00-24-53AM.log
drwxrwx---. 12 oracle oinstall    4096 Mar  4 02:18 ..
drwxrwx---.  3 oracle oinstall    4096 Mar  4 03:20 .

Review .err file: 0 byte is good

[oracle@ol7-122-rac1 GridSetupActions2020-03-04_00-24-53AM]$ ls -l *.err
-rw-r-----. 1 oracle oinstall 0 Mar  4 00:31 gridSetupActions2020-03-04_00-24-53AM.err
-rw-r-----. 1 oracle oinstall 0 Mar  4 00:38 oraInstall2020-03-04_00-24-53AM.err

Review grid action: for verification purpose grep log for when grid was configure vs upgrade for comparison

[oracle@ol7-122-rac1 GridSetupActions2020-03-03_01-26-02AM]$ grep -i getInstallOption gridSetupActions*.log
INFO:  [Mar 3, 2020 1:26:05 AM] getInstallOption: CRS_CONFIG
[oracle@ol7-122-rac1 GridSetupActions2020-03-03_01-26-02AM]$

[oracle@ol7-122-rac1 GridSetupActions2020-03-04_00-24-53AM]$ grep -i getInstallOption gridSetupActions*.log
INFO:  [Mar 4, 2020 12:32:07 AM] getInstallOption: UPGRADE
[oracle@ol7-122-rac1 GridSetupActions2020-03-04_00-24-53AM]$

Check for distinct keywords:

[oracle@ol7-122-rac1 GridSetupActions2020-03-04_00-24-53AM]$ grep -e '[[:upper:]]: ' gridSetupActions*.log | cut -d ":" -f1 | sort -u
   ACTION
          APPLICATION_ERROR
   CAUSE
INFO
Output
TaskUsersWithSameID
WARNING

Check APPLICATION_ERROR:

[oracle@ol7-122-rac1 GridSetupActions2020-03-04_00-24-53AM]$ grep -B3 -A1 APPLICATION_ERROR gridSetupActions*.log
INFO:  [Mar 4, 2020 12:35:27 AM] INFO: [Task.perform:873]
TaskCheckRPMPackageManager:RPM Package Manager database[TASKCHECKRPMPACKAGEMANAGER]:TASK_SUMMARY:FAILED:INFORMATION:INFORMATION:Total time taken []
          ERRORMSG(GLOBAL): PRVG-11250 : The check "RPM Package Manager database" was not performed because it needs 'root' user privileges.
          APPLICATION_ERROR: NodeResultsUnavailableException thrown when hasNodeResults() returns true
INFO:  [Mar 4, 2020 12:35:27 AM] INFO: [Task.perform:799]

Did you noticed that I used wildcard for the search?

It does not matter since the log for each task will typically be in different directories.

This is the one thing I noticed Oracle did correctly as it’s much easier to the same commands for any environments.

gridSetup.sh -creategoldimage Failed With Missing LINUX.X64_193000_grid_home.zip

Mon, 2020-03-02 06:38

Start with Grid 12.2, upgrade to Grid 19.3, patch Grid to 19.6, and create Grid 19.6 Gold Image.

1. Copy LINUX.X64_193000_grid_home.zip to GRID_HOME, extract zip, upgrade from Grid 12.2 to 19.3
2. Apply 19.6.0.0.200114 RU (p30501910_190000_Linux-x86-64.zip) to Grid.
3. Create Grid 19.6 Gold Image.

$GRID_HOME/gridSetup.sh -creategoldimage -exclFiles $ORACLE_HOME/log,$ORACLE_HOME/.patch_storage -destinationlocation /u01/app/oracle/goldimage -silent

4. Use LINUX.X64_19600_grid_home.zip from step 3 to upgrade Grid 12.2 successfully for another RAC cluster.
5. Removed LINUX.X64_193000_grid_home.zip from new RAC cluster.
6. Create new Grid 19.6 Gold Image for Grid failed with missing LINUX.X64_193000_grid_home.zip

7. Copy LINUX.X64_193000_grid_home.zip back to new Grid 19.6 home
8. Create new Grid 19.6 Gold Image successful.
9. Size for LINUX.X64_193000_grid_home.zip is 2.7G and for Grid 19.6 Gold Image is 5.9G.

[oracle@ol7-122-rac1 ~]$ $ORACLE_HOME/OPatch/opatch lspatches
30655595;TOMCAT RELEASE UPDATE 19.0.0.0.0 (30655595)
30557433;Database Release Update : 19.6.0.0.200114 (30557433)
30489632;ACFS RELEASE UPDATE 19.6.0.0.0 (30489632)
30489227;OCW RELEASE UPDATE 19.6.0.0.0 (30489227)

OPatch succeeded.
[oracle@ol7-122-rac1 ~]$
[oracle@ol7-122-rac1 ~]$ $ORACLE_HOME/OPatch/opatch lsinventory -details; echo $?
--------------------------------------------------------------------------------

OPatch succeeded.
0
[oracle@ol7-122-rac1 ~]$
[oracle@ol7-122-rac1 ~]$ $ORACLE_HOME/gridSetup.sh -creategoldimage -exclFiles $ORACLE_HOME/log,$ORACLE_HOME/.patch_storage -destinationlocation /u01/app/oracle/goldimage -silent
Launching Oracle Grid Infrastructure Setup Wizard...

[FATAL] [INS-42505] The installer has detected that the Oracle Grid Infrastructure home software at (/u01/app/19.6.0.0/grid) is not complete.
   CAUSE: Following files are missing:
[/u01/app/19.6.0.0/grid/LINUX.X64_193000_grid_home.zip]
   ACTION: Ensure that the Oracle Grid Infrastructure home at (/u01/app/19.6.0.0/grid) includes the files listed above.
[oracle@ol7-122-rac1 ~]$

DEBUG

[oracle@ol7-122-rac1 ~]$ echo $ORACLE_HOME
/u01/app/19.6.0.0/grid

[oracle@ol7-122-rac1 ~]$ ls -l $ORACLE_HOME/*.zip
ls: cannot access /u01/app/19.6.0.0/grid/*.zip: No such file or directory

[oracle@ol7-122-rac1 ~]$ $ORACLE_HOME/gridSetup.sh -creategoldimage -exclFiles $ORACLE_HOME/log,$ORACLE_HOME/.patch_storage \
> -destinationlocation /u01/app/oracle/goldimage -silent -debug

Launching Oracle Grid Infrastructure Setup Wizard...

[main] [ 2020-03-02 11:59:29.626 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 10
[main] [ 2020-03-02 11:59:29.627 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:29.627 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 10
[main] [ 2020-03-02 11:59:29.628 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:29.628 UTC ] [UnixSystem.isCRSConfigured:3549]  configFile=/etc/oracle/ocr.loc
[main] [ 2020-03-02 11:59:29.631 UTC ] [Utils.getPropertyValue:320]  keyName=ocrconfig_loc props.val=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587 propValue=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:29.631 UTC ] [Utils.getPropertyValue:380]  propName=ocrconfig_loc propValue=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:29.632 UTC ] [UnixSystem.isCRSConfigured:3556]  ocrconfig_loc=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:29.632 UTC ] [Utils.getPropertyValue:320]  keyName=ocrconfig_loc props.val=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587 propValue=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:29.633 UTC ] [Utils.getPropertyValue:380]  propName=local_only propValue=false
[main] [ 2020-03-02 11:59:29.633 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 10
[main] [ 2020-03-02 11:59:29.633 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:29.634 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 10
[main] [ 2020-03-02 11:59:29.634 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:29.634 UTC ] [UnixSystem.isCRSConfigured:3549]  configFile=/etc/oracle/ocr.loc
[main] [ 2020-03-02 11:59:29.637 UTC ] [Utils.getPropertyValue:320]  keyName=ocrconfig_loc props.val=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587 propValue=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:29.638 UTC ] [Utils.getPropertyValue:380]  propName=ocrconfig_loc propValue=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:29.638 UTC ] [UnixSystem.isCRSConfigured:3556]  ocrconfig_loc=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:29.638 UTC ] [Utils.getPropertyValue:320]  keyName=ocrconfig_loc props.val=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587 propValue=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:29.639 UTC ] [Utils.getPropertyValue:380]  propName=local_only propValue=false
[main] [ 2020-03-02 11:59:29.640 UTC ] [UnixSystem.getCRSHome:3689]  olrFileName = /etc/oracle/olr.loc
[main] [ 2020-03-02 11:59:29.640 UTC ] [UnixSystem.getCRSHome:3733]  configFile=/etc/oracle/olr.loc
[main] [ 2020-03-02 11:59:29.640 UTC ] [Utils.getPropertyValue:320]  keyName=olrconfig_loc props.val=/u01/app/oracle/crsdata/ol7-122-rac1/olr/ol7-122-rac1_19.olr propValue=/u01/app/oracle/crsdata/ol7-122-rac1/olr/ol7-122-rac1_19.olr
[main] [ 2020-03-02 11:59:29.641 UTC ] [Utils.getPropertyValue:320]  keyName=crs_home props.val=/u01/app/19.6.0.0/grid propValue=/u01/app/19.6.0.0/grid
[main] [ 2020-03-02 11:59:29.641 UTC ] [Utils.getPropertyValue:380]  propName=crs_home propValue=/u01/app/19.6.0.0/grid
[main] [ 2020-03-02 11:59:29.642 UTC ] [UnixSystem.getCRSHome:3741]  crs_home=/u01/app/19.6.0.0/grid
[main] [ 2020-03-02 11:59:29.643 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 10
[main] [ 2020-03-02 11:59:29.644 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:29.644 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 10
[main] [ 2020-03-02 11:59:29.644 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:29.645 UTC ] [UnixSystem.isCRSConfigured:3549]  configFile=/etc/oracle/ocr.loc
[main] [ 2020-03-02 11:59:29.646 UTC ] [Utils.getPropertyValue:320]  keyName=ocrconfig_loc props.val=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587 propValue=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:29.646 UTC ] [Utils.getPropertyValue:380]  propName=ocrconfig_loc propValue=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:29.646 UTC ] [UnixSystem.isCRSConfigured:3556]  ocrconfig_loc=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:29.647 UTC ] [Utils.getPropertyValue:320]  keyName=ocrconfig_loc props.val=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587 propValue=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:29.647 UTC ] [Utils.getPropertyValue:380]  propName=local_only propValue=false
[main] [ 2020-03-02 11:59:29.652 UTC ] [UnixSystem.getHostName:485]  unixcmd=/bin/hostname
[Thread-3] [ 2020-03-02 11:59:29.662 UTC ] [StreamReader.run:62]  In StreamReader.run
[main] [ 2020-03-02 11:59:29.664 UTC ] [RuntimeExec.runCommand:294]  runCommand: Waiting for the process
[main] [ 2020-03-02 11:59:29.664 UTC ] [RuntimeExec.runCommand:296]  runCommand: process returns 0
[Thread-2] [ 2020-03-02 11:59:29.665 UTC ] [StreamReader.run:62]  In StreamReader.run
[Thread-2] [ 2020-03-02 11:59:29.665 UTC ] [StreamReader.run:66]  OUTPUT>ol7-122-rac1.localdomain
[main] [ 2020-03-02 11:59:29.666 UTC ] [RuntimeExec.runCommand:323]  RunTimeExec: error>
[main] [ 2020-03-02 11:59:29.667 UTC ] [RuntimeExec.runCommand:349]  Returning from RunTimeExec.runCommand
[main] [ 2020-03-02 11:59:29.667 UTC ] [ClusterInfo.getHostName:462]  Hostname = ol7-122-rac1
[main] [ 2020-03-02 11:59:29.905 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 11
[main] [ 2020-03-02 11:59:29.905 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:29.905 UTC ] [Version.isPre:789]  version to be checked 19.0.0.0.0 major version to check against 11 minor version to check against 2
[main] [ 2020-03-02 11:59:29.906 UTC ] [Version.isPre:798]  isPre: Returning FALSE for major version check
[main] [ 2020-03-02 11:59:29.906 UTC ] [CRSCTLUtil.checkCRSRunning:538]  Checking if CRS is running for post 11 release
[main] [ 2020-03-02 11:59:29.906 UTC ] [Utils.getLocalHost:487]  Hostname retrieved: ol7-122-rac1.localdomain, returned: ol7-122-rac1
[main] [ 2020-03-02 11:59:29.906 UTC ] [Utils.getNodeName:897]  Hostname : ol7-122-rac1 is converted to nodeName : ol7-122-rac1
[main] [ 2020-03-02 11:59:29.906 UTC ] [CRSCTLUtil.runCrsctlCmd:620]  Fetched Local Node Name: ol7-122-rac1
[main] [ 2020-03-02 11:59:29.907 UTC ] [CmdToolUtil.doexecuteLocally:1467]  OS Name is...Linux
[main] [ 2020-03-02 11:59:29.907 UTC ] [CmdToolUtil.enforceEnglishVars:1662]  Set environment for English language
[main] [ 2020-03-02 11:59:29.907 UTC ] [CmdToolUtil.enforceEnglishVars:1663]  OS Name is...Linux
[Thread-5] [ 2020-03-02 11:59:29.924 UTC ] [StreamReader.run:62]  In StreamReader.run
[main] [ 2020-03-02 11:59:29.936 UTC ] [RuntimeExec.runCommand:294]  runCommand: Waiting for the process
[Thread-4] [ 2020-03-02 11:59:29.939 UTC ] [StreamReader.run:62]  In StreamReader.run
[Thread-4] [ 2020-03-02 11:59:29.999 UTC ] [StreamReader.run:66]  OUTPUT>CRS-4537: Cluster Ready Services is online
[Thread-4] [ 2020-03-02 11:59:29.999 UTC ] [StreamReader.run:66]  OUTPUT>CRS-4529: Cluster Synchronization Services is online
[Thread-4] [ 2020-03-02 11:59:29.999 UTC ] [StreamReader.run:66]  OUTPUT>CRS-4533: Event Manager is online
[main] [ 2020-03-02 11:59:30.010 UTC ] [RuntimeExec.runCommand:296]  runCommand: process returns 0
[main] [ 2020-03-02 11:59:30.011 UTC ] [RuntimeExec.runCommand:323]  RunTimeExec: error>
[main] [ 2020-03-02 11:59:30.011 UTC ] [RuntimeExec.runCommand:349]  Returning from RunTimeExec.runCommand
[main] [ 2020-03-02 11:59:30.012 UTC ] [CmdToolUtil.doexecuteLocally:1477]  retval =  0
[main] [ 2020-03-02 11:59:30.012 UTC ] [CmdToolUtil.doexecuteLocally:1478]  exitval =  0
[main] [ 2020-03-02 11:59:30.012 UTC ] [CmdToolUtil.doexecuteLocally:1479]  rtErrLength =  0
[main] [ 2020-03-02 11:59:30.013 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 10
[main] [ 2020-03-02 11:59:30.013 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:30.014 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 10
[main] [ 2020-03-02 11:59:30.014 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:30.014 UTC ] [UnixSystem.isCRSConfigured:3549]  configFile=/etc/oracle/ocr.loc
[main] [ 2020-03-02 11:59:30.014 UTC ] [Utils.getPropertyValue:320]  keyName=ocrconfig_loc props.val=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587 propValue=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:30.014 UTC ] [Utils.getPropertyValue:380]  propName=ocrconfig_loc propValue=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:30.015 UTC ] [UnixSystem.isCRSConfigured:3556]  ocrconfig_loc=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:30.015 UTC ] [Utils.getPropertyValue:320]  keyName=ocrconfig_loc props.val=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587 propValue=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:30.015 UTC ] [Utils.getPropertyValue:380]  propName=local_only propValue=false
[main] [ 2020-03-02 11:59:30.016 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 10
[main] [ 2020-03-02 11:59:30.016 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:30.016 UTC ] [ClusterInfo.<init>:248]  m_olsnodesPath=/u01/app/19.6.0.0/grid/bin/olsnodes
[Thread-7] [ 2020-03-02 11:59:30.018 UTC ] [StreamReader.run:62]  In StreamReader.run
[main] [ 2020-03-02 11:59:30.019 UTC ] [RuntimeExec.runCommand:294]  runCommand: Waiting for the process
[Thread-6] [ 2020-03-02 11:59:30.019 UTC ] [StreamReader.run:62]  In StreamReader.run
[Thread-6] [ 2020-03-02 11:59:30.144 UTC ] [StreamReader.run:66]  OUTPUT>Oracle Clusterware active version on the cluster is [19.0.0.0.0]
[main] [ 2020-03-02 11:59:30.147 UTC ] [RuntimeExec.runCommand:296]  runCommand: process returns 0
[main] [ 2020-03-02 11:59:30.148 UTC ] [RuntimeExec.runCommand:323]  RunTimeExec: error>
[main] [ 2020-03-02 11:59:30.148 UTC ] [RuntimeExec.runCommand:349]  Returning from RunTimeExec.runCommand
[main] [ 2020-03-02 11:59:30.149 UTC ] [ClusterInfo.getCRSActiveVersionString:1545]  output[0]=Oracle Clusterware active version on the cluster is [19.0.0.0.0]
[main] [ 2020-03-02 11:59:30.149 UTC ] [ClusterInfo.getCRSActiveVersionString:1556]  Active version = 19.0.0.0.0
[main] [ 2020-03-02 11:59:30.149 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 10
[main] [ 2020-03-02 11:59:30.149 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:30.152 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 10
[main] [ 2020-03-02 11:59:30.152 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:30.153 UTC ] [UnixSystem.isCRSConfigured:3549]  configFile=/etc/oracle/ocr.loc
[main] [ 2020-03-02 11:59:30.153 UTC ] [Utils.getPropertyValue:320]  keyName=ocrconfig_loc props.val=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587 propValue=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:30.153 UTC ] [Utils.getPropertyValue:380]  propName=ocrconfig_loc propValue=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:30.153 UTC ] [UnixSystem.isCRSConfigured:3556]  ocrconfig_loc=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:30.156 UTC ] [Utils.getPropertyValue:320]  keyName=ocrconfig_loc props.val=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587 propValue=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:30.156 UTC ] [Utils.getPropertyValue:380]  propName=local_only propValue=false
[main] [ 2020-03-02 11:59:30.156 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 10
[main] [ 2020-03-02 11:59:30.156 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:30.156 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 10
[main] [ 2020-03-02 11:59:30.158 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:30.158 UTC ] [UnixSystem.isCRSConfigured:3549]  configFile=/etc/oracle/ocr.loc
[main] [ 2020-03-02 11:59:30.159 UTC ] [Utils.getPropertyValue:320]  keyName=ocrconfig_loc props.val=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587 propValue=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:30.159 UTC ] [Utils.getPropertyValue:380]  propName=ocrconfig_loc propValue=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:30.159 UTC ] [UnixSystem.isCRSConfigured:3556]  ocrconfig_loc=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:30.159 UTC ] [Utils.getPropertyValue:320]  keyName=ocrconfig_loc props.val=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587 propValue=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:30.159 UTC ] [Utils.getPropertyValue:380]  propName=local_only propValue=false
[main] [ 2020-03-02 11:59:30.161 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 10
[main] [ 2020-03-02 11:59:30.162 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:30.162 UTC ] [ClusterInfo.<init>:248]  m_olsnodesPath=/u01/app/19.6.0.0/grid/bin/olsnodes
[Thread-9] [ 2020-03-02 11:59:30.167 UTC ] [StreamReader.run:62]  In StreamReader.run
[main] [ 2020-03-02 11:59:30.167 UTC ] [RuntimeExec.runCommand:294]  runCommand: Waiting for the process
[Thread-8] [ 2020-03-02 11:59:30.168 UTC ] [StreamReader.run:62]  In StreamReader.run
[Thread-8] [ 2020-03-02 11:59:30.374 UTC ] [StreamReader.run:66]  OUTPUT>Oracle Clusterware active version on the cluster is [19.0.0.0.0]
[main] [ 2020-03-02 11:59:30.377 UTC ] [RuntimeExec.runCommand:296]  runCommand: process returns 0
[main] [ 2020-03-02 11:59:30.378 UTC ] [RuntimeExec.runCommand:323]  RunTimeExec: error>
[main] [ 2020-03-02 11:59:30.379 UTC ] [RuntimeExec.runCommand:349]  Returning from RunTimeExec.runCommand
[main] [ 2020-03-02 11:59:30.379 UTC ] [ClusterInfo.getCRSActiveVersionString:1545]  output[0]=Oracle Clusterware active version on the cluster is [19.0.0.0.0]
[main] [ 2020-03-02 11:59:30.379 UTC ] [ClusterInfo.getCRSActiveVersionString:1556]  Active version = 19.0.0.0.0
[main] [ 2020-03-02 11:59:30.386 UTC ] [UnixSystem.getCRSHome:3689]  olrFileName = /etc/oracle/olr.loc
[main] [ 2020-03-02 11:59:30.387 UTC ] [UnixSystem.getCRSHome:3733]  configFile=/etc/oracle/olr.loc
[main] [ 2020-03-02 11:59:30.387 UTC ] [Utils.getPropertyValue:320]  keyName=olrconfig_loc props.val=/u01/app/oracle/crsdata/ol7-122-rac1/olr/ol7-122-rac1_19.olr propValue=/u01/app/oracle/crsdata/ol7-122-rac1/olr/ol7-122-rac1_19.olr
[main] [ 2020-03-02 11:59:30.387 UTC ] [Utils.getPropertyValue:320]  keyName=crs_home props.val=/u01/app/19.6.0.0/grid propValue=/u01/app/19.6.0.0/grid
[main] [ 2020-03-02 11:59:30.388 UTC ] [Utils.getPropertyValue:380]  propName=crs_home propValue=/u01/app/19.6.0.0/grid
[main] [ 2020-03-02 11:59:30.388 UTC ] [UnixSystem.getCRSHome:3741]  crs_home=/u01/app/19.6.0.0/grid
[main] [ 2020-03-02 11:59:30.388 UTC ] [CmdToolUtil.doexecuteLocally:1467]  OS Name is...Linux
[main] [ 2020-03-02 11:59:30.388 UTC ] [CmdToolUtil.enforceEnglishVars:1662]  Set environment for English language
[main] [ 2020-03-02 11:59:30.388 UTC ] [CmdToolUtil.enforceEnglishVars:1663]  OS Name is...Linux
[Thread-11] [ 2020-03-02 11:59:30.397 UTC ] [StreamReader.run:62]  In StreamReader.run
[Thread-10] [ 2020-03-02 11:59:30.397 UTC ] [StreamReader.run:62]  In StreamReader.run
[main] [ 2020-03-02 11:59:30.397 UTC ] [RuntimeExec.runCommand:294]  runCommand: Waiting for the process
[Thread-10] [ 2020-03-02 11:59:30.493 UTC ] [StreamReader.run:66]  OUTPUT>CRS-4003: Resource 'ora.crsd' is registered.
[main] [ 2020-03-02 11:59:30.499 UTC ] [RuntimeExec.runCommand:296]  runCommand: process returns 0
[main] [ 2020-03-02 11:59:30.499 UTC ] [RuntimeExec.runCommand:323]  RunTimeExec: error>
[main] [ 2020-03-02 11:59:30.499 UTC ] [RuntimeExec.runCommand:349]  Returning from RunTimeExec.runCommand
[main] [ 2020-03-02 11:59:30.502 UTC ] [CmdToolUtil.doexecuteLocally:1477]  retval =  0
[main] [ 2020-03-02 11:59:30.502 UTC ] [CmdToolUtil.doexecuteLocally:1478]  exitval =  0
[main] [ 2020-03-02 11:59:30.502 UTC ] [CmdToolUtil.doexecuteLocally:1479]  rtErrLength =  0
[main] [ 2020-03-02 11:59:30.502 UTC ] [UnixSystem.getCRSHome:3689]  olrFileName = /etc/oracle/olr.loc
[main] [ 2020-03-02 11:59:30.502 UTC ] [UnixSystem.getCRSHome:3733]  configFile=/etc/oracle/olr.loc
[main] [ 2020-03-02 11:59:30.505 UTC ] [Utils.getPropertyValue:320]  keyName=olrconfig_loc props.val=/u01/app/oracle/crsdata/ol7-122-rac1/olr/ol7-122-rac1_19.olr propValue=/u01/app/oracle/crsdata/ol7-122-rac1/olr/ol7-122-rac1_19.olr
[main] [ 2020-03-02 11:59:30.505 UTC ] [Utils.getPropertyValue:320]  keyName=crs_home props.val=/u01/app/19.6.0.0/grid propValue=/u01/app/19.6.0.0/grid
[main] [ 2020-03-02 11:59:30.506 UTC ] [Utils.getPropertyValue:380]  propName=crs_home propValue=/u01/app/19.6.0.0/grid
[main] [ 2020-03-02 11:59:30.506 UTC ] [UnixSystem.getCRSHome:3741]  crs_home=/u01/app/19.6.0.0/grid
[main] [ 2020-03-02 11:59:30.506 UTC ] [CmdToolUtil.doexecuteLocally:1467]  OS Name is...Linux
[main] [ 2020-03-02 11:59:30.506 UTC ] [CmdToolUtil.enforceEnglishVars:1662]  Set environment for English language
[main] [ 2020-03-02 11:59:30.506 UTC ] [CmdToolUtil.enforceEnglishVars:1663]  OS Name is...Linux
[Thread-13] [ 2020-03-02 11:59:30.509 UTC ] [StreamReader.run:62]  In StreamReader.run
[main] [ 2020-03-02 11:59:30.509 UTC ] [RuntimeExec.runCommand:294]  runCommand: Waiting for the process
[Thread-12] [ 2020-03-02 11:59:30.509 UTC ] [StreamReader.run:62]  In StreamReader.run
[Thread-12] [ 2020-03-02 11:59:30.613 UTC ] [StreamReader.run:66]  OUTPUT>CRS-4003: Resource 'ora.crsd' is registered.
[main] [ 2020-03-02 11:59:30.619 UTC ] [RuntimeExec.runCommand:296]  runCommand: process returns 0
[main] [ 2020-03-02 11:59:30.620 UTC ] [RuntimeExec.runCommand:323]  RunTimeExec: error>
[main] [ 2020-03-02 11:59:30.620 UTC ] [RuntimeExec.runCommand:349]  Returning from RunTimeExec.runCommand
[main] [ 2020-03-02 11:59:30.620 UTC ] [CmdToolUtil.doexecuteLocally:1477]  retval =  0
[main] [ 2020-03-02 11:59:30.620 UTC ] [CmdToolUtil.doexecuteLocally:1478]  exitval =  0
[main] [ 2020-03-02 11:59:30.620 UTC ] [CmdToolUtil.doexecuteLocally:1479]  rtErrLength =  0
[main] [ 2020-03-02 11:59:30.621 UTC ] [CmdToolUtil.doexecuteLocally:1467]  OS Name is...Linux
[main] [ 2020-03-02 11:59:30.621 UTC ] [CmdToolUtil.enforceEnglishVars:1662]  Set environment for English language
[main] [ 2020-03-02 11:59:30.622 UTC ] [CmdToolUtil.enforceEnglishVars:1663]  OS Name is...Linux
[Thread-15] [ 2020-03-02 11:59:30.623 UTC ] [StreamReader.run:62]  In StreamReader.run
[main] [ 2020-03-02 11:59:30.624 UTC ] [RuntimeExec.runCommand:294]  runCommand: Waiting for the process
[Thread-14] [ 2020-03-02 11:59:30.624 UTC ] [StreamReader.run:62]  In StreamReader.run
[Thread-14] [ 2020-03-02 11:59:30.778 UTC ] [StreamReader.run:66]  OUTPUT>CRS-6539: The cluster type is 'flex'.
[main] [ 2020-03-02 11:59:30.784 UTC ] [RuntimeExec.runCommand:296]  runCommand: process returns 0
[main] [ 2020-03-02 11:59:30.784 UTC ] [RuntimeExec.runCommand:323]  RunTimeExec: error>
[main] [ 2020-03-02 11:59:30.785 UTC ] [RuntimeExec.runCommand:349]  Returning from RunTimeExec.runCommand
[main] [ 2020-03-02 11:59:30.786 UTC ] [CmdToolUtil.doexecuteLocally:1477]  retval =  0
[main] [ 2020-03-02 11:59:30.786 UTC ] [CmdToolUtil.doexecuteLocally:1478]  exitval =  0
[main] [ 2020-03-02 11:59:30.786 UTC ] [CmdToolUtil.doexecuteLocally:1479]  rtErrLength =  0
[main] [ 2020-03-02 11:59:30.786 UTC ] [CRSCTLUtil.getClustInfo:1710]  cmdOutString   : CRS-6539: The cluster type is 'flex'.
[main] [ 2020-03-02 11:59:30.799 UTC ] [CmdToolUtil.doexecuteLocally:1467]  OS Name is...Linux
[main] [ 2020-03-02 11:59:30.799 UTC ] [CmdToolUtil.enforceEnglishVars:1662]  Set environment for English language
[main] [ 2020-03-02 11:59:30.800 UTC ] [CmdToolUtil.enforceEnglishVars:1663]  OS Name is...Linux
[Thread-17] [ 2020-03-02 11:59:30.806 UTC ] [StreamReader.run:62]  In StreamReader.run
[main] [ 2020-03-02 11:59:30.807 UTC ] [RuntimeExec.runCommand:294]  runCommand: Waiting for the process
[Thread-16] [ 2020-03-02 11:59:30.807 UTC ] [StreamReader.run:62]  In StreamReader.run
[Thread-16] [ 2020-03-02 11:59:30.850 UTC ] [StreamReader.run:66]  OUTPUT>ol7-122-rac1  Hub
[Thread-16] [ 2020-03-02 11:59:30.850 UTC ] [StreamReader.run:66]  OUTPUT>ol7-122-rac2  None
[main] [ 2020-03-02 11:59:30.854 UTC ] [RuntimeExec.runCommand:296]  runCommand: process returns 0
[main] [ 2020-03-02 11:59:30.854 UTC ] [RuntimeExec.runCommand:323]  RunTimeExec: error>
[main] [ 2020-03-02 11:59:30.854 UTC ] [RuntimeExec.runCommand:349]  Returning from RunTimeExec.runCommand
[main] [ 2020-03-02 11:59:30.854 UTC ] [CmdToolUtil.doexecuteLocally:1477]  retval =  0
[main] [ 2020-03-02 11:59:30.854 UTC ] [CmdToolUtil.doexecuteLocally:1478]  exitval =  0
[main] [ 2020-03-02 11:59:30.856 UTC ] [CmdToolUtil.doexecuteLocally:1479]  rtErrLength =  0
[main] [ 2020-03-02 11:59:30.856 UTC ] [OLSNODESUtil.getClusterNodeActiveRoles:545]  result string = ol7-122-rac1       Hub\nol7-122-rac2       None
[main] [ 2020-03-02 11:59:30.856 UTC ] [OLSNODESUtil.getClusterNodeActiveRoles:555]  outputs[0] = ol7-122-rac1  Hub
[main] [ 2020-03-02 11:59:30.856 UTC ] [OLSNODESUtil.parseNodeRole:570]  values length = 2
[main] [ 2020-03-02 11:59:30.857 UTC ] [OLSNODESUtil.parseNodeRole:573]  putting values ol7-122-rac1 Hub in map
[main] [ 2020-03-02 11:59:30.857 UTC ] [OLSNODESUtil.getClusterNodeActiveRoles:555]  outputs[1] = ol7-122-rac2  None
[main] [ 2020-03-02 11:59:30.857 UTC ] [OLSNODESUtil.parseNodeRole:570]  values length = 2
[main] [ 2020-03-02 11:59:30.857 UTC ] [OLSNODESUtil.parseNodeRole:573]  putting values ol7-122-rac2 None in map
[main] [ 2020-03-02 11:59:30.864 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 10
[main] [ 2020-03-02 11:59:30.864 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:30.864 UTC ] [ClusterInfo.<init>:248]  m_olsnodesPath=/u01/app/19.6.0.0/grid/bin/olsnodes
[main] [ 2020-03-02 11:59:30.864 UTC ] [Version.isHigher:954]  Calling isPre to compare software version 19.0.0.0.0against major ver 19 minor ver 0 and patch set ver 0
[main] [ 2020-03-02 11:59:30.864 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 19
[main] [ 2020-03-02 11:59:30.867 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:30.868 UTC ] [Version.isPre:853]  version to be checked 19.0.0.0.0 major version to check against 19 minor version to check against 0 patchset version to check against 0
[main] [ 2020-03-02 11:59:30.868 UTC ] [Version.isPre:885]  Patchset version in isPre is0
[main] [ 2020-03-02 11:59:30.868 UTC ] [Version.isPre:894]  isPre: Returning FALSE for patchset version check
[main] [ 2020-03-02 11:59:30.869 UTC ] [Version$VersionEnum.getEnumMember:228]  majorVer=19
minorVer=0
releaseVer=0
patchsetVer=0
osVer=0

[main] [ 2020-03-02 11:59:30.869 UTC ] [Version$VersionEnum.getEnumMember:264]  Version Match Successful: returning version object 19.0.0.0
[main] [ 2020-03-02 11:59:30.869 UTC ] [ClusterInfo.getVoteDiskLocations:1091]  ENTRY
[main] [ 2020-03-02 11:59:30.869 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 11
[main] [ 2020-03-02 11:59:30.869 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:30.869 UTC ] [Version.isPre:789]  version to be checked 19.0.0.0.0 major version to check against 11 minor version to check against 2
[main] [ 2020-03-02 11:59:30.870 UTC ] [Version.isPre:798]  isPre: Returning FALSE for major version check
[Thread-19] [ 2020-03-02 11:59:30.882 UTC ] [StreamReader.run:62]  In StreamReader.run
[main] [ 2020-03-02 11:59:30.897 UTC ] [RuntimeExec.runCommand:294]  runCommand: Waiting for the process
[Thread-18] [ 2020-03-02 11:59:30.897 UTC ] [StreamReader.run:62]  In StreamReader.run
[Thread-18] [ 2020-03-02 11:59:30.920 UTC ] [StreamReader.run:66]  OUTPUT>##  STATE    File Universal Id                File Name Disk group
[Thread-18] [ 2020-03-02 11:59:30.920 UTC ] [StreamReader.run:66]  OUTPUT>--  -----    -----------------                --------- ---------
[Thread-18] [ 2020-03-02 11:59:30.921 UTC ] [StreamReader.run:66]  OUTPUT> 1. ONLINE   9168fb9dc8764f76bf1c3ec141995e46 (/dev/oracleasm/asm-disk3) [DATA]
[Thread-18] [ 2020-03-02 11:59:30.921 UTC ] [StreamReader.run:66]  OUTPUT>Located 1 voting disk(s).
[main] [ 2020-03-02 11:59:30.923 UTC ] [RuntimeExec.runCommand:296]  runCommand: process returns 0
[main] [ 2020-03-02 11:59:30.924 UTC ] [RuntimeExec.runCommand:323]  RunTimeExec: error>
[main] [ 2020-03-02 11:59:30.924 UTC ] [RuntimeExec.runCommand:349]  Returning from RunTimeExec.runCommand
[main] [ 2020-03-02 11:59:30.924 UTC ] [ClusterInfo.getVoteDiskLocations:1165]  Number of tokens: 9
output[0]=##  STATE    File Universal Id                File Name Disk group
[main] [ 2020-03-02 11:59:30.925 UTC ] [ClusterInfo.getVoteDiskLocations:1165]  Number of tokens: 5
output[1]=--  -----    -----------------                --------- ---------
[main] [ 2020-03-02 11:59:30.925 UTC ] [ClusterInfo.getVoteDiskLocations:1193]  Number Format Exception:
java.lang.NumberFormatException: For input string: "--"
[main] [ 2020-03-02 11:59:30.925 UTC ] [ClusterInfo.getVoteDiskLocations:1165]  Number of tokens: 5
output[2]= 1. ONLINE   9168fb9dc8764f76bf1c3ec141995e46 (/dev/oracleasm/asm-disk3) [DATA]
[main] [ 2020-03-02 11:59:30.925 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 11
[main] [ 2020-03-02 11:59:30.925 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:30.925 UTC ] [Version.isPre:789]  version to be checked 19.0.0.0.0 major version to check against 11 minor version to check against 2
[main] [ 2020-03-02 11:59:30.925 UTC ] [Version.isPre:798]  isPre: Returning FALSE for major version check
[main] [ 2020-03-02 11:59:30.926 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 11
[main] [ 2020-03-02 11:59:30.926 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:30.926 UTC ] [Version.isPre:789]  version to be checked 19.0.0.0.0 major version to check against 11 minor version to check against 2
[main] [ 2020-03-02 11:59:30.926 UTC ] [Version.isPre:798]  isPre: Returning FALSE for major version check
[main] [ 2020-03-02 11:59:30.926 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 11
[main] [ 2020-03-02 11:59:30.927 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:30.927 UTC ] [Version.isPre:789]  version to be checked 19.0.0.0.0 major version to check against 11 minor version to check against 2
[main] [ 2020-03-02 11:59:30.927 UTC ] [Version.isPre:798]  isPre: Returning FALSE for major version check
[main] [ 2020-03-02 11:59:30.927 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 11
[main] [ 2020-03-02 11:59:30.927 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:30.927 UTC ] [Version.isPre:789]  version to be checked 19.0.0.0.0 major version to check against 11 minor version to check against 2
[main] [ 2020-03-02 11:59:30.927 UTC ] [Version.isPre:798]  isPre: Returning FALSE for major version check
[main] [ 2020-03-02 11:59:30.927 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 11
[main] [ 2020-03-02 11:59:30.927 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:30.928 UTC ] [Version.isPre:789]  version to be checked 19.0.0.0.0 major version to check against 11 minor version to check against 2
[main] [ 2020-03-02 11:59:30.928 UTC ] [Version.isPre:798]  isPre: Returning FALSE for major version check
[main] [ 2020-03-02 11:59:30.928 UTC ] [ClusterInfo.getVoteDiskLocations:1243]  |Number|Size|Status|VDIN|Path|Group|isASMPath|
|1|0|2|9168fb9dc8764f76bf1c3ec141995e46|/dev/oracleasm/asm-disk3|DATA|1|
[main] [ 2020-03-02 11:59:30.929 UTC ] [Version.isHigher:954]  Calling isPre to compare software version 19.0.0.0.0against major ver 10 minor ver 2 and patch set ver 0
[main] [ 2020-03-02 11:59:30.929 UTC ] [Version.isPre:757]  version to be checked 19.0.0.0.0 major version to check against 10
[main] [ 2020-03-02 11:59:30.929 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:30.929 UTC ] [Version.isPre:853]  version to be checked 19.0.0.0.0 major version to check against 10 minor version to check against 2 patchset version to check against 0
[main] [ 2020-03-02 11:59:30.929 UTC ] [Version.isPre:864]  isPre: Returning FALSE for major version check
[main] [ 2020-03-02 11:59:30.929 UTC ] [Version$VersionEnum.getEnumMember:228]  majorVer=10
minorVer=2
releaseVer=0
patchsetVer=0
osVer=0

[main] [ 2020-03-02 11:59:30.930 UTC ] [Version$VersionEnum.getEnumMember:264]  Version Match Successful: returning version object 10.2.0.0
[main] [ 2020-03-02 11:59:30.930 UTC ] [Version.isPre:757]  version to be checked 10.2.0.0.0 major version to check against 11
[main] [ 2020-03-02 11:59:30.931 UTC ] [Version.isPre:763]  isPre.java: Returning TRUE
[main] [ 2020-03-02 11:59:30.931 UTC ] [Version.isPre:757]  version to be checked 10.2.0.0.0 major version to check against 10
[main] [ 2020-03-02 11:59:30.931 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:30.931 UTC ] [Version.isPre:757]  version to be checked 10.2.0.0.0 major version to check against 10
[main] [ 2020-03-02 11:59:30.931 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:30.932 UTC ] [Utils.getPropertyValue:320]  keyName=ocrconfig_loc props.val=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587 propValue=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:30.933 UTC ] [Utils.getPropertyValue:380]  propName=ocrconfig_loc propValue=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:30.933 UTC ] [Version.isPre:757]  version to be checked 10.2.0.0.0 major version to check against 10
[main] [ 2020-03-02 11:59:30.933 UTC ] [Version.isPre:768]  isPre.java: Returning FALSE
[main] [ 2020-03-02 11:59:30.933 UTC ] [Utils.getPropertyValue:320]  keyName=ocrconfig_loc props.val=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587 propValue=+DATA/ol7-122-cluster/OCRFILE/registry.255.1033916587
[main] [ 2020-03-02 11:59:30.933 UTC ] [Utils.getPropertyValue:380]  propName=ocrmirrorconfig_loc propValue=null
[main] [ 2020-03-02 11:59:31.063 UTC ] [ClusterVerification.getInstance:642]  Method Entry. path=/u01/app/19.6.0.0/grid type=FRAMEWORK_HOME
[main] [ 2020-03-02 11:59:31.067 UTC ] [CVUVariables.initialize:1848]  Start parse all variables from variables.xml...
[main] [ 2020-03-02 11:59:31.069 UTC ] [VerificationUtil.getVariablesXmlURI:10964]  ====  XML variables file: file:/u01/app/19.6.0.0/grid/cv/cvdata/variables.xml
[main] [ 2020-03-02 11:59:31.069 UTC ] [VerificationUtil.getVariablesXmlSchemaURI:10946]  ==== XML variables schema file: file:/u01/app/19.6.0.0/grid/cv/cvdata/variables.xsd
[main] [ 2020-03-02 11:59:31.069 UTC ] [CVUVariables.getRootElement:2118]  ==== URIs obtained :xsd URI = file:/u01/app/19.6.0.0/grid/cv/cvdata/variables.xsd
[main] [ 2020-03-02 11:59:31.070 UTC ] [CVUVariables.getRootElement:2119]  ==== URIs obtained :xml URI = file:/u01/app/19.6.0.0/grid/cv/cvdata/variables.xml
[main] [ 2020-03-02 11:59:31.070 UTC ] [CVUVariables.getRootElement:2132]  xsdFile exists : file:/u01/app/19.6.0.0/grid/cv/cvdata/variables.xsd
[main] [ 2020-03-02 11:59:31.070 UTC ] [CVUVariables.getRootElement:2147]  xmlFile exists : file:/u01/app/19.6.0.0/grid/cv/cvdata/variables.xml
[main] [ 2020-03-02 11:59:31.070 UTC ] [CVUVariables.getRootElement:2160]  setting xmlFactory to use xsdFile : file:/u01/app/19.6.0.0/grid/cv/cvdata/variables.xsd
[main] [ 2020-03-02 11:59:31.128 UTC ] [CVUVariables.getRootElement:2191]  The xml variables file: file:/u01/app/19.6.0.0/grid/cv/cvdata/variables.xml, was parsed correctly
[main] [ 2020-03-02 11:59:31.128 UTC ] [CVUVariables.parse:1913]  Version found ALL
[main] [ 2020-03-02 11:59:31.128 UTC ] [CVUVariables.parse:1921]  Process common variables
[main] [ 2020-03-02 11:59:31.131 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "s_silent,SILENT"
[main] [ 2020-03-02 11:59:31.131 UTC ] [CVUVariables.parse:1913]  Version found CURRENT_RELEASE
[main] [ 2020-03-02 11:59:31.131 UTC ] [CVUVariables.parse:1926]  Process variables for the release: 19.0
[main] [ 2020-03-02 11:59:31.132 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "OwnerId,INSTALL_USER,BACKUP_USER,ORACLE_OWNER,LISTENER_USERNAME"
[main] [ 2020-03-02 11:59:31.132 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "s_usergroup,oracle.install.asm.OSDBA,DBA_GROUP,ORA_DBA_GROUP"
[main] [ 2020-03-02 11:59:31.132 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "s_racdbagroup,oracle.install.OSRACDBA,RACDBA_GROUP,ORA_RACDBA_GROUP"
[main] [ 2020-03-02 11:59:31.133 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_OSASM,oracle.install.asm.OSASM,ASM_GROUP,ORA_ASM_GROUP"
[main] [ 2020-03-02 11:59:31.134 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.asm.OSOPER,OPER_GROUP"
[main] [ 2020-03-02 11:59:31.134 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "s_languageid,LANGUAGE_ID"
[main] [ 2020-03-02 11:59:31.135 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_Timezone,TZ"
[main] [ 2020-03-02 11:59:31.136 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_isRolling,ISROLLING,ISROLLING"
[main] [ 2020-03-02 11:59:31.136 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_asm_UseExistingDG,oracle.install.asm.useExistingDiskGroup,ASM_DISKGROUP_REUSE_OPTION,REUSEDG"
[main] [ 2020-03-02 11:59:31.136 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_asm_DisGroupAUSize,oracle.install.asm.diskGroup.AUSize,ASM_AU_SIZE,CDATA_AUSIZE"
[main] [ 2020-03-02 11:59:31.137 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_cres_prereq_Ignored,ORA_IGNORE_CVU_ERRORS,USER_IGNORED_PREREQ"
[main] [ 2020-03-02 11:59:31.137 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "ORACLE_HOSTNAME,ORACLE_HOSTNAME,INSTALL_NODE,INSTALL_NODE"
[main] [ 2020-03-02 11:59:31.138 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_ConfigureMgmtDB,MGMT_DB"
[main] [ 2020-03-02 11:59:31.139 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.config.managementOption"
[main] [ 2020-03-02 11:59:31.139 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_IsBigCluster,BIGCLUSTER,BIG_CLUSTER"
[main] [ 2020-03-02 11:59:31.140 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_TargetHubSize,TARGET_HUB_SIZE,HUB_SIZE"
[main] [ 2020-03-02 11:59:31.141 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_HUBNodesList,HUB_LIST,HUB_NODE_LIST"
[main] [ 2020-03-02 11:59:31.141 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_RIMNodesList,RIM_LIST,RIM_NODE_LIST"
[main] [ 2020-03-02 11:59:31.142 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_HUBVipList,HUB_NODE_VIPS"
[main] [ 2020-03-02 11:59:31.143 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_Ping_Targets,PING_TARGETS,PING_TARGETS"
[main] [ 2020-03-02 11:59:31.143 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "ORACLE_HOME,ORACLE_HOME,ORACLE_HOME,ORACLE_HOME,GPNPCONFIGDIR,GPNPGCONFIGDIR,CRFHOME"
[main] [ 2020-03-02 11:59:31.143 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "ORACLE_BASE,ORACLE_BASE,ORACLE_BASE,ORACLE_BASE"
[main] [ 2020-03-02 11:59:31.144 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_configuredCRSHome,CONFIGURED_CRS_HOME,OLD_CRS_HOME"
[main] [ 2020-03-02 11:59:31.145 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "s_jreLocation,JREDIR"
[main] [ 2020-03-02 11:59:31.149 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "s_jlibDir,JLIBDIR"
[main] [ 2020-03-02 11:59:31.150 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "s_cluster,VNDR_CLUSTER"
[main] [ 2020-03-02 11:59:31.150 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_OCRDeviceList,oracle.install.crs.config.sharedFileSystemStorage.ocrLocations,OCR_LOCATIONS,OCR_LOCATIONS"
[main] [ 2020-03-02 11:59:31.150 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_ClusterName,oracle.install.crs.config.clusterName,CLUSTER_NAME,CLUSTER_NAME"
[main] [ 2020-03-02 11:59:31.152 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.crs.config.clusterNodes,RSP_CLUSTER_NODES"
[main] [ 2020-03-02 11:59:31.152 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_CommaSeparatedNodes,NODE_LIST,NODE_NAME_LIST,HOST_NAME_LIST,NODELIST"
[main] [ 2020-03-02 11:59:31.154 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "s_privatenamelist,PRIVATE_NAME_LIST"
[main] [ 2020-03-02 11:59:31.155 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "QUORUM_DISKS,QUORUM_DISKS,QUORUM_DISKS"
[main] [ 2020-03-02 11:59:31.155 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_VotingDiskList,oracle.install.crs.config.sharedFileSystemStorage.VotingDiskLocations,VOTING_LOCATIONS,VOTING_DISKS"
[main] [ 2020-03-02 11:59:31.155 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_asm_DiskGroupName,oracle.install.asm.diskGroup.name,ASM_DISKGROUP,CDATA_DISK_GROUP"
[main] [ 2020-03-02 11:59:31.156 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_asm_DiskDiscoveryString,oracle.install.asm.diskGroup.diskDiscoveryString,ASM_DISK_DISCOVERY_STRING,ASM_DISCOVERY_STRING"
[main] [ 2020-03-02 11:59:31.156 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_asm_Disks,oracle.install.asm.diskGroup.disks,ASM_DISKGROUP_DISKS,CDATA_DISKS"
[main] [ 2020-03-02 11:59:31.157 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_asm_backupDG_Disks,oracle.install.asm.gimrDG.disks,ASM_GIMR_DISKGROUP_DISKS"
[main] [ 2020-03-02 11:59:31.157 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_asm_DiskGroupRedundancy,oracle.install.asm.diskGroup.redundancy,CDATA_REDUNDANCY"
[main] [ 2020-03-02 11:59:31.159 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_asm_FailureGroups,oracle.install.asm.diskGroup.FailureGroups,ASM_DISKGROUP_FAILUREGROUPS,CDATA_FAILURE_GROUPS"
[main] [ 2020-03-02 11:59:31.160 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.asm.diskGroup.disksWithFailureGroupNames,ASM_DISKGROUP_DISKSWITHFAILUREGROUPNAMES"
[main] [ 2020-03-02 11:59:31.161 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.asm.diskGroup.quorumFailureGroupNames,ASM_QUORUM_FG_NAMES"
[main] [ 2020-03-02 11:59:31.163 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_asm_ConfigurationType,ASM_CONFIG"
[main] [ 2020-03-02 11:59:31.164 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_asm_ClientDataFile,oracle.install.asm.ClientDataFile,ASM_CLIENTDATA,ASM_CREDENTIALS"
[main] [ 2020-03-02 11:59:31.165 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_StorageOption,CRS_STORAGE_OPTION"
[main] [ 2020-03-02 11:59:31.166 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.crs.config.storageOption,RSP_STORAGE_OPT"
[main] [ 2020-03-02 11:59:31.168 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "CSS_LEASEDURATION"
[main] [ 2020-03-02 11:59:31.170 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "HOST_VIP_NAMES"
[main] [ 2020-03-02 11:59:31.171 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_NodeVips,CFG_CRS_NODEVIPS,CRS_NODEVIPS,NEW_NODEVIPS"
[main] [ 2020-03-02 11:59:31.172 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_FinalInterfaceList,CFG_NETWORKS,NETWORKS"
[main] [ 2020-03-02 11:59:31.173 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.crs.config.networkInterfaceList,RSP_NETWORK_IFLIST"
[main] [ 2020-03-02 11:59:31.173 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_SCANName,oracle.install.crs.config.gpnp.scanName,SCAN_NAME,SCAN_NAME"
[main] [ 2020-03-02 11:59:31.173 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_SCANPortNumber,oracle.install.crs.config.gpnp.scanPort,SCAN_PORT,SCAN_PORT"
[main] [ 2020-03-02 11:59:31.174 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "s_paLocation,GPNP_PA"
[main] [ 2020-03-02 11:59:31.175 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_ConfigureGNS,oracle.install.crs.config.gpnp.configureGNS,CONFIGURE_GNS,GNS_CONF"
[main] [ 2020-03-02 11:59:31.175 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_GNSType,GNS_TYPE"
[main] [ 2020-03-02 11:59:31.176 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.crs.config.gpnp.gnsOption,RSP_GNS_OPT"
[main] [ 2020-03-02 11:59:31.177 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_GNSVIPAddress,oracle.install.crs.config.gpnp.gnsVIPAddress,GNS_VIP_ADDRESS,GNS_ADDR_LIST"
[main] [ 2020-03-02 11:59:31.177 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_GNSSubDomain,oracle.install.crs.config.gpnp.gnsSubDomain,GNS_SUB_DOMAIN,GNS_DOMAIN_LIST"
[main] [ 2020-03-02 11:59:31.178 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "sl_allowDomainList,GNS_ALLOW_NET_LIST"
[main] [ 2020-03-02 11:59:31.178 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "sl_denyDomainList,GNS_DENY_NET_LIST"
[main] [ 2020-03-02 11:59:31.181 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "sl_denyInterfaceList,GNS_DENY_ITF_LIST"
[main] [ 2020-03-02 11:59:31.181 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_GNSClientDataFile,oracle.install.crs.config.gpnp.gnsClientDataFile,GNS_CLIENT_DATA_FILE,GNS_CREDENTIALS"
[main] [ 2020-03-02 11:59:31.182 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "slHostNames,NEW_HOST_NAME_LIST"
[main] [ 2020-03-02 11:59:31.183 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "slNewNodes,NEW_NODE_NAME_LIST"
[main] [ 2020-03-02 11:59:31.185 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "slPrivateNodes,NEW_PRIVATE_NAME_LIST"
[main] [ 2020-03-02 11:59:31.186 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "OCRLOC"
[main] [ 2020-03-02 11:59:31.187 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "OLRLOC"
[main] [ 2020-03-02 11:59:31.190 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "OCRID"
[main] [ 2020-03-02 11:59:31.191 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "CLUSTER_GUID"
[main] [ 2020-03-02 11:59:31.192 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "CLSCFG_MISSCOUNT"
[main] [ 2020-03-02 11:59:31.193 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.responseFileVersion"
[main] [ 2020-03-02 11:59:31.194 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "INVENTORY_LOCATION"
[main] [ 2020-03-02 11:59:31.195 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "SELECTED_LANGUAGES"
[main] [ 2020-03-02 11:59:31.197 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.option,INSTALL_OPTION"
[main] [ 2020-03-02 11:59:31.198 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.asm.SYSASMPassword"
[main] [ 2020-03-02 11:59:31.199 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.asm.monitorPassword"
[main] [ 2020-03-02 11:59:31.200 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.crs.config.ClusterConfiguration,RSP_CLUSTERTYPE_OPT"
[main] [ 2020-03-02 11:59:31.201 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.crs.config.configureAsExtendedCluster,EXTENDED_CLUSTER,EXTENDED_CLUSTER"
[main] [ 2020-03-02 11:59:31.202 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "CLUSTER_TYPE"
[main] [ 2020-03-02 11:59:31.202 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.crs.config.autoConfigureClusterNodeVIP,AUTO_CONFIGURE_CLUSTER_NODE_VIP"
[main] [ 2020-03-02 11:59:31.204 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.crs.config.sharedFileSystemStorage.votingDiskRedundancy"
[main] [ 2020-03-02 11:59:31.205 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.crs.config.sharedFileSystemStorage.ocrRedundancy"
[main] [ 2020-03-02 11:59:31.206 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.crs.config.useIPMI,USE_IPMI"
[main] [ 2020-03-02 11:59:31.207 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.crs.config.ipmi.bmcUsername"
[main] [ 2020-03-02 11:59:31.208 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.crs.config.ipmi.bmcPassword"
[main] [ 2020-03-02 11:59:31.210 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.crs.config.ignoreDownNodes"
[main] [ 2020-03-02 11:59:31.211 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.installer.autoupdates.option"
[main] [ 2020-03-02 11:59:31.212 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.installer.autoupdates.downloadUpdatesLoc"
[main] [ 2020-03-02 11:59:31.213 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "AUTOUPDATES_MYORACLESUPPORT_USERNAME"
[main] [ 2020-03-02 11:59:31.215 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "AUTOUPDATES_MYORACLESUPPORT_PASSWORD"
[main] [ 2020-03-02 11:59:31.216 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "PROXY_HOST"
[main] [ 2020-03-02 11:59:31.217 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "PROXY_PORT"
[main] [ 2020-03-02 11:59:31.218 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "PROXY_USER"
[main] [ 2020-03-02 11:59:31.219 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "PROXY_PWD"
[main] [ 2020-03-02 11:59:31.220 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "PROXY_REALM"
[main] [ 2020-03-02 11:59:31.222 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.config.omsHost"
[main] [ 2020-03-02 11:59:31.223 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.config.omsPort"
[main] [ 2020-03-02 11:59:31.224 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.config.emAdminUser"
[main] [ 2020-03-02 11:59:31.225 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.config.emAdminPassword"
[main] [ 2020-03-02 11:59:31.226 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle_install_crs_app_VIPAddress,oracle.install.crs.app.applicationAddress,APPLICATION_VIP"
[main] [ 2020-03-02 11:59:31.227 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "POST_SW_INSTALL_API_MODE,POST_SW_INSTALL_API_MODE"
[main] [ 2020-03-02 11:59:31.228 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.crs.configureAFD,CONFIGURE_AFD"
[main] [ 2020-03-02 11:59:31.228 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "IS_BUILTIN_ACCOUNT,oracle.install.IsBuiltInAccount,IS_BUILTIN_ACCOUNT"
[main] [ 2020-03-02 11:59:31.229 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "IS_VIRTUAL_ACCOUNT,oracle_install_IsVirtualAccount,IS_VIRTUAL_ACCOUNT"
[main] [ 2020-03-02 11:59:31.231 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "ORACLE_SERVICE_USER_NAME,oracle.install.OracleHomeUserName,WINSEC_SERVICE_USERNAME"
[main] [ 2020-03-02 11:59:31.231 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "ORACLE_SERVICE_USER_PASSWORD,oracle.install.OracleHomeUserPassword,WINSEC_SERVICE_PASSWORD"
[main] [ 2020-03-02 11:59:31.232 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "oracle.install.crs.config.memberClusterManifestFile,CLUSTER_MANIFEST_FILE"
[main] [ 2020-03-02 11:59:31.232 UTC ] [CVUVariables.setReleaseForVariablesXml:296]  Initialize CVUVariables for release : 19.0
[main] [ 2020-03-02 11:59:31.233 UTC ] [CVUVariables.setValue:368]  CVUVarConstant : MODE_API
[main] [ 2020-03-02 11:59:31.233 UTC ] [CVUVariables.getCVUVariable:624]  The variable name : <MODE_API> is not listed
[main] [ 2020-03-02 11:59:31.233 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "MODE_API"
[main] [ 2020-03-02 11:59:31.233 UTC ] [CVUVariables.secureVariableValueTrace:779]  setting CVUVariableConstant : VAR = MODE_API VAL = TRUE
[main] [ 2020-03-02 11:59:31.233 UTC ] [CVUVariables.setValue:368]  CVUVarConstant : MODE_CLI
[main] [ 2020-03-02 11:59:31.234 UTC ] [CVUVariables.getCVUVariable:624]  The variable name : <MODE_CLI> is not listed
[main] [ 2020-03-02 11:59:31.235 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "MODE_CLI"
[main] [ 2020-03-02 11:59:31.236 UTC ] [CVUVariables.secureVariableValueTrace:779]  setting CVUVariableConstant : VAR = MODE_CLI VAL = FALSE
[main] [ 2020-03-02 11:59:31.236 UTC ] [ParamManager.<init>:394]  m_paramInstantiated set to TRUE
[main] [ 2020-03-02 11:59:31.236 UTC ] [VerificationUtil.getLocalHost:1767]  Hostname retrieved: ol7-122-rac1.localdomain, returned: ol7-122-rac1
[main] [ 2020-03-02 11:59:31.239 UTC ] [CVUVariables.setValue:368]  CVUVarConstant : LOCAL_NODE_NAME
[main] [ 2020-03-02 11:59:31.239 UTC ] [CVUVariables.getCVUVariable:624]  The variable name : <LOCAL_NODE_NAME> is not listed
[main] [ 2020-03-02 11:59:31.241 UTC ] [CVUVariableData.<init>:102]  CVUVariableData created with names:  "LOCAL_NODE_NAME"
[main] [ 2020-03-02 11:59:31.241 UTC ] [CVUVariables.secureVariableValueTrace:779]  setting CVUVariableConstant : VAR = LOCAL_NODE_NAME VAL = ol7-122-rac1
[main] [ 2020-03-02 11:59:31.242 UTC ] [Library.load:205]  library.load
[main] [ 2020-03-02 11:59:31.244 UTC ] [sPlatform.isHybrid:66]  osName=Linux osArch=amd64 JVM=64 rc=false
[main] [ 2020-03-02 11:59:31.246 UTC ] [Library.load:271]  Property oracle.installer.library_loc is set to value=/u01/app/19.6.0.0/grid/oui/lib/linux64
[main] [ 2020-03-02 11:59:31.246 UTC ] [Library.load:273]  Loading  library /u01/app/19.6.0.0/grid/oui/lib/linux64/libsrvm19.so
[main] [ 2020-03-02 11:59:31.251 UTC ] [VerificationUtil.getEnv:8845]  ==== getEnv reports: CVU_TEST_ENV=null
[main] [ 2020-03-02 11:59:31.251 UTC ] [ClusterVerification.<init>:493]  Setting property CV_HOME /u01/app/19.6.0.0/grid
[main] [ 2020-03-02 11:59:31.252 UTC ] [VerificationUtil.getDestLoc:4812]  ==== CV_DESTLOC(pre-fetched value): '/tmp/GridSetupActions2020-03-02_11-59-28AM/'
[main] [ 2020-03-02 11:59:31.252 UTC ] [VerificationUtil.getExecutionEnvironment:9338]  RDBMS Version is -->19.0.0.0.0
[main] [ 2020-03-02 11:59:31.255 UTC ] [ConfigUtil.importConfig:97]  ==== CVU config file: /u01/app/19.6.0.0/grid/cv/admin/cvu_config
[main] [ 2020-03-02 11:59:31.256 UTC ] [ConfigUtil.importConfig:114]  ==== Picked up config variable: cv_raw_check_enabled : TRUE
[main] [ 2020-03-02 11:59:31.256 UTC ] [ConfigUtil.importConfig:114]  ==== Picked up config variable: cv_sudo_binary_location : /usr/local/bin/sudo
[main] [ 2020-03-02 11:59:31.256 UTC ] [ConfigUtil.importConfig:114]  ==== Picked up config variable: cv_pbrun_binary_location : /usr/local/bin/pbrun
[main] [ 2020-03-02 11:59:31.256 UTC ] [ConfigUtil.importConfig:114]  ==== Picked up config variable: cv_assume_cl_version : 19.1.0.0.0
[main] [ 2020-03-02 11:59:31.257 UTC ] [ConfigUtil.isDefined:200]  ==== Is ORACLE_SRVM_REMOTESHELL defined? : false
[main] [ 2020-03-02 11:59:31.257 UTC ] [VerificationUtil.getEnv:8845]  ==== getEnv reports: ORACLE_SRVM_REMOTESHELL=null
[main] [ 2020-03-02 11:59:31.257 UTC ] [ConfigUtil.getConfiguredValue:182]  ==== Fallback to env var 'ORACLE_SRVM_REMOTESHELL'=null
[main] [ 2020-03-02 11:59:31.257 UTC ] [ConfigUtil.isDefined:200]  ==== Is ORACLE_SRVM_REMOTECOPY defined? : false
[main] [ 2020-03-02 11:59:31.257 UTC ] [VerificationUtil.getEnv:8845]  ==== getEnv reports: ORACLE_SRVM_REMOTECOPY=null
[main] [ 2020-03-02 11:59:31.257 UTC ] [ConfigUtil.getConfiguredValue:182]  ==== Fallback to env var 'ORACLE_SRVM_REMOTECOPY'=null
[main] [ 2020-03-02 11:59:31.257 UTC ] [CVUVariables.getCVUVariable:624]  The variable name : <SSH_ONLY> is not listed
[main] [ 2020-03-02 11:59:31.260 UTC ] [sVerificationUtil.resolve:1229]  condition variable SSH_ONLY not handled
[main] [ 2020-03-02 11:59:31.260 UTC ] [CVUVariables.secureVariableValueTrace:779]  getting CVUVariableConstant : VAR = SSH_ONLY VAL = null
[main] [ 2020-03-02 11:59:31.262 UTC ] [UnixSystem.getHostName:485]  unixcmd=/bin/hostname
[Thread-21] [ 2020-03-02 11:59:31.263 UTC ] [StreamReader.run:62]  In StreamReader.run
[main] [ 2020-03-02 11:59:31.266 UTC ] [RuntimeExec.runCommand:294]  runCommand: Waiting for the process
[main] [ 2020-03-02 11:59:31.266 UTC ] [RuntimeExec.runCommand:296]  runCommand: process returns 0
[Thread-20] [ 2020-03-02 11:59:31.266 UTC ] [StreamReader.run:62]  In StreamReader.run
[Thread-20] [ 2020-03-02 11:59:31.266 UTC ] [StreamReader.run:66]  OUTPUT>ol7-122-rac1.localdomain
[main] [ 2020-03-02 11:59:31.266 UTC ] [RuntimeExec.runCommand:323]  RunTimeExec: error>
[main] [ 2020-03-02 11:59:31.266 UTC ] [RuntimeExec.runCommand:349]  Returning from RunTimeExec.runCommand
[main] [ 2020-03-02 11:59:31.267 UTC ] [ClusterInfo.getHostName:462]  Hostname = ol7-122-rac1
[main] [ 2020-03-02 11:59:32.952 UTC ] [CmdToolUtil.enforceEnglishVars:1662]  Set environment for English language
[main] [ 2020-03-02 11:59:32.952 UTC ] [CmdToolUtil.enforceEnglishVars:1663]  OS Name is...Linux
[main] [ 2020-03-02 11:59:32.952 UTC ] [NativeSystem.isCmdScv:601]  isCmdScv: cmd=[]
[main] [ 2020-03-02 11:59:32.953 UTC ] [UnixSystem.dorunRemoteExecCmd:4279]  Final unix SSH command: /bin/rpm -q --qf %{version} sles-release
[Thread-24] [ 2020-03-02 11:59:32.956 UTC ] [StreamReader.run:62]  In StreamReader.run
[main] [ 2020-03-02 11:59:32.958 UTC ] [RuntimeExec.runCommand:294]  runCommand: Waiting for the process
[Thread-23] [ 2020-03-02 11:59:32.959 UTC ] [StreamReader.run:62]  In StreamReader.run
[Thread-23] [ 2020-03-02 11:59:32.991 UTC ] [StreamReader.run:66]  OUTPUT>package sles-release is not installed
[main] [ 2020-03-02 11:59:32.993 UTC ] [RuntimeExec.runCommand:296]  runCommand: process returns 1
[main] [ 2020-03-02 11:59:32.993 UTC ] [RuntimeExec.runCommand:323]  RunTimeExec: error>
[main] [ 2020-03-02 11:59:32.993 UTC ] [RuntimeExec.traceCmdEnv:516]  Calling Runtime.exec() with the command
[main] [ 2020-03-02 11:59:32.993 UTC ] [RuntimeExec.traceCmdEnv:518]  /bin/sh
[main] [ 2020-03-02 11:59:32.994 UTC ] [RuntimeExec.traceCmdEnv:518]  -c
[main] [ 2020-03-02 11:59:32.994 UTC ] [RuntimeExec.traceCmdEnv:518]  /bin/rpm -q --qf %{version} sles-release
[main] [ 2020-03-02 11:59:32.994 UTC ] [RuntimeExec.traceCmdEnv:521]  runCommand: env =
[main] [ 2020-03-02 11:59:32.994 UTC ] [RuntimeExec.traceCmdEnv:524]   0:LANG=en_US.UTF-8
[main] [ 2020-03-02 11:59:32.994 UTC ] [RuntimeExec.traceCmdEnv:524]   1:LC_ALL=en_US.UTF-8
[main] [ 2020-03-02 11:59:32.994 UTC ] [RuntimeExec.traceCmdEnv:524]   2:NLS_LANG=American_America.UTF8
[main] [ 2020-03-02 11:59:32.994 UTC ] [RuntimeExec.runCommand:349]  Returning from RunTimeExec.runCommand
[main] [ 2020-03-02 11:59:32.994 UTC ] [UnixSystem.dorunRemoteExecCmd:4291]  retval = 1
[main] [ 2020-03-02 11:59:32.995 UTC ] [UnixSystem.dorunRemoteExecCmd:4296]  exitValue = 1
[main] [ 2020-03-02 11:59:32.995 UTC ] [CmdToolUtil.doexecute:1131]  nativeSystem.runRemoteExecCmd failed. Command = /bin/rpm arguments = [-q, --qf, %{version}, sles-release] env = null error = null output = [package sles-release is not installed]
[main] [ 2020-03-02 11:59:32.999 UTC ] [CmdToolUtil.getSLESRelease:1939]  Getting SLES Version failed with exception:PRCT-1011 : Failed to run "rpm". Detailed error: [package sles-release is not installed]
[main] [ 2020-03-02 11:59:33.000 UTC ] [OFSUtil.<init>:331]  Using provided binary location:/sbin/
[main] [ 2020-03-02 11:59:33.000 UTC ] [Utils.getLocalHost:487]  Hostname retrieved: ol7-122-rac1.localdomain, returned: ol7-122-rac1
[main] [ 2020-03-02 11:59:33.000 UTC ] [Utils.getNodeName:897]  Hostname : ol7-122-rac1 is converted to nodeName : ol7-122-rac1
[main] [ 2020-03-02 11:59:33.000 UTC ] [CmdToolUtil.enforceEnglishVars:1662]  Set environment for English language
[main] [ 2020-03-02 11:59:33.000 UTC ] [CmdToolUtil.enforceEnglishVars:1663]  OS Name is...Linux
[main] [ 2020-03-02 11:59:33.002 UTC ] [NativeSystem.isCmdScv:601]  isCmdScv: cmd=[]
[main] [ 2020-03-02 11:59:33.002 UTC ] [UnixSystem.dorunRemoteExecCmd:4279]  Final unix SSH command: /sbin//acfsutil version
[Thread-26] [ 2020-03-02 11:59:33.018 UTC ] [StreamReader.run:62]  In StreamReader.run
[main] [ 2020-03-02 11:59:33.028 UTC ] [RuntimeExec.runCommand:294]  runCommand: Waiting for the process
[main] [ 2020-03-02 11:59:33.028 UTC ] [RuntimeExec.runCommand:296]  runCommand: process returns 0
[Thread-25] [ 2020-03-02 11:59:33.028 UTC ] [StreamReader.run:62]  In StreamReader.run
[Thread-25] [ 2020-03-02 11:59:33.028 UTC ] [StreamReader.run:66]  OUTPUT>acfsutil version: 19.0.0.0.0
[main] [ 2020-03-02 11:59:33.029 UTC ] [RuntimeExec.runCommand:323]  RunTimeExec: error>
[main] [ 2020-03-02 11:59:33.029 UTC ] [RuntimeExec.runCommand:349]  Returning from RunTimeExec.runCommand
[main] [ 2020-03-02 11:59:33.029 UTC ] [UnixSystem.dorunRemoteExecCmd:4291]  retval = 0
[main] [ 2020-03-02 11:59:33.030 UTC ] [UnixSystem.dorunRemoteExecCmd:4296]  exitValue = 0
[main] [ 2020-03-02 11:59:33.030 UTC ] [OFSUtil.print_outputTrace:3195]  result[0] = acfsutil version: 19.0.0.0.0
[main] [ 2020-03-02 11:59:33.030 UTC ] [OFSUtil.doGetACFSActiveVersion:480]  ACFS version for the node localnode = 19.0.0.0.0
[main] [ 2020-03-02 11:59:33.030 UTC ] [OFSUtil.<init>:331]  Using provided binary location:/sbin/
[main] [ 2020-03-02 11:59:33.030 UTC ] [Utils.getLocalHost:487]  Hostname retrieved: ol7-122-rac1.localdomain, returned: ol7-122-rac1
[main] [ 2020-03-02 11:59:33.030 UTC ] [Utils.getNodeName:897]  Hostname : ol7-122-rac1 is converted to nodeName : ol7-122-rac1
[main] [ 2020-03-02 11:59:33.030 UTC ] [CmdToolUtil.enforceEnglishVars:1662]  Set environment for English language
[main] [ 2020-03-02 11:59:33.030 UTC ] [CmdToolUtil.enforceEnglishVars:1663]  OS Name is...Linux
[main] [ 2020-03-02 11:59:33.030 UTC ] [NativeSystem.isCmdScv:601]  isCmdScv: cmd=[]
[main] [ 2020-03-02 11:59:33.031 UTC ] [UnixSystem.dorunRemoteExecCmd:4279]  Final unix SSH command: /sbin//acfsutil version
[Thread-28] [ 2020-03-02 11:59:33.032 UTC ] [StreamReader.run:62]  In StreamReader.run
[main] [ 2020-03-02 11:59:33.034 UTC ] [RuntimeExec.runCommand:294]  runCommand: Waiting for the process
[Thread-27] [ 2020-03-02 11:59:33.035 UTC ] [StreamReader.run:62]  In StreamReader.run
[Thread-27] [ 2020-03-02 11:59:33.051 UTC ] [StreamReader.run:66]  OUTPUT>acfsutil version: 19.0.0.0.0
[main] [ 2020-03-02 11:59:33.054 UTC ] [RuntimeExec.runCommand:296]  runCommand: process returns 0
[main] [ 2020-03-02 11:59:33.054 UTC ] [RuntimeExec.runCommand:323]  RunTimeExec: error>
[main] [ 2020-03-02 11:59:33.055 UTC ] [RuntimeExec.runCommand:349]  Returning from RunTimeExec.runCommand
[main] [ 2020-03-02 11:59:33.055 UTC ] [UnixSystem.dorunRemoteExecCmd:4291]  retval = 0
[main] [ 2020-03-02 11:59:33.055 UTC ] [UnixSystem.dorunRemoteExecCmd:4296]  exitValue = 0
[main] [ 2020-03-02 11:59:33.055 UTC ] [OFSUtil.print_outputTrace:3195]  result[0] = acfsutil version: 19.0.0.0.0
[main] [ 2020-03-02 11:59:33.056 UTC ] [OFSUtil.doGetACFSActiveVersion:480]  ACFS version for the node localnode = 19.0.0.0.0
[main] [ 2020-03-02 11:59:33.056 UTC ] [OFSUtil.<init>:331]  Using provided binary location:/sbin/
[main] [ 2020-03-02 11:59:33.056 UTC ] [Utils.getLocalHost:487]  Hostname retrieved: ol7-122-rac1.localdomain, returned: ol7-122-rac1
[main] [ 2020-03-02 11:59:33.056 UTC ] [Utils.getNodeName:897]  Hostname : ol7-122-rac1 is converted to nodeName : ol7-122-rac1
[main] [ 2020-03-02 11:59:33.056 UTC ] [CmdToolUtil.enforceEnglishVars:1662]  Set environment for English language
[main] [ 2020-03-02 11:59:33.056 UTC ] [CmdToolUtil.enforceEnglishVars:1663]  OS Name is...Linux
[main] [ 2020-03-02 11:59:33.056 UTC ] [NativeSystem.isCmdScv:601]  isCmdScv: cmd=[]
[main] [ 2020-03-02 11:59:33.056 UTC ] [UnixSystem.dorunRemoteExecCmd:4279]  Final unix SSH command: /sbin//acfsutil version
[Thread-30] [ 2020-03-02 11:59:33.060 UTC ] [StreamReader.run:62]  In StreamReader.run
[main] [ 2020-03-02 11:59:33.060 UTC ] [RuntimeExec.runCommand:294]  runCommand: Waiting for the process
[Thread-29] [ 2020-03-02 11:59:33.064 UTC ] [StreamReader.run:62]  In StreamReader.run
[Thread-29] [ 2020-03-02 11:59:33.076 UTC ] [StreamReader.run:66]  OUTPUT>acfsutil version: 19.0.0.0.0
[main] [ 2020-03-02 11:59:33.079 UTC ] [RuntimeExec.runCommand:296]  runCommand: process returns 0
[main] [ 2020-03-02 11:59:33.080 UTC ] [RuntimeExec.runCommand:323]  RunTimeExec: error>
[main] [ 2020-03-02 11:59:33.081 UTC ] [RuntimeExec.runCommand:349]  Returning from RunTimeExec.runCommand
[main] [ 2020-03-02 11:59:33.081 UTC ] [UnixSystem.dorunRemoteExecCmd:4291]  retval = 0
[main] [ 2020-03-02 11:59:33.081 UTC ] [UnixSystem.dorunRemoteExecCmd:4296]  exitValue = 0
[main] [ 2020-03-02 11:59:33.081 UTC ] [OFSUtil.print_outputTrace:3195]  result[0] = acfsutil version: 19.0.0.0.0
[main] [ 2020-03-02 11:59:33.081 UTC ] [OFSUtil.doGetACFSActiveVersion:480]  ACFS version for the node localnode = 19.0.0.0.0
[main] [ 2020-03-02 11:59:33.082 UTC ] [OFSUtil.<init>:331]  Using provided binary location:/sbin/
[main] [ 2020-03-02 11:59:33.082 UTC ] [Utils.getLocalHost:487]  Hostname retrieved: ol7-122-rac1.localdomain, returned: ol7-122-rac1
[main] [ 2020-03-02 11:59:33.084 UTC ] [Utils.getNodeName:897]  Hostname : ol7-122-rac1 is converted to nodeName : ol7-122-rac1
[main] [ 2020-03-02 11:59:33.084 UTC ] [CmdToolUtil.enforceEnglishVars:1662]  Set environment for English language
[main] [ 2020-03-02 11:59:33.084 UTC ] [CmdToolUtil.enforceEnglishVars:1663]  OS Name is...Linux
[main] [ 2020-03-02 11:59:33.084 UTC ] [NativeSystem.isCmdScv:601]  isCmdScv: cmd=[]
[main] [ 2020-03-02 11:59:33.084 UTC ] [UnixSystem.dorunRemoteExecCmd:4279]  Final unix SSH command: /sbin//acfsutil version
[Thread-32] [ 2020-03-02 11:59:33.088 UTC ] [StreamReader.run:62]  In StreamReader.run
[main] [ 2020-03-02 11:59:33.089 UTC ] [RuntimeExec.runCommand:294]  runCommand: Waiting for the process
[Thread-31] [ 2020-03-02 11:59:33.089 UTC ] [StreamReader.run:62]  In StreamReader.run
[Thread-31] [ 2020-03-02 11:59:33.110 UTC ] [StreamReader.run:66]  OUTPUT>acfsutil version: 19.0.0.0.0
[main] [ 2020-03-02 11:59:33.116 UTC ] [RuntimeExec.runCommand:296]  runCommand: process returns 0
[main] [ 2020-03-02 11:59:33.118 UTC ] [RuntimeExec.runCommand:323]  RunTimeExec: error>
[main] [ 2020-03-02 11:59:33.121 UTC ] [RuntimeExec.runCommand:349]  Returning from RunTimeExec.runCommand
[main] [ 2020-03-02 11:59:33.122 UTC ] [UnixSystem.dorunRemoteExecCmd:4291]  retval = 0
[main] [ 2020-03-02 11:59:33.122 UTC ] [UnixSystem.dorunRemoteExecCmd:4296]  exitValue = 0
[main] [ 2020-03-02 11:59:33.122 UTC ] [OFSUtil.print_outputTrace:3195]  result[0] = acfsutil version: 19.0.0.0.0
[main] [ 2020-03-02 11:59:33.123 UTC ] [OFSUtil.doGetACFSActiveVersion:480]  ACFS version for the node localnode = 19.0.0.0.0
[main] [ 2020-03-02 11:59:33.123 UTC ] [OFSUtil.<init>:331]  Using provided binary location:/sbin/
[main] [ 2020-03-02 11:59:33.123 UTC ] [Utils.getLocalHost:487]  Hostname retrieved: ol7-122-rac1.localdomain, returned: ol7-122-rac1
[main] [ 2020-03-02 11:59:33.123 UTC ] [Utils.getNodeName:897]  Hostname : ol7-122-rac1 is converted to nodeName : ol7-122-rac1
[main] [ 2020-03-02 11:59:33.123 UTC ] [CmdToolUtil.enforceEnglishVars:1662]  Set environment for English language
[main] [ 2020-03-02 11:59:33.124 UTC ] [CmdToolUtil.enforceEnglishVars:1663]  OS Name is...Linux
[main] [ 2020-03-02 11:59:33.124 UTC ] [NativeSystem.isCmdScv:601]  isCmdScv: cmd=[]
[main] [ 2020-03-02 11:59:33.124 UTC ] [UnixSystem.dorunRemoteExecCmd:4279]  Final unix SSH command: /sbin//acfsutil info fs -o mountpoints
[Thread-34] [ 2020-03-02 11:59:33.138 UTC ] [StreamReader.run:62]  In StreamReader.run
[Thread-33] [ 2020-03-02 11:59:33.138 UTC ] [StreamReader.run:62]  In StreamReader.run
[main] [ 2020-03-02 11:59:33.138 UTC ] [RuntimeExec.runCommand:294]  runCommand: Waiting for the process
[Thread-34] [ 2020-03-02 11:59:33.147 UTC ] [StreamReader.run:66]  ERROR>acfsutil info fs: ACFS-03036: no mounted ACFS file systems
[main] [ 2020-03-02 11:59:33.151 UTC ] [RuntimeExec.runCommand:296]  runCommand: process returns 1
[main] [ 2020-03-02 11:59:33.151 UTC ] [RuntimeExec.runCommand:323]  RunTimeExec: error>
[main] [ 2020-03-02 11:59:33.151 UTC ] [RuntimeExec.runCommand:326]  acfsutil info fs: ACFS-03036: no mounted ACFS file systems
[main] [ 2020-03-02 11:59:33.151 UTC ] [RuntimeExec.traceCmdEnv:516]  Calling Runtime.exec() with the command
[main] [ 2020-03-02 11:59:33.152 UTC ] [RuntimeExec.traceCmdEnv:518]  /bin/sh
[main] [ 2020-03-02 11:59:33.152 UTC ] [RuntimeExec.traceCmdEnv:518]  -c
[main] [ 2020-03-02 11:59:33.152 UTC ] [RuntimeExec.traceCmdEnv:518]  /sbin//acfsutil info fs -o mountpoints
[main] [ 2020-03-02 11:59:33.153 UTC ] [RuntimeExec.traceCmdEnv:521]  runCommand: env =
[main] [ 2020-03-02 11:59:33.153 UTC ] [RuntimeExec.traceCmdEnv:524]   0:LANG=en_US.UTF-8
[main] [ 2020-03-02 11:59:33.153 UTC ] [RuntimeExec.traceCmdEnv:524]   1:LC_ALL=en_US.UTF-8
[main] [ 2020-03-02 11:59:33.154 UTC ] [RuntimeExec.traceCmdEnv:524]   2:NLS_LANG=American_America.UTF8
[main] [ 2020-03-02 11:59:33.155 UTC ] [RuntimeExec.runCommand:349]  Returning from RunTimeExec.runCommand
[main] [ 2020-03-02 11:59:33.155 UTC ] [UnixSystem.dorunRemoteExecCmd:4291]  retval = 1
[main] [ 2020-03-02 11:59:33.155 UTC ] [UnixSystem.dorunRemoteExecCmd:4296]  exitValue = 1
[main] [ 2020-03-02 11:59:33.155 UTC ] [CmdToolUtil.doexecute:1131]  nativeSystem.runRemoteExecCmd failed. Command = /sbin//acfsutil arguments = [info, fs, -o, mountpoints] env = null error = acfsutil info fs: ACFS-03036: no mounted ACFS file systems output =
[FATAL] [INS-42505] The installer has detected that the Oracle Grid Infrastructure home software at (/u01/app/19.6.0.0/grid) is not complete.
   CAUSE: Following files are missing:
[/u01/app/19.6.0.0/grid/LINUX.X64_193000_grid_home.zip]
   ACTION: Ensure that the Oracle Grid Infrastructure home at (/u01/app/19.6.0.0/grid) includes the files listed above.
[oracle@ol7-122-rac1 ~]$
[oracle@ol7-122-rac1 grid]$ sudo su -
Last login: Mon Mar  2 12:21:16 UTC 2020
[root@ol7-122-rac1 ~]# . oraenv <<< +ASM1
ORACLE_SID = [root] ? The Oracle base has been set to /u01/app/oracle
[root@ol7-122-rac1 ~]# time cp -fv /vagrant_software/LINUX.X64_193000_grid_home.zip $ORACLE_HOME; echo $?
‘/vagrant_software/LINUX.X64_193000_grid_home.zip’ -> ‘/u01/app/19.6.0.0/grid/LINUX.X64_193000_grid_home.zip’

real    0m21.647s
user    0m0.023s
sys     0m5.177s
0
[root@ol7-122-rac1 ~]# chmod 775 /u01/app/19.6.0.0/grid/LINUX.X64_193000_grid_home.zip
[root@ol7-122-rac1 ~]# logout
[oracle@ol7-122-rac1 grid]$ cd
[oracle@ol7-122-rac1 ~]$
[oracle@ol7-122-rac1 ~]$
[oracle@ol7-122-rac1 ~]$ $ORACLE_HOME/gridSetup.sh -creategoldimage -exclFiles $ORACLE_HOME/log,$ORACLE_HOME/.patch_storage -destinationlocation /u01/app/oracle/goldimage -silent
Launching Oracle Grid Infrastructure Setup Wizard...

Successfully Setup Software.
Gold Image location: /u01/app/oracle/goldimage/grid_home_2020-03-02_12-23-57PM.zip


[oracle@ol7-122-rac1 ~]$ ls -lh /u01/app/oracle/goldimage/grid_home_2020-03-02_12-23-57PM.zip
-rw-r--r--. 1 oracle oinstall 5.9G Mar  2 12:32 /u01/app/oracle/goldimage/grid_home_2020-03-02_12-23-57PM.zip
[oracle@ol7-122-rac1 ~]$ ls -lh /vagrant_software/LINUX.X64_193000_grid_home.zip
-rwxrwxrwx. 1 vagrant vagrant 2.7G Aug  5  2019 /vagrant_software/LINUX.X64_193000_grid_home.zip
[oracle@ol7-122-rac1 ~]$ 

Upgrade Grid 12.2 to 19.6 Using Gold Image

Sun, 2020-03-01 16:00

Quick and dirty OPatch Update for All nodes:

[oracle@ol7-122-rac1 JAN2019]$ echo $ORACLE_HOME
/u01/app/12.2.0.1/grid
[oracle@ol7-122-rac1 JAN2019]$ $ORACLE_HOME/OPatch/opatch version
OPatch Version: 12.2.0.1.6

OPatch succeeded.
[oracle@ol7-122-rac1 JAN2019]$ rm -rf $ORACLE_HOME/OPatch/*
[oracle@ol7-122-rac1 JAN2019]$ unzip -qo p6880880_122010_Linux-x86-64.zip -d $ORACLE_HOME
[oracle@ol7-122-rac1 JAN2019]$ $ORACLE_HOME/OPatch/opatch version
OPatch Version: 12.2.0.1.19

OPatch succeeded.
[oracle@ol7-122-rac1 JAN2019]$

------------------------------

[oracle@ol7-122-rac2 JAN2019]$ echo $ORACLE_HOME
/u01/app/12.2.0.1/grid
[oracle@ol7-122-rac2 JAN2019]$ $ORACLE_HOME/OPatch/opatch version
OPatch Version: 12.2.0.1.6

OPatch succeeded.
[oracle@ol7-122-rac2 JAN2019]$ rm -rf $ORACLE_HOME/OPatch/*
[oracle@ol7-122-rac2 JAN2019]$ unzip -qo p6880880_122010_Linux-x86-64.zip -d $ORACLE_HOME
[oracle@ol7-122-rac2 JAN2019]$ $ORACLE_HOME/OPatch/opatch version
OPatch Version: 12.2.0.1.19

OPatch succeeded.
[oracle@ol7-122-rac2 JAN2019]$

Create Grid 19.6 directory for All nodes:

[root@ol7-122-rac1 ~]# mkdir -p /u01/app/19.6.0.0/grid
[root@ol7-122-rac1 ~]# chown oracle:oinstall /u01/app/19.6.0.0/grid
[root@ol7-122-rac1 ~]# chmod 775 /u01/app/19.6.0.0/grid

------------------------------

[root@ol7-122-rac2 ~]# mkdir -p /u01/app/19.6.0.0/grid
[root@ol7-122-rac2 ~]# chown oracle:oinstall /u01/app/19.6.0.0/grid
[root@ol7-122-rac2 ~]# chmod 775 /u01/app/19.6.0.0/grid

Verify required Grid 12.2 patch for All nodes:

[oracle@ol7-122-rac1 ~]$ $ORACLE_HOME/OPatch/opatch lspatches -oh /u01/app/12.2.0.1/grid
28553832;OCW Interim patch for 28553832

OPatch succeeded.
[oracle@ol7-122-rac1 ~]$

------------------------------

[oracle@ol7-122-rac2 ~]$ $ORACLE_HOME/OPatch/opatch lspatches -oh /u01/app/12.2.0.1/grid
28553832;OCW Interim patch for 28553832

OPatch succeeded.
[oracle@ol7-122-rac2 ~]$

Unzip Grid 19.6 Gold Image for First node:

[oracle@ol7-122-rac1 ~]$ time unzip -qo /vagrant_software/LINUX.X64_19600_grid_home.zip -d /u01/app/19.6.0.0/grid; echo $?

real    4m56.824s
user    0m24.313s
sys     0m53.903s
0

[oracle@ol7-122-rac1 ~]$ ls /u01/app/19.6.0.0/grid
acfs        cha          dmu            javavm                          ologgerd       plsql          root.sh.old.3   utl
acfsccm     client       env.ora        jdbc                            OPatch         precomp        rootupgrade.sh  welcome.html
acfsccreg   clone        evm            jdk                             opatchautocfg  QOpatch        runcluvfy.sh    wlm
acfscm      crs          gipc           jlib                            opmn           qos            sdk             wwg
acfsiob     css          gnsd           ldap                            oracore        racg           slax            xag
acfsrd      ctss         gpnp           lib                             ord            rdbms          sqlpatch        xdk
acfsrm      cv           gridSetup.sh   LINUX.X64_193000_grid_home.zip  ords           relnotes       sqlplus
addnode     dbjava       has            md                              oss            rhp            srvm
advmccb     dbs          hs             mdns                            osysmond       root.sh        suptools
assistants  deinstall    install        network                         oui            root.sh.old    tomcat
bin         demo         instantclient  nls                             owm            root.sh.old.1  ucp
cdp         diagnostics  inventory      ohasd                           perl           root.sh.old.2  usm

[oracle@ol7-122-rac1 ~]$ du -sh /u01/app/19.6.0.0/grid
9.4G    /u01/app/19.6.0.0/grid
[oracle@ol7-122-rac1 ~]$

[root@ol7-122-rac1 ~]# /u01/app/19.6.0.0/grid/rootupgrade.sh

Performing root user operation.

The following environment variables are set as:
    ORACLE_OWNER= oracle
    ORACLE_HOME=  /u01/app/19.6.0.0/grid

Enter the full pathname of the local bin directory: [/usr/local/bin]:
The contents of "dbhome" have not changed. No need to overwrite.
The file "oraenv" already exists in /usr/local/bin.  Overwrite it? (y/n)
[n]: y
   Copying oraenv to /usr/local/bin ...
The file "coraenv" already exists in /usr/local/bin.  Overwrite it? (y/n)
[n]: y
   Copying coraenv to /usr/local/bin ...

Entries will be added to the /etc/oratab file as needed by
Database Configuration Assistant when a database is created
Finished running generic part of root script.
Now product-specific root actions will be performed.
Relinking oracle with rac_on option
Using configuration parameter file: /u01/app/19.6.0.0/grid/crs/install/crsconfig_params
The log of current session can be found at:
  /u01/app/oracle/crsdata/ol7-122-rac1/crsconfig/rootcrs_ol7-122-rac1_2020-03-01_05-05-31PM.log
2020/03/01 17:05:49 CLSRSC-595: Executing upgrade step 1 of 18: 'UpgradeTFA'.
2020/03/01 17:05:49 CLSRSC-4015: Performing install or upgrade action for Oracle Trace File Analyzer (TFA) Collector.
2020/03/01 17:05:49 CLSRSC-595: Executing upgrade step 2 of 18: 'ValidateEnv'.
2020/03/01 17:05:54 CLSRSC-595: Executing upgrade step 3 of 18: 'GetOldConfig'.
2020/03/01 17:05:54 CLSRSC-464: Starting retrieval of the cluster configuration data
2020/03/01 17:09:07 CLSRSC-4003: Successfully patched Oracle Trace File Analyzer (TFA) Collector.
2020/03/01 17:09:36 CLSRSC-692: Checking whether CRS entities are ready for upgrade. This operation may take a few minutes.
2020/03/01 17:11:40 CLSRSC-693: CRS entities validation completed successfully.
2020/03/01 17:11:44 CLSRSC-515: Starting OCR manual backup.
2020/03/01 17:11:51 CLSRSC-516: OCR manual backup successful.
2020/03/01 17:11:58 CLSRSC-486:
 At this stage of upgrade, the OCR has changed.
 Any attempt to downgrade the cluster after this point will require a complete cluster outage to restore the OCR.
2020/03/01 17:11:58 CLSRSC-541:
 To downgrade the cluster:
 1. All nodes that have been upgraded must be downgraded.
2020/03/01 17:11:58 CLSRSC-542:
 2. Before downgrading the last node, the Grid Infrastructure stack on all other cluster nodes must be down.
2020/03/01 17:12:04 CLSRSC-465: Retrieval of the cluster configuration data has successfully completed.
2020/03/01 17:12:04 CLSRSC-595: Executing upgrade step 4 of 18: 'GenSiteGUIDs'.
2020/03/01 17:12:05 CLSRSC-595: Executing upgrade step 5 of 18: 'UpgPrechecks'.
2020/03/01 17:12:08 CLSRSC-363: User ignored prerequisites during installation
2020/03/01 17:12:17 CLSRSC-595: Executing upgrade step 6 of 18: 'SetupOSD'.
2020/03/01 17:12:17 CLSRSC-595: Executing upgrade step 7 of 18: 'PreUpgrade'.
2020/03/01 17:17:16 CLSRSC-468: Setting Oracle Clusterware and ASM to rolling migration mode
2020/03/01 17:17:16 CLSRSC-482: Running command: '/u01/app/12.2.0.1/grid/bin/crsctl start rollingupgrade 19.0.0.0.0'
CRS-1131: The cluster was successfully set to rolling upgrade mode.
2020/03/01 17:17:20 CLSRSC-482: Running command: '/u01/app/19.6.0.0/grid/bin/asmca -silent -upgradeNodeASM -nonRolling false -oldCRSHome /u01/app/12.2.0.1/grid -oldCRSVersion 12.2.0.1.0 -firstNode true -startRolling false '

ASM configuration upgraded in local node successfully.

2020/03/01 17:18:22 CLSRSC-469: Successfully set Oracle Clusterware and ASM to rolling migration mode
2020/03/01 17:18:26 CLSRSC-466: Starting shutdown of the current Oracle Grid Infrastructure stack
2020/03/01 17:19:10 CLSRSC-467: Shutdown of the current Oracle Grid Infrastructure stack has successfully completed.
2020/03/01 17:19:12 CLSRSC-595: Executing upgrade step 8 of 18: 'CheckCRSConfig'.
2020/03/01 17:19:13 CLSRSC-595: Executing upgrade step 9 of 18: 'UpgradeOLR'.
2020/03/01 17:19:25 CLSRSC-595: Executing upgrade step 10 of 18: 'ConfigCHMOS'.
2020/03/01 17:19:25 CLSRSC-595: Executing upgrade step 11 of 18: 'UpgradeAFD'.
2020/03/01 17:19:32 CLSRSC-595: Executing upgrade step 12 of 18: 'createOHASD'.
2020/03/01 17:19:38 CLSRSC-595: Executing upgrade step 13 of 18: 'ConfigOHASD'.
2020/03/01 17:19:38 CLSRSC-329: Replacing Clusterware entries in file 'oracle-ohasd.service'
2020/03/01 17:21:09 CLSRSC-595: Executing upgrade step 14 of 18: 'InstallACFS'.
2020/03/01 17:23:09 CLSRSC-595: Executing upgrade step 15 of 18: 'InstallKA'.
2020/03/01 17:23:15 CLSRSC-595: Executing upgrade step 16 of 18: 'UpgradeCluster'.
2020/03/01 17:26:57 CLSRSC-343: Successfully started Oracle Clusterware stack
clscfg: EXISTING configuration version 5 detected.
Successfully taken the backup of node specific configuration in OCR.
Successfully accumulated necessary OCR keys.
Creating OCR keys for user 'root', privgrp 'root'..
Operation successful.
2020/03/01 17:27:15 CLSRSC-595: Executing upgrade step 17 of 18: 'UpgradeNode'.
2020/03/01 17:27:18 CLSRSC-474: Initiating upgrade of resource types
2020/03/01 17:33:51 CLSRSC-475: Upgrade of resource types successfully initiated.
2020/03/01 17:34:01 CLSRSC-595: Executing upgrade step 18 of 18: 'PostUpgrade'.
2020/03/01 17:34:08 CLSRSC-325: Configure Oracle Grid Infrastructure for a Cluster ... succeeded
[root@ol7-122-rac1 ~]#

[root@ol7-122-rac2 ~]# /u01/app/19.6.0.0/grid/rootupgrade.sh

Performing root user operation.

The following environment variables are set as:
    ORACLE_OWNER= oracle
    ORACLE_HOME=  /u01/app/19.6.0.0/grid

Enter the full pathname of the local bin directory: [/usr/local/bin]:
The contents of "dbhome" have not changed. No need to overwrite.
The file "coraenv" already exists in /usr/local/bin.  Overwrite it? (y/n)
[n]: y
   Copying coraenv to /usr/local/bin ...

Entries will be added to the /etc/oratab file as needed by
Database Configuration Assistant when a database is created
Finished running generic part of root script.
Now product-specific root actions will be performed.
Relinking oracle with rac_on option
Using configuration parameter file: /u01/app/19.6.0.0/grid/crs/install/crsconfig_params
The log of current session can be found at:
  /u01/app/oracle/crsdata/ol7-122-rac2/crsconfig/rootcrs_ol7-122-rac2_2020-03-01_05-39-49PM.log
2020/03/01 17:39:57 CLSRSC-595: Executing upgrade step 1 of 18: 'UpgradeTFA'.
2020/03/01 17:39:57 CLSRSC-4015: Performing install or upgrade action for Oracle Trace File Analyzer (TFA) Collector.
2020/03/01 17:39:57 CLSRSC-595: Executing upgrade step 2 of 18: 'ValidateEnv'.
2020/03/01 17:39:58 CLSRSC-595: Executing upgrade step 3 of 18: 'GetOldConfig'.
2020/03/01 17:39:58 CLSRSC-464: Starting retrieval of the cluster configuration data
2020/03/01 17:40:12 CLSRSC-465: Retrieval of the cluster configuration data has successfully completed.
2020/03/01 17:40:12 CLSRSC-595: Executing upgrade step 4 of 18: 'GenSiteGUIDs'.
2020/03/01 17:40:12 CLSRSC-595: Executing upgrade step 5 of 18: 'UpgPrechecks'.
2020/03/01 17:40:13 CLSRSC-363: User ignored prerequisites during installation
2020/03/01 17:40:14 CLSRSC-595: Executing upgrade step 6 of 18: 'SetupOSD'.
2020/03/01 17:40:14 CLSRSC-595: Executing upgrade step 7 of 18: 'PreUpgrade'.

ASM configuration upgraded in local node successfully.

2020/03/01 17:41:21 CLSRSC-466: Starting shutdown of the current Oracle Grid Infrastructure stack
2020/03/01 17:43:07 CLSRSC-4003: Successfully patched Oracle Trace File Analyzer (TFA) Collector.
2020/03/01 17:47:30 CLSRSC-467: Shutdown of the current Oracle Grid Infrastructure stack has successfully completed.
2020/03/01 17:47:32 CLSRSC-595: Executing upgrade step 8 of 18: 'CheckCRSConfig'.
2020/03/01 17:47:32 CLSRSC-595: Executing upgrade step 9 of 18: 'UpgradeOLR'.
2020/03/01 17:47:40 CLSRSC-595: Executing upgrade step 10 of 18: 'ConfigCHMOS'.
2020/03/01 17:47:40 CLSRSC-595: Executing upgrade step 11 of 18: 'UpgradeAFD'.
2020/03/01 17:47:42 CLSRSC-595: Executing upgrade step 12 of 18: 'createOHASD'.
2020/03/01 17:47:43 CLSRSC-595: Executing upgrade step 13 of 18: 'ConfigOHASD'.
2020/03/01 17:47:43 CLSRSC-329: Replacing Clusterware entries in file 'oracle-ohasd.service'
2020/03/01 17:49:01 CLSRSC-595: Executing upgrade step 14 of 18: 'InstallACFS'.
2020/03/01 17:50:42 CLSRSC-595: Executing upgrade step 15 of 18: 'InstallKA'.
2020/03/01 17:50:44 CLSRSC-595: Executing upgrade step 16 of 18: 'UpgradeCluster'.
2020/03/01 17:51:35 CLSRSC-343: Successfully started Oracle Clusterware stack
clscfg: EXISTING configuration version 19 detected.
Successfully taken the backup of node specific configuration in OCR.
Successfully accumulated necessary OCR keys.
Creating OCR keys for user 'root', privgrp 'root'..
Operation successful.
2020/03/01 17:52:29 CLSRSC-595: Executing upgrade step 17 of 18: 'UpgradeNode'.
Start upgrade invoked..
2020/03/01 17:52:33 CLSRSC-478: Setting Oracle Clusterware active version on the last node to be upgraded
2020/03/01 17:52:33 CLSRSC-482: Running command: '/u01/app/19.6.0.0/grid/bin/crsctl set crs activeversion'
Started to upgrade the active version of Oracle Clusterware. This operation may take a few minutes.
Started to upgrade CSS.
CSS was successfully upgraded.
Started to upgrade Oracle ASM.
Started to upgrade CRS.
CRS was successfully upgraded.
Started to upgrade Oracle ACFS.
Oracle ACFS was successfully upgraded.
Successfully upgraded the active version of Oracle Clusterware.
Oracle Clusterware active version was successfully set to 19.0.0.0.0.
2020/03/01 17:53:42 CLSRSC-479: Successfully set Oracle Clusterware active version
2020/03/01 17:53:42 CLSRSC-476: Finishing upgrade of resource types
2020/03/01 17:53:49 CLSRSC-477: Successfully completed upgrade of resource types
2020/03/01 17:57:54 CLSRSC-595: Executing upgrade step 18 of 18: 'PostUpgrade'.
Successfully updated XAG resources.
2020/03/01 17:58:37 CLSRSC-325: Configure Oracle Grid Infrastructure for a Cluster ... succeeded
[root@ol7-122-rac2 ~]#

Check $ORACLE_HOME/.patch_storage $ORACLE_HOME/log for new GI 19.6

[oracle@ol7-122-rac2 ~]$ ls -l $ORACLE_HOME/.patch_storage $ORACLE_HOME/log
ls: cannot access /u01/app/19.6.0.0/grid/.patch_storage: No such file or directory
/u01/app/19.6.0.0/grid/log:
total 4
drwxr-x---.  4 oracle oinstall   57 Mar  1 17:51 diag
drwxr-xr-t. 20 root   oinstall 4096 Mar  1 17:39 ol7-122-rac2
[oracle@ol7-122-rac2 ~]$

Create 19c Gold Image

Sat, 2020-02-29 23:01

What the hell. My blog posts can be so terrible that I don’t even understand at times what I post.

Anyhow, here is a reminder for creating 19c Gold Image.

From my notes, log and .patch_storage directories need to be excluded and not sure if there are any others as I have not fully tested deployment using 19c Gold Image.

[oracle@ol7-122-rac1 ~]$ echo $ORACLE_HOME; ls -l $ORACLE_HOME/.patch_storage $ORACLE_HOME/log
/u01/app/19.3.0.0/grid

/u01/app/19.3.0.0/grid/log:
total 4
drwxr-xr-x.  2 oracle oinstall    6 Feb 29 18:16 crs
drwxr-x---.  4 oracle oinstall   57 Feb 29 18:49 diag
drwxr-xr-t. 20 root   oinstall 4096 Feb 29 18:32 ol7-122-rac1

/u01/app/19.3.0.0/grid/.patch_storage:
total 48
drwxr-xr-x.  3 oracle oinstall    74 Apr 18  2019 29517242_Apr_17_2019_23_27_10
drwxr-xr-x.  3 oracle oinstall    74 Apr 18  2019 29517247_Apr_1_2019_15_08_20
drwxr-xr-x.  3 oracle oinstall    74 Apr 18  2019 29585399_Apr_9_2019_19_12_47
drwxr-xr-x.  4 oracle oinstall    87 Feb 29 20:10 30489227_Jan_7_2020_03_37_45
drwxr-xr-x.  4 oracle oinstall    87 Feb 29 20:11 30489632_Dec_24_2019_03_32_55
drwxr-xr-x.  4 oracle oinstall    87 Feb 29 20:15 30557433_Jan_6_2020_19_07_34
drwxr-xr-x.  4 oracle oinstall    87 Feb 29 20:15 30655595_Dec_12_2019_04_55_54
-rw-r--r--.  1 oracle oinstall 17034 Feb 29 20:15 interim_inventory.txt
-rw-r--r--.  1 oracle oinstall   101 Feb 29 20:15 LatestOPatchSession.properties
drwxr-xr-x. 18 oracle oinstall  4096 Feb 29 20:15 NApply
-rw-r--r--.  1 oracle oinstall 16892 Feb 29 20:15 record_inventory.txt

[oracle@ol7-122-rac1 ~]$ $ORACLE_HOME/OPatch/opatch lspatches
30655595;TOMCAT RELEASE UPDATE 19.0.0.0.0 (30655595)
30557433;Database Release Update : 19.6.0.0.200114 (30557433)
30489632;ACFS RELEASE UPDATE 19.6.0.0.0 (30489632)
30489227;OCW RELEASE UPDATE 19.6.0.0.0 (30489227)

OPatch succeeded.

[oracle@ol7-122-rac1 ~]$ $ORACLE_HOME/gridSetup.sh -creategoldimage -exclFiles $ORACLE_HOME/log,$ORACLE_HOME/.patch_storage -destinationlocation /u01/app/oracle/goldimage -silent
Launching Oracle Grid Infrastructure Setup Wizard...

Successfully Setup Software.
Gold Image location: /u01/app/oracle/goldimage/grid_home_2020-03-01_04-40-18AM.zip

[oracle@ol7-122-rac1 ~]$

[oracle@ol7-122-rac1 goldimage]$ cp -fv grid_home_2020-03-01_04-40-18AM.zip /vagrant_software/LINUX.X64_19600_grid_home.zip; echo $?
‘grid_home_2020-03-01_04-40-18AM.zip’ -> ‘/vagrant_software/LINUX.X64_19600_grid_home.zip’
0

[oracle@ol7-122-rac1 goldimage]$ ls -l /vagrant_software/
total 18236448
drwxrwxrwx. 1 vagrant vagrant          0 Feb 29 16:41 JAN2019
drwxrwxrwx. 1 vagrant vagrant          0 Feb 29 16:35 JAN2020
-rwxrwxrwx. 1 vagrant vagrant 3453696911 Feb 19  2019 linuxx64_12201_database.zip
-rwxrwxrwx. 1 vagrant vagrant 2994687209 Feb 19  2019 linuxx64_12201_grid_home.zip
-rwxrwxrwx. 1 vagrant vagrant 3059705302 Sep  5 15:45 LINUX.X64_193000_db_home.zip
-rwxrwxrwx. 1 vagrant vagrant 2889184573 Aug  5  2019 LINUX.X64_193000_grid_home.zip
-rwxrwxrwx. 1 vagrant vagrant 6276814110 Mar  1 05:05 LINUX.X64_19600_grid_home.zip
-rwxrwxrwx. 1 vagrant vagrant      21848 Feb 23 15:24 sshpass-1.06-1.el7.x86_64.rpm
[oracle@ol7-122-rac1 goldimage]$

As you can see, this looks to be full of confusion.

The environment started as Vagrant RAC build – ol7-122-rac.
GI was then upgraded to 19.3 and patched to 19.6.

Does it even make sense to create home with version specific vs 19c?

One day when I am really bored, I will probably try to upgrade GI 12.2 to 19.6 using Gold Image.

Is Peace Of Mind Better Than Best Practice

Sat, 2020-02-29 21:18

There’s a discussion on twitter about nasty bug with GI upgrade to 19.6.

It’s unclear if gridSetup.sh -applyRU is being used with leads to BUG.

So for anyone upgrading GI from 12.2 to 19.6 cc @RACMasterPM @mdinh235
After rootupgrade on node 1 finished successfully, the subnet of the interconnect changed. So the first node ASM ( or any other instance) now in 19c and the second node in 12c , failed to see each other

— Rene Antunez (@rene_ace) February 28, 2020

Truthfully, I like the concept of gridSetup.sh -applyRU; however, I am often reminded of manager who used to coach me, “Slow and steady wins the race.”

With that being said, I suggested that it may be better and simpler to complete upgrade first and then patch vs upgrade and patch at the same time.

Then I am asked, “So the best practice should be install the base one first and patch after?”

What’s the price for Peace Of Mind?

Out of curiosity, I was able to upgrade GI to 19.6 from 12.2 with upgrade first and then patch.

I am not going to explain the process but here are the relevant terminal outputs. gridSetup.sh was performed using GUI – I was lazy.

==================================================

[oracle@ol7-122-rac1 ~]$ $ORACLE_HOME/OPatch/opatch lspatches -oh /u01/app/12.2.0.1/grid
28553832;OCW Interim patch for 28553832

OPatch succeeded.
[oracle@ol7-122-rac1 ~]$ $ORACLE_HOME/OPatch/opatch lspatches -oh /u01/app/oracle/product/12.2.0.1/dbhome_1
28553832;OCW Interim patch for 28553832

OPatch succeeded.
[oracle@ol7-122-rac1 ~]$

--------------------------------------------------

[oracle@ol7-122-rac2 ~]$ $ORACLE_HOME/OPatch/opatch lspatches -oh /u01/app/12.2.0.1/grid
28553832;OCW Interim patch for 28553832

OPatch succeeded.
[oracle@ol7-122-rac2 ~]$ $ORACLE_HOME/OPatch/opatch lspatches -oh /u01/app/oracle/product/12.2.0.1/dbhome_1
28553832;OCW Interim patch for 28553832

OPatch succeeded.
[oracle@ol7-122-rac2 ~]$

==================================================

[oracle@ol7-122-rac1 ~]$ /u01/app/19.3.0.0/grid/runcluvfy.sh stage -pre crsinst -upgrade -rolling \
> -src_crshome /u01/app/12.2.0.1/grid -dest_crshome /u01/app/19.3.0.0/grid \
> -dest_version 19.0.0.0.0 -fixup -verbose

Pre-check for cluster services setup was unsuccessful.
Checks did not pass for the following nodes:
        ol7-122-rac2,ol7-122-rac1


Failures were encountered during execution of CVU verification request "stage -pre crsinst".

Verifying Physical Memory ...FAILED
ol7-122-rac2: PRVF-7530 : Sufficient physical memory is not available on node
              "ol7-122-rac2" [Required physical memory = 8GB (8388608.0KB)]

ol7-122-rac1: PRVF-7530 : Sufficient physical memory is not available on node
              "ol7-122-rac1" [Required physical memory = 8GB (8388608.0KB)]

Verifying ACFS Driver Checks ...FAILED
PRVG-6096 : Oracle ACFS driver is not supported on the current operating system
version for Oracle Clusterware release version "19.0.0.0.0".

Verifying RPM Package Manager database ...INFORMATION
PRVG-11250 : The check "RPM Package Manager database" was not performed because
it needs 'root' user privileges.


CVU operation performed:      stage -pre crsinst
Date:                         Feb 29, 2020 5:49:54 PM
CVU home:                     /u01/app/19.3.0.0/grid/
User:                         oracle
[oracle@ol7-122-rac1 ~]$

==================================================

[root@ol7-122-rac1 ~]# /u01/app/19.3.0.0/grid/rootupgrade.sh
Performing root user operation.

The following environment variables are set as:
    ORACLE_OWNER= oracle
    ORACLE_HOME=  /u01/app/19.3.0.0/grid

Enter the full pathname of the local bin directory: [/usr/local/bin]:
The contents of "dbhome" have not changed. No need to overwrite.
The file "oraenv" already exists in /usr/local/bin.  Overwrite it? (y/n)
[n]: y
   Copying oraenv to /usr/local/bin ...
The file "coraenv" already exists in /usr/local/bin.  Overwrite it? (y/n)
[n]: y
   Copying coraenv to /usr/local/bin ...

Entries will be added to the /etc/oratab file as needed by
Database Configuration Assistant when a database is created
Finished running generic part of root script.
Now product-specific root actions will be performed.
Relinking oracle with rac_on option

Using configuration parameter file: /u01/app/19.3.0.0/grid/crs/install/crsconfig_params
The log of current session can be found at:
  /u01/app/oracle/crsdata/ol7-122-rac1/crsconfig/rootcrs_ol7-122-rac1_2020-02-29_06-32-37PM.log
2020/02/29 18:33:02 CLSRSC-595: Executing upgrade step 1 of 18: 'UpgradeTFA'.
2020/02/29 18:33:02 CLSRSC-4015: Performing install or upgrade action for Oracle Trace File Analyzer (TFA) Collector.
2020/02/29 18:33:02 CLSRSC-595: Executing upgrade step 2 of 18: 'ValidateEnv'.
2020/02/29 18:33:09 CLSRSC-595: Executing upgrade step 3 of 18: 'GetOldConfig'.
2020/02/29 18:33:09 CLSRSC-464: Starting retrieval of the cluster configuration data
2020/02/29 18:33:21 CLSRSC-692: Checking whether CRS entities are ready for upgrade. This operation may take a few minutes.
2020/02/29 18:35:26 CLSRSC-693: CRS entities validation completed successfully.
2020/02/29 18:35:33 CLSRSC-515: Starting OCR manual backup.
2020/02/29 18:35:46 CLSRSC-516: OCR manual backup successful.
2020/02/29 18:36:30 CLSRSC-4003: Successfully patched Oracle Trace File Analyzer (TFA) Collector.
2020/02/29 18:39:40 CLSRSC-486:
 At this stage of upgrade, the OCR has changed.
 Any attempt to downgrade the cluster after this point will require a complete cluster outage to restore the OCR.
2020/02/29 18:39:41 CLSRSC-541:
 To downgrade the cluster:
 1. All nodes that have been upgraded must be downgraded.
2020/02/29 18:39:42 CLSRSC-542:
 2. Before downgrading the last node, the Grid Infrastructure stack on all other cluster nodes must be down.
2020/02/29 18:40:03 CLSRSC-465: Retrieval of the cluster configuration data has successfully completed.
2020/02/29 18:40:04 CLSRSC-595: Executing upgrade step 4 of 18: 'GenSiteGUIDs'.
2020/02/29 18:40:07 CLSRSC-595: Executing upgrade step 5 of 18: 'UpgPrechecks'.
2020/02/29 18:40:14 CLSRSC-363: User ignored prerequisites during installation
2020/02/29 18:40:33 CLSRSC-595: Executing upgrade step 6 of 18: 'SetupOSD'.
2020/02/29 18:40:33 CLSRSC-595: Executing upgrade step 7 of 18: 'PreUpgrade'.
2020/02/29 18:46:04 CLSRSC-468: Setting Oracle Clusterware and ASM to rolling migration mode
2020/02/29 18:46:04 CLSRSC-482: Running command: '/u01/app/12.2.0.1/grid/bin/crsctl start rollingupgrade 19.0.0.0.0'
CRS-1131: The cluster was successfully set to rolling upgrade mode.
2020/02/29 18:46:10 CLSRSC-482: Running command: '/u01/app/19.3.0.0/grid/bin/asmca -silent -upgradeNodeASM -nonRolling false -oldCRSHome /u01/app/12.2.0.1/grid -oldCRSVersion 12.2.0.1.0 -firstNode true -startRolling false '

ASM configuration upgraded in local node successfully.

2020/02/29 18:46:19 CLSRSC-469: Successfully set Oracle Clusterware and ASM to rolling migration mode
2020/02/29 18:46:29 CLSRSC-466: Starting shutdown of the current Oracle Grid Infrastructure stack
2020/02/29 18:47:15 CLSRSC-467: Shutdown of the current Oracle Grid Infrastructure stack has successfully completed.
2020/02/29 18:47:18 CLSRSC-595: Executing upgrade step 8 of 18: 'CheckCRSConfig'.
2020/02/29 18:47:21 CLSRSC-595: Executing upgrade step 9 of 18: 'UpgradeOLR'.
2020/02/29 18:47:32 CLSRSC-595: Executing upgrade step 10 of 18: 'ConfigCHMOS'.
2020/02/29 18:47:32 CLSRSC-595: Executing upgrade step 11 of 18: 'UpgradeAFD'.
2020/02/29 18:47:42 CLSRSC-595: Executing upgrade step 12 of 18: 'createOHASD'.
2020/02/29 18:47:51 CLSRSC-595: Executing upgrade step 13 of 18: 'ConfigOHASD'.
2020/02/29 18:47:51 CLSRSC-329: Replacing Clusterware entries in file 'oracle-ohasd.service'
2020/02/29 18:48:31 CLSRSC-595: Executing upgrade step 14 of 18: 'InstallACFS'.
2020/02/29 18:48:46 CLSRSC-595: Executing upgrade step 15 of 18: 'InstallKA'.
2020/02/29 18:48:54 CLSRSC-595: Executing upgrade step 16 of 18: 'UpgradeCluster'.
2020/02/29 18:50:44 CLSRSC-343: Successfully started Oracle Clusterware stack
clscfg: EXISTING configuration version 5 detected.
Successfully taken the backup of node specific configuration in OCR.
Successfully accumulated necessary OCR keys.
Creating OCR keys for user 'root', privgrp 'root'..
Operation successful.
2020/02/29 18:51:33 CLSRSC-595: Executing upgrade step 17 of 18: 'UpgradeNode'.
2020/02/29 18:51:38 CLSRSC-474: Initiating upgrade of resource types
2020/02/29 18:52:58 CLSRSC-475: Upgrade of resource types successfully initiated.
2020/02/29 18:53:13 CLSRSC-595: Executing upgrade step 18 of 18: 'PostUpgrade'.
2020/02/29 18:53:22 CLSRSC-325: Configure Oracle Grid Infrastructure for a Cluster ... succeeded
[root@ol7-122-rac1 ~]#

--------------------------------------------------

[root@ol7-122-rac2 ~]# /u01/app/19.3.0.0/grid/rootupgrade.sh
Performing root user operation.

The following environment variables are set as:
    ORACLE_OWNER= oracle
    ORACLE_HOME=  /u01/app/19.3.0.0/grid

Enter the full pathname of the local bin directory: [/usr/local/bin]:
The contents of "dbhome" have not changed. No need to overwrite.
The file "oraenv" already exists in /usr/local/bin.  Overwrite it? (y/n)
[n]: y
   Copying oraenv to /usr/local/bin ...
The file "coraenv" already exists in /usr/local/bin.  Overwrite it? (y/n)
[n]: y
   Copying coraenv to /usr/local/bin ...

Entries will be added to the /etc/oratab file as needed by
Database Configuration Assistant when a database is created
Finished running generic part of root script.
Now product-specific root actions will be performed.
Relinking oracle with rac_on option
Using configuration parameter file: /u01/app/19.3.0.0/grid/crs/install/crsconfig_params
The log of current session can be found at:
  /u01/app/oracle/crsdata/ol7-122-rac2/crsconfig/rootcrs_ol7-122-rac2_2020-02-29_06-57-24PM.log
2020/02/29 18:57:34 CLSRSC-595: Executing upgrade step 1 of 18: 'UpgradeTFA'.
2020/02/29 18:57:34 CLSRSC-4015: Performing install or upgrade action for Oracle Trace File Analyzer (TFA) Collector.
2020/02/29 18:57:34 CLSRSC-595: Executing upgrade step 2 of 18: 'ValidateEnv'.
2020/02/29 18:57:36 CLSRSC-595: Executing upgrade step 3 of 18: 'GetOldConfig'.
2020/02/29 18:57:36 CLSRSC-464: Starting retrieval of the cluster configuration data
2020/02/29 18:57:49 CLSRSC-465: Retrieval of the cluster configuration data has successfully completed.
2020/02/29 18:57:49 CLSRSC-595: Executing upgrade step 4 of 18: 'GenSiteGUIDs'.
2020/02/29 18:57:49 CLSRSC-595: Executing upgrade step 5 of 18: 'UpgPrechecks'.
2020/02/29 18:57:51 CLSRSC-363: User ignored prerequisites during installation
2020/02/29 18:57:52 CLSRSC-595: Executing upgrade step 6 of 18: 'SetupOSD'.
2020/02/29 18:57:53 CLSRSC-595: Executing upgrade step 7 of 18: 'PreUpgrade'.

ASM configuration upgraded in local node successfully.

2020/02/29 18:58:01 CLSRSC-466: Starting shutdown of the current Oracle Grid Infrastructure stack
2020/02/29 18:58:32 CLSRSC-467: Shutdown of the current Oracle Grid Infrastructure stack has successfully completed.
2020/02/29 18:58:34 CLSRSC-595: Executing upgrade step 8 of 18: 'CheckCRSConfig'.
2020/02/29 18:58:38 CLSRSC-595: Executing upgrade step 9 of 18: 'UpgradeOLR'.
2020/02/29 18:58:42 CLSRSC-595: Executing upgrade step 10 of 18: 'ConfigCHMOS'.
2020/02/29 18:58:43 CLSRSC-595: Executing upgrade step 11 of 18: 'UpgradeAFD'.
2020/02/29 18:58:44 CLSRSC-595: Executing upgrade step 12 of 18: 'createOHASD'.
2020/02/29 18:58:46 CLSRSC-595: Executing upgrade step 13 of 18: 'ConfigOHASD'.
2020/02/29 18:58:46 CLSRSC-329: Replacing Clusterware entries in file 'oracle-ohasd.service'
2020/02/29 18:59:12 CLSRSC-595: Executing upgrade step 14 of 18: 'InstallACFS'.
2020/02/29 18:59:20 CLSRSC-595: Executing upgrade step 15 of 18: 'InstallKA'.
2020/02/29 18:59:21 CLSRSC-595: Executing upgrade step 16 of 18: 'UpgradeCluster'.
2020/02/29 19:00:42 CLSRSC-4003: Successfully patched Oracle Trace File Analyzer (TFA) Collector.
2020/02/29 19:01:12 CLSRSC-343: Successfully started Oracle Clusterware stack
clscfg: EXISTING configuration version 19 detected.
Successfully taken the backup of node specific configuration in OCR.
Successfully accumulated necessary OCR keys.
Creating OCR keys for user 'root', privgrp 'root'..
Operation successful.
2020/02/29 19:01:45 CLSRSC-595: Executing upgrade step 17 of 18: 'UpgradeNode'.
Start upgrade invoked..
2020/02/29 19:01:53 CLSRSC-478: Setting Oracle Clusterware active version on the last node to be upgraded
2020/02/29 19:01:53 CLSRSC-482: Running command: '/u01/app/19.3.0.0/grid/bin/crsctl set crs activeversion'
Started to upgrade the active version of Oracle Clusterware. This operation may take a few minutes.
Started to upgrade CSS.
CSS was successfully upgraded.
Started to upgrade Oracle ASM.
Started to upgrade CRS.
CRS was successfully upgraded.
Started to upgrade Oracle ACFS.
Oracle ACFS was successfully upgraded.
Successfully upgraded the active version of Oracle Clusterware.
Oracle Clusterware active version was successfully set to 19.0.0.0.0.
2020/02/29 19:03:04 CLSRSC-479: Successfully set Oracle Clusterware active version
2020/02/29 19:03:05 CLSRSC-476: Finishing upgrade of resource types
2020/02/29 19:03:17 CLSRSC-477: Successfully completed upgrade of resource types
2020/02/29 19:03:50 CLSRSC-595: Executing upgrade step 18 of 18: 'PostUpgrade'.
Successfully updated XAG resources.
2020/02/29 19:04:19 CLSRSC-325: Configure Oracle Grid Infrastructure for a Cluster ... succeeded
[root@ol7-122-rac2 ~]#

==================================================

[oracle@ol7-122-rac1 ~]$ $ORACLE_HOME/OPatch/opatch lspatches -oh /u01/app/19.3.0.0/grid
29585399;OCW RELEASE UPDATE 19.3.0.0.0 (29585399)
29517247;ACFS RELEASE UPDATE 19.3.0.0.0 (29517247)
29517242;Database Release Update : 19.3.0.0.190416 (29517242)
29401763;TOMCAT RELEASE UPDATE 19.0.0.0.0 (29401763)

OPatch succeeded.
[oracle@ol7-122-rac1 ~]$

--------------------------------------------------

[oracle@ol7-122-rac2 ~]$ $ORACLE_HOME/OPatch/opatch lspatches -oh /u01/app/19.3.0.0/grid
29585399;OCW RELEASE UPDATE 19.3.0.0.0 (29585399)
29517247;ACFS RELEASE UPDATE 19.3.0.0.0 (29517247)
29517242;Database Release Update : 19.3.0.0.190416 (29517242)
29401763;TOMCAT RELEASE UPDATE 19.0.0.0.0 (29401763)

OPatch succeeded.
[oracle@ol7-122-rac2 ~]$

==================================================

[oracle@ol7-122-rac1 ~]$ cluvfy stage -post crsinst -allnodes -collect cluster -gi_upgrade

Post-check for cluster services setup was successful.

CVU operation performed:      stage -post crsinst
Date:                         Feb 29, 2020 7:39:23 PM
CVU home:                     /u01/app/19.3.0.0/grid/
User:                         oracle
[oracle@ol7-122-rac1 ~]$

==================================================

[oracle@ol7-122-rac1 ~]$ crsctl check cluster -all
**************************************************************
ol7-122-rac1:
CRS-4537: Cluster Ready Services is online
CRS-4529: Cluster Synchronization Services is online
CRS-4533: Event Manager is online
**************************************************************
ol7-122-rac2:
CRS-4537: Cluster Ready Services is online
CRS-4529: Cluster Synchronization Services is online
CRS-4533: Event Manager is online
**************************************************************

[oracle@ol7-122-rac1 ~]$ crsctl query crs softwareversion
Oracle Clusterware version on node [ol7-122-rac1] is [19.0.0.0.0]

[oracle@ol7-122-rac1 ~]$ crsctl query crs softwarepatch
Oracle Clusterware patch level on node ol7-122-rac1 is [724960844].

[oracle@ol7-122-rac1 ~]$ crsctl query crs releaseversion
Oracle High Availability Services release version on the local node is [19.0.0.0.0]

[oracle@ol7-122-rac1 ~]$ crsctl query crs releasepatch
Oracle Clusterware release patch level is [724960844] and the complete list of patches [29401763 29517242 29517247 29585399 ] have been applied on the local node. The release patch string is [19.3.0.0.0].

[oracle@ol7-122-rac1 ~]$ crsctl query crs activeversion -f
Oracle Clusterware active version on the cluster is [19.0.0.0.0]. The cluster upgrade state is [NORMAL]. The cluster active patch level is [724960844].
[oracle@ol7-122-rac1 ~]$

--------------------------------------------------

[oracle@ol7-122-rac2 ~]$ crsctl check cluster -all
**************************************************************
ol7-122-rac1:
CRS-4537: Cluster Ready Services is online
CRS-4529: Cluster Synchronization Services is online
CRS-4533: Event Manager is online
**************************************************************
ol7-122-rac2:
CRS-4537: Cluster Ready Services is online
CRS-4529: Cluster Synchronization Services is online
CRS-4533: Event Manager is online
**************************************************************

[oracle@ol7-122-rac2 ~]$ crsctl query crs softwareversion
Oracle Clusterware version on node [ol7-122-rac2] is [19.0.0.0.0]

[oracle@ol7-122-rac2 ~]$ crsctl query crs softwarepatch
Oracle Clusterware patch level on node ol7-122-rac2 is [724960844].

[oracle@ol7-122-rac2 ~]$ crsctl query crs releaseversion
Oracle High Availability Services release version on the local node is [19.0.0.0.0]

[oracle@ol7-122-rac2 ~]$ crsctl query crs releasepatch
Oracle Clusterware release patch level is [724960844] and the complete list of patches [29401763 29517242 29517247 29585399 ] have been applied on the local node. The release patch string is [19.3.0.0.0].

[oracle@ol7-122-rac2 ~]$ crsctl query crs releasepatch
Oracle Clusterware release patch level is [724960844] and the complete list of patches [29401763 29517242 29517247 29585399 ] have been applied on the local node. The release patch string is [19.3.0.0.0].
[oracle@ol7-122-rac2 ~]$

==================================================

[oracle@ol7-122-rac1 ~]$ echo $ORACLE_HOME
/u01/app/19.3.0.0/grid
[oracle@ol7-122-rac1 ~]$ $ORACLE_HOME/OPatch/opatch version
OPatch Version: 12.2.0.1.17

OPatch succeeded.
[oracle@ol7-122-rac1 ~]$ rm -rf $ORACLE_HOME/OPatch/*
[oracle@ol7-122-rac1 ~]$ unzip -qo /u01/app/oracle/patch/p6880880_190000_Linux-x86-64.zip -d $ORACLE_HOME
[oracle@ol7-122-rac1 ~]$ $ORACLE_HOME/OPatch/opatch version
OPatch Version: 12.2.0.1.19

OPatch succeeded.
[oracle@ol7-122-rac1 ~]$

--------------------------------------------------

[oracle@ol7-122-rac2 ~]$ echo $ORACLE_HOME
/u01/app/19.3.0.0/grid
[oracle@ol7-122-rac2 ~]$ $ORACLE_HOME/OPatch/opatch version
OPatch Version: 12.2.0.1.17

OPatch succeeded.
[oracle@ol7-122-rac2 ~]$ rm -rf $ORACLE_HOME/OPatch/*
[oracle@ol7-122-rac2 ~]$ unzip -qo /u01/app/oracle/patch/p6880880_190000_Linux-x86-64.zip -d $ORACLE_HOME
[oracle@ol7-122-rac2 ~]$ $ORACLE_HOME/OPatch/opatch version
OPatch Version: 12.2.0.1.19

OPatch succeeded.
[oracle@ol7-122-rac2 ~]$

==================================================

[root@ol7-122-rac1 ~]# $ORACLE_HOME/OPatch/opatchauto apply /u01/app/oracle/patch/30501910

OPatchauto session is initiated at Sat Feb 29 20:04:21 2020

System initialization log file is /u01/app/19.3.0.0/grid/cfgtoollogs/opatchautodb/systemconfig2020-02-29_08-04-24PM.log.

Session log file is /u01/app/19.3.0.0/grid/cfgtoollogs/opatchauto/opatchauto2020-02-29_08-04-50PM.log
The id for this session is TMIS

Executing OPatch prereq operations to verify patch applicability on home /u01/app/19.3.0.0/grid
Patch applicability verified successfully on home /u01/app/19.3.0.0/grid


Bringing down CRS service on home /u01/app/19.3.0.0/grid
Prepatch operation log file location: /u01/app/oracle/crsdata/ol7-122-rac1/crsconfig/crspatch_ol7-122-rac1_2020-02-29_05-04-37PM.log
CRS service brought down successfully on home /u01/app/19.3.0.0/grid


Start applying binary patch on home /u01/app/19.3.0.0/grid
Binary patch applied successfully on home /u01/app/19.3.0.0/grid


Starting CRS service on home /u01/app/19.3.0.0/grid
Postpatch operation log file location: /u01/app/oracle/crsdata/ol7-122-rac1/crsconfig/crspatch_ol7-122-rac1_2020-02-29_05-04-37PM.log
CRS service started successfully on home /u01/app/19.3.0.0/grid

OPatchAuto successful.

--------------------------------Summary--------------------------------

Patching is completed successfully. Please find the summary as follows:

Host:ol7-122-rac1
CRS Home:/u01/app/19.3.0.0/grid
Version:19.0.0.0.0
Summary:

==Following patches were SUCCESSFULLY applied:

Patch: /u01/app/oracle/patch/30501910/30489227
Log: /u01/app/19.3.0.0/grid/cfgtoollogs/opatchauto/core/opatch/opatch2020-02-29_20-09-33PM_1.log

Patch: /u01/app/oracle/patch/30501910/30489632
Log: /u01/app/19.3.0.0/grid/cfgtoollogs/opatchauto/core/opatch/opatch2020-02-29_20-09-33PM_1.log

Patch: /u01/app/oracle/patch/30501910/30557433
Log: /u01/app/19.3.0.0/grid/cfgtoollogs/opatchauto/core/opatch/opatch2020-02-29_20-09-33PM_1.log

Patch: /u01/app/oracle/patch/30501910/30655595
Log: /u01/app/19.3.0.0/grid/cfgtoollogs/opatchauto/core/opatch/opatch2020-02-29_20-09-33PM_1.log



Following homes are skipped during patching as patches are not applicable:

/u01/app/oracle/product/12.2.0.1/dbhome_1



OPatchauto session completed at Sat Feb 29 20:22:52 2020
Time taken to complete the session 18 minutes, 31 seconds
[root@ol7-122-rac1 ~]#

--------------------------------------------------

[root@ol7-122-rac2 ~]# $ORACLE_HOME/OPatch/opatchauto apply /u01/app/oracle/patch/30501910

OPatchauto session is initiated at Sat Feb 29 20:24:25 2020

System initialization log file is /u01/app/19.3.0.0/grid/cfgtoollogs/opatchautodb/systemconfig2020-02-29_08-24-28PM.log.

Session log file is /u01/app/19.3.0.0/grid/cfgtoollogs/opatchauto/opatchauto2020-02-29_08-24-53PM.log
The id for this session is X5R1

Executing OPatch prereq operations to verify patch applicability on home /u01/app/19.3.0.0/grid
Patch applicability verified successfully on home /u01/app/19.3.0.0/grid


Bringing down CRS service on home /u01/app/19.3.0.0/grid
Prepatch operation log file location: /u01/app/oracle/crsdata/ol7-122-rac2/crsconfig/crspatch_ol7-122-rac2_2020-02-29_05-32-25PM.log
CRS service brought down successfully on home /u01/app/19.3.0.0/grid


Start applying binary patch on home /u01/app/19.3.0.0/grid
Binary patch applied successfully on home /u01/app/19.3.0.0/grid


Starting CRS service on home /u01/app/19.3.0.0/grid

Postpatch operation log file location: /u01/app/oracle/crsdata/ol7-122-rac2/crsconfig/crspatch_ol7-122-rac2_2020-02-29_05-32-25PM.log
CRS service started successfully on home /u01/app/19.3.0.0/grid

OPatchAuto successful.

--------------------------------Summary--------------------------------

Patching is completed successfully. Please find the summary as follows:

Host:ol7-122-rac2
CRS Home:/u01/app/19.3.0.0/grid
Version:19.0.0.0.0
Summary:

==Following patches were SUCCESSFULLY applied:

Patch: /u01/app/oracle/patch/30501910/30489227
Log: /u01/app/19.3.0.0/grid/cfgtoollogs/opatchauto/core/opatch/opatch2020-02-29_20-30-45PM_1.log

Patch: /u01/app/oracle/patch/30501910/30489632
Log: /u01/app/19.3.0.0/grid/cfgtoollogs/opatchauto/core/opatch/opatch2020-02-29_20-30-45PM_1.log

Patch: /u01/app/oracle/patch/30501910/30557433
Log: /u01/app/19.3.0.0/grid/cfgtoollogs/opatchauto/core/opatch/opatch2020-02-29_20-30-45PM_1.log

Patch: /u01/app/oracle/patch/30501910/30655595
Log: /u01/app/19.3.0.0/grid/cfgtoollogs/opatchauto/core/opatch/opatch2020-02-29_20-30-45PM_1.log



Following homes are skipped during patching as patches are not applicable:

/u01/app/oracle/product/12.2.0.1/dbhome_1



OPatchauto session completed at Sat Feb 29 20:54:46 2020
Time taken to complete the session 30 minutes, 21 seconds
[root@ol7-122-rac2 ~]#

==================================================

[oracle@ol7-122-rac1 ~]$ $ORACLE_HOME/OPatch/opatch lspatches -oh /u01/app/19.3.0.0/grid
30655595;TOMCAT RELEASE UPDATE 19.0.0.0.0 (30655595)
30557433;Database Release Update : 19.6.0.0.200114 (30557433)
30489632;ACFS RELEASE UPDATE 19.6.0.0.0 (30489632)
30489227;OCW RELEASE UPDATE 19.6.0.0.0 (30489227)

OPatch succeeded.
[oracle@ol7-122-rac1 ~]$

--------------------------------------------------

[oracle@ol7-122-rac2 ~]$ $ORACLE_HOME/OPatch/opatch lspatches -oh /u01/app/19.3.0.0/grid
30655595;TOMCAT RELEASE UPDATE 19.0.0.0.0 (30655595)
30557433;Database Release Update : 19.6.0.0.200114 (30557433)
30489632;ACFS RELEASE UPDATE 19.6.0.0.0 (30489632)
30489227;OCW RELEASE UPDATE 19.6.0.0.0 (30489227)

OPatch succeeded.
[oracle@ol7-122-rac2 ~]$

==================================================

[oracle@ol7-122-rac1 ~]$ crsctl query crs releasepatch
Oracle Clusterware release patch level is [2701864972] and the complete list of patches [30489227 30489632 30557433 30655595 ] have been applied on the local node. The release patch string is [19.6.0.0.0].
[oracle@ol7-122-rac1 ~]$

--------------------------------------------------

Oracle Clusterware release patch level is [2701864972] and the complete list of patches [30489227 30489632 30557433 30655595 ] have been applied on the local node. The release patch string is [19.6.0.0.0].
[oracle@ol7-122-rac2 ~]$

Try, Try, And Try Again

Fri, 2020-02-28 20:12

I hope you don’t judge a post by its title.

Working with vagrant build and experiencing periodic timed out.

    default: ******************************************************************************
    default: Unzip database software. Sat Feb 29 00:34:50 UTC 2020
    default: ******************************************************************************
    default: ‘/vagrant_software/LINUX.X64_193000_db_home.zip’ -> ‘./LINUX.X64_193000_db_home.zip’
    default: cp: error reading ‘/vagrant_software/LINUX.X64_193000_db_home.zip’: Protocol error
    default: cp: failed to extend ‘./LINUX.X64_193000_db_home.zip’: Protocol error
    default: 1
    default:
    default: real       0m3.468s
    default: user       0m0.000s
    default: sys        0m0.558s
    default: unzip:  cannot find or open LINUX.X64_193000_db_home.zip, LINUX.X64_193000_db_home.zip.zip or LINUX.X64_193000_db_home.zip.ZIP.
    default: 9
    default:
    default: real       0m0.006s
    default: user       0m0.001s
    default: sys        0m0.001s
    default: 0
    default:
    default: real       0m0.001s
    default: user       0m0.000s
    default: sys        0m0.001s
    default: ******************************************************************************
    default: Do database software-only installation. Sat Feb 29 00:34:54 UTC 2020
    default: ******************************************************************************
    default: /vagrant/scripts/oracle_db_software_installation.sh: line 15: /u01/app/oracle/product/19.0.0/dbhome_1/runInstaller: No such file or directory
    default: ******************************************************************************

I have probably brought this upon myself since I Create Windows Symlinks to Vagrant Software Folder

However, issue is that it works from GI and not DB software is what agitates me.

Enough of the rant and there is the resolution.

The while loop will repeat the last command until successful.

echo "******************************************************************************"
echo "Unzip database software." `date`
echo "******************************************************************************"

time cp -fv /vagrant_software/${DB_SOFTWARE} ${ORACLE_BASE}
while [ $? -ne 0 ] ; do fc -s ; done
time unzip -oq ${ORACLE_BASE}/${DB_SOFTWARE} -d ${ORACLE_HOME}
while [ $? -ne 0 ] ; do fc -s ; done
time rm -fv ${ORACLE_BASE}/${DB_SOFTWARE}
while [ $? -ne 0 ] ; do fc -s ; done

Somehow, just magically works.

    default: ******************************************************************************
    default: Unzip database software. Sat Feb 29 01:57:23 UTC 2020
    default: ******************************************************************************
    default: ‘/vagrant_software/LINUX.X64_193000_db_home.zip’ -> ‘/u01/app/oracle/LINUX.X64_193000_db_home.zip’
    default:
    default: real       0m11.455s
    default: user       0m0.008s
    default: sys        0m1.409s
    default:
    default: real       1m50.354s
    default: user       0m46.963s
    default: sys        0m8.285s
    default: removed ‘/u01/app/oracle/LINUX.X64_193000_db_home.zip’
    default:
    default: real       0m0.133s
    default: user       0m0.000s
    default: sys        0m0.128s
    default: ******************************************************************************
    default: Do database software-only installation. Sat Feb 29 01:59:25 UTC 2020
    default: ******************************************************************************
    default: Launching Oracle Database Setup Wizard...

Back to work!

Create Windows Symlinks to Vagrant Software Folder

Sat, 2020-02-22 06:45

Most Vagrant implementations have some derivative of software folder with file – put_software_here.txt

It does not make sense to have to copy software to the folder and then to delete afterwards repeatedly for every new Vagrant build.

Luckily, there is a way to create symlinks to a centralized software location.

Start vagrant failed – ERROR: gi_software does not exist because software folder was purposely deleted for demo.
Note: Using Git BASH for Windows.

resetlogs@ghost MINGW64 /g/oracle/vagrant-boxes/OracleFPP (master)
$ ls -l
total 28
drwxr-xr-x 1 dinh 197121     0 Feb 22 06:35 config/
drwxr-xr-x 1 dinh 197121     0 Feb 22 06:35 images/
-rw-r--r-- 1 dinh 197121  1896 Feb 22 06:35 LICENSE.txt
-rw-r--r-- 1 dinh 197121 10645 Feb 22 06:35 README.md
drwxr-xr-x 1 dinh 197121     0 Feb 22 06:35 scripts/
-rw-r--r-- 1 dinh 197121  5449 Feb 22 06:35 THIRD_PARTY_LICENSES.txt
drwxr-xr-x 1 dinh 197121     0 Feb 22 06:35 userscripts/
-rw-r--r-- 1 dinh 197121 16610 Feb 22 06:45 Vagrantfile

resetlogs@ghost MINGW64 /g/oracle/vagrant-boxes/OracleFPP (master)
$ vagrant up
ERROR: gi_software does not exist

resetlogs@ghost MINGW64 /g/oracle/vagrant-boxes/OracleFPP (master)

Create symlinks from Windows command line.

C:\Windows\System32>mklink
Creates a symbolic link.

MKLINK [[/D] | [/H] | [/J]] Link Target

        /D      Creates a directory symbolic link.  Default is a file
                symbolic link.
        /H      Creates a hard link instead of a symbolic link.
        /J      Creates a Directory Junction.
        Link    Specifies the new symbolic link name.
        Target  Specifies the path (relative or absolute) that the new link
                refers to.

C:\Windows\System32>mklink /J "G:\oracle\vagrant-boxes\OracleFPP\ORCL_software" "E:\ORCL_software"
Junction created for G:\oracle\vagrant-boxes\OracleFPP\ORCL_software <<===>> E:\ORCL_software

C:\Windows\System32>

Read more about it: The Complete Guide to Creating Symbolic Links (aka Symlinks) on Windows

Review symlinks and start vagrant successfully:

resetlogs@ghost MINGW64 /g/oracle/vagrant-boxes/OracleFPP (master)
$ ls -l
total 28
drwxr-xr-x 1 dinh 197121     0 Feb 22 06:35 config/
drwxr-xr-x 1 dinh 197121     0 Feb 22 06:35 images/
-rw-r--r-- 1 dinh 197121  1896 Feb 22 06:35 LICENSE.txt
lrwxrwxrwx 1 dinh 197121    16 Feb 22 07:05 ORCL_software -> /e/ORCL_software/
-rw-r--r-- 1 dinh 197121 10645 Feb 22 06:35 README.md
drwxr-xr-x 1 dinh 197121     0 Feb 22 06:35 scripts/
-rw-r--r-- 1 dinh 197121  5449 Feb 22 06:35 THIRD_PARTY_LICENSES.txt
drwxr-xr-x 1 dinh 197121     0 Feb 22 06:35 userscripts/
-rw-r--r-- 1 dinh 197121 16610 Feb 22 06:45 Vagrantfile

resetlogs@ghost MINGW64 /g/oracle/vagrant-boxes/OracleFPP (master)
$ vagrant up
getting Proxy Configuration from Host...
Bringing machine 'host1' up with 'virtualbox' provider...
Bringing machine 'host2' up with 'virtualbox' provider...
==> host1: Importing base box 'ol74'...

Simplify RMAN Restore With Meaningful Tag

Wed, 2020-02-19 15:44

Here is a simple demo for how to restore RMAN backup in case of failed migration using RMAN tag.

List backup from File System:

[oracle@db-fs-1 ~]$ ls -alrt /u01/backup/*MIGRATION*
-rw-r----- 1 oracle oinstall  12886016 Feb 18 21:56 /u01/backup/HAWK_3291419015_20200218_1cuosf50_1_1_MIGRATION_44
-rw-r----- 1 oracle oinstall   1073152 Feb 18 21:56 /u01/backup/HAWK_3291419015_20200218_1duosf58_1_1_MIGRATION_45
-rw-r----- 1 oracle oinstall 112263168 Feb 18 21:57 /u01/backup/HAWK_3291419015_20200218_1buosf50_1_1_MIGRATION_43
-rw-r----- 1 oracle oinstall 212926464 Feb 18 21:57 /u01/backup/HAWK_3291419015_20200218_1auosf50_1_1_MIGRATION_42
-rw-r----- 1 oracle oinstall   2946560 Feb 18 21:57 /u01/backup/HAWK_3291419015_20200218_1euosf61_1_1_MIGRATION_46
-rw-r----- 1 oracle oinstall    114688 Feb 18 21:57 /u01/backup/HAWK_3291419015_20200218_1fuosf63_1_1_MIGRATION_47
-rw-r----- 1 oracle oinstall   1114112 Feb 18 21:57 /u01/backup/HAWK_3291419015_20200218_1guosf64_1_1_MIGRATION_48
-rw-r----- 1 oracle oinstall      3584 Feb 18 21:57 /u01/backup/HAWK_3291419015_20200218_1iuosf67_1_1_MIGRATION_50
-rw-r----- 1 oracle oinstall   2946560 Feb 18 21:57 /u01/backup/HAWK_3291419015_20200218_1huosf67_1_1_MIGRATION_49
-rw-r----- 1 oracle oinstall   1114112 Feb 18 21:57 /u01/backup/CTL_HAWK_3291419015_20200218_1juosf6a_1_1_MIGRATION_51
-rw-r----- 1 oracle oinstall      3584 Feb 18 21:57 /u01/backup/CTL_HAWK_3291419015_20200218_1kuosf6c_1_1_MIGRATION_52
-rw-r----- 1 oracle oinstall    114688 Feb 18 21:57 /u01/backup/CTL_HAWK_3291419015_20200218_1luosf6e_1_1_MIGRATION_53
-rw-r----- 1 oracle oinstall   1114112 Feb 18 21:57 /u01/backup/CTL_HAWK_3291419015_20200218_1muosf6f_1_1_MIGRATION_54
[oracle@db-fs-1 ~]$

List backup from RMAN:

[oracle@db-fs-1 ~]$ rman target /

Recovery Manager: Release 12.2.0.1.0 - Production on Wed Feb 19 04:21:17 2020

Copyright (c) 1982, 2017, Oracle and/or its affiliates.  All rights reserved.

connected to target database: HAWK (DBID=3291419015)

RMAN> list backup summary tag='MIGRATION';


List of Backups
===============
Key     TY LV S Device Type Completion Time      #Pieces #Copies Compressed Tag
------- -- -- - ----------- -------------------- ------- ------- ---------- ---
39      B  0  A DISK        2020-FEB-18 21:56:52 1       1       YES        MIGRATION
40      B  0  A DISK        2020-FEB-18 21:56:56 1       1       YES        MIGRATION
41      B  0  A DISK        2020-FEB-18 21:57:11 1       1       YES        MIGRATION
42      B  0  A DISK        2020-FEB-18 21:57:17 1       1       YES        MIGRATION
43      B  A  A DISK        2020-FEB-18 21:57:22 1       1       YES        MIGRATION
44      B  F  A DISK        2020-FEB-18 21:57:23 1       1       YES        MIGRATION
45      B  F  A DISK        2020-FEB-18 21:57:25 1       1       YES        MIGRATION
46      B  A  A DISK        2020-FEB-18 21:57:27 1       1       YES        MIGRATION
47      B  A  A DISK        2020-FEB-18 21:57:27 1       1       YES        MIGRATION
48      B  F  A DISK        2020-FEB-18 21:57:31 1       1       YES        MIGRATION
49      B  A  A DISK        2020-FEB-18 21:57:32 1       1       YES        MIGRATION
50      B  F  A DISK        2020-FEB-18 21:57:34 1       1       YES        MIGRATION
52      B  F  A DISK        2020-FEB-18 21:57:36 1       1       YES        MIGRATION

RMAN> list backup of database summary tag='MIGRATION';


List of Backups
===============
Key     TY LV S Device Type Completion Time      #Pieces #Copies Compressed Tag
------- -- -- - ----------- -------------------- ------- ------- ---------- ---
39      B  0  A DISK        2020-FEB-18 21:56:52 1       1       YES        MIGRATION
40      B  0  A DISK        2020-FEB-18 21:56:56 1       1       YES        MIGRATION
41      B  0  A DISK        2020-FEB-18 21:57:11 1       1       YES        MIGRATION
42      B  0  A DISK        2020-FEB-18 21:57:17 1       1       YES        MIGRATION

RMAN> list backup of archivelog all summary tag='MIGRATION';


List of Backups
===============
Key     TY LV S Device Type Completion Time      #Pieces #Copies Compressed Tag
------- -- -- - ----------- -------------------- ------- ------- ---------- ---
43      B  A  A DISK        2020-FEB-18 21:57:22 1       1       YES        MIGRATION
46      B  A  A DISK        2020-FEB-18 21:57:27 1       1       YES        MIGRATION
47      B  A  A DISK        2020-FEB-18 21:57:27 1       1       YES        MIGRATION
49      B  A  A DISK        2020-FEB-18 21:57:32 1       1       YES        MIGRATION

RMAN> list backup of controlfile summary tag='MIGRATION';


List of Backups
===============
Key     TY LV S Device Type Completion Time      #Pieces #Copies Compressed Tag
------- -- -- - ----------- -------------------- ------- ------- ---------- ---
45      B  F  A DISK        2020-FEB-18 21:57:25 1       1       YES        MIGRATION
48      B  F  A DISK        2020-FEB-18 21:57:31 1       1       YES        MIGRATION
52      B  F  A DISK        2020-FEB-18 21:57:36 1       1       YES        MIGRATION

RMAN> list backup of spfile summary tag='MIGRATION';


List of Backups
===============
Key     TY LV S Device Type Completion Time      #Pieces #Copies Compressed Tag
------- -- -- - ----------- -------------------- ------- ------- ---------- ---
44      B  F  A DISK        2020-FEB-18 21:57:23 1       1       YES        MIGRATION
50      B  F  A DISK        2020-FEB-18 21:57:34 1       1       YES        MIGRATION

RMAN> list backupset 42,49,50,52;


List of Backup Sets
===================


BS Key  Type LV Size       Device Type Elapsed Time Completion Time
------- ---- -- ---------- ----------- ------------ --------------------
42      Incr 0  203.05M    DISK        00:00:29     2020-FEB-18 21:57:17
        BP Key: 42   Status: AVAILABLE  Compressed: YES  Tag: MIGRATION
        Piece Name: /u01/backup/HAWK_3291419015_20200218_1auosf50_1_1_MIGRATION_42
        Keep: BACKUP_LOGS        Until: 2020-AUG-18 21:56:48
  List of Datafiles in backup set 42
  File LV Type Ckp SCN    Ckp Time             Abs Fuz SCN Sparse Name
  ---- -- ---- ---------- -------------------- ----------- ------ ----
  1    0  Incr 1428959    2020-FEB-18 21:56:48              NO    /u02/oradata/HAWK/datafile/o1_mf_system_h4s874gt_.dbf

BS Key  Size       Device Type Elapsed Time Completion Time
------- ---------- ----------- ------------ --------------------
49      3.00K      DISK        00:00:00     2020-FEB-18 21:57:32
        BP Key: 49   Status: AVAILABLE  Compressed: YES  Tag: MIGRATION
        Piece Name: /u01/backup/CTL_HAWK_3291419015_20200218_1kuosf6c_1_1_MIGRATION_52
        Keep: BACKUP_LOGS        Until: 2020-AUG-18 21:57:32

  List of Archived Logs in backup set 49
  Thrd Seq     Low SCN    Low Time             Next SCN   Next Time
  ---- ------- ---------- -------------------- ---------- ---------
  1    3       1429002    2020-FEB-18 21:57:26 1429040    2020-FEB-18 21:57:32

BS Key  Type LV Size       Device Type Elapsed Time Completion Time
------- ---- -- ---------- ----------- ------------ --------------------
50      Full    96.00K     DISK        00:00:00     2020-FEB-18 21:57:34
        BP Key: 50   Status: AVAILABLE  Compressed: YES  Tag: MIGRATION
        Piece Name: /u01/backup/CTL_HAWK_3291419015_20200218_1luosf6e_1_1_MIGRATION_53
        Keep: BACKUP_LOGS        Until: 2020-AUG-18 21:57:33
  SPFILE Included: Modification time: 2020-FEB-18 21:51:45
  SPFILE db_unique_name: HAWK

BS Key  Type LV Size       Device Type Elapsed Time Completion Time
------- ---- -- ---------- ----------- ------------ --------------------
52      Full    1.05M      DISK        00:00:01     2020-FEB-18 21:57:36
        BP Key: 52   Status: AVAILABLE  Compressed: YES  Tag: MIGRATION
        Piece Name: /u01/backup/CTL_HAWK_3291419015_20200218_1muosf6f_1_1_MIGRATION_54
        Keep: BACKUP_LOGS        Until: 2020-AUG-18 21:57:35
  Control File Included: Ckp SCN: 1429047      Ckp time: 2020-FEB-18 21:57:35

RMAN>

You are probably wondering why BS 49 with Piece Name: /u01/backup/CTL_HAWK_3291419015_20200218_1kuosf6c_1_1_MIGRATION_52 contains archivelog?

RMAN backup script:

[oracle@db-fs-1 ~]$ cat /u01/backup/backup_keep.rman
spool log to /u01/backup/rman_keep_backup_migration.log
connect target;
set echo on
show all;
run {
allocate channel c1 device type disk format '/u01/backup/%d_%I_%T_%U_MIGRATION_%s' MAXPIECESIZE 2G MAXOPENFILES 1;
allocate channel c2 device type disk format '/u01/backup/%d_%I_%T_%U_MIGRATION_%s' MAXPIECESIZE 2G MAXOPENFILES 1;
allocate channel c3 device type disk format '/u01/backup/%d_%I_%T_%U_MIGRATION_%s' MAXPIECESIZE 2G MAXOPENFILES 1;
backup as compressed backupset incremental level 0 filesperset 1 check logical database
keep until time 'ADD_MONTHS(SYSDATE,6)' TAG='MIGRATION';
backup as compressed backupset archivelog from time 'trunc(sysdate)'
filesperset 8
keep until time 'ADD_MONTHS(SYSDATE,6)' TAG='MIGRATION';
}
run {
allocate channel c1 device type disk format '/u01/backup/CTL_%d_%I_%T_%U_MIGRATION_%s';
backup as compressed backupset current controlfile
keep until time 'ADD_MONTHS(SYSDATE,6)' TAG='MIGRATION';
}
LIST BACKUP OF DATABASE SUMMARY TAG='MIGRATION';
LIST BACKUP OF ARCHIVELOG ALL SUMMARY TAG='MIGRATION';
LIST BACKUP OF CONTROLFILE TAG='MIGRATION';
report schema;
exit
[oracle@db-fs-1 ~]$

RMAN backup log snippet:

allocated channel: c1
channel c1: SID=57 device type=DISK

Starting backup at 2020-FEB-18 21:57:30

backup will be obsolete on date 2020-AUG-18 21:57:30
archived logs required to recover from this backup will be backed up
channel c1: starting compressed full datafile backup set
channel c1: specifying datafile(s) in backup set
including current control file in backup set
channel c1: starting piece 1 at 2020-FEB-18 21:57:31
channel c1: finished piece 1 at 2020-FEB-18 21:57:32
piece handle=/u01/backup/CTL_HAWK_3291419015_20200218_1juosf6a_1_1_MIGRATION_51 tag=MIGRATION comment=NONE
channel c1: backup set complete, elapsed time: 00:00:01
current log archived

backup will be obsolete on date 2020-AUG-18 21:57:32
archived logs required to recover from this backup will be backed up
channel c1: starting compressed archived log backup set
channel c1: specifying archived log(s) in backup set

******* input archived log thread=1 sequence=3 RECID=30 STAMP=1032731852 *******

channel c1: starting piece 1 at 2020-FEB-18 21:57:32
channel c1: finished piece 1 at 2020-FEB-18 21:57:33
piece handle=/u01/backup/CTL_HAWK_3291419015_20200218_1kuosf6c_1_1_MIGRATION_52 tag=MIGRATION comment=NONE
channel c1: backup set complete, elapsed time: 00:00:01

Restore backup from RMAN:

RMAN> startup force nomount;

Oracle instance started

Total System Global Area     805306368 bytes

Fixed Size                     8625856 bytes
Variable Size                314573120 bytes
Database Buffers             478150656 bytes
Redo Buffers                   3956736 bytes

RMAN> restore controlfile from '/u01/backup/CTL_HAWK_3291419015_20200218_1muosf6f_1_1_MIGRATION_54';

Starting restore at 2020-FEB-19 03:41:37
allocated channel: ORA_DISK_1
channel ORA_DISK_1: SID=35 device type=DISK

channel ORA_DISK_1: restoring control file
channel ORA_DISK_1: restore complete, elapsed time: 00:00:01
output file name=/u02/fra/HAWK/controlfile/o1_mf_h4r8xqh6_.ctl
Finished restore at 2020-FEB-19 03:41:38

RMAN> alter database mount;

RMAN> catalog start with '/u01/backup' noprompt;

RMAN> restore database preview summary from tag='MIGRATION';

Starting restore at 2020-FEB-19 03:43:05
using channel ORA_DISK_1


List of Backups
===============
Key     TY LV S Device Type Completion Time      #Pieces #Copies Compressed Tag
------- -- -- - ----------- -------------------- ------- ------- ---------- ---
42      B  0  A DISK        2020-FEB-18 21:57:17 1       1       YES        MIGRATION
41      B  0  A DISK        2020-FEB-18 21:57:11 1       1       YES        MIGRATION
39      B  0  A DISK        2020-FEB-18 21:56:52 1       1       YES        MIGRATION
40      B  0  A DISK        2020-FEB-18 21:56:56 1       1       YES        MIGRATION


List of Backups
===============
Key     TY LV S Device Type Completion Time      #Pieces #Copies Compressed Tag
------- -- -- - ----------- -------------------- ------- ------- ---------- ---
47      B  A  A DISK        2020-FEB-18 21:57:27 1       1       YES        MIGRATION
recovery will be done up to SCN 1428959
Media recovery start SCN is 1428959
Recovery must be done beyond SCN 1428964 to clear datafile fuzziness
Finished restore at 2020-FEB-19 03:43:05

RMAN> restore database from tag='MIGRATION';
RMAN> recover database until scn 1428965;

Starting recover at 2020-FEB-19 03:44:45
using channel ORA_DISK_1
RMAN-00571: ===========================================================
RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============
RMAN-00571: ===========================================================
RMAN-03002: failure of recover command at 02/19/2020 03:44:45
RMAN-20208: UNTIL CHANGE is before RESETLOGS change

RMAN> list incarnation of database;


List of Database Incarnations
DB Key  Inc Key DB Name  DB ID            STATUS  Reset SCN  Reset Time
------- ------- -------- ---------------- --- ---------- ----------
1       1       HAWK     3291419015       PARENT  1          2017-JAN-26 13:52:29
2       2       HAWK     3291419015       PARENT  1408558    2020-FEB-18 18:49:45
3       3       HAWK     3291419015       PARENT  1424305    2020-FEB-18 20:02:49
4       4       HAWK     3291419015       PARENT  1425161    2020-FEB-18 20:19:50
5       5       HAWK     3291419015       PARENT  1425162    2020-FEB-18 20:33:05
6       6       HAWK     3291419015       PARENT  1426203    2020-FEB-18 21:13:15
7       7       HAWK     3291419015       CURRENT 1428966    2020-FEB-18 22:05:54

RMAN> recover database until scn 1428967;

Starting recover at 2020-FEB-19 03:47:41
using channel ORA_DISK_1

starting media recovery

channel ORA_DISK_1: starting archived log restore to default destination
channel ORA_DISK_1: restoring archived log
archived log thread=1 sequence=1
channel ORA_DISK_1: reading from backup piece /u01/backup/HAWK_3291419015_20200218_1huosf67_1_1_MIGRATION_49
channel ORA_DISK_1: piece handle=/u01/backup/HAWK_3291419015_20200218_1huosf67_1_1_MIGRATION_49 tag=MIGRATION
channel ORA_DISK_1: restored backup piece 1
channel ORA_DISK_1: restore complete, elapsed time: 00:00:01
archived log file name=/u02/oradata/HAWK/archivelog/2020_02_19/o1_mf_1_1_h4s8gfjc_.arc thread=1 sequence=1
archived log file name=/u02/oradata/HAWK/archivelog/2020_02_19/o1_mf_1_1_h4rx8c8b_.arc thread=1 sequence=1
channel default: deleting archived log(s)
archived log file name=/u02/oradata/HAWK/archivelog/2020_02_19/o1_mf_1_1_h4s8gfjc_.arc RECID=32 STAMP=1032752861
media recovery complete, elapsed time: 00:00:00
Finished recover at 2020-FEB-19 03:47:42

RMAN> alter database open resetlogs;

Statement processed

RMAN> report schema;

Report of database schema for database with db_unique_name HAWK

List of Permanent Datafiles
===========================
File Size(MB) Tablespace           RB segs Datafile Name
---- -------- -------------------- ------- ------------------------
1    800      SYSTEM               YES     /u02/oradata/HAWK/datafile/o1_mf_system_h4s874gt_.dbf
3    470      SYSAUX               NO      /u02/oradata/HAWK/datafile/o1_mf_sysaux_h4s86of7_.dbf
4    70       UNDOTBS1             YES     /u02/oradata/HAWK/datafile/o1_mf_undotbs1_h4s86kbl_.dbf
7    5        USERS                NO      /u02/oradata/HAWK/datafile/o1_mf_users_h4s86ncz_.dbf

List of Temporary Files
=======================
File Size(MB) Tablespace           Maxsize(MB) Tempfile Name
---- -------- -------------------- ----------- --------------------
1    20       TEMP                 32767       /u02/oradata/HAWK/datafile/o1_mf_temp_h4s8jc3n_.tmp

RMAN> delete force noprompt backup tag='MIGRATION';

using target database control file instead of recovery catalog
allocated channel: ORA_DISK_1
channel ORA_DISK_1: SID=53 device type=DISK

List of Backup Pieces
BP Key  BS Key  Pc# Cp# Status      Device Type Piece Name
------- ------- --- --- ----------- ----------- ----------
39      39      1   1   AVAILABLE   DISK        /u01/backup/HAWK_3291419015_20200218_1cuosf50_1_1_MIGRATION_44
40      40      1   1   AVAILABLE   DISK        /u01/backup/HAWK_3291419015_20200218_1duosf58_1_1_MIGRATION_45
41      41      1   1   AVAILABLE   DISK        /u01/backup/HAWK_3291419015_20200218_1buosf50_1_1_MIGRATION_43
42      42      1   1   AVAILABLE   DISK        /u01/backup/HAWK_3291419015_20200218_1auosf50_1_1_MIGRATION_42
43      43      1   1   AVAILABLE   DISK        /u01/backup/HAWK_3291419015_20200218_1euosf61_1_1_MIGRATION_46
44      44      1   1   AVAILABLE   DISK        /u01/backup/HAWK_3291419015_20200218_1fuosf63_1_1_MIGRATION_47
45      45      1   1   AVAILABLE   DISK        /u01/backup/HAWK_3291419015_20200218_1guosf64_1_1_MIGRATION_48
46      46      1   1   AVAILABLE   DISK        /u01/backup/HAWK_3291419015_20200218_1iuosf67_1_1_MIGRATION_50
47      47      1   1   AVAILABLE   DISK        /u01/backup/HAWK_3291419015_20200218_1huosf67_1_1_MIGRATION_49
48      48      1   1   AVAILABLE   DISK        /u01/backup/CTL_HAWK_3291419015_20200218_1juosf6a_1_1_MIGRATION_51
49      49      1   1   AVAILABLE   DISK        /u01/backup/CTL_HAWK_3291419015_20200218_1kuosf6c_1_1_MIGRATION_52
50      50      1   1   AVAILABLE   DISK        /u01/backup/CTL_HAWK_3291419015_20200218_1luosf6e_1_1_MIGRATION_53
52      52      1   1   AVAILABLE   DISK        /u01/backup/CTL_HAWK_3291419015_20200218_1muosf6f_1_1_MIGRATION_54
deleted backup piece
backup piece handle=/u01/backup/HAWK_3291419015_20200218_1cuosf50_1_1_MIGRATION_44 RECID=39 STAMP=1032731809
deleted backup piece
backup piece handle=/u01/backup/HAWK_3291419015_20200218_1duosf58_1_1_MIGRATION_45 RECID=40 STAMP=1032731816
deleted backup piece
backup piece handle=/u01/backup/HAWK_3291419015_20200218_1buosf50_1_1_MIGRATION_43 RECID=41 STAMP=1032731808
deleted backup piece
backup piece handle=/u01/backup/HAWK_3291419015_20200218_1auosf50_1_1_MIGRATION_42 RECID=42 STAMP=1032731808
deleted backup piece
backup piece handle=/u01/backup/HAWK_3291419015_20200218_1euosf61_1_1_MIGRATION_46 RECID=43 STAMP=1032731841
deleted backup piece
backup piece handle=/u01/backup/HAWK_3291419015_20200218_1fuosf63_1_1_MIGRATION_47 RECID=44 STAMP=1032731843
deleted backup piece
backup piece handle=/u01/backup/HAWK_3291419015_20200218_1guosf64_1_1_MIGRATION_48 RECID=45 STAMP=1032731845
deleted backup piece
backup piece handle=/u01/backup/HAWK_3291419015_20200218_1iuosf67_1_1_MIGRATION_50 RECID=46 STAMP=1032731847
deleted backup piece
backup piece handle=/u01/backup/HAWK_3291419015_20200218_1huosf67_1_1_MIGRATION_49 RECID=47 STAMP=1032731847
deleted backup piece
backup piece handle=/u01/backup/CTL_HAWK_3291419015_20200218_1juosf6a_1_1_MIGRATION_51 RECID=48 STAMP=1032731851
deleted backup piece
backup piece handle=/u01/backup/CTL_HAWK_3291419015_20200218_1kuosf6c_1_1_MIGRATION_52 RECID=49 STAMP=1032731852
deleted backup piece
backup piece handle=/u01/backup/CTL_HAWK_3291419015_20200218_1luosf6e_1_1_MIGRATION_53 RECID=50 STAMP=1032731854
deleted backup piece
backup piece handle=/u01/backup/CTL_HAWK_3291419015_20200218_1muosf6f_1_1_MIGRATION_54 RECID=52 STAMP=1032752561
Deleted 13 objects


RMAN> exit


Recovery Manager complete.

[oracle@db-fs-1 ~]$ ls -alrt /u01/backup/
total 28
drwxrwxr-x 6 oracle oinstall  4096 Feb 18 19:11 ..
-rw-r--r-- 1 oracle oinstall  1104 Feb 18 20:40 backup_keep.rman
-rw-r--r-- 1 oracle oinstall 12346 Feb 18 21:57 rman_keep_backup_migration.log
drwxr-xr-x 2 oracle oinstall  4096 Feb 19 04:28 .
[oracle@db-fs-1 ~]$

Just a crazy idea.
Keep the same backup tag for all backups until the next level 0.

Backup TAG for daily level 0 backup:

[oracle@db-fs-1 ~]$ echo "$(date +%Y%b%d)"
2020Feb19
[oracle@db-fs-1 ~]$

Backup TAG for weekly level 0 backup

[oracle@db-fs-1 ~]$ echo "$(date +%Y%b)_WK$(date +%U)"
2020Feb_WK07
[oracle@db-fs-1 ~]$

Backup TAG for monthly level 0 backup

[oracle@db-fs-1 ~]$ echo "$(date +%Y%b)"
2020Feb
[oracle@db-fs-1 ~]$

Tag: ARCH for archivelog backup and may not be useful.
LV=A means archivelog backup.

RMAN> list backup summary;


List of Backups
===============
Key     TY LV S Device Type Completion Time      #Pieces #Copies Compressed Tag
------- -- -- - ----------- -------------------- ------- ------- ---------- ---
69      B  A  A DISK        2020-FEB-19 13:29:02 1       1       NO         ARCH
70      B  A  A DISK        2020-FEB-19 13:29:03 1       1       NO         ARCH
71      B  F  A DISK        2020-FEB-19 13:29:04 1       1       NO         TAG20200219T132904

RMAN>

In writing this post, I realized the my own backup script will need some improvements.

Note to self for how to easily add user to sudoers file

Sun, 2020-02-16 07:49

Unable to sudo su -:

[grid@ol7-fpp-fpps ~]$ sudo su -

We trust you have received the usual lecture from the local System
Administrator. It usually boils down to these three things:

    #1) Respect the privacy of others.
    #2) Think before you type.
    #3) With great power comes great responsibility.

[sudo] password for grid:
grid is not in the sudoers file.  This incident will be reported.

[grid@ol7-fpp-fpps ~]$ 

Group wheel has ALL privileges:

[root@ol7-fpp-fpps ~]# grep wheel /etc/sudoers
## Allows people in group wheel to run all commands
%wheel  ALL=(ALL)       ALL
# %wheel        ALL=(ALL)       NOPASSWD: ALL
[root@ol7-fpp-fpps ~]#

Modify user grid:

[root@ol7-fpp-fpps ~]# usermod -aG wheel grid

Or: usermod -a -G sudo user

Test sudo -:

[grid@ol7-fpp-fpps ~]$ sudo su -
[sudo] password for grid:
Last login: Sun Feb 16 08:35:52 -05 2020 on pts/0
[root@ol7-fpp-fpps ~]#

Modify VAGRANT_HOME For Windows

Tue, 2020-02-11 19:21

I know what you are thinking, “Why make it so complicated!”

Unfortunately, C: drive is only 500G while other attached drives are 1TB.

Download vagrant box from outside of VAGRANT_HOME

resetlogs@ghost MINGW64 /d/VirtualBoxVMs
$ vagrant box add --name ol77 https://yum.oracle.com/boxes/oraclelinux/ol77/ol77.box
==> box: Box file was not detected as metadata. Adding it directly...
==> box: Adding box 'ol77' (v0) for provider:
    box: Downloading: https://yum.oracle.com/boxes/oraclelinux/ol77/ol77.box
    box:
==> box: Successfully added box 'ol77' (v0) for 'virtualbox'!

resetlogs@ghost MINGW64 /d/VirtualBoxVMs
$ ls -l
total 0

resetlogs@ghost MINGW64 /d/VirtualBoxVMs
$

Verify vagrant box is downloaded to VAGRANT_HOME

resetlogs@ghost MINGW64 /d/vagrant.d
$ ls -l
total 5
drwxr-xr-x 1 dinh 197121    0 Feb 11 20:00 boxes/
drwxr-xr-x 1 dinh 197121    0 Feb 11 19:49 data/
drwxr-xr-x 1 dinh 197121    0 Feb 11 19:49 gems/
-rw-r--r-- 1 dinh 197121 1675 Feb 11 19:49 insecure_private_key
drwxr-xr-x 1 dinh 197121    0 Feb 11 19:49 rgloader/
-rw-r--r-- 1 dinh 197121    3 Feb 11 19:49 setup_version
drwxr-xr-x 1 dinh 197121    0 Feb 11 20:00 tmp/

resetlogs@ghost MINGW64 /d/vagrant.d
$ cd boxes

resetlogs@ghost MINGW64 /d/vagrant.d/boxes
$ ls -l
total 0
drwxr-xr-x 1 dinh 197121 0 Feb 11 20:00 ol77/

resetlogs@ghost MINGW64 /d/vagrant.d/boxes
$ env|grep -i vagrant
PWD=/d/vagrant.d/boxes
PATH=/c/Users/Michael Dinh/bin:/mingw64/bin:/usr/local/bin:/usr/bin:/bin:/mingw64/bin:/usr/bin:/c/Users/Michael Dinh/bin:/c/Windows/system32:/c/Windows:/c/Windows/System32/Wbem:/c/Windows/System32/WindowsPowerShell/v1.0:/c/Windows/System32/OpenSSH:/c/Program Files/WinMerge:/cmd:/c/Program Files/PuTTY:/c/Program Files (x86)/Intel/Intel(R) Management Engine Components/DAL:/c/Program Files/Intel/Intel(R) Management Engine Components/DAL:/c/Program Files/Intel/WiFi/bin:/c/Program Files/Common Files/Intel/WirelessCommon:/d/HashiCorp/Vagrant/bin:/c/Users/Michael Dinh/AppData/Local/Microsoft/WindowsApps:/c/Program Files/Java/jdk-12.0.1/bin:/usr/bin/vendor_perl:/usr/bin/core_perl
ORIGINAL_PATH=/mingw64/bin:/usr/bin:/c/Users/Michael Dinh/bin:/c/Windows/system32:/c/Windows:/c/Windows/System32/Wbem:/c/Windows/System32/WindowsPowerShell/v1.0:/c/Windows/System32/OpenSSH:/c/Program Files/WinMerge:/cmd:/c/Program Files/PuTTY:/c/Program Files (x86)/Intel/Intel(R) Management Engine Components/DAL:/c/Program Files/Intel/Intel(R) Management Engine Components/DAL:/c/Program Files/Intel/WiFi/bin:/c/Program Files/Common Files/Intel/WirelessCommon:/d/HashiCorp/Vagrant/bin:/c/Users/Michael Dinh/AppData/Local/Microsoft/WindowsApps:/c/Program Files/Java/jdk-12.0.1/bin
***** VAGRANT_HOME=D:\vagrant.d *****
OLDPWD=/d/vagrant.d

resetlogs@ghost MINGW64 /d/vagrant.d/boxes
$

Delete vagrant box.

resetlogs@ghost MINGW64 /d/vagrant.d/boxes
$ vagrant box remove ol77
Removing box 'ol77' (v0) with provider 'virtualbox'...

resetlogs@ghost MINGW64 /d/vagrant.d/boxes
$

Here are the modifications from Windows registry and environments

Use srvctl stop home When Stopping Many Database Instances

Tue, 2020-01-28 12:13

=============================================================================
### Stops all Oracle clusterware resources that run from the Oracle home.
=============================================================================

Usage: srvctl stop home -oraclehome  -statefile  -node  [-stopoption ] [-force]
    -oraclehome              Oracle home path
    -statefile         Specify a file path for the 'srvctl stop home' command to store the state of the resources
    -node               Node name
    -stopoption      Stop options for the database. Examples of shutdown options are NORMAL, TRANSACTIONAL, IMMEDIATE, or ABORT.
    -force                         Force stop
    -help                          Print usage
[oracle@ol7-19-rac1 ~]$

=============================================================================
### Check ALL DB status running from same ORACLE_HOME for NODE
=============================================================================

[oracle@ol7-19-rac1 ~]$ srvctl status database -d cdbrac -v
Instance cdbrac1 is running on node ol7-19-rac1. Instance status: Open.
Instance cdbrac2 is running on node ol7-19-rac2. Instance status: Open.

[oracle@ol7-19-rac1 ~]$ srvctl status home -o $ORACLE_HOME -s $ORACLE_HOME/statushome.txt -node ol7-19-rac1
Database cdbrac is running on node ol7-19-rac1

[oracle@ol7-19-rac1 ~]$ cat $ORACLE_HOME/statushome.txt
db-cdbrac
[oracle@ol7-19-rac1 ~]$

=============================================================================
### STOP ALL DB running from same ORACLE_HOME for NODE
=============================================================================

[oracle@ol7-19-rac1 ~]$ srvctl stop home -o $ORACLE_HOME -s $ORACLE_HOME/stophome.txt -node ol7-19-rac1

[oracle@ol7-19-rac1 ~]$ cat $ORACLE_HOME/stophome.txt
db-cdbrac

[oracle@ol7-19-rac1 ~]$ srvctl status database -d cdbrac -v
Instance cdbrac1 is not running on node ol7-19-rac1
Instance cdbrac2 is running on node ol7-19-rac2. Instance status: Open.
[oracle@ol7-19-rac1 ~]$

=============================================================================
### START ALL DB running from same ORACLE_HOME for NODE
=============================================================================

[oracle@ol7-19-rac1 ~]$ srvctl start home -o $ORACLE_HOME -s $ORACLE_HOME/stophome.txt -node ol7-19-rac1

[oracle@ol7-19-rac1 ~]$ srvctl status database -d cdbrac -v
Instance cdbrac1 is running on node ol7-19-rac1. Instance status: Open.
Instance cdbrac2 is running on node ol7-19-rac2. Instance status: Open.
[oracle@ol7-19-rac1 ~]$

Data Guard Fast-Start Failover Test – Shutdown Standby Host

Fri, 2020-01-17 15:41

Data Guard Fast-Start Failover Test – Shutdown Primary Host

Review primary host and start observer:
[oracle@ol7-121-dg1 sql]$ sqlplus / as sysdba

SQL*Plus: Release 12.1.0.2.0 Production on Fri Jan 17 20:42:54 2020

Copyright (c) 1982, 2014, Oracle.  All rights reserved.


Connected to:
Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options

OL7-121-DG1:(SYS@cdb1):PRIMARY> exit
Disconnected from Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options

[oracle@ol7-121-dg1 sql]$ dgmgrl /
DGMGRL for Linux: Version 12.1.0.2.0 - 64bit Production

Copyright (c) 2000, 2013, Oracle. All rights reserved.

Welcome to DGMGRL, type "help" for information.
Connected as SYSDG.
DGMGRL> show configuration

Configuration - my_dg_config

  Protection Mode: MaxPerformance
  Members:
  cdb1      - Primary database
    cdb1_stby - Physical standby database

Fast-Start Failover: DISABLED

Configuration Status:
SUCCESS   (status updated 13 seconds ago)

DGMGRL> enable fast_start failover
Enabled.
DGMGRL> show configuration

Configuration - my_dg_config

  Protection Mode: MaxPerformance
  Members:
  cdb1      - Primary database
    cdb1_stby - (*) Physical standby database

Fast-Start Failover: ENABLED

Configuration Status:
SUCCESS   (status updated 12 seconds ago)

DGMGRL> validate database cdb1

  Database Role:    Primary database

  Ready for Switchover:  Yes

DGMGRL> validate database cdb1_stby

  Database Role:     Physical standby database
  Primary Database:  cdb1

  Ready for Switchover:  Yes
  Ready for Failover:    Yes (Primary Running)

DGMGRL> show database cdb1

Database - cdb1

  Role:               PRIMARY
  Intended State:     TRANSPORT-ON
  Instance(s):
    cdb1

  Database Error(s):
    ORA-16820: fast-start failover observer is no longer observing this database

Database Status:
ERROR

DGMGRL> show database cdb1_stby

Database - cdb1_stby

  Role:               PHYSICAL STANDBY
  Intended State:     APPLY-ON
  Transport Lag:      0 seconds (computed 0 seconds ago)
  Apply Lag:          0 seconds (computed 0 seconds ago)
  Average Apply Rate: 2.00 KByte/s
  Real Time Query:    ON
  Instance(s):
    cdb1

  Database Error(s):
    ORA-16820: fast-start failover observer is no longer observing this database

Database Status:
ERROR

DGMGRL> start observer
[P001 01/17 20:46:01.38] Authentication failed.
DGM-16979: Unable to log on to the primary or standby database as SYSDBA
Failed.
DGMGRL> connect sys@cdb1
Password:
Connected as SYSDBA.
DGMGRL> start observer
Observer started
Restart standby host, listener, and database:
resetlogs@ghost MINGW64 /g/oraclebase/vagrant/dataguard/ol7_121/node2 (master)
$ vagrant status
Current machine states:

default                   poweroff (virtualbox)

The VM is powered off. To restart the VM, simply run `vagrant up`

resetlogs@ghost MINGW64 /g/oraclebase/vagrant/dataguard/ol7_121/node2 (master)
$ vagrant up
Bringing machine 'default' up with 'virtualbox' provider...

====================================================================
resetlogs@ghost MINGW64 /g/oraclebase/vagrant/dataguard/ol7_121/node2 (master)
$ vagrant status
Current machine states:

default                   running (virtualbox)

The VM is running. To stop this VM, you can run `vagrant halt` to
shut it down forcefully, or you can run `vagrant suspend` to simply
suspend the virtual machine. In either case, to restart it again,
simply run `vagrant up`.

====================================================================
resetlogs@ghost MINGW64 /g/oraclebase/vagrant/dataguard/ol7_121/node2 (master)
$ vagrant ssh
Last login: Fri Jan 17 20:11:35 2020 from 10.0.2.2
[vagrant@ol7-121-dg2 ~]$ sudo su - oracle
Last login: Fri Jan 17 20:11:44 UTC 2020 on pts/0
[oracle@ol7-121-dg2 ~]$ . oraenv <<< cdb1
ORACLE_SID = [cdb1] ? The Oracle base remains unchanged with value /u01/app/oracle
[oracle@ol7-121-dg2 ~]$ lsnrctl start

LSNRCTL for Linux: Version 12.1.0.2.0 - Production on 17-JAN-2020 20:53:20

Copyright (c) 1991, 2014, Oracle.  All rights reserved.

Starting /u01/app/oracle/product/12.1.0.2/dbhome_1/bin/tnslsnr: please wait...

TNSLSNR for Linux: Version 12.1.0.2.0 - Production
System parameter file is /u01/app/oracle/product/12.1.0.2/dbhome_1/network/admin/listener.ora
Log messages written to /u01/app/oracle/diag/tnslsnr/ol7-121-dg2/listener/alert/log.xml
Listening on: (DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(HOST=ol7-121-dg2.localdomain)(PORT=1521)))
Listening on: (DESCRIPTION=(ADDRESS=(PROTOCOL=ipc)(KEY=EXTPROC1521)))

Connecting to (DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=ol7-121-dg2)(PORT=1521)))
STATUS of the LISTENER
------------------------
Alias                     LISTENER
Version                   TNSLSNR for Linux: Version 12.1.0.2.0 - Production
Start Date                17-JAN-2020 20:53:22
Uptime                    0 days 0 hr. 0 min. 0 sec
Trace Level               off
Security                  ON: Local OS Authentication
SNMP                      OFF
Listener Parameter File   /u01/app/oracle/product/12.1.0.2/dbhome_1/network/admin/listener.ora
Listener Log File         /u01/app/oracle/diag/tnslsnr/ol7-121-dg2/listener/alert/log.xml
Listening Endpoints Summary...
  (DESCRIPTION=(ADDRESS=(PROTOCOL=tcp)(HOST=ol7-121-dg2.localdomain)(PORT=1521)))
  (DESCRIPTION=(ADDRESS=(PROTOCOL=ipc)(KEY=EXTPROC1521)))
Services Summary...
Service "cdb1_stby_DGMGRL" has 1 instance(s).
  Instance "cdb1", status UNKNOWN, has 1 handler(s) for this service...
The command completed successfully
[oracle@ol7-121-dg2 ~]$ cd /sf_working/sql
[oracle@ol7-121-dg2 sql]$ sqlplus / as sysdba

SQL*Plus: Release 12.1.0.2.0 Production on Fri Jan 17 20:53:38 2020

Copyright (c) 1982, 2014, Oracle.  All rights reserved.

Connected to an idle instance.

SYS@cdb1> startup mount;
ORACLE instance started.

Total System Global Area 1610612736 bytes
Fixed Size                  2924928 bytes
Variable Size             520097408 bytes
Database Buffers         1073741824 bytes
Redo Buffers               13848576 bytes
Database mounted.
SYS@cdb1> @stby.sql

Session altered.

*** v$database ***

DB          OPEN                   DATABASE           REMOTE     SWITCHOVER      DATAGUARD  PRIMARY_DB
UNIQUE_NAME MODE                   ROLE               ARCHIVE    STATUS          BROKER     UNIQUE_NAME
----------- ---------------------- ------------------ ---------- --------------- ---------- ---------------
cdb1_stby   MOUNTED                PHYSICAL STANDBY   ENABLED    NOT ALLOWED     ENABLED    cdb1

*** gv$archive_dest ***

                                                                                              MOUNT
 THREAD#  DEST_ID DESTINATION               STATUS       TARGET           SCHEDULE PROCESS       ID
-------- -------- ------------------------- ------------ ---------------- -------- ---------- -----
       1        1 USE_DB_RECOVERY_FILE_DEST VALID        LOCAL            ACTIVE   ARCH           0
       1       32 USE_DB_RECOVERY_FILE_DEST VALID        LOCAL            ACTIVE   RFS            0

*** gv$archive_dest_status ***

                               DATABASE        RECOVERY
 INST_ID  DEST_ID STATUS       MODE            MODE                    GAP_STATUS      ERROR
-------- -------- ------------ --------------- ----------------------- --------------- --------------------------------------------------
       1        1 VALID        MOUNTED-STANDBY IDLE                                    NONE
       1       32 VALID        UNKNOWN         IDLE                                    NONE

*** v$thread ***

 THREAD# CURRENT LOG SEQUENCE STATUS
-------- -------------------- ------------
       1                   26 OPEN

*** gv$archived_log ***

 DEST_ID  THREAD# APPLIED    MAX_SEQ MAX_TIME             DELTA_SEQ DETA_MIN
-------- -------- --------- -------- -------------------- --------- --------
       1        1 NO              25 17-JAN-2020 20:53:53         2 41.68333
       1        1 YES             23 17-JAN-2020 20:12:12

*** v$archive_gap ***

no rows selected

*** GAP can also be verified using RMAN from STANDBY ***

RMAN1
------------------------------------------------------------
list archivelog from sequence 24 thread 1;

*** v$dataguard_stats ***

NAME                      VALUE              UNIT
------------------------- ------------------ ------------------------------
transport lag             +00 00:00:00       day(2) to second(0) interval
apply lag                                    day(2) to second(0) interval

*** gv$managed_standby ***

no rows selected

SYS@cdb1> exit
Disconnected from Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options
[oracle@ol7-121-dg2 sql]$
Screen output from observer:
DGMGRL> start observer
Observer started
[W000 01/17 20:48:58.27] The primary database has requested a transition to the UNSYNC/LAGGING state.
[W000 01/17 20:48:58.28] Permission granted to the primary database to transition to UNSYNC/LAGGING state.
[W000 01/17 20:50:01.29] The primary database has been in UNSYNC/LAGGING state for 63 seconds.
[W000 01/17 20:51:04.31] The primary database has been in UNSYNC/LAGGING state for 126 seconds.
[W000 01/17 20:52:07.33] The primary database has been in UNSYNC/LAGGING state for 189 seconds.
[W000 01/17 20:53:10.36] The primary database has been in UNSYNC/LAGGING state for 252 seconds.
[W000 01/17 20:54:13.39] The primary database has been in UNSYNC/LAGGING state for 315 seconds.
[W000 01/17 20:54:16.39] The primary database returned to SYNC/NOT LAGGING state.
Validate Data Guard configuration:
[oracle@ol7-121-dg2 sql]$ dgmgrl /
DGMGRL for Linux: Version 12.1.0.2.0 - 64bit Production

Copyright (c) 2000, 2013, Oracle. All rights reserved.

Welcome to DGMGRL, type "help" for information.
Connected as SYSDG.
DGMGRL> show configuration

Configuration - my_dg_config

  Protection Mode: MaxPerformance
  Members:
  cdb1      - Primary database
    cdb1_stby - (*) Physical standby database

Fast-Start Failover: ENABLED

Configuration Status:
SUCCESS   (status updated 10 seconds ago)

DGMGRL> validate database cdb1

  Database Role:    Primary database

  Ready for Switchover:  Yes

DGMGRL> validate database cdb1_stby

  Database Role:     Physical standby database
  Primary Database:  cdb1

  Ready for Switchover:  Yes
  Ready for Failover:    Yes (Primary Running)

DGMGRL> exit
[oracle@ol7-121-dg2 sql]$
Open database read only:

This is required because database is not register to cluster resource.

[oracle@ol7-121-dg2 sql]$ sqlplus / as sysdba

SQL*Plus: Release 12.1.0.2.0 Production on Fri Jan 17 21:33:07 2020

Copyright (c) 1982, 2014, Oracle.  All rights reserved.


Connected to:
Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options

OL7-121-DG2:(SYS@cdb1):PHYSICAL STANDBY> alter database open read only;

Database altered.

OL7-121-DG2:(SYS@cdb1):PHYSICAL STANDBY> exit
Disconnected from Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options

[oracle@ol7-121-dg2 sql]$ dgmgrl /
DGMGRL for Linux: Version 12.1.0.2.0 - 64bit Production

Copyright (c) 2000, 2013, Oracle. All rights reserved.

Welcome to DGMGRL, type "help" for information.
Connected as SYSDG.
DGMGRL> show database cdb1_stby

Database - cdb1_stby

  Role:               PHYSICAL STANDBY
  Intended State:     APPLY-ON
  Transport Lag:      0 seconds (computed 0 seconds ago)
  Apply Lag:          0 seconds (computed 0 seconds ago)
  Average Apply Rate: 1.00 KByte/s
  Real Time Query:    ON
  Instance(s):
    cdb1

Database Status:
SUCCESS

DGMGRL> exit
[oracle@ol7-121-dg2 sql]$

Pages