CA Workload Automation AE (AutoSys Edition) r11.3. Performance Summary

CA Workload Automation AE (AutoSys Edition) r11.3 Performance Summary 1 Table of Contents CA Workload Automation AE (AutoSys Edition) r11.3 1 1. De...
10 downloads 0 Views 2MB Size
CA Workload Automation AE (AutoSys Edition) r11.3 Performance Summary

1

Table of Contents CA Workload Automation AE (AutoSys Edition) r11.3 1 1. Definitions, Acronyms, and Abbreviations 3 2. Overview 3 3. Performance and Stability testing 3 3.1 Performance Testing: Combination environment with High Availability/Dual Event Servers, Single Scheduler/Dual Event Servers, and Single Scheduler/Single Event Server 3 3.2 Performance Testing: Single Scheduler/Single Event Server 17 3.3 Performance Testing: Stability Lag Time (HA with DUAL ES Sybase) 28 3.4 Performance Testing: Stability Lag Time (HA with Oracle RAC) 33

2

1. Definitions, Acronyms, and Abbreviations WAAE: CA Workload Automation AE (AutoSys Edition) ES: Event Server Lag time: Time between when event was placed in Event Server and processed by Scheduler. OS: Operating System r11.3: CA Workload Automation AE (AutoSys Edition) r11.3, build 142 EEM: CA Embedded Entitlements Manager

2. Overview Gather performance and stability numbers for CA Workload Automation AE (AutoSys Edition) r11.3, including results for processing job streams, calculating lag times between events, and memory/cpu consumption.

3. Performance and Stability testing 3.1 Performance Testing: Combination environment with High Availability/Dual Event Servers, Single Scheduler/Dual Event Servers, and Single Scheduler/Single Event Server Test Environment  5 Managers in the following configurations: AIX/Sybase running with Single Scheduler/Dual Event Servers HP-UX/Oracle running with Single Scheduler/Single Event Server Linux/Sybase running with High Availability/Dual Event Servers Solaris/Oracle running with High Availability/Dual Event Servers Windows/MSSQL running with High Availability/Dual Event Servers  EEM security is enabled  Autotrack level=1  Each Manager submits continuous workload to 15 Agents over 5+ days Test Objectives  Calculate average lag time between STARTING, RUNNING, and SUCCESS events over 5+ days  Gather the total number of STARTING, RUNNING and SUCCESS events over 5+ days.  Monitor Memory and CPU Usage in the test environment Job Streams  1 BOX with 20 CMD jobs that run “true” to each Agent. These boxes run recursively.  1 BOX with 5 cross-instance CMD jobs that run to each of the 5 instances in the environment. This box runs recursively.

3

3.1.1 Hardware and Software Details – AIX OS

System

Hard Drive

RAM CPU

AIX 6.1 (64bit)

IBM,8204E8A

40GB

2GB

Database

2x4204MHz Sybase PowerPC_POWER6 15.0.1

WAAE/ Common Components Scheduler Application Server Agent Event Server A

AIX 6.1 (64bit)

IBM,8204E8A

40GB

2GB

2x4204MHz Sybase PowerPC_POWER6 15.0.1

Agent Event Server B

3.1.2 Hardware and Software Details – HP-UX OS

System

Hard Drive

RAM CPU

HPUX 11.23 (64bit)

HP 9000/800/rp3410

40GB

6GB

Database

2x800MHz Oracle PA-RISC 10gr2

WAAE/ Common Components Scheduler Application Server Agent Event Server

3.1.3 Hardware and Software Details – Linux OS

System

Hard Drive

RAM CPU

Database

WAAE/ Common Components

RHEL 5.3 (32bit)

VMware, Inc. VMware Virtual Platform

43GB

2GB

Sybase 15.0.1

Primary Scheduler Application Server Agent

1x2.33GHz Intel(R) Xeon(R) CPU 5410

Event Server A RHEL 5.3 (32bit)

VMware, Inc. VMware Virtual Platform

43GB

2GB

1x2.33GHz Intel(R) Xeon(R) CPU 5410

Sybase 15.0.1

Shadow Scheduler Application Server Agent Event Server B

RHEL 5.3 (32bit)

VMware, Inc. VMware Virtual

43GB

2GB

1x2.33GHz Intel(R) Xeon(R)

4

Sybase 15.0.1

Tiebreaker Scheduler Application Server

Platform RHEL 5.3 (32bit)

VMware, Inc. VMware Virtual Platform

CPU 5410 43GB

2GB

(Client)

1x2.33GHz Intel(R) Xeon(R) CPU 5410

Agent Agent EEM 8.4.405

3.1.4 Hardware and Software Details – Solaris OS

System

Hard Drive

RAM CPU

Database

WAAE/ Common Components

Solaris Sparc 10 (64bit)

Sun Microsystems sun4v T5240

30GB

2GB

2x1165MHz SUNW,UltraSPARCT2+

Oracle 10gr2 (Client)

Primary Scheduler Application Server Agent

Solaris Sparc 10 (64bit)

Sun Microsystems sun4v T5240

30GB

2GB

2x1165MHz SUNW,UltraSPARCT2+

Oracle 10gr2

Shadow Scheduler Application Server Agent Event Server B

Solaris Sparc 10 (64bit)

30GB

2GB

Sun Microsystems sun4v T5240

2x1165MHz SUNW,UltraSPARCT2+

Oracle 10gr2

Tiebreaker Scheduler Application Server Agent Event Server A

Solaris Sparc 10 (64bit)

Sun Microsystems sun4v T5240

30GB

2GB

2x1165MHz SUNW,UltraSPARCT2+

Agent

3.1.5 Hardware and Software Details – Windows OS

System

Hard Drive

RAM CPU

Windows 2008 EE SP2 (32bit)

VMware, Inc. VMware Virtual Platform

40GB

2GB

Database

2x2.33GHz MSSQL Intel(R) 2005 SP3 Xeon(R) L5410

WAAE/ Common Components Primary Scheduler Application Server Agent Event Server A

Windows 2008 EE SP2 (32bit)

VMware, Inc. VMware Virtual Platform

50GB

2GB

2x2.33GHz MSSQL Intel(R) 2005 SP3 Xeon(R) L5410

Shadow Scheduler Application Server Agent Event Server B

5

Windows 2008 EE SP2 (32bit)

VMware, Inc. VMware Virtual Platform

40GB

2GB

2x2.33GHz MSSQL Intel(R) 2005 SP3 Xeon(R) L5410 (Client)

Tiebreaker Scheduler Application Server Agent

Windows 2008 EE SP2 (32bit)

VMware, Inc. VMware Virtual Platform

40GB

2GB

2x2.33GHz Intel(R) Xeon(R) L5410

Agent

3.1.6 Results – Lag Times

18 16 14 12 10 8 6 4 2 0

Date/Time

6

03/16/2011 12:03:13

03/14/2011 12:03:48

03/15/2011 12:01:18

SUCCESS 03/13/2011 12:01:58

Seconds

AIX/Sybase Average Lag Time - Daily

RUNNING STARTING

Date/Time

7

03/16/2011 12:00:24

03/15/2011 12:00:42

03/14/2011 12:00:14

03/13/2011 12:00:18

Seconds

03/16/2011 12:02:27

03/15/2011 12:02:56

03/14/2011 12:03:42

03/13/2011 12:01:35

03/12/2011 12:02:19

Seconds

HP-UX/Oracle Average Lag Time - Daily

7 6 5 4 3 2 1 0 SUCCESS

0.5

0 RUNNING

STARTING

Date/Time

Linux/Sybase Average Lag Time - Daily

2.5

2

1.5

1

SUCCESS

RUNNING

STARTING

Date/Time

8

03/16/2011 12:00:22

03/15/2011 12:00:22

03/14/2011 12:00:13

03/13/2011 12:00:13

Seconds

03/16/2011 12:01:40

03/15/2011 12:01:49

03/14/2011 12:00:55

03/13/2011 12:01:30

03/12/2011 12:01:32

Seconds

Solaris/Oracle Average Lag Time - Daily

6

5

4

3

2

1

0 SUCCESS

0.2

0 RUNNING

STARTING

Date/Time

Windows/MSSQL Average Lag Time - Daily

1.2

1

0.8

0.6

0.4

SUCCESS

RUNNING

STARTING

3.1.7 Results – Events Processed

AIX/Sybase Events Processed - Daily Number of Events

25000 20000

15000 10000

5000

SUCCESS 03/16/2011 12:03:13

03/15/2011 12:01:19

03/13/2011 12:01:58

03/14/2011 12:03:52

0

RUNNING STARTING

Date/Time

HP-UX/Oracle Events Processed - Daily 200000 150000 100000 50000

SUCCESS

Date/Time

9

03/16/2011 12:02:27

03/15/2011 12:02:58

03/14/2011 12:03:42

03/13/2011 12:01:43

0 03/12/2011 12:02:20

Number of Events

250000

RUNNING STARTING

200000 180000 160000 140000 120000 100000 80000 60000 40000 20000 0 03/16/2011 12:00:24

03/15/2011 12:00:42

03/14/2011 12:00:14

SUCCESS 03/13/2011 12:00:18

Number of Events

Linux/Sybase Events Processed - Daily

RUNNING STARTING

Date/Time

80000 70000 60000 50000 40000 30000 20000 10000 0

Date/Time

10

03/16/2011 12:01:40

03/15/2011 12:01:49

03/14/2011 12:00:55

03/13/2011 12:01:31

SUCCESS 03/12/2011 12:01:32

Number of Events

Solaris/Oracle Events Processed - Daily

RUNNING STARTING

03/12-18:06:23 03/12-23:57:27 03/13-05:05:26 03/13-10:08:48 03/13-15:18:04 03/13-20:23:43 03/14-01:27:13 03/14-06:31:22 03/14-11:35:16 03/14-16:37:11 03/14-21:39:55 03/15-02:43:30 03/15-07:47:30 03/15-12:50:19 03/15-17:54:27 03/15-22:59:06 03/16-04:07:51 03/16-09:14:01 03/16-14:17:13 03/16-19:19:46

RES Memory - Kb

Date/Time

11

03/16/2011 12:00:22

03/15/2011 12:00:23

03/14/2011 12:00:14

03/13/2011 12:00:14

Number of Events

Windows/MSSQL Events Processed - Daily

360000 350000 340000 330000 320000 310000 300000 290000 280000 SUCCESS

50000 45000 40000 35000 30000 25000 20000 15000 10000 5000 0 RUNNING

STARTING

Date/Time

3.1.8 Results – Memory Usage

HP-UX/Oracle event_demon Memory Usage

event_demon

03/12-18:22:36 03/12-23:25:12 03/13-04:32:58 03/13-09:40:42 03/13-14:46:20 03/13-19:57:45 03/14-01:04:23 03/14-06:14:07 03/14-11:20:14 03/14-16:26:39 03/14-21:31:27 03/15-02:37:56 03/15-07:41:59 03/15-12:43:42 03/15-17:45:31 03/15-22:48:30 03/16-03:55:06 03/16-08:59:36 03/16-14:03:01 03/16-19:07:58

RES Memory - Kb

03/12-18:07:28 03/13-00:40:17 03/13-05:37:32 03/13-10:28:46 03/13-15:30:10 03/13-20:41:18 03/14-01:21:45 03/14-06:32:27 03/14-11:28:41 03/14-16:31:41 03/14-20:58:55 03/15-01:39:40 03/15-06:07:00 03/15-10:54:54 03/15-15:34:24 03/15-20:13:25 03/16-00:52:23 03/16-05:52:01 03/16-10:38:01 03/16-15:03:29 03/16-19:38:35

RES Memory - Kb

HP-UX/Oracle as_server Memory Usage

32000

31000

30000

29000

28000

27000 as_server

26000

Date/Time

Linux/Sybase event_demon Memory Usage

35000

30000

25000

20000

15000

10000

5000 event_demon

0

Date/Time

12

03/12-18:01:51 03/12-23:06:24 03/13-04:14:21 03/13-09:20:02 03/13-14:25:28 03/13-19:30:31 03/14-00:35:44 03/14-05:42:34 03/14-10:49:55 03/14-15:57:26 03/14-21:05:01 03/15-02:09:58 03/15-07:18:28 03/15-12:28:10 03/15-17:37:09 03/15-22:43:31 03/16-03:49:55 03/16-08:55:02 03/16-14:02:20 03/16-19:10:32

RSS Memory - Kb

03/12-18:22:39 03/12-23:11:29 03/13-04:04:13 03/13-08:57:37 03/13-13:48:15 03/13-18:45:09 03/13-23:35:22 03/14-04:31:57 03/14-09:24:49 03/14-14:18:11 03/14-19:07:43 03/14-23:57:55 03/15-04:50:46 03/15-09:37:25 03/15-14:26:44 03/15-19:12:50 03/16-00:02:01 03/16-04:54:32 03/16-09:44:14 03/16-14:33:39 03/16-19:23:29

RES Memory - Kb

Linux/Sybase as_server Memory Usage

24800 24600 24400 24200 24000 23800 23600 23400 23200 23000 as_server

Date/Time

Solaris/Oracle event_demon Memory Usage

68000 66000 64000 62000 60000 58000 56000 54000 52000 50000 event_demon

Date/Time

13

03/12-18:06:23 03/12-23:43:08 03/13-04:36:56 03/13-09:24:26 03/13-14:19:52 03/13-19:11:02 03/14-00:01:41 03/14-04:50:47 03/14-09:38:19 03/14-14:28:20 03/14-19:16:43 03/15-00:04:59 03/15-04:53:14 03/15-09:43:10 03/15-14:30:41 03/15-19:21:32 03/16-00:12:43 03/16-05:05:41 03/16-09:57:05 03/16-14:45:50 03/16-19:34:12

Percent

03/12-18:02:10 03/12-22:52:54 03/13-03:46:16 03/13-08:37:47 03/13-13:27:07 03/13-18:19:09 03/13-23:09:57 03/14-04:01:13 03/14-08:53:52 03/14-13:45:53 03/14-18:38:18 03/14-23:30:31 03/15-04:23:01 03/15-09:16:58 03/15-14:11:29 03/15-19:05:18 03/15-23:56:15 03/16-04:47:13 03/16-09:38:38 03/16-14:31:18 03/16-19:24:54

RSS Memory - Kb

Solaris/Oracle as_server Memory Usage

69800 69600 69400 69200 69000 68800 68600 68400 68200 68000 as_server

Date/Time

3.1.9 Results – CPU Usage

HP-UX/Oracle event_demon CPU Usage

20 18 16 14 12 10 8 6 4 2 0 event_demon

Date/Time

14

03/12-18:22:36 03/12-23:10:25 03/13-04:03:10 03/13-08:56:28 03/13-13:47:11 03/13-18:44:06 03/13-23:34:01 03/14-04:30:45 03/14-09:23:33 03/14-14:17:05 03/14-19:06:39 03/14-23:56:53 03/15-04:49:34 03/15-09:36:23 03/15-14:25:39 03/15-19:11:45 03/16-00:00:57 03/16-04:53:29 03/16-09:42:52 03/16-14:32:36 03/16-19:22:24

Percent 03/12-18:07:28 03/13-00:17:19 03/13-04:51:10 03/13-09:03:37 03/13-13:41:21 03/13-18:26:59 03/13-22:53:29 03/14-03:18:27 03/14-07:56:15 03/14-12:19:40 03/14-16:56:57 03/14-21:02:14 03/15-01:14:23 03/15-05:27:26 03/15-09:43:10 03/15-14:01:57 03/15-18:05:23 03/15-22:28:26 03/16-02:43:37 03/16-07:18:25 03/16-11:29:44 03/16-15:29:43 03/16-19:42:57

Percent

HP-UX/Oracle as_server CPU Usage

8

7

6

5

4

3

2

1 as_server

0

Date/Time

Linux/Sybase event_demon CPU Usage

1.6

1.4

1.2

1

0.8

0.6

0.4

0.2 event_demon

0

Date/Time

15

03/12-18:01:51 03/12-23:06:24 03/13-04:14:21 03/13-09:20:02 03/13-14:25:28 03/13-19:30:31 03/14-00:35:44 03/14-05:42:34 03/14-10:49:55 03/14-15:57:26 03/14-21:05:01 03/15-02:09:58 03/15-07:18:28 03/15-12:28:10 03/15-17:37:09 03/15-22:43:31 03/16-03:49:55 03/16-08:55:02 03/16-14:02:20 03/16-19:10:32

Percent

03/12-18:22:39 03/12-22:58:17 03/13-03:37:12 03/13-08:16:49 03/13-12:54:58 03/13-17:37:10 03/13-22:15:18 03/14-02:58:15 03/14-07:37:43 03/14-12:15:55 03/14-16:53:26 03/14-21:30:19 03/15-02:08:35 03/15-06:45:34 03/15-11:18:40 03/15-15:54:34 03/15-20:27:50 03/16-01:05:23 03/16-05:43:47 03/16-10:19:23 03/16-14:56:00 03/16-19:32:30

Percent

Linux/Sybase as_server CPU Usage

1.4

1.2

1

0.8

0.6

0.4

0.2 as_server

0

Date/Time

Solaris/Oracle event_demon CPU Usage

7.00%

6.00%

5.00%

4.00%

3.00%

2.00%

1.00% event_demon

0.00%

Date/Time

16

4.50% 4.00% 3.50% 3.00% 2.50% 2.00% 1.50% 1.00% 0.50% 0.00%

as_server 03/12-18:02:10 03/12-22:52:54 03/13-03:46:16 03/13-08:37:47 03/13-13:27:07 03/13-18:19:09 03/13-23:09:57 03/14-04:01:13 03/14-08:53:52 03/14-13:45:53 03/14-18:38:18 03/14-23:30:31 03/15-04:23:01 03/15-09:16:58 03/15-14:11:29 03/15-19:05:18 03/15-23:56:15 03/16-04:47:13 03/16-09:38:38 03/16-14:31:18 03/16-19:24:54

Percent

Solaris/Oracle as_server CPU Usage

Date/Time

3.1.10 Summary The variances in job throughput are dependent on the hardware of the machines along with the software releases and configurations of the environments. There is also potential to get better or worse results by tweaking the operating system and software variables.  Windows/MSSQL had the lowest average lag times.  Linux/Sybase, Solaris/Oracle, and HP-UX/Oracle had lower average lag times than AIX/Sybase.  Windows/MSSQL processed the most events.  HP-UX/Oracle, Linux/Sybase and Solaris/Oracle processed more events than AIX/Sybase.  Memory and CPU Consumption was stable throughout the test.

3.2 Performance Testing: Single Scheduler/Single Event Server Test Environment  3 Managers in the following configurations: Linux/Oracle running with Single Scheduler/Single Event Server Solaris/Sybase running with Single Scheduler/Single Event Server Windows/MSSQL Server running with Single Scheduler/Single Event Server  EEM security is enabled  Each Manager submits continuous workload to 11 Agents and 1 Cross Platform Agent over 5+ days.

17

Test Objectives  Calculate average lag time between STARTING, RUNNING, and SUCCESS events over 5+ days  Gather the total number of STARTING, RUNNING and SUCCESS events over 5+ days.  Monitor Memory and CPU Usage in the test environment Job Streams  1 BOX with 20 CMD jobs (18 jobs that run “true”, 1 job that runs “autorep –J ALL”, 1 job that runs “sendevent -E COMMENT -c "EEM comment"”) to the local Agent. This box runs recursively.  1 BOX with 20 CMD jobs that run “true” to each Agent. These boxes run recursively.  1 BOX of 56 jobs of varying types (BOX, CMD, FT, FTP, FW, OMCPU, OMD, OMIP, OMP, OMTF, SCP) to each Agent on AIX/HP-UX/Linux/Solaris. These boxes run recursively.  1 BOX of 66 jobs of varying types (BOX, CMD, FT, FTP, FW, OMCPU, OMD, OMEL, OMIP, OMP, OMS, OMTF, SCP) to each Agent on Windows. These boxes run recursively.  1 CMD job that runs “autoaggr –h” to the local Agent. This job runs once per hour.  1 CMD job that runs an API request to get a list of all jobs to the local Agent. This job runs 3 times per hour.  1 box with 20 command jobs that run “cau9test T=1” to a JMO 11 Cross Platform Agent. This box runs 2 times per hour.

3.2.1 Hardware and Software Details – AIX OS

System

Hard Drive

RAM CPU

Database

WAAE/ Common Components

AIX 6.1 (64bit)

IBM,8204E8A

30GB

4GB

2x4204MHz N/A PowerPC_POWER6

Agent (SSA Sockets) Agent (Plain Sockets)

AIX 6.1 (64bit)

IBM,8203E4A

18GB

3GB

2x4204MHz N/A PowerPC_POWER6

Agent (SSA Sockets)

3.2.2 Hardware and Software Details – HP-UX OS

System

Hard Drive

RAM CPU

HPUX 11.23 (64bit)

HP 9000/800/A4006X

40GB

2GB

Database

1x650MHz N/A PA-RISC

WAAE/ Common Components Agent (SSA Sockets) Agent (Plain Sockets)

3.2.3 Hardware and Software Details – Linux OS

System

Hard

RAM CPU

18

Database

WAAE/

Drive RHEL 4.0 (32bit)

Common Components

VMware, Inc. VMware 23GB Virtual Platform

1GB

1x2.33GHz Intel(R) Xeon(R) CPU 5140

Oracle 10.2

Scheduler Application Server Agent (SSA Sockets) CCI Event Server

SLES 10 (32bit)

Microsoft Corporation 23GB Virtual Machine

2GB

1x2.33GHz Intel(R) Xeon(R) CPU L5410

N/A

Agent (Plain Sockets)

RAM CPU

Database

WAAE/ Common Components

4GB

Sybase 15.0.2

Scheduler Application Server Agent (SSA Sockets)

3.2.4 Hardware and Software Details – Solaris OS

System

Hard Drive

Solaris Sparc 10 (64bit)

Sun Microsystems 130GB sun4v T5240

4x1165MHz SUNW,UltraSPARCT2+

CCI Event Server Solaris Sparc 10 (64bit)

Sun Microsystems 35GB sun4v T5240

2GB

2x1165MHz SUNW,UltraSPARCT2+

N/A

Agent (Plain Sockets)

Solaris Sparc 10 (64bit)

Sun Microsystems 35GB sun4v T5240

2GB

2x1165MHz SUNW,UltraSPARCT2+

N/A

EEM 8.4.405

3.2.5 Hardware and Software Details – Windows OS

System

Hard Drive

RAM

CPU

Database

WAAE/ Common Components

Windows 2003 Server EE SP2 (32bit)

Microsoft Corporation Virtual Machine

40GB

2GB

1x2.33GHz Intel(R) Xeon(R) L5410

MSSQL 2005 SP3

Scheduler Application Server Agent (Plain Sockets) CCI Event Server

19

Windows XP Professional (32bit)

VMware, Inc. VMware Virtual Platform

20GB

1.5GB 1x2.33GHz Intel(R) Xeon(R) L5410

N/A

Agent (Plain Sockets)

Windows 2003 Server EE SP2 (32bit)

VMware, Inc. VMware Virtual Platform

25GB

1.5GB 1x2.33GHz Intel(R) Xeon(R) L5410

N/A

JMO 11 Agent

3.2.6 Results – Lag Times

20

21

3.2.7 Results – Events Processed

22

3.2.8 Results – Memory Usage

23

24

25

3.2.9 Results – CPU Usage

26

27

3.2.10 Summary The variances in job throughput are dependent on the hardware of the machines along with the software releases and configurations of the environments. There is also potential to get better or worse results by tweaking the operating system and software variables.  Windows/MSSQL had the lowest average lag times.  Solaris/Sybase had lower average lag times than Linux/Oracle.  Linux/Oracle processed the most events.  Windows/MSSQL processed more events than Solaris/Sybase.  Memory and CPU Consumption was stable throughout the test.

3.3 Performance Testing: Stability Lag Time (HA with DUAL ES Sybase) Test Environment  1 Manager in the following configuration: Solaris/Sybase running with High Availability/Dual Event Servers  6 Agents on Solaris  6 Agents on Linux  EEM Security is enabled  Autotrack level=1  Workload is submitted to 11 Virtual Machines in the environment over 5+ days Test Objectives

28

  

Calculate average lag time between STARTING, RUNNING, and SUCCESS events over 5+ days Gather the total number of STARTING, RUNNING and SUCCESS events over 5+ days. Monitor Memory and CPU Usage in the test environment

Job Streams  A real-world customer job stream that runs 100k+ jobs per day.

3.3.1 Hardware and Software Details - Solaris OS

System

Hard Drive

RAM CPU

Database

WAAE/ Common Components

Solaris Sparc 10 (64bit)

Sun Microsystems sun4v T5240

131GB

8GB

Sybase 15.0

Primary Scheduler Application Server Agent

8x1165MHz SUNW,UltraSPARCT2+

Event Server A Solaris Sparc 10 (64bit)

Sun Microsystems sun4u Sun Fire V215

59GB

8GB

2x1504MHz SUNW,UltraSPARCIIIi

Sybase 15.0

Shadow Scheduler Application Server Agent Event Server B

Solaris Sparc 10 (64bit)

Sun Microsystems sun4v Sun Fire T200

63GB

4GB

8x1000MHz SUNW,UltraSPARCT1

Sybase 15.0 (Client)

Tiebreaker Scheduler Application Server Agent

Solaris Sparc 10 (64bit)

Sun Microsystems sun4v T5240

131GB

8GB

8x1165MHz SUNW,UltraSPARCT2+

N/A

Agent

Solaris Sparc 10 (64bit)

Sun Microsystems sun4v T5240

131GB

8GB

8x1165MHz SUNW,UltraSPARCT2+

N/A

Agent

Solaris Sparc 10 (64bit)

Sun Microsystems sun4v T5240

131GB

8GB

8x1165MHz SUNW,UltraSPARCT2+

N/A

Agent

RAM CPU

Database

WAAE/ Common Components

2GB

N/A

Agent

EEM 8.4.405

3.3.2 Hardware and Software Details – Linux OS

System

Hard Drive

SLES 10.2 (32bit)

Microsoft Corporation 23GB Virtual Machine

1x2.33GHz Intel(R) Xeon(R) CPU 5410

29

SLES 10.2 (32bit)

Microsoft Corporation 23GB Virtual Machine

2GB

1x2.33GHz Intel(R) Xeon(R) CPU 5410

N/A

Agent

SLES 10.2 (32bit)

Microsoft Corporation 23GB Virtual Machine

2GB

1x2.33GHz Intel(R) Xeon(R) CPU 5410

N/A

Agent

SLES 10.2 (32bit)

Microsoft Corporation 23GB Virtual Machine

2GB

1x2.33GHz Intel(R) Xeon(R) CPU 5410

N/A

Agent

3.3.3 Results – Lag Times

14 12 10 8 6 4 2 0

Date/Time

30

03/11/2011 12:00:08

03/10/2011 12:00:35

03/09/2011 12:00:30

03/08/2011 12:00:37

03/07/2011 12:00:10

SUCCESS 03/06/2011 12:01:21

Seconds

Solaris/Sybase Average Lag Time - Daily

RUNNING STARTING

03/04-12:09:17 03/04-20:48:09 03/05-05:26:55 03/05-14:05:42 03/05-22:44:23 03/06-07:23:04 03/06-16:01:46 03/07-00:40:27 03/07-09:19:08 03/07-17:57:44 03/08-02:36:20 03/08-11:14:55 03/08-19:53:30 03/09-04:32:07 03/09-13:10:50 03/09-21:49:29 03/10-06:28:06 03/10-15:06:44 03/10-23:45:20 03/11-08:24:01

RSS Memory - Kb

Date/Time

31

03/11/2011 12:00:09

03/10/2011 12:00:35

03/09/2011 12:00:30

03/08/2011 12:00:37

03/07/2011 12:00:10

03/06/2011 12:01:21

03/05/2011 12:00:16

Number of Events

3.3.4 Results – Events Processed

Solaris/Sybase Events Processed - Daily

120000

100000

80000

60000

40000

20000

0 SUCCESS

100000 90000 80000 70000 60000 50000 40000 30000 20000 10000 0 RUNNING

STARTING

Date/Time

3.3.5 Results – Memory Usage

Solaris/Sybase event_demon Memory Usage

event_demon

03/04-12:09:17 03/04-20:48:09 03/05-05:26:55 03/05-14:05:42 03/05-22:44:23 03/06-07:23:04 03/06-16:01:46 03/07-00:40:27 03/07-09:19:08 03/07-17:57:44 03/08-02:36:20 03/08-11:14:55 03/08-19:53:30 03/09-04:32:07 03/09-13:10:50 03/09-21:49:29 03/10-06:28:06 03/10-15:06:44 03/10-23:45:20 03/11-08:24:01

Percent

03/04-12:10:17 03/04-20:23:58 03/05-04:37:32 03/05-12:51:10 03/05-21:04:40 03/06-05:18:11 03/06-13:31:42 03/06-21:45:12 03/07-05:58:43 03/07-14:12:10 03/07-22:25:35 03/08-06:39:00 03/08-14:52:25 03/08-23:05:50 03/09-07:19:21 03/09-15:32:51 03/09-23:46:19 03/10-07:59:44 03/10-16:13:11 03/11-00:26:37 03/11-08:40:08

RSS Memory - Kb

Solaris/Sybase as_server Memory Usage

30000

25000

20000

15000

10000

5000 as_server

0

Date/Time

3.3.6 Results – CPU Usage

Solaris/Sybase event_demon CPU Usage

1.60%

1.40%

1.20%

1.00%

0.80%

0.60%

0.40%

0.20% event_demon

0.00%

Date/Time

32

Solaris/Sybase as_server CPU Usage 0.35% 0.30% Percent

0.25% 0.20% 0.15% 0.10% 0.05%

as_server 03/04-12:10:17 03/04-20:23:58 03/05-04:37:32 03/05-12:51:10 03/05-21:04:40 03/06-05:18:11 03/06-13:31:42 03/06-21:45:12 03/07-05:58:43 03/07-14:12:10 03/07-22:25:35 03/08-06:39:00 03/08-14:52:25 03/08-23:05:50 03/09-07:19:21 03/09-15:32:51 03/09-23:46:19 03/10-07:59:44 03/10-16:13:11 03/11-00:26:37 03/11-08:40:08

0.00%

Date/Time

3.3.7 Summary Over weekends the job streams ran less, and this is why event throughput was less. The variances in job throughput are dependent on the hardware of the machines along with the software releases and configurations of the environments. There is also potential to get better or worse results by tweaking the operating system and software variables.  Daily Average Lag Times were < 14 seconds.  Event Throughput averaged ~280k events per day.  Memory and CPU Consumption was stable throughout the test.

3.4 Performance Testing: Stability Lag Time (HA with Oracle RAC) Test Environment  1 Manager in the following configuration: Linux /Oracle RAC running in High Availability  4 Agents on Solaris  6 Agents on Linux  EEM Security is enabled  Autotrack level=1  Workload is submitted to 10 Virtual Machines in the environment over 5+ days

33

Test Objectives  Calculate average lag time between STARTING, RUNNING, and SUCCESS events over 5+ days  Gather the total number of STARTING, RUNNING and SUCCESS events over 5+ days.  Monitor Memory and CPU Usage in the test environment Job Streams  A real-world customer job stream that runs 100k+ jobs per day.

3.4.1 Hardware and Software Details – Linux OS

System

Hard Drive

RAM CPU

Database

WAAE/ Common Components

RHEL 5.5 (64-bit)

Dell Inc. PowerEdge R610

476GB

62GB 16 x 2.67GHz

Oracle RAC Primary Scheduler 11.2.0 Application Server Agent

RHEL 5.5 (64-bit)

Dell Inc. PowerEdge R610

476GB

62GB 16 x 2.67GHz

Oracle RAC Primary Scheduler 11.2.0 Application Server Agent

SLES 10.2 (32bit)

Microsoft Corporation Virtual Machine

23GB

2GB

1x2.33GHz Intel(R) Xeon(R) CPU 5410

N/A

Agent

SLES 10.2 (32bit)

Microsoft Corporation Virtual Machine

23GB

2GB

1x2.33GHz Intel(R) Xeon(R) CPU 5410

N/A

Agent

SLES 10.2 (32bit)

Microsoft Corporation Virtual Machine

23GB

2GB

1x2.33GHz Intel(R) Xeon(R) CPU 5410

N/A

Agent

SLES 10.2 (32bit)

Microsoft Corporation Virtual Machine

23GB

2GB

1x2.33GHz Intel(R) Xeon(R) CPU 5410

N/A

Agent

3.4.2 Hardware and Software Details - Solaris OS

System

Hard Drive

RAM CPU

Database

WAAE/ Common Components

Solaris Sparc 10 (64bit)

Sun Microsystems sun4v Sun Fire T200

63GB

4GB

8x1000MHz SUNW,UltraSPARCT1

N/A

Agent

Solaris

Sun Microsystems

131GB

8GB

8x1165MHz

N/A

Agent

34

Sparc 10 (64bit)

sun4v T5240

Solaris Sparc 10 (64bit)

Sun Microsystems sun4v T5240

131GB

8GB

8x1165MHz SUNW,UltraSPARCT2+

N/A

Agent

Solaris Sparc 10 (64bit)

Sun Microsystems sun4v T5240

131GB

8GB

8x1165MHz SUNW,UltraSPARCT2+

N/A

Agent

SUNW,UltraSPARCT2+

EEM 8.4.405

3.4.3 Results – Lag Times

Linux/Oracle Average Lag Time - Daily 3.5 3

2 1.5 1

SUCCESS

Date/Time

35

3/19/2011 0:00

3/18/2011 0:00

3/17/2011 0:00

3/16/2011 0:00

3/15/2011 0:00

3/14/2011 0:00

3/13/2011 0:00

3/12/2011 0:00

3/11/2011 0:00

STARTING 3/10/2011 0:00

0 3/9/2011 0:00

RUNNING 3/8/2011 0:00

0.5 3/7/2011 0:00

Seconds

2.5

03/07-13:12:08 03/07-16:51:15 03/07-20:30:23 03/08-00:09:31 03/08-03:48:38 03/08-07:27:46 03/08-11:06:53 03/08-14:46:01 03/08-18:25:09 03/08-22:04:17 03/09-01:43:25 03/09-05:22:33 03/09-09:01:41 03/09-12:40:49 03/09-16:20:59 03/09-20:00:08 03/09-23:39:17 03/10-03:18:26 03/10-06:57:35 03/10-10:36:44

Number of Events

Date/Time

36

10000

0 3/19/2011 0:00

3/18/2011 0:00

3/17/2011 0:00

3/16/2011 0:00

3/15/2011 0:00

3/14/2011 0:00

3/13/2011 0:00

3/12/2011 0:00

3/11/2011 0:00

3/10/2011 0:00

3/9/2011 0:00

3/8/2011 0:00

3/7/2011 0:00

Number of Events

3.4.4 Results – Events Processed

Linux/Oracle Events Processed - Daily

120000

100000

80000

60000

40000

20000 SUCCESS

RUNNING

0 STARTING

Date/Time

3.4.5 Results – Memory Usage

Linux/Oracle event_demon Memory Usage

60000

50000

40000

30000

20000

event_demon

03/07-13:12:08 03/07-16:40:58 03/07-20:09:48 03/07-23:38:39 03/08-03:07:29 03/08-06:36:19 03/08-10:05:10 03/08-13:34:00 03/08-17:02:51 03/08-20:31:41 03/09-00:00:32 03/09-03:29:23 03/09-06:58:13 03/09-10:27:04 03/09-13:56:57 03/09-17:25:49 03/09-20:54:40 03/10-00:23:32 03/10-03:52:23 03/10-07:21:15 03/10-10:50:07

Percent

03/07-13:13:09 03/07-16:42:00 03/07-20:10:50 03/07-23:39:41 03/08-03:08:31 03/08-06:37:21 03/08-10:06:11 03/08-13:35:02 03/08-17:03:52 03/08-20:32:43 03/09-00:01:34 03/09-03:30:24 03/09-06:59:15 03/09-10:28:06 03/09-13:56:57 03/09-17:25:49 03/09-20:54:40 03/10-00:23:32 03/10-03:52:23 03/10-07:21:15 03/10-10:50:07

RES Memory - Kb

Linux/Oracle as_server Memory Usage

350000

300000

250000

200000

150000

100000

50000 as_server

0

Date/Time

3.4.6 Results – CPU Usage

Linux/Oracle event_demon CPU Usage

0.12

0.1

0.08

0.06

0.04

0.02 event_demon

0

Date/Time

37

0.45 0.4 0.35 0.3 0.25 0.2 0.15 0.1 0.05 0

as_server 03/07-13:13:09 03/07-16:32:44 03/07-19:52:19 03/07-23:11:54 03/08-02:31:29 03/08-05:51:03 03/08-09:10:38 03/08-12:30:13 03/08-15:49:48 03/08-19:09:23 03/08-22:28:58 03/09-01:48:33 03/09-05:08:08 03/09-08:27:44 03/09-11:47:19 03/09-15:06:55 03/09-18:26:31 03/09-21:46:07 03/10-01:05:43 03/10-04:25:19 03/10-07:44:55 03/10-11:04:31

Percent

Linux/Oracle as_server CPU Usage

Date/Time

3.4.7 Summary Over weekends the job streams ran less, and this is why event throughput was less. The variances in job throughput are dependent on the hardware of the machines along with the software releases and configurations of the environments. There is also potential to get better or worse results by tweaking the operating system and software variables.  Daily Average Lag Times were < 1 second.  Event Throughput averaged ~232k events per day.  Memory and CPU Consumption was stable throughout the test.

38