IF4IT Home

The International Foundation for Information Technology

IF4IT is International
Discipline Quick Links
A B C D E F G
H I J K L M N
O P Q R S T U
V W X Y Z
Disciplines Master Index
Glossary Quick Links
A B C D E F G
H I J K L M N
O P Q R S T U
V W X Y Z
Glossary Master Index

Home Page for the Information Technology (IT) Discipline

"System Performance Test Management"


Table of Contents

Introduction: Introduction to System Performance Test Management
Framework: Using This Artifact as a "System Performance Test Management Framework"
Key Terms: Key Terms for System Performance Test Management
Glossary: The "System Performance Test Management Glossary"
Capabilities: System Performance Test Management as an Enterprise Capability
Ownership: Clearly Defined System Performance Test Management Ownership is Critical for Success
Verbs and Actions: Understanding Why Verbs and Actions are Important to System Performance Test Management
Roles: Key Verb and Action Driven Roles For System Performance Test Management
Taxonomy: Understanding System Performance Test Management Classifications or Categorizations
Ontology: System Performance Test Management Ontology as a Means for Language Standardization
Life Cycle (Lifecycle): Lifecycle Phases for System Performance Test Management
Inventories: System Performance Test Management Inventories
Environments: System Performance Test Management Environments
Metrics: System Performance Test Management Metrics
Services: System Performance Test Management as a Set of Services (a.k.a. System Performance Test Management Services)
Service Paradigms: Centralized System Performance Test Management vs. Federated System Performance Test Management
Principles & Best Practices: Common Principles and Best Practices for System Performance Test Management
Further Reading and Reference Material for System Performance Test Management


Introduction: Introduction to System Performance Test Management

This document represents an aggregated, ordered and contextualized view of the material we've been able to compile and publish that is related to the topic of "System Performance Test Management." The goal is to make this page a landing and launch point for all things related to this topic. As our content becomes more complete and more accurate, this page should become a very useful and powerful knowledge base for this topic and all parties interested in it.

You'll find that the content for this document is consistent with that of other discipline related documents. This is intentional. The consistency is based on a knowledge pattern that helps individuals learn more about different topics, quicker and more efficiently. We hope you find the material useful and easy to learn.

It's important to realize that content in this document and any related sub-documents are constantly evolving. Therefore, we recommend you check for updates, regularly, to keep up with the latest material.

The Foundation always welcomes your feedback and suggestions for improvement, as we're always looking for ways to improve our solutions and offerings to the general community.

All solutions published by the Foundation are subject to the terms and conditions of the Foundation's Master Agreement.


Framework: Using This Artifact as a "System Performance Test Management Framework"

This document or artifact, along with everything in it, is intended to act as a "Framework" that addresses various aspects of System Performance Test Management.

The readers will notice that most sections in the Table of Contents (TOC) use a format where the TOC entry is prefixed with a topic name, followed by a short descriptive title (i.e. "TOPIC_NAME: TOPIC_RELATED_SECTION_TITLE"). This is intentional and represents a format by which the Foundation may achieve things like the identification of appropriate topic areas, the segregation of distinct topic areas from each other, the appropriate ordering of topic areas, and achieve the maintenance of consistency, both, within and across different IT Disciplines.

To elaborate, this artifact is intended to:

  1. Organize different areas of the discipline known as System Performance Test Management into clear and compartmentalized areas that allow the Foundation to more effectively and productively collect, document and publish information that pertains to this discipline.
  2. Decompose each area of System Performance Test Management into smaller and, therefore, more digestible units for more efficient learning and understanding.
  3. Document common industry wisdom about each area, piece or subcomponent of System Performance Test Management
  4. Act as a set of System Performance Test Management related best practices and guidelines that have been collected, documented, and published for the benefit of IT Professionals, regardless of their specific industry, line of business, or area of expertise.
  5. Act as a consistent and repeatable pattern for documenting, publishing and learning, both, within this Discipline and across "all" Disciplines.

From the Foundation's perspective, if done correctly, all of the above will allow the Foundation to properly decompose, document and publish content related to each sub-area or sub-topic for each IT Discipline, including this specific discipline (i.e. "System Performance Test Management").

From the reader's perspective, if done correctly, all of the above will allow him or her to easily find and learn about specific areas of interest associated with this and all other IT Disciplines in a manner where the reader may effectively consume and digest material in small atomic segments that act as repeatable and more effective learning units.

As this artifact evolves and progresses, the reader will see it address key areas of the professional IT Discipline "System Performance Test Management" that range from its detailed definition through closely related terms, phrases and their definitions, to its detailed specification of System Performance Test Management Capabilities, and all the way through to defining, delivering, operating and supporting System Performance Test Management Services.

As mentioned previously, this document will continue to evolve and the Foundation recommends the reader check back, regularly, to stay abreast of modifications and new developments. It is also important to understand that the structure of this artifact may change to meet the needs of such evolution.


Key Terms for System Performance Test Management

Before moving on to learn more about the rest of the System Performance Test Management framework, we suggest that you take some time to familiarlize yourself with the following very basic term(s)...

System Performance Test:

"1. A Test, manual or automated, that is intended to exercise and measure the time based performance of a System or a controlled subset of a System."

System Performance Test Management:

"1. The professional discipline that involves working with, in or on any aspect of planning, delivering, operating or supporting for one or more System Performance Test Items or any and all solutions put in place to deal with such Items.

2. The solution set that a person or organization puts in place to manage one or more System Performance Test Items.

3. The process or processes put in place by a person or organization to assist in the management, coordination, control, delivery, or support of one or more System Performance Test Items.

4. The Enterprise Capability that represents the general ability or functional capacity for a Resource or Organization to deal with or handle one or more System Performance Test Items. Such a term is often used by Information Technology (IT) Architects when performing or engaging in the activities associated with general Capability Modeling."

In addition to the above basic term(s), you can also learn a great deal about System Performance Test Management by familiarizing yourself with the broader spectrum of terms that make up the System Performance Test Management Glossary...


Glossary: The "System Performance Test Management Glossary"

IT Glossary

Language between IT professionals and the businesses we serve is often a significant barrier to success, as we often spend countless hours trying to interpret each other's meanings. This is often also true between IT professionals who are taught to use certain terms and definitions as part of the organizations and industries they serve. It's when you start to jump from organization to organization, from enterprise to enterprise, and from industry to industry that you realize how much time and effort is wasted on just getting language and meanings correct. For these reasons, the Foundation puts a great deal of focus on terms and phrases, as well as their corresponding definitions. We highly recommend you spend time learning and understanding all of the related terms and phrases, along with their meanings, for all areas of "System Performance Test Management."

System Performance Test Management Glossary
Centralized System Performance Test Management System Performance Test Management Principle
Decentralized System Performance Test Management System Performance Test Management Procedure
Enterprise System Performance Test Management System Performance Test Management Process
Federated System Performance Test Management System Performance Test Management Professional
Regional System Performance Test Management System Performance Test Management Program
System Performance Test System Performance Test Management Project
System Performance Test Automation System Performance Test Management Reference Architecture
System Performance Test Capacity Management System Performance Test Management Release
System Performance Test Catalog System Performance Test Management Report
System Performance Test Catalogue System Performance Test Management Reporting
System Performance Test Configuration System Performance Test Management Roadmap
System Performance Test Configuration Item System Performance Test Management Role
System Performance Test Configuration Management System Performance Test Management Rule
System Performance Test Cost System Performance Test Management Schedule
System Performance Test Data Entity System Performance Test Management Security
System Performance Test Database System Performance Test Management Service
System Performance Test Decommission System Performance Test Management Service Assurance
System Performance Test Delivery System Performance Test Management Service Contract
System Performance Test Dependency System Performance Test Management Service Level Agreement (SLA)
System Performance Test Deployment System Performance Test Management Service Level Objective (SLO)
System Performance Test Document System Performance Test Management Service Level Requirement (SLR)
System Performance Test Document Management System Performance Test Management Service Level Target (SLT)
System Performance Test File Plan System Performance Test Management Service Provider
System Performance Test Framework System Performance Test Management Service Request
System Performance Test Governance System Performance Test Management Software
System Performance Test History System Performance Test Management Solution
System Performance Test Identifier System Performance Test Management Stakeholder
System Performance Test Inventory System Performance Test Management Standard
System Performance Test Item System Performance Test Management Strategy
System Performance Test Lifecycle System Performance Test Management Supply
System Performance Test Lifecycle Management System Performance Test Management Support
System Performance Test Management System Performance Test Management System
System Performance Test Management Application System Performance Test Management Theory
System Performance Test Management Best Practice System Performance Test Management Training
System Performance Test Management Blog System Performance Test Management Vision
System Performance Test Management Capability System Performance Test Management Wiki
System Performance Test Management Center of Excellence System Performance Test Management Workflow
System Performance Test Management Certification System Performance Test Metadata
System Performance Test Management Class System Performance Test Migration
System Performance Test Management Community of Practice (CoP) System Performance Test Plan
System Performance Test Management Course System Performance Test Portfolio
System Performance Test Management Data System Performance Test Portfolio Management
System Performance Test Management Data Dictionary System Performance Test Processing
System Performance Test Management Database System Performance Test Record
System Performance Test Management Demand System Performance Test Records Management
System Performance Test Management Dependency System Performance Test Repository
System Performance Test Management Discussion Forum System Performance Test Reuse
System Performance Test Management Document System Performance Test Review
System Performance Test Management Documentation System Performance Test Schedule
System Performance Test Management File Plan System Performance Test Schematic (Schema)
System Performance Test Management Form System Performance Test Security
System Performance Test Management Framework System Performance Test Software
System Performance Test Management Governance System Performance Test Strategy
System Performance Test Management Knowledge System Performance Test Support
System Performance Test Management Lessons Learned System Performance Test Taxonomy
System Performance Test Management Metric System Performance Test Termination
System Performance Test Management Operating Model System Performance Test Tracking
System Performance Test Management Organization System Performance Test Tracking Software
System Performance Test Management Plan System Performance Test Transaction
System Performance Test Management Platform System Performance Test Unique Identifier
System Performance Test Management Policy System Performance Test Verification
System Performance Test Management Portfolio System Performance Test Version
System Performance Test Management Principle System Performance Test Workflow

Please refer to the IT Glossary for other terms and phrases that may be relevant to this professional discipline.

Readers may also refer to the Taxonomy of Glossaries for terms and phrases that are semantically grouped according to IT Disciplines or enterprise domains.

This System Performance Test Management Glossary is a contextual subset of the master IF4IT Glossary of Terms and Phrases. The master glossary can be used by you and your enterprise as a foundation for broader understanding of Information Technology and can be used as a teaching and learning tool for those you work with, helping to ensure a common and more standard language.


Capabilities: System Performance Test Management as an Enterprise Capability

A Capability, as it pertains to Information Technology (IT) or to an enterprise that an IT Organization serves, is defined to be "A manageable feature, faculty, function, process, service or discipline that represents an ability to perform something which yields an expected set of results and is capable of further advancement or development. In other words, a Capability is nothing more than "the ability to do something" or, quite simply, a Feature or Function. Therefore, when applied to an enterprise, a Capability represents a critical Enterprise Feature or Enterprise Function.

When it comes to Capabilities, there are multiple types that an enterprise needs to be aware of. Examples include but are not limited to:

As can be seen above, there are Capabilities that are associated with Resources, Organizations, and Assets such as Systems. All are important to an enterprise.

In the case of this IT Discipline (i.e. System Performance Test Management), we use the word Capability in the context of an Enterprise Capability or an IT Capability, which are both equivalent to Enterprise Disciplines or IT Disciplines, respectively. In short, the Capability of System Performance Test Management represents the ability to deal with any and all System Performance Test Items and anything relevant that is related to or associated with any System Performance Test Items.

If you think about it, a capability is really nothing more than a "verb" or "action that represents "the ability to do something." Understanding this allows us to derive a consistent and highly repeatable set of sub-capabilities for any Noun we're dealing with. For example:

In summary, the implication is that the Enterprise Capability or Enterprise Discipline known as System Performance Test Management is the superset of all the above Sub-Capabilities, as they pertain to or are applied to the discipline-specific Noun: "System Performance Test." This now translates more specifically to:

For a more complete list of very specific Capabilities/Disciplines, refer to the Foundation's Master Inventory of IT Disciplines. It is important to note that this inventory is in a flat or non-hierarchical form, specifically because "hierarchy" is almost always a matter of personal preference or context (what hierarchy is important to one Resource or Organization may be unimportant to another's needs or requirements). Therefore, the Foundation has published its inventory of Capabilities in a non-hierarchical, flat form.

This now brings us to a very obvious problem that surrounds Capabilities, which is the fact that there are simply too many "granular" or "specific" Capabilities to document and publish in any single Capability Model. The end result is that a Capability Model may become unwieldy because of trying to incorporate so many different specific Capabilities. Also, Capability Modeling "Purists," who all have their own (and very differing) opinions about how Capability Models should or should not be represented, almost always refuse to get into the details. To address this, we recommend using a generic set of Capabilities that map to and are driven by the Systems Development Life Cycle. For example:

As you can see from the above, we now have a very limited, controlled and manageable set of Discipline-specific Capabilities for the Discipline System Performance Test Management.

As a reminder, the above Capability representations are "suggestions" for baselining or initializing your own Enterprise Capability Model (ECM). It's recommended that you take the time to work with your enterprise stakeholders to improve upon and/or customize your own ECM so that you can help meet their needs. However, with that being said, it's always a better idea to go in with a baseline that you can modify rather than building your own solution from scratch, especially if your goals are to standardize, not reinvent the wheel, and not deviate too far from what other enterprises are doing to model their own environments. This is especially true if you've never had any experience building ECMs that have gained and maintained full adoption.

Why do enterprises perform Capability Modeling? Enterprises most often build Capability Models that are associated with System Performance Test Management for the following reasons...

Capability Modeling Recommendations: Some things to consider and keep in mind when working on or creating your System Performance Test Management and Enterprise Capability Models...

Learn More About Capability Models: Taking the time to learn about and understand Capability Models, what they're for, and how they're used may help you learn how System Performance Test Management better fits into the broader enterprise. Therefore, we suggest you spend some time reviewing and understanding the IF4IT Enterprise Capability Model...

Enterprise Capability Model

Ownership: Clearly Defined System Performance Test Management Ownership is Critical for Success

IT Discipline Ownership

Here's a very simple fact... If an enterprise does not establish and enforce clearly defined Ownership (i.e. a Resources and his or her Organization are assigned as accountable ownership) for System Performance Test Management, the enterprise has automatically set itself up for failure in its implementation of that discipline. Therefore, if you and your enterprise want to implement and maintain a successful solution for System Performance Test Management, there must be a clearly defined Owner that can and will be held accountable for getting work done, providing transparency, helping with strategy setting, and coordinating implementation of System Performance Test Management as a fully functional and mature enterprise Service.

Having clearly defined Ownership should not be confused with having fully dedicated Resources that spend one hundred percent of their time working on System Performance Test Management. In fact, smaller enterprises can rarely afford to dedicate full time Resources, like larger enterprises can, to all enterprise IT Disciplines. This being the case, all IT Disciplines, including System Performance Test Management, should "always" have clearly defined Owners so that there is always a clear point of accountability and contact for any issues or work that need to be addressed.

In addition to the common best practice of having clearly assigned Ownership for System Performance Test Management, it is also considered a best practice to clearly publish and socialize System Performance Test Management Ownership details to a centralized location (often referred to as a "Service Catalog" or an "Enterprise Service Catalog"), along with Ownership details for all other IT Disciplines, so that the entire enterprise has constant access to it.

Canonical Ownership of an Enterprise Capability

Figure: How Ownership of the Capability System Performance Test Management fits into the Canonical Model for IT

The above figure helps us understand how Capability or Discipline Ownership fits into the Canonical Model for Information Technology (IT) (i.e. "Think," "Deliver," and "Operate"). Owners are assigned to individual Disciplines or Capabilities, such as System Performance Test Management, and are instantly made accountable to the enterprise for the results of all System Performance Test Management Thinking activities (i.e. Strategy, Research, Planning and Design), all System Performance Test Management Delivery activities (i.e. Construction, Deployment and Quality Assurance), and all System Performance Test Management Operations activities (i.e. Use, Maintenance and Support). Done correctly, System Performance Test Management Ownership is constant and ongoing. It's important to understand that such assigned Ownership should "never" end so that there is clear and constant accountability and transparency for all aspects of the Canonical Model to the enterprise.

Not having clear Ownership for System Performance Test Management means that there is no clear understanding of who is accountable for it, who can provide understanding of what's going on within it, who can help the enterprise provide short term and long term descriptions of work being performed within the Discipline area to improve it over time for its customers, and who can help with getting work done that's associated with it. It means your or your enterprise's implementation for System Performance Test Management will be highly incomplete and erratic because no one is constantly (or even partially) watching over the Discipline and its needs for maintenance and evolution. Not having clear System Performance Test Management Ownership is a recipe for confusion and, sometimes, even chaos.

In summary, if you and your enterprise truly want to be successful with your implementation of System Performance Test Management, ensure that a clear and highly accountable owner is identified and assigned to the Discipline. Publish those ownership details, preferably in an enterprise's Service Catalog, and socialize it so everyone knows whom to go to for answers and for help with System Performance Test Management related work. In other words, if you want to implement System Performance Test Management as an enterprise Service, then you absolutely must start with clearly defined, published and socialized Ownership.


Verbs and Actions: Understanding Why Verbs and Actions are Important to System Performance Test Management

Throughout the Foundation's documentation, you will continuously run into the references of "Nouns and Verbs." These concepts are key to consistency and standardization, throughout the IT Industry, down to each and every IT Discipline. Given that we've discussed the impact of "Nouns" on the discipline of "System Performance Test Management," this section will start to discuss the importance of "Verbs" or "Actions" that can be performed with or against the key Noun or Nouns associated with this Discipline. To reiterate, Verbs or Actions allow us to clearly understand what can be performed on or with the Noun in question. As will be discussed in the next section, Verbs or Actions will also help us clearly identify whom it is (i.e. the "who" or more specifically the Roles) that performs or executes such Verbs or Actions against a Discipline and its associated Noun or Nouns. As will be discussed later, Verbs or Actions will also help identify key Attributes (i.e. Field Names) that are necessary for the very data definition of the Noun or Nouns for this Discipline and will even help identify which Verbs or Actions can be automated for this Discipline.

As a reminder, the base Noun for the discipline known as System Performance Test Management is: "System Performance Test," which is sometimes referred to as a the Noun: "System Performance Test Item."

By now, it should be becoming apparent that verbs represent a baseline for defining solid functional requirements and sub-capabilities for what would be a part of any good System Performance Test Management System or Service. What this means is that if you and/or your Organization is looking for a solution in this space (e.g. the purchasing or building of a software solution or the implementation of a Service to address the needs of System Performance Test Management), you could use discipline-related verbs to drive the foundation of what the solution should or shouldn't do, as mapped to specific stakeholders that will use or provide the solution.

Examples of the types of Verbs or Actions that are important to this Discipline include but are not limited to:

The above list represents a very small subset of all Verbs or Actions that are relevant for this Discipline. The more complete set can be found in the Roles section of this document, where readers can see the direct correlation of Verb to Noun and to, both, Generic Role and Discipline Specific Role.


Roles: Key Verb and Action Driven Roles For System Performance Test Management

An "action" or a "verb" is something that can be performed on or with a specific "noun." The reason it is important to itemize all relevant verbs is because we can now start to determine what we can or cannot do with the noun in question, where in this case the noun is "System Performance Test."

Actions/Verbs Example as Applied to "System Performance Test" Generic Roles Discipline-Specific Roles
Administrate Administrate System Performance Test Administrator System Performance Test Administrator
Approve Approve System Performance Test Approver System Performance Test Approver
Architect Architect System Performance Test Architector System Performance Test Architector
Archive Archive System Performance Test Archiver System Performance Test Archiver
Audit Audit System Performance Test Auditor System Performance Test Auditor
Bundle Bundle System Performance Test Bundler System Performance Test Bundler
Clone Clone System Performance Test Cloner System Performance Test Cloner
Code Code System Performance Test Coder System Performance Test Coder
Configure Configure System Performance Test Configurer System Performance Test Configurer
Copy Copy System Performance Test Copier System Performance Test Copier
Create Create System Performance Test Creator System Performance Test Creator
Decommission Decommission System Performance Test Decommissioner System Performance Test Decommissioner
Delete Delete System Performance Test Deletor System Performance Test Deletor
Deploy Deploy System Performance Test Deployer System Performance Test Deployer
Deprecate Deprecate System Performance Test Deprecator System Performance Test Deprecator
Design Design System Performance Test Designer System Performance Test Designer
Destroy Destroy System Performance Test Destroyer System Performance Test Destroyer
Develop Develop System Performance Test Developer System Performance Test Developer
Distribute Distribute System Performance Test Distributor System Performance Test Distributor
Download Download System Performance Test Downloader System Performance Test Downloader
Edit Edit System Performance Test Editor System Performance Test Editor
Educate Educate System Performance Test Educator System Performance Test Educator
Export Export System Performance Test Exporter System Performance Test Exporter
Govern Govern System Performance Test Governor System Performance Test Governor
Import Import System Performance Test Importer System Performance Test Importer
Initialize Initialize System Performance Test Initializer System Performance Test Initializer
Install Install System Performance Test Installer System Performance Test Installer
Instantiate Instantiate System Performance Test Instantiator System Performance Test Instantiator
Integrate Integrate System Performance Test Integrator System Performance Test Integrator
Manage Manage System Performance Test Manager System Performance Test Manager
Merge Merge System Performance Test Merger System Performance Test Merger
Modify Modify System Performance Test Modifier System Performance Test Modifier
Move Move System Performance Test Mover System Performance Test Mover
Own Own System Performance Test Owner System Performance Test Owner
Package Package System Performance Test Packager System Performance Test Packager
Persist Persist System Performance Test Persister System Performance Test Persister
Plan Plan System Performance Test Planner System Performance Test Planner
Purge Purge System Performance Test Purger System Performance Test Purger
Receive Receive System Performance Test Receiver System Performance Test Receiver
Record Record System Performance Test Recorder System Performance Test Recorder
Recover Recover System Performance Test Recoverer System Performance Test Recoverer
Register Register System Performance Test Registrar System Performance Test Registrar
Relocate Relocate System Performance Test Relocator System Performance Test Relocator
Reject Reject System Performance Test Rejecter System Performance Test Rejecter
Remove Remove System Performance Test Remover System Performance Test Remover
Replicate Replicate System Performance Test Replicator System Performance Test Replicator
Report Report System Performance Test Reporter System Performance Test Reporter
Request Request System Performance Test Requestor System Performance Test Requestor
Restore Restore System Performance Test Restorer System Performance Test Restorer
Review Review System Performance Test Reviewer System Performance Test Reviewer
Save Save System Performance Test Saver System Performance Test Saver
Search Search System Performance Test Searcher System Performance Test Searcher
Split Split System Performance Test Splitter System Performance Test Splitter
Sponsor Sponsor System Performance Test Sponsor System Performance Test Sponsor
Store Store System Performance Test Storer System Performance Test Storer
Strategize Strategize System Performance Test (or Set System Performance Test Strategy) Strategizer (or Strategy Setter) System Performance Test Strategizer (or System Performance Test Strategy Setter)
Support Support System Performance Test Supporter System Performance Test Supporter
Test Test System Performance Test Tester System Performance Test Tester
Train Train System Performance Test Trainer System Performance Test Trainer
Upgrade Upgrade System Performance Test Upgrader System Performance Test Upgrader
Upload Upload System Performance Test Uploader System Performance Test Uploader
Verify Verify System Performance Test Verifier System Performance Test Verifier
Version Version System Performance Test Versioner System Performance Test Versioner
View View System Performance Test Viewer System Performance Test Viewer

At a minimum, the above list of Verbs can be used to help identify, track, and manage the basic "Features" required by and associated with System Performance Test Management, even if your enterprise doesn't maintain a Capability Model that lists specific System Performance Test Management Capabilities. Application designers, developers, and architects often find such Verb Lists or Feature Inventories to be invaluable.


Taxonomy: Understanding System Performance Test Management Classifications or Categorizations

IF4IT Taxonomies

A Taxonomy, in its noun form, is defined as:

...a documented and orderly set of types, classifications, categorizations and/or principles that are often achieved through mechanisms including but not limited to naming, defining and/or the grouping of attributes, and which ultimately help to describe, differentiate, identify, arrange and provide contextual relationships between the entities for which the Taxonomy exists.

From this general definition, we can derive that the definition for a System Performance Test Management Taxonomy is:

...a documented and orderly set of types, classifications, categorizations and/or principles that are often achieved through mechanisms including but not limited to naming, defining and/or the grouping of attributes, and which ultimately help to describe, differentiate, identify, arrange and provide contextual relationships between System Performance Test Items, Entities or Types.

In short, what this means all means is that a Taxonomy is nothing more than a classification or typing mechanism and that a System Performance Test Taxonomy is nothing more than a classification or typing mechanism that helps people and systems distinguish between different System Performance Test Items, Entities, Types, Records or any other System Performance Test Management element you can think of.

It's important to understand that Taxonomies can be as simple as a list of relevant terms or phrases with respective meanings or definitions or they can take on more complex forms, such as hierarchical and graphical model structures that can be homogeneous and heterogeneous in nature. More complex Taxonomies include examples such as "Visual Taxonomies" and "Audible Taxonomies" but, expect in the case of very special technologies, are typically out of scope for general Information Technology (IT) Operations.

The Foundation directs readers to its ever-evolving Inventory of Taxonomies for Standard Taxonomy suggestions. Specifically, readers may want to start with the Taxonomy of Taxonomies, which helps make it clear that the IT Industry is composed of many hundreds if not thousands of Taxonomies, Classifications, Categorizations or Types.


Ontology: System Performance Test Management Ontology as a Means for Lanagugae Standardization

While Taxonomies represent organized classifications or types, you can think of Ontologies as the design and representation of entire lanaguages, with the specific intent to control things like structure, behavior, representation, and meaning. Without getting into a theoretical conversations about Ontologies, you can view this entire article as a foundation for the ontology of System Performance Test Management. Or, in other words, a System Performance Test Management Ontology.

Throughout this artifact/framework, you will find things like System Performance Test Management related terms, phrases, definitions, roles, responsibilities, nouns, verbs, classifications, and so on, all as a means of definining a standard representation for and interpretation of the language of System Performance Test Management.

It is only through the definition, communication, and establishment of such Ontologies that we can standardize language and communication associated with System Performance Test Management, whether it be between humans and/or systems.


Life Cycle (Lifecycle): Lifecycle Phases for System Performance Test Management

When we talk about Life Cycle (or lifecycle) for System Performance Test Management, it's important to keep in mind that there are two different types of Life Cycles that apply. The first is a Data Life Cycle, which addresses System Performance Test Management data or entities, and the second is associated with delivering System Performance Test Management Assets like Systems or Software solutions.

System Performance Test Management Data Life Cycle Phases:

Data Lifecycle (or Life Cycle) for any and all data is the period from the "inception" of data through to its ultimately being "purged" from existence. This is no different for System Performance Test Management related data.

Like the data associated with any other professional IT Discipline, System Performance Test Management related data adheres to the following common Data Lifecycle Phases:

Data Lifecycle Phases

Figure: System Performance Test Management Lifecycle Phases

  1. Inception: Data is in it's raw idea-like form and is not ready for consumption by the general population because it has not been documented or registered, anywhere, in a formal manner.
  2. Creation and Registration: Data is formally put into existence for day-to-day use by appropriate stakeholders.
  3. Iterative Maintenance: Data is in a mode of constant use and is updated and modified, as needed, to meet the needs of daily use by various stakeholders.
  4. Decommission and Deletion: Data is prepared for deletion and eventually deleted from daily operational use but still exists for administrative or organizational purposes, such as historical auditing. It can be restored to any one of its relevant last states and, therefore, can be brought back into existence for day-to-day use.
  5. Purged From Existence: Data is completely removed from an environment with no means to restore or reconstruct it, without recreating it from scratch and with no guarantees that it will match it's previous state.

The above Life Cycle Phases represent the high level transitions that occur from the inception of System Performance Test Items or Entities all the way through to their complete elimination from existence. A more detailed breakdown of these transitions or phases represents what are referred to as "System Performance Test Management States."

System Performance Test Management Systems Development Life Cycle (SDLC) Phases or System Performance Test Management Software Development Life Cycle (SDLC) Phases:

The SDLC is a means for facilitating and controlling how IT Professionals deliver Assets, such as System Performance Test Management Systems and Software. In this case, you should default to the master SDLC, which is used to deliver any Asset of any type, including those associated with the System Performance Test Management discipline.

System Performance Test Management SDLC Diagram

Inventories: System Performance Test Management Inventories

There are probably no greater or more important tools for providing System Performance Test Management transparency and direction than the collection, ordering, categorizing, grouping, and maintenance of all related System Performance Test Items. In other words, System Performance Test Management Inventories.

In short, an Inventory represents a list of individual things or instances of things that are typically all of the same Noun Type or Data Type, where these instances are described and detailed by their Attributes, along with the Data and Information that act as values for such Attributes.

At a minimum, System Performance Test Management Inventories are used for the establishment of solid System Performance Test Configuration Management practices, as the System Performance Test Instances tracked within such System Performance Test Inventories act as Configuration Items (in Target and/or Dependency form) for key Configurations (System Performance Test Management Configurations or otherwise).

Inventories are also used for solid decision making. Good decisions, either strategic or tactical, are made based on having good Data and Information. And, good Data and Information only come from taking the time to follow best practices associated with Inventory Management. It's only through building such Inventories that an enterprise can achieve solid System Performance Test Management Business Intelligence and Reporting.

Also, it's these very same Inventories that act as the foundation for understanding and managing Total Cost of Ownership (a.k.a. "TCO") for System Performance Test Management. Without such Inventories, trying to understand your costs can be nothing more than uneducated guessing.

The obvious place to start is with System Performance Test Inventories and then move on to surrounding Inventories that are directly and indirectly related to System Performance Test Management.

Additionally, there are many other types of Inventories that are common and important to System Performance Test Management, which include but are not limited to examples such as:

  1. People and Organizations related to System Performance Test Management
  2. Roles, Responsibilities, and Skills related to System Performance Test Management
  3. Products and Services related to System Performance Test Management
  4. Capabilities related to System Performance Test Management
  5. Contracts, Agreements, and Licenses related to System Performance Test Management
  6. Processes related to System Performance Test Management
  7. Tools and Technologies (e.g. Systems/Applications/Software/Computers) related to System Performance Test Management
  8. Data Types and Instances related to System Performance Test Management
  9. Data Interfaces related to System Performance Test Management
  10. Environments related to System Performance Test Management
  11. Facilities and Locations related to System Performance Test Management

If you and/or your enterprise are not collecting and maintaining such Inventories, you're probably considered to be very low on the efficiency and effectiveness maturity scale.

It's important to keep in mind that collecting and managing System Performance Test Management Inventories is something that should be performed across all phases of System Performance Test Management Lifecycle and across all Environments (i.e. System Performance Test Management Environments). Both are considered to be very important Best Practices. For example, you and/or your enterprise cannot get a complete understanding of System Performance Test Management costs or impacts without knowing all related Inventory Items in all environments. And, tracking across all lifecycle phases gives a temporal perspective that is important for things like problem analysis, historical reporting, and the reconstruction of state (i.e. Configuration Management).

NOTE: System Performance Test Management Inventories are also important for other enterprise functions, such as Architecture and Design. Such Inventories represent the foundation for understanding an enterprise's Current State and are critical for planning Future State and any related strategies, roadmaps, and transition plans for facilititating change.


Environments: System Performance Test Management Environments

Building environments that are specific to and for the discipline known as System Performance Test Management is no different than doing so for any other discipline area. The reader should, therefore, refer to the IT Environment Framework to understand such environments.

IT Environment Framework for System Performance Test Management

Metrics: System Performance Test Management Metrics

As with any professional Discipline, the place to start with when dealing with System Performance Test Management specific metrics is with standard metrics categorizations. Standard Metrics Categorizations, or what are commonly referred to as "SMCs," include but are not limited to...

System Performance Test Management Quantitative Metrics: Quantitative metrics for System Performance Test Management often revolve around the "counting" of key constructs that are associated with the Discipline. For example, the number of System Performance Test Items or Entities that have been Created, Edited or Modified, Copied or Cloned, Destroyed, Archived, Restored, etc. (Note the correlations to key System Performance Test Management Verbs!). Also, the counts for things like the number of System Performance Test Management Stakeholders, such as but not limited to Paying Customers, End Users, Employees, Consultants, etc. are also very useful.

System Performance Test Management Qualitative Metrics: Qualitative metrics for System Performance Test Management often revolve around concepts such as System Performance Test Management Defects, Failures, Problems, Incidents, and/or Issues. So, for example, if we were to capture the number of System Performance Test Management Defects (i.e. their counts) over time, we could do things like see if Defect quantities are going up or down, over time, allowing us to explore that area for things like correlating Causes and Effects.

System Performance Test Management Time Metrics: When dealing with System Performance Test Management Time Metrics, there are usually two forms. The first was introduced in the previous paragraph, which has to do with capturing and measuring things like Quantitative or Qualitative Metrics, over time. In this case, we capture other metric categories, over time, with the intent to see how they change and perform, based on modifications to the System Performance Test Management Operating Environment. The second form of Time related metrics has to do with system or operational performance, such as in the case of how long it takes to process a System Performance Test Management Request, from the time it is created to the time the Requester gets a satisfactory deliverable that allows him or her to move on with his or her work.

System Performance Test Management Utilization Metrics: Utilization Metrics specifically have to do with the consumption of System Performance Test Management specific solutions or deliverables. For example, tracking the number of System Performance Test Management Service Requests, over periods of time, along with their corresponding System Performance Test Management Deliverables, allows one to measure how active System Performance Test Management Services are against other Services that may exist within the Enterprise.

System Performance Test Management Financial Metrics: As is always the case for any single Discipline, Financial Metrics for System Performance Test Management always revolve around things like revenue, expenses, and profits, both, for operators of the Service or Services and for consumers of the Service or Services. For example, if a System Performance Test Management Request is invoked by a System Performance Test Management Customer (acting as the "Requester"), it becomes important to be able to identify and understand what the cost is to that Customer who is invoking the Request, and it also becomes important to understand why that cost is what it is. In the case of Services that do not yield revenue or profits, measuring costs is a strong way to, at very least, help understand the costs associated with each Service being performed by, within, external to, and for the Enterprise and its Customers.

Note: It's important to understand that, when it comes to metrics, enterprises should take a "Crawl," "Walk," "Run" approach to collecting, working with, and understanding them. That is, you cannot get to complex metrics collection, dissection, analysis, and understanding until you start with basic metrics and slowly work your way to more complex metrics representations.


Services: System Performance Test Management as a Set of Services (a.k.a. System Performance Test Management Services)

One of the most important concepts you will learn about System Performance Test Management (or any Discipline, for that matter) is the notion of implementing the Discipline as an accountable, planned, controlled, transparent, and managed "Service."

In short, Services represent a logically "bounded" and repeatable sets of work types, activities or tasks that are performed by humans and/or machines, with the specific intent to provide outputs or deliverables, in the form of solutions for the requesting Stakeholders who are commonly considered the customers of such Services. In other words, we perform and/or provide a Service to deliver very specific solutions to very specific Stakeholders who are looking for a means to solve a certain problem they have.

A System Performance Test Management Service is defined as:

"1. A set of solutions, either transactional (i.e. Transactional System Performance Test Management Services) or dial-tone (i.e. Dial-Tone System Performance Test Management Services), that are being or have been put in place to yield an intended, controlled, expected, repeatable and measurable set of results or deliverables for System Performance Test Management specific Customers, Consumers or Clients.

NOTE: System Performance Test Management Service Consumers or Clients can be either Human Resources or Systems."

All Services, including System Performance Test Management Services, can be performed manually (i.e. by people), automatically (i.e. by machines such as Computers), or by a combination of the two (i.e. a hybrid that is both manually and automated).

Also, all Services, including System Performance Test Management Services, can be either transactional or dial tone, in nature.

In the case of Transactional Services for System Performance Test Management, a Service Request is submitted and that Request is fulfilled as part of a process that is either manual, automated, or a hybrid of both (e.g. a Service to perform maintainance on your System Performance Test Management System).

In the case of Dial Tone Services for System Performance Test Management, a Service is expected to be up, running, available, and accessible to an End User so that he/she/it may perform some controlled and highly repeatable function (e.g. a "System Performance Test Management System" that is up and running all the time).

System Performance Test Management Service Components: The successful implementation of System Performance Test Management as a set of Services for your enterprise usually implies that a number of key components have been established to support it. These components are:

  1. A clearly documented and socialized System Performance Test Management Service Owner that is held accountable for Service performance, quality, and cost.
  2. A clearly documented and socialized System Performance Test Management Service Provider, Organization or Group who is performing the Service or work.
  3. A clearly documented and socialized inventory of all System Performance Test Management Service Inputs, including System Performance Test Management Service Requests and any artifacts necessary to support such Requests so that consumers of the Service know how to engage and request or take advantage of them.
  4. For every System Performance Test Management Service Input, a clearly documented and socialized inventory of System Performance Test Management Service Outputs, making it clear to consumers what they can expect to receive as a result of a successful Service Request.
  5. For every System Performance Test Management Service Input, a clearly documented and socialized inventory of the work being performed by the Service Provider to achieve such Outputs or Deliverables.
  6. For every System Performance Test Management Service Input, a clearly documented and socialized inventory Service Level Agreements (e.g. Service Availability, Service Duration, Service Guarantees, etc.) that can be used to set expectations and measure actuals against for said Service Outputs.
  7. Clearly specified System Performance Test Management Service Costs that help set expectations for Service Requesters (i.e. the cost of a request) and that provide clear transparency to the organizations that fund and sponsor such Services (i.e. the Total Cost of Ownership (TCO) your Service(s).
  8. System Performance Test Management Service Request Patterns (Estimation Creation, Modification, Decommission, Support/Incidents, Complaints, etc.) in order to create intuitive and repeatable user experiences across different Service Types.
  9. Clearly understand what System Performance Test Management Service Resources are required, human or otherwise, to create and deliver your System Performance Test Management Service Deliverables, in a repeatable, cost-efficient, timely, and high quality manner.
  10. For every System Performance Test Management Service Request, understand the chargeback mechanism, in order to recoup your Service Costs.
  11. For every System Performance Test Management Service, it's important to understand the skills that are required, will need to be developed, and will need to be maintained by Service Resources, in order to deliver each Service Deliverable.
  12. It's important to understand who your System Performance Test Management Service Stakeholders are, this includes but is not limited to your Customers, Consumers, Clients, Sponsers, etc. are, as well as the types of problems it is that they're trying to solve or interests that they will have in your Services.

System Performance Test Management Ownership: The most important thing to understand about a System Performance Test Management Service is that, in order for such a Service to be successful, there must be a clear and accountable Owner for it. That is, there needs to be a very clear and accountable named person or organization that owns and is fully responsible for the Service, all of its sub-Services and, most importantly, all of the Service's "Outcomes." Without clear ownership, Services are almost never successful. And, for those few occasions where Services are successful without clear ownership, you can assume that they're successful because the people working in those Service areas are acting as heroes, or... the those Services are just plain lucky (that kind of luck doesn't last for long).

System Performance Test Management Service Inputs: There are typically two types of inputs to any System Performance Test Management Service. The first is what is known as a "System Performance Test Management Service Request" and the second really represents any and all supporting artifacts that are necessary to support such requests, including but not limited to Data and Information in the form of Documents, either electronic or paper in form. Many would argue that the "money" to pay for the Service execution of the Request would be the third but, for now, we will assume that payment is controlled through the Data and Information provided to the Service Operators, in support of the Request.

System Performance Test Management Service Outputs: The outputs of any Service are often referred to as the Service's Deliverables. Therefore, the readers should be aware that the terms "System Performance Test Management Outputs" and "System Performance Test Management Deliverables" are synonymous and interchangeable. All work performed in any enterprise is, by default, a Service that is being performed for someone else and, therefore, all work or Services yield results. These results are the Service's Outputs or Deliverables and a good Service ensures that such Outputs are appropriately documented to the consumers of said Service. This means that for any given System Performance Test Management Service Request Type or Category there will be one or more clearly defined and documented Outputs or Deliverables, making it clear to the consumer what he, she, or they will get in response to their Request. This can be as simple as an answer to a question or as complex as the Merger of two enterprises.

System Performance Test Management Service Levels: Service Levels represent "performance agreements," contractual or otherwise, that dictate how well a System Performance Test Management Service should perform, most often keeping the Customers, Consumers, Clients or End Users of the Service in mind. System Performance Test Management Service Levels can come in many forms and are often worked out by the Customers paying for the Services and the Service Providers who sell or provide the Services. In many cases, Service Levels are also self-imposed by the Service Providers performing the Services as a means to set expectations for Service Customers. In short, System Performance Test Management Service Levels are constraints, limitations, and/or expectations that are tied directly to System Performance Test Management Service Deliverables. They represent measures for things like quality, efficiency, and cost against said Deliverables or Outputs that allow the consumer of such Services to measure what they actually get against what they expected to get.


Service Paradigms: Centralized System Performance Test Management vs. Federated System Performance Test Management

Assuming an enterprise pursues the establishment of System Performance Test Management as a set of controlled Services, there are three common paradigms for doing so. These include:

  1. A "Centralized System Performance Test Management" implementation paradigm
  2. A "Federated System Performance Test Management" implementation paradigm
  3. A "Hybrid System Performance Test Management" implementation paradigm

Centralized System Performance Test Management is defined as:

"1. The term or phrase that implies establishing and/or practicing the Discipline known as System Performance Test Management as a concentric and singular set of organizations and services, usually in order to serve an entire enterprise, regardless of geographic location, further implying full centralization and no federation of any and all System Performance Test Management associated Work, Activities, Actions, Tasks, Capabilities and/or Services."

Federated System Performance Test Management, which is also referred to as Decentralized System Performance Test Management, is defined as:

"1. The term or phrase that implies establishing and/or practicing the Discipline known as System Performance Test Management in multiple pockets, communities, or organizations, further implying no centralization in the implementation and execution of System Performance Test Management associated Work, Activities, Actions, Tasks, Capabilities and/or Services."

There are clear tradeoffs to each of the two models. For example, in a Centralized paradigm, it's normally easier to coordinate work and provide broad coverage, across many areas of the enterprise and relevant stakeholders. However, it becomes far more difficult for a centralized organization to properly fund and staff resources and services in order to perform all required work across all stakeholders, in a much larger enterprise.

It's also important to note that a third paradigm also exists as an option. This is known as a Hybrid System Performance Test Management paradigm or model. In this case, there is a centralized System Performance Test Management organization that is often responsible for things like centralized governance, command, control, and communications, while federated staff and services deal with localized forms of System Performance Test Management. In this type of paradigm, federated staff and services usually report direclty into their local management but may have matrix reporting or responsibilities into the Centralized System Performance Test Management organization.


Principles & Best Practices: Common Principles and Best Practices for System Performance Test Management

A "Principle" is defined as being: "A professed assumption, basis, tenet, doctrine, plan of action or code of conduct for activities, work or behavior." Therefore, we can deduce the definition of "a System Performance Test Management Principle" to be:

System Performance Test Management Principle: "1. A professed assumption, basis, tenet, doctrine, plan of action or code of conduct for any activities, work or behavior associated with the Discipline known as System Performance Test Management."

A "Best Practice" is defined as being: "One or more Activities, Actions, Tasks or Functions that often do not conform with strict Standards and that have evolved, over time, to be considered as conventional wisdom for consistently and repeated achieving Outcomes or Results that can be measured as being equal to or above acceptable norms." Therefore, we can deduce the definition of "a System Performance Test Management Best Practice" to be:

System Performance Test Management Best Practice: "1. One or more System Performance Test Management related Activities, Actions, Tasks or Functions that often do not conform with strict standards and that have evolved, over time, to be considered as conventional wisdom for consistently and repeatedly achieving Outcomes or Results that can be measured as being equal to or above acceptable norms."

The plural form of this term would be "System Performance Test Management Best Practices."

Common System Performance Test Management related principles and best practices exist to help achieve higher than average expectations of quality and to ease in the implementation, support, operations, and future change associated with the solutions industry professionals put in place to address the needs of this Discipline and all its related stakeholders.

While this entire document is meant to represent and serve as a set of common principles and best practices for System Performance Test Management, the following list represents a summary of some very basic examples of what implementers, supporters, and operators of System Performance Test Management should constantly be working toward:

Principle or Best Practice Description
Establish and always have very clear Ownership for System Performance Test Management. Establishing, publishing and socializing clear Ownership for System Performance Test Management allows an enterprise and all its Resources, regardless of their geographic location, to assign accountability for all aspects of the Discipline. It also ensures that there's always at least one person that everyone can go to for transparency into the Discipline as well as for handling work that is associated with the Discipline.
Define, Collect, and Manage Relevant System Performance Test Management Inventories. As an IT professional, there are probably few things that are as important as knowing what is or is not in your portfolio, as well as understanding key traits about your portfolio. You cannot achieve this without the transparency provided by your inventories. Therefore, it is critical that you clearly define, collect, manage, and govern any and all relevant System Performance Test Management inventories. Lack of System Performance Test Management Inventories means no transparency, a chaotic and immature environment, and (even worse) the implication that you don't know how to do your job.
Always use standard terminology for System Performance Test Management, in order to standardize communications between stakeholders. It is often argued that the biggest mistake you can make is to create your own words and/or your own definitions, when communicating with others. There is no place where this is more accurate than in the field of Information Technology. IT Stakeholders make up their own words and definitions far too often, or let their business constituents do so. When you make up words or definitions, or you let others do so, you're creating a grave injustice for your organization. Self invented terminology and grammar often leads to poor communications, which in turn leads to redundancy of solutions, higher complexity of environments, slower delivery times, and much higher costs. Therefore, the IF4IT always recommends that you leverage standard terminology for System Performance Test Management, whenever possible.
Centralization of System Performance Test related data. While often impossible to centralize and collocate all System Performance Test related data and information, especially in a geographically dispersed environment, System Performance Test Management related stakeholders should always strive to centralize all data and information. The goals are to eliminate data fragmentation, improve source of truth for data, reduce the number of systems needed to support stakeholders, reduce the complexity of solutions, improve usability, and to ultimately reduce the costs associated with System Performance Test Management.
Clearly define, implement, track, and analyze System Performance Test Management Metrics. In order to successfully set up the discipline of System Performance Test Management and its related Services, it is critical to clearly define, track, and constantly analyze System Performance Test Management metrics. Such metrics include but are not limited to Supply and Demand Metrics (i.e. Operational Metrics), Performance Metrics, Quality Metrics, and Financial Metrics.
Transparency of System Performance Test related data. Stakeholders should always strive to make any and all System Performance Test Management data transparent to all other appropriate stakeholders, at a minimum, and often to the entire enterprises. The exception when private user data must be protected. Many stakeholders often make the mistake of treating internal operational data as private or protected. This often creates a data silo and will often lead to internally silo-ed organizations that revolve around such data silos.
Do not let "perfection" of System Performance Test Management solutions stand in the way of "good enough solutions". Often, System Performance Test Management stakeholders "overthink" solutions, leading to the impression that best-of-breed or perfect solutions are more effective than "good enough" solutions. Experience tells us that "good enough" is, almost always, the better path to follow. We live in an age where technologies grow old in the blink of an eye. Even the implementation of something that looks perfect, today, will look antiquated, tomorrow. This is especially true if your enterprise doesn't have a long term funding plan and commitment to improvements and upgrades of the solution(s) put in place.
Follow industry Standards, Best Practices, and Guiding Principles for System Performance Test Management, whenever possible". One of the most common errors many enterprises make is to create solutions from scratch or without the guidance, assistance and/or experience of others who have created such solutions, before them. Whenever possible, the IF4IT recommends that you research existing Standards, Best Practices, and Guiding Principles to avoid the mistakes of others, while also gaining from their successes. Remember, we live in a vast world. Chances are very high that someone else has already experienced the pain you're about to create for yourself. Wise people will always look to learn from such people's experiences before they go down the road of implementing their own solutions.
Work toward and maintain a Single Source of Truth (SSoT), whenever possible. While it may be impossible to truly maintain a Single Source of Truth (SSoT) for all data items at all times, especially in the case where the same data entity or instance enters an enterprise through unique data channels, it is an accepted, industry-wide best practice to always work toward such a goal.

Further Reading and Reference Material for System Performance Test Management

The Information Technology (IT) Learning Framework. A tutorial that helps understand Information Technology and how disciplines, such as this one, fits into the bigger picture of IT Operations.

Copyright 2009 - Present by The International Foundation for Information Technology (IF4IT) : Privacy Policy and Terms of Use