|Discipline Quick Links|
|Disciplines Master Index|
|Glossary Quick Links|
|Glossary Master Index|
This document represents an aggregated, ordered and contextualized view of the material we've been able to compile and publish that is related to the topic of "System Integration Test Management." The goal is to make this page a landing and launch point for all things related to this topic. As our content becomes more complete and more accurate, this page should become a very useful and powerful knowledge base for this topic and all parties interested in it.
You'll find that the content for this document is consistent with that of other discipline related documents. This is intentional. The consistency is based on a knowledge pattern that helps individuals learn more about different topics, quicker and more efficiently. We hope you find the material useful and easy to learn.
It's important to realize that content in this document and any related sub-documents are constantly evolving. Therefore, we recommend you check for updates, regularly, to keep up with the latest material.
The Foundation always welcomes your feedback and suggestions for improvement, as we're always looking for ways to improve our solutions and offerings to the general community.
All solutions published by the Foundation are subject to the terms and conditions of the Foundation's Master Agreement.
This document or artifact, along with everything in it, is intended to act as a "Framework" that addresses various aspects of System Integration Test Management.
The readers will notice that most sections in the Table of Contents (TOC) use a format where the TOC entry is prefixed with a topic name, followed by a short descriptive title (i.e. "TOPIC_NAME: TOPIC_RELATED_SECTION_TITLE"). This is intentional and represents a format by which the Foundation may achieve things like the identification of appropriate topic areas, the segregation of distinct topic areas from each other, the appropriate ordering of topic areas, and achieve the maintenance of consistency, both, within and across different IT Disciplines.
To elaborate, this artifact is intended to:
From the Foundation's perspective, if done correctly, all of the above will allow the Foundation to properly decompose, document and publish content related to each sub-area or sub-topic for each IT Discipline, including this specific discipline (i.e. "System Integration Test Management").
From the reader's perspective, if done correctly, all of the above will allow him or her to easily find and learn about specific areas of interest associated with this and all other IT Disciplines in a manner where the reader may effectively consume and digest material in small atomic segments that act as repeatable and more effective learning units.
As this artifact evolves and progresses, the reader will see it address key areas of the professional IT Discipline "System Integration Test Management" that range from its detailed definition through closely related terms, phrases and their definitions, to its detailed specification of System Integration Test Management Capabilities, and all the way through to defining, delivering, operating and supporting System Integration Test Management Services.
As mentioned previously, this document will continue to evolve and the Foundation recommends the reader check back, regularly, to stay abreast of modifications and new developments. It is also important to understand that the structure of this artifact may change to meet the needs of such evolution.
Before moving on to learn more about the rest of the System Integration Test Management framework, we suggest that you take some time to familiarlize yourself with the following very basic term(s)...
System Integration Test:
"1. An evaluation, examination or test of System Integration expectations, of or for one or more Assets, Systems, or Solutions."
System Integration Test Management:
"1. The professional discipline that involves working with, in or on any aspect of planning, delivering, operating or supporting for one or more System Integration Test Items or any and all solutions put in place to deal with such Items.
2. The solution set that a person or organization puts in place to manage one or more System Integration Test Items.
3. The process or processes put in place by a person or organization to assist in the management, coordination, control, delivery, or support of one or more System Integration Test Items.
4. The Enterprise Capability that represents the general ability or functional capacity for a Resource or Organization to deal with or handle one or more System Integration Test Items. Such a term is often used by Information Technology (IT) Architects when performing or engaging in the activities associated with general Capability Modeling."
In addition to the above basic term(s), you can also learn a great deal about System Integration Test Management by familiarizing yourself with the broader spectrum of terms that make up the System Integration Test Management Glossary...
Language between IT professionals and the businesses we serve is often a significant barrier to success, as we often spend countless hours trying to interpret each other's meanings. This is often also true between IT professionals who are taught to use certain terms and definitions as part of the organizations and industries they serve. It's when you start to jump from organization to organization, from enterprise to enterprise, and from industry to industry that you realize how much time and effort is wasted on just getting language and meanings correct. For these reasons, the Foundation puts a great deal of focus on terms and phrases, as well as their corresponding definitions. We highly recommend you spend time learning and understanding all of the related terms and phrases, along with their meanings, for all areas of "System Integration Test Management."
|System Integration Test Management Glossary|
|Centralized System Integration Test Management||System Integration Test Management Principle|
|Decentralized System Integration Test Management||System Integration Test Management Procedure|
|Enterprise System Integration Test Management||System Integration Test Management Process|
|Federated System Integration Test Management||System Integration Test Management Professional|
|Regional System Integration Test Management||System Integration Test Management Program|
|System Integration Test||System Integration Test Management Project|
|System Integration Test Automation||System Integration Test Management Reference Architecture|
|System Integration Test Capacity Management||System Integration Test Management Release|
|System Integration Test Catalog||System Integration Test Management Report|
|System Integration Test Catalogue||System Integration Test Management Reporting|
|System Integration Test Configuration||System Integration Test Management Roadmap|
|System Integration Test Configuration Item||System Integration Test Management Role|
|System Integration Test Configuration Management||System Integration Test Management Rule|
|System Integration Test Cost||System Integration Test Management Schedule|
|System Integration Test Data Entity||System Integration Test Management Security|
|System Integration Test Database||System Integration Test Management Service|
|System Integration Test Decommission||System Integration Test Management Service Assurance|
|System Integration Test Delivery||System Integration Test Management Service Contract|
|System Integration Test Dependency||System Integration Test Management Service Level Agreement (SLA)|
|System Integration Test Deployment||System Integration Test Management Service Level Objective (SLO)|
|System Integration Test Document||System Integration Test Management Service Level Requirement (SLR)|
|System Integration Test Document Management||System Integration Test Management Service Level Target (SLT)|
|System Integration Test File Plan||System Integration Test Management Service Provider|
|System Integration Test Framework||System Integration Test Management Service Request|
|System Integration Test Governance||System Integration Test Management Software|
|System Integration Test History||System Integration Test Management Solution|
|System Integration Test Identifier||System Integration Test Management Stakeholder|
|System Integration Test Inventory||System Integration Test Management Standard|
|System Integration Test Item||System Integration Test Management Strategy|
|System Integration Test Lifecycle||System Integration Test Management Supply|
|System Integration Test Lifecycle Management||System Integration Test Management Support|
|System Integration Test Management||System Integration Test Management System|
|System Integration Test Management Application||System Integration Test Management Theory|
|System Integration Test Management Best Practice||System Integration Test Management Training|
|System Integration Test Management Blog||System Integration Test Management Vision|
|System Integration Test Management Capability||System Integration Test Management Wiki|
|System Integration Test Management Center of Excellence||System Integration Test Management Workflow|
|System Integration Test Management Certification||System Integration Test Metadata|
|System Integration Test Management Class||System Integration Test Migration|
|System Integration Test Management Community of Practice (CoP)||System Integration Test Plan|
|System Integration Test Management Course||System Integration Test Portfolio|
|System Integration Test Management Data||System Integration Test Portfolio Management|
|System Integration Test Management Data Dictionary||System Integration Test Processing|
|System Integration Test Management Database||System Integration Test Record|
|System Integration Test Management Demand||System Integration Test Records Management|
|System Integration Test Management Dependency||System Integration Test Repository|
|System Integration Test Management Discussion Forum||System Integration Test Reuse|
|System Integration Test Management Document||System Integration Test Review|
|System Integration Test Management Documentation||System Integration Test Schedule|
|System Integration Test Management File Plan||System Integration Test Schematic (Schema)|
|System Integration Test Management Form||System Integration Test Security|
|System Integration Test Management Framework||System Integration Test Software|
|System Integration Test Management Governance||System Integration Test Strategy|
|System Integration Test Management Knowledge||System Integration Test Support|
|System Integration Test Management Lessons Learned||System Integration Test Taxonomy|
|System Integration Test Management Metric||System Integration Test Termination|
|System Integration Test Management Operating Model||System Integration Test Tracking|
|System Integration Test Management Organization||System Integration Test Tracking Software|
|System Integration Test Management Plan||System Integration Test Transaction|
|System Integration Test Management Platform||System Integration Test Unique Identifier|
|System Integration Test Management Policy||System Integration Test Verification|
|System Integration Test Management Portfolio||System Integration Test Version|
|System Integration Test Management Principle||System Integration Test Workflow|
Please refer to the IT Glossary for other terms and phrases that may be relevant to this professional discipline.
Readers may also refer to the Taxonomy of Glossaries for terms and phrases that are semantically grouped according to IT Disciplines or enterprise domains.
This System Integration Test Management Glossary is a contextual subset of the master IF4IT Glossary of Terms and Phrases. The master glossary can be used by you and your enterprise as a foundation for broader understanding of Information Technology and can be used as a teaching and learning tool for those you work with, helping to ensure a common and more standard language.
A Capability, as it pertains to Information Technology (IT) or to an enterprise that an IT Organization serves, is defined to be "A manageable feature, faculty, function, process, service or discipline that represents an ability to perform something which yields an expected set of results and is capable of further advancement or development. In other words, a Capability is nothing more than "the ability to do something" or, quite simply, a Feature or Function. Therefore, when applied to an enterprise, a Capability represents a critical Enterprise Feature or Enterprise Function.
When it comes to Capabilities, there are multiple types that an enterprise needs to be aware of. Examples include but are not limited to:
As can be seen above, there are Capabilities that are associated with Resources, Organizations, and Assets such as Systems. All are important to an enterprise.
In the case of this IT Discipline (i.e. System Integration Test Management), we use the word Capability in the context of an Enterprise Capability or an IT Capability, which are both equivalent to Enterprise Disciplines or IT Disciplines, respectively. In short, the Capability of System Integration Test Management represents the ability to deal with any and all System Integration Test Items and anything relevant that is related to or associated with any System Integration Test Items.
If you think about it, a capability is really nothing more than a "verb" or "action that represents "the ability to do something." Understanding this allows us to derive a consistent and highly repeatable set of sub-capabilities for any Noun we're dealing with. For example:
In summary, the implication is that the Enterprise Capability or Enterprise Discipline known as System Integration Test Management is the superset of all the above Sub-Capabilities, as they pertain to or are applied to the discipline-specific Noun: "System Integration Test." This now translates more specifically to:
For a more complete list of very specific Capabilities/Disciplines, refer to the Foundation's Master Inventory of IT Disciplines. It is important to note that this inventory is in a flat or non-hierarchical form, specifically because "hierarchy" is almost always a matter of personal preference or context (what hierarchy is important to one Resource or Organization may be unimportant to another's needs or requirements). Therefore, the Foundation has published its inventory of Capabilities in a non-hierarchical, flat form.
This now brings us to a very obvious problem that surrounds Capabilities, which is the fact that there are simply too many "granular" or "specific" Capabilities to document and publish in any single Capability Model. The end result is that a Capability Model may become unwieldy because of trying to incorporate so many different specific Capabilities. Also, Capability Modeling "Purists," who all have their own (and very differing) opinions about how Capability Models should or should not be represented, almost always refuse to get into the details. To address this, we recommend using a generic set of Capabilities that map to and are driven by the Systems Development Life Cycle. For example:
As you can see from the above, we now have a very limited, controlled and manageable set of Discipline-specific Capabilities for the Discipline System Integration Test Management.
As a reminder, the above Capability representations are "suggestions" for baselining or initializing your own Enterprise Capability Model (ECM). It's recommended that you take the time to work with your enterprise stakeholders to improve upon and/or customize your own ECM so that you can help meet their needs. However, with that being said, it's always a better idea to go in with a baseline that you can modify rather than building your own solution from scratch, especially if your goals are to standardize, not reinvent the wheel, and not deviate too far from what other enterprises are doing to model their own environments. This is especially true if you've never had any experience building ECMs that have gained and maintained full adoption.
Why do enterprises perform Capability Modeling? Enterprises most often build Capability Models that are associated with System Integration Test Management for the following reasons...
Capability Modeling Recommendations: Some things to consider and keep in mind when working on or creating your System Integration Test Management and Enterprise Capability Models...
Learn More About Capability Models: Taking the time to learn about and understand Capability Models, what they're for, and how they're used may help you learn how System Integration Test Management better fits into the broader enterprise. Therefore, we suggest you spend some time reviewing and understanding the IF4IT Enterprise Capability Model...
Here's a very simple fact... If an enterprise does not establish and enforce clearly defined Ownership (i.e. a Resources and his or her Organization are assigned as accountable ownership) for System Integration Test Management, the enterprise has automatically set itself up for failure in its implementation of that discipline. Therefore, if you and your enterprise want to implement and maintain a successful solution for System Integration Test Management, there must be a clearly defined Owner that can and will be held accountable for getting work done, providing transparency, helping with strategy setting, and coordinating implementation of System Integration Test Management as a fully functional and mature enterprise Service.
Having clearly defined Ownership should not be confused with having fully dedicated Resources that spend one hundred percent of their time working on System Integration Test Management. In fact, smaller enterprises can rarely afford to dedicate full time Resources, like larger enterprises can, to all enterprise IT Disciplines. This being the case, all IT Disciplines, including System Integration Test Management, should "always" have clearly defined Owners so that there is always a clear point of accountability and contact for any issues or work that need to be addressed.
In addition to the common best practice of having clearly assigned Ownership for System Integration Test Management, it is also considered a best practice to clearly publish and socialize System Integration Test Management Ownership details to a centralized location (often referred to as a "Service Catalog" or an "Enterprise Service Catalog"), along with Ownership details for all other IT Disciplines, so that the entire enterprise has constant access to it.
Figure: How Ownership of the Capability System Integration Test Management fits into the Canonical Model for IT
The above figure helps us understand how Capability or Discipline Ownership fits into the Canonical Model for Information Technology (IT) (i.e. "Think," "Deliver," and "Operate"). Owners are assigned to individual Disciplines or Capabilities, such as System Integration Test Management, and are instantly made accountable to the enterprise for the results of all System Integration Test Management Thinking activities (i.e. Strategy, Research, Planning and Design), all System Integration Test Management Delivery activities (i.e. Construction, Deployment and Quality Assurance), and all System Integration Test Management Operations activities (i.e. Use, Maintenance and Support). Done correctly, System Integration Test Management Ownership is constant and ongoing. It's important to understand that such assigned Ownership should "never" end so that there is clear and constant accountability and transparency for all aspects of the Canonical Model to the enterprise.
Not having clear Ownership for System Integration Test Management means that there is no clear understanding of who is accountable for it, who can provide understanding of what's going on within it, who can help the enterprise provide short term and long term descriptions of work being performed within the Discipline area to improve it over time for its customers, and who can help with getting work done that's associated with it. It means your or your enterprise's implementation for System Integration Test Management will be highly incomplete and erratic because no one is constantly (or even partially) watching over the Discipline and its needs for maintenance and evolution. Not having clear System Integration Test Management Ownership is a recipe for confusion and, sometimes, even chaos.
In summary, if you and your enterprise truly want to be successful with your implementation of System Integration Test Management, ensure that a clear and highly accountable owner is identified and assigned to the Discipline. Publish those ownership details, preferably in an enterprise's Service Catalog, and socialize it so everyone knows whom to go to for answers and for help with System Integration Test Management related work. In other words, if you want to implement System Integration Test Management as an enterprise Service, then you absolutely must start with clearly defined, published and socialized Ownership.
Throughout the Foundation's documentation, you will continuously run into the references of "Nouns and Verbs." These concepts are key to consistency and standardization, throughout the IT Industry, down to each and every IT Discipline. Given that we've discussed the impact of "Nouns" on the discipline of "System Integration Test Management," this section will start to discuss the importance of "Verbs" or "Actions" that can be performed with or against the key Noun or Nouns associated with this Discipline. To reiterate, Verbs or Actions allow us to clearly understand what can be performed on or with the Noun in question. As will be discussed in the next section, Verbs or Actions will also help us clearly identify whom it is (i.e. the "who" or more specifically the Roles) that performs or executes such Verbs or Actions against a Discipline and its associated Noun or Nouns. As will be discussed later, Verbs or Actions will also help identify key Attributes (i.e. Field Names) that are necessary for the very data definition of the Noun or Nouns for this Discipline and will even help identify which Verbs or Actions can be automated for this Discipline.
As a reminder, the base Noun for the discipline known as System Integration Test Management is: "System Integration Test," which is sometimes referred to as a the Noun: "System Integration Test Item."
By now, it should be becoming apparent that verbs represent a baseline for defining solid functional requirements and sub-capabilities for what would be a part of any good System Integration Test Management System or Service. What this means is that if you and/or your Organization is looking for a solution in this space (e.g. the purchasing or building of a software solution or the implementation of a Service to address the needs of System Integration Test Management), you could use discipline-related verbs to drive the foundation of what the solution should or shouldn't do, as mapped to specific stakeholders that will use or provide the solution.
Examples of the types of Verbs or Actions that are important to this Discipline include but are not limited to:
The above list represents a very small subset of all Verbs or Actions that are relevant for this Discipline. The more complete set can be found in the Roles section of this document, where readers can see the direct correlation of Verb to Noun and to, both, Generic Role and Discipline Specific Role.
An "action" or a "verb" is something that can be performed on or with a specific "noun." The reason it is important to itemize all relevant verbs is because we can now start to determine what we can or cannot do with the noun in question, where in this case the noun is "System Integration Test."
|Actions/Verbs||Example as Applied to "System Integration Test"||Generic Roles||Discipline-Specific Roles|
|Administrate||Administrate System Integration Test||Administrator||System Integration Test Administrator|
|Approve||Approve System Integration Test||Approver||System Integration Test Approver|
|Architect||Architect System Integration Test||Architector||System Integration Test Architector|
|Archive||Archive System Integration Test||Archiver||System Integration Test Archiver|
|Audit||Audit System Integration Test||Auditor||System Integration Test Auditor|
|Bundle||Bundle System Integration Test||Bundler||System Integration Test Bundler|
|Clone||Clone System Integration Test||Cloner||System Integration Test Cloner|
|Code||Code System Integration Test||Coder||System Integration Test Coder|
|Configure||Configure System Integration Test||Configurer||System Integration Test Configurer|
|Copy||Copy System Integration Test||Copier||System Integration Test Copier|
|Create||Create System Integration Test||Creator||System Integration Test Creator|
|Decommission||Decommission System Integration Test||Decommissioner||System Integration Test Decommissioner|
|Delete||Delete System Integration Test||Deletor||System Integration Test Deletor|
|Deploy||Deploy System Integration Test||Deployer||System Integration Test Deployer|
|Deprecate||Deprecate System Integration Test||Deprecator||System Integration Test Deprecator|
|Design||Design System Integration Test||Designer||System Integration Test Designer|
|Destroy||Destroy System Integration Test||Destroyer||System Integration Test Destroyer|
|Develop||Develop System Integration Test||Developer||System Integration Test Developer|
|Distribute||Distribute System Integration Test||Distributor||System Integration Test Distributor|
|Download||Download System Integration Test||Downloader||System Integration Test Downloader|
|Edit||Edit System Integration Test||Editor||System Integration Test Editor|
|Educate||Educate System Integration Test||Educator||System Integration Test Educator|
|Export||Export System Integration Test||Exporter||System Integration Test Exporter|
|Govern||Govern System Integration Test||Governor||System Integration Test Governor|
|Import||Import System Integration Test||Importer||System Integration Test Importer|
|Initialize||Initialize System Integration Test||Initializer||System Integration Test Initializer|
|Install||Install System Integration Test||Installer||System Integration Test Installer|
|Instantiate||Instantiate System Integration Test||Instantiator||System Integration Test Instantiator|
|Integrate||Integrate System Integration Test||Integrator||System Integration Test Integrator|
|Manage||Manage System Integration Test||Manager||System Integration Test Manager|
|Merge||Merge System Integration Test||Merger||System Integration Test Merger|
|Modify||Modify System Integration Test||Modifier||System Integration Test Modifier|
|Move||Move System Integration Test||Mover||System Integration Test Mover|
|Own||Own System Integration Test||Owner||System Integration Test Owner|
|Package||Package System Integration Test||Packager||System Integration Test Packager|
|Persist||Persist System Integration Test||Persister||System Integration Test Persister|
|Plan||Plan System Integration Test||Planner||System Integration Test Planner|
|Purge||Purge System Integration Test||Purger||System Integration Test Purger|
|Receive||Receive System Integration Test||Receiver||System Integration Test Receiver|
|Record||Record System Integration Test||Recorder||System Integration Test Recorder|
|Recover||Recover System Integration Test||Recoverer||System Integration Test Recoverer|
|Register||Register System Integration Test||Registrar||System Integration Test Registrar|
|Relocate||Relocate System Integration Test||Relocator||System Integration Test Relocator|
|Reject||Reject System Integration Test||Rejecter||System Integration Test Rejecter|
|Remove||Remove System Integration Test||Remover||System Integration Test Remover|
|Replicate||Replicate System Integration Test||Replicator||System Integration Test Replicator|
|Report||Report System Integration Test||Reporter||System Integration Test Reporter|
|Request||Request System Integration Test||Requestor||System Integration Test Requestor|
|Restore||Restore System Integration Test||Restorer||System Integration Test Restorer|
|Review||Review System Integration Test||Reviewer||System Integration Test Reviewer|
|Save||Save System Integration Test||Saver||System Integration Test Saver|
|Search||Search System Integration Test||Searcher||System Integration Test Searcher|
|Split||Split System Integration Test||Splitter||System Integration Test Splitter|
|Sponsor||Sponsor System Integration Test||Sponsor||System Integration Test Sponsor|
|Store||Store System Integration Test||Storer||System Integration Test Storer|
|Strategize||Strategize System Integration Test (or Set System Integration Test Strategy)||Strategizer (or Strategy Setter)||System Integration Test Strategizer (or System Integration Test Strategy Setter)|
|Support||Support System Integration Test||Supporter||System Integration Test Supporter|
|Test||Test System Integration Test||Tester||System Integration Test Tester|
|Train||Train System Integration Test||Trainer||System Integration Test Trainer|
|Upgrade||Upgrade System Integration Test||Upgrader||System Integration Test Upgrader|
|Upload||Upload System Integration Test||Uploader||System Integration Test Uploader|
|Verify||Verify System Integration Test||Verifier||System Integration Test Verifier|
|Version||Version System Integration Test||Versioner||System Integration Test Versioner|
|View||View System Integration Test||Viewer||System Integration Test Viewer|
At a minimum, the above list of Verbs can be used to help identify, track, and manage the basic "Features" required by and associated with System Integration Test Management, even if your enterprise doesn't maintain a Capability Model that lists specific System Integration Test Management Capabilities. Application designers, developers, and architects often find such Verb Lists or Feature Inventories to be invaluable.
A Taxonomy, in its noun form, is defined as:
...a documented and orderly set of types, classifications, categorizations and/or principles that are often achieved through mechanisms including but not limited to naming, defining and/or the grouping of attributes, and which ultimately help to describe, differentiate, identify, arrange and provide contextual relationships between the entities for which the Taxonomy exists.
From this general definition, we can derive that the definition for a System Integration Test Management Taxonomy is:
...a documented and orderly set of types, classifications, categorizations and/or principles that are often achieved through mechanisms including but not limited to naming, defining and/or the grouping of attributes, and which ultimately help to describe, differentiate, identify, arrange and provide contextual relationships between System Integration Test Items, Entities or Types.
In short, what this means all means is that a Taxonomy is nothing more than a classification or typing mechanism and that a System Integration Test Taxonomy is nothing more than a classification or typing mechanism that helps people and systems distinguish between different System Integration Test Items, Entities, Types, Records or any other System Integration Test Management element you can think of.
It's important to understand that Taxonomies can be as simple as a list of relevant terms or phrases with respective meanings or definitions or they can take on more complex forms, such as hierarchical and graphical model structures that can be homogeneous and heterogeneous in nature. More complex Taxonomies include examples such as "Visual Taxonomies" and "Audible Taxonomies" but, expect in the case of very special technologies, are typically out of scope for general Information Technology (IT) Operations.
The Foundation directs readers to its ever-evolving Inventory of Taxonomies for Standard Taxonomy suggestions. Specifically, readers may want to start with the Taxonomy of Taxonomies, which helps make it clear that the IT Industry is composed of many hundreds if not thousands of Taxonomies, Classifications, Categorizations or Types.
While Taxonomies represent organized classifications or types, you can think of Ontologies as the design and representation of entire lanaguages, with the specific intent to control things like structure, behavior, representation, and meaning. Without getting into a theoretical conversations about Ontologies, you can view this entire article as a foundation for the ontology of System Integration Test Management. Or, in other words, a System Integration Test Management Ontology.
Throughout this artifact/framework, you will find things like System Integration Test Management related terms, phrases, definitions, roles, responsibilities, nouns, verbs, classifications, and so on, all as a means of definining a standard representation for and interpretation of the language of System Integration Test Management.
It is only through the definition, communication, and establishment of such Ontologies that we can standardize language and communication associated with System Integration Test Management, whether it be between humans and/or systems.
When we talk about Life Cycle (or lifecycle) for System Integration Test Management, it's important to keep in mind that there are two different types of Life Cycles that apply. The first is a Data Life Cycle, which addresses System Integration Test Management data or entities, and the second is associated with delivering System Integration Test Management Assets like Systems or Software solutions.
System Integration Test Management Data Life Cycle Phases:
Data Lifecycle (or Life Cycle) for any and all data is the period from the "inception" of data through to its ultimately being "purged" from existence. This is no different for System Integration Test Management related data.
Like the data associated with any other professional IT Discipline, System Integration Test Management related data adheres to the following common Data Lifecycle Phases:
Figure: System Integration Test Management Lifecycle Phases
The above Life Cycle Phases represent the high level transitions that occur from the inception of System Integration Test Items or Entities all the way through to their complete elimination from existence. A more detailed breakdown of these transitions or phases represents what are referred to as "System Integration Test Management States."
System Integration Test Management Systems Development Life Cycle (SDLC) Phases or System Integration Test Management Software Development Life Cycle (SDLC) Phases:
The SDLC is a means for facilitating and controlling how IT Professionals deliver Assets, such as System Integration Test Management Systems and Software. In this case, you should default to the master SDLC, which is used to deliver any Asset of any type, including those associated with the System Integration Test Management discipline.
There are probably no greater or more important tools for providing System Integration Test Management transparency and direction than the collection, ordering, categorizing, grouping, and maintenance of all related System Integration Test Items. In other words, System Integration Test Management Inventories.
In short, an Inventory represents a list of individual things or instances of things that are typically all of the same Noun Type or Data Type, where these instances are described and detailed by their Attributes, along with the Data and Information that act as values for such Attributes.
At a minimum, System Integration Test Management Inventories are used for the establishment of solid System Integration Test Configuration Management practices, as the System Integration Test Instances tracked within such System Integration Test Inventories act as Configuration Items (in Target and/or Dependency form) for key Configurations (System Integration Test Management Configurations or otherwise).
Inventories are also used for solid decision making. Good decisions, either strategic or tactical, are made based on having good Data and Information. And, good Data and Information only come from taking the time to follow best practices associated with Inventory Management. It's only through building such Inventories that an enterprise can achieve solid System Integration Test Management Business Intelligence and Reporting.
Also, it's these very same Inventories that act as the foundation for understanding and managing Total Cost of Ownership (a.k.a. "TCO") for System Integration Test Management. Without such Inventories, trying to understand your costs can be nothing more than uneducated guessing.
The obvious place to start is with System Integration Test Inventories and then move on to surrounding Inventories that are directly and indirectly related to System Integration Test Management.
Additionally, there are many other types of Inventories that are common and important to System Integration Test Management, which include but are not limited to examples such as:
If you and/or your enterprise are not collecting and maintaining such Inventories, you're probably considered to be very low on the efficiency and effectiveness maturity scale.
It's important to keep in mind that collecting and managing System Integration Test Management Inventories is something that should be performed across all phases of System Integration Test Management Lifecycle and across all Environments (i.e. System Integration Test Management Environments). Both are considered to be very important Best Practices. For example, you and/or your enterprise cannot get a complete understanding of System Integration Test Management costs or impacts without knowing all related Inventory Items in all environments. And, tracking across all lifecycle phases gives a temporal perspective that is important for things like problem analysis, historical reporting, and the reconstruction of state (i.e. Configuration Management).
NOTE: System Integration Test Management Inventories are also important for other enterprise functions, such as Architecture and Design. Such Inventories represent the foundation for understanding an enterprise's Current State and are critical for planning Future State and any related strategies, roadmaps, and transition plans for facilititating change.
Building environments that are specific to and for the discipline known as System Integration Test Management is no different than doing so for any other discipline area. The reader should, therefore, refer to the IT Environment Framework to understand such environments.
As with any professional Discipline, the place to start with when dealing with System Integration Test Management specific metrics is with standard metrics categorizations. Standard Metrics Categorizations, or what are commonly referred to as "SMCs," include but are not limited to...
System Integration Test Management Quantitative Metrics: Quantitative metrics for System Integration Test Management often revolve around the "counting" of key constructs that are associated with the Discipline. For example, the number of System Integration Test Items or Entities that have been Created, Edited or Modified, Copied or Cloned, Destroyed, Archived, Restored, etc. (Note the correlations to key System Integration Test Management Verbs!). Also, the counts for things like the number of System Integration Test Management Stakeholders, such as but not limited to Paying Customers, End Users, Employees, Consultants, etc. are also very useful.
System Integration Test Management Qualitative Metrics: Qualitative metrics for System Integration Test Management often revolve around concepts such as System Integration Test Management Defects, Failures, Problems, Incidents, and/or Issues. So, for example, if we were to capture the number of System Integration Test Management Defects (i.e. their counts) over time, we could do things like see if Defect quantities are going up or down, over time, allowing us to explore that area for things like correlating Causes and Effects.
System Integration Test Management Time Metrics: When dealing with System Integration Test Management Time Metrics, there are usually two forms. The first was introduced in the previous paragraph, which has to do with capturing and measuring things like Quantitative or Qualitative Metrics, over time. In this case, we capture other metric categories, over time, with the intent to see how they change and perform, based on modifications to the System Integration Test Management Operating Environment. The second form of Time related metrics has to do with system or operational performance, such as in the case of how long it takes to process a System Integration Test Management Request, from the time it is created to the time the Requester gets a satisfactory deliverable that allows him or her to move on with his or her work.
System Integration Test Management Utilization Metrics: Utilization Metrics specifically have to do with the consumption of System Integration Test Management specific solutions or deliverables. For example, tracking the number of System Integration Test Management Service Requests, over periods of time, along with their corresponding System Integration Test Management Deliverables, allows one to measure how active System Integration Test Management Services are against other Services that may exist within the Enterprise.
System Integration Test Management Financial Metrics: As is always the case for any single Discipline, Financial Metrics for System Integration Test Management always revolve around things like revenue, expenses, and profits, both, for operators of the Service or Services and for consumers of the Service or Services. For example, if a System Integration Test Management Request is invoked by a System Integration Test Management Customer (acting as the "Requester"), it becomes important to be able to identify and understand what the cost is to that Customer who is invoking the Request, and it also becomes important to understand why that cost is what it is. In the case of Services that do not yield revenue or profits, measuring costs is a strong way to, at very least, help understand the costs associated with each Service being performed by, within, external to, and for the Enterprise and its Customers.
Note: It's important to understand that, when it comes to metrics, enterprises should take a "Crawl," "Walk," "Run" approach to collecting, working with, and understanding them. That is, you cannot get to complex metrics collection, dissection, analysis, and understanding until you start with basic metrics and slowly work your way to more complex metrics representations.
One of the most important concepts you will learn about System Integration Test Management (or any Discipline, for that matter) is the notion of implementing the Discipline as an accountable, planned, controlled, transparent, and managed "Service."
In short, Services represent a logically "bounded" and repeatable sets of work types, activities or tasks that are performed by humans and/or machines, with the specific intent to provide outputs or deliverables, in the form of solutions for the requesting Stakeholders who are commonly considered the customers of such Services. In other words, we perform and/or provide a Service to deliver very specific solutions to very specific Stakeholders who are looking for a means to solve a certain problem they have.
A System Integration Test Management Service is defined as:
"1. A set of solutions, either transactional (i.e. Transactional System Integration Test Management Services) or dial-tone (i.e. Dial-Tone System Integration Test Management Services), that are being or have been put in place to yield an intended, controlled, expected, repeatable and measurable set of results or deliverables for System Integration Test Management specific Customers, Consumers or Clients.
NOTE: System Integration Test Management Service Consumers or Clients can be either Human Resources or Systems."
All Services, including System Integration Test Management Services, can be performed manually (i.e. by people), automatically (i.e. by machines such as Computers), or by a combination of the two (i.e. a hybrid that is both manually and automated).
Also, all Services, including System Integration Test Management Services, can be either transactional or dial tone, in nature.
In the case of Transactional Services for System Integration Test Management, a Service Request is submitted and that Request is fulfilled as part of a process that is either manual, automated, or a hybrid of both (e.g. a Service to perform maintainance on your System Integration Test Management System).
In the case of Dial Tone Services for System Integration Test Management, a Service is expected to be up, running, available, and accessible to an End User so that he/she/it may perform some controlled and highly repeatable function (e.g. a "System Integration Test Management System" that is up and running all the time).
System Integration Test Management Service Components: The successful implementation of System Integration Test Management as a set of Services for your enterprise usually implies that a number of key components have been established to support it. These components are:
System Integration Test Management Ownership: The most important thing to understand about a System Integration Test Management Service is that, in order for such a Service to be successful, there must be a clear and accountable Owner for it. That is, there needs to be a very clear and accountable named person or organization that owns and is fully responsible for the Service, all of its sub-Services and, most importantly, all of the Service's "Outcomes." Without clear ownership, Services are almost never successful. And, for those few occasions where Services are successful without clear ownership, you can assume that they're successful because the people working in those Service areas are acting as heroes, or... the those Services are just plain lucky (that kind of luck doesn't last for long).
System Integration Test Management Service Inputs: There are typically two types of inputs to any System Integration Test Management Service. The first is what is known as a "System Integration Test Management Service Request" and the second really represents any and all supporting artifacts that are necessary to support such requests, including but not limited to Data and Information in the form of Documents, either electronic or paper in form. Many would argue that the "money" to pay for the Service execution of the Request would be the third but, for now, we will assume that payment is controlled through the Data and Information provided to the Service Operators, in support of the Request.
System Integration Test Management Service Outputs: The outputs of any Service are often referred to as the Service's Deliverables. Therefore, the readers should be aware that the terms "System Integration Test Management Outputs" and "System Integration Test Management Deliverables" are synonymous and interchangeable. All work performed in any enterprise is, by default, a Service that is being performed for someone else and, therefore, all work or Services yield results. These results are the Service's Outputs or Deliverables and a good Service ensures that such Outputs are appropriately documented to the consumers of said Service. This means that for any given System Integration Test Management Service Request Type or Category there will be one or more clearly defined and documented Outputs or Deliverables, making it clear to the consumer what he, she, or they will get in response to their Request. This can be as simple as an answer to a question or as complex as the Merger of two enterprises.
System Integration Test Management Service Levels: Service Levels represent "performance agreements," contractual or otherwise, that dictate how well a System Integration Test Management Service should perform, most often keeping the Customers, Consumers, Clients or End Users of the Service in mind. System Integration Test Management Service Levels can come in many forms and are often worked out by the Customers paying for the Services and the Service Providers who sell or provide the Services. In many cases, Service Levels are also self-imposed by the Service Providers performing the Services as a means to set expectations for Service Customers. In short, System Integration Test Management Service Levels are constraints, limitations, and/or expectations that are tied directly to System Integration Test Management Service Deliverables. They represent measures for things like quality, efficiency, and cost against said Deliverables or Outputs that allow the consumer of such Services to measure what they actually get against what they expected to get.
A "Principle" is defined as being: "A professed assumption, basis, tenet, doctrine, plan of action or code of conduct for activities, work or behavior." Therefore, we can deduce the definition of "a System Integration Test Management Principle" to be:
System Integration Test Management Principle: "1. A professed assumption, basis, tenet, doctrine, plan of action or code of conduct for any activities, work or behavior associated with the Discipline known as System Integration Test Management."
A "Best Practice" is defined as being: "One or more Activities, Actions, Tasks or Functions that often do not conform with strict Standards and that have evolved, over time, to be considered as conventional wisdom for consistently and repeated achieving Outcomes or Results that can be measured as being equal to or above acceptable norms." Therefore, we can deduce the definition of "a System Integration Test Management Best Practice" to be:
System Integration Test Management Best Practice: "1. One or more System Integration Test Management related Activities, Actions, Tasks or Functions that often do not conform with strict standards and that have evolved, over time, to be considered as conventional wisdom for consistently and repeatedly achieving Outcomes or Results that can be measured as being equal to or above acceptable norms."
The plural form of this term would be "System Integration Test Management Best Practices."
Common System Integration Test Management related principles and best practices exist to help achieve higher than average expectations of quality and to ease in the implementation, support, operations, and future change associated with the solutions industry professionals put in place to address the needs of this Discipline and all its related stakeholders.
While this entire document is meant to represent and serve as a set of common principles and best practices for System Integration Test Management, the following list represents a summary of some very basic examples of what implementers, supporters, and operators of System Integration Test Management should constantly be working toward:
|Principle or Best Practice||Description|
|Establish and always have very clear Ownership for System Integration Test Management.||Establishing, publishing and socializing clear Ownership for System Integration Test Management allows an enterprise and all its Resources, regardless of their geographic location, to assign accountability for all aspects of the Discipline. It also ensures that there's always at least one person that everyone can go to for transparency into the Discipline as well as for handling work that is associated with the Discipline.|
|Always use standard terminology for System Integration Test Management, in order to standardize communications between stakeholders.||It is often argued that the biggest mistake you can make is to create your own words and/or your own definitions, when communicating with others. There is no place where this is more accurate than in the field of Information Technology. IT Stakeholders make up their own words and definitions far too often, or let their business constituents do so. When you make up words or definitions, or you let others do so, you're creating a grave injustice for your organization. Self invented terminology and grammar often leads to poor communications, which in turn leads to redundancy of solutions, higher complexity of environments, slower delivery times, and much higher costs. Therefore, the IF4IT always recommends that you leverage standard terminology for System Integration Test Management, whenever possible.|
|Centralization of System Integration Test related data.||While often impossible to centralize and collocate all System Integration Test related data and information, especially in a geographically dispersed environment, System Integration Test Management related stakeholders should always strive to centralize all data and information. The goals are to eliminate data fragmentation, improve source of truth for data, reduce the number of systems needed to support stakeholders, reduce the complexity of solutions, improve usability, and to ultimately reduce the costs associated with System Integration Test Management.|
|Clearly define, implement, track, and analyze System Integration Test Management Metrics.||In order to successfully set up the discipline of System Integration Test Management and its related Services, it is critical to clearly define, track, and constantly analyze System Integration Test Management metrics. Such metrics include but are not limited to Supply and Demand Metrics (i.e. Operational Metrics), Performance Metrics, Quality Metrics, and Financial Metrics.|
|Transparency of System Integration Test related data.||Stakeholders should always strive to make any and all System Integration Test Management data transparent to all other appropriate stakeholders, at a minimum, and often to the entire enterprises. The exception when private user data must be protected. Many stakeholders often make the mistake of treating internal operational data as private or protected. This often creates a data silo and will often lead to internally silo-ed organizations that revolve around such data silos.|
|Do not let "perfection" of System Integration Test Management solutions stand in the way of "good enough solutions".||Often, System Integration Test Management stakeholders "overthink" solutions, leading to the impression that best-of-breed or perfect solutions are more effective than "good enough" solutions. Experience tells us that "good enough" is, almost always, the better path to follow. We live in an age where technologies grow old in the blink of an eye. Even the implementation of something that looks perfect, today, will look antiquated, tomorrow. This is especially true if your enterprise doesn't have a long term funding plan and commitment to improvements and upgrades of the solution(s) put in place.|
|Follow industry Standards, Best Practices, and Guiding Principles for System Integration Test Management, whenever possible".||One of the most common errors many enterprises make is to create solutions from scratch or without the guidance, assistance and/or experience of others who have created such solutions, before them. Whenever possible, the IF4IT recommends that you research existing Standards, Best Practices, and Guiding Principles to avoid the mistakes of others, while also gaining from their successes. Remember, we live in a vast world. Chances are very high that someone else has already experienced the pain you're about to create for yourself. Wise people will always look to learn from such people's experiences before they go down the road of implementing their own solutions.|
|Work toward and maintain a Single Source of Truth (SSoT), whenever possible.||While it may be impossible to truly maintain a Single Source of Truth (SSoT) for all data items at all times, especially in the case where the same data entity or instance enters an enterprise through unique data channels, it is an accepted, industry-wide best practice to always work toward such a goal.|
The Information Technology (IT) Learning Framework. A tutorial that helps understand Information Technology and how disciplines, such as this one, fits into the bigger picture of IT Operations.