Skip to content

December 13, 2023

Reinventing Software Asset Inventory

By Stephen Magill ,Michael Edenzon ,Rakesh Bantu ,Charles Betz

This post is an adapted excerpt for the DevOps Enterprise Forum paper “Reinventing Software Asset Inventory: A Modern Approach to Maintaining Evidence and Relationships between Software Assets.”


As digital systems expand in scope, mission-criticality, and reach, organizations find themselves challenged to understand and account for them. The digital estate presents an information management problem that has increased dramatically in scale and complexity since the early days of computing. Digital systems support financial, supply chain, and even healthcare outcomes, among many other uses. Understanding the resources and their configurations and interplay is essential to sustaining their smooth and secure operation.

Yet, too many organizations struggle with manual, ad hoc procedures that yield inconsistent and sometimes inaccurate information that cannot achieve the necessary precision to support operational objectives. Imprecise, inconsistent, and inaccurate inventory data (sometimes called “metadata” in the context of digital systems, as this is “data about the data processing infrastructure”) will stunt the efficacy of advancements such as automated governance and impede an organization’s ability to respond to threats such as Log4j. Inventory must be driven by objective evidence, captured through repeatable procedures, and maintained in real-time.

Drawing from the principles of automated governance, we propose a methodology and approach to maintaining software asset inventory that allows for granular and reproducible metadata to be captured and updated in real-time throughout the software life cycle.

Goals

This paper will outline a methodology for achieving an asset inventory that supports current and future generations of software architecture and development practices and is capable of scaling to meet the requirements of large organizations. Furthermore, we will present solutions that leverage existing systems of record, providing a smooth path to a more mature inventory practice.

As digital transformation intensifies, the challenge of managing information in large-scale digital organizations is increasingly critical. Historically, organizations have grappled with information management since the onset of widespread computer usage—from mainframe computing to the present era of digital proliferation. The Configuration Management Database (CMDB) was conceived as a solution to this issue, offering a consolidated “database” for tracking and managing IT resources, but it has suffered many failures and currently has a troubled reputation.

However, the CMDB is not the issue in and of itself. It is a manifestation of the broader underlying problem: the need for efficient, effective, and scalable information management in the complex digital landscapes of modern enterprises.

Today’s challenge is two fold:

  1. the number of resources in an organization’s digital estate has multiplied exponentially, and
  2. simultaneously, the economic and social dependencies and therefore risk associated with the resources and the services they comprise has grown substantially.

What’s needed is not an evolution of the CMDB, nor another “silver bullet” to replace it, but a new approach to managing software asset inventory, the relationships between software assets, and the evidence upon which these relationships are based. This new approach prioritizes information management best practices, including transparency, objectivity, and repeatability to create a rich ledger of evidence and metadata surrounding the assets in a software ecosystem. It provides sufficient insight into an organization’s asset inventory to embrace automated governance and ensure regulatory compliance.

Problems/Challenges to Solve

Currently, large organizations use various repositories to manage their digital estate (software, hardware, and higher-level groupings generally termed products, applications, or services). One of the most important has been the CMDB (configuration management database). The CMDB (and its associated practices) have various challenges:

  • Data accuracy and collection methods
  • Visibility
  • Precision of policy

Minimal Viable Solution

We claim that proper methodology can address the lion’s share of the three underlying problems of current asset inventory. The following principles address the fundamental requirements of a minimum viable solution and simultaneously provide a foundation for future enhancements and the progression toward a more advanced inventory management.

1) Transparency and Repeatability

An acceptable methodology for evidence collection must be transparent and repeatable. It is important to note that automation, while most desirable, is not a requirement.

2) Event-Driven (Or Strive to Be):

Inventory should change when a change occurs. Should an asset be created or its properties change, the inventory should be updated accordingly. The legacy method of periodically attesting to the accuracy of information is insufficient and cannot keep up with the pace of change in the modern software landscape. An asset should be tracked in inventory based on its activity, not manual input. Activities such as the creation of a new repository or the compilation and publishing of a binary warrant additions or changes to the inventory. Activities such as these should be the basis for asset tracking.

Adherence to this principle ensures that any changes to an asset’s properties are promptly and accurately reflected in the inventory. This approach allows for real-time visibility and monitoring of software assets, enabling organizations to make informed decisions based on the most current information available.

Transgression of this principle could result in outdated or inaccurate information remaining in the system, leading to incorrect policy enforcement, increased risk exposure, and potential non-compliance with regulatory requirements. The lack of real-time updates also hampers the organization’s ability to respond to emerging threats and vulnerabilities in a timely manner.

3) Objectivity Observations, Objective Conclusions

Properties that define an asset should be grounded in objectivity, deriving from empirical observations that, given similar conditions, can be replicated. This objectivity must be reflected not only in the property values but also in the conclusions drawn from those values. Adherence to this principle ensures that the information used to describe and understand an asset is continually up-to-date, reproducible, and open to full auditability. It fosters a high level of trust in the data presented and the decisions made based on that data.

Transgression of this principle, however, can introduce inaccuracies and inconsistencies in the recorded properties of an asset, undermining confidence in the inventory information and potentially leading to suboptimal decisions and policy enforcement. Furthermore, it can make the inventory susceptible to challenges from audit or regulatory bodies due to the lack of a reproducible, objective trail of evidence.

4) Subjective Application Boundaries

Whereas evidence and conclusions warrant objectivity, there is an essential subjectivity to setting application boundaries; as Martin Fowler notes, “applications are social constructions.”

5) Dynamic and Real Time

The solution should focus on providing dynamic, real-time insights into the organization’s software assets, enabling rapid response to changes in the software ecosystem. This involves adopting an event-driven architecture that updates the software asset inventory in real-time based on observed changes, ensuring that the inventory remains accurate and up-to-date at all times.


Read the full paper in the Fall 2023 DevOps Enterprise Journal.

- About The Authors
Avatar photo

Stephen Magill

Vice President, Product Innovation at Sonatype

Follow Stephen on Social Media
Avatar photo

Michael Edenzon

Michael Edenzon is a senior IT leader and engineer that modernizes and disrupts the technical landscape for highly-regulated organizations. Michael provides technical design, decisioning, and solutioning across complex verticals and leverages continuous learning practices to drive organizational change. He is a fervent advocate for the developer experience and believes that enablement-focused automation is the key to building compliant software at scale.

Follow Michael on Social Media
Avatar photo

Charles Betz

Research director, analyst, architect, author. I talk to a lot of people about how digital and IT organizations operate at scale.

Follow Charles on Social Media

No comments found

Leave a Comment

Your email address will not be published.



More Like This

Team Cognitive Load: The Hidden Crisis in Modern Tech Organizations
By Summary by IT Revolution

"This feels pointless." "My brain is fried." "Why can't I think straight?" These aren't…

The Missing Link in Your Industry 4.0 Strategy: Industrial DevOps
By Summary by IT Revolution

As manufacturers embrace Industry 4.0, many find that implementing new technologies isn't enough to…

The Original Disruptor of the Music Industry
By Matt McLarty , Stephen Fishman

I know. You’re thinking I'm talking about Napster, right? Nope. Napster was launched in…

From Turbulence to Transformation: A CIO’s Journey at Southwest Airlines
By Summary by IT Revolution

When Southwest Airlines' crew scheduling system became overwhelmed during the 2022 holiday season, the…