Inspire, develop, and guide a winning organization.
Create visible workflows to achieve well-architected software.
Understand and use meaningful data to measure success.
Integrate and automate quality, security, and compliance into daily work.
Understand the unique values and behaviors of a successful organization.
LLMs and Generative AI in the enterprise.
An on-demand learning experience from the people who brought you The Phoenix Project, Team Topologies, Accelerate, and more.
Learn how making work visible, value stream management, and flow metrics can affect change in your organization.
Clarify team interactions for fast flow using simple sense-making approaches and tools.
Multiple award-winning CTO, researcher, and bestselling author Gene Kim hosts enterprise technology and business leaders.
In the first part of this two-part episode of The Idealcast, Gene Kim speaks with Dr. Ron Westrum, Emeritus Professor of Sociology at Eastern Michigan University.
In the first episode of Season 2 of The Idealcast, Gene Kim speaks with Admiral John Richardson, who served as Chief of Naval Operations for four years.
New half-day virtual events with live watch parties worldwide!
DevOps best practices, case studies, organizational change, ways of working, and the latest thinking affecting business and technology leadership.
Is slowify a real word?
Could right fit help talent discover more meaning and satisfaction at work and help companies find lost productivity?
The values and philosophies that frame the processes, procedures, and practices of DevOps.
This post presents the four key metrics to measure software delivery performance.
December 13, 2023
This post is an adapted excerpt for the DevOps Enterprise Forum paper “Reinventing Software Asset Inventory: A Modern Approach to Maintaining Evidence and Relationships between Software Assets.”
As digital systems expand in scope, mission-criticality, and reach, organizations find themselves challenged to understand and account for them. The digital estate presents an information management problem that has increased dramatically in scale and complexity since the early days of computing. Digital systems support financial, supply chain, and even healthcare outcomes, among many other uses. Understanding the resources and their configurations and interplay is essential to sustaining their smooth and secure operation.
Yet, too many organizations struggle with manual, ad hoc procedures that yield inconsistent and sometimes inaccurate information that cannot achieve the necessary precision to support operational objectives. Imprecise, inconsistent, and inaccurate inventory data (sometimes called “metadata” in the context of digital systems, as this is “data about the data processing infrastructure”) will stunt the efficacy of advancements such as automated governance and impede an organization’s ability to respond to threats such as Log4j. Inventory must be driven by objective evidence, captured through repeatable procedures, and maintained in real-time.
Drawing from the principles of automated governance, we propose a methodology and approach to maintaining software asset inventory that allows for granular and reproducible metadata to be captured and updated in real-time throughout the software life cycle.
This paper will outline a methodology for achieving an asset inventory that supports current and future generations of software architecture and development practices and is capable of scaling to meet the requirements of large organizations. Furthermore, we will present solutions that leverage existing systems of record, providing a smooth path to a more mature inventory practice.
As digital transformation intensifies, the challenge of managing information in large-scale digital organizations is increasingly critical. Historically, organizations have grappled with information management since the onset of widespread computer usage—from mainframe computing to the present era of digital proliferation. The Configuration Management Database (CMDB) was conceived as a solution to this issue, offering a consolidated “database” for tracking and managing IT resources, but it has suffered many failures and currently has a troubled reputation.
However, the CMDB is not the issue in and of itself. It is a manifestation of the broader underlying problem: the need for efficient, effective, and scalable information management in the complex digital landscapes of modern enterprises.
Today’s challenge is two fold:
What’s needed is not an evolution of the CMDB, nor another “silver bullet” to replace it, but a new approach to managing software asset inventory, the relationships between software assets, and the evidence upon which these relationships are based. This new approach prioritizes information management best practices, including transparency, objectivity, and repeatability to create a rich ledger of evidence and metadata surrounding the assets in a software ecosystem. It provides sufficient insight into an organization’s asset inventory to embrace automated governance and ensure regulatory compliance.
Currently, large organizations use various repositories to manage their digital estate (software, hardware, and higher-level groupings generally termed products, applications, or services). One of the most important has been the CMDB (configuration management database). The CMDB (and its associated practices) have various challenges:
We claim that proper methodology can address the lion’s share of the three underlying problems of current asset inventory. The following principles address the fundamental requirements of a minimum viable solution and simultaneously provide a foundation for future enhancements and the progression toward a more advanced inventory management.
An acceptable methodology for evidence collection must be transparent and repeatable. It is important to note that automation, while most desirable, is not a requirement.
Inventory should change when a change occurs. Should an asset be created or its properties change, the inventory should be updated accordingly. The legacy method of periodically attesting to the accuracy of information is insufficient and cannot keep up with the pace of change in the modern software landscape. An asset should be tracked in inventory based on its activity, not manual input. Activities such as the creation of a new repository or the compilation and publishing of a binary warrant additions or changes to the inventory. Activities such as these should be the basis for asset tracking.
Adherence to this principle ensures that any changes to an asset’s properties are promptly and accurately reflected in the inventory. This approach allows for real-time visibility and monitoring of software assets, enabling organizations to make informed decisions based on the most current information available.
Transgression of this principle could result in outdated or inaccurate information remaining in the system, leading to incorrect policy enforcement, increased risk exposure, and potential non-compliance with regulatory requirements. The lack of real-time updates also hampers the organization’s ability to respond to emerging threats and vulnerabilities in a timely manner.
Properties that define an asset should be grounded in objectivity, deriving from empirical observations that, given similar conditions, can be replicated. This objectivity must be reflected not only in the property values but also in the conclusions drawn from those values. Adherence to this principle ensures that the information used to describe and understand an asset is continually up-to-date, reproducible, and open to full auditability. It fosters a high level of trust in the data presented and the decisions made based on that data.
Transgression of this principle, however, can introduce inaccuracies and inconsistencies in the recorded properties of an asset, undermining confidence in the inventory information and potentially leading to suboptimal decisions and policy enforcement. Furthermore, it can make the inventory susceptible to challenges from audit or regulatory bodies due to the lack of a reproducible, objective trail of evidence.
Whereas evidence and conclusions warrant objectivity, there is an essential subjectivity to setting application boundaries; as Martin Fowler notes, “applications are social constructions.”
The solution should focus on providing dynamic, real-time insights into the organization’s software assets, enabling rapid response to changes in the software ecosystem. This involves adopting an event-driven architecture that updates the software asset inventory in real-time based on observed changes, ensuring that the inventory remains accurate and up-to-date at all times.
Read the full paper in the Fall 2023 DevOps Enterprise Journal.
Vice President, Product Innovation at Sonatype
Michael Edenzon is a senior IT leader and engineer that modernizes and disrupts the technical landscape for highly-regulated organizations. Michael provides technical design, decisioning, and solutioning across complex verticals and leverages continuous learning practices to drive organizational change. He is a fervent advocate for the developer experience and believes that enablement-focused automation is the key to building compliant software at scale.
Research director, analyst, architect, author. I talk to a lot of people about how digital and IT organizations operate at scale.
No comments found
Your email address will not be published.
First Name Last Name
Δ
If you haven’t already read Unbundling the Enterprise: APIs, Optionality, and the Science of…
Organizations face critical decisions when selecting cloud service providers (CSPs). A recent paper titled…
We're thrilled to announce the release of The Phoenix Project: A Graphic Novel (Volume…
The following post is an excerpt from the book Unbundling the Enterprise: APIs, Optionality, and…