Tokenize the World: A Framework for Digital Assets
The Digital Imperative
It is clear today, and certainly more so than even just six months ago, that organizations need to grasp the generational urgency of digitization. The playing field for commercial success has shifted and entities need to exploit new digital models to create value; commercial entities must adopt digital business strategies to survive and to have the opportunity to compete.
Ecommerce is arguably the most disruptive digital transformation that we have seen in history. While the digitization of retail was previously on the executive agenda, implementation has been uneven across sectors, entities, and geographies; and it is very apparent now that those who were leading are now exponentially ahead, dominating the market and defining the landscape.
The Digital Asset sector is at a similar juncture; entities that capture the opportunity today will define the rules for tomorrow.
A Meta-Model for Digital Assets
The potential to effectively tokenize real-world assets on a digital system and to efficiently transact is creating new asset classes and entirely new markets for hitherto illiquid assets; and it is reducing risk and cost in clearing, settlement, custody, collateralization and more. Tokens can represent any asset or agreement across multiple parties, and in any sector or industry.
With the broad diversity in the nature and type of tokens, a repeatable design approach, and notably one that leads to predictable results, needs to be anchored on a “meta-model”, an ontology that defines an underlying standards-based language for digital assets. For context, the Periodic Table serves as the meta-model for Chemistry.
The Token Taxonomy Framework is one such meta-model for digital assets. It and serves to define a common set of concepts and terms that may be used by commercial and regulatory participants to enable compliance and governance; and to generate precise specifications for developers to follow, for tools to generate from, and for standards organizations to validate.
A Taxonomy for Tokens: Creating, Classifying and Reusing Digital Assets
A standards-based taxonomy for tokens is crucial for world markets and for global policymakers because a token class or type determines its attributes and behaviors i.e. securities rules, issuance facilities etc. With the growth in the number and types of digital assets, there is an increase in the demand from regulators and from investors to consistently classify and categorize token types.
In addition to enabling the core meta-model, the Token Taxonomy Framework (TTF) also facilitates a platform to enable the creation, classification, and the reuse of token definitions.
A token definition is a template to create and instantiate a specific tokenized asset. In the TTF for instance, three atomic elements together comprise a token template: the Base Type (e.g. fungible), Behaviors i.e. capabilities that are present (e.g. transferable), and Properties i.e. values that may be queried and/or updated (e.g. SKU).
The Taxonomy classified token types using five capabilities that they possess:
- Token Type: fungible or non-fungible
- Token Unit: fractional, whole or singleton, indicating whether a token can be divided into smaller fractions, or if there can be a quantity greater than 1
- Value Type: intrinsic or reference, indicating whether the token itself is a value, like a cryptocurrency, or if it references a value elsewhere, like a property title
- Representation Type: common or unique, indicating whether a token has its own identity and properties
- Supply: fixed, capped, variable, gated or infinite, indicating how many token instances a digital asset can have during its lifetime, and
- Template Type: single or hybrid, indicating the existence of parent/child relationships or dependencies across tokens.
Whilst demands from regulators, policymakers and the global markets are a powerful motive for categorization, classification metadata is powerful for visualization; tools built using this metadata accelerate the learning about and the re-use of existing token templates. Re-use ensures the long-term sustainability of the underlying repository and the asset community at large.
In practical terms, the TTF is a composition framework that breaks token definitions down into basic reusable components – base token types, behaviors, and properties. These reusable components are then categorized by type and support grouping and classification. Composing these reusable entities together generates newer token definitions, and so forth.
A Design Approach: Where to Start
The taxonomy serves to establish the definition and the categories and is complemented by proven industry practices, that lead to predictable business results. Looking across sectors, industries and geographies, three activities are key:
- Business Case and Economic Model for the digital assets: The economic model serves to analyze the economic structures and elements that participate in the creation of value across the ecosystem. The business case delineates the economic forces that catalyze the starting-up and continued viability of the participants in the system, and beyond.
- Business specification of the targeted digital asset(s): A business specification of the digital asset, including a comprehensive illustration of the lifecycle of the asset, involved parties, and relevant multi-party transactions. The business specification should use language that serves the needs of regulatory and compliance bodies.
- Technical definition of the token: A technical specification of the token, including the base token type(s), behaviors, and properties. The meta-model provides the language (“token formulae”) to precisely describe the token in language that is both implementation-neutral and developer-friendly. The taxonomy also serves to help compare and contrast with other existing token definitions and to classify the new token.
The meta-model, taxonomy, and the approach are all implementation-neutral, and platform(chain)-agnostic, and the resulting token definitions may then be input to one or more code-generation tools to instantiate the token definition onto a blockchain or even to a centralized database, as appropriate.
In summary, a structured designed approach for digital asset definition needs to be based on an underlying meta-model (i.e. a language)for digital assets; and this meta-model needs to enable an ontology (i.e. a taxonomy) for the creation, classification and re-use of asset definitions and templates. The rest is implementation detail, as they say.
John deVadoss is a founding Director of the InterWork Alliance and co-chairs the Token Taxonomy Framework Working Group. He leads NGD Enterprise, based in Seattle, Washington, building developer tools for Neo. Previously, he built and successfully exited two machine learning start-ups. Earlier in his career at Microsoft, John incubated and built Microsoft Digital from zero to $0.5B in revenue; he led architecture, product, and developer experience for the .NET platform v1 and v2; and he was instrumental in creating Microsoft’s Enterprise Strategy.