Emerging Productivity Leap: Economies of Network ‘Coopetition’

Davinder Jawanda
99 min readOct 16, 2020

Contents

  1. Bold Prediction Review
  2. Emerging Productivity Leap
  3. Ethereum and the FAT DApp Layer
  4. Bandwidth for Scale
  5. Summary
  6. Timelines

Bold Prediction Review

In my original article I made some very specific macro predictions for this decade. This article provides my reasoning in support of these predictions, particularly with regard to #3. Because each of these predictions are interrelated, predictions #2 and 4 are also referenced. An earlier article detailed #1 & 2 in depth. For convenience, I’ve again outlined the five predictions from the original article, here:

1. Political System ~2021 CCP collapse and China reunification

With regard to order and timing, my spidey sense tells me the above will happen in the order indicated. I do feel that before Bitcoin’s adoption as the GRC (first in some failing or developing nation like Venezuela or Pakistan, and then quickly to the rest of the world including the USA), that China will be reunified (before even Korea). However, rather than the mainland Chinese Communist Party (CCP) absorbing Taiwan and Hong Kong, it will be Taiwan and to some extent Hong Kong displacing the CCP — not through military means but from the economic collapse of the mainland due to an looming financial contagion.

2. Financial System ~2024 BTC adoption as GRC

Bitcoin as the global reserve currency removes the compromised political influence over central bank monetary policy. Central planning no longer has a place in monetary policy — a system that should otherwise be a market based system for market based economies. Market based monetary policy would mean no fractional reserve banking, quantitative easing or consumerism. DeFi together with Bitcoin provide enough flexibility to avoid the moral hazard of the monetary policy system.

3. Business Productivity ~2027 Mass Personalization at scale

A convergence of three primary acronyms: DLT + AI + IoT. The complementarity effect from the maturing of this converge will organically provide equal access to opportunity (and thus more even wealth distribution). AI needs greater data, ie, IoT to achieve new heights (the singularity). However, IoT first needs the automated security of DLT (a.k.a, blockchain, which is coming) before it can provide open access to AI’s powerful system distributed intelligent agents (IA). This convergence will allow automated organizations and smart contracts to deliver on mass personalization at scale, resulting in a huge leap in productivity (just like the first two three industrial revolutions before this one).

4. Human Consciousness ~2030 Collective Consciousness at scale

The third industrial revolution will quite naturally beget in a fully automated economy. Traditional manual intervention will be a friction point rather than a value add. So where does that leave the people, the masses, the next generation? We will have to find something more valuable to contribute than the labour of our bodies and our brains. They will be too weak and slow compared to AI software and robots. But what machines will not have over us is consciousness. Consciousness will become a tangible factor of economic production as our reconciliation journey matures. This collective consciousness will then be minable by neural links (something that’s already possible to some degree with Elon Musk’s Neuralink). Thus, consciousness as an economic good that only humans can contribute, will be the solution to the automation problem.

5. Political System II ~2030 USA deunification

The USA doesn’t come out of this paradigm shift unscathed. There will be new victims and victors. Fractional reserve banking, quantitative easing, and consumerism are inter-related tools in the last leg of this current capitalistic era that have come to the end of their useful life. The current political system will be unable to make a change to dispose of these outdated tools. Every new administration simply maintains the status quo for the most part, each time punting the problem to the next administration. The change will come through revolution, yes, but not a societal revolution (despite what we are witnessing in streets of American metropolises currently) — rather, a technological revolution.

Emerging Productivity Leap: Economies of Network ‘Coopetition’

Supply and demand are the yin yang of economics: more productive capital allocation comes from more efficient and precise markets of supply & demand meeting. The feedback loop between supply and demand is the result of their complementary nature. This complementarity is known as network effects — the powerful dynamic that generates the disruption of incumbent businesses by new emergent business paradigms.

The latest paradigm is liberalizing data from centralized databases providing greater access. This very important change is showing signs of network effects, meaning a disruption is forming. This disruption is essentially lowering transaction costs enough so that the fabric of today’s economy will have new laws.

Scaling in a competitive market based on controlling centralized data will give way to collaborating using liberated data. The change is so profound that the resulting precision in price signals and capital allocation will power a productivity revolution (The Third Industrial Revolution, a.k.a, 3IR or alternatively Web 3.0) resulting in the transcendence of what until now has been a very stubborn tradeoff, that between mass production and personalization.

Transaction Costs and Network Effects

Lower transaction costs and superior intra-firm network effects (including lower communication costs, lower duplication of effort, less office politics, no top heaviness) give smaller firms (networked), a competitive advantage over larger firms (hierarchical).

The large firms will experience lower average returns in an era of lower transaction costs, causing the firm’s per unit cost to go up, to the point that it eventually becomes higher than a non-hierarchical unbloated networked firm.

A hierarchical centralized firm produces internal communication network effects that positively impact production. The amount of such production is dependent on the number of value producing connections facilitated by the firm’s organizational structure (communications technology, facilities and hierarchical structure). However, centralized organizations reach a point of diminishing returns from adding additional human capital, eventually neutering productivity gains from the internal network effects.

At that point any additional employee is no longer productively connected. This increases the marginal costs of human capital, which negatively impacts internal network effects. The result is lower average returns causing the firm’s per unit cost to rise, making it less productive and by extension, less competitive.

In the industrial economy, the diminishing returns from muted internal network effects was overshadowed by the more impressive returns from the supply-side economies of scale by deploying ever larger amounts of capital to dominate the supply chain. Decreasing transaction costs both fundamentally change those competitive economics by 1) deemphasizing hierarchical scale in the supply chain, and 2) emphasizing network effects within the firm, which favors networked scale within the supply chain.

These new economics provide a tangible competitive edge to firms that are networked internally. Optimally sized and automated networked firms avoid four specific issues with hierarchical firms:

Network effects within the firm from networked human capital, grow dramatically once these four obstacles are removed. Note how the number of communication channels (edges) grow exponentially as each networked worker (node) is added to a non-hierarchical networked firm:

Source: Wikipedia

Here’s what this logarithmic pattern looks like visually:

Source: Wikipedia

These beautiful graphs are in essence a 2D visualization of network effects. Now try to imagine it in 3D and compare it in your mind to a traditional 2D model of a hierarchical firm. Once crystallized in your mind’s eye, you start to sense the power of networks. By extension, one begins to understand the power of social networks like Facebook.

Despite its impressive accomplishments, Facebook is though is just scratching the surface of networks. Yes, its customer facing component is a 3D networked marketplace for content, but its own internal structure is a 2D hierarchy. Until now, it has had no choice. Transaction costs being what they were, limited their ability to network internally as is the case with all established firms, including the others of Big Tech (Microsoft, Apple, Google, Amazon, etc).

These hierarchical firms that greatly benefit from demand-side economies of scale (network effects), will suffer the greatest losses from diseconomies of scale on the supply-side due to lower transaction costs. The networked ecosystems of firms will benefit from network effects on both the supply and demand side.

The ability for the entrenched firms to transform themselves into networked firms in time to effectively compete with the new nimble ecosystem of firms will be virtually impossible. As a result, these firms’ abilities to maintain their current dominant marketshare positions is just as unlikely.

Conversely, the networked ecosystem of firms with their dual network effects infrastructure (decreasing supply-side average costs, while at the same time increasing demand-side utility and user experience), will have dramatically superior economics allowing them usher in the era of mass personalization at scale, ie, the Third Industrial Revolution which includes Web 3.0.

Winner-take-all Economics

The Third Industrial Revolution (a.k.a, 3IR or Web 3.0) innovations (smart contracts, smart currencies and smart agents) shift power away from a concentrated center (through a lowering of transaction costs) to the diversified edges based on two design features:

  1. a flip in the economic paradigm from one based on closed systems to a replacement based on open systems.
  2. advances in storing data, particularly user data and transactional data, in a distributed fashion versus centrally.

The dynamics of the 2.2IR (the digitization of 2IR centralized architecture) retains power at the center. Therefore you can say that 2.2IR is still centripetal in nature, whereas 3IR is centrifugal, with power being pushed to the edges, for more even distribution.

The 3IR mode more closely matches the nature of the Internet — an open global network. 2.2IR is a transitional state where due to technological limitations, a layer of multiple closed global networks leverage the openness of Internet Economics, limiting its potential by throwing up gates and locks suppressing market based price signals leading to inefficient capital allocation, temporarily thwarting optimal innovation and productivity.

Internet Economics (IE) include:

  1. inherent infinite market reach and
  2. low barriers to entry,

which, for sustainable scale in the 2.2IR era, required TCP based business models to include:

  1. path dependence and
  2. system closing mechanisms.

Internet Economics (IE) allows for winner-take-all scenarios that gain unstoppable momentum and leverage through free services, funded through data ownership monetization. Thus, IE under a centralized regime is like the Pareto Principle on Steroids, further compounding the centripetal nature of 2IR.

From an economic perspective, digital platforms are fundamentally different from bricks-and-mortar-only platforms in one very important way. Industrial era platforms like the railroad had high upfront fixed costs but were able to make those up and much much more because ongoing operations were very low. Thus, railroad barons were able to reduce costs (eliminating competitors and creating monopolies) while maintaining healthy margins. Thus, these physical platforms drove down marginal costs dramatically. This created a massive economies of scale and positive feedback loop on the supply side of the business.

Digital platforms not only create economies of scale feedback loops on the supply side, but also on the demand side. This is because of network effects which provide more utility to the users of the platform as more users come on board. This dual capacity means TCPs can not only lower marginal costs but also increase marginal revenues dramatically, which shifts the demand curve outward. Non digital platforms can only reduce total marginal costs — the supply side of the value equation. Centralization regimes atop the network effects enabling open Internet facilitate The Pareto Principle on Steroids effect. (More about the Pareto Principle Impact here.)

Blockchain is emerging as a 3IR model that can break the closed system hold over the open Internet, because it facilitates optimal market economies, leading to higher productivity ushering in an age of mass customization, the wealth of which is distributed more evenly because the critical factor of production of IE, specifically data, is no longer controlled by a minority of powerful players.

This winner-takes-all scenario, leaving the majority of firms in the dust. Thus, the unfortunate side effect of digital platforms is a high degree of concentration of wealth and power. The Pareto Principle in a digital Web 2.0 world provides a huge upside for the winning platform owner. Apple, Google, Microsoft, Salesforce, AirBnB, Uber are all examples of successful digital platform businesses. Generally speaking, all other competitors, partners, suppliers and customers become subservient.

TCP’s create path dependence and system closing mechanisms on both the user and vendor side. On the user side, profiles, reputation scores and transaction history are all stored and controlled on proprietary centralized data creating a lock-in situation. On the vendor side, their products purchased on the platform are often only consumable on the platform or on authorized components. Or if usable outside the platform, the transaction is on the platform, giving control to the platform owner to take a share of the revenue that they can often dictate.

These aspects give TCPs that winner-takes-all potential. Facebook derives its power from its content platform, ie the social network, which is essentially a marketplace monetized currently through advertisement sales. Other Big Tech firms like Microsoft, Apple, Google, and Amazon also dominate due to the demand-side economies of scale (network effects) available to them as platform firms.

But because it is a winner-takes-all scenario, that initial hill to climb is steep. There are some 2.2IR tricks to the trade that are still relevant in a 3IR world.

Platform Design

The book Platform Revolution summarizes the what, why and how of platform design in three groups of 3 elements:

A. 3 Types of Platform Transactions (The What) — three things that are going to be exchanged on a platform:

  1. Information — the information is always on the platform.
  2. Goods & Services — sometimes the goods are consumed off platform but the information about those goods is only found on the platform.
  3. Perceived Currency — doesn’t have to be fiat currency, can be attention, likes, follows, shares, reward points, tokens, etc, but the transaction is ideally or exclusively on-platform.

B. 3 Elements of Platform Core Interaction (The Why) — the reason for the interaction on the platform is going to have these three elements:

  1. Participants — there are always two sides to a transaction represented by a producer and a consumer. Users can switch spots but there is always one of each in each transaction.
  2. Value Unit — the producer creates the value unit (post, tweet, video, listing, offer, ad spot, etc) which is consumed by the consumer
  3. Filter — filtering the exchange of information is critical to personalizing the user experience for both the consumer and producer by helping serve up the most relevant value units. For instance, Facebook’s news feed is a personalizing filter.

C. 3 Elements of Platform Competitiveness (The How) — to make the climb up the adoption curve, platforms need to consider these three specific competitive strategies:

  1. Pull — there’s both the initial traction and also the retention aspect:

1.1 Adoption Dilemma — nine strategies that successful platforms have utilized to seed the platform for accelerated adoption:
a. The Follow-the-Rabbit Strategy — hit product focus first
I. Staging value creation
II. Designing the platform to attract one set of users
III. Simultaneous on-boarding

b. The Piggyback Strategy — 3IR examples to follow
c. The Seeding Strategy — 3IR examples to follow
d. The Marquee Strategy — big name developer or founder
e. The Single-side Strategy — platform issuer assumes the other side of the market
f. The Producer Evangelism Strategy — producer has baked in followers (teacher bring students)
g. The Big-Bang Adoption Strategy — event based application of solution for max exposure
h. The Micromarket Strategy — niche market dominance first like Facebook schools subnetworks
i. Viral Growth: The User-to-User Launch Mechanism — consumers incented to lead the piggyback strategy
I. The Sender
II. The Value Unit
III. The External Network
IV. The Recipient

1.2 Retention — successful platforms are able to generate two types of feedback loops:
a. Single — the platform uses a core algorithm to learn about users and serve up the most relevant value units. The more the users use the platform, the smarter the algorithm gets at figuring out relevance. This creates a powerful personalization feedback loop.
b. Multi-user — a producer activity is served to relevant consumers, whose activity in turn is delivered to the producer, creating a virtuous cycle of greater value on either side, enhancing network effects. The Facebook News Feed is a great example.

2. Facilitate — platforms are able to lower or raise barriers to grow and protect their network effects producing feedback loops.

3. Match — platforms enhance network effects by matching the most relevant users with one another. This maximized profitability since personalization ensures an optimal balance between transactions volumes and user experience. Data about producers, consumers, value units, and goods and services is absolutely critical to relevance and personalization.

Modularity

New market opportunities require pursuing vendors to take on a wide array of activities as part of the process of provisioning an offering. In essence this is complex demand being filled by complex supply. Digital Platforms take complex supply to another level by leveraging a network of suppliers all tied to the platform’s standard.

Thus, modularity is a critical design component of successful network effects experiencing platforms. Modularity is an efficient way of organizing complex value provision by both proprietary core components (the low variety platform) and complimentary peripheral components (the high variety ecosystem).

To provision modularity, digital platforms essentially hybridize the two fundamental models of business operations: complex-systems model and volume operations model. The platform itself is by definition a complex system that tends to do one thing very well. By provisioning an ecosystem of third-party service providers, variation is added to the offering by these volume operators.

The platform sets the modular standard through which the ecosystem participates. Style, speed and variety are pushed to the peripheral volume-operators by the core complex-system. That modular standard partitions information into self contained designs where the rules are visible while the parameters are hidden. These visible rules impact subsequent design decisions.

Demand-side Economies of Scale

Complex-systems model platforms in the bricks-and-mortar-only space, despite having high upfront fixed costs, have very low operational ongoing costs. Thus, these physical platforms dramatically drive down marginal costs. This creates a massive economies of scale and positive feedback loop on the supply side of the business. An example of such a platform is railroads. In the early days, rail barons further reduced costs by eliminating competitors and creating monopolies.

Digital platforms not only create economies of scale feedback loops on the supply side, but also on the demand side. This is because of network effects which provide more utility to the users of the platform as more users come on board. This dual capacity means TCPs can not only lower marginal costs but also increase marginal revenues dramatically, which shifts the demand curve outward. Non digital platforms can only reduce marginal costs — the supply side of the value equation.

Microservices

The vehicle for digital platforms to deploy a modular strategy are Microservices. Essentially, the microservice is the modularized subcomponent of the overall platform. Without the modularization, the application is a monolith. Monoliths are easier to build and deploy than a modularized microservice architecture. However, in a dynamic environment, monoliths can quickly become inefficient. Any necessary adaptations as a response to the environmental changes, becomes a very risky proposition as it requires changes to the monolith source code. Adaptation is more efficient if the change can be made with more precision while containing the possible ramifications to the relevant and isolated subcomponent.

Source: Link

Additionally, when opening up the platform to a third party ecosystem, providing access to subcomponents of the application is ideal from a security perspective as well as from the perspective of resource management. Third party applications looking for access to a certain portion of the application or application database frees the rest of the platform having to dedicate resources to full the third party request. And in a dynamic digital environment, the volume of those requests can be fast and furious. For these reasons, a self-contained piece of business functionality, the microservice, is the ideal architecture for digital platforms.

A microservice is not a layer within a monolithic application, which for example has a web controller layer, or a backend-for-frontend layer). The microservice may though, through its own internal components, implement a layered architecture.

Application Programming Interfaces (APIs)

Microservices with clear and standardized interfaces ensure the ecosystem of third party applications can conveniently tap into the platform application. Platform interfaces are known as application programming interfaces (APIs). A useful visual is seeing microservices as wrapped in APIs.

A vibrant growing ecosystem ensures maximum end user value. The ecosystem provides the variety so that the platform producer can focus on the core competency. The end result is an ideal user experience that includes reliability and variety, often an unbreakable tradeoff in a non-modular world.

As a result, among value chains with dominant centralized database platforms, demand continues to swell for microservices via APIs. APIs for Google Maps, the New York Stock Exchange, Salesforce, Thomson Reuters Eikon, Twitter, Facebook, etc, are how their ecosystem participants interact with the main platform to provide the variety to their users. This allows platforms to service in real-time ever more niche demand including from unarticulated mass markets.

Platform strategy has traditionally been focussed on gaining critical mass and opening up that product to third party contributors to generate an ecosystem around a proprietary standard. Strictly enforcing this proprietary standard is the key competitive strategy as it locks out competing platforms (system closing mechanisms). In a Web 2.0 world, platforms create a market for prosumers (vs developers in the Web 1.0 era).

As it is in Web 2.0, the world of Web 3.0 will be dominated by market facilitating platforms. Conversely though, the networked ecosystem of firms with their dual network effects architecture (decreasing supply-side average costs, while at the same time increasing demand-side utility and user experience), will have dramatically superior economics allowing them usher in the era of mass personalization at scale.

The open data interface standards of Web 3.0 DLT platforms will ensure Web 2.0 TCPs will be unable to employ system-closing mechanisms. Thus, system-closing and vendor-lock-in mechanisms will give way to integrating force strategies.

Web 3.0 Nano-services

In a Web 3.0 world, network effects and Pareto Principle will favour those platforms that provide the optimal user the best experience for the masses. Thus, platforms will still continue to scale through ecosystems of third party contributors, which provide the necessary variety or wider scope desired by varied user bases.

Existing companies will become networked, automated and modularized at their atomic level, freeing trapped latent value to be monetized. Organizational “excess capacity” will modularize and become available for lease/sale to third parties in need of such an offering.

The increasingly granular performance improvements in the pursuit of mass personalization frees trapped value at the fundamental level in larger volumes, resulting in more precise subcomponents and recombination possibilities. Both of these are key ingredients in emergent feedback loops.

This foundation of dual network effects (decreasing supply-side average costs, while at the same time increasing demand-side utility and user experience), will eventually produce a more profound synergistic emergence of enveloping-experience level of personalization. It’s analogous to the fundamental quantum layer of existence, which happens to be completely unrecognizable from everyday life, being capable of producing such a brilliant experience for living beings at our material layer of existence.

The open API standards of Web 3.0 platforms will ensure that sufficient profits will be accessible to ecosystem players in new layers of increasing granularity providers. It’s at these more granular layers of value that platforms and their ecosystems can provision enveloping user experiences, a key value proposition of mass personalization at scale.

Smart Contracts and Smart Tokens

The current level of granularity is limited by the economics of the current architecture (TCP) and the profitability of future levels of granularity is limited by the advancements in distributed data technology (DLT). More efficient monetization requires a new financial layer that lowers transaction costs. The financial layer that is arising is powered by smart contracts (automated executing digital agreements) and their smart tokens (i.e., utility digital tokens or cryptocurrencies), enabling profitability for additionally granular value layers.

Further innovations in smart contract and smart tokens will enable better performance of automated transactions and incentivisation systems, giving profitable rise to ever smaller niches of granularity. DLT platforms facilitate transactions and integration of modular value, rewarding both third party providers and users.

Smart contracts can provide realtime valuations of modular assets at the granular level, creating optimal supply and demand liquidity to parties transacting in markets of granular value. Smart tokens can ensure the transactions costs are as low as possible. This granular value tokenization is the Web 3.0 equivalent of the securitization of otherwise illiquid assets (a process known as financialization).

This automated financial layer will serve as a foundation for a realtime API economy enabling AI software agents to transact with one another on behalf of networked ecosystem participants, in delivering mass personalization to users. Smart contracts, smart currencies and smart agents, converging, co-creating and self-organizing to generate trillions of dollars worth of ad hoc transactions of granular value, begetting an intelligence productivity leap. The greater scope of this intelligence leap ensures economical precision in the generation of predictive and prescriptive services, which is at the heart of producing emergent, dynamic, and realtime wrap-around user experiences (i.e., mass personalization at scale).

Smart Agents

The next level of platforms will provision this data exchange between automated artificial (AI) agents, otherwise known as intelligent agents (IA). These smart bots will be able to manage the entire search, negotiation and purchase process on their own without any manual intervention. This includes using and acquiring data to power their own decisions in favour of those they represent.

These future fully automated marketplaces of IAs both fulling supply and demand roles for granular data purchases will require scalable fractional currency and scalable data security protocols to be able to do their job productively. Once they are so enabled, they will produce tremendous productivity — enough so that mass personalization of services will become a realistic possibility — something that has been out of the reach in the industrial age.

IAs produce the most value in dynamic coordinated multi-agent clusters (versus monolith AI machines). Again blockchain and smart contracts are designed to produce dynamic multi-agent automation without the need for intermediaries or central authorities.

PWC intelligence division says the following about the AI and blockchain tandem:

“AI and blockchains in that sense are complementary and synergistic. They are two ends of a continuum chock full of powerful emerging tech included in PwC’s Essential Eight. AI, for its part, can add intelligence and insight to the decision making process. Blockchain, in its role, adds integrity, assurance and decentralization to the core transactional environment and can help enormously in process improvement.”

Source: PWC Blog

Multi-agent deployment within complex environments is more effective than utilizing monolith for the same reason why complex decisions are best taken after thorough discourse with someone of an alternative perspective. The back and forth on related yet differentiated ideas generates deeper insights. The deeper insights can be considered a valuable byproduct, or a complementarity effect. In cases like these, 1 + 1 = 3, and not 2. It’s for this reason network platforms that facilitate supply and demand exchange between micro agents will have superior economics.

The increase in transactions among automated agents will enable a release in trapped value comparable to a new industrial revolution. The complementarity network effects from all of the granular supply finding granular demand will produce productivity gains from which a new reality for people will emerge.

Market-based Wealth Redistribution

The rise in TCP mega firms is the result of data becoming a factor of production in the 2IR era. Firms that were able to aggregate the largest troves of user data had inherent superior monetization potential, and thus were able to grow their market capitalizations the most.

This tendency for digital value creation systems to also follow typical power laws like the Pareto Principle is no trivial matter. Left unchecked, income inequality eventually destabilizes the whole system of value creation. Attempting to impact inequality with systems with cures worse than the problem are of legitimate concern. When the extreme alternative to capitalism is communism, evolving capitalism at the systemic level is of paramount importance in order to maintain our meritocratic system. The key happens to be data control.

This convergence will create an unstoppable momentum towards information openness and organizational automation. That will mean transaction costs will lower facilitating the dual benefit of indirect network effects and a more egalitarian distribution of wealth. The gig economy of both human and AI agents will obviate the need for large companies (which grow larger because of high transaction costs).

Fortunately, the success of the Web 1.0 & 2.0 winners will eventually become their weakness. As new Web 3.0 (a.k.a. 3IR) innovations start to take hold, TCP will take a textbook Disruption Theory path.

The newly unleashed economics based on network collaboration will trigger the disintegration of organizations built on the economics of mass production. The disintegration of large entities critical to the current era factors of production, into smaller entities better suited for the future economies production factors will ensure an organic redistribution of wealth (without the need for government regulation or intervention).

TCP retaining control over data, as decentralized organizations become more competitive, is the equivalent of a ceiling on societal wealth, and the delay of the mass personalization era and its associated 3IR productivity leap. That is why it’s in our collective best interest to see through the 4th Turning or the transition from 2.2IR (aka, a digitized yet still centralized 2IR) to decentralized 3IR.

The productivity leap will create huge efficiency and effectiveness, meaning a tremendous reduction in greenhouse gas emissions. It will also mean the nature of national governance will change and war will become unproductive, including civil war.

The path to this productivity leap could be a bumpy one as entities with power today will begin to lose their grip on that power. Some of these entities have access to weapons of mass destruction, including of the nuclear variety. Obviously a nuclear war would be a wrench in the gears of the productivity leap. Dismantling the centralized powers that control nuclear weapons is a topic that requires more discussion.

Economies of Network Coopetition

Mass personalization is made possible by innovations that unleash Economies of Network ‘Coopetition’, with low transaction costs on both the supply and demand side, leading to high transaction volumes by ever-granular DLT entities.

Antifragility is an emergent quality of increasing volumes and granularities of the dynamic complex units in question. Emergence is the result of an increase in opponent processing (point of reference processing) opportunities in complex environments.

Renowned economist and futurist Jeremy Rifkin helps explain productivity leaps by using the formula of aggregate efficiency, which is the ratio of potential work to the actual useful work that gets embedded into a product or service. Productivity is after all about improving the ratio between input and output volumes. The higher the aggregate efficiency of a good or service, the less waste is produced in every single conversion in its journey across the value chain. The lower the aggregate efficiency of a good or service, the more waste is produced in every single conversion in its journey across the value chain.

It goes to figure then, the higher the overall aggregate efficiency, the closer we get to near-zero marginal cost (the change in total cost that arises when the quantity produced is incremented by one unit). That is indeed the trend. Aggregate efficiency rose dramatically from 8% in 1910s (USA) to 20% in the 1990s (Japan). Rifken estimates 80% (globally) during the current 2.2IR era. In the 3!R, I project aggregate efficiency to approach 99%, thus economically enabling mass personalization (something that would otherwise be impossible).

As mass personalization becomes possible, the proliferation of blockchain technology, with its distributed data architecture, will begin to displace centralized database infrastructure. Open source data networking business models and scalable DLT unleash the trapped productivity value enabling mass personalization.

Without central control over data, these firms will no longer be able to control user data. This will be a natural economic ceiling on the optimal scale size of firms. The economic incentive to consolidate to scale will dissipate.

A positive externality or byproduct is organic wealth redistribution in the Web 3.0 era — a tide that still raises all boats. As a result, the current concentration of wealth will loosen over time. Note how the Concentration of Wealth curve flattens in the era-by-era charts below, to a more acceptable level by the time of Web 3.0:

  • Economies of Scale — based on the classic factors of Land, Labour, and Capital) making possible Mass Production which incentivizes Consumerism. The digitization era (2.2IR or Web 2.0) of the second industrial revolution (2IR) has begotten dominant Big Tech TCP firms which harness economies of scale on both the supply side and demand side (network effects) causing an even higher concentration of wealth.
  • Economies of Network ‘Coopetition’ — based on lower transaction costs, enabling open data, the new factor of production, and on granular modular integration possibilities through smart contracts, smart currencies and smart agents, making possible Mass Personalization (3IR or Web 3.0), and resulting in the eventual diseconomies of scale in TCP firms — the level of Economies of Network Coopetition will quite naturally be directly proportional to the Diseconomies of Scale (cost disadvantages due to organizational size increases) in TCP (or in other words, conversely proportional to economies of scale in TCP).

As a result, the happiness index will rise among the masses. Opportunities for wealth creation will become more accessible, as the economic incentive for centralization will evaporate giving way to a more profitable middle income living in the Web 3.0 era. As a result, relative social status “scoring” will improve among the masses.

The current problems are the unintended consequences of success. Most of our problems today are the result of unintended consequences of the first three industrial revolutions.

  • The First Industrial Revolution brought us dynastic empires, with the unintended consequence of slavery.
  • The Second Industrial Revolution — Phase 1 brought big scale manufacturing, with the unintended consequence of environmental degradation.
  • The Second Industrial Revolution — Phase 2 brought digitization to the centralized business model, resulting in loads of smartphones loaded with a vast variety of free software applications, with the unintended consequence of extreme wealth concentration.

The economies of scale both on the supply side (near zero marginal cost) and on the demand side (network effects), have resulted in The Pareto Principle on Steroids during this 2.2IR. The key factor of production, data, is centralized and controlled by a select few mega winners. This extreme concentration of wealth is creating political instability in democratice and sowing the seeds for civil war and revolution.

In addition, the concentration of data is not ideal for end user value creation. Sure we have access to cheap smartphones with a wide array of free software apps. The vast majority of those apps do not work together to provide a seamless wrap around personalized experience because there is no easily accessible economic incentive or mechanism. Smart contracts and native tokens will change that.

The unintended consequence will be accessible and secure databases. The smart contract automation and more freely available information will allow for ever smaller entities to be economically viable. The granularization of value will result in mass personalization.

  • The Third Industrial Revolution will bring us mass personalization, and organic wealth redistribution (with the unintended consequence seemingly being mass unemployment … but that’s where the Fifth Industrial Revolution comes in … more on that in a future article).

Fortunately capitalism, like nature, is designed to fix itself as the nature of competition drives the growth imperative, the largest of those drivers being the ability to economically transcend tradeoffs. The biggest tradeoff that is close to becoming low hanging fruit is the tradeoff between mass production and personalization. This is the next biggest revenue opportunity since it delivers maximum end user value at the lowest price possible. Mass personalization will become accessible through 3IR innovations.

The unintended consequence is that the innovation that will facilitate mass personalization will also unlock information from the grip of the few, thus flattening the wealth distribution curve from its current extreme high concentration point.

The 3IR winners aren’t going to give up easily. There will be war over marketshare between traditional cloud platforms (TCP) and blockchain platforms (DLT). TCP is the incumbent. DLT is the challenger. The verticals seeing initial traction for DLT is the financial sector. In 2020 we’ve witnessed the mushrooming of DeFi, That will grow to include more verticals as previously discussed.

Ever more granular service will continue to find profitable demand which in turn further increases the value of the open standard attracting ever more supply and demand. That growth in granular supply and demand will create more and more supply and demand in mass personalization services. This network effect feedback loop can be visualized as a tide that raises all boats, including both buyers (users) and sellers (providers).

Granularizing value facilitates the more precise allocation of capital (and data — now a key factor of productivity in the digital age). A prerequisite to such a market is efficient fractionalized monetary exchange which facilitates accurate price signals. We are talking about enabling a market where micro supply is able to efficiently find micro demand.

Mass Personalization

Granularization of value allows entrepreneurs to play at the foundational layer allowing for more profound finished products or even subcomponents. An economic leap from value granularization, made possible by open data platforms, will give rise to mass personalization.

Mass personalization will both be made possible by and contribute to the profitability of niche service providers, entrepreneurial ventures, which otherwise would not survive economically. These niche players will be critical to producing mass personalization due to their ability to collaborate with other niche (and larger players) to facilitate niche needs.

This Web 3.0 era will be able to transcend the value-price tradeoff in a big way. As modularization reaches deeper and deeper into existing assets, smaller and smaller chunks of value will become available for trade. Smaller pieces of granular value facilitate the building of more precise customized offerings. This means a huge leap for customer value propositions to the point of true custom experience offerings provided at scale.

  1. Web 1.0 of 1990 to 2005 provided more rich standardized offerings at a lower cost
  2. Web 2.0 of 2005 to present provided more variety of experiences at a lower cost
  3. Web 3.0 of 2020 onwards is starting to provide more customized experiences at a lower cost

Capitalistic entities are in the pursuit of profit. Optimal profit margins are most accessible on the supply of products and services that are in high demand by the marketplace. Having a sustainable competitive differentiator is the key to capturing the bulk of those profits or otherwise profit margins will be squeezed by copycat competitors driving down prices,

To be effective, the competitive differentiator should be able to deliver superior value or experience (or an optimal balance of the two) to the customer. The customer typically evaluates for superiority based on which solution best fills the need, meets the requirements or gets the job done. All needs are different to some extent, and so optimal solution customization or personalization relative to price tends to win over the customer in the competitive battle.

All things being equal, personalization tends to add costs to the provision of the solution, and thus either squeezes the vendor’s profit margins or increases the price to the customer. Thus, the ability to break the trade-off between optimal profit margin attainment and optimal personalization delivery would be a game changing innovation. The vendor would be able to increase demand without increasing price, or in other words shifting the demand curve to the right.

The key to shifting the demand curve to the right in a digital economy is access to data. Data is the newest factor of production — fuel for the digital economy. The insights derived from data analytics enable ever greater precision in price signals, the deployment of capital, and production of goods and services. Greater access to data means better insights, resulting in greater productivity — ie, a better ratio between results and inputs.

That’s also the foundation of mass personalization which is greater personalization (greater results) for the same inputs (same price). Mass personalization is greater productivity. If data is the key to mass personalization and greater productivity, then greater access to more data is the key ingredient.

Traditional cloud applications have a vested interest in guarding their vast treasure chests of data. And because the Internet is universal, the owners of the traditional cloud application protocols are typically winner-take-all success stories, meaning they have monopolistic power over that data. This data access bottleneck serves as a ceiling on productivity or our economic progress.

Data Liberalization and Security

Blockchain is a new way to manage data. Rather than being stored centrally under lock and key by one entity, blockchain distributes the data among many nodes, which both compete and collaborate. They compete to add the latest information to the database to earn the token rewards, while also collaborating to keep each copy of the distributed database synchronized across the entire network of nodes.

The fact that there are so many synchronized copies of the data makes the database more antifragile. It’s no longer up to one entity to defend the database. A system that provides efficient data redundancy is inherently safer and more secure.

Now consider that the most popular of these blockchains are open to the public, meaning anyone can access the data. And recall how important data access is to growing productivity. What blockchain manages to do is break the tradeoff between security and access. Typically, cordoning off access provides greater security like in centralized database management. In the case of blockchain, greater access also increases security.

Bottomline then, there is no longer a tradeoff between security and productivity. In the blockchain world, greater data access means more security and more productivity.

Data Liberalization and Atomization

Blockchain at scale, is a productivity enabler because smart contract and tokenization liberate data through lower transaction costs. That liberation facilitates greater knowledge, price signals and precise resource allocation, the foundation of productivity.

This is data atomization which also facilitates product atomization — both information products and non information. Non info products can be more profitably built in a personalized and distributed manner (think 3D printing) when resources are optimally allocated. Cost effective atomization allows for more granular rebundling and integration, enabling greater personalization at scale.

Greater access to data (analytics) and automation are the foundation of optimal resource allocation. Lower transaction costs are the foundation of greater data access. Smart contracts and tokens are the foundation of lower transaction costs. And blockchain is the foundation of smart contracts and tokenization.

Content is data. Being able to purchase fractional content like one song from an album or one chapter from a book, or one scene from a movie is similar. What makes it more economical is automated monetization which is possible through fractional tokens and smart contracts, allowing for more precise, granular or atomized compensation to even the smallest contributor or contribution. This is extending the longtail offering business model or value proposition.

Money is data. DeFi is about atomization and lower transaction costs. Tokens are mixed, matched and bundled together like lego pieces to provide more personalized offerings to investors and consumers. Also, investors and consumers are able to purchase fractional units of these tokens creating greater liquidity and thus releasing greater value via network effects. Defi smart contracts can also facilitate niche investment vehicles and bundles of niche vehicles. The smallest contributors or contributions to those vehicles can be profitably rewarded as a result of the tokenization and smart contract automation.

There is a rub though. The first iterations of the most popular public blockchains were difficult to scale up because anyone could be a database node. Anyone being a node is a system feature, as it decentralizes data management. Nodes are rewarded for providing this service with cryptocurrency or crypto tokens.

However, the more nodes there are the more nodes that need to agree on the validity of the shared database and any changes made to it. As transactions on the blockchain increase, the process of recording transactions to the shared database creates bottlenecks causing demand to outstrip supply and processing fees to increase to the point where the blockchain can become uneconomical for what otherwise would be valid use cases. This I refer to supply-side transaction costs.

On the other, because sophisticated blockchains like Ethereum enable these transactions to be undertaken by the buyer and seller in a hands-off manner through the use of automated computer code, otherwise known as “smart contracts,” transaction costs were also lowered. This I refer to as demand-side transaction costs.

Transaction Costs Taxonomy

Transaction Costs come in many forms including, supply-side and demand-side, internal and external. It’s all a matter of perspective however, regarding which category a transaction costs fits. It depends from which link in the supply chain one is viewing.

For our purposes, we’ll take on the perspective of the consensus mechanism provider (the section to the extreme right in the table above). The consensus mechanism powers its ecosystem of DApp providers (the middle section in the above table).

From the consensus mechanism’s perspective, only the provider’s own costs (internal) or that of its supply chain (external) are supply-side transaction costs. All other transaction costs are then demand-side transaction costs.

From the perspective of its customers, the DApp ecosystem, supply-side transactions costs are also those of its own costs (internal) and that of its supply chain (external) (which includes the consensus mechanism’s transaction costs); while the demand-side costs are those costs incurred in selling to the DApp’s end users.

Transaction Costs and Automation

It gets a bit confusing when demand-side transaction costs are conflated with DApp end user service prices based on DApp tokens (in contrast to consensus computing tokens). DApp service prices are also demand-side transactions. And thus higher prices for them can be easily confused for those of smart contract transaction costs. Demand-side service costs versus demand-side smart contract costs.

The DApp service transactions can perhaps be considered an extension of the consensus computing transaction. The consensus operations required to manage the DApp token are the same consensus operations of the smart contract services. Thus, both are concerned with demand-side transactions powered by the supply-side transactions of the consensus computing network.

The new factors of production are reducing transaction costs. An economy based largely on data is much more productive because the marginal cost of producing digital goods is near zero. Traditionally production relied on people and machines working together. When an increasing amount of consumption is made up of digital goods, data has replaced physical goods in the production process. Thus, transaction costs of new digital goods are much cheaper than traditional production goods. Also, digital goods are virtual and thus the replication costs are virtually nothing as well. Add both of those together, and we have near zero marginal costs, which shifts the demand curve outward for the new paradigm vs the old.

Positive changes to transactions costs and factors of production thus are changing the economics of markets and organizations making them more productive. And it’s these superior economics that act like a magnet for entrepreneurs — the irresistible force sidestepping what otherwise would be an immovable object.

  • Lowering high organization and market transaction costs (the cause of centralization) of decentralized organization (relative to centralized organization), requires organizational innovation (DLT or Distributed Ledger Technology).
  • The current innovations in decentralized organizational technologies (DLT) are lowering organization market transaction costs which in turn will soon cause a commoditization in centralized business models including TCP (Traditional Cloud Platforms).

A decentralized architecture for an ecosystem of applications is possible because the software platform enables a token incentivized network of suppliers providing distributed yet synchronized records management and storage services. Transacting End Users makeup the market tied to the software platform not by the vendor-locked-in user accounts but through the proprietary token used to execute the transactions.

When the Transacting End Users self organize in the pursuit of some mutual interests, the underlying economics of their organizational architecture, called DAO, tend to be superior to those of any entrenched centralized competition pursuing similar goals. as a result of the automated nature of smart contracts that automatically tie the distributed players together, enable low overhead for DAOs. This provides DAOs with superior underlying economics. The lower transaction costs obviate the need for a rigid employee operating model, something that is naturally less efficient than a network of market based labour.

DeepDAO has launched a new interface for examining the health and wealth of the top decentralized organizations in crypto.

https://cointelegraph.com/news/the-number-of-active-daos-is-up-660-since-2019

The current innovations in decentralized organizational technologies (DLT) are lowering organization market transaction costs which in turn will soon cause a commoditization in centralized business models including TCP. There will be diseconomies of scale for centralized organizations while at the same time, economies of scale for decentralized organizations as the latter’s new architecture lowers transaction costs.

Smart Contracts thus blur the lines between internal (intra-organization) and external (supply chain) transaction costs. Smart contracts lower external transaction costs to such a degree that DAOs will eventually begin disrupting traditional firms with their inherent superior economics.

Though DAOs are disruptive to the incumbents due to lower external transaction costs, currently it’s only true to a certain point. The greater demand for DAOs puts a drain on the supply of smart contracts, which are strained by the bandwidth of consensus mechanism. From the perspective of the DAO consensus mechanism provider, lower demand-side transaction costs tend to increase supply side transaction costs.

Because growing demand-side volumes tend to cause rising supply-side transaction costs, the benefits of consensus computing networks tend to be asymptotic (tapers off over time due to diminishing capacity).

Sustaining innovations in the consensus operations can increase capacity without overly sacrificing the decentralized nature that underlies the consensus computing network. However, only a disruptive innovation can increase capacity without sacrificing decentralization.

Thus, established consensus computing networks like Ethereum would need to be weary of new protocols based on quantum networks. That yet to arrive protocol is all but inevitable and thus, DApp providers and investors are also best served by keeping an eye out.

In terms of step by step progression, this is what I anticipate:

  1. Transaction cost reducing innovations on the supply side trigger access to additional dimensions of competition on the demand side of DLT platforms in its emerging battle with TCP for marketshare. (graphic?)
  2. DLT innovations are facilitating deeper penetration of the two existing verticals in which DLT has wedge-like traction, and will provide access to additional verticals for disruption of the TCP incumbents.
  3. Though there are multiple competitive dimensions and verticals for DLT platforms for which scale is possible, these only become available for wedge entry incrementally, simply because currently the supply side innovations are incremental in nature.
  4. Once there is a paradigm shifting innovation (even from a convergence of multiple innovations) on the supply side, a demand side paradigm shift will correspondingly emerge.
  5. The momentum of these greater blockchain transactions volumes power diseconomies of scale in large non-networked organizations, while also powering economies of scale in networked operations.
  6. TCP like Facebook with its own forthcoming cryptocurrency will benefit from this transitional phase, as it will be able to deploy a smart contract protocol enabling faster DAOs formed by Facebook accounts working together to produce high margin services for the Facebook community monetized through Facebook Shops or some Libra/Novi-specific offshoot.
  7. A paradigm of granular collaboration will be made possible by open data platforms. Ethereum is the current dominant platform. It has its own economy powered by its Ether gas token and utility token standard ERC-20. It powers all sorts of dapps for all sorts of industries.

Ethereum and the FAT DApp Layer

With demand-side transaction costs lowering, DAOs have a superior economic competitive advantage as DApp providers. As a result, the application protocol layer within the blockchain technology stack, will enable the application layer (DApps) rather than overshadow it, due to their decentralized nature.

This is in stark contrast to the application protocol layer in the Traditional Cloud Applications (TCA) stack, where the Traditional Cloud Platforms dominate, capturing the bulk of the value, leading to Big Tech (Apple, Microsoft, Google, Amazon, Facebook and Salesforce).

Blockchain Application Protocol Layer (APL) and their DApps are able to financialize and fractionalize existing new asset classes and create interoperable markets for each of these asset classes. This will allow for a much more nimble and varied supply chain of goods and services, powering a new end user economy that enables mass personalization at scale.

DeFi Fattening

The first vertical to demonstrate network effects style scaling is financial services. Here the DAOs deployed atop the Ethereum blockchain are using smart contracts to enable non-custodial marketplaces meaning the buyers and sellers are transacting without giving up control of their money to a third party.

Decentralized finance has taken the crypto world by storm, reshaping the space.

Defi is an early example of this recomposition model with cryptocurrency trapped in crypto wallets being freed to contribute as locked atomized value to liquidity pools in decentralized lending exchanges through wrapped token smart contracts. And the programmability of smart contracts allows for all sorts of variations in the offerings for investors, lenders and borrowers, including a mashup of services spanning industries allowing for optimized insurance rates to interest rate in a one stop shop smart contract ready for execution with zero third party human intervention — the type of zero-marginal-cost economics that can facilitate that elusive mass personalization business model.

The Ethereum DeFi Standard

The difference between Web 2.0 TCP and Ethereum’s Web 3.0 DApps is the atomization of value creation and of value. This is accomplished due to a greater divisibility of money through cryptocurrency and automated contribution tracking through smart contracts. On the Ethereum platform both are almost one in the same.

Ethereum is an open source computing platform for smart contract based applications and native tokens. Together these facilitate lower demand-side transaction costs which give DAOs a competitive advantage, ie, superior economica.

Bitcoin does not have such a dynamic because it is not a computing platform — it is secure digital money. SImilarly, early network effects by one of the protocols is a reliable indicator of future dominance. With that in mind, Ethereum is well positioned particularly with Ethereum 2.0 on its way. It would require a new disruptive innovation to unseat it from its position as the current DeFi protocol dominance. This is clearly evident when comparing the Bitcoin based DeFi ecosystem (or lack thereof) to Ethereum DeFi.

Source: Link

According to Dan Morehad, CEO of Pantera Capital, DeFi has more chances to grow 100X in the next five years than Bitcoin. Perhaps an early indicator of his prediction is the adjusted weekly value of Ethereum trading recently surpassed that of Bitcoin for the first time since early 2018. Ethereum based DeFI has caused an increased use of stablecoins, particularly Tether (USDT), helping Ethereum pass Bitcoin as the most used blockchain:

While USDT was first issued on the Bitcoin blockchain, only 13.2% of its supply currently resides on BTC, while the Ethereum chain holds 59.8% of the USDT supply. As most of the USDT balance is held on Ethereum, USDT is also the biggest spender of gas in the network, according to data from ETH Gas Station.

Furthermore, this DeFi related “flippening” has also seen a growing trend within the Ethereum ecosystem where most of the growth is in trade volumes on decentralized exchanges vs centralized exchanges. For instance, the Uniswap DAO surpassed the market leader, Coinbase, the largest US-based centralized cryptocurrency exchange.

Because Ethereum based DAOs are all software, which much of their operations automated via smart contracts, the marginal cost of personalizing offerings with further atomization, variation and combinations, is near zero. Also, the Ethereum ecosystem DAOs and DApps are like lego pieces that can be combined and recombined in an endless variety of combinations to produce increasing value for users. Hence, an economy of mass personalization has been initiated through the near zero marginal cost of DeFi composability.

As long as each Lego piece has the same interlocking mechanism, any piece can be combined with any other piece to form a whole that is greater than the sum of its parts. The Ethereum protocol serves as the common denominator, allowing each ecosystem to be entirely compatible with one another.

Aggregation Layer

With trapped value being freed from inflexible TCP entities, the marketplace becomes more important than the firm in finding buyers for that atomized value. And beyond the marketplace, the buying and selling part, the additional need is for a layer that enables the recomposition of those pieces of atomized value into full personalized realtime services for users.

An aggregation layer above DApps becomes necessary at this point. Aggregation equates to superior experience due to one stop shop convenience. This convenience factors into user interface elegance.

From a tech stack perspective, the inclusion of AI agent layers that enable hyper atomized value marketplaces that facilitate coopetition in provisioning that coopetition to recompose the atomized value into finished personalized services in realtime. This layer needs to be discrete from the DApp layer. In fact, some DApps could be themselves disrupted by these mashup marketplaces.

Feedback Loops

The currency component, the ETH coin and the ERC-20 token protocol, give Ethereum a double threat. The triple threat is the ecosystem enabling aspect. From this combination is emerging a positive feedback loop.

DeFi is capable of disrupting the financial services back office, where the settlement issue has its roots. You start with high frequency horizon transactions (how frequently the users transact) payments, savings, borrowing & lending, investing, and then insurance.

DeFi on Ethereum has the dynamics to gain a foothold in nascent marketplaces within verticals where non-blockchain centralized marketplaces already provide dizzying execution speeds — enough of a foothold where the new entrants have a chance to exploit the centralized incumbents critical weaknesses: slow settlement and expensive personalized services.

DeFi Ecosystem

DeFi is freeing value locked in non-composable entities and locking it in composable entities that have the flexibility and automation to provide mass personalization at scale.

Source: Youtube.com

MakerDAO was an early piece in the DeFi ecosystem providing a decentralized lending DAO atop Ethereum smart contract virtual computer platform. MakerDAO initially limited the collateral to ETH. So then Compound came along to grow the variety of possible coins used as collateral to any ERC-20 token, including DAI, ETH, and any currency (like Bitcoin and USD) that was wrapped in an ERC-20 token like WBTC and USDC.

Expanding the decentralized finance ecosystem (DeFi) of which one of the dominant players today is MakerDAO. With its Maker token and its DAI currency also based on ERC-20 but instead pegged to USD, it facilitates smart contract loans materially reducing the cost of lending.

MakerDAO has locked in its smart contracts at anyone time about 10% of ETH market cap, making it the most successful Decentralized App (Dapp) to date. Kyber Network is a more recent success story, emerging to fulfill the need for on-chain atomic swapping (online seamless transactions) resulting in it attracting the most users of any ERC-20 DeFi solution as of the end of 2019 (see chart from a Binance report).

With Bitcoin still being the best store of value among the cryptocurrencies even in comparison to Ethereum, BTC is assured of longterm relevance. Private tokens can interact directly with BTC to serve as the second layer that accommodate everyday transactions. The WBTC token is an early example of this. Based on the ERC-20 token standard from Ethereum, its token is backed by BTC. WBTC (short for Wrapped Bitcoin) uses the Ethereum blockchain platform to trade BTC by locking the BTC in a virtual vault made managed by an Ethereum smart contract with automated if-this-then-do-that instructions.

Compound further extends WBTC deepening the integration between Bitcoin and Ethereum through its cWBTC token technology. Similar to WBTC, Ren and Synthetix are other ERC-20 DApps that act like a non-custodial virtual vaults via Ethereum smart contract technology for BTC (and other cryptocurrencies) with their renBTC and sBTC tokens respectively. Currently there is only 0.05% of BTC on Ethereum but the powerful incentives and automation will see that dramatically grow continuously.

Compound further incentivized borrowers and lenders with their creative use of their native ERC-20 token, COMP, using it to reward not only lenders but borrowers as well. The Compound DAO’s governance token COMP essentially became a utility token. The intrinsic value put it in demand, whereby Compound was able to use COMP to subsidize borrowers borrowing at high interest rates with COMP tokens (like credit card reward points). That meant there was more revenue to go around, meaning lenders could not only earn COMP for lending but higher rates of actual interest. COMP was used then to generate a feedback loop of supply and demand, ie, network effects (a.k.a demand-side economies of scale) resulting in the type of healthy liquidity essential for success in financial market platform deployment.

Market makers (liquidity providers) ensure healthy price discovery/signals, which without would result in poor capital allocation, which will upend and demand in the marketplaces services. And what’s driving liquidity growth is high returns and governance/reward tokens (with the right balance of liberal distribution and dilution). High returns are possible because the DApp platform is subsidizing borrowers of high interest rate loans with reward tokens. These reward tokens are popular if the platform is simultaneously able to generate the right amount of growing user base both on the supply and demand sides. It’s a fine balance.

This is early days for DeFi and DeFi is just the first sector. With these foundational pieces provisioning relatively scalable returns, the groundwork has been laid to enable additional financial services like derivatives, exchanges and insurance providers to round out an ecosystem.

This mass personalization trend through re-composability and automation will not only further penetrate finance but also spread to many other sectors. This includes non digital sectors through non fungible token technology, another Ethereum smart contract enabled token protocol.

Because the money lego aspect of DeFi allows for mashups of all sorts of value assets, smart contract offerings with an almost infinite mix of DApps, cryptocurrency, non fungible tokens, and AI are possible. This means an aggregation marketplace layer is entirely necessary to more accurately represent the Ethereum tech stack.

Yearn Finance as a one-stop-shop roboadvisor can grow past automated balancing for optimal yield farming, and provide automated personalized advice in directly related subjects or even unrelated subjects as it gains scale. It’s just a matter of Yearn’s AI and algorithms processing additional categories of data to increase the scope of it’s one-stop-shop. And that could be the beginning of a data market on the blockchain for AI consumption.

Source: The Defiant

This intersection of the two largest blockchains (Bitcoin and Ethereum) expands the decentralized finance ecosystem (DeFi) making the Ethereum protocol to some extent, a second layer in the Bitcoin ecosystem, possibly displacing the fledgling Lightning Network from that position. The Ethereum based DeFi network has an important advantage,the continued network effects scaling in the token incentives marketplace of the ERC-20 smart contract platforms.

These incentive schemes known as Yield Farming, are possible because of the Ethereum Virtual Machine (EVM) and its ERC-20 protocol (enabling native tokens) and smart contract platform. Combined, this Ethereum is a disruptive technology with the type of gravitational pull on the ecosystem to lure developers and entrepreneurs away from the Lightning Network. At its current growth rate, ERC-20 based WBTC will surpass the Lightning Network’s capacity within months, not years.

Source: Link

As of late August 2020, DeFi has $9.02 billion in “locked in value”. The largest DeFi project is Aave ($1.71B), followed by MakerDAO ($1.43B), Balancer ($1.36B), Curve ($1.26B) Yearn ($953.6M) and Synthetix ($871M).

Source: Link

Adoption Accelerating Strategies

Ethereum is as much a software platform as it is a currency (Ether). Ecosystem participants have the ability to build their own currencies or tokens using the ERC-20 protocol. This gives DLT entities an incredible adoption and scaling advantage over their TCP rivals. TCPs could also deploy blockchain tokens (something Facebook is building currently). However, TCPs still lack a networked operations core which will eventually be an insurmountable disadvantage.

Platform operators can be described as marketplace developers, requiring both buyers and sellers to be of value to either. This puts platform operators in a Chicken-or-Egg dilemma: without the buyers you can’t attract sellers, but you can’t get sellers without the buyers on board. So which comes first, the Chicken or the Egg?

As mentioned earlier, “The Platform Revolution” outlines over a half dozen strategies to overcome this adoption dilemma. Three of those are particularly relevant to DeFi:

  1. The Piggyback Strategy — wrapped token strategy
  2. The Micromarket Strategy — DeFi lego pieces are starting with niche markets
  3. The Seeding Strategy — governance token grants strategy

Let’s take a closer look at these three:

1. Piggyback Strategy: this is where a platform hitches their waggon to a more established platform to create a cross-action effect. A classic example of this is Instagram driving adoption by increasing integration with Facebook, driving Instagram usage by Facebook account owners.

In DeFi, several DAOs are filling their liquidity pools by allowing platform users to contribute Bitcoin as collateral by tokenizing the Bitcoin by “wrapping” or storing it in smart contracts tied to the DeFi platform. In this case, Bitcoin is the established platform and the DeFi DAO is going for a piggyback ride.

2. Micromarket Strategy: this is also a useful strategy in DLT world. Like Facebook started in universities before opening up the platform to all, DLT platforms can limit usage to a particular segment and then use the ensuing scale to add an adjacent market or vertical.

The Single-side Strategy is also applicable, whereby the platform acts as the seller or buyer, thus eliminating the chicken or the egg dilemma. The Marquee Strategy was used by many ERC-20 token projects in 2017 to bring attention to their projects. Many were using celebrities and influencers on social media to hawk their coins. Without substance, these coins often failed despite the initial momentum.

3. Seeding Strategy: soon after Paypal launched in the late 90s, it too used the seeding strategy, actually paying buyers to make their payments using the platform. Buyers demanding sellers use Paypal to take payment drove adoption especially on eBay.

Similarly, DeFi DAOs are paying lenders to liquidity pools with governance tokens, thus attracting both lenders and borrowers. Governance tokens are being used as a powerful incentive to power adoption of DAO services.

Not all DeFi projects deploying this method are going to last. Nevertheless, some will. And those that do stick around will have matched value provision with a tremendous adoption mechanism.

Some DeFi DAOs counterbalance the governance token payments by burning tokens later to offset the inflationary aspect of seeding, leading to debasement of the token. However, others are not — a recipe for FOMO (fear-of-missing-out) bubbles.

Incidentally, the Federal Reserve’s quantitative easing has a similar effect on US financial assets, as investors use the cheap credit to buy premium companies in the stock market leading to FOMO bubbles including the one that’s been building during this pandemic caused recession.

Rehypothecation Risk: CDP or CDO

Effective seeding adoption strategies can often mask the underlying risk of the investment asset as FOMO blinds the larger market to the risks. Investors are advised to factor these risks into DeFi related investments.

Rehypothecation or the use of collateral multiple times through securitization (CDOs), is a very dangerous activity for the health of financial markets as was demonstrated by the subprime mortgage based great recession of the late 2000s. DeFi through the variety of ERC-20 tokens is delving into the world of collateral securitization and multiple resales for the purpose of creating super liquid pools of funds (liquidity pools) to provide investors and borrowers with a viable decentralized money market.

One could argue that DeFi’s CDPs have superior price signals to CDOs, helping investors to make more prudent decisions. This is due to DLT’s inherent transparency, Nevertheless, because the current DeFi ecosystem is almost entirely based on the price of Ether (ETH) maintaining or increasing in value, sudden and prolonged decreases in ETH value will sink the CDP market, resulting in the collapse of DeFi.

What could cause a prolonged low ETH price?

  • Ethereum blockchain insecurity? Would have to be catastrophic and permanent failure. Unlikely to be permanent.
  • Ethereum 2.0 failure? Would have to be catastrophic and permanent failure. Unlikely to be permanent.
  • Overall world market turmoil? Temporary. We saw a temporary collapse during the March Covid-19 related drop in world financial markets. Central banks stepped in to provide liquidity. Some portion of that liquidity undoubtedly found its way into ETH and ERC-20 tokens as well spurring a mini-bull market over the summer of 2020.

Other risks included poor smart contract design. The DeFI project called Yam was an unfortunate example of this issue. (Note: the naming of these DeFi projects trending towards food, by the way, has to do with the farming notion in yield farming.) The below list details additional risks.

Source: Link

We are in the wild west of the DeFi era. It’s an exciting roller coaster requiring a strong stomach. Without investing prudence the risk can turn quickly against investors, flipping the overly exposed from happy campers into one-and-done victims. DeFi does not have the safety nets of CeFI (Centralized Finance, ie, the incumbent). In the bigger picture, that’s a good thing as DeFI looks to displace CeFI.

CeFI is fully backed by depository insurance, and particularly the Central Bank’s quantitative easing (QE) powers. The resulting moral hazard (MH) of these federal safety nets are designed to delay the painful consequences. Instead, MH-QE feedback loop continues to create a larger bubble sustained by the status of the USD as the Global Reserve Currency (GRC), which ensures the dollar’s demand. The Fed continues to print new money and sell federal bonds to fund the necessary bailouts that result from the moral hazard.

The consequences of these maneuvers can not be delayed forever. The US will fight to maintain GRC status for the USD. Any contender to the GRC throne, including the dual threat of DeFi and BTC, will eventually become a target of regulatory entities. But by the time, hyperinflation, skyrocketing interest rates, and an unmanageable national debt, it will be too late for the Fed to save CeFi from displacement.

This CeFi displacement by a decentralized alternative is just another face of The Fourth Industrial Revolution , which dictates innovations shift power away from a concentrated center to the diversified edges.

The Hedonic Treadmill

In the book “The Happiness Hypothesis,” Jonathan Haidt concluded that happiness comes from both within and without. The Yoni Linga or Yin Yang philosophy is a useful iconic representation of this type of balance.

The enemy of happiness is the trap of relative social status comparisons, and the internal and external peer pressure or FOMO (fear-of-missing-out) that results. Modern networking technology, specifically social networks like Facebook and Instagram, exacerbate the relative social status problem. The end result is a high degree of populace unhappiness, with most users heavily promoting their best PR faces, generating a heavy dose of envy among consumers of that content.

The happiness is relative and so is unhappiness. Thus, despite the fact living standards have materially increased decade over decade, our recent memory bias has our happiness sensor, our mind, ignores facts and relies on immediate gratification.

“As nations grow wealthier, they must produce an ever-increasing amount of goods and services to maintain the same degree of satisfaction among citizens.”

~William Berstein, The Birth of Plenty, p.333

As a result, the populace continues to be bombarded with status and peer pressure tactics in the form of social media feeds, traditional media content and advertising. This commercial attempt of providing happiness to the populace through external means can be aptly labeled as Consumerism, ie, the consumption of status enhancing material products or services.

These products in the industrial era have been mass produced because the lower the marginal costs from high volume production increase profitability as sales increase. The downside is that if the product doesn’t sell as expected, the resulting large inventory takes up costly warehouse space, thus cutting into the profits.

Due to the scale of the operation, mass produced goods are the realm of public companies, which have the means to raise the necessary capital to establish the large scale manufacturing operation. Economies of scale favour mass production in the industrial era.

Publicly listed enterprises are required to report earnings quarterly. Management and sales bonuses are tied to quarterly and annual earnings performance. Wall Street is watching closely in anticipation, making recommendations to investors which will drive the stock price up or down.

Thus public companies with mass production operations are thus inherently incentivized to influence the populace for short term gain to move that inventory to ensure the quarterly management bonuses. The easiest way to influence the populace is by exploiting their subconscious, which due to recency bias is riddled with fears of inadequacy due to relative social status.

Social Media serves as a platform for publicly listed companies to scale up consumerism to another level leading to national crisis of depression. These goods include pharmaceuticals which further compounds the mental health crisis. The inadequacy sentiment is the heart of the socialist movement in academics both among faculty and the millennial students.

Further compounding the problems yet, is that those same companies are incentivized to lower production costs even more due to competition. With China’s Guangdong manufacturing hub scaling up, US midwest factories started to shut down. These job losses in the rust belt particularly, made it even more difficult for those families to keep up with Joneses especially relative to their social media friends on both coasts: the Wall Street highfliers and political elites on the east coast, and the Silicon Valley and Kardashian-wannabe families on the west coast.

The high interest rate threat makes it impossible for politicians to tackle the high national debt, which ensures cheap credit continues to find its way into financial assets (public company stocks in particular) making them more inflated and expensive and risky for the average person who has a lot more to lose relative to their ability to recover when the bubble bursts.

The rising stock prices mean public company management bonuses only get bigger, while the average person’s job keeps getting threatening. The fractional reserve banking, and money supply expansion policies are contributors to the problem. The envy grievance is a natural reaction to this downward spiral and at heart of the populism movement.

So you have two reactionary movements. socialism and populism tearing at the fabric of society. Neither solves the problem. Yes, the problem is commercial in nature. But the answer is also commercial and nature — and not political. The next phase of technological evolution of cloud computing: decentralized database infrastructure will produce just-in-time production of personalized goods at scale.

Economies of Scale power mass production which results in suboptimal products relative to consumer needs at the margins. With on-demand personalized goods, with near zero ad hoc customization costs to service a new consumer in a personal way, stimulating demand for suboptimal goods through liquidity will become unnecessary.

Mass inventory will eventually be uneconomical due to poor consumer demand. It will give way to mass personalization. The masses will no longer be attempting to keep up with the Kardashians through inferior mass produced crap — but rather they’ll be collaborating with one another designing and producing their own personalized goods and services, creating both income and lower prices for themselves and one another — the essence of meaningful work. A new path for happiness will be paved leading to more fulfilling social networking.

DeFi vs CBDC

With the era of consumerism ending, by extension, the era of manual central banking will also end, giving way to automated decentralized banking. In this era, CBDC will find a hard time competing against DeFi. In fact, CBDC may become irrelevant soon after it becomes prevalent.

The DeFi has an edge against CBDCs, as the latter is hindered by their need to protect their pirate bank partners. The finance Dapps of DeFi on the other hand will not hesitate to disintermediate private banks in their pursuit of greater market share through enhanced customer experience at lower prices, ie, mass personalization.

It’s still early days and the total DeFi dollar figures are minuscule in comparison to mainstream finance. DeFi is currently valued at $4.47B, a rounding error in comparison to the traditional stock market alone is worth $75T. Despite this, the savings provided to early adopters are undeniable.

In the decentralized tech stack, DeFi can be viewed as the value layer, and a future rival of the traditional financial services vertical. DeFi’s impressive growth rate and superior economics suggest a battle of a worthy contender in the not too distant future.

Network effects are inherently logarithmic (versus linear). And as discussed earlier, DeFi has the inherent traits to accelerate those network effects. The resulting momentum ensures the point of critical mass for disruptive technology to successively compete against entrenched incumbents arrives much earlier than today’s marketshare numbers might otherwise indicate.

Fractionalization vs Central Banks (Money Supply Expansion and Fractional Reserve Banking

  • Money supply expansion is the expansion of the existing money supply.
  • Fractional Reserve Banking is the move from a ratio of 1:1 in underlying assets to lent funds to a ratio that is some fraction of underlying assets in order to get more money in the hands of the borrower market.
  • Fractionalization is the incremental division of the existing economic units that make up the money supply.
  • Fractionalization of an expanded money supply has an exponential impact on credit availability, resulting in the debasement of currency and its holders’ net worth.

The proliferation of DeFI governance tokens is the equivalent of traditional money supply expansion by Central Banks. This ability to create more credit requires interoperability in order to feel like one economic unit of measure as is the US Dollar. With interoperability in place, the market deciding on which tokens to back will more organically and automatically expand money supply versus politically exposed central bankers manually intervening to supply liquidity through their private bank partners.

This feature of interoperable varied DeFi tokens also mitigates the notion of Bitcoin as a deflationary asset, because of Bitcoins limited expansion abilities (notwithstanding Bitcoin’s fractionalization capabilities). Regardless of whether capped Bitcoin is viewed as a positive or negative, automated interoperable DeFi acts as a healthy inflationary complement due to the free market’s influence over digital currency supply expansion.

Domestic Currency Debt Denomination

As consumerism declines, the need for central bank intervention also declines, incenting the world to move away from the Fed controlled USD as the global reserve currency, and onto a more neutral currency with an automated and decentralized management ecosystem. The natural candidate is Bitcoin as the new global reserve currency due to its built-in protection against destructive credit expansion. DeFi acts as an automated Money Supply Expansion and Fractional Reserve Banking complement to Bitcoin — thus, together serving as the basis for a new automated and decentralized banking order.

This will also create a much more even playing field for all the nations of the world. The US will lose asymmetric benefit as the GRC, which includes the ability to withstand large debt loads much better than the rest of the world. Unlike the rest of the world, US debt is denominated in its own currency. Where as the rest of the world must acquire USD to pay off debt, keeping demand in the USD artificially high despite its debasement through money supply expansion.

Raul Pal: USA is 6% of world population and 25% of global GDP, yet 79% of global transactions are done in USD (as agreed to at Bretton Woods after WWII) demonitaing high national debt in USD vs domestic fiat currency is unsustainable, causing inflation through domestic money printing.

Price Signal

Credit expansion policy of fractional reserve banking distorts the price signal. Inaccurate price signals increase transaction costs. Higher transaction costs suppress innovation. Innovation is the basis of productivity growth.

Lowering transaction costs by ensuring accurate price signals allows the market and its agents to make better buy and sell decisions that induce value which in turn unleash productivity growth.

Money Supply Expansion and Military Expansion

The US has benefited from the USD having global reserve currency status, and has enabled it to fund a military and economy that defeated communism. But due to changing economics, the US will be forced to withdraw from its expensive role as the world’s policeman. Before that happens, the Chinese Communist Party will fall.

The rest of the world will quite naturally coalesce around an independent global reserve currency in order to level the playing field in time for the next phase of the digital information revolution. It’s good for the US that this happens as the USD’s GRC status has made the US economy addicted to financialization as a driver of GDP growth, hastening its wealth gap, and causing a devolution of its society.

Dilution and Tragedy of the Commons — The Solution: AI Smart Contracts

The Tragedy of the Commons is a specific form of scarcity-based negative externality. Lack of ownership or lack of regulation and the conservation and stewardship they would provide, cause competing forces to hasten the exhausting of the resource despite mutually assured destruction among the competing forces.

In a digital world, resources are rarely scarce. And so the typical network effect is not asymptotic. However, in the real world, resources do grow scarce over time, and thus negative externalities are quite common within networks. Productivity growth in a scarce resource world requires a prime focus on competition. Whereas in an unlimited resource world, like digital, a prime focus on collaboration (such as platform ecosystems) is more productive.

The vast majority of today’s applications do not work together to provide a seamless wrap around personalized experience for users because there is no easily accessible economic incentive or mechanism to collaborate. Smart contracts and native tokens will change that. Collaboration will thus emerge as a superior form of organizing in business and in politics.

The unintended consequence will be accessible and secure databases. The smart contract automation and more freely available information will allow for ever smaller entities to be economically viable.

In business, the granularization of value will result in mass personalization. In governance, it will help us better manage our physical environments of scarce resources without the need for an increase in proprietary ownership or direct government management of common goods. Cheap smart contracts will do it at a fraction of the cost, and with no conflict of interests, enabling more productive and fair usage.

Thus in the era of the 4!R, averting Tragedy of the Commons scenarios will be commonplace. This included the tragedy of fiat currency, which is a national public good. Harmful fiat debasement, dilution, and rehypothecation can also be prevented with smart contracts and token incentivization. This will be the next phase of capitalism and politics … and it’s arriving much sooner than we might think.

Beyond DeFi DApps

The reduced cost of computing and communications are allowing the best ideas to rise and scale rather cheaply and effortlessly. In a Web 3.0 world, dramatically reduced timeframes and costs between inventiveness to productivity will become a reality allowing almost anyone with skills to find demand and anyone with any need to find a supplier.

That certainly puritans to verticals beyond just finance. The blockchain technology stock in theory is applicable to all verticals, and in many cases brings verticals together to facilitate integrated and enveloping personalized services to users. Layer 4 and 5 represent these verticals in the blockchain technology stack.

There are three enablers of Mass Personalization that represent the pre-revolution — the transition zone now coming into view:

Initial Traction

  1. DeFi (as reviewed above)
  2. Media and Gaming (Brave Browser, Steemit, CryptoKitties via NFTs)
  3. Prediction markets (Augur)
  4. Supply Chain Management by disintermediating blockchains (Vechain)
  5. Smart Contract Service Providers (ChainLink)

Physical goods (Tinlake via NFTs)

  1. Decentralized Autonomous Organizations (DAOs)
  2. Smart Contracts on Ethereum enabling Decentralized Autonomous Organizations utilizing the ERC-20 token protocol
  3. Smart Contracts on Facebook’s Libra enabling Decentralized Autonomous Organizations on Facebook selling through Facebook Shops (more on this below)

Convergence of IOT, AI, DLT

  1. DLT mediated Data Markets (Ocean Protocol)
  2. AI Agents and Tokenization together facilitating a microtransactions economy

Ethereum has such a variety of DAOs and DApps that sometimes it’s just a matter of building some additional DApps that together kick start collaborative and complementary offering resulting in the emergence of network effects style growth. This model is a precursor to a larger blockchain based global token economy.

Let’s take a closer look at some of the possibilities below.

AI-IOT-DLT Convergence

3IR will eventually trigger a massive complementarity network effect as it facilitates the convergence between three key technologies, AI, IOT and DLT. AI needs information, IOT generates information, and DLT provides tradeoff transcending access to information (the tradeoff between security and transparency).

At scale, this convergence will process incomprehensible amounts of data and produce unfathomable end user value. Once we are there, the leap in productivity will counterbalance and even reverse much of the negative impact of the unintended consequences of the last couple industrial revolutions.

High transaction volume capacity is absolutely necessary for blockchain mediated data exchanges. Blockchain still requires more computer science problem solutions (sharding) to scale to the necessary levels (with regard to transaction volumes and velocity). Also, information sharing tools without violating privacy are emerging. Blockchain and homomorphic encryption combined become quite relevant.

At this future point, Web 2.0 platforms like Salesforce, Amazon and Facebook will become quite vulnerable and quite quickly, despite their current state of near complete market domination. They will no longer own the key factor of production: data. Instead an open standard will mediate a universal data market.

Universal Data Market

IOT is essentially a very large and rapidly growing ocean of data. And a growing consumer of that data will be AI. Smart contract enabled agents could acquire data directly from centralized aggregators. However, once data generators begin to realize the value of their data, decentralized data markets will become a lot more attractive due to their increased efficiencies, provisioning more attractive profit margins.

A blockchain project specifically designed to enable a universal data market is Ocean Protocol:

“Ocean Protocol is an L2 solution. It rides on top of Ethereum and other chains. Think of it as an operating system on top of L1 chains, which is geared towards universal decentralized access control. Developers and companies can build their marketplaces and services on top of Ocean. These services can be B2C, B2B, or C2C — whatever tool or application that helps people to share data securely.”

~Bruce Pon, Founder, Ocean Protocol, Aug 13, 2020 AMA Q&A

Ocean Protocol wants to facilitate this data market by modelling itself after the token economy:

Source: Ocean Protocol Blog

Let’s go through the layers one by one:

Base layer (bottom row):

  1. a reserve currency and store of value;
  2. a data/asset platform and unit of exchange
  3. and a data/asset funding platform.

“Utility last mile” (middle row) layer:

  1. data science
  2. AI applications (which consume the data)

Economic last mile (top row):

  1. data custody and data management
  2. data marketplaces and data DeFi apps
  3. storage / compute service networks

Data tokens (the feedback loop) flow through Ocean’s envisioned data economy.

Ocean Protocol wants to serve as the base-layer substrate for this data economy. Its architecture includes the following sublayers:

  1. Access controls to data, and data services (read, write, or perform execution on the data)
  2. Metadata (discoverability via browsing, searching, and filtering)
  3. Data apps (data marketplaces, data commons, and AI / data science tools)

Ocean will incentivize data markets through data tokens for the access controls. A crypto wallet holds the access rights to data similar to how non-fungible tokens (NFTs) are held in wallets, representing the rights to real world assets. These access rights can then be transferred to others just like cryptocurrency and tokens are exchanged between wallets.

Ocean is designed to alleviate valid concerns over data security and privacy, which have so far hindered non-blockchain attempts at establishing liquid data markets. Ocean serves as a non-custodial marketplace, allowing for peer-to-peer exchange of data rights in the form of tokens. Privacy is ensured since access is limited to only authorized AI agents.

Source: Ocean Protocol Blog

Ocean also integrates with DeFi players like Uniswap and Balancer to facilitate automation in data exchange thus making low demand data profitable. Bundling disparate sources (to increase scope) through data set tokenization automates the process of optimal price setting and selling modes (buy or rent) increases the value of data that may otherwise be uneconomical.

Chainlink is a decentralized data provision service that fulfills the oracle component (almost third party realtime “news feed” that serves as the source of truth) for smart contract settlement. It’s LINK token is an ERC-20 token with ERC-223 transfer and call functionality. LINK is not only relevant to DeFi as a blockchain mediated currency market enabler, but also to a blockchain mediated data market. Data is after all the newest factor of production. Data and currency are intertwined because both are required for effective price signals, which are required for capital allocation which are essential to productivity.

Chainlink essentially allows data feeds to monetize their contributions to blockchains through the LINK token. Off chain nodes get paid in LINK tokens. These nodes allow for integration with external adaptors which perform subtasks on external nodes making data collection more efficient.

On chain includes oracles (which off-chain data process user requests, routed to appropriate smart contract and matched with the correct off-chain data. There are three contract types for matching:

  1. Aggregating Contracts — gathers data from oracles
  2. Order matching Contracts — matches the requestor to the corresponding oracle based on the data request relevance
  3. Reputation Contracts — evaluates oracle reputability

LINK, the established data marketplace token by then, will be a natural fit to help monetize data provisioning by distributed data aggregators. Smart contracts will power smart agents which will serve other automated AI (powering IOT and DAOs) and people. All of these players on the blockchain will also need data. Chainlink’s sophisticated offchain-onchain data exchange protocol will have the critical scope to fulfill this demand.

Universal Media Market

There will be mass personalization in media, with multiple media platforms being leveraged by subgroup forming networks. Content from Youtube, Reddit, Medium will all be mixed, matched and integrated in real time to produce superior personalized on-demand media for both entertainment and education purposes. Collaborating contributors (including people and smart agents) will be compensated in smart tokens with revenue shares distributed according to the predefined rules in smart contracts. This will obviate the need for centralized platforms to host organizing these ad hoc subnetworks).

The increase in automated smart contract transactions using smart tokens by automated smart agents will enable a release in trapped value in media comparable to the revolution initiated in finance by DeFi. The complementarity network effects from all of the granular supply finding granular demand will produce productivity gains from which a new reality for people will emerge — including a new virtual reality.

Early versions of such media ecosystems including Steemit among others. Critical mass for this model will have to wait until transaction costs which will negatively impact the hold TCP social media platforms, like Youtube, Facebook and Twitter currently have on API protocols via their proprietary standards. Facebook based token projects could be a future competing initial hybrid ecosystem powered by a Libra smart contract infrastructure.

An Ethereum based supply-chain of media products for instance, monetized and atomized using the ERC-20 protocol, can produce an automated collaboration and integration marketplace for content. The result will be personalized media offerings as a result of near zero marginal costs for recombination. This marketplace will find early adoption when ecosystem participants like new media DAOs develop clever ways to incentivize contributors through the use of governance tokens turned utility tokens, much like Compound Finance in DeFi. The Brave Browser and it’s BAT token are just the tip of the iceberg. Facebook based token projects could be a competing ecosystem powered by future Libra smart contract infrastructure. Much more on Facebook, Twitter and their blockchain plans below.

Physical Goods

Supply chain distribution costs are decreasing due to 3D printing technology. Many products and parts will no longer need to be shipped across the world. Instead they can be printed locally removing the costs of customized production and distribution, while increasing speed to market. Add that to smart contract and smart token technology which produce lower transaction costs, powerful scaling incentivization capabilities, automated processes and the disintermediation of costly middlemen) and you can see the foundations for mass personalization in the physical world also being laid.

In the B2C space, ad hoc collaboration is much easier in the digital realm versus the physical because of lower reproduction cost and distribution costs of digital products and services. Distributed production via 3D printers allows physical goods including finished products and parts. It is a key piece of technology in the new economy of Manufacturing 2.0, where mass personalization business models are beginning to reveal themselves.

Then there is nano printing, or being able to print customized miniature products or components on demand in an additive manufacturing economy powered by smart agents. This brings the type of liquidity to markets of material things that we currently find only in the markets of financial assets.

Distributed production technologies will also negate the need for large factories which also reduces the need for large capital projects. Additionally, as we find ways to leverage distributed energy (individuals storing energy from solar in batteries on their own for current or later use) vs relying on centrally aggregated energy via large utility companies, the need for large capital projects will be mitigated.

Small is a superior form factor because it can behave in both big and small ways through modularization. Small modular robots working together can use this advantage to further increase productivity in chaotic and complex environments due to their inherent antifragility, in the pursuit of mass personalization in physical goods. For instance, MAVs (micro aerial vehicles) do this in agriculture to manage crops. Some MACs specialize in certain tasks. When these differentiations are taken together a complementarity network effect can result, similar digital network effects or that of sports teams or music bands. When small agents have the ability to work together without the overhead of central control, the greater numbers can act as a single large agent to capitalize on the unique aspects of larger scales.

Sensor-generated data from physical assets such as delivery vehicles, containers, and warehouses are producing rich operational data from across the logistics value chain. Having that voluminous data (IoT) on the blockchain brings increased transparency without any sacrifice in security — an otherwise common tradeoff in centralized databases supply chains. This paradigm shifts disintermediates certain irrelevant data-hoarding players, while also holding the rest of the supply chain members to a new higher standard, with the end result being lower costs.

Legacy corporations tend to prefer private blockchains due to the need for speed and data security. VeChain (public blockchain) vs TradeLens (private blockchain). VeChain has two associated cryptocurrency (for governance/rewards and pegged fiat digitization) that come bundled with its platform. TradeLens (a Maersk and IBM initiative) does not have a cryptocurrency associated with its platform. The superior economics of cryptocurrency to fiat favour public IoT blockchains over private IoT blockchains in the long run, once Layer 2 solutions increase consensus-based token transaction processing speeds. VeChain in particular could see big growth with it’s toolchain solution and dual tokens.

Many of these companies within the supply chain have massive fixed assets like ships, planes, trains, trucks, power generation plants, warehouses, global offices, and land, all of which will need to be repurposed to some extent due to 3D printing. TCPs have done a nice job in the B2C space of de-linking ownership of the physical asset from the value it creates, with the likes of AirBnB and Uber. However, doing so in the corporate space is a little trickier due to the greater risk to security relative to the low monetization upside.

On the blockchain, these corporations with massive fixed assets, can unlock the trapped value by not having their data stored in centralized servers that rely on the security practices of third parties. Additionally, the cryptocurrency monetization process is more economical due to the automation provisioned smart contracts and greater fractional units of value. The former lowers the cost of sales making smaller transactions more profitable.

Non-fungible Tokens (NFTs) are unique digital certificates that can represent physical goods in the digital realm. NFT technology first became popular through the popularization of digital collectibles like CryptoKitties. NFTs are different from mass produced assets like for instance money which is fungible, i.e., perfectly interchangeable. They do this through four inherent properties:

  1. cannot be replicated
  2. cannot be counterfeited
  3. cannot be inflated (reproduced on demand)

The Ethereum blockchain has a unique protocol for NFTs. Where ERC-20 is for fungible tokens like for instance Maker, Comp, and BAT, the ERC-721 protocol is for non-fungible tokens. And then there is the ERC-1155 standard which covers off both Fungible & Non-fungible and is the basis of new NFT projects that cross with DeFi.

Tinlake offers smart contracts representing non-fungible assets and a native fungible token (TIN) to serve as a marketplace enabling borrowing against tangible assets by pooling them together and offering them to lenders (or investors). Tinlake is built on the Centrifuge Protocol .

Tinlake is bringing real world assets to the DeFi space. MakerDAO’s DAI stable coin is what borrowers get when they put up their real world assets as virtualized collateral via NFT smart contracts. Lenders or investors receive fungible TIN tokens as yield or interest for lending their stable coins.

The collateralized asset pools can be ring fenced to represent personalized (by way of individualized interest rates and individualized collateralization rates) fractional proceeds of the asset pool via the Tinlake open source smart contract and the TIN fungible native token. Not only can TIN be transferred to the investors to draw the DAI stable coin funding, but TIN can also be locked into other crypto protocol asset pools to draw funding from their lenders/investors (money Lego).

Tinlake is moving to a decentralized exchange architecture (DEX), to become fully interoperable with other blockchains and stable coins similar to DeFi player Kava. As a result, NFTs representing real world assets can serve as collateral in DeFi.

Also, NFT’s are a path to more flexible digital property rights by virtualizing, atomizing and personalizing ownership of real world assets. This increases buyer-seller marketplace liquidity, creating even more powerful network effects further enabling the personalization economy.

Corporations with large fixed assets can use NFTs to collaterize their loans helping further reduce borrowing costs. Additionally, NFTs can represent components within large assets allowing owners to be more precise in their collateralization and borrowing. Thus, blockchain is further helping facilitate granularization of trapped value, a prerequisite to mass personalization.

When taken as a whole, technologies like 3D printing, MAVs or micro vehicles or nanobots, IoT, smart supply chains, smart energy, smart software agents, smart contracts, smart currencies, and NFTs, one can begin to see how a world of mass personalization in the physical world will take shape.

Interoperability Solutions

Polkadot is a very compelling and ambitious blockchain founded by Ethereum co-founder, Dr. Gavin Wood. Polkadot uses what it calls “parachains”, a sharded blockchain model conceptually similar to the design of Ethereum 2.0. Where they differ is that the Ethereum 2.0 shards are uniform by design, while the Polkadot shards can be specialized for different use cases, giving developers on Polkadot more flexibility. These parachains operate as shards of Polkadot’s relay chain (the equivalent to Ethereum 2.0’s beacon chain).

Source: Polkadot Wiki

From a value proposition perspective, sharding provides similar benefits to the modular design of microservices whereby the shards are separate yet also part of the greater whole. This network of chains model is superior to that of non-sharded monolith blockchains, which have significantly less processing power. Sharded blockchains like Polkadot can process many transactions on multiple parallel chains, eliminating the bottlenecks of Ethereum and other blockchains that process transactions one at a time.

Source: Polkadot Blog

This parachain design of Polkadot also makes it a compelling Layer 2 solution for Ethereum and other blockchains, with inherent interoperability features, potentially connecting otherwise incompatible blockchains, including those that are private.

Like Ethereum, Polkadot is also programmable giving DApps developers the tools to build their own smart contracts. It’s open-source framework called Substrate enables developers to efficiently build entire, configurable blockchains and applications.

It’s Dot token, the equivalent of Ethereum’s Ether, is used by the Polakdot network for its governance and consensus mechanism. In 2021, Polkadot plans to release a token minting system similar to Ethereum’s ERC-20 standard, called “Polimec”.

Despite Polkadot’s apparent technological superiority, its challenge in fulfilling its potential as an Ethereum-killer will be Ethereum’s head start and current dominance in DeFi. Hence, Polkadot’s initial value proposition may be in complementing Ethereum as a scaling layer, and Ethereum 2.0 with interoperability bridge.

With regard to inter-ecosystem or inter-platform collaboration, those centralized platforms that don’t develop a token, create an opportunity for a third party token designed specifically for interoperability and bundling of services into one seamless mass personalization experience.

Blockchain interoperability solutions go beyond Polkadot of course:

“…within the cross-ledger interoperability of Ripple Labs Interledger project, Cosmos’ “Internet of Blockchains,” or Polkadot’s “Parachain,” solutions are emerging that drive the process away from the “maximalist” notion that all economic activity must gravitate to a dominant blockchain.”

~Alex Tapscott, Financial Services Revolution

In the DeFi space specifically, Kyber Networks is seeing early adoption as an interoperability leader in tokens, by facilitating exchanges between token holders of all varieties. Because Kyber does not take custody of the tokens during the exchange, it is fundamentally different than the more established cryptocurrency exchange platform like Kraken, Coinbase or Binance, to name a few. Though Kyber is based on the Ethereum ERC-20 token standard, the Ether token itself is not always the ideal intermediary token facilitating the exchange. Instead, facilitate the transactions in a peer-to-peer fashion between the token holders.

Legacy corporations tend to prefer private blockchains due to the need for speed and the perceived need for data control. As discussed earlier logistics giant Maersk selected IBM Hyperledger as its blockchain development platform.

JP Morgan Chase’s Quorum blockchain platform had also been gaining traction within the financial industry, until recently selling it to ConsenSys, a leader in Ethereum private blockchain development and deployment. JPM recently decided to take a stake in ConsenSys while also contributing Quorum IP as part of the deal. What the two have in common is utilization of the Ethereum tech stack, specifically the one designed for the private blockchain market.

ConsenSys built Hyperledger Besu to provision compatibility between Hyperledger and Ethereum. Now it can do the same with Quorum, by building compatibility with Hyperledger via consensus mechanisms, API interfaces and privacy tools. The ConsenSys’ enterprise Ethereum stack offering, from now on to be referred to as ConsenSys Quorum, will come in two varieties: the Go Quorum–based version and the Hyperledger Besu–based version.

“It answers concerns about commercial support for enterprises [that] have deployed Quorum and, more importantly, it helps efforts to bring greater interoperability and code re-use between Quorum and Hyperledger Besu.” … “This is textbook open-source ‘co-opetition’ at its finest, where competitors can realize they’re actually stronger working together than trying to divide a market.” … “We look forward to helping ConsenSys and JPMorgan (who are both Hyperledger Premier Members) and other ecosystem members drive adoption.”

~Brian Behlendorf, executive director of Hyperledger via CoinDesk

This gives Quorum a more neutral standing which should help its adoption rate among JPM competitors, and it puts the onus on ConsenSys to proliferate Quorum. ConsenSys has now become the leader in private blockchain interoperability while maintaining its fundamental link to the most popular Turing complete public blockchain, Ethereum.

This is important as the economics around public Ethereum, once Ethereum 2.0 and Layer 2 scaling come online, will become so vastly superior to private blockchains, that ConsenSys can play a critical role in converting and bridging Ethereum private blockchains to the Ethereum public blockchain.

Bandwidth for Permissionless Scalable Lower Supply-side Transaction Costs

Ethereum’s challenges with transaction bandwidth is the result of its success, ie, rapid growth. Ethereum’s solution for bandwidth is Second Layer solutions and a redesign of its nodal database management model and consensus model, together referred to as Ethereum 2.0. Ethereum share of the decentralized computing platform market is already advanced that other Ethereum-like platforms won’t have any killer competitive advantages once Ethereum’s 2.0 is successfully launched, which will dramatically expand transaction processing bandwidth.

Today’s inflexible supply chain economy is dominated by centralized TCPs that are unable and unwilling to fractionalize and financialize (as it would cannibalize their current TCP cash cow). As a result a tremendous amount of potential value is trapped in the rigid business models incented to hoard data. A mass personalization economy will free this trapped value, resulting in a productivity leap so impressive, that we will be so large that we’ll look back at this era as the dawn of the Third Industrial Revolution (3IR).

The Tale of the Tape

It is still early days, and further innovations are required in DLT before full disruptions of TCP are possible. Regardless, a classic disruption theory trajectory is emerging. A central tenet of disruption theory is the notion of a ‘good-enough’ value proposition (ie, compelling value-to-low price ratio) for a job-to-be-done. You can also think of it as a “thin edge of the wedge” or “tip of the spear” competitive strategy.

One common strategy for establishing this initial relationship is what is sometimes known as the “thin edge of the wedge” strategy (aka the “tip of the spear” strategy). This strategy is analogous to the bowling pin strategy: both are about attacking a smaller problem first and then expanding out. The difference is that the wedge strategy is about product tactics while the bowling pin strategy is about marketing tactics. (Link)

Classic disruptive theory is only possible if the disrupting innovation has inherently superior economics facilitating a lower price relative to the incumbent, and not just a predatory pricing strategy of a common competitor or sustaining innovation.

Asymptotic Network Effects and Bottlenecks

More users of blockchain platforms that facilitate marketplaces, produce greater benefits for all existing users of that marketplace, since supply more efficiently is able to find demand, and demand more efficiently finds supply. This is known as network effects.

Currently, these network effects are asymptotic, in that the benefits diminish over time as greater transaction volumes between the supply and the demand agents produce processing bottlenecks.

It’s somewhat akin to the freeway lanes dilemma. Add more lanes and the current traffic congestion is cleared, which though increases the popularity of moving to the suburbs causing a new wave of traffic congestion.

In the case of blockchain networks, the processing bottlenecks are more easily reached than in TCPs due to the additional infrastructural burden of the decentralized network nodes needing to reach consensus before recording a transaction to the blockchain.

It is foolish then for blockchain based applications to compete with traditional applications on the dimension of speed. A certain good-enough level of speed is necessary, however, blockchain must be able to deliver a big benefit in some other dimension of value.

Second Dimension of Competition: Anonymity

DLT inherently has superior economics to TCP in one competitive dimension, namely security or more specifically anonymity. DLT is adding additional innovations which provide additional scope to the initial superior economics providing access to additional dimensions of competition.

Without further innovation then, DLT platform operators are faced with a dilemma: increase service fees and lose superior economics relative to TCP, or maintain transaction fees and limit the disruptive scope of DLT due to bottlenecks.

Inverse Correlation: Supply side and Demand side Transaction Costs

Fortunately, DLT supply side innovations are being developed. The implementation of these sustaining (incremental) innovations by DLT platform operators, transaction costs both reduce and increase:

  • supply side transaction (consensus node processing) costs increase, while
  • demand side transaction (non-node processing) costs decrease.

Reduced transaction costs on the demand side (users able to ) generate more transaction volumes on demand side for DLT platform operators, which naturally increases transaction volumes on the supply side (blockchain network node suppliers), eventually leading to increasing supply side transaction costs in order to free up the network of traffic jams.

Thus, transaction costs on both supply and demand side are important to understand since they determine which dimensions of competition in which verticals business models and solutions achieve scale relative to the entrenched competition.

The Red Pill Scale: From Blue to Pill Purple to Pill Red Pill

The dilemma for blockchain platforms then is how much of their decentralized nature do they sacrifice toward a hybridized decentralized — centralized organizational structure, in order to achieve greater transaction speeds. That’s the Purple Pill option to reduce transaction costs on both the supply side and demand side of the equation, until a true red pill option becomes available.

Blockchain technology evolution is driven by solving its inherent trilemma, as Vitalik Buterin put it, between two of the three dimensions in demand, being Good Fast and Cheap. There are over 7K full nodes across the Ethereum world all of which need to agree before transactions can be immutably written to the blockchain.

So cheap, fast and good trilemma is really one of entrenched fast & centralized vs disruptive personalization & decentralized. The cheap and good are relative to what is valued by users. Really fast at scale (only possible with centralized architecture) doesn’t produce the best experience but it does produce good-enough experiences cheaply. Personalization at scale (only possible with decentralization at scale) produces great experiences cheaply. Decentralization at scale though requires new innovations that reduce processing transaction costs.

The initial superior competitive dimension of DLT, anonymity of transacting agents, is possible due to the inherent core competency of blockchain, transcending the tradeoff between transparency and security. This means the technology facilitates enough transparency whereby the identity of the transacting parties need not be revealed. That’s a big security benefit for those who prefer or require anonymity.

Third Dimension of Competition: Disintermediation

The benefit of anonymity security even after the drawback of processing speed limitations were accounted for, created a momentum for blockchain that eventually revealed a secondary benefit in the battle for marketshare with TCP, namely, disintermediation.

Each successive benefit is gradually revealed through a sustaining innovation in blockchain. For instance, Bitcoin’s pure blockchain (proof of work consensus protocol) produced the aspect of anonymity for transacting parties. Ethereum’s smart contract innovation supercharged the disintermediation possibilities of blockchain.

Outside the firm, disintermediation targets the classic “middleman”, by obviating the need for their irrelevant services. Inside the firm, disintermediation targets employee bloat, both managers and labour. Both of these examples of external and internal disintermediation are the result of automation — in these cases the automation of authority and trust through smart contracts, obviating the need for human intervention. This means scale built by incumbent organizations based on labour pool scale moves from an advantage to a disadvantage, or essentially a diseconomies of scale scenario.

We are seeing disintermediation already within Ethereum’s DeFi sector, as DAO’s fintechs are obviating the need for third party investor custodianship. The initial scaling of DeFi is already generating so many transactions that supply-side transactions costs are rising dramatically.

Ethereum 2.0

Ethereum 2.0 will mean fewer nodes will need to be involved in the tasks of data retrieval, transmission and storage. It’s the equivalent of the evolution of cloud computing where monolith applications moved to a model of modular microservices via APIs, thus increasing transaction speeds, availability, and security.

Ethereum 2.0 will be utilizing sharding (partitioning of the distributed databases enabling more scalable storage and search of the blockchain), in order to reduce latency and increase scalability. This means each consensus node will no longer need to manage the entire database but rather maintain pieces or shards of the larger database.

Ethereum 2.0 will also be moving to a proof of stake consensus protocol and away from energy-intensive and time-consuming the proof of work protocol. This will shift Ethereum away from the decentralized pole to some degree and toward the centralized pole on that continuum. Node operators will now be required to put up 32 ether (or about $12,000 USD depending on the exchange rate). This means not just anyone will be able to act as a node, reducing the number of nodes required to reach consensus thus speeding up the process.

These trade-offs are designed to alleviate congestion on the road to consensus by removing some of the burden from the node operators. Currently the traffic jam is causing consensus or supply-side transaction fees to rise dramatically threatening the disintermediation trend. Keeping supply-side transaction costs low will allow the inherent competitive advantages of this dimension of competition (external and internal disintermediation) to start to tip the economic balance in favour of Ethereum 2.0 applications in their fight for marketshare with TCP.

This will trigger the exponential acceleration of the mass personalization trend. At some point however, the increasing popularity will again jam up the network, particularly as more and more AI agents join the economy to get access to blockchain mediated data markets.

Fourth Dimension of Competition: Mass Personalization

Ethereum Layer 2 Scaling Solutions

Whether it’s the side-chain, child-chain or para-chain model, Layer 2 solutions take some of the redundant transaction record keeping and consensus off the hands of the primary blockchain nodes. Smart contracts provide the integration bridge synching up at the beginning and at the end of the multi-step transaction. The intermediate transactions are performed off chain on the side-chain or child-chain.

Side-chains and child-chains are known as Second Layer Solutions because they don’t impact the base protocol, but are off to the side, acting as an additional layer to expand the processing capacity. The trade-off is that these side-chains are not as decentralized as the primary blockchain.

These off-chains can require third party custodian services to serve as the authority that holds a nonced (ordered) ledger copy in order to validate the final state of the transaction in question, before it is submitted to the primary blockchain. In this way, these side-chains or child-chains can be considered “state channels”.

An analogy is going to a casino, using your credit card to buy some poker chips, playing a few rounds and then cashing out at the end of your visit by converting your remaining chips into fiat money back on your credit card. The credit card transactions at the beginning and end can be considered on-chain while all of the in-game transactions can be seen as off-chain on a side-chain (or in a side economy). Imagine using your credit card at the table — not much fun and inefficient. Similarly, not all steps in a larger transaction need to be directly processed by Ethereum nodes, unnecessarily bogging down the marketplace increasing supply-side transaction cost for all transactions.

Matic, a sidechain solution for Ethereum, has demonstrated impressive results being able to process 260,000 tps per sidechain. According to Vitalik Buterin, some of these second layers solutions, including Starkware, are getting close to going live alongside Ethereum, when he replied in a June 1 2020 tweet saying:

“While everyone wasn’t looking, the initial deployment of Ethereum’s layer 2 scaling strategy has *basically* succeeded. What’s left is refinement and deployment.”

Polkadot which recently launched its DOT token is described as an blockchain interoperability network actually has terrific potential as a layer 2 solution for Ethereum helping take some of the load including in the quickly mushrooming DeFi sector.

Institutional investors and corporate partners believe in Ethereum’s market leadership position that a firm like JP Morgan Chase have divested their proprietary blockchain venture Quorum by selling it to Ethereum ecosystem firm like Consensys to adopt Ethereum as the platform on which to build its private blockchain solutions and cryptocurrencies.

As these solutions ramp up their impact on reducing supply-side transaction costs will take the fourth industrial revolution from neutral into drive.

Facebook, Libra and Novi

Ethereum 2.0 and Layer 2 Solutions will result in a classic disruption pattern in their battle with traditional cloud platforms. This includes decentralized applications (DApps) versus centralized applications (TCP Apps). Some of the most entrenched cloud applications, though they are unlikely to maintain their current market dominance, still have time to pivot to ensure survival. This includes Facebook, which is hedging its bets by introducing Libra.

Because Libra is to be a private consensus blockchain with a small number of nodes represented by large companies, the consensus burden is lower than for Ethereum. And because traditional Facebook already monetizes through advertising, the costs of blockchain computation and storage are subsidized. The consensus nodes are large entities staking millions of dollars to be part of the network. Their return on this investment will be in validation fees as well as influence within the ecosystem.

Facebook will make up the cost of subsidizing computation and storage not just through the likely dominant position of its Calibra wallet (now rebranded Novi by Facebook), but on controlling the reserve currency that they hope will power an international ecommerce economy that could one day rival Amazon, and perhaps become a critical player in global trade (including as an IOT, payments and settlement platform).

Though Facebook is a walled garden platform, Facebook APIs allow third parties to integrate their services providing users the opportunity to receive a more personalized experience. For example, Facebook partnered with Shopify to power a small business focused ecommerce storefront platform called Facebook Shops for its Facebook for Business Marketplace.

Currently Facebook does not collect commissions on purchases by users from vendors on Facebook Shops. However, not only will that eventually change, but Facebook will likely use the platform to drive adoption of Libra (piggyback strategy) perhaps by incenting buyers through subsidies to purchase in Libra versus fiat (seeding strategy). Facebook also has the opportunity to sell ads to vendors competing with other vendors for users for buyer eyeballs, which Facebook can discount if the ads are paid for in Libra.

Facebook’s Instagram and eventually WhatsApp divisions will also factor into an integrated ecommerce and advertising offering denominated in Libra cryptocurrency. Facebook can deploy their crypto wallet Novi, formerly known as Calibra, not only on Facebook and Instagram but also on WhatsApp, serving as a mobile bank account that can be used to pay for products and services within Facebook and outside of its walled garden.

Facebook Shops with Libra (and Novi) have the potential to be a supercharged version of the SecondLife virtual economy. That’s particularly compelling when the platform begins to facilitate Facebook vendors and users to collaborate in producing more personalized offerings. This subgroup networks (SGN) functionality is already a foundation of Facebook with almost two billion Facebook Groups on the platform today. Adding commercial collaboration possibilities to Facebook Group and integrating it with Facebook Shops underwritten by Libra, has the potential to position Facebook as a global supply chain mediator.

The first step is allowing Facebook users to collaborate ad hoc via automated smart contracts, and being paid for their contributions in Libra. Will Libra have smart contracts? The answer is yes according to the Libra white paper:

“The Association is committed to implementing appropriate review and risk controls for smart contracts. At first, only Association-approved and -published smart contracts will be able to interact directly with the Libra payment system. Over time, the Association will explore appropriate controls to allow third-party publishing of smart contracts.”

When smart contract-enabled Facebook subgroups begin to demonstrate an ability to deliver a certain level of personalization in goods sold through Facebook Shops, the organizational economics (lower costs and better offerings) will become superior to traditional organizational economics. This will produce demand for non-Facebook entities that are better positioned to produce subcomponents at a superior cost than can be produced by a proprietary Facebook tool or by Facebook user groups.

This is where open source blockchains like Ethereum and its DApps begin to contribute to the Facebook economy by producing superior subcomponents at lower costs. This will be the beginning of the crossover — where a proprietary blockchain shows its limits and an open source blockchains begins to disrupt the Libra economy replacing it with a new open source blockchain economy.

The Ethereum DeFi space will be the first to provide subcomponents or subservices to the Facebook Libra subnetwork personalization economy. For instance, Metamask digital wallet maker has already launched a widget “allowing Twitter users to trade on Uniswap without leaving the social platform.” They hope to expand the service to Facebook.

In the future, it would make sense for a DeFi project to enable users with the ability to swap cryptocurrencies to facilitate greater Facebook Shop transactions. Buyers may need Libra to buy in Facebook Shops and sellers will need to exchange Libra for a CBDC to pay their taxes in their home countries.

Twitter, Blue Sky and Square

Speaking of Twitter, CEO Jack Dorsey also has a strategy for blockchain for both of his big TCP companies, the other being Square. Almost a year ago, he announced “Blue Sky”:

Twitter@jack 6:13 AM · Dec 11, 2019:

Twitter is funding a small independent team of up to five open source architects, engineers, and designers to develop an open and decentralized standard for social media. The goal is for Twitter to ultimately be a client of this standard.

Twitter was so open early on that many saw its potential to be a decentralized internet standard, like SMTP (email protocol). For a variety of reasons, all reasonable at the time, we took a different path and increasingly centralized Twitter. But a lot’s changed over the years…

First, we’re facing entirely new challenges centralized solutions are struggling to meet. For instance, centralized enforcement of global policy to address abuse and misleading information is unlikely to scale over the long-term without placing far too much burden on people.

Second, the value of social media is shifting away from content hosting and removal, and towards recommendation algorithms directing one’s attention. Unfortunately, these algorithms are typically proprietary, and one can’t choose or build alternatives. Yet.

Third, existing social media incentives frequently lead to attention being focused on content and conversation that sparks controversy and outrage, rather than conversation which informs and promotes health.

Finally, new technologies have emerged to make a decentralized approach more viable. Blockchain points to a series of decentralized solutions for open and durable hosting, governance, and even monetization. Much work to be done, but the fundamentals are there.

Some of these issues were emphasized by @stephen_wolfram in a blog post following his Senate hearing titled “Optimizing for Engagement: Understanding the Use of Persuasive Technology on Internet Platforms”.

Recently we came across @mmasnick’s article “Protocols, Not Platforms” which captures a number of the challenges and solutions. But more importantly, it reminded us of a credible path forward: hire folks to develop a standard in the open.

Square is doing exactly this for bitcoin with @SqCrypto. For social media, we’d like this team to either find an existing decentralized standard they can help move forward, or failing that, create one from scratch.

That’s the only direction we at Twitter, Inc. will provide.

Why is this good for Twitter? It will allow us to access and contribute to a much larger corpus of public conversation, focus our efforts on building open recommendation algorithms which promote healthy conversation, and will force us to be far more innovative than in the past.

There are MANY challenges to make this work that Twitter would feel right becoming a client of this standard. Which is why the work must be done transparently in the open, not owned by any single private corporation, furthering the open & decentralized principles of the internet.

We’d expect this team not only to develop a decentralized standard for social media, but to also build open community around it, inclusive of companies & organizations, researchers, civil society leaders, all who are thinking deeply about the consequences, positive and negative.

This isn’t going to happen overnight. It will take many years to develop a sound, scalable, and usable decentralized standard for social media that paves the path to solving the challenges listed above. Our commitment is to fund this work to that point and beyond.

We’re calling this team @bluesky. Our CTO @ParagA will be running point to find a lead, who will then hire and direct the rest of the team. Please follow or DM @bluesky if you’re interested in learning more or joining!

Twitter will fund the work but leave it up to Blue Sky to determine whether to build, buy, partner, or leverage with regard to that final decentralized social media platform, ie, Twitter’s future home for hosting its content. That’s a lot for Blue Sky to take one. Its no wonder that more than half a year later, they are still looking for a leader. Dorsey revealed the following, at the virtual Oslo Freedom Forum 2020 specifically to Human Rights Foundation president Thor Halvorssen:

So right now we’re in the phase of finding a leader for it but this is a completely separate non-profit from the company. This group will be tasked with building a protocol that we can use, but everyone else can use. And then we’ll really focus on becoming a client of it so that we can build a compelling service and business on top of a much larger corpus of conversation that anyone can access and anyone can contribute to.

Dorsey, who earlier this month said bitcoin is “probably the best” native currency of the internet, has previously gone as far as saying bitcoin has the potential to be the world’s sole currency by 2030.

It will be interesting to see how existing blockchain social media platforms like Steemit and others factor into the eventual build or buy decision. Regardless, it would make sense for Blue Sky to include smart contracts and smart tokens if the intention is to be a decentralized social media protocol.

Blockchain without smart contracts and smart tokens cannot be granularly competitive or collaborative, since there is no real-time incentive system for third party contributions, which would render mass personalization inert. These smart contracts and smart tokens essentially are there to economically incentivize ever-smaller entities to collaborate to produce a personalized user experience. I call this the economies of network coopetition, where ecosystem participants simultaneously compete and collaborate governed by an open platform standard.

As for Square, Dorsey seems to be suggesting it can serve as the intuitive broker and wallet layer for Bitcoin, which he considers the frontrunner to becoming the Internet’s native currency. From a DeFi perspective, that vision sounds similar to Yearn Finance, ie, a one-stop-shop roboadvisor. Just this September, Dorsey told Reuters in a video interview:

Twitter@Reuters Sep 10:

Twitter CEO @jack tells @Reuters #bitcoin is still the most viable internet-native currency:

“I think the internet wants a currency and wants a native currency and I think Bitcoin is probably the best manifestation of that thus far. and I can’t see that changing given all the people who want the same thing and want to build it without control. And our job as a toolmaker is to make that easy for people to access, make it easy for people to understand, and most importantly utilize. And in that we learn, and we learn how to make it accessible to more people — where it’s useful and where it’s not useful.

Two big things, one is just transaction times and efficiency. So making it cost effective and making it time effective. And second, that it be intuitive to people, that they understand why they might use it, they understand wearables, and they can access it in a way that feels similar to just handing over paper cash. That is fundamentally the easiest that every single one has experienced. And as you add technologies like cards or contact lists you see the power in it being even faster and more convenient.

So we have to build bitcoin in such a way that it is as intuitive, it’s as fast, and it’s as efficient as what exists today.”

As DeFi is showing, native tokens and Bitcoin can not only coexist but also commingle. Dorsey’s fondness for Bitcoin as the Internet’s ultimate native currency, hopefully doesn’t blind him from seeing the power of platform or application specific currencies to serve as a more programmable way of facilitating productivity and value creation.

Convergence

TCPs will have a place in the mass personalization era particularly in the first half. After that, their inferior economics will cause a diseconomies of scale so severe that DAOs will prevail and dominate. However, in the meantime, there will be an opportunity to profit immensely, especially those centralized platforms that introduce their own token system that facilitates both intra-ecosystem and inter-ecosystem collaboration.

Twitter has announced plans to move away from centralization entirely as explained earlier. While Facebook is the first example of a centralized platform planning to introduce a token. Libra will serve as both a utility and app token. It’s use is to go beyond just the Facebook platform giving it inter-ecosystem potential. Its deal with Shopify gives it a natural commerce component that could help grease the skids for Libra denominated transactions. Using Facebook to seed Libra as a reserve currency of sorts is somewhat akin to what the Saudi’s did for the Americans to establish USD as the Petrodollar.

The Facebook-Shopify relationship potentially expanding to include Libra could kickstart an economic trend in favor of openness not only on Facebook but in ecommerce in general. An ecosystem of smaller more niche vendors integrating with Facebook underwritten by Calibra opens up all sorts of possibilities especially as Libra launches smart contracts. The most economically compelling of these possibilities is the opportunity to personalize the user experience. True mass personalization, the most profitable type, is economically only possible through network collaboration.

Thus, Facebook’s moves into ecommerce and cryptocurrency, and smart contracts is setting itself up to over time cannibalize its walled user data garden model, facilitating its own disintegration into a smaller open platform competing with others rather than being able to hoard user data as its power position. Smart contract technology is inherently more economical as an intermediary than a physical organization, thus triggering diseconomies of scale in existing large organizations.

The decentralized applications with their superiority in the competitive dimensions of security, disintermediation and diseconomies of scale are well positioned to displace the entrenched traditional cloud applications in the competition to produce mass personalization. Granularity is a prerequisite of personalization. And network effects are a prerequisite for scalable profitability. Only a decentralized economy that can produce profitable competitive collaboration at ever more granular level.

As the mass personalization momentum grows, eventually, only open source blockchain platforms (with decentralized ad hoc organizational models) will be able to deliver ever greater personalization at scale — the type that can produce higher revenues, lower costs, or both.

This assumes low supply-side transaction costs. Ethereum 2.0 innovations and Layer 2 scaling solutions will facilitate. These do have trade-offs in the form of growing centralization in the consensus mechanisms. Thus, the pressure will always be present to come up with a way that mitigates the tradeoff. There is such a way.

Quantum Blockchain: Leveling the Speed Dimension

So far this has been the story of DLT getting to “good-enough” in transaction speeds (the base dimension of competition) as compared to TCP. Each incremental improvement opens up an additional dimension of competition, placing another set of verticals in the TCP disruption crosshairs.

Full leveraging of Economies of Network Coopetition, the type that produces full mass personalization, will require another leap in transaction speeds — a dramatic increase. Ultimately, what’s necessary is a new paradigm to transcend the dilemma/trilemma tradeoffs. That type of increase may only be possible from a paradigm shift in computing.

Quantum computers can process multiple complex queries simultaneously due to the superposition nature of qubits (the ability to be in two positions at the same time — yes, like magically transcending gravity!). What superposition provides over classical computing is not speed in straight line races. It provides unbeatable speed in complex scenarios, ones with many possibilities like a maze. That’s exactly what blockchains are, complexity especially relative to centralized data centers.

The only thing we hear about quantum computing and blockchain today is how quantum computing will break RSA cryptography — another complex technology. By that same logic, however, quantum can also improve RSA, making it more antifragile. Moreover quantum computing can enhance so many areas of blockchain operations including:

  • robust large-scale consensus
  • efficient on-chain data searching
  • private record validation
  • high-speed smart contract processing
  • interoperability between blockchain networks.

In fact, integrating quantum computing within blockchain, ie, the quantum blockchain, relative to classical computing and traditional applications, potentially provisions a no trade-offs scenario.

Quantum computing and quantum networks will allow Bitcoin’s Proof of Work consensus protocol to function fast enough for Bitcoin alone to represent a much larger portion of the global financial technology stack, eliminating the need for second layer solutions for off-chain payment processing.

The Bitcoin mining process is one of applying brute force computing resources to solve the cryptographic puzzle through all possible input value combinations applied one at a time in order to find the winning match. Quantum computing specialty is solving probabilistic problems (as opposed to deterministic problems for which AI is used). A cryptographic puzzle with an almost infinite number of possibilities can be sped up through superior probability.

Proof-of-Work Consensus Model works as follows:

  • A narrow output (consensus among multiple nodes) for
  • a narrow input (a ledger of a consistent number of transactions), with
  • a very wide range of possible (proof of work is about trying all possible combination of to solve a cryptographic hash puzzle)
  • large amount of processing to solve the cryptographic hash puzzle.

Quantum Computing is well suited for proof-of-work puzzles (a small number of inputs and outputs with an almost infinite number of possibilities) as it can quickly produce iterative computations.

These kinds of gains from super fast quantum networks could also positively impact inter-blockchain interoperability, increasing collaboration including for improved transaction processing via load balancing. Growing the number of viable qubits on a chip will facilitate longer states of superposition giving algorithms enough time to process large data sets.

The demand for mass personalization and the rewards awaiting those who can deliver on its promise, will ensure a quicker path to quantum computing at scale. The demand for faster transaction speeds among open source DLT platforms like Ethereum will naturally drive investments in quantum computing thus accelerating its milestones. The current projection for quantum computer commercial availability is as follows:

  • 2026 = 14% probability
  • 2031 = 50% probability

China is spending at least $10 billion over the next three years on quantum technology including computing. The US government is providing $1.2 billion to fund activities promoting quantum research over an initial five-year period. The US figure however doesn’t include private investments in quantum computing.

US companies have their own quantum computing projects underway including Google, IBM, Microsoft, Apple and Amazon. US government agencies are focused on quantizing the internet):

The Department of Energy’s five quantum computing centers, housed at US national laboratories, are funded by a five year, $625 million project bolstered by $340 million worth of help from companies including IBM, Microsoft, Intel, Applied Materials and Lockheed Martin. The funds came from the $1.2 billion allocated by the National Quantum Initiative Act, which President Donald Trump signed in 2018, but the private sector contributions add some new clout.

The idea is to link government, private and university research to accelerate key areas in the US. It’s the same recipe used for earlier US technology triumphs like the Manhattan Project to build the atomic bomb in World War II, the Apollo program to send humans to the moon and the military-funded effort to establish what became the internet.

The five quantum computing centers will be located at Argonne, Brookhaven, Fermi, Oak Ridge and Lawrence Berkeley national laboratories. Areas of research include materials science, quantum networking and quantum sensor networks.

A quantum internet would speed up peer node to peer node propagation thus allowing for larger block sizes which is key to additional transactions without requiring more time (ie, higher transactions per second TPS speeds).

The Energy Department and its 17 national labs will form the backbone of the project. … Initial users of a quantum Internet could include national security agencies, financial institutions and health-care companies seeking to send data more securely, researchers said.

The networks promise to be more secure — some even say unhackable — because of the nature of photons and other quantum bits, known as qubits. Any attempt to observe or disrupt these particles would automatically alter their state and destroy the information being transmitted, scientists say.

A quantum Internet could also be used to connect various quantum computers with one another, helping boost their total computing power. Quantum computers are still at an early stage of development and not yet as powerful as classical computers, but connecting them via an Internet could help accelerate their use for solving complex problems like finding new pharmaceuticals or new high-tech materials (according to David Awschalom, a professor at the University of Chicago’s Pritzker School of Molecular Engineering and senior scientist at Argonne National Laboratory).

On March 26, 2020, the Defense Advanced Research Projects Agency (DARPA) announced that they had awarded $8.6 million in grants to Rigetti Computing:

The grant is part of the DARPA ONISQ (Optimization with Noisy Intermediate-Scale Quantum) program. The goal of the program is to establish that quantum information processing using NISQ devices has a quantitative advantage for solving real-world combinatorial optimization problems as compared with the best known classical methods.

In addition to Rigetti Computing, other quantum computer makers include IonQ, Quantum Circuits, D-wave (Canada). Many of the big tech firms are partnering with these companies to commercialize quantum computing through cloud services:

  • IBM already offers a quantum computing cloud service called IBM Quantum Experience. IBM’s goal is to double “quantum volume”, a measurement combining qubit count with coherence among other metrics.
  • Microsoft is focused on developing not only quantum computing hardware but also software for quantum computing and quantum networks. Through their Azure Quantum cloud service, Microsoft is developing a quantum computers cloud service in partnership with Honeywell, IonQ and QCI.
  • Amazon is looking to collaborate with quantum computer makers like D-Wave, IonQ and Rigetti to make their quantum computers available through Amazon Web Services, a service they’ve branded Amazon Braket. Amazon also launched The Center for Quantum Computing which is focused on offering mass-produced quantum computers.
  • Google declared Quantum Supremacy in 2019 using a 53 Qubit quantum computer. Google is targeting encryption services with its quantum technology, servicing critical communication and payment system providers. It plans to bring this to market as a cloud service.

Other contributors to the ecosystem include

  • Honeywell: a leader in environmental controls, Honeywell is using its core competency to become a leader in quantum coherence (which requires delicately calibrating severely low temperatures). Without coherence, the quantum state falls apart rendering the qubits useless for computing. Honeywell declared in March 2020 that it intends to increase its quantum volume by a factor of 10 each year. This would accelerate traditional quantum timelines by five years.
  • Intel: “Intel’s quantum computing research spans the complete stack — from qubits and algorithms research to control electronics and interconnects — required to make practical quantum computers for real-world applications a reality.”
  • Applied Materials: “Applied Materials looks forward to bringing our capabilities in materials engineering and high-volume manufacturing to help accelerate the quantum future — from materials to systems.”

Apple has yet to make public their quantum computing plans.

Venture capital firms are also getting into the act investing in quantum computing startups, many of which were spun out of research teams at universities in 2017 and 2018. By 2019, private investors had backed at least 52 quantum technology companies around the world since 2012, according to an analysis by Nature. These companies received at least $450 million in private funding — more than four times the funding from the previous two years.

Summary

Early digital platforms were locally installed software based like Microsoft DOS and Windows. In successive waves, new innovations brought new arrivals:

  • open source local software platforms like Linux; then
  • smartphone platforms like iOS App Store; then
  • enterprise cloud platforms like Salesforce AppExchange; then
  • prosumer cloud platforms like AirBnB and Uber, where the consumers were the producers.

Blockchain is not analogous to Linux based Android OS as that is still controlled by one entity even though it is being made available to other participants. Thus Android is a supplementary curve to the iOS proprietary network platform curve. Windows lacks a dominant network platform play however it is still making hay with it’s more simple Windows MS Office platform innovation of 30 years ago.

The next phase is decentralized smart contract and smart token-enabled platforms both enterprise and prosumer, producing a more profound dual network effects, lower transaction costs on the supply-side, and greater user experience on the demand-side. From the resulting economies of network ‘coopetition’ will emerge an era of mass personalization at scale.

Blockchain without tokens cannot be granularly competitive or collaborative, because there is no real-time incentive system for third party contributions, which would render mass personalization inert. These smart contracts and smart tokens essentially are there to incentivize ever-smaller entities to compete and collaborate to produce a personalized user experience. I call this the economies of network coopetition, where ecosystem participants simultaneously compete and collaborate while being governed by an open platform protocol.

There is a convergence between the growth in microservices and the growth in distributed data storage. Microservices will grow into nanoservices and distributed data storage will transition from private permissioned to public permissionless. A catalyst for both will be AI applications requiring greater access to data and greater granularity in service provision. Nanoservices and distributed data are better suited to fulfill the demand from end users seeking perpetually greater personalized services.

Critical Mass Timing

Currently public blockchain applications are inferior to private applications because the consensus process required for public is an additional layer that produces a negative externality. This negative externality is strong enough that it prevents mass adoption. However, the economics of will flip quickly in favor of public blockchain applications when:

  1. decreasing supply-side average costs, ie, lower transaction costs, to the point that the negative externalities of the consensus process are sufficiently mitigated by
  2. increasing demand-side value, ie, the positive externalities of greater utility and user experience.

This dual network effect will usher in the era of mass personalization at scale. As stated previously, my estimation for this economic based flipping point is 2027.

Ethereum 2.0 and Ethereum Layer 2 solutions are both at least a year away from full production. Either or both of those innovations at scale, will lower supply-side transaction costs immediately. If that happens by 2022, by my calculation, there will be a five year transitional period before a permanent flip to humanity’s new decentralized economic value creation technology regime.

The resulting quantum blockchain will beget a productivity leap. Our individual and collective willingness to push forward with the technological forces already at play will help us ensure more of us are beneficiaries of this Third Industrial Revolution (3IR), rather than its victims.

3IR is an actualization tool for humanity to tap into our collective consciousness potential, which will more fully bloom as the Fourth Industrial Revolution. Our dedication to the reconciliation journey will ensure we are good custodians of this powerful tool, ensuring an equitable distribution of opportunity from which to profit.

Strategy Transformation

Web 2.0 World:

Opportunity/Problem:

  • Infinite market reach (upside) with infinite competition (downside) with low demand side transaction costs (low barrier to competition entry).

Ideal Strategy:

  • Revenue share marketplace business model with a proprietary standard to capitalize on the market size potential while eliminating the competition.
  • The value of the marketplace to buyers and sellers should be designed in such a way that it produces positive externalities — which enable network effects (demand side scaling).
  • The demand side scale of the marketplace entrenches the proprietary standard leading to a winner take all situation (centralized entity).

Use cases:

  • This strategy was used early by Napster with music, and later by Apple’s iTunes marketplace and iPod music player tandem, which paved the way for Apple’s AppStore and iPhone, the pioneer of this model.

Tradeoff:

  • The centralized nature of the successful entity has a vested economic interest in keeping its system closed (to avoid competition) resulting in higher profits but also suboptimal mass market products.

Revolution Potential:

  • Open data is a prerequisite to the component granularization that enables mass personalization. However, TCP firms have a vested and strong short-term economic interest in maintaining the status quo.
  • Also, further innovation is required in decentralized consensus mechanisms to produce lower transaction costs which currently remain too high to make centralization uneconomical.

Web 3.0 World:

Opportunity/Problem:

  • Blockchain innovation reduces transaction costs on the supply side.
  • This enables mass personalization for the same world-wide marketplace with low demand side transaction costs, that the internet originally facilitated.
  • Mass personalization at scale is inherently in more demand than mass market products, but requires a decentralized ecosystem of open data.

Ideal Strategy:

  • Select a at-scale decentralized smart contract supply-side token platform with an open source demand-side token protocol that is innovating to sufficiently reduce its supply side transaction costs enabling supply side scaling.
  • Revenue share marketplace business model and proprietary token (tied to the native platform) and a granulating offering that sufficiently incentivizes an ecosystem to provide components that generate mass personalization.
  • Capitalize on the market size potential while ensuring profitability by mass personalization value enhancing ecosystem partners including AI enabled automated agents.
  • The value of the marketplace to buyers and sellers should be designed in such a way that it produces positive externalities — which enable network effects (demand side scaling).
  • The scale of the marketplace entrenches the decentralized ecosystem based on the native token.

Use cases:

  • Ethereum ERC-20 DApps and DAOs that issue native tokens. The first verticals to scale is decentralized finance (DeFi). As supply-side transaction costs come down through consensus protocol innovations (Ethereum 2.0 and Layer 2 scaling solutions), the DApps and DAOs will scale in additional verticals.

Tradeoff:

  • Greater user experience increases supply-side transaction costs (Ether)

Revolution Potential:

  • Mass Personalization at Scale.

Timeline

Second Industrial Revolution, Phase 1: Electrification (2IR)

1838 telegraph
1876 Bell telephone
1880 Edison light bulb
1894 Marconi radio
1927 television
1945 ENIAC (first turing-complete computer)
1948 Information Theory (Claude Shannon paper “A Mathematical Theory of Communication”)
1950 Diners’ Club credit card
1954 Devol Unimate (first commercialized AI programmable robot)
1957 Sony TR-63 “pocketable” transistor radio
1958 American Express travel and entertainment credit card

Second Industrial Revolution, Phase 2: Digitization (2.2IR)

1969 1st TCP/IP message (internet), Compuserve, CCD, UNIX, videocassette
1970 optical fiber
1971 Kenbak-1 first personal computer, ATM machine, NASDAQ electronic exchange, Fed paperless money deposit and clearance system
1972 Magnavox Odyssey
1972 Atari computer
1973 Motorola mobile phone
1975 Microsoft founded, digital camera
1976 Apple founded, 5-inch floppy disk, VHS video player format, Mattel Auto Race
1977 SWIFT central bank money transfer messaging system
1979 Sony Walkman (22 years after Sony TR-63), WordPerfect
1981 IBM PC, Microsoft MS-DOS, 32-inch floppy disk, Compuserve Forums, Bloomberg Terminal
1982 Commodore 64, compact disc (CD)
1983 Lotus 1–2–3, Delphi internet forums, Compuserve cloud storage
1984 9–6 kbit/s modem, AOL
1985 Windows 10
1986 DSLR camera
1989 World Wide Web
1990 Berners-Lee web browser, Logitech Fotoman
1991 Linux GSM
1993 Windows 3.1
1994 Amazon, MP3 format
1995 Yahoo, mIRC, DVD
1996 33.6 kbit/s modem, ICQ, USB, Dolly (first mammal cloned from an adult somatic cell)
1997 Netflix mail-order
1998 56 kbit/s modem, ADSL, Wi-Fi, Google, Nick Szabo bit gold smart contract mechanism
1999 Salesforce, Blackberry 850 email pager, Napster
2000 Nokia 3310
2001 iPod, Wikipedia, UMTS, Google Earth
2003 iTunes, Skype, Nokia 1100, MySpace, Second Life, LCD monitor sales surpassed CRT sales
2004 Facebook, Flickr, mirrorless camera, World of Warcraft
2005 Motorola Rokr E1 with iTunes preloaded, Google Maps, Youtube, 16% of global population have internet
2006 Twitter, Amazon AWS S3 cloud storage service introduced
2007 iPhone, Amazon Kindle, Netflix streaming, Google Maps Street View
2008 iPhone SDK & App Store, Android, AirBnB, Tesla Roadster
2009 Bitcoin, WhatsApp
2010 Instagram, 30% of global population have internet, IoT emerged with the things/people ratio growing from 0.08 in 2003 to 1.84 in 2010 according to Cisco
2011 Snapchat, Uber, Zoom, Apple Siri, IBM Watson AI computer defeated Jeopardy! champs
2012 Google Now, iPad
2013 Google Glass, Slack
2014 Amazon Alexa
2015 Ethereum Virtual Machine blockchain, Windows 10, Brainet (monkey brain to brain hive mind experiment)
2016 47% of global population have internet

Third Industrial Revolution: Mass Personalization (3IR)

2018 Genome-edited human babies
2019 Google AI Quantum Supremacy
2020 DeFI adoption acceleration
2021 Ethereum 2.0

--

--