BIGtheme.net http://bigtheme.net/ecommerce/opencart OpenCart Templates
Thursday , March 21 2019
Home / Column / The inevitability of tokenized data
The inevitability of tokenized data

The inevitability of tokenized data

We are approaching the final stage of the inevitable showdown between large technology and regulatory agencies, which revolves around consumer data. In many ways, the fact that the facts have arrived reflects the fact that the market has not yet developed alternatives to the Google and Facebook data paradigms, as the dominant source and seller today and Amazon as the host.

The tokenization and decentralization of data provides such an alternative. While the first generation of "practical" tokens is only supported by dreams, a new generation of tokens will be explicitly associated with data values.

The dialogue around data has reached a new turning point.

Presidential candidate, Senator Elizabeth Warren called for the dissolution of technology giants including Amazon and Facebook. In many ways, this move feels like the inevitable climax of the past few years, during which the public sentiment surrounding the technology industry has changed from being extremely positive to becoming more skeptical.

Part of this growing skepticism is related to the fact that when populist ideology emerges, all authorities are subject to more rigorous scrutiny. However, when you hone the details, it is clear that the fundamental problem that technology companies lose confidence is the data: the content collected, the way they are used, and the people who profit from it.

Facebook's Cambridge Analytica scandal, in which a large amount of user data is used to help Russian political actors create discord and help Trump be elected in 2016, and Facebook CEO Mark Zuckerberg's testimony before Congress It is this watershed moment that loses confidence in the data.

Those who pointed out that almost no one really left the platform because the event failed to recognize the real impact is always more likely to be the case – a company that dismissed consumers who were angry at the incident by providing political cover to breaking the phone.

Image courtesy of Bryce Durbin

Of course, not all 2020 Democratic presidential candidates agree with Warren’s call. In response to Warren, Andrew Yang, a newcomer focused on Universal Basic Income and emerging after the popular podcast of Joe Rogen The candidate wrote: “Agreeing that there is a fundamental problem with big technology. But we need to expand our toolset. For example, we should share the profit of using our data. It is better than simple adjustment. We need a kind of independent consumer price against trust. New legal system."

Although some may think that Yang is biased, because he is from the technical world, he is more outspoken than any candidate for the upcoming escalation of automation. His concept of different arrangements for data economics between the people who produce it and the platform used (and opposed to sales advertising) is worth considering.

In fact, one can argue that not only is this stringent regulatory approach to data inevitable, but it also represents a fundamental market failure in the way data economics are organized.

The interior of a modern server room in the data center

It is said that the data is new oil . In this analogy, it is the fuel that the attention economy plays. Without data, there is no advertising; no advertising, no free services can dominate our social life.

Of course, the data market has another aspect, which is where it is located. Investors (and former Facebook growth director) Chamath Palihapitiya pointed out that 16% of the money he invested in the company was directly used in Amazon's data-managed warehouse.

This fact suggests that while regulators – and even more presidential candidates who wish to earn points on a populist basis – may think that all technologies are centered around the status quo – there are actually many different financial motives.

Enter ' decentralization.

In his opening essay "Why is decentralization important," A16Z investor Chris Dickson explained the differences in incentives in the network. At the beginning of the network, the network owner and the participant have the same incentives – increasing the number of nodes in the network. However, it is inevitable to reach a threshold that the pure growth of new participants is unachievable and the network owner must instead extract more from the existing players.

In Dixon’s estimate, decentralization offers another option. In short, tokenization will allow all users to participate in the economic benefits and benefits of the network, effectively eliminating the distinction between network owners and network users. When there is no clear ownership category, no one has the need (or power) to extract.

This article is a wonderful expression of an idealized state (reflected in more than 50,000 hands). However, in the ICO boom, things did not fully meet Dixon's imagination.

Fundamentally, the question is what the token is actually. In almost all cases, the “utility token” is just a payment token – just providing alternative funding for the service. Their value depends on the guess that they can get a certain currency premium so they can surpass the utility of the network – or make the network grow so large that the value can last for a while.

It is not difficult to understand why things are designed this way. For network builders, such payment tokens allow global and immediate capitalization in a completely undiluted form. For retail buyers, they offer opportunities to participate in venture capital in ways that are denied by certification law.

However, at the end of the day, the simple fact is that these tokens have no support beyond dreams.

When the market for these dream coins eventually collapsed, many people decided to throw away the token baby with ICO bath water.

What if it raises a question: If the tokens in the decentralized network have nothing to do but dreams, but we will be supported by the data? If it is not a dream coin, what if we have a data coin?

The data is indeed the oil of the new economy. In the context of any given digital application, data is where the value lies: for the company that pays for it; for the platform that can sell the ad; for users who effectively trade the data to lower the price.

In other words, data is an asset. Like other assets, it can be tokenized and dispersed into a common blockchain. It's not hard to imagine that the future of every meaningful data in the world will be represented by a private key. Binding tokens to data explicitly creates a new world of options to reconfigure how the application is built.

First, data tokenization can create an opportunity for decentralized nodes in a hosted network — AWS's decentralized alternatives — to effectively speculate on the future value of the data in the applications they provide for hosting services, creating simple financial beyond Incentives provide services. When a third party like Google wants to crawl, query, and access data, they pay the token (database) that represents the data to the miner who secures and stores it, as well as the developer data that gets, builds, and tags it. Valuable to third parties – especially machine learning and artificial intelligence driven organizations.

Second, application builders can not only take advantage of the more fluid capitalization through tokens, but can also easily try new ways to arrange value streams, such as cutting users into their own data values ​​and benefiting them.

Third, users can begin to have a tangible (and traceable) perception of the value of their data, exerting market pressure on the platform to be included in the uplink, and exerting more control over the location and manner of their data. . used.

In other words, tokenizing data can create a market mechanism to redistribute the balance of power in a technology network without resorting to ham (or even good sense) regulation like GDPR, or even worse, this Decomposition proposed Warren.

Even after the outbreak of the ICO, there are still many people like Fred Wilson who believe that the transformation of data user control facilitated by blockchain is not only possible but inevitable.

Historically, technology has changed from closed to open, from closed to closed, and then to open. We are now in a closed phase, with centralized applications and services owning and controlling the vast majority of data access. A decentralized p2p database – a common blockchain – will open and tag data in a destructive manner, which will change the way in which value is captured and created on the Internet.

Simply put, tokenization and open data can limit the impact of controlling data monopolies on future innovations while creating a new era of computing.

Information can eventually be set freely.

About admin

Check Also

How to build The Matrix

How to build The Matrix

Rizwan Virk Contributors Rizwan Virk is an executive director of Play Labs @ MIT a ...