top of page
Top of Page 2




October 31, 2018

Once upon a time in America (circa 1995)....  

There was a User, and there was the Internet.

The Internet was open—decentralized, peer-to-peer, with intelligence residing at either end.  Unlike the virtual “walled gardens” that preceded it, the Net provided people with choices, and innovations, and opportunities.  Indeed, some of its champions promised to eliminate unneeded, rent-seeking middlemen, calling it “disintermediation.” Instead, power would shift from network core to the edges of the Net—and the ordinary user.

The Net was young and chaotic and messy, but it also held tremendous potential for human autonomy and freedom.  Quickly, the Internet became the most successful technology platform in history, hosting in particular the World Wide Web.


Some twenty-five years later….  

We still have Users, and the Internet, and the Web—and now, we have the Platforms.

Platform companies can be large or small, but their most basic characteristic is that they are multi-sided, with money flowing in several directions (typically among users, partners, vendors, retailers, developers, advertisers, and data brokers).  Today, we are most familiar with these companies as Facebook, Google, Amazon, and many others.

Over the years, the Platforms have profited enormously from three interrelated benefits derived from the open Internet: system inputs, network effects, and platform dynamics.  And these factors in turn all stem from the Web Users themselves:

  • Internet inputs: data, access, devices, software, and content;

  • Internet outputs: network effects, positive externalities, scale and scope impacts, feedback loops, tipping points, and winners take all; and

  • Platform dynamics: revenue and profit incentives are tied to advertisers and developers (as the customers), utilizing data from Web Users (as the objects).

Few would question the Platforms’ right to tap into these User-generated benefits, and by doing so they have produced considerable economic and social value.  A legitimate question, however, is whether there are other potentially viable business models beyond the Platforms’ current ”Ads+Data World” paradigm.


To this point, many have lived with the uneasy online social compact of “trading my data in return for some stuff.”  The existing financial model has entailed “giving away” one’s data from interactions, based often on vague or confusing terms of service and shrink-wrap agreements, in exchange for useful services and goods.  Now, even third parties with whom one has no prior relationship can gain access to and utilize one’s data, and not be required to provide anything in return.

The process of living involves countless mediations, with filters existing between ourselves and the world.  Some are chosen by us voluntarily (schooling), some selected for us by nature (cognition, memory, sensory systems), or assigned via nurture (family, culture).  Over time technologies have proven to be effective mediators, expanding the scope of our abilities (the wheel, electricity) and our sense of reality (from the micro of particle accelerators, to the macro of gamma ray telescopes).


But technology generally has served humanity as a useful tool, not as an ultimate end unto itself.  After all, social systems (tribes) existed for millions of years before Facebook. Search networks (libraries) existed for thousands of years before Google.  Retail platforms (bazaars) existed as far back as bartering and currency. And advertising wares and services were staples of market economies going back for many centuries, and through many cultures.


Disintermediation of unwanted “middlemen” was supposed to reduce the frictions and costs in online transactions.  In some ways that has been the case. Ironically, though, what is a Platform but another form of intermediation? And in this case, the corporate accountability has been lacking.  In legal terms, these entities often lack privity with Web Users, and any resulting fiduciary obligations to treat them as real customers in Ads+Data World. Leaving us to ask: is it time to remedy the loss of trust and accountability in the Web?



It’s all about the data.

At the outset, even calling them “users” and “data” concedes much to their techno-market conception, rather than the ordinary human roots.  For now, at least, we will stick with convention.

It seems a misnomer to ascribe such considerable power and influence to the 1s and 0s meant to define us. Data is supposed to represent points of reality—personal, institutional, environmental. When correctly assigned and collected and analyzed, and placed in proper context, data points may be stitched together to create a higher level construct, information.  In turn, when added to a narrative and further context, information can become knowledge.

However, there are inherent limits to this conception of Ads+Data World.  As Claude Shannon discovered, information is comprised of signals (meaning) and noise (irrelevant background), in ever-shifting ratios.  It turns out that the more data we collect, the more noise there is accompanying the signal. Normally, the signal-to-noise (S/N) ratio of “data” is quite low.  Even as the algorithmic systems get better at interpretation, the mountains of data grow larger. With the move to “information,” however, the signal-to-noise ratio improves substantially.  And finally, at “knowledge,” the signal is operating at optimal strength.



And relevant data is all about the context.

Data is highly contextual, and by definition context is largely granular and local. So it depends on the circumstances of its collection and measurement as to whether data can be interpreted correctly as meaningful signal, or mere background noise.  That noise can include a myriad of elements, such as human bias, algorithmic bias, incomplete data gathering, inaccurate measurements, wrong assessments, unfounded inferences, and bad conclusions.  Not to mention numerous missteps in transmitting, storing, protecting, and preserving those same bits of data. Without adequate—and correct—context, data is noise masquerading as meaningful signal.

Data collected from Web Users also tends to relate to something that has happened in the past—a certain retail store purchase, for example.  Theoretically, a string of purchases can convey basic information to third parties. With algorithmic computations and machine learning systems, we may be able to ascertain certain otherwise unseen patterns in the data, so that the information conveys some richness and depth.


But normally that is as far as it can go.  The data points tend to remain trapped in the past.  As our financial institutions repeatedly warn us, past performance cannot be correlated to future results.  Further, because oftentimes it is collected from Web Users based on their assumed/confused/reluctant consent, the data may be gathered surreptitiously, clumsily, incompletely.



“Spooky action at a distance.”

Today, personal data can be seen as operating at a remove from the context of the living, ever-changing individual, both geographically and conceptually.  Such distancing creates an unhealthy dynamic, where data is decontextualized and disembodied. The simple fact is that the actual human being can become lost in the numeric haze.

There is also a decided lack of symmetry—power, leverage, influence, transparency—between Web Users and the rest of the online environment.  The Web can be an overwhelming place, full of both wonderment and peril. Few of us are equipped to fully handle its complexity, let alone the emerging version now upon us.  Indeed, per Brett Frischmann, online technologies may be programming us to become passive recipients of external “buy” signals, rather than engaged agents. Why, as one example, should our automobiles possess more autonomy than ordinarily is left to many of us online?


As technology recedes from ready view behind ever-present and all-but-invisible sensors and cameras and airwaves, informed consent has become a non sequitur.  When inevitable tracking and hacking and breaches do occur, there seems no recourse.


GLIAnet Part 2: The Need For Trust.

Read more here.


Why should we care? The human element

bottom of page