Facebook

  • December 5, 2019
  • I wanted to like Kara Swisher’s piece in the New York Times about Facebook’s attempt to wrestle with its daemons, but I can’t because it feels too much like self-delusion. To cut to the chase, Facebook announced it was forming an oversight board that will eventually have about 40 members with responsibilities for policing its domain and reducing or even eliminating the fake news and propagandistic uses the service has been subjected to at least since the 2016 election cycle.

    I am not impressed.

    I am not swayed because it appears to me that the company can’t or won’t come to terms with a valid definition of the problem. Facebook has a quality control problem and building structures that chase after bad actors and their content once they’ve had a chance to pollute social media doesn’t work. If it was a manufacturer with defective products, we’d all quickly conclude that the way to improve product quality is to build it in and not to attempt to add it after the fact, which is what the oversight board would do.

    The auto industry of the 1970s provided all of the case study information you need. American car companies tried to improve the quality of their products after they were built shunting aside cars that needed rework. The Japanese in contrast broke down the assembly process and tried to improve every aspect of it to drive error out and build quality in. During the 1970 and 1980s Detroit lost roughly half of its market share to foreign competition that just built better cars and it has not recovered.

    Social media is a bit different. Bad actors are building defects into social media so that any policing strategy or oversight board will always be a day late and a dollar short. Social media, and Facebook in particular, need to face the fact that products once intended for private individuals to communicate with have been adopted by government and industry for other purposes because they represent a commoditization of other modes of communication. They’re cheap and effective and nothing attracts government and business like cheap and effective.

    Right now, you could call yourself Neil Armstrong and launch a page that said the moon landing was faked and you’d be off to the races. Social media companies want to be able to wash their hands of responsibility for the misinformation except that they also want to capture revenue from it. This is a two-tier business model mascaraing as one: they are platform and apps. Solving the social media problem requires a model that separates ownership and requires commercial and government users to demonstrate a fundamental understanding of the tools and resources including their proper use. Penalties for misusing the services would be a big plus.

    Lots of people will call this draconian and a violation of imagined rights, but it is the way we’ve regulated other businesses for a long time. Every plumber, electrician, beautician, doctor, dentist, lawyer and many more professions have to sit for licensing exams before they’re allowed to practice on the public. This sets up a reasonable malpractice regimen too. Social media use by commercial and government entities should face the same regulation.

    Regulating at the user level would do a lot to reduce or even eliminate bad actors and misinformation it would effectively add quality to an industry that was once a good idea but is increasingly becoming a cesspool with geopolitical ramifications.

    So, I am not a fan of an oversight board for Facebook and, with due respect, Kara, you should know better.

     

     

     

     

     

    Published: 4 years ago


    This week the New York Times reported that Mark Zuckerberg and Sheryl Sandberg of Facebook demonstrated poor judgment, ineptitude, and a stunning lack of empathy over the last two years as they tried to play catch up to the unfolding Russian election scandal. It’s time for appropriate regulation that builds safeguards into social media while enabling it to continue its mission.

     

    The reputations of Facebook founder and CEO Mark Zuckerberg and COO Sheryl Sandberg are in tatters today after a long expose in the New York Times that examined how the pair dealt with the rolling crisis that engulfed the company during and after the 2016 elections that saw Donald Trump elected under a cloud of suspicion that he had help from Russia. 

    The story has been picked up on cable news and it paints a picture of a company led by two executives more interested in company growth than anything else. The story includes examples of attempts to pass the blame on to others, withholding useful information from investigators and repeated denials of culpability when the known facts inside the company said otherwise.

    Most significantly, it shows a company in constant reactive mode in part because no one seemed to have a moral center, a clear sense of right and wrong, and the fortitude to take the right actions for customers and the country regardless of how those actions might hurt the executives or damage the company’s reputation.

    A day earlier, the Times also published a three-part video, “Operation Infektion,” describing a decades-long effort by the Kremlin to spread fake news (active measures) about the West and the US especially in an effort to weaken its adversaries. The overlap between the stories hasn’t received as much attention in the media, which is a shame because social media became the accelerant in an act of political arson.

    There’s a lot of information already available on the debacle so let’s skip ahead to Infektion to get a big picture view of how Russia’s use of Facebook damaged society in the West and how its repercussions will play out for a long time.

    The most relevant part of the Times video series comes in part 2 dealing with the seven commandments of active measures, a term that encompasses Russia’s approach to spreading fake information to the detriment of the West. The seven commandments are,

    1. Find the cracks—social, economic, linguistic, religious, or ethnic issues that can be exploited and wedge them open. This can include almost anything from gender issues, to religion, to immigration and abortion. You get the picture. Russia started out picking sides but grew to realize it could manipulate both sides of any issue to manufacture discord. There are examples of confrontations during the 2016 election in which both sides were galled into action by Russian efforts, often on Facebook.
    2. Create a big bold lie—something that is so outrageous no one would believe it was made up. Example, the AIDS virus was manufactured by the US to hurt minorities and escaped from a lab at Fort Detrick, MD, where it was supposedly made. 
    3. A kernel of truth—provide a speck of facts to make the lie more believable. The US does have labs that work on viruses and ways to combat them in war. Fort Detrick is one place where this research goes on. The kernel of truth in this case is the name of the lab assigned blame for the fictional virus release.
    4. Conceal your hand, make it seem like the story came from elsewhere. The first mention of the AIDS story came from a small paper in India and it took years for it to percolate through the journalism community in the 1980s. One weak spot exploited by this approach is that fact checking didn’t go all the way to establishing primary sources. News stories up to and including some in America only used other stories as their sources. Thus, the people relying on the transitive property of truth were severely exploited.
    5. Find a useful idiot, someone who would unwittingly promote the fake news story as real. The emphasis in that phrase is evenly distributed. A useful idiot can be anyone who unwittingly (the idiot) takes the pseudo-information at face value and passes it on (the useful bit), often amplifying it. In the case where a news organization propagates an untrue story, it is serving as a useful idiot, even if it attributes the story to another news outlet in another country.
    6. Deny everything when the truth squad shows up. We’ve seen way too much of it lately. When the truth squad tried unraveling the AIDS scare it had to go through many layers of news outlets and reporters to find Russians who denied everything. Or consider “no collusion.” Collusion isn’t a crime in the US but conspiracy is. So in this two word phrase you have a useful idiot spreading a big lie with a kernel of truth in an effort to conceal his hand. It’s brilliant.
    7. Play a long game. Repeat, repeat, repeat. Regardless of costs, keep your eyes on the prize and understand that losses and setbacks are temporary when you play a long game. It’s a very Zen idea. Consider the birther movement. The only way to silence it was to meet its terms by producing President Obama’s birth certificate. The choice was continued low grade carping with erosion of public trust or swallowing a larger amount of humiliation all at once.

    In nearly all these commandments you can see how Facebook was taken advantage of during the election cycle thus playing the useful idiot. But also, you see the tactics the company tried to use to deflect attention from itself during investigations—keep in mind the Times’ story headline is “Delay, Deny and Deflect: How Facebook’s Leaders Fought Through Crisis.”

    In the process, Facebook has become a useful idiot on steroids, thanks to the Internet and social media’s reach. In the age before the Internet and social media these tactics might have caused some trouble but the disturbance was usually self-contained because it couldn’t spread as well. The truth squad eventually rode in and set things right. Today it’s much harder. For example, in the pre-Internet era, the fake story about the AIDS virus took 6 years to spin up. The story about a child sex ring run by the Clinton campaign and operating out of a pizza restaurant in Washington, DC only took 6 months to bear “fruit” if that’s even the right term for the mass shooting that happened.

    What to do?

    There is a nascent movement in Congress to develop some form of regulation over Facebook and other social media giants, which is reasonable but not very welcome. It never is. The result of the tech revolution is that we can no longer function very well as a society without information, a trend that is still increasing. This trend makes equal and transparent access to information something that must be spread throughout society.

    At the same time, social media and Facebook can no longer be regarded as neutral platforms that foster free speech. They have become publishers bringing eyeballs to advertisers and they have responsibilities that go with this status as well as First Amendment rights. Sen. Mark Warner (D, VA), already has a draft working paper circulating suggesting some components of regulation. Michelle Goldberg Opinion columnist at the Times wrote that,

    Among them are amending the Communications Decency Act to open platforms up to defamation and invasion of privacy lawsuits, mandating more transparency in the algorithms that decide what content we see, and giving consumers ownership rights over the data that platforms collect from them.

    My two bits

    Facebook, Google, Amazon, and many other platforms that use a social media model to capture consumer data and remarket it to advertisers, is now a utility. My definition of a utility is something that started as a disruptive innovation and proved so useful that it has become indispensable to modern life. Modern transportation beginning with the railroads, electricity, telephone, and cable have all trod the same path. At the moment cable, especially as a gateway to the Internet, is the outlier thanks to decisions by the Trump era FCC. By rolling back net neutrality rules set by the Obama administration, the FCC negated the principle of the common carrier that over-arches other utilities. We can and should hope this is a temporary aberration. 

    But back to data and its collectors and merchants. The consumer and the nation at large have a right to expect that the purveyors of modern life’s essentials will abide by the essence of the Hippocratic Oath and first do no harm. Enacting a common sense set of regulations to make this so should not be a heavy lift. 

    A century ago President Theodore Roosevelt pushed legislation enacting standards for food and drugs ushering in the Food and Drug Administration (FDA). A generation later, after Wall Street shenanigans brought on the Great Depression, his cousin, Franklin Roosevelt, brought forth the Securities and Exchange Commission (SEC). We’re at a similar crossroads today. The path forward might be shrouded in mist but the way backward is completely unacceptable.

    Like so many things riling society today, from global warming to immigration, fixing the problem is not the hard part. Getting the various sides to agree there is a problem that needs addressing and negotiating a solution is. Doing nothing is never a solution because letting a situation fester only makes it worse. The recent revelations of Facebook’s failures has demonstrated that putting its house in order is more than it can do internally. Perhaps a reconfigured political landscape in the US and a new year will bring solutions into focus.

    Published: 5 years ago


    Adam Smith famously referred to “the invisible hand” of the free market in his landmark book “The Wealth of Nations” and with that made himself one of the very first political economists. Smith’s observation was so on point that today most of us assume markets run through the agency of individuals pursuing their enlightened self-interests. A lot of this drove the evolution of CRM as a tool for tracking customers.

    If you pay attention today you can notice a not-so-invisible hand functioning in multiple areas. For instance, if you’ve been following the aftermath of the school shooting in Florida, you know that an activated group of kids and adults nationwide has begun a movement to get something done about gun safety. The #MeToo movement in which women are banding together to change the workplace by eliminating sexual harassment is another, and so is the Black Lives Matter movement. But you can argue that all of them are free market responses in that they arose from the grass roots without much prompting from elite power centers.

    What each has in common is the initiative by engaged individuals to cause change in what are essentially marketplaces in the broadest conception of that term. Much closer to home, even in the technology world we’re seeing the stirrings of user dissatisfaction with social media and it’s not clear where this will go. Its impact on CRM could be big because social has become one of the key channels linking vendors and customers.

    A recent article in Wired, “Facebook Doesn’t Know How Many People Followed Russians on Instagram” by Issie Lapowsky, documents Facebook’s foot dragging on producing information for the various inquiries surrounding the 2016 American election. Jonathan Albright of Columbia University’s Tow Center for Digital Journalism, has been looking at the details and producing information that’s uncomfortable to Facebook. He’s been quoted in Wired, The New York Times and elsewhere.

    Albright’s work has uncovered many things concerning Facebook’s approach to the investigation which you might call passive aggressive. For instance, when asked why it had not produced information about how many people had seen Instagram information produced by Russian operated troll accounts, a spokesperson for Facebook, which owns Instagram, said, “We have not been asked to provide that information.” So little curiosity…

    It’s not necessary to repeat the article here; it’s worth reading but that’s your call. It documents how Facebook has so far assisted investigators but only if they ask the right questions. The final paragraph summarizes this point,

    Facebook has shown consistent reticence in detailing how these trolls infiltrated its platform and who that propaganda reached. They’ve repeatedly had to correct prior statements about the reach of these ads and accounts. By working with outside researchers like Albright, the company might be able to paint a more complete picture, but Facebook has been unwilling to open its data up to researchers.

    It’s not necessary to re-examine every time Facebook denied their involvement or disputed findings that upwards of 150 million people saw content from the Russians or that all the US intelligence services agree that the Russians did indeed hack the election. That’s all very interesting from another journalistic angle but not this one.

    The totality of Facebook’s unwitting involvement in the hack plus its efforts to downplay their importance brings up a bigger issue for Facebook, and by extension all social media: How useful are Facebook and social media generally considering the Russian hacks?

    A glib answer might be that it doesn’t have to be terribly useful because it’s free and users get whatever utility they can from using it. But that misses the point. If Facebook’s utility is small or especially if it’s disputable, its business model would be in serious trouble.

    Social media’s primary product has always been the user. It is valuable to each of us when we use it to gather information about our personal grafs and we knowingly pay an in-kind fee by letting social sites collect data about us, which they can then sell to advertisers. It’s a classic network effect—the greater the audience the more valuable the output of its data.

    But what happens if the veracity of information on social media is in doubt? Social media’s value is directly proportional to its veracity and if one can doubt that veracity it might be prudent to seek alternatives. People and corporations that invest heavily in using social media’s information might begin doubting if their investments deliver value.

    My thoughts

    So far Facebook’s approach to the hacking scandal has been to deny and ignore it only admitting something when there’s no other choice. This presents another problem associated with stonewalling—dissipating trust. However unpleasant the facts, the more a party tries to ignore or hide them the lower the market’s trust in that entity and the greater the opening for a disruptor.

    The truth value of what people put up on the networks and what they believe about the truthfulness of others’ posts makes social media’s world go around. That truth is what makes some people spend hours a day surfing the sites and it’s what makes advertisers purchase ads. Once that trust begins to erode, even a little, the business model can begin to unravel.

    Whoever is advising Facebook on its strategy should reconsider. It’s human nature to not enjoy dealing with criticism and serious accusations. But impinging the free flow of information won’t solve the problem. Free markets depend on transparency and Facebook is a free market of ideas. If it stops being that, or even if people stop believing it, there’s no reason for them to continue using it.

    Published: 6 years ago


    Ok, this is kind of long. Go get a cup of coffee.

    Amid the anxiety and revelations of the Russia scandal including the Cambridge Analytica story that showed how easy it was to steal 50 million Facebook user profiles, it’s easy to mix up cause and effect. Importantly, Facebook wasn’t hacked or broken into but it was used as it was designed.

    This has led some to question whether Facebook as such can exist at all in our pluralistic society while others believe the problem of surreptitious psychographic profiling will blow over once everyone plays by the same rules. After all, others have argued, other entities do the same thing. They point to Google, Amazon and even the traditional print industry as culprits for gathering personal data for analysis and, it should be said, weaponization.

    Of course, the issue is manipulating and weaponizing the data. If we can’t trust the data, then we are disassembling one of the pillars of democracy, the acceptance of scientific rationalism. Boiled down, it means facts are facts even if you don’t like them.

    If you remember a time before social media when identities were not so readily stolen and you think that reality was good, you might also recoil at the thought that those were the good old days, that things are now permanently different. There is a third option though and there are probably many that seek to balance the benefits of new technology with the protections we’ve grown accustomed to.

    This article can’t be all things to all those people but it attempts to find safe harbor in a storm and therefore makes accommodations. If we can’t live with the compromises, perhaps it can at least point out some of the major obstacles to be over come.

    Business model

    It is an article of faith that Facebook’s business model, as well as those of other social networks and search engines, is selling advertising. But it is my contention that this model has run its course. It was effective when the companies were smaller, when their consumers were more innocent to the ways technology can be used for both good and ill purposes. The advertising model was even necessary in a time when the Internet was new and finding people and things was strange.

    The advertising business model was a default that data aggregators took on the way to phenomenal profits and who could blame them. The tech sector has a habit of minting money and the founders of social media and search engines were merely the latest in a long line of prolific brainiacs who struck gold. It is hard to believe that any human in a similar situation would act much differently.

    The latest dustup that dragged social media into the political spotlight now presents two choices to these businesses. They can hobble their products, which could reduce the amount of data they collect making them less interesting to advertisers, or they can change their business models slightly to prevent unethical use of their networks.

    Disruptive innovation

    Anytime a new technology reaches market, it has the possibility that it will disrupt the existing order of things. Disruptive innovations have coexisted with Capitalism since its origins in the Industrial Revolution. Disruptive innovation means making thread and then cloth with high-speed mechanical means, making a steam engine powerful and small enough to be mobile, or making a computer that could fit on a sliver of silicon about the size of your thumbnail.

    The world changed with each of these disruptive innovations and others, because they immediately made an old order irrelevant and they organized whole economies and even civilizations around new driving forces. The Internet and its children are the latest innovations that have rocked the world. In each, humanity has had to grapple with both the benefits and the deficits of the innovations.

    So far, we’ve benefitted enormously from these innovations but recently we discovered their less sanguine side. If history is a guide then regulation in some form is a likely next step. Some leaders in congress have already broached the idea on several occasions but it’s important to get the idea right before pulling the trigger, which is why we need to discuss business models.

    Regulation?

    Regulation could happen in social media and search but there’s much that the technology companies can do to either avert it or ensure that its mandate is as light and congruent with company interests as possible starting with the prevailing business model.

    Although the advertising business model has served many companies well, they’ve morphed into data companies with big responsibilities for safeguarding the data they collect and that’s not something they’re eager for.

    The big data gathering companies like Facebook, Amazon, and Google and their competitors, have become data companies first and advertising vendors second and if this understanding had been realized sooner, many data breeches would in all likelihood have been thwarted. Rule One of business is never give away your product, it’s what you charge for because it pays the bills. Applying the rule should be as obvious as encrypting user data in this case. Additionally, no expectant user of the data should be able to access it in its unencrypted form without, of course paying, but more importantly presenting valid credentials and stating a beneficial and productive purpose of the use.

    I’ve written before about credentialing and how it’s actually harder to pull permits to remodel your kitchen than it is to advertise any message you want on social media so I won’t perseverate. So let’s turn to encryption.

    Security as a business model

    Social and search’s business model must turn from advertising to data management, curation, and selling access to it and we live at precisely the moment when these activities are possible on a very large level. This includes encryption and the same form of certification that applies to other professionals from doctors to beauticians and plumbers.

    Encryption and its reverse take time and require compute and storage resources which have often cut short discussions involving them because of cost considerations. But new, shall we say disruptive, innovations in computer hardware and software are reigniting the discussion.

    In hardware data storage was long accomplished with the hard drives of most computer systems. Data enters and leaves storage on millisecond time scales, which is very fast. However, computer CPUs and memory operate one million times faster at nanosecond speeds. CPU chips spend a lot of time waiting for data to become available even when, as most modern computer systems do, there is memory caching for frequently used data.

    Innovative hardware designs now offer solid-state memory devices that replace disks. This memory operates at nanosecond intervals and eliminates the lag time of older mechanical systems. What should we do with all of this newfound speed? One possibility might be to dedicate a small portion of it to encryption. Typical encryption modes on the market right now could be broken but that would take so many years that the resulting data, when finally available, would be useless and encryption is getting better.

    Encryption would be a good thing but it wouldn’t solve all problems and securing our information infrastructure so that it operates more at utility grade, requires other changes. Bad software, malware, viruses, Trojan Horses, and the like may still get into systems.

    Mark meet Larry

    As luck would have it free markets generate inventions faster than they can be adopted. Often a disruptive innovation exists at the nexus of several disruptions that just need one more critical piece. That’s the case with many of the system level inventions that Oracle has brought to market over the last several years. They’ve pioneered important developments in solid state storage, encryption, chip sets that weed out intrusive malware, and a self-patching autonomous database that just hit the market.

    All of these things turn out to be essential to safeguarding data which will enable the information revolution to continue burrowing its way into our lives and enriching society. They are also the underpinnings of a new business model that turns big data companies into ethical data providers. They might also continue being social media companies but the data tail would now be wagging the dog.

    My two cents

    What do I know? I just read and write a lot. But what I see is an industry about to be regulated and, in my mind, the smart play is for the social media companies to lead the charge to ensure they arrive at something they can live with instead of remaining aloof and having some regulations imposed on them.

    There’s a wild west mentality in Silicon Valley in which what isn’t proscribed is encouraged. But we should keep in mind that the west only remained wild until the pioneers arrived and established towns with roads, schools, and churches. The wild bunch might have disliked the idea of settlement, they might have opposed it, but they were quickly in the minority and civilization won. That’s what’s happening in tech today and we all need to seize the moment.

     

     

     

     

     

     

    Published: 6 years ago


    The multiple issues/scandals/problems facing companies in Silicon Valley could drive you to ask if the wheels falling off the collective tech wagon. A recent article in the DealBook section of the New York Times asks the question,

    Are we witnessing the end of a mania?

    Investors, always willing to believe in technology companies, spent the last three years piling into the shares of companies like Facebook, Amazon and Netflix with special abandon. Now the intellectual underpinnings of the tech rally are being seriously tested.

    But maybe this is just the way disruptive innovations evolve. To be sure, if it takes 50 to 60 years for a disruption to work its way through society and the economy, as I have documented in discussions of long economic waves called K-waves, then it’s quite likely that we’ve seen this kind of thing before, say in the 1960’s. You remember the 60’s, right? That’s the point.

    Because there’s little living memory of the last time, today’s happenings look unfamiliar. A disruption grows and grows consuming everything in its path until there’s nothing left to consume and at its peak the disruptors become the disrupted and accommodations must be made. Often the they are initiated by standards and even regulation—admittedly a dirty word in Silicon Valley and environs but one that must be said.

    In the more genteel days of the 19th century the disruption of the day, electricity, had its own peaking moment pitting uber genius Thomas Edison, a proponent of direct current (DC) standards against the lesser known geniuses Nicolai Tesla and George Westinghouse, who favored alternating current (AC).

    Edison was not above making a slur to defend his viewpoint. When a condemned man was electrocuted using alternating current, Edison got his publicists to influence a headline saying the man had been “Westinghoused.” Those were the days. Nonetheless, facts are stubborn things and AC won the day because it was just a plain better standard for long distance transmission, something the emerging electricity industry simply had to have.

    Through standards setting and regulation, the electric industry became a ubiquitous and standardized service that is accessible to all and this set the stage for further growth of the industry and the economy. Ironically, the standards and regulations were nothing that individual business people would have readily agreed to.

    The Times article asks in a roundabout way if the tech sector is in a similar moment. While the sector comprises businesses as different as Facebook, Apple, and Google, to name a few, Facebook and social networks are reaping a healthy share of skepticism, and for good reason, so let’s concentrate there. Another Times article may have pinpointed the peak moment when social media went from the disruptor to the disrupted,

    Shortly before the election, a senior official with the Trump campaign bragged to the Bloomberg reporters Joshua Green and Sasha Issenberg, “We have three major voter suppression operations underway,” which the article described as targeting “idealistic white liberals, young women, and African-Americans.” Brad Parscale, who ran the Trump campaign’s digital advertising, is quoted in the same piece discussing his plan to use dark ad posts of an animation of Hillary Clinton referring in 1996 to some African-Americans as “super predators.” Parscale suggested that the campaign would use this image to discourage a demographic category described by the reporters as infrequent black voters in Florida. “Only the people we want to see it, see it,” he explained. “It will dramatically affect her ability to turn these people out.”

    Dark ads display once to a specific audience and then disappear though they’re still in the vault. But guess what, the article goes on to say that,

    the dark ads have disappeared and Facebook won’t release them, citing the privacy of its advertisers.

    Zuckerberg might say that trust is important and he might spend a few bucks on grandiose full-page newspaper mea culpas but he and his company are still remarkably tone deaf. Did the suppression effort work? You be the judge,

    The election of 2016, the first after Barack Obama’s presidency, was notable for a seven-percentage-point decrease in African-American turnout, from 66.6 percent in 2012 to 59.6 percent, according to the Pew Research Center.

    The first decline in 20 years and the largest decline on record.

    This isn’t an American phenomenon. The news shows that the effort in 2016 was international with companies like Cambridge Analytica employing many non-Americans to sort data, create psychographic profiles, and generally influence the US election and possibly other efforts like the Brexit vote.

    My take

    We’re at the moment when attention turns to regulation in social networking and elsewhere. Facebook’s focus on the individual’s rights (privacy of its advertisers!) rather than the potential harm it can cause to a whole society represents a blind spot that will interfere with any solo attempt to rectify the situation. That’s why self-regulation rarely works without a mandate from government.

    It’s possible to regulate social networks through a system of encryption, certification, and a modicum of tracking. Doctors, lawyers, plumbers, beauticians and many other professionals all submit to systems of practitioner licensing and professional standards and utilities are regulated. That’s a workable model for social networks.

    It’s currently easier to attempt to influence millions of people about consequential issues than it is to pull the permits to make an addition to one’s home. It shouldn’t be that way.

    The momentum in the halls of legislatures around the world right now is toward regulation. The social networks should welcome it and work with governments to reach a workable compromise which includes a standard set of regulations that apply over broad swaths of the planet. Regulation will do for the industry what it can’t do for itself and that’s exactly what the society needs.

    Published: 6 years ago