Jungle Drum NEWSWIRE
[Jungle Drum Newswire has been officially decommissioned but will remain online as a resource and to preserve backlinks; new site here.]
Independent Publishing
 
"The Gods pay tribute to those who dare lay siege at heaven's gate and storm the walls of paradise" -- Anon

» Gallery


Search

search comments
advanced search


Download

Download



this site  web    
Avoid Google's intrusive, snoopware technologies!


We are ONE
We are ONE


http://jungledrum.lingama.net/news/newsfeed.php

"Asymmetry
is a
Keyboard"


Google, your data suppression methods are obvious, easily recorded, abysmally inept and generally pathetic.

The simple fact that you actively engage in suppressing this and other alternative news sites means we have won and TRUTH will prevail in the end.
Sister sites and affiliates:
Current active site here.
printable version
PDF version

Digital Abuse by Big Tech now Under Scrutiny -- about time!
by tom Saturday, Jan 20 2018, 8:30pm
international / prose / post

Various algorithms and other manipulative strategies utilised by Big Tech for years, are clearly harmful to developing youth and young adults raised with digital devices, no argument. Facebook's high placed insiders have revealed in a video, how the company intentionally manipulates the minds and emotions of users for gain, without any regard whatsoever for detrimental impact on users and society, which is now proven.

tracebook.jpg

On a personal level I once confiscated my childrens' smartphones, with which they were obsessed, in a matter of hours I watched my own children undergo withdrawal symptoms usually reserved for heroin and other drug addicts.

Years have passed since then, and now finally the negative issues associated with these UNNECESSARY evil algorithms are under scrutiny. It is hoped that these mind manipulating methods are banned as clearly they are destroying not only the fabric of society but the minds of users which have suffered detectable brain 'wiring' anomalies, according to neuroscientists. Google and Facebook as they stand today are indeed EVIL.


Important article from Alternet follows:

Big Tech's algorithms have turned billions of device users into addicts whose every whim is tracked and baited.
by Steven Rosenfeld

Everywhere you look, high tech is in somebody’s bullseye. Take Apple. On the inside, top investors are worried about its products’ effects on children. On the outside, liberal activists are grousing about its offshore $billions it can now bring home under new GOP tax "reform." Even usually anti-regulation conservatives at the National Review are asking why Big Tech isn’t regulated like Big Oil or Big Tobacco.

These examples, all recently in the news, confirm the trend but skim the surface. New national polling has found public opinion is shifting from a warm embrace to growing skepticism. It’s not just the way so-called fake news on social media had a role in recent elections in the U.S. and led to congressional inquiries. And it's not just calls for federal anti-trust actions aimed at the most popular information curators, Facebook and Google.

Beyond these dots that attest to a backlash is understanding what’s really going on below the screens and in the minds of Facebook’s 2 billion users and Google-owned YouTube’s 1.5 billion users. There is a new phrase describing this sphere of human activity, the technology behind it and its effects. What’s being called the attention economy is coming under new scrutiny because it's seen as undermining the journalism profession as well as trust in public institutions and democracy.

“We come here in friendship,” said Anthony Marx, president of the New York Public Library, co-chair of the Knight Commission on Trust, Media and Democracy, at Stanford University this week. The panel was created last fall to try to fix the attention economy’s biggest problems, which also include the way Google search and Facebook have demoted the visibility of independent media, under the guise of fighting fake news.

Marx’s comments elicited nervous laughter, because he had just presided over a panel that laid out in vivid and disturbing detail how Silicon Valley’s best minds have created brain-tracking, brain-mimicking and brain-triggering computational formulas. These algorithms have turned billions of digital device users into information addicts—and when put at the service of supercomputers, targeting online advertising or content placement, they have fractured society as never before.

“We’re at Stanford; the belly of the beast. This is where it all started. Which is why we’re here, why we need to understand what you all are thinking,” Marx said, speaking to panelists, an audience filled with tech executives and commission members culled from some of the monopolistic companies under attack.

“Let’s be very clear about two things,” he said, proceeding diplomatically. “One, you all have created this amazing tool. If you had said to me as a child that I would have something in my pocket that could connect me to all the information in the world, potentially, I wouldn’t have believed you. That’s astonishing. So thank you. That’s the good news.”

“The less good news is this is not taking us to a good place,” he continued. “It has not taken us to a good place. And I’m not supposed to say any of this, but your industry, the industry that you all are a part of, I think the world has changed its view of the industry in the last year, in a way that I have never seen the likes of before. Meaning what was thank you, is now uh-oh. Bad things are coming of this, and that puts us all on the spot, which is why we are here. We want to understand what is possible—what can be better.”

The Attention Economy

Those outside Silicon Valley’s innermost circles cannot access or evaluate the algorithms powering Facebook’s news feeds and advertising-driven content placement, or YouTube’s engine that recommends other videos individual users might like. However, panelists at the Commission’s Stanford University session were exceptionally articulate and forthcoming about the nature and goals of the algorithms, better described as brain-mimicking artificial intelligence.

One of the most outspoken explainers and critics was Tristan Harris, a former “design ethicist” at Google—his company was acquired by it in 2011—who now runs a non-profit, Time Well Spent, which seeks to improve Big Tech’s impact on society. What this under-40, ex-CEO said was as stunning as what looked like a blasé reaction from his industry colleagues.

Harris said the attention economy, or the media on everyone’s smartphones and computers, is not just the endless marketing we all see. There’s a deeper reason why many established news sources can be supplanted by shadowy propaganda on major platforms, why facts can be outrun by opinions and lies, and why narrower tribal loyalties can usurp democratic institutions.

Harris pointed the finger of blame at the heart and circulatory system of Silicon Valley. Its artificial intelligence algorithms are designed to trigger brain responses and be addictive, he said. They power a business model loosely called online advertising, but that is a superstructure that cashes in by targeting and provoking shared interests, via curated content. But it also separates society into disconnected spheres.

“There’s the public rhetoric about what [information] technology ought to do and what the positive intentions are. But then there’s the reality, if you actually go inside the companies, and hear the engineers and designers talk about their daily objectives, everything comes down to what’s going to hook people to staying on the screen,” Harris said. “No matter what the positive intentions are, 2 billion people do wake up in the morning right now and they have one of these things in their pockets, and they do use one of a handful of services. As my colleague Roger McNamee, who is [Facebook founder Mark] Zuckerberg’s mentor likes to say, there’s 2 billion people who use Facebook, that’s more than the number of followers of Christianity; 1.5 billion people use YouTube, that’s about the number of followers of Islam. These products have more influence over our daily thoughts than many religions and certainly more than any government.”

When John Lennon said the Beatles were bigger than Jesus in 1966, he created an international uproar. The band received death threats and had to stop touring. But when Harris said Facebook was more popular than Jesus and YouTube served more people than entire continents, these breathtaking assertions barely raised eyebrows. That scale underscores why high tech’s biggest successes are facing a reckoning, from congressional inquiries on Russian meddling in the 2016 presidential election, to new calls to break up tech monopolies under anti-trust laws, to solution-seeking forums like the Knight Commission.

Harris studied how brains make choices at Stanford and went on to create technologies and a company that tapped the “invisible influences that hijack human thinking,” as his bio puts it. But that technology now poses an existential threat to humanity, he said, because it’s growing beyond the ability any one company—or handful of attention economy monopolies—to control.

“We’re a species that…can study our own ability to be manipulated,” he said. “We have to talk about the advertising-based business model, which, paired with artificial intelligence, poses an existential threat. We have to get really serious about this. If you think about where are the most powerful AIs in the world located right now? Arguably, at two companies: Google and Facebook. The most powerful AIs in the world.

“Instead of pointing them at a challenge like climate change, and saying, let’s solve that, or pointing it at drug discovery for cancer, and saying, let’s solve that, we have pointed the most powerful AI supercomputers in the world at your brain. And we basically said, play chess against this brain and figure out what will engage it the best. And so every time we open up a news feed, we’re playing chess against a supercomputer that’s designed to see 50 million steps ahead on the chessboard of your mind, and figure out what will perfectly engage you.”

The results are not always pretty, he said, a remark others echoed.

“When you think about the global consequences of this—the fact that this supercomputer is doing this in languages, and in countries, the engineers and the companies don’t even speak [or live in], which is how you get the Rohingya genocide in Burma. And how you get some fake news pod creating certain deaths in India, in South Sudan. The engineers can’t put this thing back in the bag. We have created exponential impact without exponential sensitivity.”

It’s even worse than that, explained panelist Gina Bianchini, the founder and CEO of Mighty Networks, which specializes in niche social networks. She repeatedly said that there is a race in Silicon Valley to break Facebook and Google’s information monopolies that involve algorithms teaching—programming—themselves to execute a range of tasks, including bringing content to people who will be hubs in their own information networks. (Silicon Valley calls this "machine learning.")

While she lauded the virtues of more competition, she and others described artificial intelligence as being at a threshold where the ability to get the biggest players in a room to agree to solutions would not be possible. That’s because artificial intelligence is becoming so decentralized that the ethical problems highlighted by Harris will be beyond anyone’s ability to rein in—because Silicon Valley and Big Tech isn’t a monolithic entity.

“There’s actually a scarier thing happening, which is today you can talk to two companies. Somebody shows up from Google. Somebody shows up from Facebook and wants to have the conversation, because they have the monopoly over attention today and over the advertising revenue,” she said. “The natural progression of software and where technology goes is it bends toward decentralization. It bends toward distributed technologies. Who do you talk to at that point?”

Bianchini gave an example that underscored why traditional anti-trust laws and government regulation are hopelessly outdated and ill-equipped to deal with the attention economy’s dark side. She cited Napster, which allowed music lovers to share audio files, so the recording industry sued and shut it down.

“We were able to shut down Napster and the next thing that happened was Bit Torrent [software], where there was nothing to shut down. That is where the world is going.”

What Would George Orwell Do?

These criticisms and explanations were not entirely rejected by their targets in the room. But as is often the case in high-stakes hearings, core issues can be sidetracked by expanding the focus—not by sticking with key questions such as whether the attention economy’s biggest players would change what’s powering their addictive algorithms and micro-targeted advertising.

Take commission member Richard Gingras, vice president of news for Google. Before asking questions, he said there are two historic developments to keep in mind. First, the internet put the “means of communication, the printing press, in everyone’s hands.” That has history-making benefits and challenges.

“We have diversity of information like we have never seen before. Some of that diversity is troubling. That’s par for the course, what freedom of the press is all about. Such that I’ve often posed the question of, is the true challenge to democracy the fact that we have unfettered free expression… that’s one component,” said Gingras.

“The second is I think the points about [ad and content] targeting are fair in the sense that we do have—and by we, it’s permeated throughout the ecosystem—companies have the ability to target or leverage targeting beyond the dreams of any direct marketer or in the history of politics. Here, again, it's not that the behaviors are necessarily different, it’s just that they are more efficient,” he said. “These seem to me to be the key changes.”

Big technological changes always have intended and unintended consequences, Gingras said. History is filled with examples from professions that had to adapt, he said, adding that's what the media and political culture need to do.

“It’s not sufficient to simply talk about this through the lens of technology,” Gingras said. “What is incumbent on the rest of society and its institutions to think about and address it as well. When you look at [an] environment where people are consuming information in different ways, forming opinions in different ways, this seems to me to suggest that we should rethink the mechanisms of journalism.”

“How we interact with our audiences,” he continued. “How we formulate content. The content models we use. Even business models we use to get there… How do these other institutions have to change? How do our basic cultural approaches to transparency and trust need to change to help folks understand why they are seeing what they are seeing.”


Gingras does not make this comment in a vacuum. He co-founded an initiative called the Trust Project, based at Santa Clara University Journalism School, which is urging news organizations to better label their online content and revise their websites so the search algorithms can elevate more authoritative content. That will help media stand out in the attention economy. Of course, it also helps Google do better searches—because Google search, unlike Facebook, directs users away from its website, and better results will fortify its search monopoly.

As the Knight Commission’s public sessions came to a close, the very issue Silicon Valley opposes the most—only second to revealing its secretive computational formula—was raised. What would be the result of government regulation, including the possibility of anti-trust actions breaking up the attention economy monopolies?

That question prompted one of the fiercest exchanges, and while unresolved, it suggests that Facebook and Google are going to have to become more transparent or face even greater backlash.

Gina Bianchini: “I have a very low confidence that the solutions are going to come from regulation. The solutions are going to come from the fact that we are building a grassroots mass motivation to move around centralization, which is going to be a whole different conversation."

Richard Gingras: “I find this thread a little bit puzzling. If I had heard the discussion about possible solutions to the problem, absent any knowledge of problem, I would have thought that we were talking about the fact that we actually have a problem with monolithic information in a society which is over-guided and controlled in one direction. Right? But of course that’s actually not the problem we’re facing. In fact, the problem we’re facing is one that is completely the opposite. We have tremendous diversity and points of views, silos of thought, reinforced silos of thought, from one end of the spectrum to the other and around and back again. So when I look at that problem, I wonder what problem are we really trying to solve, and how? I’m failing to see the dots connected on this.”

Gina Bianchini: “From a monolithic perspective, who’s controlling that algorithm?"

Richard Gingras: “But the algorithm…”

Gina Bianchini: “It’s two companies [Facebook and Google].”

Richard Gingras: “This putative control isn’t controlling people’s points of view. If anything, it's commending various points of view beyond their own level of comfort.”

Ethan Zuckerman, director of the Center for Civic Media at MIT, and commission consultant: “It’s impossible to know that from the outside world. It’s literally impossible.”

Richard Gingras: “Outside world. It’s not hard looking at our world today to say we have a society that’s less unified than ever before.”

Ethan Zuckerman: “And you can ask a question… about whether this information environment, around Facebook and Google, took a very extreme part of that and made it much, much more powerful. But we have a very, very hard time auditing that... All I am trying to say is that one thing short of regulation, and actually breaking up these entities, would be paths to a great deal more transparency, so we can ask these hard questions about how these platforms are shaping the information and knowledge that we are getting.”

The Knight Commission will continue meeting through 2018 before issuing a report and recommendations next fall. But in several brief hours at Stanford University’s alumni center, it laid out the issues, challenges and stakes for the problems in an attention economy where psychological manipulation and micro-targeting are used by the top information curators.

Notably, late Friday, Mark Zuckerberg announced that Facebook would soon ask its 2 billion users to rate the trustworthiness of the media on their news feeds. That may help identify more and less trustworthy news sources according to each user’s values. But it won’t get at the “invisible influences that hijack human thinking” as Tristan Harris put it. Nor will it address the societal segmentation accelerated by online ad technology that Gingras acknowledged. Nor is it an action that would add transparency to the algorithms powering these information monopolies that MIT’s Zuckerman noted.

Indeed, as New York Times tech columnist Farhad Manjoo noted this week in a piece that pondered if Apple would save the day by adding elegant product features to blunt the excesses of the digital ad business, “I’m skeptical they’ll [the leaders of the attention economy] be able to suppress their economic interests.”
Copyright applies.

googlevi.jpg

liblab_lewiscarroll.jpg


 
<< back to stories
 

© 2012-2024 Jungle Drum Prose/Poetry.
Unless otherwise stated by the author, all content is free for non-commercial re-use, reprint, and rebroadcast, on the net and elsewhere.
Opinions are those of the contributors and are not necessarily endorsed by Jungle Drum Prose/Poetry.
Disclaimer | Privacy [ text size >> ]