ANTI—SOCIAL MEDIA

Next



MEDIA FOR MORONS

Hello, Meta. CEO Mark Zuckerberg announced Thursday that Facebook the company is changing its name to Meta to reflect its growing focus on the metaverse. “From now on, we're going to be the Metaverse first, not Facebook first,” Zuckerberg said at the company's annual Connect Conference Thursday.


OUR BATTLE CRY IS HIDE THE MESS-ER, WE’LL CALL IT META


Facebook (FB) - Get Facebook, Inc. Class A Report CEO Mark Zuckerberg has drawn plenty of fire for being conspicuously silent since the news came out this week that as many as 50 million Facebook users might have had their data gathered and used improperly by a political research organization. But any calls to admonish or even replace the 33-year-old chief executive would be extremely difficult, analysts and experts say.

That's because of Facebook's controversial dual-class stock structure that gives Zuckerberg a majority voting power of the stock. According to The Economist, Zuckerberg owns just about 16% of Facebook's stock, but controls 60% of the voting rights since the class B shares he owns convey 10 times the votes as the class A shares. 

On Tuesday, Facebook's stock fell nearly 3% after falling almost 7% on Monday, and pressure has been growing on Zuckerberg and other top executives to publicly address the issue. The Federal Trade Commission is reportedly investigating whether Facebook mishandled user data, and British and American lawmakers have called on Zuckerberg and other executives at Facebook to answer questions about what happened. Meanwhile, The Verge reported that Facebook planned to hold an  emergency internal meeting on Tuesday morning to discuss the situation with employees. 

Like fellow tech companies Alphabet Inc. (GOOGL) - Get Alphabet Inc. Class A Reportand Snap Inc. (SNAP) - Get Snap, Inc. Class A Report , Facebook has a dual-class stock structure that vests a disproportionate amount of voting power in their respective companies' founder-CEOs, holding them much less beholden to their boards than would otherwise be the case.

"On one extreme, a company's board holds management accountable; on the other extreme, it advises management on issues," said A.T. Kearney lead partner of retail practice Greg Portell. "What's unique about Facebook's board is that management is the primary shareholder... advising and oversight being one and the same." 

At the same time, however, Portell noted that Facebook's board is not simply a rubber stamp organization composed of friends and family.

New York (CNN Business)  The identity of the Facebook whistleblower who released tens of thousands of pages of internal research and documents — leading to a firestorm for the social media company in recent weeks — was revealed on "60 Minutes" Sunday night as Frances Haugen.

The 37-year-old former Facebook product manager who worked on civic integrity issues at the company says the documents show that Facebook knows its platforms are used to spread hate, violence and misinformation, and that the company has tried to hide that evidence. 

"The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook, and Facebook over and over again chose to optimize for its own interests, like making more money," Haugen told "60 Minutes." 

"60 Minutes" correspondent Scott Pelly quoted one internal Facebook (FB) document as saying: "We have evidence from a variety of sources that hate speech, divisive political speech and misinformation on Facebook and the family of apps are affecting societies around the world.


SIDEBAR:    Facebook Executive Says It's 'Ludicrous' To Blame Jan. 6 On Social Media  —  You did not light the fire , you just supplied the fuel,  and the matches you asshole.


About a month ago, Haugen filed at least eight complaints with the Securities and Exchange Commission alleging that the company is hiding research about its shortcomings from investors and the public. She also shared the documents with the Wall Street Journal, which published a multi-part investigation showing that Facebook was aware of problems with its apps, including the negative effects of misinformation and the harm caused, especially to young girls, by Instagram. 

Haugen, who started at Facebook in 2019 after previously working for other tech giants like Google (GOOGL GOOGLE) and Pinterest (PINS), is set to testify on Tuesday before the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security.

"I've seen a bunch of social networks, and it was substantially worse at Facebook than anything I've seen before," Haugen said. "At some point in 2021, I realized I'm going to have to do this in a systemic way, that I'm going to have to get out enough [documents] that no one can question that this is real." 

Facebook has aggressively pushed back against the reports, calling many of the claims "misleading" and arguing that its apps do more good than harm.

"Every day our teams have to balance protecting the ability of billions of people to express themselves openly with the need to keep our platform a safe and positive place," Facebook spokesperson Lena Pietsch said in a statement to CNN Business immediately following the "60 Minutes" interview. "We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true." 

Several hours after the interview aired, Pietsch released a more than 700-word statement laying out what it called "missing facts" from the segment, and saying the interview "used select company materials to tell a misleading story about the research we do to improve our products."

A spokesperson for "60 Minutes" did not immediately respond to a request for comment from CNN Business on Facebook's claims.  On Sunday morning ahead of the "60 Minutes" interview, Facebook Vice President of Global Affairs Nick Clegg told CNN's Brian Stelter that "there is no perfection on social media as much as in any other walk of life."

"We do a huge amount of research, we share it with external researchers as much as we can, but do remember there is ... a world of difference between doing a peer-reviewed exercise in cooperation with other academics and preparing papers internally to provoke and inform internal discussion," Clegg said.

Haugen said she believes Facebook Founder and CEO Mark Zuckerberg   "never set out to make a hateful platform, but he has allowed choices to be made where the side effects of those choices are that hateful and polarizing content gets more distribution and more reach.” 

SIDEBAR:  Solution  —  It’s time to breakup Facebook, even dismiss Facebook, and bring some sense into the monster and if it means hanging a few execs out in the wilderness with a rope around their necks so be it , the harm has been done.


Screen Shot 2021-10-05 at 6.54.04 AM

A Very Brave Woman  —  Frances Haugen, a former Facebook product manager, was revealed on Sunday as the whistleblower who leaked thousands of pages of internal research and documents that have created a firestorm for the social media company. 

Whistleblower Revealed  —  Haugen said she was recruited by Facebook in 2019 and took the job to work on addressing misinformation. But after the company decided to dissolve its civic integrity team shortly after the 2020 Presidential Election, her feelings about the company started to change. 

She suggested that this decision — and moves by the company to turn off other election protection measures such as misinformation prevention tools — allowed the platform to be used to help organize the January 6 riot on Capitol Hill. 

"They basically said, 'Oh good, we made it through the election, there weren't riots, we can get rid of civic integrity now,'" she said. "Fast forward a couple of months, and we had the Insurrection. When they got rid of civic integrity, it was the moment where I was like, 'I don't trust that they're willing to actually invest what needs to be invested to keep Facebook from being dangerous.'" 

Facebook says the civic integrity team's work was distributed to other units when it was dissolved. Facebook Vice President of Integrity Guy Rosen said on Twitter Sunday night that the group was integrated into other teams so the "work pioneered for elections could be applied even further."

The social media company's algorithm that's designed to show users content that they're most likely to engage with is responsible for many of its problems, Haugen said. 


Facebook Grilled By Senate Over Company's Impact On Kids  —   "One of the consequences of how Facebook is picking out that content today is that it is optimizing for content that gets engagement, a reaction, but its own research is showing that content that is hateful, that is divisive, that is polarizing, it's easier to inspire people to anger than it is to other emotions," she said. She added that the company recognizes that "if they change the algorithm to be safer, people will spend less time on the site, they'll click on less ads, and — as we always say follow the money  —

 “ THEY'LL MAKE LESS MONEY." 

Screen Shot 2021-10-05 at 9.36.09 AM

Facebook's Pietsch said in her Sunday night statement that the platform depends on "being used in ways that bring people closer together" to attract advertisers, adding, "protecting our community is more important than maximizing our profits.”    In an internal memo obtained by the New York Times earlier Sunday, Clegg disputed claims that Facebook contributed to the January 6 riot.

"Social media has had a big impact on society in recent years, and Facebook is often a place where much of this debate plays out," Clegg said in the memo. "So it's natural for people to ask whether it is part of the problem. But the idea that Facebook is the chief cause of polarization isn't supported by the facts."

Haugen said that while "no one at Facebook is malevolent ... the incentives are misaligned."

"Facebook makes more money when you consume more content. People enjoy engaging with things that elicit an emotional reaction," she said. "And the more anger that they get exposed to, the more they interact and the more they consume.


Dangerous Turf For Kids  —  A Facebook executive was grilled by Senators on Thursday about the impact its apps have on younger users, two weeks after an explosive report indicated the company was aware that Facebook-owned Instagram could have a "toxic" effect on teen girls.

The hearing, featuring Facebook's global head of safety, Antigone Davis, is the first of two that the Senate Commerce Committee is holding on how Facebook approaches its younger users. Next week, the committee is expected to receive testimony from a Facebook whistleblower. 

"We now know that Facebook routinely puts profits ahead of kids' online safety. We know it chooses the growth of its products over the well-being of our children," Democratic Sen. Richard Blumenthal said in opening remarks at the hearing. "And we now know it is indefensibly delinquent in acting to protect them."

"The question that haunts me," Blumenthal added, "is how can we, or parents, or anyone, trust Facebook?"

In a sign of the bipartisan pressure on this issue, Republican Sen. Marsha Blackburn echoed Blumenthal in her opening remarks directed at Facebook. "We do not trust you with influencing our children," she said.


Facebook Is Hitting The Brakes On Instagram For Kids  —  The Wall Street Journal reported earlier this month that researchers at Facebook have been conducting studies for the past three years into how Instagram, which it owns, affects its millions of young users. The research shows the platform can damage mental health and body image, especially among teenaged girls. 

Blumenthal said his office created an Instagram account identifying as a 13-year-old girl. It followed some easily discoverable accounts associated with extreme dieting and eating disorders. Within a day, he said, the Instagram recommendations were "exclusively filled" with other accounts that promoted self harm and eating disorders. (Davis said those accounts would be in violation of Instagram's policies that crack down on content promoting self-harm.)

Following the Journal report, Instagram said it was looking at new ways to discourage users from focusing on their physical appearance. The company also said that while Instagram can be a place where people have "negative experiences," the app also gives a voice to marginalized people and helps friends and family stay connected.

"What's been lost in this report is that in fact with this research, we've found that more teen girls actually find Instagram helpful -- teen girls who are suffering from these issues find Instagram helpful than not," Davis said Thursday. "Now that doesn't mean that the ones that aren't, aren't important to us. In fact, that's why we do this research."

Screen Shot 2021-10-05 at 7.18.51 AM

“  Davis, who identified herself as a mother and former teacher, also pushed back on the idea that the report was a "bombshell" and did not commit to releasing a full research report, noting potential "privacy considerations." She said Facebook is "looking for ways to release more research.

SIDEBAR  TRANSLATION  —  This statement means she needs more time to make up better lies  —  Listening to her was like listening to Ilse Koch, "The Bitch of Buchenwald”. A lying turnoff.

“If you need to do more research on this, you should fire everyone you paid to do research,” replied Markey. “IG stands for Instagram, but it also stands for Insta-greed.”

“If Facebook has taught us anything, it’s that self-regulation is not an option,” added Markey, comparing Facebook to big tobacco companies that pushed deadly products on children and teens. “Instagram is that first childhood cigarette.”

The report, and the renewed pressure from lawmakers in its aftermath, also appeared to force Instagram to rethink its plans to introduce a version of its service for kids under 13. Days before the hearing this week, Instagram said it would press pause on the project.

"While we stand by the need to develop this experience, we've decided to pause this project," Adam Mosseri, head of Instagram, wrote in a blog post published Monday. "This will give us time to work with parents, experts, policymakers and regulators, to listen to their concerns, and to demonstrate the value and importance of this project for younger teens online today."

In the blog post Monday, Mosseri acknowledged that the Journal's reporting "has raised a lot of questions for people." Those questions may only persist after the hearing.

"For 2.5 hours, Ms. Davis offered evasions and misdirections, but refused to commit to a single substantive change or even greater transparency," Josh Golin, executive director at Fairplay, a child advocacy group formerly known as the Campaign for a Commercial-Free Childhood, said in a statement Thursday. "She also continued to push the fiction that Facebook's interest in Instagram Kids is driven by concern for children's safety when the company's own leaked documents make clear it's part of a larger strategy for growth and to compete with TikTok and Snap for young users.

“While Facebook publicly denies that Facebook is harmful for teens, privately Facebook experts and researchers have been ringing the alarm for years,” said Sen. Richard Blumenthal.

Shortly afterward, the Journal published a more comprehensive series of slides than those that were released by Facebook, including a slideshow called “Teen Girls Body Image and Social Comparison on Instagram.” 

That report included a study showing that 66 percent of teen girls and 40 percent of teen boys on Instagram “experience negative social comparison.” When teen girls felt bad about their bodies, 32 percent said Instagram made them feel worse, according to the slides shared by the Journal. 

Davis said Facebook was trying to release more of its Instagram studies but did not provide a concrete commitment or timeline. 

SIDEBAR:  They pay this lying bitch?

12/02/2021   aljacobsladder.com