Thursday 22nd of March 2018

people will believe the worst...


Mark Zuckerberg

As the world becomes more complex and governments everywhere struggle, trust in the internet is more important today than ever.

The internet is our shared space. It helps us connect. It spreads opportunity. It enables us to learn. It gives us a voice. It makes us stronger and safer together.

To keep the internet strong, we need to keep it secure. That's why at Facebook we spend a lot of our energy making our services and the whole internet safer and more secure. We encrypt communications, we use secure protocols for traffic, we encourage people to use multiple factors for authentication and we go out of our way to help fix issues we find in other people's services.

The internet works because most people and companies do the same. We work together to create this secure environment and make our shared space even better for the world.

This is why I've been so confused and frustrated by the repeated reports of the behavior of the US government. When our engineers work tirelessly to improve security, we imagine we're protecting you against criminals, not our own government.

The US government should be the champion for the internet, not a threat. They need to be much more transparent about what they're doing, or otherwise people will believe the worst.

I've called President Obama to express my frustration over the damage the government is creating for all of our future. Unfortunately, it seems like it will take a very long time for true full reform.

So it's up to us -- all of us -- to build the internet we want. Together, we can build a space that is greater and a more important part of the world than anything we have today, but is also safe and secure. I'm committed to seeing this happen, and you can count on Facebook to do our part.


a social weak link...

Facebook has received criticism on a wide range of issues, including its treatment of its usersonline privacychild safetyhate speech, and the inability to terminate accounts without first manually deleting the content. In 2008, many companies removed their advertising from the site because it was being displayed on the pages of individuals and groups they found controversial. The content of some user pages, groups, blogs, and forums has been criticized for promoting or dwelling upon controversial and often divisive topics (e.g., politicsreligionsex, etc.). There have been several censorship issues, both on and off the site.

In the lifespan of its service, Facebook has made many changes that directly impact its users, and their changes often result in criticism. Of particular note are the new user interface format launched in 2008, and the changes in Facebook's Terms of Use, which removed the clause detailing automatic expiry of deleted content. Facebook has also been sued several times.[1]

On August 19, 2013, it was reported that a Facebook user from Palestinian Autonomy Khalil Shreateh found a bug that allowed him to post material to other users' Facebook Walls. Users are not supposed to have the ability to post material to the Facebook Walls of other users unless they are approved friends of those users that they have posted material to. To prove that he was telling the truth, Shreateh posted material to Sarah Goodin's wall, a friend of Facebook CEO Mark Zuckerberg. Following this, Shreateh contacted Facebook's security team with the proof that his bug was real, explaining in detail what was going on. Facebook has a bounty program in which it compensates people a $500+ fee for reporting bugs instead of using them to their advantage or selling them on the black market. However, it was reported that instead of fixing the bug and paying Shreateh the fee, Facebook originally told him that "this was not a bug" and dismissed him. Shreateh then tried a second time to inform Facebook, but they dismissed him yet again. On the third try, Shreateh used the bug to post a message to Mark Zuckerberg's Wall, stating "Sorry for breaking your privacy ... but a couple of days ago, I found a serious Facebook exploit" and that Facebook's security team was not taking him seriously. Within minutes, a security engineer contacted Shreateh, questioned him on how he performed the move and ultimately acknowledged that it was a bug in the system. Facebook temporarily suspended Shreateh's account and fixed the bug after several days. However, in a move that was met with much public criticism and disapproval, Facebook refused to pay out the 500+ fee to Shreateh; instead, Facebook responded that by posting to Zuckerberg's account, Shreateh had violated one of their terms of service policies and therefore "could not be paid." Included with this, the Facebook team strongly censured Shreateh over his manner of resolving the matter. In closing, they asked that Shreateh continue to help them find bugs.

limited intelligence...


United States president Barack Obama has met with bosses from Facebook, Google and other internet giants to discuss plans to overhaul the surveillance practices of America's spy agencies.

Those attending included Google's executive chairman Eric Schmidt and Facebook founder Mark Zuckerberg, who says he last week called the president personally to express frustration with the vast online intelligence dragnets.

A White House official says the meeting is part of Mr Obama's continuing dialogue on the issues of privacy, technology and intelligence.

"The president reiterated his administration's commitment to taking steps that can give people greater confidence that their rights are being protected, while preserving important tools that keep us safe," the White House said.

But Mr Zuckerberg, a public critic of government data gathering practices, says more needs to be done.

"While the US government has taken helpful steps to reform its surveillance practices, these are simply not enough," he said through a spokesperson.

"People around the globe deserve to know that their information is secure and Facebook will keep urging the US government to be more transparent about its practices and more protective of civil liberties," he said.

Some of the largest US technology companies, including Google, its rival Yahoo, social networking site Twitter and others, have been pushing for more transparency, oversight and restrictions to the US government's gathering of intelligence.


loosing business....

IBM is spending more than a billion dollars to build data centers overseas to reassure foreign customers that their information is safe from prying eyes in the United States government.

And tech companies abroad, from Europe to South America, say they are gaining customers that are shunning United States providers, suspicious because of the revelations by Edward J. Snowden that tied these providers to the National Security Agency’s vast surveillance program.

Even as Washington grapples with the diplomatic and political fallout of Mr. Snowden’s leaks, the more urgent issue, companies and analysts say, is economic. Technology executives, including Mark Zuckerberg of Facebook, raised the issue when they went to the White House on Friday for a meeting with President Obama.

It is impossible to see now the full economic ramifications of the spying disclosures — in part because most companies are locked in multiyear contracts — but the pieces are beginning to add up as businesses question the trustworthiness of American technology products.


Facebook founder Mark Zuckerberg earned $3.3bn (£1.9bn) on the sale of share options in 2013, a new regulatory filing has revealed.

Mr Zuckerberg has now exhausted his supply of stock options as a result of Facebook's public offering.

He was given 60 million shares to help him with his tax bill.

His base salary for 2013 fell to $1, like other tech leaders such as Google's Larry Page and former Apple boss Steve Jobs.

However, his total compensation for the year was $653,165, down from $1.99m in 2012.

Facebook said the majority of that was to pay for flights on private jets, which are seen as necessary for security reasons.

Mr Zuckerberg still owns 426.3 million Facebook shares, which are worth around $25.7bn.

Shares in the social networking giant have more than doubled in value over the past year, as Facebook has reported better than expected earnings due to its strong mobile ad sales.

no idea what all this switcheroo means...


In a global rollout from today, Facebook will start removing the message function from its mobile app for iOS and Android and instead require users to install its standalone Messenger app, which, it says, is "fast and reliable."

Facebook is about to eliminate the message feature of its mobile app, pushing its users to install the company’s standalone app Messenger instead, TechCrunch reports.

The company has begun sending out notifications to users in Europe saying that the message service will disappear from Facebook’s main mobile app for iOS and Android in about two weeks.

“We have built a fast and reliable messaging experience through Messenger and now it makes sense for us to focus all our energy and resources on that experience,” the company said in a statement Wednesday, Reuters reports.

Users in a handful of European countries, including England and France, will be the first users forced to download the Messenger app, but eventually users in all countries will see the message service in the main app disappear, spokesman Derick Mains said to Reuters.

read more:


emotional contagion is not new...

There are two interesting lessons to be drawn from the row about Facebook's "emotional contagion" study. The first is what it tells us about Facebook's users. The second is what it tells us about corporations such as Facebook.

In case you missed it, here's the gist of the story. The first thing users of Facebook see when they log in is their news feed, a list of status updates, messages and photographs posted by friends. The list that is displayed to each individual user is not comprehensive (it doesn't include all the possibly relevant information from all of that person's friends). But nor is it random: Facebook's proprietary algorithms choose which items to display in a process that is sometimes called "curation". Nobody knows the criteria used by the algorithms – that's as much of a trade secret as those used by Google's page-ranking algorithm. All we know is that an algorithm decides what Facebook users see in their news feeds.

So far so obvious. What triggered the controversy was the discovery, via the publication of a research paper in the prestigious Proceedings of the National Academy of Sciences that for one week in January 2012, Facebook researchers deliberately skewed what 689,003 Facebook users saw when they logged in. Some people saw content with a preponderance of positive and happy words, while others were shown content with more negative or sadder sentiments. The study showed that, when the experimental week was over, the unwitting guinea-pigs were more likely to post status updates and messages that were (respectively) positive or negative in tone.

Statistically, the effect on users was relatively small, but the implications were obvious: Facebook had shown that it could manipulate people's emotions! And at this point the ordure hit the fan. Shock! Horror! Words such as "spooky" and "terrifying" were bandied about. There were arguments about whether the experiment was unethical and/or illegal, in the sense of violating the terms and conditions that Facebook's hapless users have to accept. The answers, respectively, are yes and no because corporations don't do ethics and Facebook's T&Cs require users to accept that their data may be used for "data analysis, testing, research".

Facebook's spin-doctors seem to have been caught off-guard, causing the company's chief operating officer, Sheryl Sandberg, to fume that the problem with the study was that it had been "poorly communicated". She was doubtless referring to the company's claim that the experiment had been conducted "to improve our services and to make the content people see on Facebook as relevant and engaging as possible.

read more:

tempora spies on your facebook...

A tribunal is to hear a legal challenge by civil liberty groups against the alleged use of mass surveillance programmes by UK intelligence services.

Privacy International and Liberty are among those to challenge the legality of alleged "interception, collection and use of communications" by agencies.

It follows revelations by the former US intelligence analyst Edward Snowden about UK and US surveillance practices.

The UK government says interception is subject to strict controls.

The case - also brought by Amnesty International and the American Civil Liberties Union and other groups - centres on the alleged use by UK intelligence and security agencies of a mass surveillance operation called Tempora.

The UK government has neither confirmed nor denied the existence of the operation.

But documents leaked by whistleblower Mr Snowden and published in the Guardian newspaper claimed the existence of Tempora, which the paper said allowed access to the recordings of phone calls, the content of email messages and entries on Facebook.


See toon at top and article below it...

real spooks on facebook...

Facebook, the world’s top social media platform, is reportedly seeking to hire hundreds of employees with US national security clearance licenses.

Purportedly with the aim of weeding out “fake news” and “foreign meddling” in elections.

If that plan, reported by Bloomberg, sounds sinister, that’s because it is. For what it means is that people who share the same worldview as US intelligence agencies, the agencies who formulate classified information, will have a direct bearing on what millions of consumers on Facebook are permitted to access.

It’s as close to outright US government censorship on the internet as one can dare to imagine, and this on a nominally independent global communication network. Your fun-loving place “where friends meet.”

Welcome to Facespook!

As Bloomberg reports: “Workers with such [national security] clearances can access information classified by the US government. Facebook plans to use these people – and their ability to receive government information about potential threats – in the company’s attempt to search more proactively for questionable social media campaigns ahead of elections.”

A Facebook spokesman declined to comment, but the report sounds credible, especially given the context of anti-Russia hysteria.

Over the past year, since the election of Donald Trump as US president, the political discourse has been dominated by “Russia-gate” – the notion that somehow Kremlin-controlled hackers and news media meddled in the election. The media angst in the US is comparable to the Red Scare paranoia of the 1950s during the Cold War.

Facebook and other US internet companies have been hauled in front of Congressional committees to declare what they know about alleged “Russian influence campaigns.” Chief executives of Facebook, Google, and Twitter, are due to be questioned again next month by the same panels.

read more:

that unsocial shit on facebook is you...

Palihapitiya’s comments last month were made one day after Facebook’s founding president, Sean Parker, criticized the way that the company “exploit[s] a vulnerability in human psychology” by creating a “social-validation feedback loop” during an interview at an Axios event.

Parker had said that he was “something of a conscientious objector” to using social media, a stance echoed by Palihapitaya who said that he was now hoping to use the money he made at Facebook to do good in the world.

“I can’t control them,” Palihapitaya said of his former employer. “I can control my decision, which is that I don’t use that shit. I can control my kids’ decisions, which is that they’re not allowed to use that shit.”

He also called on his audience to “soul search” about their own relationship to social media. “Your behaviors, you don’t realize it, but you are being programmed,” he said. “It was unintentional, but now you gotta decide how much you’re going to give up, how much of your intellectual independence.”

read more:

facebook: we can’t promise we won’t destroy democracy...

SAN FRANCISCO — Facebook Inc. warned Monday it could offer no assurance that social media was on balance good for democracy, but the company said it was trying what it could to stop alleged meddling in elections by Russia or anyone else.

The sharing of false or misleading headlines on social media has become a global issue, after accusations that Russia tried to influence votes in the United States, Britain and France. Moscow denies the allegations.

Facebook, the largest social network with more than 2 billion users, addressed social media’s role in democracy in blog posts from a Harvard University professor, Cass Sunstein, and from an employee working on the subject.

“I wish I could guarantee that the positives are destined to outweigh the negatives, but I can‘t,” Samidh Chakrabarti, a Facebook product manager, wrote in his post.

Facebook, he added, has a “moral duty to understand how these technologies are being used and what can be done to make communities like Facebook as representative, civil and trustworthy as possible.”

Contrite Facebook executives were already fanning out across Europe this week to address the company’s slow response to abuses on its platform, such as hate speech and foreign influence campaigns.

Read more:



Read from top...


The UK's Information Commissioner says she will seek a warrant to look at the databases and servers used by British firm Cambridge Analytica.

The company is accused of using the personal data of 50 million Facebook members to influence the US presidential election in 2016.

Its executives have also been filmed by Channel 4 News suggesting it could use honey traps and potentially bribery to discredit politicians.

The company denies any wrongdoing.

Fresh allegations

On Monday, Channel 4 News broadcast hidden camera footage in which Cambridge Analytica chief executive Alexander Nix appears to suggest tactics his company could use to discredit politicians online. 

In the footage, asked what "deep digging" could be done, Mr Nix told an undercover reporter: "Oh, we do a lot more than that."

He suggested one way to target an individual was to "offer them a deal that's too good to be true and make sure that's video recorded".

He also said he could "send some girls around to the candidate's house..." adding that Ukrainian girls "are very beautiful, I find that works very well".

Mr Nix continued: "I'm just giving you examples of what can be done and what has been done."

Channel 4 News said its reporter had posed as a fixer for a wealthy client hoping to get a political candidate elected in Sri Lanka.

However, Cambridge Analytica said the report had "grossly misrepresented" the conversations caught on camera.

"In playing along with this line of conversation, and partly to spare our 'client' from embarrassment, we entertained a series of ludicrous hypothetical scenarios," the company said in a statement.

"Cambridge Analytica does not condone or engage in entrapment, bribes or so-called 'honeytraps'," it said.

Mr Nix told the BBC's Newsnight programme that he regarded the report as a "misrepresentation of the facts" and said he felt the firm had been "deliberately entrapped".


Read more:


Read from top

no influence whatsoever on the morons...

Mark Zuckerberg has admitted Facebook "made mistakes" in protecting users' data and has announced a suite of changes to the social network in the wake of the Cambridge Analytica scandal.

Key points:
  • Mark Zuckerberg says Facebook will do more to restrict developers who have misused data
  • Statement comes as company is in crisis mode amid a #DeleteFacebook movement 
  • Man who created the app that was used to mine data says he's been made a "scapegoat"


He says Facebook will now impose stricter rules on developers of third-party apps that collect your data, and will create a new section in your News Feed where you can review those that you use.

"There's more to do, and we need to step up and do it," he said in his first statement on the matter this morning.

Facebook suspended the London-based political research company Cambridge Analytica last week over allegations that it kept improperly obtained user data after telling the social media giant it had been deleted.

The data was reportedly collected by University of Cambridge psychology academic Aleksandr Kogan via a survey app on Facebook years ago, and then passed onto Cambridge Analytica, which used it to target people with political advertising during the 2016 US election campaign.


Read more: