Thứ Ba, 27 tháng 11, 2018

News on Youtube Nov 28 2018

hi I'm Lauren a human just like you and there are a few certain things essential

to surviving as a human we must eat we must drink we must sleep and we must use

the bathroom and that's not some bad nasty thing it's just something that

must happen but that's not so easy for everyone people dealing with certain

disabilities in certain physical ailments and require assistance do not

have the setup they need to use the bathroom while in public and there's a

very simple solution for this adult size changing tables are the simple solution

that would fix a huge problem no person deserves to be laid on the floor to be

changed and everyone deserves a chance to spend time out with their family

there are many different types of adult changing tables they range from simple

to complex and some are very easy to install giving businesses many different

options

now let's meet Heather mama - an advocate to the cause and let's find out

what these tables would mean for her and her family so my son Teddy is 7 years

old he's 50 pounds now and he's really long so that makes it tough when we go

places because we don't have the option to just you know bring them to the

bathroom and put them on the accessible toilet our options are you know changing

them in the car or on the floor or waiting until we get home and and that

makes it very tough and kinda takes away our freedom completely the whole issue

with laying him on the floor is that the floor is to prove very gross people

don't even want to put their backs on the floor but yet we are to put our

children on the floor with compromised immune systems it's very troubling

so changing tables and big venues movie theaters things like that

places that people are gonna be staying for more than an hour or an hour tops

would make the biggest difference we're not asking for just the gas station down

the street or the Wendy's around the corner it's it's places that people are

gonna be for a long time and that they're gonna be trying to enjoy in the

community and that kind of thing would change our lives we would have so much

more freedom to be able to go and do the things we want to do and and have our

child around doing other things that other people can do as well and this

doesn't just go for my child this goes for adults with disabilities

and children with this changing spaces is just trying to bring inclusion and

dignity to those who do not have it right now and it's not that people have

just forgotten them or didn't care it's that

we didn't realize that was needed back in the day you know a lot of people with

disabilities were kept in the homes they weren't really brought out so now that

technology has increased and people have increased their lifespans things are

needing to change and it's making a great progress Airport they're putting

them in hospitals and I just think all of your support join the Facebook group

changing spaces North Carolina and start the hashtag challenge for old poor

For more infomation >> Public service announcement by lauren place , ELC-381 - Duration: 4:33.

-------------------------------------------

Public Service Commission candidates face off in debate during runoff election - Duration: 1:46.

For more infomation >> Public Service Commission candidates face off in debate during runoff election - Duration: 1:46.

-------------------------------------------

इसे कहते है भीड़-Congress I Big Stage And Public Gathering As Compare To Bjp Viral Video - Duration: 1:25.

in haryana

For more infomation >> इसे कहते है भीड़-Congress I Big Stage And Public Gathering As Compare To Bjp Viral Video - Duration: 1:25.

-------------------------------------------

Public hearing in Albany on student debt - Duration: 0:30.

For more infomation >> Public hearing in Albany on student debt - Duration: 0:30.

-------------------------------------------

Amanda Bynes' Comeback: Paper Mag Reflects On The Actress' Public Breakdown & Journey Back | Access - Duration: 6:15.

For more infomation >> Amanda Bynes' Comeback: Paper Mag Reflects On The Actress' Public Breakdown & Journey Back | Access - Duration: 6:15.

-------------------------------------------

Georgia Department of Public Safety Gives Back with 'Toys for Tots' - Duration: 3:31.

For more infomation >> Georgia Department of Public Safety Gives Back with 'Toys for Tots' - Duration: 3:31.

-------------------------------------------

Public Hearing Held On Plan To Require Sprinklers In High-Rise Buildings - Duration: 2:19.

For more infomation >> Public Hearing Held On Plan To Require Sprinklers In High-Rise Buildings - Duration: 2:19.

-------------------------------------------

NOAA Workshops help keep the public and marine life safe - Duration: 5:10.

For more infomation >> NOAA Workshops help keep the public and marine life safe - Duration: 5:10.

-------------------------------------------

City Council Business Meeting & Public Forum - November 26, 2018 - Duration: 2:13:43.

For more infomation >> City Council Business Meeting & Public Forum - November 26, 2018 - Duration: 2:13:43.

-------------------------------------------

KPOP IN PUBLIC CHALLENGE EXO TEMPO DANCE IN PUBLIC - Duration: 5:30.

KPOP IN PUBLIC CHALLENGE EXO TEMPO DANCE IN PUBLIC

For more infomation >> KPOP IN PUBLIC CHALLENGE EXO TEMPO DANCE IN PUBLIC - Duration: 5:30.

-------------------------------------------

FUCK, MARRY, KILL | Public Interview! - Duration: 8:15.

OUUUU RANSI IS FINALLY GETTING SOME

Ayo PrettyBooyz

Today we are doing a video called F*CK, MARRY, KILL

You have three choices. From the three of us you will kill someone, Chill with someone or you will marry someone

Let's see how the people answered us!!

Who is this... The girl close to us, is?

Melina

She is called Melina and the question of the day is

Hello! The people close to me are? Saara

Sara

Saara and Sara

And today's question is

And the girl close to us today is? Melina! Okey melina

I have a question for you

Hi I'm Hulo HI I'm Klaara

Hi I'm Inka

And They are Kalid and Ransi

Wassup 2x and these are Daniela and

Veera.

Danie... Daniela and Veera

There are 3 questions

Today's question is that which one of us would you.

Fuck, Marry or Kill

And Which one of us would you Fuck, Marry or kill

Who is going to go first?

And don't do anything like Klaara said that and stuff

Or Inka said like that

You both have own opinions. Go with that.

Wait I need a moment to think!

Doesn't matter. Take your time

Wait let me think

Is it like bad to kill

Is there a reason for me to kill. Wait I don't want to kill anybody

No! You have to kill one of us. It's Fuck, Marry, Kill

One of us dies

One One... My talking skills...

One of us dies One of us you will marry and one of us you will

SMASH

Okey im going to chill

Okey okey

I don't know sorry. Who goes first. Inka first

Okey okey umm

Okey

umm

Hulo umm

Marry

Fuck

And Kill sorry

Wou She gave you an L

So with Kalid you are going to chill. Yeah

Me! i would

Fuck

Marry and kill

Take the L part 2

Next clip thank you!

Ransi. I feel like... Hulo we are chilling

My guy!

Ransi can chill in his...

WHAT?

Ransi is going to chill in his GRAVE

WOUU

OUU

She said that to you and you're quiet

Thanks to both of you and Nexr Clip

You have three choices

And they are about us. Are you ready?

Fuck, Marry or KILL

One of us

Which one of us you are going to chill, marry or

KILL

UMm

Don't take it personally then!

MARRY

FUCK

KILL

Alright thank you

There are three of us

There is Hulo, Me and Ransi is on the phone

You have three questions. Fuck Marry or KILL

Who are you going to fuck

And who are you going to marry. OOHH Ransi came back!!

He came

What picture?? Ransi is right here

Let's go!

Three choices

Which one of us are going to fuck, marry or kill

I don't want to kill amybody

You have to kill

One of us is going to bang

Alright

Ransi just came, if you didn't get it. Im going to be honest alright

KILL HULO

FUCK Kalid and Marry Him

FIRST L

Wallah first L

I was honest

NEXT CLIP

Or i don't know what it ws but

Three choices

FUCK, MARRY, KILL. Who are you going to kill

Who are you going to chill with or marry

Who goes first. Sara can go

What am i going to answer. Three boys in front of you

3 Boys and who are you going to chill with

Marry or Kill

It's not bad. Come Come

Well

Umm im chilling with

Ransi

and im killing Hulo and im going to marry

And you are going to marry kalid alright

You're turn.

Fuck Kalze

Marry Hulo and KILL RANSI

YES I DIDNT TAKE AN L

Alright then. Thank you and let's go to the next clip

Three guys are in front of you. Hulo

Ransi and Me. Who are you going to fuck.

Who are you going to marry with

FUCK, MARRY KILL

Who goes first

You take care of it

Why am i starting. Let me think

You im going to fuck, and you im going to marry because you're nice

NO IM NOT, well yes you are sometimes

SOMETIMES

ALRIGHT

UMM

I would marry him

umm

I dont want to be a husband. I want to be a daddy

You i would kill

And you I would Fuck

OUU

Ransi is finally getting some

But why are you killing Kalid???

What is it about?

TOO BIG!!

im getting killed haha

Hi and we have here

Mimmi

Mimmi

Well now

FUCK MARRY KILL

Who are you going to kill, fuck or marry

NOOO!

It's not bad rly! Just kill and you know the rest

I would marry you

You i would kill and the rest you know already!

What do girls want about me

I don't want to be a husband. I wan't to be a daddy

Thank you for watching. Remember to Subscribe

I don't know man but I chilled well.

KILL, KILL, KILL...

Thank you for watching until now

SHARE, LIKE

And give us new topics on the comments below

REMEMBER TO LIKE

LIKE if you like and dislike id you did not.

And if you are a new guy on the channel

SUBSCIRE NOW!!

For more infomation >> FUCK, MARRY, KILL | Public Interview! - Duration: 8:15.

-------------------------------------------

2018 'Or 'Emet Lecture "Digital Journalism and the New Public Square" Jameel Jaffer Oct 18, 2018 - Duration: 1:17:27.

Good afternoon my name is François Tanguay-Renaud I'm a professor here at

Osgoode Hall Law School I'm also the co-director of the Jack and Mae

Nathanson Center on transnational human rights crime and security and it's my

pleasure this afternoon to welcome you to the 2018 'Or 'Emet lecture a few words

about the lecture the 'Or 'Emet fund was established a long time ago now in 1976

to promote the study of law in the broadest sense the fund in itself seeks

to promote through public discussion research and scholarly writing public

and professional appreciation of the significance of things like religion

ethics culture and history in the development of the legal system 'Or 'Emet

what does this mean well it means the light of truth so the lecture really

seeks to shed light on matters that perhaps deserve a bit more truth or

truth telling as it were in 2010 the Nathanson center of which

I'm co-director decided to pool its resources with the 'Or 'Emet fund to ensure

that the lecture would be delivered on an annual basis so as a result the

themes that are not explored by the by the lecture tend to relate at least

tangentially to the Mandate of the Center now this year we're delighted to

have with us Jameel Jaffer who will already be known

most likely to many of you Jameel is the inaugural director of the Knight First

Amendment Institute at Columbia University but he also previously served

as deputy legal director at the American Civil Liberties Union the ACLU where he

oversaw that organizations work on free speech privacy technology and national

security as well as international human rights

Jameel has argued civil liberties cases in multiple appeals courts in the

US as well as in the US Supreme Court and he's testified many times before the

US Congress his cases include some of the most significant post 9/11 cases

relating to national security as well civil liberties including cases

concerning the tension interrogation surveillance targeted killing and

government secrecy he actually co-led the litigation that resulted in the

publication of the Bush administration's torture memos which is a lawsuit that

the New York Times described as among the most successful in the history of

public disclosure in the United States and Jameel also led the ACLU litigation

that resulted in the publication of the Obama administration's drone memos and

most recently with the Knight's First Amendment Institute he's also litigated

groundbreaking freedom of speech cases his recent writing has appeared in

multiple publications including the New York Times the Los Angeles Times The

Guardian the nation and a Yale Law Journal forum he's an executive director

of just security which is a national security blog and he has two

best-selling books under his belt his first book administration of torture

from Washington to Abu Ghraib story and beyond

I was co-authored with Amrit Singh and was published by Columbia University

Press in 2007 and more recently the drone memos published by the new press

in the fall of 2016 Jameel is a graduate of Williams College

the University of Cambridge and Harvard Law School he has served as a law clerk

to the Honorable Amalia Pierce of the US Court of Appeals for the

Second Circuit and he's also been a law clerk to the right honourable beveler

Beverly McLaughlin Chief Justice of former Chief Justice of the Supreme

Court of Canada so I should emphasize here that despite the fact that Jameel

has built most of his career side of the border in the United States he's still

very much a Canadian and thankfully he told me yesterday that he

still identifies as such so um so and and he's actually been involved with

many organizations on this side of the border including the Canadian Civil

Liberties Association he's actually a Distinguished Fellow of the Munk school

of global affairs and public policy and many other organizations so we're really

delighted to welcome Jameel here at Osgoode Hall Law School York University

today to deliver the 2018 'Or 'Emet lecture on the topic of digital

journalism and the new public square now just a few words about format format

Jameel tells me that he's going to speak for about 30 to 40 minutes and then

we're going to open to questions from the floor for an approximate 30 to 40

minutes once again so without further ado please join me in welcoming Jameel

Jaffer thank you Thank You Professor Tanguay-Renaud

and thank you all of you for being here so I haven't lived in Toronto

for more than a quarter century which is difficult for me to say let alone

believe but I'm always delighted to be invited to speak here because I still

have many friends and family in and around Toronto it's also an honor to be

asked to deliver this particular lecture though as you can probably imagine it's

a little bit intimidating to have to deliver a lecture whose name promises

the light of truth so in the spirit of managing expectations let me just say

right up front that I can aspire to at most a faint flicker of truth through a

heavy fog that's what that's what I'll try for a few months ago The

Guardian published a remarkable story revealing that a Cambridge University

researcher had harvested as many as 50 million facebook profiles for cambridge

analytica which is a data analytics firm that was headed at the time by Steve

Bannen one of Donald Trump's key advisors and the researcher Alexander

Cogan collected the profiles with an app called this is your digital life and

through the app he paid Facebook users small amounts of money to take

personality tests and to consent to their collection to the collection of

their data for academic purposes and then Kogan turned the profiles over to

cambridge analytica which used them The Guardian said to predict and influence

American voters choices at the ballot box

The Guardian story relied in significant part on the account of Christopher Wiley

a Canadian researcher who had helped establish Cambridge analytica on and who

later worked with Kogan on the Facebook project Wiley told the Guardian quote we

exploited Facebook to harvest millions of people's profiles and we built models

to exploit what we knew about them and target their inner demons that was what

Cambridge analytica was built on he said most of you probably remember that

story you may not be familiar though with what happened the day before it was

published as the guardians editors were readying their story for print the

lawyers received a letter from Facebook and the letter threatened a lawsuit if

the Guardian went forward with the story Facebook knew that the story would

provoke disbelief and outrage and perhaps even a regulatory response by

the US Congress so it tried to quash the story with the threat of a lawsuit now

there's nothing unusual unfortunately about powerful actors threatening

litigation to preempt unflattering news coverage intimidating letters of the

kind that Facebook's lawyer sent to the Guardian are a staple of the media

sphere in the United States and in the United Kingdom and they're probably a

staple of the media sphere here in Canada too but Facebook isn't just any

powerful actor it's one of the largest corporations in the world it has more

than 2 billion users more than 200 million of them in the United States and

through its human and algorithmic decision making it has immense influence

on how its users engage with one another and with the communities around them

it's no accident that stories about political polarization and filter

bubbles an election integrity the spread of disinformation online voter

suppression in Brazil mob violence in Sri Lanka and even ethnic cleansing in

Myanmar have Facebook as their common denominator Facebook has an invisible

but powerful influence on what we would once have called the public square and

through that influence Facebook affects societies all over the world so what are

the mechanics of that influence in a new article the legal scholar Kate clonic

argues that the social media platform should be thought of as quote systems of

governance because they're now the principal regulators of speech that

takes place online through their control of the new public square the platform's

are exercising power we ordinarily associate with state actors one facet of

that power the facet that cloning explores in her article is sometimes

called content moderation by which we usually mean the determination of which

speech should be permitted on these privately controlled platforms and which

shouldn't be Facebook for example routinely removes User Content that

shows graphic violence or nudity as well as hate speech as Facebook defines it

and speech glorifying terrorism again as Facebook defines it the other major

platforms have similar policies there's an ongoing debate about the ways in

which the companies are exercising that power whether they're taking down too

much speech or too little speech or perhaps even both in an essay published

over the summer by the Hoover Institution

Daphne Keller who's a researcher at Stanford observes that people from

across the political spectrum are convinced that the platforms are

silencing speech for the wrong reasons this debate is important because

increasingly it's the platform's rather than governments that delineate the

outer limits of public discourse and also because the platform's power to

censor isn't subject to constitutional or regulatory constraint

but content moderation at least in the narrow sense of that phrase hardly

begins to describe the platform's influence the social media company's

more fundamental power over public discourse is reflected not principally

in their decisions about which speech to exclude from their platforms but in

their decisions about how to organize and structure the speech that remains

they dictate which kinds of interactions are possible and which aren't

which kinds of speech will be made more prominent and which will be suppressed

which communities will be facilitated and which will be foreclosed if the new

public square were an ocean Facebook would control not only which fish got to

swim in it but also the temperature and salinity of the water the force and

direction of the currents and the ebb and flow of the tides

but despite the rule that Facebook and other platforms now play our collective

understanding of them is very limited and we sometimes struggle even to

describe what they are when is a platform a publisher when is it a common

carrier for the past year the knight Institute that Institute that I direct

has been litigating a First Amendment challenge a free speech challenge the

president Trump's practice of blocking critics from his Twitter account the

case turns on the question of whether the president's account at real Donald

Trump is a public forum the litigation has been a competition of analogies is

the president's notorious Twitter account like a town hall or a park or is

it more like a radio station or a telegraph Heather Whitney a philosopher

and legal theorist has written a fascinating paper about whether the

social media companies are properly thought of as editors her paper is

further evidence that we're still trying to figure out what the platforms are and

which legal labels to attach to them last year the US Supreme Court decided

an important case involving the First Amendment right of access to social

media the decision was unanimous but Justice Kennedy who wrote the courts

opinion and Justice Alito who concurred in it disagreed about whether the platforms

in their entirety should be characterized as public forums at some

level both justices seem to understand that the conceptual vocabulary available

to them was inadequate Justice Kennedy wrote quote while we now may be coming

to the realization that the cyber age is a revolution of historic proportions

we can't appreciate yet it's full dimensions and vast potential to alter

how we think express ourselves and define who we want to be he continued

the forces and directions of the Internet are so new so protein and so

far-reaching that courts must be conscious that what they say today might

be obsolete tomorrow if our collective understanding of the platforms is

limited it's a large part because the social media companies have guarded so

jealously the information that would help us

understand them over the last few years they've begun to share information about

instances in which governments compel them to remove user-generated content or

compel them to turn over user sensitive data they've begun to share information

about the limits they themselves impose on the kinds of information and the

kinds of content the users can post on their platforms just a few months ago

Facebook released the internal guidelines it uses to enforce its

community standards a release that marked with the Electronic Frontier

Foundation described as a sea change in Facebook's reporting practices these

disclosures are commendable but they're also overdue and incomplete as the

Electronic Frontier Foundation also observed perhaps under new pressure from

the public and from regulators around the world the companies can be compelled

to reveal more let's also recognize though that one reason the companies

aren't more transparent is that they themselves don't fully understand the

decisions that they're making sometimes this is by choice The Guardian

journalist who broke the Cambridge analytica story spoke to an engineer

who had once been in charge of policing third party app developers

at Facebook the engineer said he'd tried to warn Facebook about the growing black

market for its user data Facebook wasn't interested in hearing about it he said

he told the Guardian quote Facebook was in a stronger legal position if it

didn't know about the abuse that was happening they felt it was better not to

know it would be a mistake to think that

public ignorance in this context is simply a result of the company's refusal

to release information the information we most need the companies don't have

the companies are engaged in a massive social experiment with no precedence in

human history they rely increasingly on continuously

involving machine generated algorithms whose complexity defies understanding

they don't understand their own platforms and still less do they

understand their platforms broader social and political implications when

Facebook was asked about the reach of disinformation in the months preceding

the 2016 presidential election in the United States it first said that ten

million people had seen political ads posted by accounts linked to the Russian

internet research agency Jonathan Albright who's a researcher at

Columbia's journalism school then reported in the Washington Post that six

Russia link pages alone had reached 19 million people earlier this year

Facebook conceded that content posted by Russia linked accounts had reached as

many as a hundred and 26 million people if Mark Zuckerberg recent congressional

testimony showed anything it's that Facebook is a black box even to Facebook

Zuckerberg himself has sometimes acknowledged this in a recent interview

he conceded it's tough to be transparent when we don't first have a full

understanding of the state of some of our systems against that background and

now I'm finally getting to my point we should recognize that independent

journalism and research about the social media platforms is extraordinarily

valuable and that obstacles to this kind of journalism and research are

especially problematic to the extent these obstacles

impede us from understanding the forces that invisibly shape public discourse

their best thought of as obstacles to self-government we should view them I

think in much the same way we view laws that prevented us from reading federal

statutes or attending congressional hearings or accessing judicial opinions

they impede us from understanding the forces that govern us last-ditch legal

efforts to block stories before they go to print like the letter that Facebook's

lawyers sent to the Guardian aren't by any means the only obstacles worth

worrying about in this context perhaps the most

significant impediments to journalism and research about the social media

platforms arise from the company's Terms of Service which bar journalists and

researchers from using digital tools that are indispensable to the study of

the platforms at scale most significantly all the major social media

companies bar users from collecting information from their platforms through

automated means journalists and researchers can collect this information

manually but most of the platforms forbid them from collecting it using

computer code the prohibition is significant because it's impossible to

study trends and patterns and information flows without collecting

information at scale and it's practically impossible to collect

information at scale without collecting it digitally the effect of the

prohibition is to make some of the most important journalism and research

off-limits some platforms including Facebook also prohibit the use of

temporary research accounts the kinds of accounts that could allow journalists

and researchers to probe the platform's algorithms and to better study issues of

discrimination and disinformation online this prohibition too prevents

journalists and researchers from doing work we need them to do these

impediments are substantial in themselves but in the United States

they're made even more so by a federal statute called

the Computer Fraud and Abuse Act the US Justice Department understands that

statute to impose civil and criminal penalties on those who violate the

social media platforms Terms of Service on the Justice Department's reading the

law makes it a crime for a journalist or researcher to study the platforms using

basic digital tools the very existence of the law discourages journalists and

researchers from undertaking projects that are manifestly in the public

interest projects focused on for example the spread of disinformation and junk

news political polarization and unlawful discrimination in advertising anyone who

undertakes a project like that does it under the threat of legal action by the

Justice Department in the social media platforms when journalists and

researchers do undertake the projects they often modify their investigations

to avoid violating Terms of Service even if doing so makes their work less

valuable to the public in some cases the fear of liability leads them to abandon

projects altogether now I want to acknowledge right away that the social

media companies may have good reasons for generally Pro prohibiting the use of

these digital tools on their platforms Facebook's prohibition against automated

collection is presumably meant at least in part to impede the ability of

commercial competitors data aggregators and others to collect use and misuse the

data that Facebook's users post publicly Facebook's prohibition against the use

of fake accounts reflects in part an effort to ensure that users can feel

confident that other users they interact with are real people intentionally or

not though Facebook is also impeding journalists and researchers ability to

study and understand and report about the platform it's difficult to study a

digital machine like Facebook without the use of digital tools trying to study

Facebook with the without the use of these tools is like trying to study the

ocean without leaving the shore I want to return

to a point I made earlier that we should think of independent journalism and

research about the platforms as especially valuable and that we should

think of obstacles to them as especially as especially problematic a half a

century ago the US Supreme Court decided New York Times versus Sullivan which is

the landmark case that established crucial free speech protections that

American publishers rely on even today the cases some of you may know arose out

of an advertisement in the New York New York Times that solicited contributions

for the committee to defend Martin Luther King and the struggle for freedom

in the south and the ad accused certain public officials in the American South

of harassing civil rights activists and using violence to suppress peaceful

protests lb Sullivan the public safety commissioner of Montgomery Alabama filed

a libel suit against four of the clergymen who had signed the ad and also

against the New York Times which had published it the case was tried in

Alabama and the jury awarded five hundred thousand dollars in damages but

the US Supreme Court reversed in his opinion for the unanimous Court justice

brennan explained that the first amendment was intended first and

foremost to ensure the freedom of public debate on what he called public

questions quoting an earlier case he wrote the maintenance of the opportunity

for free political discussion to the end that the government may be responsive to

the will of the people is a fundamental principle of our constitutional system

and in his opinions most celebrated passage he invoked the United States

quote profound national commitment to the principle that debate on public

issues should be uninhibited robust and wide open

justice Brennan was focused in particular on speech critical of

government officials because the case before him involved exactly that kind of

speech but the key insight behind his opinion has broader implications the

insight is that the First Amendment was intended most of all to protect the

speech necessary to self-government or as Harry Calvin jr. put it and now

famous essay published only months after the New York Times case was decided the

central meaning of the First Amendment is quote to protect the speech without

which democracy cannot function now it hardly needs to be said that there is a

large distance between the question that was presented to the court in the New

York Times case and the questions I started off with today but it seems to

me that in an era in which the social media companies control the public

square in an era in which these companies are in a very real sense our

governor's journalism and research focused on these companies implicates

the very core of the First Amendment journalism and research that help us

better understand the forces that shape public discourse and that help us hold

accountable the powerful actors that control those forces is speech essential

to self-government it's what the First Amendment is for so

I'd like to offer just a few preliminary and tentative thoughts about what the

companies the courts and legislators could do to better protect the kind of

journalism and research I've been describing and more generally to ensure

that the public has the information it needs in order to understand the new

public square and to hold accountable the companies that control it if we were

committed to these goals what kinds of policy reforms might we ask for well we

might ask first that the companies be more transparent about the ways in which

they're shaping public discourse David Kaye who's the United Nations Special

Rapporteur for free expression issued a report several months ago urging the

companies to disclose quote radically more information about the nature of

their rulemaking and enforcement concerning expression on their platforms

in a report filed with the UN Human Rights Council kay

recommended that companies issue public opinions when they remove content from

their platforms so that users and others can better understand why the content is

being taken down and so that they can challenge the company's decisions when

they believe the company's decisions are unjustified those seem like good ideas

to me as Kay says quote in regulation of speech we expect some form of

accountability when authorities get it wrong we should see that happening

online too but we need the platforms to be transparent not only about what

speech they're taking down but about how they're shaping the speech they're not

taking down what kinds of speech and associations are they privileged in and

what forms does this privilege take what kinds of speech and associations are

they marginalizing and what forms is that marginalization take Kay's report

had a different focus but he highlighted the need for transparency about a

specific form of content curation quote if companies are ranking content on

social media feeds based on interactions between users here o they should explain

the data collected about such interactions and how this informs the

ranking criteria the more general point is that the companies should be more

transparent about how they're shaping public discourse they should be more

forthcoming about the forces at work in the new public square' here's a second

possible Avenue for reform we could ask the social media companies not to

enforce their Terms of Service against those who use digital tools on their

platforms in the service of bona fide journalism and research again many of

the companies themselves many of the questions that are most urgent right now

are questions the companies themselves aren't able to answer the companies

should be facilitating not obstructing the work of journalists and researchers

and others who might be able to provide answers where the companies can't even

if the companies have legitimate commercial privacy or security reasons

for generally prohibiting the use of certain digital investigative tools it

should be possible for them to create a kind of carve out a safe harbor that

expands the space for these activities while protecting the privacy of their

user and the integrity of their platforms

incidentally the data privacy regulation that went into effect in Europe over the

summer supplies a conceptual framework for exactly that kind of safe harbor as

a general matter the new regulation places significant restrictions on the

use of digital tools to collect and analyze and disseminate information

obtained from social media platforms but the regulation also encourages

individual countries to exempt journalism and research from those

general restrictions a handful of countries including the UK have already

recognized exemptions of this kind European privacy legislation in other

words distinguishes between those who use digital tools for private or

nefarious purposes and journalists who use those tools to inform the public

about matters of public concern the company's terms of service should be

similarly discerning the companies can protect their users privacy and their

platforms integrity without categorically prohibiting digital

journalism and research that is overwhelmingly in the public interest a

third possibility we could ask the courts to refuse to enforce the

company's Terms of Service against those who responsibly use digital tools in the

service of bona fide journalistic and research projects as a general matter

courts often a courts enforce contracts as written and as a general matter they

should enforce Terms of Service - but there are content contexts in which

courts declined to enforce contractual provisions that conflict

with public policy often these cases involve contractual terms whose

enforcement would disable democratic institutions or processes for example

where a contract would prohibit a person from running for office or from

criticizing public officials or from disclosing information of overriding

public importance those cases are surely relevant here for reasons I've already

explained journalism and research focused on the platform's is of special

democratic importance because of the unique role that these companies play in

shaping public discourse Terms of Service that impede that kind of journal

and research our intention with our commitment to self-government because

again they impede us from understanding the forces that profoundly shape our

interactions and our communities and our democracy here finally is a fourth

possibility we could encourage the US Congress to amend the Computer Fraud and

Abuse Act so that digital journalists and researchers can do their work

without the risk of incurring civil and criminal penalties journalists and

researchers who are investigating questions that implicate the very core

of the First Amendment's concern shouldn't have to operate under the

threat of legal sanctions let me just close by bringing us back to the

guardian story I started with that story about Cambridge analytic ah is in part a

reminder that the platforms are entirely justified to worry about the ways in

which their users data can be exploited by third parties it should also be a

reminder though of how reliant we are on the work of independent journalists and

researchers if we want to understand the platform's if we want to understand the

new public square' journalism and research about the platforms is crucial

and it deserves special solicitude under our law whatever free speech means in

the digital age it should encompass robust protections I think for this kind

of public interest journalism and research thank you

well thank you very much Jimmy offered this very

thought-provoking lecture so we have plenty of time actually for questions

now this is being very good video recorded and we have one microphone so

if you have a question please wait until I reach you so you can speak into the

microphone

we'll start us off professor Priya thank you this is really really interesting so

I have a question so you you tied the proposals that you make to the u.s.

approach to free speech and New York Times but could it be that that approach

nowadays is part of the problem rather than part of the solution so so

underlying this approach is one that proclaims those three famous words of

Brennan what is it robust uninhibited wide-open exactly but in fact you can

look at it in a different way and say it's a form of privatization so it

privatizes the regulation of speech and interestingly people who in other

contexts are very very favorable to idea that private business should be

regulated by government in this context are saying we should not regulate speech

and in fact nowadays we shouldn't even regulate those who regulate speech the

private companies that regulate speech now this may have been great in the 60s

but now leaving this domain completely on Facebook Google and so on completely

unregulated is is perhaps not sufficient to address the kind of problems that

that we have and perhaps and actually never thought about this until now the

basis for a distinction is is something like speech versus information so

so you can be as free with your speech but but these companies trade in

information and and and that's something else where government may want to be

more heavily involved with with regulating it so now of course they're

just there but so I I generally agree with you that the approach that we have

taken until now is not up to the task I don't see the problem as a problem with

York Times versus Sullivan or the you know at New York Times versus Sullivan

to the extent it just means that we want to make sure that debate about what

Brendon called public issues or public questions is uninhibited and robust and

wide open I'm all for that I don't see any problem with that I think that the

problem is a doctrinal one that is more recent and there has been a shift over

the last 50 years towards a much narrower conception of free speech which

focuses myopically on speakers right and is indifferent and you know to people

who are not in this debate it seems obvious that that free speech should

focus on speakers but it's actually not so obvious right because there are other

participants in the marketplace of ideas there are speakers but there are

listeners there also when it comes to social media platforms users their

society more generally which is supposed to be the beneficiary of this

marketplace of ideas and so I think that the the solution here is a is to reverse

that doctrinal trend and many of the most important and free speech questions

right now that are coming before the US Supreme Court at least are questions in

which there are at least arguably free speech interests on both sides of the V

right so if you think of these social media cases Facebook or Twitter they

argue that they have the First Amendment right to create the expressive

communities they want to create they are at

or they're publishing and they should be entitled to the same First Amendment

rights as any other editor or publisher and to me that's not an implausible

argument the problem is that they're not the only their First Amendment interests

aren't the only ones at stake in these you know in in these contexts they have

interest in creating the communities they want to create but users also have

free speech interests in being able to being able to participate fully in

public discourse society has an interest in ensuring that these platforms don't

undermine other important societal interests including the interest in fair

elections right and so I don't I don't like viewing these First Amendment

questions or these free speech questions through a very narrow lens focused on

speakers alone and I think that we are going to have to think about how to

reconcile or balance the rights again which don't seem to me trivial I think

Facebook does possibly have the right to create its own expressive community but

we need to balance that against other societal interests and the doctrine we

have in the United States right now it doesn't do that very well and you see

that not only in cases involving new communications platforms but you see it

in cases involved in campaign finance or cases involving commercial speech its

commercial speakers where there is this very narrow focus on the rights of the

speaker and in my view insufficient attention given to the free speech

implications of those decisions for other participants in what is sometimes

called the marketplace of ideas

Thanks I'm talking much eg I'm a visiting fellow at the school and

basically in University Belfast and I wonder whether you think there's a need

to go beyond the sort of the shield argument in much of the case you've laid

out which is a very compelling case is predicated on the technological ability

of the journalist or the researcher to work out what's going on which might

be presumptively unlawful but then trumped by one of the illustrations

you've laid out um should we also consider or expect that the platforms in

question might seek to frustrate either by not recognizing that they're

frustrating an investigation through their own data management and systems

practices or as we've sometimes seen with some of the sharing economy

companies deliberately seeking to interfere with investigation we've seen

for instance uber try to develop tools that are both clever and wicked in

recognizing when a regulator is logging on and giving them a different version

of the service and it's not wouldn't be surprising to see similar happening with

a journalistic investigation so where as just say the GDP or in Europe does give

a broad protection for researchers it doesn't do as much of some hints at us

requiring disclosure of how an algorithm operates or providing access to data as

we would for instance with classic public authorities where there may be an

access to information right or some way of requiring an explanation so does the

argument you outline lend itself to lore enough yeah maybe but maybe more of the

positive obligation school rather than the shield school yeah so this is you

know in terms of US constitutional doctrine my argument is quite ambitious

in terms of in any other terms it's a very modest argument right my argument

is ambitious only because US constitutional doctrine is so

constraining but I will not pretend that what I'm proposing is a complete

solution to the problems that I've you know described I I think that we can ask

the social media companies to be more transparent and we should

and journalism and research focused on them can create a kind of pressure to

you know a pressure that they will feel to be more you know more transparent I I

don't think we should rule out the possibility of regulation that requires

them to disclose information it's a complicated question what that

disclosure would look like when it comes to so you can distinguish between two

forms of censorship that platforms engage in I don't mean to you know I'm

using censorship in a value neutral way here the platforms decide who can be on

their platforms right the companies decide who can be on the platforms and

what can be said right by setting up the community standards so that's kind of

threshold censorship who gets who gets into the ocean in the first place right

and then there's a kind of environmental or interstitial censorship that the

platforms engage in much less visible and probably more consequential shaping

the speech by deciding which speech goes viral and which speech disappears into

the void so transparency as to the first forms of censorship is relatively

straightforward right you could demand as David Kay has and many others have

you could demand that the social media companies and you could require them to

do this disclose what speech have they taken down what speakers have they taken

down you could require them to issue public opinions explaining their

decisions although the scale presents a you know a challenge because they make

these decisions Oh thousands of times a day but that's relatively

straightforward what transparency looks like with respect to environmental or

interstitial censorship it's much more complicated and some people have called

for algorithmic transparency meaning you know the require the platforms to

disclose the algorithms that underlie those speech environments but again you

know as I said the the plan firms themselves don't fully understand

the algorithms the algorithms are continuously evolving they're based on

machine learning you know machines are adapting to a continuously changing

environment and if they were to disclose their algorithms nobody else would

understand them either and so you know you you can require them to disclose

major inputs that would be one possibility and some of the platforms

have made a move in that direction already in you know in blog posts they

will discuss how certain things end up at the top of your newsfeed and how

others get you know relegated to the bottom but that is a course form of of

transparency I think of journalism and research as output transparency you have

an input transparency right so the journalism and research isn't telling us

what the algorithms are but they're telling us it's telling us what the

results of the algorithms are and there is this many journalists and researchers

think that that form of transparency is ultimately more important because what

you don't get from input transparency even if we were show you know even if

Facebook showed us its algorithms and we fully understood its algorithms arguably

more important than the algorithms or the data that Facebook is feeding into

the algorithms and you don't get that from the algorithmic transparency the

only way you understand the interaction between the algorithm and the data is by

studying the output which is with journalists and researchers do so you

know that's just a long way of saying I think that the solution has to involve

some combination of input transparency and output transparency meaning some

combination of requiring the companies to disclose more and allowing

journalists and researchers to explore more freely and I think in the long run

this set of solutions here probably has to include a combination of

self-regulation on the part of the platforms and regulation imposed from

the outside by

by government my proposals are also modest in another sense in that I'm

focused entirely on transparency right now transparency and I maybe you know

maybe not the narrowest sense of transparency but nonetheless

transparency and but even if you had all the transparency that I've asked for

there's still the question of you know we would know what the platforms are

doing but what should we think of what the platforms are doing that's a

separate set of questions which is much more complicated than the set of

questions that I've tackled in in my view what Facebook and Twitter and

Instagram and YouTube are doing would be not worth losing any sleep over if these

companies were smaller than they are that every everything all the problems

turn on scale they stem from scale and they're that the magnitude of their

implications are societal implications as a result of their scale and if you

think if you agree with me that the problem you know the central problem is

a problem of centralization of power over the speech environment then it may

be that the most direct solution isn't a kind of free speech regulation but

rather antitrust or anti-monopoly regulation where you you know break up

the companies in some way or somehow lower the barriers to entry into the you

know marketplace in the real sense you know into this sector of social media

and their various ways you could do that you could do that again by breaking up

the companies but you could also do it through data portability requirements

you know just making it easier for people to move from Facebook to some new

platform right those forms of regulation might be a more direct way of addressing

some of the problems certainly the centralization problem and in some ways

they're less

they raise fewer difficult questions because you know regulation of speech

which is the other option would run into immediate doctrinal barriers in the

United States and I think would run into immediate value barrier it's not just in

United States but in most democracies you know people don't like the idea of

governments dictating to private entities what content you know what

viewpoints they can carry and what viewpoints they can and antitrust or

anti-monopoly options don't raise those concerns at least not to the same not to

the same extent sorry it's a very long answer - it was

probably a simple question professor Bandopadhyay I thank you to me this was

great uhm I mean I think I think your paper this is a paper does a lot already

in the sense that it moves beyond gateway regulate keeping regulation to

look at the environmental modification the sort of expansive effects of that on

both users and producers of content so but my question is that so from what I

understand as you said this is a transparency argument but it seems a

little asymmetric to mean that it's missing a dimension so for instance in

the classical understanding of free speech of the marketplace of ideas

it was ideas for their own sake and it was the more speech the better right and

you want to encourage that marketplace of ideas but and that is the classical

American understanding of it but of course we now know and as you

acknowledge that there is an economic dimension to speech the reason these

companies are powerful is because they monetize speech and they're able to do

things with speech and people listening to each other are able to monetize

information right so to the extent that your solutions if I have them right of

you know either don't have the Terms of Service or Facebook should agree not to

enforce them in case of research or court should not enforce them in cases

of research all the legislature should amend this statute to my mind they run

into not the free speech sort of the

the the old-timey free speech understanding but they run into an

economics problem which is that the term the service exists and the courts exist

and this particular legislation exists because as a matter of value people see

the economic benefits the economic side of free speech is as important and the

sort of private corporation side of free speech is as important as free speech

for its own sake and this is the kind corporations have speech to do with

campaign finance and things like that there is now an equation between these

two values of free speech for its own sake and free speech as an economic

value to society so they're both in the public interest my question is that

without measuring the economic side of this is it really to what extent would

you say it's reasonable to expect the courts or the corporation or the

legislature to want to undo or unravel the system at all

aside from there you know ideological commitment there's also an ideological

commitment on the economy side so could you say a little more about how you

think that side would play into your argument thank you that's a very hard

question so I I don't think that there is a strong argument that categorically

prohibiting these forms of journalism and research is necessary to serve

economic interests you know I think that a safe harbor for journalism and

research would add you know just to think of it in a very sort of crass

ledger you know sort of pros and cons way you know the pros would be the

public would have more information about the platforms and the cons would be very

limited even if you take economic costs into account these rules were put in

place long before anyone had thought about social media platforms a Computer

Fraud and Abuse Act is a 1980 statute the concern was hacking it's just that

the law was drafted so broadly that when the social media companies came along

they saw in the law the possibility of

you know strengthening the prohibition in their terms of service against this

form of digital journalism and research so I you know they've taken advantage of

the breadth of the law but the law doesn't reflect a congressional judgment

that banning this form of journalism is necessary or that it serves some

economic and the law was drafted before anyone you know had before Facebook was

a gleam in Mark Zuckerberg I so the hope now is that we can go back to the

legislature and say draft a narrower law you draft a narrower law that preserves

Facebook's economic viability that preserves a Facebook's ability to

protect the integrity of its platform preserves Facebook's ability to protect

its users privacy I don't like saying Facebook's protecting its users privacy

because it's ridiculous that you know but that's the argument that they make

is that they use privacy in a very stylized way what they mean is protect

our own ability to exploit our users data but prevent anyone else from

exploiting it mewho piz we can go back to Congress and say now that we have a

better sense of how now that we know how the law is being used today draft a

narrower law and with respect to so I didn't mention it but this safe harbor

idea this idea of the social media platforms creating a safe harbor

voluntarily for this kind of journalism and research you know voluntariness is

it's a spectrum right and Facebook is responding to public pressure right now

Facebook's under a lot of public pressure to respond to the major news

stories that have exposed the various pathologies on Facebook's platform that

lead to some of the things I was talking about earlier you know disinformation or

you know the undermining of elections and my Institute we sent a letter to

Facebook maybe six weeks ago proposing that they adopt a safe harbor for this

these forms of journalism and research and we spelled out in significant detail

what we meant by public interest journalistic and research projects and

we provided a draft of an amendment to their Terms of Service and I will not

say that they jumped at the opportunity to create that safe harbor harbor but

they responded graciously and invited us to come in and talk with him about it

and we've now had a couple of conversations with them most recently

just a few days ago where we spent two hours talking to Facebook's lawyers and

engineers about whether something like this would be workable and I am NOT

suggesting that you know I'm confident that this is going to happen I'm

definitely not but we would not be getting this kind of response from

Facebook if they didn't feel some public pressure now to answer the kinds of

concerns that we're that we're raising so that's just to say that we're in an

environment in which I think these kinds of reforms are at least on the table

which is more than we could have said two years ago and I think it's

worthwhile to think about how to exploit this moment especially since we don't

know how long this moment will will last

I like to address this from tools digital tools available to analyze

analyze algorithms parentsí reasons I come from small consultancy the group

that I had and I work for banks for in the corporate world government the way

they are using analysis it's a very methodical focused work for this craft

from social planners to be analyzed maybe we could borrow from their

expertise let's take an example we talk let's look at the algorithm if I don't

make sense please interrupt in here you talk input all garage you call data

coming out of the hall to the algorithm the algorithm a thematical formulas each

together I would believe as a professional wouldn't give the entire

story what will tell the story if five

Facebook would show us their use cases what does it mean a use case I want to

add a new customer I have a registration and I end up with a profile and I want

to know what goes in my profile and if you have a flag that you flagged me as a

his security I want to help you do follow-up on me yeah that's I agree with

you I so I mentioned this only briefly in my prepared remarks but so Facebook's

Terms of Service prohibit not just scraping not just collecting collecting

data through computer code they also prohibit what they call fake accounts

and in most contacts you know what counts as a fake account

is obvious you know if you create an account that's not associated with your

real name it's a fake account but researchers and journalists sometimes

want to use what Facebook would call fake accounts

and what they would call what researchers and journalists would call

temporary research accounts to probe the platform for precisely this reason so

you create an account in which you you know you you post that you're a member

of the Democratic Party and you create another account that in which you say

you're a member of the Republican Party and you can see then you know how does

Facebook respond to you you know how different are the responses and this is

a very important thing you you because the algorithms don't tell you everything

you need to know how the platform's respond to certain prompts right and

this is you know in the offline world there's a whole line of cases at least

in the u.s. that protects the ability of civil rights researchers to go to

landlords with tests you know with testers and you send one white couple to

the landlord the next day you send a black couple to the landlord and you see

how does the landlord respond differently to people who are otherwise

similar but differ in this one key quality and journalists and civil rights

researchers want to do the same thing on Facebook's platform they want to find

out well if you're black on Facebook what is the experience of being black on

Facebook and how does it differ from the experience of being white on Facebook or

you know what what is the experience of being from California on Facebook and

how does it differ from the experience of being from Toronto and on Facebook

and it's impossible to do that kind of research without violating Facebook's

Terms of Service so the safe harbor that we proposed and the safe harbor that I

referenced earlier would protect not only the ability of a researcher to

scrape the platform it would also protect the ability for researcher to

create what they would Facebook would call fake account to do that kind of

work now Facebook has said we want to create a community in which people can

feel confident that the people they're interacting with are real people and not

you know not BOTS or machines or fake fake people and so what we've proposed

is that they allow researchers and journalists certain in certain contexts

to create these temporary research accounts so long as the accounts

themselves identify themselves as fake accounts right so but there are there

are many contexts in which what you're trying to study is not the activity of

other users what you're trying to study is the activity of the platform and so

you're not really interacting with other users and to the extent you are you

would disclose to those users that this is not a real person this is the New

York Times doing a study of Facebook and you know we'll see how Facebook response

to this suggestion I mean they again they did not jump at it but nor have

they said that they're close to it good question let me now ask a question and

we have had the chance to speak about this so so your discussion is very much

couched in American medium right so Facebook is an American company you're

from an American organization and you're having a dialogue within a legal

framework that is American and that makes sense right to the extent that you

approaching Facebook on those terms and then be willing to say amend their Terms

of Service or their approach in a self-regulated way and so far as that

works fantastic the sense is that might not be the end of the story

and so the worry here in terms of strategy is having that discussion in a

context like the US which has a very specific not to say neo syncretic

understanding of freedom of speech which is as you said very user driven very Pro

corporate speech campaign-finance Hobby Lobby Citizens United cases we're all

familiar about is the worry that it the kinds of outcomes with that discussion

say whether it ends up in the court or even at the level of Congress and

tranches further some of these idiosyncrasies makes me think that it

might be worth thinking about approaching other avenues or other

venues from a transnational perspective going to the EU coming to other

jurisdictions have that discussion in order to be able

to try to have it only in any less idiosyncratic way right so we've seen

Google having trouble in the EU and my terms of services here in Canada changed

as a resolved right so so to what extent do you think it's it's it's prudent I

want to say that cab in the approach to the US and to what extent do you think

it would be worth engaging other yeah that's a good question I and I mean

you're absolutely right that you know I've approached this through a kind of

American lens I would say that the extent to which it would be productive

to approach it through a different lens depends a little bit on which reform

we're talking about right so if we're talking about pressuring the companies

to be more transparent there's no reason why that kind of campaign should be an

American campaign you know there are users of these platforms all over the

world and we're not relying on law in any direct way in pushing for this kind

of reform we are it's a softer ask and it would make sense to have a broader

coalition and that's something that we should definitely explore when it comes

to you know persuading the courts not to enforce Terms of Service it's hard to

get out of a single jurisdiction right there there the cases were a lot we'd

have to rely on in asking American courts not to enforce Terms of Service

our American cases and we're stuck with the idiosyncrasies of American law in

that context and you know there is a sense in which we could put together a

broader you know that we could have groups in multiple countries going to

their own courts and you know making the same arguments but they would have to

make the arguments in very different ways and even you know whether the

argument could be made or not might turn on what jurisdiction you're in I mean

this is a not an easy argument in the United States it might be an easier one

in countries where the public interest override to contractual agreements is

stronger you know there's more in the United States as you can imagine you

it's this very high deference to private private contractual agreements and then

this argument that you know the CFAA the Computer Fraud and Abuse Act violates

the First Amendment to the extent it prevents this kind of journalism

research there we're addressing both of them we're addressing an Americans a

prop an American problem because it's an American statute and I don't know that

there are similar things in other countries maybe there are but I don't

know and we have no choice but to use American tools to get at it there I

think the problem is especially American in that the CFAA reflects this use of

government power to protect private interests and you know that's the kind

of thing you see more in the United States than you do and your private

economic interest right but the solution our our argument that this statute

violates the First Amendment to the or violates free speech rights to the

extent that prevents this kind of journalists or research my sense is we

have more we have stronger tools at our disposal in the United States to fight

against a problem like this then you do in most other places in the world

because free speech doctrine is very well developed in the United States for

better and worse and you know it's an entirely plausible argument that this

statute could be struck down as applied to journalists and researchers on First

Amendment grounds so I think that we you know we this this set of problems would

definitely benefit from the thinking and resources and experience of a broader an

international coalition or group but the value of that might be different

depending on which reform we were talking thank you my question was about

jurisdiction too but I'm gonna know can can you take us back to the original

story and the Guardian and take us to the offline legal scene what happened

with that threat to to sue the Guardian the short answers

they back down Facebook back down I think arguments so they argued that so

the Guardian their hook was that the Guardian described this as a breach as a

data breach and Facebook the hook for Facebook's letter to the Guardian was

that it was libelous or defamatory to describe this Cambridge analytic

analytical thing is a breach because there was it's not that Facebook had

data in some you know on some hard drive somewhere and somebody hacked into it

and exposed it that's not what happened here this was a combination of data that

individual users had already made public in some sense they had posted publicly

and other information that users had voluntarily given to this researchers in

return for small sums of money and so Facebook said you absolutely cannot

publish that you know you your your story is completely wrong the framing is

all wrong this is a much more minor problem and to call it a breach is you

know not just wrong but actionable now I don't think Facebook was really

concerned about that narrow point what Facebook was concerned about was the

story you know more generally and the Guardian to its credit just published

the story and the aftermath of the story was that Facebook was you know under

fire from you know and from many different places and Facebook

predictably decided that it was probably not in its interest to be seen to be

going after the Guardian for for the story and so Facebook sort of changed

its its attitude towards the story after after it was after it was published and

but you see that you know now we represent these journalists and

researchers that are trying to study the platforms and those kinds of

conversations happen all the time behind the scenes around the Terms of Service

and the canoe fraud and abuse act like a journalist

will so we represent one journalist who

came up with a tool that allowed people to study why Facebook was recommending

certain people as friends in the friends you may know people you may know feature

so if you use faith Facebook there's a feature called people you may know and

it recommends people to you and this journalist cache hill at Gizmodo was

studying this particular feature of Facebook and interested to find out why

she certain people had been recommended to her and one of the people that

Facebook recommended to her turned out to be like a long-lost relative of hers

and she didn't even know this was a relative but she did some research and

she found out that Facebook somehow knew that this was her relative even though

she didn't and she thought well this is interesting

and so invented a browser plugin that individual users could install and track

the people who who Facebook was recommending to them and Facebook

contacted her offline and said we think that this this tool violates Terms of

Service and we think you might want to take it down and she got in in exchange

with them about you know can you explain to me how this violates the Terms of

Service because this is a browser plug-in it's not obviously in violation

of the prohibition against scraping it doesn't involve a fake account so what

what is that the term that it's prohibiting and I don't think she ever

got a satisfactory answer to that question but they tried to bully her

into abandoning this this project and she didn't you know to her credit but

many journalists abandoned projects in the face of you know the veiled threat

of legal action or not-so-veiled threat of legal action and so the same kind of

thing that you see going on with the Guardian goes on you know every day

behind the scenes and there's no no easy way to track those kinds of

conversations and you know we have no idea how many projects have been

abandoned because Facebook or some other platform said hey you know what there's

this federal statute called the Computer Fraud and Abuse Act and you know they

may as they're doing with cache Hill they may be invoking Terms of Service or

the Computer Fraud and Abuse Act even when those things don't actually apply

to the the project that the journalist or researcher is engaging so yeah that

you know that's part of the reason why we thought this safe harbor idea was

worth pursuing yeah last question hi so my name is Luke and I'm just a first

year student here at Osgoode and my question was related to the EU and you

mentioned them briefly in your speech I just wanted to know your assessment of

how they're handling it because I see that they're taking broader steps in

other countries or like North America or Canada in terms of curtailing this and

we specifically maybe - the copyright directive that they've recently done

because there's been some public backlash to that so I'm not sure how you

assess their strategy on the digital single market so I don't know a lot

about the copyright directive I think my general sense is that the EU has been

more thoughtful than the United States when it comes to privacy issues I'm

worried a little bit about their response to what they see as dangerous

speech you know so they have you know in Germany there's this new law that

requires the platforms to take down speech related to terrorism for example

within I think it's 24 hours and you know the the way these categories of

speech are defined it's often unsatisfying and the platforms have

every incentive to over censor to take down much more than might fall into

those categories because liability the liability is significant and there's no

penalty for taking down more speed than you have to and most the speech

that is taken down you know some of the speech that's taken down

you know everybody would agree this is inciting people to terrorism or inciting

people to immediate violence but a lot of it isn't like that you know a lot of

it is stuff that's controversial or involves groups that are disfavored but

doesn't involve you know sort of the kind of speech that most people would

agree governments have a legitimate interest in taking down and the cost of

that kind of over censorship often fall most on marginalized groups marginalized

because they're politically marginalized or religious minorities or racial

minorities and you know I I think that there are you know there are trade-offs

here in the United States there's a good argument to be made that Twitter for

example has which has you know used to call itself the free speech wing or the

free speech party right because they generally didn't take things down one

consequence of their not taking things down is that a lot of harassment stayed

up on the platform and there are two the groups that pay the most significant

price for that kind of harassment are women or you know minorities of one kind

or another and it'll be hard to call that a free speech success you know but

on the other hand you don't want the platforms to take down so much speech

that you know you end up with these carefully curated spaces where you know

controversial ideas are marginalized so you have to find some middle you know

some some middle middle road here on the speech issues I'm not convinced that

we've found it privacy I think that the EU seems to be

a good step ahead of the United States and on antitrust issues to the EU it's

been much more aggressive with the big technology companies and now you see for

good reasons and bad reasons the u.s. admitted the Trump administration

more interested in anti-monopoly litigation or regulation of the of the

tech companies but we're definitely behind

alright so Jimmy o started is lecture by confessing doubts as to whether his

lecture would live up to the title Omid lecture that is the light of truth now

after listening to this lecture I think you'll agree with me that he's really

made a valiant effort to shed light on an area on an issue that really warrants

more inquiry and that he's been a the lecture really champions eliciting facts

access to facts greater actor access to truth in an area where even those were

putting out the truth that his companies like Facebook generating algorithms do

not know what that truth is so it's hard for me to think of a topic for a lecture

for Nora mat lectured that was more appropriate so please join me in

thanking time will Joffrey for what was really a great

For more infomation >> 2018 'Or 'Emet Lecture "Digital Journalism and the New Public Square" Jameel Jaffer Oct 18, 2018 - Duration: 1:17:27.

-------------------------------------------

Public Hearing Held On Proposal To Require Sprinklers In High-Rise Buildings - Duration: 2:09.

For more infomation >> Public Hearing Held On Proposal To Require Sprinklers In High-Rise Buildings - Duration: 2:09.

-------------------------------------------

Haslam named in 'Public Officials of the Year' by Governing Magazine - Duration: 0:25.

For more infomation >> Haslam named in 'Public Officials of the Year' by Governing Magazine - Duration: 0:25.

-------------------------------------------

Immigrants in Greece face winter crisis after public sector cuts World news - Duration: 4:22.

Immigrants in Greece face winter crisis after public sector cuts World news

UN envoy says EU policy during debt crisis had unintended consequences

UN envoy says EU policy during debt crisis had unintended consequences

Greece's asylum system is hamstrung by public sector cuts imposed during the country's EU bailouts, a UN envoy has said, as campaigners warned of a looming winter crisis for refugees and migrants.

MEPs blame Europe's asylum system for humanitarian conditions in Greece, where thousands are stranded in squalid camps that are a danger to physical and mental health.

Philippe Leclerc, the UN refugee agency's representative in Athens, said EU policy on Greece during the debt crisis was "totally legitimate", but pointed to unintended consequences for migration.

"It is a state that is affected by the consequences of the financial crisis and public control spending measures … [so] you have an emergency situation on the islands and the mainland, where the state is not fully equipped to respond."

He was speaking to the Guardian days after the UNHCR called on Greece to take "urgent steps" to improve conditions for 11,000 people in dirty and unsafe camps on the islands of Samos and Lesbos.

Senior European sources are appalled by the camps, especially Samos, where 4,000 people are living in wretched conditions at the Vathy reception centre, six times above capacity.

New arrivals are pitching flimsy tents on steep slopes around the camps and have no access to electricity, running water or lavatories. Inside the camps, broken toilets and showers mean that people live next to raw sewage. Camp dwellers also have to contend with snakes, and rats feeding on uncollected waste. "This is supposed to be the richest and most civilised continent in the world," said the Dutch liberal MEP Sophie in 't Veld. "This is happening under our noses."

EU officials think that Greek ministries are unable to coordinate or spend EU funds to help asylum seekers: the EU has allocated €1.6bn (about £1.4bn) since 2015, but at least €554m has not been spent by Greek authorities.

Brussels is suspicious that the defence ministry, which is led by Panos Kammenos of the nationalist right party Independent Greeks, is not prioritising the humanitarian needs of refugees. The ministry, one of the key government departments overseeing refugee camps, has been at the centre of alleged misspending EU funds, claims that it has rejected as "fake news". The EU's anti-fraud agency, Olaf, confirmed that it has "opened an investigation into alleged irregularities concerning the provisions of EU funded food for refugees", but declined to comment further.

The EU is increasingly worried about a rise in arrivals in Greece, which is piling pressure on a system beset by delays. Greece is sheltering 67,100 refugees and migrants and has seen an abrupt surge in numbers crossing the land border with Turkey. The number of detections of illegal crossings at the northern land border has tripled, said Krzysztof Borowski, at the European Border and Coast Guard Agency (Frontex). "This is adding to pressure on Greece."

Most making the treacherous journey across the fast-flowing Evros River separating the two nations are Turks fleeing political persecution. By October, registrations of Turkish nationals had jumped from 6,500 last year to 18,700, according to Frontex. Increasingly, Syrians and Iraqis have also joined the flows as word has spread that the land frontier is easier to cross.

As winter approaches, NGOs are warning of a crisis in the making. "There are around 400 people in the north still living in tents," said Ruben Cano who heads the Athens branch of the International Federation of Red Cross and Red Crescent Societies. "The reception system in Greece is overwhelmed partly because the country is being made to shoulder too great a burden."

The influx has put increasing strain on an asylum service labouring under severe fiscal constraint.

"People are working under adverse conditions. There is clearly an issue of staff capacity that we need to resolve," said Markos Karavias, director of the Greek asylum service.

Deep public sector cuts have contributed to Greece's notoriously slow asylum procedures. Greece does not have enough judges to hear appeals, nor doctors and psychologists to carry out assessments of vulnerable claimants.

One consequence is that few asylum seekers and migrants are being returned to Turkey, a possibility created by a controversial EU deal with Ankara in 2016.

Brussels thinks that people-smugglers, aware of administrative shortcomings, are stepping up efforts to get people to Greece. "By not returning [from the islands to Turkey] you create a powerful marketing model [for smugglers]," said an EU source.

MEPs say it is wrong to blame Greece. The Dutch MEP in 't Veld blamed "the dysfunction of the council" – EU leaders and home affairs ministers – for the asylum system's inadequacies.

EU member states remain deadlocked over a permanent system of refugee quotas, a dispute that threatens to delay a broader overhaul of the European asylum system, comprising seven laws.

Péter Niedermüller, a Hungarian centre-left MEP, thinks it unlikely that the legislation will be agreed before European elections in 2019.

"The whole refugee issue has been captured by the far-right political movements in Europe," said Niedermüller, who led a European parliament delegation to Greece last year. "Italy, Austria, Poland and Hungary have nationalist anti-immigrant parties in government," he added. "Many of the member states are not really ready to join a common European asylum policy and I think this is the problem."

Không có nhận xét nào:

Đăng nhận xét