ACTION ALERT: Google Employees Resign in Protest Against Pentagon Contract

May 17th, 2018 - by admin

Kate Conger / Gizmodo & Win Without War – 2018-05-17 00:05:17

https://gizmodo.com/google-employees-resign-in-protest-against-pentagon-con-1825729300

Google Employees Resign
In Protest Against Pentagon Contract

Kate Conger / Gizmodo

(May 14, 2018) — It’s been nearly three months since many Google employees — and the public — learned about the company’s decision to provide artificial intelligence to a controversial military pilot program known as Project Maven, which aims to speed up analysis of drone footage by automatically classifying images of objects and people. Now, about a dozen Google employees are resigning in protest over the company’s continued involvement in Maven.

The resigning employees’ frustrations range from particular ethical concerns over the use of artificial intelligence in drone warfare to broader worries about Google’s political decisions — and the erosion of user trust that could result from these actions. Many of them have written accounts of their decisions to leave the company, and their stories have been gathered and shared in an internal document, the contents of which multiple sources have described to Gizmodo.

The employees who are resigning in protest, several of whom discussed their decision to leave with Gizmodo, say that executives have become less transparent with their workforce about controversial business decisions and seem less interested in listening to workers’ objections than they once did.

In the case of Maven, Google is helping the Defense Department implement machine learning to classify images gathered by drones. But some employees believe humans, not algorithms, should be responsible for this sensitive and potentially lethal work — and that Google shouldn’t be involved in military work at all.

Historically, Google has promoted an open culture that encourages employees to challenge and debate product decisions. But some employees feel that their leadership no longer as attentive to their concerns, leaving them to face the fallout. “Over the last couple of months, I’ve been less and less impressed with the response and the way people’s concerns are being treated and listened to,” one employee who resigned said.

There’s precedent for employee pushback resulting in product changes — in 2015, employees and users successfully challenged Google’s ban on sexually explicit content posted to Blogger. But these are the first known mass resignations at Google in protest against one of the company’s business decisions, and they speak to the strongly felt ethical concerns of the employees who are departing.

In addition to the resignations, nearly 4,000 Google employees have voiced their opposition to Project Maven in an internal petition that asks Google to immediately cancel the contract and institute a policy against taking on future military work.

However, the mounting pressure from employees seems to have done little to sway Google’s decision — the company has defended its work on Maven and is thought to be one of the lead contenders for another major Pentagon cloud computing contract, the Joint Enterprise Defense Infrastructure, better known as JEDI, that is currently up for bids.

Employees’ demands that Google end its Pentagon contract are also complicated by the fact that Google claims it is only providing open-source software to Project Maven, which means the military would be able to still use the technology, even if Google didn’t accept payment or offer technical assistance.

Still, the resigning employees believe that Google’s work on Maven is fundamentally at odds with the company’s do-gooder principles. “It’s not like Google is this little machine-learning startup that’s trying to find clients in different industries,” a resigning employee said. “It just seems like it makes sense for Google and Google’s reputation to stay out of that.”

Many Google employees first learned the company was working on Maven when word of the controversial project began to spread internally in late February. At the time, a Google spokesperson told Gizmodo that the company was in the process of drafting “policies and safeguards” around its use of machine learning, but that policy document has yet to materialize, sources said.

One employee explained that Google staffers were promised an update on the ethics policy within a few weeks, but that progress appeared to be locked in a holding pattern. The ethical concerns “should have been addressed before we entered this contract,” the employee said.

Google has emphasized that its AI is not being used to kill, but the use of artificial intelligence in the Pentagon’s drone program still raises complex ethical and moral issues for tech workers and for academics who study the field of machine learning.

In addition to the petition circulating inside Google, the Tech Workers Coalition launched a petition in April demanding that Google abandon its work on Maven and that other major tech companies, including IBM and Amazon, refuse to work with the US Defense Department.

“We can no longer ignore our industry’s and our technologies’ harmful biases, large-scale breaches of trust, and lack of ethical safeguards,” the petition reads. “These are life and death stakes.”

More than 90 academics in artificial intelligence, ethics, and computer science released an open letter today that calls on Google to end its work on Project Maven and to support an international treaty prohibiting autonomous weapons systems. [See the complete letter below. — EAW.]

Peter Asaro and Lucy Suchman, two of the authors of the letter, have testified before the United Nations about autonomous weapons; a third author, Lilly Irani, is a professor of science and a former Google employee.

Google’s contributions to Project Maven could accelerate the development of fully autonomous weapons, Suchman told Gizmodo. Although Google is based in the US, it has an obligation to protect its global user base that outweighs its alignment with any single nation’s military, she said.

“If ethical action on the part of tech companies requires consideration of who might benefit from a technology and who might be harmed, then we can say with certainty that no topic deserves more sober reflection — no technology has higher stakes — than algorithms meant to target and kill at a distance and without public accountability,” the letter states.

“Google has moved into military work without subjecting itself to public debate or deliberation, either domestically or internationally. While Google regularly decides the future of technology without democratic public engagement, its entry into military technologies casts the problems of private control of information infrastructure into high relief.”

Executives at Google have made efforts to defend Project Maven to employees. At a meeting shortly after the project became public, Google Cloud CEO Diane Greene spoke in support of Project Maven, multiple sources told Gizmodo.

More recently, Greene and other employees have hosted several sessions to debate and discuss the project. These sessions featured speakers who supported and opposed Maven and stressed the difficulty of drafting policy about the ethical use of machine learning, an attendee explained.

There are other reputational concerns factoring into employees’ decisions to leave Google. The company’s recent political fumbles, like its sponsorship of the Conservative Political Action Conference and its struggle to address internal diversity concerns, have also played a role.

“At some point, I realized I could not in good faith recommend anyone join Google, knowing what I knew. I realized if I can’t recommend people join here, then why am I still here?” a resigning Google employee said.

“I tried to remind myself right that Google’s decisions are not my decisions. I’m not personally responsible for everything they do. But I do feel responsibility when I see something that I should escalate it,” another added.

“An important part of our culture is having employees who are actively engaged in the work that we do. We know that there are many open questions involved in the use of new technologies, so these conversations — with employees and outside experts — are hugely important and beneficial,” a Google spokesperson said in an April statement. “The technology is used to flag images for human review and is intended to save lives and save people from having to do highly tedious work. Any military use of machine learning naturally raises valid concerns. We’re actively engaged across the company in a comprehensive discussion of this important topic and also with outside experts, as we continue to develop our policies around the development and use of our machine learning technologies.”

A Google spokesperson did not immediately respond to a request for comment about the resignations. But employees want to see action from the company, in the form of an ethics policy, a canceled contract, or both.

“Actions speak louder than words, and that’s a standard I hold myself to as well,” a resigning employee said. “I wasn’t happy just voicing my concerns internally. The strongest possible statement I could take against this was to leave.”


Researchers in Support of Google Employees:
Google should withdraw from Project Maven
and commit to not weaponizing its technology

An Open Letter To:
Larry Page,
CEO of Alphabet;
Sundar Pichai, CEO of Google;
Diane Greene, CEO of Google Cloud;
Fei-Fei Li, Chief Scientist of AI/ML and Vice President, Google Cloud.

As scholars, academics, and researchers who study, teach about, and develop information technology, we write in solidarity with the 3100+ Google employees, joined by other technology workers, who oppose Google’s participation in Project Maven.

We wholeheartedly support their demand that Google terminate its contract with the DoD, and that Google and its parent company Alphabet commit not to develop military technologies and not to use the personal data that they collect for military purposes.

The extent to which military funding has been a driver of research and development in computing historically should not determine the field’s path going forward. We also urge Google and Alphabet’s executives to join other AI and robotics researchers and technology executives in calling for an international treaty to prohibit autonomous weapon systems.

Google has long sought to organize and enhance the usefulness of the world’s information. Beyond searching for relevant webpages on the internet, Google has become responsible for compiling our email, videos, calendars, and photographs, and guiding us to physical destinations.

Like many other digital technology companies, Google has collected vast amounts of data on the behaviors, activities and interests of their users. The private data collected by Google comes with a responsibility not only to use that data to improve its own technologies and expand its business, but also to benefit society. The company’s motto “Don’t Be Evil” famously embraces this responsibility.

Project Maven is a United States military program aimed at using machine learning to analyze massive amounts of drone surveillance footage and to label objects of interest for human analysts. Google is supplying not only the open source ‘deep learning’ technology, but also engineering expertise and assistance to the Department of Defense.

According to Defense One, Joint Special Operations Forces “in the Middle East” have conducted initial trials using video footage from a small ScanEagle surveillance drone. The project is slated to expand “to larger, medium-altitude Predator and Reaper drones by next summer” and eventually to Gorgon Stare, “a sophisticated, high-tech series of cameras . . . that can view entire towns.”

With Project Maven, Google becomes implicated in the questionable practice of targeted killings. These include so-called signature strikes and pattern-of-life strikes that target people based not on known activities but on probabilities drawn from long-range surveillance footage. The legality of these operations has come into question under international and US law.

These operations also have raised significant questions of racial and gender bias (most notoriously, the blanket categorization of adult males as militants) in target identification and strike analysis. These problems cannot be reduced to the accuracy of image analysis algorithms, but can only be addressed through greater accountability to international institutions and deeper understanding of geopolitical situations on the ground.

While the reports on Project Maven currently emphasize the role of human analysts, these technologies are poised to become a basis for automated target recognition and autonomous weapon systems. As military commanders come to see the object recognition algorithms as reliable, it will be tempting to attenuate or even remove human review and oversight for these systems.

According to Defense One, the DoD already plans to install image analysis technologies on-board the drones themselves, including armed drones. We are then just a short step away from authorizing autonomous drones to kill automatically, without human supervision or meaningful human control.

If ethical action on the part of tech companies requires consideration of who might benefit from a technology and who might be harmed, then we can say with certainty that no topic deserves more sober reflection – no technology has higher stakes – than algorithms meant to target and kill at a distance and without public accountability.

We are also deeply concerned about the possible integration of Google’s data on people’s everyday lives with military surveillance data, and its combined application to targeted killing. Google has moved into military work without subjecting itself to public debate or deliberation, either domestically or internationally.

While Google regularly decides the future of technology without democratic public engagement, its entry into military technologies casts the problems of private control of information infrastructure into high relief.

Should Google decide to use global internet users’ personal data for military purposes, it would violate the public trust that is fundamental to its business by putting its users’ lives and human rights in jeopardy. The responsibilities of global companies like Google must be commensurate with the transnational makeup of their users.

The DoD contracts under consideration by Google, and similar contracts already in place at Microsoft and Amazon, signal a dangerous alliance between the private tech industry, currently in possession of vast quantities of sensitive personal data collected from people across the globe, and one country’s military. They also signal a failure to engage with global civil society and diplomatic institutions that have already highlighted the ethical stakes of these technologies.

We are at a critical moment. The Cambridge Analytica scandal demonstrates growing public concern over allowing the tech industries to wield so much power. This has shone only one spotlight on the increasingly high stakes of information technology infrastructures, and the inadequacy of current national and international governance frameworks to safeguard public trust. Nowhere is this more true than in the case of systems engaged in adjudicating who lives and who dies.

We thus ask Google, and its parent company Alphabet, to:
* Terminate its Project Maven contract with the DoD.

* Commit not to develop military technologies, nor to allow the personal data it has collected to be used for military operations.

* Pledge to neither participate in nor support the development, manufacture, trade or use of autonomous weapons; and to support efforts to ban autonomous weapons.

Posted in accordance with Title 17, Section 107, US Code, for noncommercial, educational purposes.


ACTION ALERT: Project Maven
Win Without War

A dozen Google employees are quitting to protest Google’s plans to help the Pentagon automate drone targeting.

This is huge. The Pentagon’s drone warfare program is already notorious for killing civilians. The new Google-Pentagon plan, called Project Maven, would automate remote assassination even further by using artificial intelligence to identify human targets.

What’s more, the Trump administration blatantly ignored their most recent deadline to report drone casualties. Our chances of learning the names of civilians killed by secret Google drones — or even the number — are nearly zero.

With Project Maven, Google can claim to keep their hands clean while they’re automating the business of death.

That’s why it’s such a big deal that these brave Google employees stepped up to tell another story. But they’re up against a tech giant and the powerful Pentagon. They need to know you have their back.

ACTION: Add your name to stand with the Google employees who quit and demand Google pull out of Project Maven.