Here’s a partial transcript to shed some light on the situation.
The panel discussed the issue of valid data and how companies have become more concerned with whether they can sell an algorithm-driven software product, and less concerned with whether the damned thing works and who its results might affect, should they be inaccurate.
Lisa Rein: Okay first question: Are the software companies who are making these algorithm-based products are just selling them to whomever they can, for whatever they can apply them to? And if so, how can we stop this from happening?
These companies are powerful and getting larger and more powerful all the time. Yet, no one seems to care; even the companies buying the snake oil products; as long as they can resell the services somehow, and the money keeps coming in. What can we do?
Chelsea Manning: Me personally, I think that we in technology have a responsibility to make our own decisions in the workplace – wherever that might be. And to communicate with each other, share notes, talk to each other, and really think – take a moment – and think about what you are doing. What are you doing? Are you helping? Are you harming things? Is it worth it? Is this really what you want to be doing? Are deadlines being prioritized over – good results? Should we do something? I certainly made a decision in my own life to do something. It’s going to be different for every person. But you really need to make your own decision as to what to do, and you don’t have to act individually. We can work as a community. We can work as a collective entity. People in technology, we’re not going to be able to explain to people – all this stuff. People know that everything’s messed up, and they know that things are messed up because of the algorithms that we have. We’ve educated them on that. They understand that. They understand that viscerally, because they see the consequences and the results of these things that are happening every day.
The problem is that people in technology aren’t paying attention, and even some of us who are paying attention, aren’t doing anything about it. We’re waiting for somebody to do something about it. Somebody else isn’t going to do it. We’re going to have to do it ourselves.
Caroline Sinders: Even if you feel like a cog in the machine, as a technologist, you aren’t. There are a lot of people like you trying to protest the systems you’re in. Especially in the past year, we’ve heard rumors of widespread groups and meetings of people inside of Facebook, inside of Google, really talking about the ramifications of the U.S. Presidential election, of questioning, “how did this happen inside these platforms?” – of wanting there even to be accountability inside of their own companies. I think it’s really important for us to think about that for a second. That that’s happening right now. That people are starting to organize. That they are starting to ask questions.
I think especially looking at where we are right now in San Francisco – inside the hub of Silicon Valley – in a space where it’s very amenable to protest is very amiable to supporting ethical technology. How do we build more support for other people. Is it going to spaces we’re not usually in? Is it going to other tech meet ups? Maybe. Is it having hard conversations with other technologists? Probably. How do we push the politics of our community into the wider spread community? We have to go and actually evangelize that I think.
Is the company even using the algorithm in the way it was intended to be used? Often a company purchases an algorithm that is made for one kind of analytics and it gets used for a completely different thing, and then you get these really skewed results.
Lisa Rein: Wait a minute. I can’t believe I’m asking this, but, are you saying that, as long as they like the results, nobody cares if the results are accurate?
Caroline Sinders: ‘How sure are we that it’s true?’ is not the question that I’m hearing in the conference room. It’s more like ‘we’ve gotten these results and these people have purchased it.’ or ‘It’s selling really well.’ Cause we are in the age of people building software as a product, capabilities as a product, APIs as a product. (Meaning that you buy access to an API that’s like a pipeline.) And if it’s returning certain results that a company can then use and put in a portfolio to sell to other different kinds of clients, like, it doesn’t actually matter how much it works, if it has the appearance of working; if it’s pumping out ‘results.’ So, I can’t speak to like academic verifiability of different kinds of APIs. I can speak to “that I have not ever heard people really talk about that.
Chelsea Manning: Yeah. I’ve had experience with this in particular… For verifiability of data, it’s purely academic. That’s what I’ve found. When you are working in a corporate or a business setting or whether your are working in a government setting – military context or whatever – it’s ‘results, results, results.’ Nobody cares how verifiable the data is. Everybody’s cutting corners. Everybody’s trying to meet deadlines. That’s why we — people in technology – we need to be thinking more ethically. We need to be very cognizant of the systems that we’re building, and not just sitting there, continually meeting deadline and meeting priorities that our set by our leadership, or by clients or by senior corporate, ya know, C Suite people.
You really need to think about what you’re doing. What the consequences of what you’re doing are. Because these (questions) are not happening, and they should be happening. In many cases, for some of these systems, maybe the question of whether we should be doing a system like this at all is a question that should be asked – at least asked, in some of these rooms. It’s not, and it’s not going to be.
Caroline Sinders: I think there is a big push though, if you work in industry software, to really understand the ethical ramifications of the products your using, or the software that you’re using, and how it effects your users. And how this effects even unintended bystanders – people that have not opted in to the system, or into the product, right? And that’s where you get into like, different surveillance systems, or systems that are in the whole vein of the Internet of Things, right? How many people are “accidentally” a part of a data set that they didn’t get to opt in to.
Kristen Lum: And in some cases, remember, you yourself can act as a sponge for accountability. Because now, let’s say you have a system that’s been purchased, that’s been created by “peer reviewed science” or very expensive technology, and it’s saying to do the thing that your organization kinds of wants to do anyway. Well, maybe do some research and show the people your working with, and say “hey we may be over policing this community.” Because, otherwise, it’s like “Hey, this software we spent all this money on is telling us to do it,” which gives them justification to do what they want to do anyway. So, try to maybe act like a buffer, between these viewpoints, by being able to ask, and question, ‘why are you doing that?’
Lisa Rein: Would opening up the “black box” solve everything?
Chelsea Manning: It’s not just that it’s a black box, even when the code is available to you, sometimes how it’s actually coming up with the predictions it’s coming up with — apart from doing pure math, when you’re trying to come up with something that’s understood by humans for an explanation, it can escape you sometimes. So I think that’s one of the dangers of depending on showing the entire algorithm. We have to fully understand these algorithms and not just see how they work from a code perspective or an algorithmic perspective.
What scares me with that is some of these algorithms being used in like, Bail hearings… You literally changing this person’s life, because they are going to stay in jail, because they are in a bail hearing, where an algorithm — made by some company — decided that you’re more predicted to be arrested. It’s not evidentiary in any way, but it’s being used in an evidentiary manner. It’s just a mathematical prediction based on false data — or poor data — and it’s actually tearing people’s lives apart. And it’s also feeding into this feedback loop, because they’re seen as being re-arrestable. Therefore, it reinforces the data set.
Kristen Lum: There are a lot of models now predicting whether an individual will be re-arrested in the future. Here’s a question: What counts as a “re-arrest?” Say someone fails to appear for court and a bench warrant is issued, and then they are arrested. Should that count? So I don’t see a whole lot of conversation about this data munging.
Caroline Sinders: Specifically, I think some of the investigative reporting that Pro Publica has done specifically on this is really worth highlighting.
(Editor’s Note: Parts of this partial transcript were rearranged slightly for flow and readability.)
This time last year, my friends Chelsea Manning and Heather Hewey-Hagborg were still depending on me in order to communicate effectively; so they could collaborate on their art and research projects together.
Heather and Chelsea’s collaborations started way back in 2015, when some folks at Paper magazine orchestrated their first collaboration. Using only the good old U.S. mail, Chelsea sent Heather swabs of her DNA and answered a number of powerful questions to start a discussion about DNA and privacy that continues to this day. Boy was it cool being in the middle of that conversation. 🙂
Heather had developed a method for creating portraits of strangers based on DNA, and was speaking around the world, explaining both the wonders of forensic phenotyping, and how the technology is inherently problematic.
Meanwhile, Chelsea, in prison, in 2013 was not being allowed to be photographed or recorded. (After a charismatic Daniel Ellsberg won over the hearts of millions in the 1970s, the Feds sure weren’t going to make that mistake again.)
The way Chelsea’s voice was being silenced angered and, ultimately, intrigued Heather. Perhaps she could turn it around into something anti-oppressive, and utilize this technology to give Chelsea the public face she had been denied, by creating portraits of her, from her DNA.
By the time I came along, they had already completed Stranger Visions and Radical Love – so I had a lot of catching up to do, at first, just to understand Chelsea’s artistic preferences. Chelsea would often have ideas that she had written up, and I would take notes and read them back to Chelsea exactly, so I could convey the information accurately to Heather. Then Heather would write back with her ideas, and I would have to make sure I understood those well enough to explain them to Chelsea the next time we spoke.
On November 23, 2016, Heather sent me an excited email with a great idea for “taking some of the writing I have been doing and working with an illustrator to make a comic book or animation bringing things to life.” This would become the Suppressed Images comic book, which tells the story about their friendship and artistic collaborations, and specifically how Chelsea learned it’s important to “Never Shut Up” when someone tries to chill your speech.
Although both Chelsea and I loved the idea, we didn’t think there was enough time to complete the project, and didn’t want to pressure them. But Heather and illustrator Shoili Kanungo worked very very hard to meet their own intense deadlines, in order to finish it in time to be published during President Obama’s last week in office. (When, historically, commutations happen.) As it turned out, it was published in the morning on the same day her commutation was announced. (As the White House announced Chelsea’s commutation in the early afternoon, east coast time.)
This comic book had become our attempt to visualize a reality where Chelsea was commuted – in a world where everyone else had told us that it was impossible. (Now, amazingly, in that same world, everyone acts like it was inevitable 🙂
I was asking different people all over the world to visualize Chelsea out in the regular world with them. Playwrights visualized Chelsea sitting in the audience at their plays. Band members pictured her rocking out at their shows. DJs pictured her dancing to their beats. And now, this comic book literally provided illustrated pictures of the possibilities.
Just one month after the comic book was released — and of course — one month after we had received the good news about Chelsea’s upcoming release — in February, 2017, I got to do it again.
This time last year, it really did still fee like a dream. Chelsea was still in prison but we had begun working on the expanded installation – our collaboration on ProbablyChelsea – that would mark and celebrate her release. I shared some initial rough ideas with the audience about even at the end of the talk, although they grew and changed in important ways over the next months.
I also described our graphic short story that Chelsea and I wrote together with illustrator Shoili Kanungo which advocated and envisioned her commutation – to the audience – and told them the amazing and miraculous story of publishing the comic on the very morning that Obama actually granted her clemency.
Now I am so incredibly honored, humbled, inspired, and filled with gratitude to be able to stand on the stage together with Chelsea in person to discuss art, technology, and politics — like it’s just totally normal to be here together.
LR: I remember the morning we heard the news, on January 17, 2017.
HDH: For me, on the east coast, it was in the early afternoon 🙂 The comic went live early morning my time and I heard about the commutation that afternoon.The announcement of Chelsea’s commutation was one of the most jubilant and overwhelmingly emotional moments of my life, and certainly my artistic career. It was incredibly meaningful to me.
LR: It also felt important the whole time you guys were working on that comic book too. It was an incredible experience for me, as a historian and archivist, to not just meet or read about, but actually be the conduit that worked between you guys on those projects (Suppressed Images and Probably Chelsea).
HDH: Looking back, it was, and is, such a dark time. After Trump in the U.S. and brexit in Europe, Chelsea’s commutation was like a beacon of hope; a way of showing us how important it is to really incant the future you want to see, and how the power of words can be used to make the changes you want.
Of course her release was the result of a lot of different things coming together — and the comic we wrote, anticipating, asking for her release, felt like this little sprinkle of magic potion that catalyzed this reaction. To revisit that today is such a powerful reminder that positive change is really possible.
John Perry Barlow was a close collaborator and dear friend, since the first day I met him, in 2002. He was extremely encouraging, spoke at many of the events I organized, and was there for me generally, as I followed the breadcrumbs of my archival adventures. First as Dr. Timothy Leary’s Digital Librarian, next as the co-founder of Aaron Swartz Day, and most recently, as Chelsea Manning’s Archivist.
Barlow was a very exciting person to work with. In the beginning, there wasn’t much pressure during our meetings, while he answered questions about Dr. Timothy Leary, who he had a close and very interesting — albeit sometimes strained — relationship with. He was helping me fill in the little details between the overlapping stories I had heard from others. I often showed up with a box of artifacts from whichever specific time period I wanted to discuss that day. Although the lives of psychedelic folks in the 1960s and 1970s are often portrayed as footloose and fancy free, their real lives weren’t really like that most of the time (except when they really really were :).
I would often ask him to confirm specific facts for me, and would end up hearing completely different stories that took place around the time period in question. Being Dr. Leary’s Digital Librarian, I liked to know the story behind every artifact. Since I wasn’t alive yet, much less there, when a lot of things took place, my job, most of the time, amounted to collecting and comparing notes from everyone I could find who was there.
I would often get conflicting stories about how certain events played out, and I was usually hoping that John Perry’s account of events could break the tie. Unfortunately, his accounts did no such thing. More often than not, he would say that something different altogether had taken place, making it so the only thing I knew for sure was that the “official” story was wonky. Nevertheless, it was always quite amusing hearing his take on famous figures — Dr. Timothy Leary, Ralph Metzner, Baba Ram Dass, Bobby Weir, Jerry Garcia -— or hear him tell (and re-tell) the story of forming the Electronic Frontier Foundation with Mitch Kapor and John Gilmore.
These last few years, as things became more intense around my work, if I stopped by for anything, he would literally drop everything (give or take an hour 🙂 to see me and help me figure out what I needed, and quickly. Often, it seemed as if he was dealing with two or three other critical situations at the same time, and I felt quite honored to be included in his circle of intensity.
In 2009, I worked with the Leary Estate to put on a family reunion and party for close friends. John Perry Barlow was there, and said a few words:
From that talk:
And, this archive…”will attempt to tell a complex story from a number of different points of view”… and will attempt to encapsulate Timothy Leary, who was truly the most paradoxical and vexing and inspiring and maddening of human beings. He probably had more to do with introducing people to the spiritual matter than practically anybody who wasn’t born in the desert someplace, and yet he was, for much of his life, a profoundly anti-spiritual man.
He was a very loving man, who introduced a lot of people to a greater depth of love…
I think it’s really important to decode this guy. He was one of my friends. I knew him from ’65 until he died, pretty continuously, and I loved him dearly. But it’s important to take him apart, and figure out who he really was, ’cause you can learn a lot about America, from learning a lot about Timothy Leary.”
what did Aaron do to get in so much trouble? Well, you’re not going to believe this:
Aaron downloaded a bunch of journal articles over an open network at MIT.
No, seriously. That’s what he did.
By Lisa Rein
I’m here to tell you about this weekend’s hackathon and celebratory festivities, and also explain a few things about how these things all weave in and out of our existing MONDO-world. It’s a TRIP.
I co-founded this event with Brewster Kahle, after Aaron’s death, in 2013. The Aaron Swartz Day and International Hackathon is an annual event that encompasses an entire weekend — celebrating Aaron’s life and providing yearly updates for many of Aaron’s collaborative projects that are still thriving today.
Who was Aaron Swartz? Well, the Aaron Swartz that I knew really well was just a 15 year old kid that helped me do my job better at Creative Commons, when I was its Technical Architect, working with Lawrence Lessig, in 2001-2002. We were using RSS news feeds to describe copyright licenses.
Yeah. It’s as boring as it sounds, and that’s why people don’t think about it unless they have to. Our job was to make it easy for them to insert some information about their Creative Commons license in the existing places — metadata fields in a .jpg file, or an mp3 file, etc. Aaron and Matt Haughey came up with the idea of asking a series of questions that help people determine what license they want, which turned out to be the hard part for artists. (Here’s a table I have a actually that makes that choice a bit easier.)
But I digress…
Aaron allowed me to be successful in my Creative Commons “mission” from Lawrence Lessig. We used RSS to describe copyright law, and, as it happened, so much more. It happened. Perfectly. Because Aaron knew just how to do it, and Lawrence and I let him, even though he was 15 years old.
I’ve also worked with Brewster digitizing some of the Timothy Leary Archives, since I am Timothy’s Digital Librarian, and now, also, Chelsea Manning’s Archivist. (Not to be confused with Michael Horowitz, who is Timothy Leary’s Archivist. Michael and I collaborate on the Timothy Leary Archives and Michael’s Own Archives, from that time period. Over these last two years, since I’ve been Chelsea’s Archivist, he’s given me oodles of excellent advice.
The Open Library, which is one of the projects people can hack on at the hackathon this year, started out small, although its goals were quite large: aspiring to create “a web page for every book.” Now, just over ten years later (Started circa 2007 by Aaron), Open Library is the world’s free digital library with over 2M public domain books and another 500k+ books available to be borrowed and read in the browser. Even when the Open Library itself doesn’t have a digital copy, it can connect readers to libraries that do have copies. So far, Open Library has collected information about over 25M book records.
After the Open Library, Aaron went to Stanford for a semester, dropped out and founded a Y-combinator startup, that later was spun into Reddit. Reddit was bought by Conde Nast, which wasn’t quite Aaron’s style, so he left. He was an Ethics Fellow at Harvard when the famous altercation took place.
So, what did Aaron do to get in so much trouble? Well, you’re not going to believe this:
Aaron downloaded a bunch of journal articles over an open network at MIT.
No, seriously. That’s what he did.
The actions that the U.S. government took against Aaron: making up hacking charges, stressing him out with surveillance and concern that those he loved would be interrogated as witnesses in his case. It seems like it all made him feel like his life, and his entire future, was somehow ruined.
He was kind of a genius and had a lot of projects that are still going. The Aaron Swartz Day community just worked hard to secure Chelsea Manning’s release — and she is our guest speaker.
TICKETS(Use the Promotional Code “MONDO” & save $35.)
How Aaron Swartz Day started:
It was on the eve of the San Francisco Memorial for Aaron, that Brewster, myself, and several others that night all had the same idea: Let’s keep up the momentum from all of this inspired action with some kind of event every year. So, for five years going now, we gather in November for an entire weekend of events on what would have been his birthday weekend. There are two goals. One is raising awareness about what happened to him — in order to protect other innovative students from government over prosecution — and future “hackers” that are exemplifying the true nature of curiosity and improvement. The other is to draw attention to his projects that are still going strong, such as SecureDrop and the Open Library.
At the same time, in the months that followed, memorial hackathons started popping up all over the world. We approached Yan Zhu, a friend of Aaron’s who was organizing them, about combining forces in November, and she agreed.
As Brewster and I began to create the first event (2013), many people had the same requirement: that the event be forward-thinking and uplifting, should not be sad or pessimistic, or dwell on what we would have done, had we known — except to the extent where doing so might help us protect others in the future.
After a few years of these events, we decided to step it up a notch, and try to think of ways that we could really use our event to make a difference. So, Brewster and I decided we would reach out to Chelsea, see if we could archive her writings or letters or something, if she’d be up for it, and just basically try to find different creative ways to try to make Chelsea Manning’s life in prison a little more livable.
Both Chelsea and Aaron stood up for the ideals of transparency and accountability. Ideals that Brewster and myself had taught them were so important. Yet, when Chelsea and Aaron stood up for these ideals, they were crushed by the full weight of the government.
There’s more to this than first meets the eye. Our community has always felt bad about not being able to do more to help Aaron. We wish we would have pressed him further about his case, when he was reluctant to discuss it. We wish we would have done this… We wish we would have tried that. We all drive ourselves crazy thinking these thoughts, still, to this day.
All of us that knew Aaron told each other privately that we would have done anything to help him, had we realized the severity of the situation. When I heard Chelsea’s voice over the phone, I realized it was happening again. Except we had a chance this time; Chelsea was still alive, and we could still save her.
The question was, what could we really do? We didn’t know yet – but I knew that if I could find out what she needed, our entire community was ready and willing to help her. So, we decided that we would start by writing her and ask her if she’d like to prepare a statement for Aaron Swartz Day. She accepted. (2015 Statement) (2016 Statement).
The rest, as they say, is history.
That’s why this year’s event is especially incredible: because Chelsea Manning is attending in person, after only being able to send us statements from afar, in prison, for two years running. Her speaking to us in person, as a free woman, is definitely nothing less than a dream come true.
Evening Program of Speakers with special guest Chelsea Manning
Saturday, after the San Francisco hackathon, at 6pm, there will be a reception and we will toast to our community’s accomplishments this year! The program upstairs will begin promptly at 7:30 pm. I’ve just added 50 tickets just for you Mondo 2000 readers! When you go to buy tickets enter the promotional code “MONDO” to get a $35 discount off of the $75 ticket price 🙂
Each of this year’s evening event speakers was asked to attend for a very specific reason. Some speakers knew Aaron and worked with him directly, others were inspired by him, or were working on projects inspired by him (such as Barrett Brown’s Pursuance Project). Barrett Brown is fresh out of prison and ready to stir up more folks to become aware of their surroundings.
Other speakers, such as Chelsea Manning, we know Aaron “gushed about” and thought was “so cool.” Jason Leopold is going to teach us about FOIA (Freedom of Information Act) and about the FOIA requests that Aaron submitted. Also Jason just got a new dump of files from the Secret Service that look interesting. It’s almost as if we were given a present before the event. Daniel Rigmaiden will be there, who exposed the Stingray from prison, in the course of representing himself, once he was able to determine that the Feds had used a Stingray on him illegally, in order to determine his location.
Here is the complete line-up of speakers with their bios:
Chelsea Manning – Network Security Expert, Transparency Advocate
Chelsea E. Manning is a network security expert, whistleblower, and former U.S. Army intelligence analyst. While serving 7 years of an unprecedented 35 year sentence for a high-profile leak of government documents, she became a prominent and vocal advocate for government transparency and transgender rights, both on Twitter and through her op-ed columns for The Guardian and The New York Times. She currently lives in the Washington, D.C. area, where she writes about technology, artificial intelligence, and human rights.
Lisa Rein – Chelsea Manning’s Archivist, Co-founder, Aaron Swartz Day & Creative Commons
Daniel Rigmaiden became a government transparency advocate after U.S. law enforcement used a secret cell phone surveillance device to locate him inside his home. The device, often called a “Stingray,” simulates a cell tower and tricks cell phones into connecting to a law enforcement controlled cellular network used to identify, locate, and sometimes collect the communications content of cell phone users. Before Rigmaiden brought Stingrays into the public spotlight in 2011, law enforcement concealed use of the device from judges, defense attorneys and defendants, and would typically not obtain a proper warrant before deploying the device.
Barrett Brown – Journalist, Activist, and Founder of the Pursuance Project
Barrett Brown is a writer and anarchist activist. His work has appeared in Vanity Fair, the Guardian, The Intercept, Huffington Post, New York Press, Skeptic, The Daily Beast, al-Jazeera, and dozens of other outlets. In 2009 he founded Project PM, a distributed think-tank, which was later re-purposed to oversee a crowd-sourced investigation into the private espionage industry and the intelligence community at large via e-mails stolen from federal contractors and other sources. In 2011 and 2012 he worked with Anonymous on campaigns involving the Tunisian revolution, government misconduct, and other issues. In mid-2012 he was arrested and later sentenced to four years in federal prison on charges stemming from his investigations and work with Anonymous. While imprisoned, he won the National Magazine Award for his column, The Barrett Brown Review of Arts and Letters and Prison. Upon his release, in late 2016, he began work on the Pursuance System, a platform for mass civic engagement and coordinated opposition. His third book, a memoir/manifesto, will be released in 2018 by Farrar, Strauss, and Giroux.
Jason Leopold, Senior Investigative Reporter, Buzzfeed News
Jason Leopold is an Emmy-nominated investigative reporter on the BuzzFeed News Investigative Team. Leopold’s reporting and aggressive use of the Freedom of Information Act has been profiled by dozens of media outlets, including a 2015 front-page story in The New York Times. Politico referred to Leopold in 2015 as “perhaps the most prolific Freedom of Information requester.” That year, Leopold, dubbed a ‘FOIA terrorist’ by the US government testified before Congress about FOIA (PDF) (Video). In 2016, Leopold was awarded the FOI award from Investigative Reporters & Editors and was inducted into the National Freedom of Information Hall of Fame by the Newseum Institute and the First Amendment Center.
Jennifer Helsby, Lead Developer, SecureDrop (Freedom of the Press Foundation)
Jennifer is Lead Developer of SecureDrop. Prior to joining FPF, she was a postdoctoral researcher at the Center for Data Science and Public Policy at the University of Chicago, where she worked on applying machine learning methods to problems in public policy. Jennifer is also the CTO and co-founder of Lucy Parsons Labs, a non-profit that focuses on police accountability and surveillance oversight. In a former life, she studied the large scale structure of the universe, and received her Ph.D. in astrophysics from the University of Chicago in 2015.
Gabriella (Biella) Coleman holds the Wolfe Chair in Scientific and Technological Literacy at McGill University. Trained as an anthropologist, her scholarship explores the politics and cultures of hacking, with a focus on the sociopolitical implications of the free software movement and the digital protest ensemble Anonymous. She has authored two books, Coding Freedom: The Ethics and Aesthetics of Hacking (Princeton University Press, 2012) and Hacker, Hoaxer, Whistleblower, Spy: The Many Faces of Anonymous (Verso, 2014).
Caroline Sinders – Researcher/Designer, Wikimedia Foundation
Caroline Sinders is a machine learning designer/user researcher, artist. For the past few years, she has been focusing on the intersections of natural language processing, artificial intelligence, abuse, online harassment and politics in digital, conversational spaces. Caroline is a designer and researcher at the Wikimedia Foundation, and a Creative Dissent fellow with YBCA. She holds a masters from New York University’s Interactive Telecommunications Program from New York University.
Brewster Kahle, Founder & Digital Librarian, Internet Archive
Brewster Kahle has spent his career intent on a singular focus: providing Universal Access to All Knowledge. He is the founder and Digital Librarian of the Internet Archive, which now preserves 20 petabytes of data – the books, Web pages, music, television, and software of our cultural heritage, working with more than 400 library and university partners to create a digital library, accessible to all.
Steve Phillips, Project Manager, Pursuance Project
Steve Phillips is a programmer, philosopher, and cypherpunk, and is currently the Project Manager of Barrett Brown’s Pursuance Project. In 2010, after double-majoring in mathematics and philosophy at UC Santa Barbara, Steve co-founded Santa Barbara Hackerspace. In 2012, in response to his concerns over rumored mass surveillance, he created his first secure application, Cloakcast. And in 2015, he spoke at the DEF CON hacker conference, where he presented CrypTag. Steve has written over 1,000,000 words of philosophy culminating in a new philosophical methodology, Executable Philosophy.
Mek Karpeles, Citizen of the World, Internet Archive
Mek is a citizen of the world at the Internet Archive. His life mission is to organize a living map of the world’s knowledge. With it, he aspires to empower every person to overcome oppression, find and create opportunity, and reach their fullest potential to do good. Mek’s favorite media includes non-fiction books and academic journals — tools to educate the future — which he proudly helps make available through his work on Open Library.
The San Francisco Hackathon is leading the way for the hackathons around the world. This year, we are integrating remote hackers from all over the world to work on our projects, and we are going to stay organized, so we can keep hacking on them in the days and weeks to come.
SecureDrop is an open-source whistleblower submission system managed by Freedom of the Press Foundation and originally created by Kevin Poulsen and Aaron Swartz. The goal of SecureDrop is to help media organizations simplify the process of securely accepting documents from anonymous sources. Dozens of news organizations, including: The New York Times, The Washington Post, The Associated Press, Vice, The Guardian, AP, The Intercept, BuzzFeed and Forbes, are now running SecureDrop servers to communicate securely with sources.
The Pursuance System software enables you to create a pursuance (which is a sort of organization), invite people to that pursuance (with the level of permissions and privileges that you choose), assign those people tasks (manually, or automatically based on their skill set!), brainstorm and discuss what needs to be done.
Next, you’ll be rapidly recording exciting ideas or strategies in an actionable format (namely as tasks), share files and documents, be notified when relevant events occur (e.g., you are assigned a task or mentioned), and effectively get help from others. Here’s an interview with Barrett Brown and Steve Phillips explaining Pursuance in more detail.
OpenArchive is a free, open source application for android, available on the Google Play Store that enables you to send your mobile media directly to the Internet Archive over Tor (Orbot), and choose what metadata and Creative Commons license to include with it. The primary goal of the app is to empower the user to easily archive photos, video and audio from their mobile device to a secure, trustworthy, and remote storage service.
Come join members of the Open Library team, and work directly with them on Sunday, November 5th and together we’ll turn your ideas and suggestions into empowerment for an international audience.
Open Library is the world’s free digital library with over 2M public domain books and another 500k+ books available to be borrowed and read in the browser. Started circa 2007 by Aaron, the vision of Open Library is to be an open wiki catalog of every work ever published. So far, Open Library has collected information about over 25M book records, empowering readers with data to locate books even when Open Library doesn’t have a digital copy. Over 100,000 readers borrow books on Open Library each month, but there’s a lot we aspire to do to make our library experience more accessible and useful to readers world-wide.
Right now, citizens have to play a guessing game with Law Enforcement in their town. Police Departments are not required to have a policy on the purchase and use of surveillance equipment unless there is public outcry for them to do so. At Aaron Swartz Day this year, we aim to provide a public outcry model, automate the process for filing multiple public records requests, asking for every known variation of surveillance equipment, providing a template for the requests, and also another template to demand that your city government implement a policy regarding how surveillance is used on the citizens of any given town. Then, we’re going to split up in to “follow up groups,” whose job it is to keep making calls and sending emails until the local governments are taking action.
Efforts are in the final stages in both Oakland and Berkeley, and both should have laws by the end of the year. So, we’re going to use them as examples for the rest of the country.