Episode 4: Richard M. Stallman Transcript

Chris: Richard Stallman, welcome to Making Better!

Richard: Thanks for asking me on.

Chris: You’re most well-known for being the father of the free software movement, and I don’t believe most of our listeners know anything about free software. So can you start, maybe even by the most fundamental of definitions, by telling us the difference between “free” as in freedom and “free” as in gratis…

Richard: Well, I don’t use the word “free” to refer to price, I avoid that for the sake of clarity. Free software means software that respects the freedom and community of the program’s users. So it’s a matter of freedom, we’re not talking about price at all. In fact, when I want to say that something is available for zero price, I call it “gratis,” because that’s an unambiguous, clear word. And every time I say “free,” it refers to freedom, never to price. But why is it important for software to respect your freedom and your community? Well, with any program there are two possibilities: either the users control the program, or the program controls the users. You’ve probably heard about the issues that happen when companies control seeds, or companies control medicines, or companies control how you can watch TV or connect to the internet. These companies use their power to mistreat people. Well, it’s just the same with software companies. Nowadays, if you’re running in your computer a program that is not free software, whatever company owns that program has power over you, because it controls the program that tells your computer what to do. And the company’s programmers, they know that, and they are probably designing that program to give the company an advantage over you, to put you at a disadvantage. The companies do this by putting in malicious functionalities, like they make the program spy on its users, or they design it to refuse to do the things that the users will want, ‘cause it’s to their advantage not to let users do what they want. They may have it depend on a server, which might get switched off; they could make it censor users, it can have a backdoor which can be used to do whatever they want to do to users whenever they want to do it. The Amazon e-book reader—we call it the Amazon Swindle—has a backdoor to erase books. And Amazon used this backdoor to send a command to erase thousands of copies of a particularly book one day, and can you guess what book it was? It was 1984 by George Orwell. I’m not making this up! If I were writing fiction, that would seem just too unbelievable, I wouldn’t dare pick that example. But that’s what really happened!

Chris: 1984 meets Fahrenheit 451.

Richard: Well, in 1984 the government burned books, too, you know.

Chris: Yes. So you laid out the four fundamental freedoms of software freedom—could you go into, briefly, should the four freedoms?

Richard: Sure, but I want to explain how they figure in. I say that in order for the program to respect users’ freedom, the users have to have control over it. They need to have control separately but also collectively. Now there are four particular freedoms that the users need in order to have control, so these four freedoms—we call them the four essential freedoms—they actually make a practical definition of free software. So, freedom zero is the freedom to run the program any way you wish for any purpose. Freedom one is the freedom to study the program’s source code, those are the plans that a programmer can understand and then change it so that it does things the way you wish. In other words, so that it works differently, any way you like. And these two give each user separately control over the program, but in order to exercise freedom one, you need to be a programmer. Most users are not programmers, but they still deserve control over their computing, which requires control over the programs they run. How does a non-programmer get to have control? Through collective control, which is the freedoms for a group of users to work together to exercise control over what that program does. So maybe you and five other users decide to make certain changes, and then some of you, who know how to program, write those changes, and then you all use the result, including some of you who don’t know how to program. So this is the way non-programmers can participate in getting programs changed the way they wish. Collective control requires two additional essential freedoms, which adds up to the four essential freedoms. So, freedom number two is to make exact copies and redistribute them to others, either give them or sell them to others, when you wish. And freedom three is to make copies of your modified versions and give or sell them to others when you wish. So with this, the group’s members can cooperate. If one member of the group makes a modified version, then with freedom three, person can copies of that and distribute them to others in the group…then they, with freedom two, can make exact copies of that, and distribute them to others in the group, and this way everyone in the group can get a copy of this new, modified version. So this is why those four freedoms are all essential, so that users have individual, separate control and collective control. Collective control requires two additional essential freedoms, which adds up to the four essential freedoms.

Chris: The comment I was about to make was about the second freedom, I was going to say that the Free Software Foundation website has a list of other volunteer opportunities that people can do to help free software, who are not programmers.

Richard: That’s right, there’s a lot of work that this movement needs which is not programming, some of it which is not particularly technical. It’s like any other political movement, we need speakers, we need people to organize local activist groups and look for opportunities to try to move things to freedom. So you don’t need to be a programmer. Take a look at GNU.org/help and you’ll see our list of many different kinds of work we need.

Francis: If the free software movement succeeds, how will that improve the average person’s life?

Richard: Well, it means that you won’t be mistreated by the software in your computer. You know, right now many programs are spying on people. We recently found that programs sent sensitive personal information to Facebook because the developers of those programs, their mobile apps, use the Facebook library that sends data to Facebook. But you’ve got to expect programs to be spying on you if they’re non-free, that’s why I don’t use any of them. I just reject them all. If there’s something that would be convenient but it means using a non-free program, I won’t do it. I defend my freedom. I won’t sacrifice my freedom for convenience, that’s the first step in having any freedom is not trading it for convenience. But the thing is, if you make that trade, eventually you’ll be at such a big disadvantage vis-a-vis companies, that you’ll probably end up being lured into throwing your money away and being mistreated constantly by companies that profile you, your wallet will be emptied, your democracy will be emptied, your elections will be manipulated, and if the government knows everything about everyone, and of course whatever the companies collect in the US the US government gets to take from them, then heroic whistleblowers like Edward Snowden will be impossible because they’d be caught right away. If the government knows who goes where, what each one does and who talks with whom, there is no privacy left and you can see what that looks like in China, and that’s where the United States is heading to. It’s time now to organize and fight and say, “stop talking about regulating the use of our data, and stop collecting it at all.”

Francis: We live in a generation of kids right now who haven’t been brought up to appreciate privacy as a right, even.

Richard: That’s true, it’s very threatening. But there’s no use giving up. You see, I don’t need to be an optimist to keep fighting. I know that it’s better to fight than to surrender. If you surrender you’re immediately defeated. What good is that?

Francis: I agree completely. So what do you do about all this information that’s been garnered about everyone that’s on Facebook and all these other sites that have been tracking people—is that information just out there ready to be exploited?

Richard: Well, it is. Theoretically we could pass laws requiring them to delete it, but even if we don’t delete the old information, if we stop the collection of more information, that will gradually solve the problem of the old information. It will become less applicable and less complete, and eventually it won’t be enough to be a basis for tyranny by itself.

Chris: I’ve heard you talking, and you use a phrase that I really like, that if you use Facebook you’re not actually a “user,” you’re a “used”

Richard: Correct. Facebook uses people, people don’t use Facebook. Facebook uses people to collect data about them and about other people. If you talk with people through Facebook, then those people are pressured by you to be used by Facebook, to give data to Facebook. And when they do so, they’re also pressuring others. So they may be pressuring you to communicate through Facebook, while you’re pressuring them to communicate by Facebook. People are pressuring each other. And my conclusion from that is, it’s really our duty to refuse to be used by Facebook. But it’s important to note that Facebook collects data about people in a number of other ways, even if they don’t have Facebook accounts. For instance, there is a period tracking app for women, which was sending data to Facebook, and these women didn’t necessarily have to have Facebook accounts. It was still getting data about them, so of course it can find out when they’re pregnant, and not only that. Lots of programs have been sending data to Facebook in that way. And websites also send data to Facebook. If you look at a website and you see a “like” button, the Facebook server knows that your machine visited that page.

Chris: And we actually, after you and I went back and forth in email a little bit, Richard, we’ve decided not to have any “like” buttons on our website because Amanda, our web person (who you met in Cambridge once) looked at them and they all execute Javascript…

Richard: Right…but the point is, it’s not just executing Javascript, it’s malicious Javascript. It’s a non-free program which is malicious, as many non-free programs nowadays are. It’s the usual case. That may sound like a shockingly strong statement, but people have done studies of, I think, the thousand most popular android apps at the time, and found that the majority of them were spying in one particular way that was easy for the researcher to detect. Of course the rest could be spying in other ways, but without knowing that, the researcher had already proved that most of them were spying. And by the way, there are a lot of other things like, similar to the like button, but from other companies, and they’re all doing the same kind of spying. But we have a browser which blocks all of that, it’s called Icecat. Obviously, it’s a modified version of Firefox. And we’ve put in various features to protect users’ privacy.

Chris: And there are some social media outlets that are ethical…I’m thinking of Gnu Social, and Megadon perhaps?

Richard: You mean Mastadon?

Chris: Mastadon.

Rchard: Yeah. Mastadon is actually a modified version of Gnu Social. Yes, they’re ethical in two ways. One is that the client software you need is all free software, and the other is that the networks are run to some extent by the community, and therefore, they’re not constantly being designed to get people to give more data. Of course you still have to think about what you’re going to say to people. What you say in any social network, no matter how ethically run it might be, is going to be visible to a lot of people. So, you should think twice before saying things. You know, don’t post photos of other people, because publishing a photo with other people in it is helping to track them. You should ask them whether they really want to be tracked. Don’t post photos of minors, because those photos will be part of their permanently available data, available to whoever, whoever it might be depending on how you posted it. But the point is, that’s someone who hasn’t had a chance to think about whether that’s a good idea, and could conceivably be used to hurt that person, or even you.

Chris: So it brings us to a question of consent.

Richard: Well, a baby can’t consent to anything. So, people don’t realize how dangerous massive surveillance is. But if you want to understand that, read about China’s social credit system, which is designed to keep track of people’s behavior in many dimensions, and then punish or reward people, depending on how much the state likes what people are doing.

Francis: What do you think, say, 50 years from now, would be the ultimate landscape for how software is offered, the nature of software? Say, like 50 years from now.

Richard: I can’t tell you. Nobody can predict the future 50 years from now, it’s fatuous to try. Human as in technology may not exist. Technology might exist and no humans, or humans exist but no technology. Robots powerful enough might consider humans to be a dispensable toy, or an inconvenience. On the other hand, global heating disaster could destroy globalized manufacturing and leave humans without the ability to make high tech goods, and some would still survive, nowhere near as many as today.

Chris: What do you consider are the most successful pieces of free software out there today? I mean, in my mind, the internet wouldn’t work without free software, so…

Richard: I don’t know if that’s true. I can’t say that I’m sure that the internet would not work without free software, free software has, for much of the internet’s existence, played an important role in its use. But there were other programs to do those things, and so there could still have been an internet without them, it just wouldn’t have respected our freedom at all. But the most important free software is the Gnu/ Linux operating system, that’s what makes it possible to run a PC with only free software. And then Android has a relationship with free software. It’s not a simple relationship; a large piece of Android is released by Google as free software. It’s not everything you need to actually run a device, so you’ve to either pick your device carefully and put in some free replacements, or run non-free programs along with it. And then many of the original apps of Android were non-free, and over the years some apps that were free have been replaced with non-free programs by Google. And the next thing is, if you get an Android device, it’s not certain that the programs distributed in that device are free—they could be non-free modified versions of the free programs, they could be other programs that are non-free. It’s something that can only be checked on a case-by-case basis. There is a free variant of Android, it’s called Replicant, it works on certain models and it’s entirely free software.

Chris: Does Replicant also cut down on the surveillance as well?

Richard: Well, to some extent. There are various technical methods to do surveillance. Many non-free programs spy on the user and send personal data to somebody. Well, if you don’t run those programs, if you run only free software, you’re pretty much safe from that. It’s unusual for free programs to spy on the user, because the users can fix that. That means that the users, if they care enough, can make sure that the program doesn’t spy on them, and this is a powerful deterrent against anyone’s putting in surveillance functionalities into free software. It’s not guaranteed, you may have to check which version you’re going to use to avoid surveillance. But at least it’s possible to make the modified version which doesn’t spy. In any case, though, that’s not the only form of surveillance. For instance, there’s an increasing danger from cameras on the street that recognize license plates and recognize faces. Now, that doesn’t work through our computers, we can’t block that by changing the software in our computers. We have to organize and get laws to say that face recognition systems may not recognize any face except somebody put in the list by a court order.

Francis: I’m imagining that if this face recognition, the computer could look at you, your laptop could look at you, figure out who you are and then market…

Richard: No, no no, no. Your laptop knows who you are. If it’s running non-free software, it’s very likely spying on you also, but if it’s running free software and you use a system distro that’s developed by people who care about protecting privacy, it probably isn’t spying on you. No, I’m talking about these cameras that are placed in the street, they’re not on your laptop. They watch everybody who passes by. And Amazon is now recruiting people to put video systems, you know, at their doors to watch everybody who comes up and the video, I believe, just gets stored by Amazon, and so the FBI, I believe, could collect it all with a National Security Letter. But in addition, there are programs to get people to sign up to send some of this information to cops. I know in Detroit, there’s a program where the cops tell businesses, if you want us to show up quickly for a 9-11 call you better have a camera that we can watch through all the time. Well, cameras that they can watch through all the time should be flat out illegal!—unless a court says put a camera right there for the next five months. These strong measures are what it will take to give us actual protection from total government tracking of what we do. And anything less won’t do the job.

Francis: I think a big hit on freedom was the Patriot Act and how that undid a lot of these freedoms.

Richard: I like to call it the “Pat-riot Act,” ‘cause in a country based on an idea of freedom, there is nothing less patriotic than that law. The idea of the US is that the government shouldn’t have total control over what people do, it shouldn’t know every, where everyone goes. It shouldn’t know everyone you talk with.

Francis: Well, how do we get back to that sort of awareness in people?

Richard: …have to change laws. I don’t have a magic recipe for how to win this campaign, but I can point out what we need. Let’s think of the Green New Deal. For many years, most of us—except denialists—have acknowledged that global heating is very dangerous and that it needs to be stopped. But you’ve seen, over and over, inadequate proposals, proposals that wouldn’t do the job. And now finally you see Extinction Rebellion and the students climate strike and the Green New Deal, where people are demanding that solutions be considered that would really solve the problem, that are not obviously insufficient. Well, I’m saying the same kind of thing about how to avoid having a surveillance state that represses dissent. We have to put an end to the collection of so much data that it could repress us.

Francis: Are any bills in Congress along these lines?

Richard: Not that I know of. But in Massachusetts, a bill has been submitted to stop the state from doing face recognition. It’s presented as a moratorium until proper regulations can be adopted. But in the meantime, it says none at all. There are very limited exceptions, which I believe depend on a court order. So that’s the kind of solution that’s needed, except it has to apply to private companies as well.

Francis: Maybe we need a Free New Deal, like getting back the freedoms we gave up because of the 9-11 hysteria back in the Patriot Act days. I think a lot of what was promoted at that time were supposed to be temporary, like emergency measures, too, not like a permanent …

Richard: Exactly. That’s what typically happens. That law was initially temporary, the Pat-riot Act, and it got renewed a few times, and I think ultimately made permanent, if my memory serves. But it shouldn’t have been passed at all. It was obvious at the time that this was a worse attack than the September 11th attacks themselves. In fact, on that day I started to write an article telling people what the next target would be, our freedom would be the next target. In the article I started to write, I urged Americans that, in their frustration about being unable to fight back against the bombers, that they must not start destroying their own country’s freedom instead.

Francis: And Obama was such a disappointment. I mean, the way Obama treated whistleblowers, instead of maybe seeing Snowden as a hero with what he did…I mean, that to me was very telling about who Obama really was.

Richard: Well, I could tell that before the election. His slogans were “Hope” and “Change”—could you get vaguer than that? It was clear that he wasn’t the kind of person who would really try to solve the country’s big problems. I wasn’t knowledgable enough to foresee the specifics of his disappointing actions, his inability to push hard for anything, and his subservience to the banksters when the central problem of the United States is plutocracy. You can look at so many different areas, whether it’s surveillance, or pollution, or impoverishment of most people, or imprisonment, or the failure to curb global heating, and what you see is, plutocracy’s involved. I couldn’t see that much detail about him, but I did see that he was not proposing firm and major moves to address these problems.

Chris: I got excited about President Obama when he was still a candidate, because he was the first major Presidential candidate I’d ever heard who included disability in his speeches, and then he turned out to be an absolute disaster for people with disabilities. His Justice Department and his Federal Communications Commission were just an absolute failure for us. It was just a phenomenal disappointment.

Richard: Yeah. Well, it’s one more disappointment on many others from Obama.

Francis: Actually prosecute all the people that were doing the, uh, stuff the Bush team were doing that was obviously illegal, and the bankers, the banksters, and then his basic idea was, “we can’t look back, we have to look forward.” I was thinking, you know, if any lawyer tried that in a court, like “Judge, yeah he killed 50 people, but we can’t look back, we have to look forward”—what kind of justification is that anyway? But somehow he pulled it off.

Richard: Yeah, they stole a million Americans’ houses with fraudulent foreclosure, let’s not look back at that…but it was Obama and it was, Holder, I think was his name, the Treasury Secretary, who let the banksters off the hook, protected them, and when states were going to prosecute them for this fraud, it was Obama and Holder that pressured them to let it drop.

Chris: But if you compare what our Attorney General did in the banking crisis to Iceland, where 35 bankers in Iceland all went to jail—little Iceland actually prosecutes criminals while big United States gives criminals handouts.

Richard: Yeah. Well, part of that is because the companies are so big that it’s hard to fight them. Well, I have a proposal for how to make all these companies get smaller. The proposal is, a tax. You could think of it as a progressive sales tax. The bigger the total company is, worldwide, the higher the tax rate. This way, it pressures companies to split themselves up into independent businesses, because then each one will pay less tax.

Francis: Whoa. [claps] I love that idea! It’s so simple, and it has to work.

Richard: Well, it has to work if it’s tuned right. You know, there are economic questions in there, I can’t be sure it would actually work, but I’m sure it’s worth studying to see if you can tune it so you’ll get good results. It’s much better, though, than existing almost-nugatory US antitrust law, which requires suing the company and proving it did something in a very small range of prohibited kinds of actions, and then it can be split up…splitting it up in a way chosen by the government, which is not, again, limited by certain criteria, what…that’s not the way to do it. This tax would pressure companies to find the way to split themselves up, find an efficient way, but that would still give you n smaller companies instead of one giant company.

Francis: Well, you know, part of how we got here was this kind of scam that I think Reagan really started, where the free market became sort of a religion, and it was the answer to everything is just free market, free market, no regulations, and that sort of thing.

Richard: Right. It’s the cult of the invisible hand.

Francis: Yeah. And even with regards to free software, how do you think the free market system has backfired in terms of the common interest people have?

Richard: I wouldn’t say that. You know, free software can exist in a free market, in fact it does. Of course every market has regulations. Some markets have arbitrary regulations that limit participation to certain entities. That’s usually what’s meant by not a free market. But there are always regulations on any market. If you tried not to have them, everybody might start cheating unless they know each other enough that they won’t. And markets are unstable in many ways that economists know about but in neoclassical economics they tend to close their eyes to this fact. The reason that we have anti-trust laws, 100 or so years ago, Americans realized that the market was unstable against mergers and trusts that would conspire to fix prices. So, laws were needed and were adopted to stop that. Well, when the market hits an instability, if you don’t want to let society fall into that instability, you need laws to stop it. You need regulations of the market. So, I wouldn’t say that free software is in any way contradictory to the free market. But there are certain practices that are harmful that shouldn’t be done. Our society, our legal system and the ambient philosophy promotes the practice of non-free software which is basically subjugating people through their computers. If you want software to treat you ethically, the way to get that in practice is, insist on free software and that way the user community of each program can make sure it isn’t nasty.

Francis: Can you give an example of what that would look like to people who aren’t into programming and that sort of thing?

Richard: Well, suppose that there’s a certain program that is inconvenient to use “cause the designers made a certain choice about how you would tell it what to do, what it’s interface would look like. And suppose that it doesn’t cater to people with vision problems. People with vision problems can work together to add a screen reader into that program, or make it talk properly to a screen reader. You don’t have to wait and try to convince the program’s principal developers that it’s worth paying attention to this. You can just ignore them, you can fix it yourself. You personally could fix it yourself if you want to, and this kind of thing’s happened, you know. Do you know Krishna Kahnt (*)?

Chris: I do.

Richard: Yeah. Well, Krisha Kahnt is a great example. He came to one of my talks, I suppose it was in Mumbai, and he said “I’m a blind user of Gnu/Linux and I find that the screen reader is not working very well. What should I do?” And I said, “work on improving it,” and he did. And he made important contributions to the free software screenreader that we have.

Chris: They were tremendously important additions, and he works a lot on the project now.

Richard: Yeah. Oh, that’s good news, ‘cause he’s so effective. The point is, if you said that about Windows, it would be useless, it would just be pissing and moaning. It’s unpleasant that the screenreader in Windows is inconvenient in these ways, what can I do? The answer is, nothing. “Cause Microsoft controls it, and it controls you. But when the program you’re complaining about is free software, then you can improve it yourself, or you, being a group of blind users who don’t know how to program but you’ve got money to contribute, you could pool your money to hire somebody to make whatever improvements you ask for.

Chris: There is actually a group out of Australia called NV Access, who make a screenreader called NVDA for Windows that’s GPL.

Richard: Yeah, I know. It’s sad that it’s for Windows. I wish they would work on making it run on Gnu. So this freedom for the users is what distinguishes free software. So it means that we can fix things that are broken, it means we can add improvements, but in addition, it means that we can fix and also deter anything malicious. When a company controls the program and controls what it does, that company faces temptation, the situation gives that company power over the users, which adds up to temptation. And you can’t expect companies nowadays to resist temptation to exploit people. Exploiting people is what most of them think they’re there for. So they’re going to put in malicious functionalities: like stopping you from doing certain things. Like spying on you. Like the backdoor that allowed Amazon to erase copies of 1984 and at other times to erase other books, and then there’s the censorship control that Apple has, which allows it to prevent people from installing apps unless Apple approves of them. And Apple was exercising this censorship power arbitrarily until 2017, at which point China told Apple, “from now on you’re going to censor for us”—and Apple said, “yes sir.”

Francis: Was jailbreaking iPhones kind of an attempt at some sort of…?

Richard: Yes, the purpose of “jailbreaking” them was to be able to install programs Apple didn’t approve of. But I gather that it’s pretty hard to jailbreak them now, and it’s not done very much. So basically Apple won that battle. We can’t assume that we will always defeat malicious technology because we’re so clever. We have to fight against it on the ethical grounds, politically. We have to say, don’t permit computers to censor the installation of applications or other software. Why should we allow such products to be sold?

Francis: I agree 100%. I think most people would agree with that.

Richard: Well, why should we allow programs to spy on people? Why should we allow programs to be designed with DRMs so that they restrict what users can do with the data they have? Why should we allow programs to have backdoors? People have accepted, though, people other than me, and I hope you, have basically granted legitimacy to the software developers having power over users. There are many people that complain when they see companies use this power in ways that cause pain, but they mostly don’t say that the companies shouldn’t have power over people in the first place. That’s what the free software movement says.

Chris: When Tim Cook came out with his criticism of Facebook, and came very closer to your “user vs. used” statement, all I could think was, you know, Richard Stallman’s been saying this for ten years. Tim Cook is certainly smart enough to have known it ten years earlier, why wasn’t he saying it then?

Richard: Well, first of all, “smart enough to” is not the operative question when it comes to taking a political position. You need more than just reasoning ability to reach political conclusions, because these conclusions are based on events in the world. And if you don’t see the harm, then you won’t reach the conclusion. So that’s not where my criticism of Tim Cook would be. My criticism is that he presides over malicious software, too. Apple is, I think, the nastiest company in regard to malicious software. Everything about Apple machines is designed to be malicious. In fact, I read that as of September or so, it became impossible to install Gnu/Linux in a Mac, that the Macs that came out after that were configured in such a way that they wouldn’t permit it. It would easy…

Chris: I have Ubuntu installed in a virtual machine on a my Macintosh and use it, so…

Richard: Sorry, that’s not the same thing. You’re still running Apple’s non-free software, which I wouldn’t tolerate in a computer of mine. And if Apple has set it up so that you can only run free software by…run a free system in a virtual machine, and can’t make that the native system of the machine, that’s oppression. That’s one of the things that I’d propose should be illegal.

Chris: Apple does ship some free software on Macintosh, I mean Emacs is there…

Richard: Some free software doesn’t matter. The question is, can you live in the free world, can you have freedom? It’s like saying, well, gee, there’s no chain on my left hand, there’s only a chain on each of my feet and on my right hand and on my neck, but on my left hand there’s no chain! Isn’t that great! It’s a small step towards where things should be. You see my point? The goal of the free software movement is not that we have a few programs that are free. No, it’s to do our computing in freedom, which means no non-free software. That was the goal from the very beginning. That’s the goal that I announced in 1983, to develop a sufficient body of free software that we could get along without any software that is not free.

Chris: I’m going to ask you now, an entertaining Richard Stallman story, just because I’ve heard it told to me in about 30 different versions. You were coming home from your birthday party and were told that Symbolics was no longer sharing its software with MIT. Now the version of the story I heard most often is that…

Richard: Let me tell you what actually happened. It’s better than recounting a possibly incorrect story. First of all, I don’t know whether I had a birthday party that day. Most years I have not had one. But it was on my birthday that I was informed that Symbolics had effectively issued an ultimatum to everybody at the AI Lab. But perhaps I should explain the background for this, because most people listening don’t know what Symbolics was or what the dispute was about. Well, a group of us at the MIT Artificial Intelligence Lab had developed the LISP machine. LISP is a very elegant programming language that was often used for Artificial Intelligence, as well as for some other things. So this was a computer designed to run LISP programs faster and do so with full runtime error checking. If anything turned out to be the wrong data type and you tried to operate on it, the machine would show you that there was an error and where, and you could do whatever was useful to do. It wouldn’t go blindly on, treating the data as if it were the other type and do total nonsense. So the LISP machine was a big advance, and we wanted to give other people—other labs, not individuals, really, at the time—the chance to have that. So people started to create a company to manufacture LISP machines. By the way, LISP is short for LISt Processing. Anyway, there was a disagreement, and the group split and two different companies were started by different parts of the group. The person who had started the project, Greenblatt, wanted to make a hacker companies—we all called ourselves hacking, and “hacking” just meant playful cleverness, such as developing the LISP machine. That was hacking. We had a lot of fun being playfully clever, developing that software. So his idea was that it would not hire away everybody from the AI Lab, and thus not destroy our community, and in other ways be less ruthless than typical companies funded by venture capital. The other people decided to do it the usual way, with venture capital, and make something that would be ruthless. MIT set up an agreement between the two companies, for them to use the LISP machine software system, which entailed, by the way, making that system that I had helped develop parts of, into non-free software. And that, that hurt my conscience. What could I do? And things just went along, but basically the agreement was both of those companies could use and distribute MIT’s LISP machine operating system, and they both had to give their changes back to MIT. But the MIT lawyers forgot to include a clause saying that MIT could redistribute those changes. It sort of took for granted that that would be permitted. Symbolics was one of the companies, it was the one with the venture capital, the one that was going to be ruthless, and it was. And on my birthday, it said that from then on, it would provide its version of the system to MIT for people at MIT to use, but MIT would not be allowed to put those changes and improvements into the MIT LISP machine operating system. Symbolics’ idea was that since it had hired most of the programmers away from the AI Lab, and had destroyed our community, and the other company, Greenblatt’s company, since it didn’t have all that money, didn’t have programmers working for it at the time, Symbolics thought that it would destroy Greenblatt’s company this way. It thought that everyone at MIT would use the Symbolics version of the system because it would have improvements, and that the MIT version would languish, it would not be maintained, and therefore LISP Machine Incorporated, Greenblatt’s company, would fail. And I interpreted this as an ultimatum that required everybody at MIT to choose a side between the two companies. I didn’t want to choose a side, I had been neutral. I had stayed working at MIT rather than go into either company, because I didn’t want to take a side in that conflict. Although before it wasn’t a…burning conflict, it was just a potential conflict. It was definitely contention between the two, and I just wanted to stay at MIT doing what I’d done before. But I couldn’t do that anymore, Symbolics had forced me to take a side. Therefore, I took the side against the attacker, Symbolics. That’s what you do when you’re neutral and you’re attacked by one side. So, I said that I would develop replacements for all of Symbolics’ improvements and put them into MIT’s version of the system, and thus keep it maintained, keep improving, and also enable LISP Machine Incorporated to have a chance to succeed, stopping Symbolics’ attempt to destroy it. Symbolics didn’t like this, but I did this for almost two years, and that was to punish Symbolics for imposing an ultimatum that it thought would compel me and everyone else at MIT to be a supporter of Symbolics. I don’t like it when anyone tries to force me to support in an injustice against someone else.

Chris: So effectively in two years you did all the work of Symbolics’ team, that was heavily funded by venture capitalists?

Richard: Yes. I wrote equivalent different software to compete with all the people who had left the AI Lab that used to be part of the same group. I worked terribly hard. But it was my ability to succeed in doing that, that showed me that I could entertain the idea of developing a free operating system, which was an even bigger job.

Chris: And that’s what led to Gnu/Linux.

Richard: Well, Gnu is the operating system I developed in 1992, when Torvalds made Linux free software. It could be used as the kernel for the Gnu system, that filled the last gap in the Gnu system, which was almost complete at the time—complete in the sense of a minimal operating capability, a system you could use to do its own continued development. It didn’t have everything you’d want, but it was a system that could be run and developed. And that is the result of my decision, that I was going to develop this system no matter what it took, because it had to be developed, it was the only way that people would ever be able to use a computer in freedom. And that is, without any software that puts you under the power of somebody else.

Chris: And you’ve grown that community from basically yourself to tens if not hundreds of thousands of people around the world who are hacking on free software every day…

Richard: Yes. Although sad to say, most of them don’t support the free software movement. People in 1998 who disagreed with our philosophy decided to try to co-opt our work and substitute a different philosophy, a business-friendly philosophy, one that didn’t rock the boat. They called that “open source,” and the idea was they would not talk about freedom, they would never talk about it as a matter of right and wrong. They wouldn’t say that open source was their substitute for free, but it stood for a different idea, it wasn’t just a different name. It was a different name for a different meaning. Whereas we say it’s wrong if a program doesn’t respect your freedom, they would never say that it’s wrong if a program is closed source. That was exactly what they wanted to avoid even hinting at. Well, that led to a lot of companies, including even IBM, boosting the term “open source” and never raising the issue in terms of right and wrong. And that did lead to more development of programs that are free, but it led to our communities mostly forgetting about the issue of freedom, which is the reason this is so important. So, what that means is, our community is much wider now, but it doesn’t have the same deep roots.

Chris: How would you see to promote the philosophy moving forward?

Richard: Well, we can talk about it, which is what I’m doing now, and we can show that we treat this as a principle. So I won’t have a non-free program on my computer. I don’t say I use free software “whenever possible,” meaning whenever it’s not too hard. No, I say I will not use non-free software. I won’t have it on my computer at all, and if there’s something I can’t do with free software, well, then I’m not going to do it, I guess, until there is free software to do it.

Francis: What about having class action suits against companies to prevent the spying aspects…..?

Richard: I don’t know whether you have a chance of winning. In principle, if you’ve got a chance of winning, I’d say go to it. But I don’t think that there are very good laws against it. The most there is in some cases as far as I know is that, in some cases they are required to ask permission. Now that’s a very weak protection, because the computing industry, the software industry and digital dis-service industry, has become adept at the manufacture of consent. They can get you to say “yes” with a combination of several techniques. One is, writing carefully so that it’s not clear to you just what you’re saying “yes” to, and another is putting the consent in something very long, that would take you hours to understand, and you couldn’t understand because it’s written in a way that is above your reading level, and that you’re not even going to look at anyway.

Chris: And also typically you’ll find the “agree” button is before the text, so most users out there just hit the “agree” button so they can use the software.

Richard: I’m not sure exactly what you mean by “before”—I don’t know if it’s legally valid, legally binding if you can’t see the agreement before you agree to it…

Chris: They just put the button at the top, with the text below it, so…

Richard: They know they can manipulate people because people have not been…their consciousness has not be raised about how they’re going to be mistreated. But then the final pressure is, you want to use this dis-service, and you feel it’s very important to you, so even if you saw that they were going to do nasty things to you, you think that your choice is either to accept it or not. And in the short term, if you don’t value freedom the way I do, you might accept it. “Cause in the short term, your choice is either use it or don’t. But in the long term, society’s choice is either to have one that respects our freedom and privacy and community, or have one that trashes our freedom and privacy and community. Because if we tolerate the unjust one, it will fill the niche and it will be harder for us to replace it with the just one. So my feeling is, it’s my responsibility as a citizen to reject the unjust one.

Francis: Seems to me that, if it were other technologies that came before computing, the same type of behavior would have had people up in arms. Like if the phone company was listening to your phone calls and then sending that information to ..markers and stuff, I mean people…

Richard: Well, but that’s happening now.

Francis: It’s happening on phones?

Richard: Well, I know the phone companies are recording who you talk to, and some of them are telling the US government, one thing we learned in 2013, I think we learned this from Snowden, is that AT&T had been telling the US government—I think specifically the FBI—about every long-distance phone call it had made since 1988 or so.

Chris: And the government can collect all these phone numbers and know everybody who’s talked to everybody…

Richard: Exactly. And that’s exactly what we can’t allow the state to know about people other than court-designated suspects. That’s the system that we developed to protect people from being tracked in all their activities by the state.

Chris: We have Jim Fructerman coming on as a guest, and he’s built, at Benetech, a communications system that’s being used by more and more NGOs and nonprofits who are doing work in countries that…may have governments unfriendly to their work, that goes a step beyond TOR even, to make their communications private.

Richard: Does it support non-immediate messages, in other words something comparable to email?

Chris: I am not certain, I don’t know the details.

Richard: That would be important. You see, if it supports something comparable to email, something that people could have drop into their mailboxes, then I could use it. But if required that both people be online at the same time, it’s impossible for me. Totally impossible.

Chris: It’s pretty much designed for human rights organizations to communicate with each other.

Richard: It may be a very good thing, I don’t know enough about it. I’d be interested in seeing a summary of its features, maybe we could use it. Is it all free software? I’m hoping it is.

Chris: Yes, it is. Because of some mistakes they made early on, not everything Benetech does is free software, but they’re replacing everything they have with free software now.

Richard: I’m glad they’ve got on the right path.

Chris: They were doing some things in Windows. If Windows was free software, this wouldn’t have been required, but they had to use real—almost black-hat hacker techniques to get information out of some things in Windows.

Richard: Yes, that’s one of the problems you get when you’re trying to build on top of a non-free program, which is the developers tell you about how to do certain things, and they don’t tell you how to do other things. Non-free programs often have secret functionalities that other products from the same company can use, but your products can’t. Basically any chance they get to manipulate others to their own advantage, they don’t hesitate. Because there’s nothing to check their nastiness anymore. Windows has a universal back door. What this means is that Microsoft can forcibly change the software in a Windows machine in any way it likes. It can put any nasty thing into a Windows system.

Chris: And I’m sure there are many nasty things already there.

Richard: There are. We have a catalog of known malicious functionalities, hundreds of them. It’s in Gnu.org/malware. Malware means a program designed to run in a way that hurts other users.

Chris: You also have a list of all of the crimes Facebook committed, don’t you?

Richard: Well, I have a list of reasons to refuse to be used by Facebook, and that’s in Stallman. org/Facebook.html

Francis: How do they make all that data available? Like, how do they sell it, and who’s buying it?

Richard: It’s complicated and I don’t know all the info. I don’t actually care very much how they use it, the point is, those are details that don’t affect my judgment of anything.

Francis: I’m just wondering if perhaps it could be made illegal to use that kind of information.

Richard: Well, it could be. I think it’s probably just as easy to forbid the things they do to collect the information as it is to limit the harmful things they do with it. And one thing that we’re not likely to see laws to forbid them to do, is give the data to the FBI when the FBI asks for it. And that’s, of course, what the Pat-riot Act requires.

Chris: So if they’re not collecting the data at all, Frank, they can’t sell it, they can’t give it to the government, they can’t do anything. So Richard’s point is, pass laws that keep them from collecting it at all.

Richard: Right.

Francis: Well, I’m just being selfish, ‘cause I’m assuming there’s all this information about everything I’ve googled and everything I’ve bought and everyone I’ve talked to for the last bunch of years…

Richard: Well actually, if Google knows who you are, when you do a search, which it probably does more or less if you haven’t come through TOR using a browser designed to hide your identity…

Chris: I believe Google blocks TOR now…

Richard: Yes, it does. That’s my experience. Although…not actually it doesn’t block TOR, exactly. What happened last time I tried was, it sent me a capcha, but the capcha required running non-free software, and I wouldn’t run that. I don’t know what other nasty things that…the code of that capcha involves. We’re changing the subject…the point is, Google, if it knew who you were, kept track of what searches you did, and uses that to profile you and will give it to the FBI if ordered to.

Francis: I remember there was that time when everybody was worried about lists they were on, and how, like, if there’s some sort of other huge event that’s created like 9-11 that people would just be herded into stadiums and stuff, who are considered…potential threats to whatever they

Richard: If anything, we’re closer to that now. I wouldn’t be surprised if the supporters of the bullshitter do a thing like that. I used to call him “the Troll,” but that’s when …the main thing he was doing was saying outrageous things, in other words trolling. Even what he says is worse that just trolling, it’s making up total nonsense that’s different every day from what it was the previous day. I also refer to him as the Bully, the Cheater, the War-lover, the Saboteur-in Chief, whatever fits whatever act I’m talking about. The point is that they have such contempt for every actual meaning of human rights and the Constitution, that I wouldn’t be surprised at all if they put thousands of Americans into prison for political reasons.

Francis: During a crisis, they could easily pull that off.

Richard: Well, they have lots of cops that are right-wing. And so if they wanted to put leftists in prison, I’m sure they wouldn’t get much resistance from most cops. It’s not as if they would need to be careful about deciding who to imprison, people like that are not tremendously concerned about doing justice, even in terms of their own ideas of right and wrong.

Francis: I’m not sure if right and wrong really enters into their thought process. More like win or lose…

Richard: Well, in a way it does. They certainly do think that various groups of people are bad and they’re doing things that are wrong. The point is that checking carefully what each individual has or hasn’t done is not part of their thinking about it. They like saying “lock her up” and proving that she did something that is a crime, is not something that they think is even pertinent.

Francis: Well, we’re having to re-fight all these old battles. I think eventually where we’re headed now, we’re going to have to start arguing over whether democracy itself is valid.

Richard: Well, we know the answer to that. It’s the worst of all systems of government, except the others that have been tried from time to time.

Chris: So we’re bumping up against our hour—Richard, is there anything that you’d especially like to add?

Richard: For more information about free software and Gnu, look at gnu.org. If you’re interested in philosophical questions, look at gnu.org/ philosophy. If you’re interested in free software licenses and what they mean and which ones to use, look at gnu.org/licenses. If you’d like to see why schools have the moral obligation to teach exclusively free software, look at gnu.org/education. And if you want to look at why governments should move to free software in all their activities, look at gnu.org/government. And if you’re interested in why we should prohibit systems that collect personal data, make sure that they’re designed in a way such that they don’t collect data about who does what and who goes where and who talks with whom, look at gnu.org/philosophy/surveillancevsdemocracy.html–the title of the article is, How Much Surveillance Can Democracy Withstand? There’s also the Free Software Foundation, whose website is FSF.org. If you look at FSF.org/resources/hw you can find out about machines that can run with free software. And there’s other useful resources and campaigns in that site as well. And you can also donate or become an associate member of the Free Software Foundation. Donations are memberships are our main source of funds. There’s also gnu.org/help, which lists various kinds of volunteer work we’re looking for, and most of them do not involve programming. You don’t need to be a technical person to do them.

Chris: And that’s something that I constantly tell people I know who use Gnu software, that they can help out in one way or another. Whenever I talk to somebody who says “oh, Gnu—I use Gnu paint, or whatever Gnu program they use.

Richard: Thank you for doing that. And Happy Hacking!

Chris: Thank you so much for coming on Making Better, and we hope you’ll come back maybe in a year, give us an update.

Richard: Sure.

—[end]

Leave a Reply

Your email address will not be published. Required fields are marked *