 
# 

# Computing Without Compromise: Love Letters to Open Source

  1.  1 Preface 
  2.  2 KDE4 Memoir 
    1.  2.1 Pros and Cons 
    2.  2.2 KDE 4 Release 
    3.  2.3 The Future 
    4.  2.4 The Here and Now (Then) 
      1.  2.4.1 No Desktop 
    5.  2.5 Qt 
    6.  2.6 Lunch 
    7.  2.7 Coverage 
    8.  2.8 Fallout 
    9.  2.9 Half Life 
  3.  3 How I Love Linux 
    1.  3.1 Smart applications. 
    2.  3.2 Customization. 
    3.  3.3 Efficiency. 
    4.  3.4 Configurability. 
    5.  3.5 Control. 
    6.  3.6 Unlimited Everything. 
    7.  3.7 Network Transparency. 
    8.  3.8 Independence. 
    9.  3.9 Portability. 
    10.  3.10 Constructive. 
    11.  3.11 Educational. 
  4.  4 Environment Variables 
    1.  4.1 Environment Variables Explained to an 8 Year Old 
      1.  4.1.1 Try It Yourself 
  5.  5 My Box 
    1.  5.1 Historical Significance 
    2.  5.2 Gamepad 
  6.  6 Why Software Needs to be Open 
    1.  6.1 What is "Open Source" and How is it "Free"? 
    2.  6.2 Every Kilobyte is Sacred 
    3.  6.3 Art and Sharing 
    4.  6.4 Lingua Franca 
    5.  6.5 Capitalism is Not the Answer 
    6.  6.6 Code is Cheap. Support, On the Other Hand... 
    7.  6.7 People Understand Money 
    8.  6.8 De-Centralization 
    9.  6.9 Diversity 
    10.  6.10 Open 
  7.  7 @font-face 
    1.  7.1 A Note About Fonts and the Web 
  8.  8 Corporate Recovery 
    1.  8.1 Hardware Problem 
    2.  8.2 Software Problem 
    3.  8.3 The Price of Quitting 
    4.  8.4 The Solution 
  9.  9 Type Special Characters 
    1.  9.1 Set Up a Compose Key 
      1.  9.1.1 KDE 
      2.  9.1.2 Gnome 
      3.  9.1.3 Other Desktops 
    2.  9.2 Using the Compose Key 
  10.  10 Be Dirt Poor and Happy 
  11.  11 Experience 
  12.  12 User Betrayal 
    1.  12.1 Is it Worth it? 
    2.  12.2 The Open Specification Alternative 
    3.  12.3 The Open Source Alternative 
    4.  12.4 Open Source Guarantee 
  13.  13 On Changing the Channel 
  14.  14 Use dd 
    1.  14.1 Imaging 
    2.  14.2 Restoring 
    3.  14.3 Compression 
  15.  15 What is "Intuitive Design"? 
    1.  15.1 Expectation 
    2.  15.2 Exploration 
    3.  15.3 Context 
    4.  15.4 Now try it with Plain Text 
    5.  15.5 Intuitive Design 
  16.  16 State of Independence 
    1.  16.1 Independence. 
    2.  16.2 Control. 
    3.  16.3 Ecologically Responsible. 
  17.  17 So You Want to be a Programmer 
    1.  17.1 Systems 
    2.  17.2 Parsing 
    3.  17.3 Logic 
    4.  17.4 Maths 
    5.  17.5 Collaborate 
    6.  17.6 Computers 
    7.  17.7 Learning 
    8.  17.8 Programme 
  18.  18 Abstraction the File Chooser 
    1.  18.1 Proposed Idea for a Solution 
  19.  19 Howto Write a Howto 
    1.  19.1 Proof of Concept EARLY 
    2.  19.2 Separate Instructions from Explanations 
    3.  19.3 Tell the user Why they are doing something 
    4.  19.4 People are there for the Information 
    5.  19.5 Update the article 
      1.  19.5.1 Comments are not Edits 
    6.  19.6 Provide a Definitive End 
  20.  20 The AT&T Guide to UNIX 
  21.  21 Float vs Inline-Block 
    1.  21.1 What Does Float Do? 
    2.  21.2 display: inline-block; 
  22.  22 Petty Computing Issues 
    1.  22.1 Moral 
    2.  22.2 But Wait, There's More 
    3.  22.3 Solution 
  23.  23 Race to Better Security 
    1.  23.1 Bottom line 
  24.  24 Security and Upper Management 
  25.  25 Cross Compile 
    1.  25.1 Setting Up Your Dev Environment 
    2.  25.2 Caveats 
  26.  26 GNU Linker 
    1.  26.1 Do it Right 
      1.  26.1.1 FLTK Standard Build 
    2.  26.2 SFGUI with -rpath 
      1.  26.2.1 LD_LIRARY_PATH Example 
  27.  27 Override Runtime Libraries with Env 
    1.  27.1 The Problem 
    2.  27.2 The Solution 
    3.  27.3 [Non] Caveats 
  28.  28 Just a Mechanic 
    1.  28.1 Do as we Do, And as we Say 
    2.  28.2 That's MISTER Computer-Repair-Guy to You 
  29.  29 Brand 
  30.  30 LAMP 
    1.  30.1 Installing LAMP Components 
  31.  31 Affordable Computing 
  32.  32 Apologies 
    1.  32.1 Knowledge as a Threat 
    2.  32.2 Not Suitable for Normal Users 
    3.  32.3 A New Normal 
    4.  32.4 Certification 
    5.  32.5 Choose Independence 
  33.  33 Rsync and Rsync Daemon 
    1.  33.1 How to use rsync 
    2.  33.2 Automate rsync 
      1.  33.2.1 Exceptions to the Rule 
    3.  33.3 Rsync Server 
    4.  33.4 Rsyncing over the Network 
    5.  33.5 What rsync alone does not do 
  34.  34 Destructive Habits in Approaching Open Source 
    1.  34.1 The Real Thing 
    2.  34.2 So, what is Open Source all about? 
  35.  35 Using 'su' and 'su -' 
  36.  36 Free Gaming 
    1.  36.1 Games as Culture 
    2.  36.2 Open for Culture 
    3.  36.3 Books and Movies and Music 
  37.  37 GNU tar 
    1.  37.1 creating tarballs 
    2.  37.2 Add a file or dir to an existing tarball 
    3.  37.3 View a list of files within a tarball 
    4.  37.4 Extract just one file 
    5.  37.5 Extract a directory 
    6.  37.6 Extract multiple directories 
    7.  37.7 Extract files using regex 
    8.  37.8 Extract a tarball 
    9.  37.9 Extract a tarball to another directory 
  38.  38 Marketing Exclusivity 
  39.  39 Inclusive Technology 
  40.  40 Setup VNC on Linux 
    1.  40.1 Platform Notes 
    2.  40.2 Start the VNC Server 
    3.  40.3 Make the Connection 
      1.  40.3.1 Having Trouble Connecting? 
  41.  41 What is a "Hack"? 
  42.  42 The Promise of Technology 
  43.  43 Makers 
    1.  43.1 The Maker Burden of Proof 
    2.  43.2 The [Actual] Maker's Burden of Proof 
  44.  44 Life Lessons for New Sys Admins 
    1.  44.1 Linux Linux Linux 
    2.  44.2 Turnkey 
    3.  44.3 Not Your Repairman 
    4.  44.4 Illegal Software is Illegal. 
    5.  44.5 You are Your Own Wing Man 
    6.  44.6 Contribute Back 
  45.  45 The Trouble with Lifejackets 
    1.  45.1 Preamble 
    2.  45.2 Insurance 
    3.  45.3 Personal Detachment 
    4.  45.4 Maintenance Mode 
    5.  45.5 Get It 
  46.  46 Learning 
    1.  46.1 Knowledge Over Skills 
    2.  46.2 Lingua Digitum 
    3.  46.3 Open Source 
    4.  46.4 But Wait! There's More! 
  47.  47 Ebook Formats 
    1.  47.1 Ebook or Information? 
    2.  47.2 The Bad 
    3.  47.3 The Ugly 
      1.  47.3.1 PDF 
    4.  47.4 The Good 
      1.  47.4.1 Epub 
      2.  47.4.2 cbz 
    5.  47.5 But Wait There's More 
  48.  48 Pay or Don't Pay 
    1.  48.1 They Fear Your Silence 
    2.  48.2 Capitalism in Action 
    3.  48.3 Money 
    4.  48.4 Honesty 
  49.  49 Ownership 
  50.  50 Late Blooming Geek 
    1.  50.1 Approved Method of Acquisition 
    2.  50.2 A Brush with the Alternative 
    3.  50.3 Man Behind the Curtain 
    4.  50.4 Unix 
    5.  50.5 Hardware Hacking 
    6.  50.6 GNU's Not Unix 
    7.  50.7 Linux 
    8.  50.8 Open Source 
  51.  51 Colophon 
  52.  52 License 
    1.  52.1 Creative Commons Attribution-ShareAlike 4.0 International Public License

# 1 Preface

I like books.

They're entertaining and informative and, especially if you grew up with them as a part of your life, downright comforting.

And _comfortable_. There's just something about sitting down with a book, in a comfortable chair (or the floor, if you prefer), and getting lost in the pages.

The book, as a format, is a medium that I enjoy.

And don't get me wrong: I'm actually not one of those purists who believes books must be on actual paper. I'm quite happy with an e-book. What I am saying that I enjoy over other formats is the sense of focus you get when you declare to yourself and others "Don't bother me, I'm going to go read a book; I'll be back in a few weeks, although it'll only seem like a few hours to you."

It's like time travel, or going off to Narnia. It's a one-on-one, intensely private occurrence: the whole of your thoughts and mental energy and a book.

You see, I also enjoy computers, and technology, and other topics that are often called "geeky" or "nerdy".

So, what do you do if you like both technology and the decidedly non-technical format of books (besides reading an e-book to rankle your book-purist friends)?

That was the question that faced me over the past year, after I'd read all the sci fi paperbacks I could possibly find at my local op shop. I wanted to relax with a good book, but I wanted the book to be about something that I enjoyed.

Because I do enjoy Unix, and Linux, and technology. I like thinking about those topics. I talk about them whenever I can. I dwell on them because they are topics that, to no greater surprise than my own, affected and changed my life.

But nobody writes book about technology any more; technology, as everyone knows, moves too fast to be committed to woefully stagnate written words. We've all moved on; we have dynamic and responsive websites now, we have pages and pages of comment systems, and social networks, and online courseware, and virtual reality. Why would we need books?

Of course, there are tech books out there, if you look. But I think you'll find the bulk of what's out there are, well, _about_ something. Even my own tech books, Slackermedia and Programming Book, pride themselves on being task-oriented. You don't read them for fun, you read them to learn something. They are instruction manuals, user guides. And usually that's a good thing.

It occured to me, though, that sometimes I just wanted to read about nerdy topics without necessarily looking to learn a specific skill, but also without going all the way over into the realm of speculative fiction.

To that end, I found a great book, The Charm of Linux by Hazel Russman, and a few geeky books on Smashwords, and a few WikiBooks, and some stuff online, but all in all I felt there needed to be more.

The book you are about to read is not about any specific topic. It's not going to teach you much, if anything. It's a collection of musings, thoughts, ideas, notions, reminders, and love letters to and about Unix, GNU, Linux, computing, technology, and all things geeky.

So thanks for picking up this book, and please do enjoy it. It's meant for that; enjoyment. it's not a lesson, it's not a lecture, it's just a fun discussion, usually positive but sometimes maybe a bit cranky; it's a chat with a friend over a cuppa.

So sit back, relax, and read.

# 2 KDE4 Memoir

I am slowly coming to realise that a historical perspective on small things, from a single individual, is actually kind of important. If people don't write little notes about what they witnessed "that one time" when "that thing happened", then memories get muddled, achievements get lost, and newcomers are unable to at least attempt to understand why something is important, and exciting. So I want to make a quick memoir about KDE 4.x, because although I'm a relative newcomer to Linux (didn't start using it until 2006 or so), I was very much on the scene for the birth of KDE 4.x.

When I started with Linux, I had the usual murky understanding of how a desktop was different from the operating system, and with all the different desktops to choose from and familiarity with _none_ of them, my early KDE experiences are pretty vague. I remember not loving KDE, because it did look a little too much like what I'd seen of Windows, and it had lots of tooltips. But I did use it, especially on Slax and Slackware.

Not too long after I had switched to Linux full time, I heard that KDE version 4 was scheduled to be released soon. Apparently it had been scheduled for quite a long time; I remember an early KDE 4 preview bundled in with a **Linux Format** magazine; it was absolutely nothing like the modern KDE 4 but I guess the framework itself must have been there.

## 2.1 Pros and Cons

At the time, the main complaints about KDE in general seemed to be:

  * Too complex: too many buttons, too many contextual menus, too many ways to customise the desktop. Users get lost in all the configurability.

  * Application names too often start with the letter "K", or use "K" in place of a hard-consonant.

  * Needs a make-over.

Praises for it often were exactly the opposite of the complaints. Traditionalist users loved its look, loved all the options, and loved the branding.

In addition to that, though, a big deal was made about its own internal integration; all "k apps" did things basically the same way as one another; there were frequently tabs down the left side of the window, or panels, there were similar keyboard shortcuts, a similar look and feel, all widgets worked the same, all file chooser dialoge boxes were the same, the same notification system got used by everything, and so on. If you ran a computer using primarily a KDE desktop and K applications, then it all felt very unified, very much like, well, an "operating system" in the way that most users think of an operating system.

It was, in a way, the "KDE OS". Sure, you could add applications outside of the KDE group, but those would feel like the "open source" add-ons that you might install on top of any other OS you buy from a store; they work great, they just happen to have some unique conventions, some buttons that look a little different, and so on.

This, by the way, is still very much the feel that you get when using KDE. It is very unified, both in the way it works and in its developer community. If I was going to go into a store and buy a pre-configured Linux PC, I would buy one that was branded KDE, and then add the applications that I need in addition. Come to think of it, the OS that I use (Slackware) _is_ a KDE OS in a sense, so I guess I already do that.

## 2.2 KDE 4 Release

Any way, when KDE 4 was announced as imminent, there was a lot of excitement, presumably because people had been waiting for it for a long time, but also because everyone beta testing it were posting screenshots and it looked, basically, amazing.

I was on a few KDE mailing lists at the time (or maybe I just frequented their forums, I don't recall exactly), and an open invite was issued to anyone who wanted to attend the special KDE Event where KDE 4 would be unveiled. The event was free, but you had to get yourself there and pay for lodging and food (aside from a lunch that was provided at the event).

I had recently started podcasting, and I had also recently moved to California for a job, so I figured it might be a good idea to attend. I signed up to go, rented a car, and drove an hour or two over to Mountain View, to Google.

The event was not huge in terms of size and attendance, but it wasn't small either; I mean, it wasn't held in an auditorium or anything, but it was in a large meeting space with several rows of chairs set up and a podium at the front. The crowd was pretty large, and I have never really been one for mingling or socialising, so I kept mostly to myself.

The event started with, I think, Aaron Seigo introducing KDE 4, including its new components (Plasma, Frameworks, Solid, Phonon, Nepomuk), as well as its philosophy of integration, unified design, and task-awareness.

## 2.3 The Future

The theories of integration and unified design I have already mentioned. The idea of KDE as a task-aware environment was something that sounded really exciting, and still, more or less, does, but it appears to remain out of reach. The idea was that KDE would know when you were at work and when you were at home, so when you went to work, your laptop would log on to the correct wi-fi network, and present you with your "work" activity set. When you were at home, your laptop would switch over to your home network, and present your home activity.

An "activity" was, as I understood it at the time, an instance of your desktop specifically geared toward some set of associated tasks. In theory, I guess, you could create an activity for audio editing. You could switch to that activity and find your DAW running, some synths, and so on. When you're finished with that, you could switch to your banking activity and find a web browser open to your bank website, your budget spreadsheet, and a calculator. And so on.

The implementation of this continues to completely elude possibly all KDE users on the planet.

Another thing talked about a lot was desktop indexing. Nepomuk was going to index all the files on your computer for you so that you could find your files quickly and easily.

It was a really big deal at the time (and maybe still is as I write this, I'm not sure) that you could wield meta data in exciting new ways. Meta data: data about data!! It's so cool, right?

Right??

Well, I didn't think it was that exciting, personally, but at the same time, I could see, and I can still see, why having these abilities was important in order to progress toward greater things later on. I don't think all of these fancy "the computer knows all!" technologies are useful in real life right now, but it is useful for big science, and it's a good thing to be able to say "oh, yeah, we know how to do that; that's already written into the framework" when, in the future, we actually do have a need for it.

In other words, KDE as a project was looking ahead.

Far ahead.

## 2.4 The Here and Now (Then)

More importantly, of course, KDE had changed everything. I mean, the whole desktop had been overhauled, and it was beautiful and exciting. It was like we were getting tomorrow's tech today. The desktop menus were smoky-black, the kicker was glossy, there were plasmoids that floated on the desktop to provide quick access to commonly used tools.

There was a brand new, simplified file manager called Dolphin.

A new multimedia backend with the express purpose of taking whatever media you throw at it and finding something to play it for you. There was a hardware layer that abstracted all manner of hardware from you; like the multimedia backend, this layer took hardware and used whatever was necessary to make it visible for you.

### 2.4.1 No Desktop

Another cool thing was that the desktop itself did not allow icons to be placed onto it. I used to joke to people that I judged how savvy a computer user was by how organised or disorganised their desktop was (only, I wasn't joking). In fact I illustrated an example years ago, based on a real computer that someone brought to me for repair. Don't worry, their privacy is retained:

So for KDE to not even bother with implementing icons on the desktop was a brilliant move toward consistency in the paradigm.

 **Think about it.**

You have a computer, and in the computer you have files. Where are the files kept? in a home directory. That home directory is available through a file manager. It makes sense so far. You can even liken it to the real world; you have a file cabinet in your office, you can access those files by opening the cabinet and opening up the folders inside.

But with the desktop thrown in, you suddenly have a file manager containing all your files, but one of those folders is scattered across your screen. Why are those files _there_ when all the other folders and files are inside of the file manager? and how can just _those_ files be _both_ inside the file manager and scattered all over my desktop? The real world example of that makes just as much sense; your desk is a magical realm where you can access folders and files on a certain folder inside your file cabinet, and even though those files are on your desktop, they are also inside your filing cabinet, along with Santa Claus and the Easter Bunny.

And what's the point of the desktop, anyway? you have all of these things scattered across your workspace, and the moment you open an application to get some work done, it's all covered up by your application.

And why are we encouraging the desktop-as-a-work-directory, since moving any file there necessarily moves it out of its natural organisational structure? no wonder people need desktop indexing; UI designers for decades have been training them to throw everything on the desktop and let God sort it out later.

Getting rid of the desktop as a _place_ makes good sense, and I was shocked that no other desktop thought to do it sooner. Sure, Fluxbox, which was the other system GUI that I used on a regular basis, and some others had done this for years, but no other desktop environment had ever thought to take such a bold and definitive step.

And on a purely visual level, the desktop and applications just looked fresh and new, clean and simple. It was like all the dust and detritous had been cleared away for a fresh start. It felt clean.

And it felt like this KDE was, literally, from the future. Sure, some of the things they were talking about were well-known buzzwordy concerns _du jour_ but a lot of what they were aiming for was utterly unique. They wanted your computer to be in sync with your life. They wanted your computer to be working _for you_.

## 2.5 Qt

After Aaron Seigo, as I recall, some of the KDE teams reported in and gave status updates on where they were in becoming fully KDE 4 compliant. Amarok pledged its allegiance, and the porting teams gave updates on how they were doing in their efforts to make KDE work on non-Linux platforms. All that was good, but the other heavyweight, aside from Aaron, was the co-founder and current manager of the Qt framework. Unfortunately, I don't remember which of the two founders it was.

He spoke about the history of Qt, basing his talk around one of the default wallpapers from OS X, of all things. Specifically, it was a clownfish among some green bulbous plant life. There was some obscure fact about clownfish, as I recall, that he equated to Qt libraries; I don't remember much of his talk, except some of the historical stuff (like how Qt got started and what it offers), but he eventually got round to revealing that Qt, from that point on, would be moving to a GPLv3 license.

That got a standing ovation.

And that moment, when we all stood and applauded for the further opening of a great, die-hard cross-platform and unifying open source project, captured the overall sense of that event, at least for me. We were there to celebrate a technology that was based upon and extends the ideas that all platforms should be treated equally, that they should all run the same code, and that they should serve the user.

It's sad that in 2008, that's what we felt we had to fight hard to promote. It certainly feels like that goal should have been conquered by 2008, and it certainly feels that way now, but sadly it's still a battle technologists are fighting to win.

## 2.6 Lunch

A lunch was provided, which not only gave me a chance to eat for free, but it also forced me to mingle with other attendees, at least a little. I wish I'd done more of it, because I later found out that several people I more-or-less knew over the internet were also in attendance; notably, I know that Patrick Volkerding of Slackware was there, as was Chess Griffin from the Linux Reality podcast.

The people I did happen to meet were either journalists or programmers. The former group mostly ignored me either because they were also shy or because they felt like I wasn't a real journalist (and I wasn't), and the latter group turned out to be stereotypically boring to talk to. But what did come across for me was the impact that KDE and Qt actually had on the real world. Sure, at home nobody knew what "Linux" was, much less what "Kay Dee Eee" meant, but here, in this world, it was a major force in people's lives. Qt, which I'd vaguely heard of prior to this event, turned out to be a very important component in commercial software, from Skype to everything Autodesk released. It employed people who, in fact, seemed to have no real interest in anything technology-related aside from the knowledge that being able to write code happened to pay their bills. KDE was in use by companies that weren't even aware that there were users out there using it _for fun_ , because _we wanted to_.

In short, this Linux and Open Source thing was a lot more diverse and multi-faceted than I had realised.

## 2.7 Coverage

I documented being at the event in a podcast of mine. My show had barely even had 30 episodes, 19 of which had been dedicated not at all to Linux, but how to use Apple computers as "Unix" (quotes because, regardless of what they tell you, Apple Unix just isn't like the real thing) machines. But this episode was exciting for me. I'd finally attended a public tech event, I had gotten an incredible scoop (well, I thought it was a scoop, forgetting that journalists from major publications were there) about Qt going GPLv3, and I'd seen a desktop that blew my mind with its beauty, functionality, its goals, its ideas, and the passion of its developers.

## 2.8 Fallout

After KDE was released, a few things happened. I can sum it up with two real-world examples:

  1. Slackware's next version did not ship with KDE 4 series desktop, retaining KDE 3 for its maturity and stability.
  2. Everyone else (that's an exaggeration) shipped KDE 4.

In software development, there are three distinct stages of life:

  1.  **Alpha** , during which only developers get to see and use the product,
  2.  **Beta** , in which quality assurance monkeys test the product, and developers fix any problems found (or kill the monkeys to keep them silent).
  3.  **Release** , in which the product is sent into the wild.

In open source, these stages are left up to the discretion of the user. After all, the source is _open_ , so you can go get it, build it, and use it, at any time. But if you are a user who values a computer that works reliably, then you are supposed to refrain from doing that. If you are a user who just wants to be a beta tester and help out programmers, then you should go for it.

In theory, a software "distribution" can help users understand the difference between something that is in alpha or beta and something that has been released.

I feel that in open source, to this day, these lines are blurred by everyone involved. And frankly, it annoys me. Everyone of us needs to do better:

  * Programmers need to label their code. If you don't want people stumbling on it and trying to use it as something reliable, then make that known.
  * On the other hand, if your code is at a reasonable stage and should be used as a finished product, then you should tell people that, too. Set expectations, let people know just how buggy they should expect this to be; it will help them know what to bug you about, and it should keep out most of the riff raff who would otherwise report things like "does not have an install script" or "does not compile without debug flags on" or whatever.
  * Distributions need to protect their users from alpha and beta software. People are using a distribution because they don't know how to assemble the stuff on their own, so you can and should assume that your users expect things to work. People expect that what is on the label is what they are getting, so don't lie to them.
  * Is your audience specifically _not_ the unwashed masses? is the point of your distribution to ship alpha and beta software? then you should make that clear! don't pass yourself off as an OS for computer users when you're an OS for computer programmers.
  * Users need to learn to tell the difference between something that is in a "release" stage of life. Unfortunately, this is often difficult since programmers and distributions tend to muddle it up so badly, but here's a hint: the latest release of any software is probably _not_ the most stable. If you don't want to deal with bugs, then you do _not_ want to be an early adopter.

I say all of this because I feel like of all the open source releases that I have witnessed, the KDE 4 release exemplifies everything that could possibly go wrong. The event was great, and it was clear at the event that it was not complete yet. It was still a work in progress. I know this because I said as much in the episode of my podcast that I made the evening of the event.

KDE was very clear that KDE 3 was still the stable branch, it was still the desktop to use, and it would continue to be supported for years to come.

What everyone in the world apparently heard was: "start using KDE 4 right now". I feel like the best defense users have over blind media hype are distributions, and I have to say that I feel very strongly that several major distributions utterly failed their users when KDE 4 hit. It got picked up by distributions _way_ too early, and it stunned the users in how drastically simplified the desktop was, how all the cool features they'd come to love were now missing, how different everything was. People hated it.

And I'd have hated it, too, if I'd jumped into it not understanding that this was not a finished product. How unfinished was it? Oh, it was unfinished. The system tray crashed on a regular basis; basically, any change to an applet in the system tray caused the tray to crash in some small way, so if you plugged in a USB thumbdrive, then your system tray would get maligned with the rest of the kicker. I know this, because I wrote a blog post with a work-around (make the system tray a hovering widget). The Activities function (the one that would help you move seamlessly from work to home) was a joke. The kicker panel could not be resized, and it was quite large. And so on, and so on.

The KDE devs tried to do damage control, telling everyone not to try KDE until the 4.2 release, but it was mostly too late. Distributions had picked up KDE 4, and they were forcing normal users to be beta testers.

This isn't my memory playing tricks on me; the breach of communication was severe. I still blame a few select distributions, including one that had poised itself to be a sort of Linux for "normal" people, and yet which consistently seemed to ship beta software, but I think a lot of people blamed KDE. The schism got bad enough for a serious fork to happen; almost 8 years later, and the (Trinity Desktop Environment](https://www.trinitydesktop.org/), which took KDE 3 and ported it to Qt 4 with almost no UI change, is still going strong.

I, myself, was intentionally using KDE4 at the time, on Fedora, because I wanted to help beta test. So I got to see it go through all of its growing pains, but I didn't mind, because I'd signed up for it. When Slackware got KDE4 by default, I switched back to Slack as my main OS, and it's been smooth sailing ever since.

## 2.9 Half Life

As I write this, we are on the cusp of KDE5. KDE devs swore that the transition from KDE 4 to 5 was going to be easy and gentle. The desktop was not being re-invented, just updated. Initial reports from beta testers (appropriately using KDE in distributions that advertise themselves as cutting-edge) seem to mostly support this, although some functions missing from Qt bleed over into missing functions in KDE. Most importantly, the rampant confusion of why something totally different and new is being forced upon users does not appear to exist. Some distributions still, in my opinion, have a severe identity crisis, but that's got nothing to do with KDE.

So what ever came of KDE 4?

Well, some things about KDE flew and others sank, and still others ended up taking a different form.

The Trinity Desktop Environment fork notwithstanding, the new KDE interface became an accepted way of working. It got all of its config options back, eventually, and KDE is now as easily customised as ever. Few of my KDE desktops ever look the same (by choice!) and they all work exactly the way I want them to work. I would be lying if I said that KDE left me wanting for _anything_. It is exactly what I want, no matter what I happen to want on any given day.

Plasmoids (desktop widgets) turned out to be, as I figured they would, just a trendy myth of the time; there weren't actually must-have tools that should hide in plain view on your desktop so that you could spend 5 clicks re-positioning all of your windows to get to the tool, thereby saving you two clicks to just go to a menu and start the application directly. No one wanted them, no one used them, and no one developed truly useful ones. I say this broadly and unfairly; plasmoids are fine, the concept is benign, they can be useful for some things, but they were overrated.

Nepomuk was the bane of the typical KDE user for years, and caused some contention. I was among the users who deactivated it with every new release after giving it a fighting chance for a month or two. Nepomuk did not work, simple as that. I don't know why, because I hardly know anything about its code or methodology, but I do know that it did not work.

But I also do not know what was learned from Nepomuk. Either way, it got replaced eventually by Baloo, which works marvelously. Maybe Nepomuk was a stepping stone, or maybe it was the basis for a better version of itself; should it have been included with KDE upon every release? not sure. Was it painful? yes, it was painful, but mostly just socially. I mean, you could turn it off and forget it existed, so who really cares?

The mysterious meta data thing turned out not to be a myth, but it was mischaracterised. People didn't want to take vacation photos in Fiji, and then go home and search through their Geo Tags (meta data recording the location where the photo was taken) for all the photos in Fiji, and then open up a 3d model of the Earth, and put little cartoon pins linking to their vacation photos in all the places they had been.

But it turns out that big science loves subtle correlation of obvious data, and this kind of meta data geekery has thrived in that space.

So KDE 4, and everything around KDE 4, was a success in a lot of different ways.

The release of KDE4 was a zeitgeist of excitement over open source, and it continues to be a shining example of open source being progressive rather than playing catch-up. KDE is a great desktop, it's a great user experience, it's integration, it's intuitive design driving intuitive learning, it's advanced but simple, it's beautiful but geeky. It tried some new ideas, it resurrected old ones, and darn it, it inspired at least one young man who wanted to become a part of a grass-roots effort to combine independent art and independent technology.

# 3 How I Love Linux

I hate it when someone asks me "Why do you like Linux?" and all I can think to say is "I dunno, just do." because there are some very good reasons that I really do passionately love Linux, GNU, UNIX, and free/opensource software. These are all, of course, distinct things but since they all party together, I will take them as a whole.

But the problem with the question " _why_ do you like Linux" is that it often is countered (even when I hadn't realised there was a debate) with reasons that whatever I have just listed is not unique to Linux.

It's a weird argument to have at all, because pretty much any computer can do pretty much anything. I mean, they are computers. It is not at all surprising that they all do a lot of similar things.

So instead of listing _why_ I love Linux and most everything surrounding it, I am going to list _how_ I love it. You will probably find shining examples of a lot of these things that also occur in another OS or application set. That's OK. I am just expressing to you the little things in daily use of open source that keeps me coming back time and time again, and loving every minute of it:

## 3.1 Smart applications.

There are exceptions, but so many open source applications are just so darned pragmatic. It feels like someone designed them to be used, not to just be looked at, or just for a quick demo at a tech conference. These are applications meant for _daily_ use.

Take Gwenview, for instance; when you open it, it doesn't just open to a blank screen, it has a little file navigator right there in its main window, so you can go, in app, to the folder you want to view. And if you do open Gwenview to look at just one photo and then decide you really want to see all other photos in a directory, no worries; Gwenview has anticipated that and provides access to the rest of the directory (or filesystem for that matter). It's not like many other image viewers I've seen, many of which are dumb, interactionless apps.

That's just one small example, and I could go on and on. But giving specific examples is difficult because for every feature I mention here, a non-Linux app could be found with the same feature. But it's not the specific features that impress me; it's the tendency to make applications actually usable and useful, not just pretty to look at, and certainly not only if you agree to use it in one specific way.

## 3.2 Customization.

Not everyone uses their computers in the same way. In fact, I myself don't even use each of my computers the same way; my work desktop and my personal laptop both run Slackware with a KDE desktop, but you wouldn't necessarily know that by looking at them, because they are completely unique. The nice thing is that on my old OS, I used to spend hours trying to customize the way I interacted with the computer and never did really succeed beyond skinning and theming. On Linux, I spend half that amount of time and end up with an UX designed by me, perfectly suited to my individual way of using that machine.

## 3.3 Efficiency.

Simple things like **emacs ~/docs/foo.txt** , which opens emacs, establishes a new document called **foo.txt** , and places that document in **~/docs** just like that, in an instant.

Compare this to finding some substandard GUI text editor, starting to type, then going to save it, navigating through a **Save Dialog** so you can save the document where you want it, naming it, clicking save, and then fighting the application over whether you REALLY want it to be saved as plain text or not...and so on. It reads like one of those bad infomercials, but it's true, especially when you have to go through those steps multiple times a day.

Add customized commands and shell scripts into the mix, and it's just embarrassing. Simple examples would be resizing images and id3 tagging or organizing music or bulk renaming files. Those kinds of tasks used to be full days of work for me, even with some of the rudimentary scripting tools that my old commercial OS and applications provided.

On Linux, these are such trivial tasks that they aren't even on my radar any more. From full day of work to routine tasks that don't even make it onto my TODO list. That's huge for me.

## 3.4 Configurability.

I build and work on computers often, whether it's for the random friend who is curious to try Linux, or a new file server or dev machine at work, or a personal server, or a multimedia workstation for a client, or repairing a spare laptop that I've rescued from a dumpster. I used to do the same thing with the commercial OS I used to use, and for each computer, I'd have to change all the settings (as much as they permitted, any way) and load each custom applications I used, create users, open and close ports, and so on.

It was an all-day event to get the thing where I wanted it to be, and there was A LOT of clicking and entering of serial numbers and agreeing to licenses and trying to remember which setting I'd forgotten to set.

And then. Suddenly. I found Linux.

I cannot think of anything on my Linux systems that cannot be set with a plain text configuration file, nor anything that cannot be installed with a shell script. Setting up computers now is simply a matter of copying over a folder of configuration files, and I'm done. If it's a specialized build, then it may still be an all-day event (depending on how much I personally choose to install from source, etc) but it's an automated process; while the computer sets itself up, I'm off doing other things.

## 3.5 Control.

When I experienced the sheer power and efficiency of controlling everything on my computer via a UNIX shell, mine eyes were opened and my life changed.

The sad thing is, I guess, that when you first realize you have this level of control, you don't yet know how to weild it so a lot of its significance falls flat until you learn how to use it. But if you look hard, you'll see the potential and you will be very excited, because probably you were _looking_ for that level of control all along, you just didn't know how to ask for it.

And you're not wrong; the level of control over your system is unequaled, and once you learn what to do with your newfound power, you'll never see computers in the same way.

## 3.6 Unlimited Everything.

On one server platform I heard about, the threads of an SQL database are limited unless you pay more money. I am not making this up; you can ask an admin of a [non-free] database.

On a desktop platform that I heard about, the kinds of video you are allowed to play in your video player is limited because the proprietor has decided to disallow those codecs from playing.

And so on, and so on. The really fun examples are the ones you can't think of until they happen. The ones that catch you when you are in a hurry, and the task just seemed _so simple_ when you thought to take it on, and the next thing you know it's 8 hours later and you're still trying to work around an arbitrary roadblock that some random marketing person decided to have written into the OS for whatever reason.

You just don't get those limitations on Linux. If there's a limitation on Linux, it's because you don't know how. And that's it.

## 3.7 Network Transparency.

Opening Kate and going to **Open Recent** , and having it open a text file that exists on the _internet_ , not on your own computer. And then working on it, and saving it back to your server. Speechlessness. Inability to form complete sentences. The technicalities of this aside (local copy in **/tmp** , and so on), this is just plain cool.

I've had not just a few friends see this as a killer feature. Like so many other specific examples I give, it's not entirely unique to Linux; the idea has been around for a long time and some other OS's try to implement it here and there. But on Linux, it works the way it ought to, regardless of protocol, regardless of whose service you signed up for.

I mean, Linux does network transparency sublimely. So much so that I've witnessed the gleam in the eye and the look of realization when my friends suddenly, in a flash, realize that the network truly is their computer.

## 3.8 Independence.

Frankly, one of the reasons I love GNU and Linux so much is its independence and neutrality. It is not commercial, so it's not driven by capitalistic goals (which I don't agree with anyway). It has no ulterior motive; it makes no attempt to lock users into doing things in one way or another, or using services or servers that they don't want to use.

The interesting thing about computers is that so many of us take them for granted, so we don't even notice when commercial software does this to us. Or if we do notice, we for some reason chalk it up to a necessary expense associated with the privilege of getting to use the software at all. Why do we do this?

I believe it's because all of the software involved has a price tag on it and its superficially kept just out of reach; we have to pay or "steal" in order to obtain it. How exciting! And how fortunate that stealing it is _just_ easy enough for the average joe to figure out. Compare that to software that is freely available for anyone, but that then requires you to actually _learn_ something new, like a new OS, or even that there is such a thing as an OS, and some new applications, new keyboard shortcuts, and so on. Lazy humans go for the paying/stealing so that they are not taken too far from their comfort zone. When they get tired of stealing software, they purchase a legit copy. And commercialism wins again.

I choose to extricate myself from this cycle. I use free software, and I assign no value to commercial software. It is not, in fact, even valuable enough for me to bother steal.

## 3.9 Portability.

One of the UNIX ideals from long ago has been the concept of portability. This is a funny term because there are levels of portability, but the idea is that whatever you create on one system should be able to be taken over to another system and work there, too. In computers, that only works to a point; different operating systems, different CPU architectures, and things like that, do restrict the magickfulness of the portability-ness of code. However, there are a few things that help keep the magic possible:

  1. Standards. Yes, there are lots of them, but when standards are good and well-respected, they can be used to ensure portability.
  2. Open source. When you can download the code and change it, you can make the changes necessary for an application to run on your platform.
  3. Good code. When programmers do their best to make their software portable, it ends up either being portable or being easily modified to become portable.
  4. Frameworks. There are a ton of really great frameworks that are portable, and by using these, applications are written that are in turn portable.

## 3.10 Constructive.

Have you ever had a friend who lends you something that seems really great at first, until they give you a thirty page list of all the conditions that come along with their "gift"? You may not have; I don't know how universal that experience is, but I have, several times; the long and short of it is that the gift is not worth it.

That's how I feel about commercial software. There are some really cool applications out there, and a lot of very talented programmers have spent a lot of time making them work. But the gift of this software (and I use the term "gift" in the sense that it is a product that you purchase) is finite. The model of commercial software does not allow for expansion or exploration. It's not an open world. If you want to explore, you can only explore within the confines of what has been created for you.

I'm not saying that this is an ultimate evil or a human rights violation, but I am saying that I am not interested in this model. I'm interested in creating new things, in collaboration, in sharing, in exploration and invention. I want to be able to use the tools I'm given in the ways that the author intended PLUS in ways that the author never imagined, and then I want to take the tool apart and look at how it was built so I can build another version of it or improve it or whatever. I want to be in an environment that encourages this sort of thing _explicitly_.

The GPL does a lot to foster this environment. It reminds us users to share and share alike, and forces the big businesses and corporations to do the same. I'm excited for a day when the businesses are removed from the equation entirely, but for now it's nice to at least be able to keep some of their greed at bay.

## 3.11 Educational.

I never really considered myself a geek, and then I found out about Linux and started using it. Months later, with a few installs under my belt and even some rudimentary shell scripts (and a kernel compile, but that's another story) and I am really starting to _understand_ this stuff. I mean, really understand it.

Linux doesn't insist that you become a geek, but it does pull back the curtain and let you see what really goes on inside of that magickal box you sit in front of every day.

And frankly, I like a system that is engineered in a way that it can be understood, and in a way that it allows people to teach themselves new things, and advance their comprehension of their world.

# 4 Environment Variables

When you first come to Linux, you typically learn the obligatory all-in-caps magick words that apparently control important things. You know, things like PATH, and EDITOR, and possibly you ran into DISPLAY when you tried that fancy X-forwarding trick, and so on. They're called "environment variables" and every time you have to set one, you have to go look up how to do it, and you still don't quite understand what makes them persistent, much less why the don't just get set right in the first place.

Well, settle in, because I am going to explain that, and so much more.

## 4.1 Environment Variables Explained to an 8 Year Old

When your mum goes to the store, she takes a shopping list so she doesn't forget why she is there.

Some things, she just knows you all need. Can you think of some items that you get every week? Maybe your mum gets bread and butter and milk every single week. If so, she probably doesn't have to write those down on her list.

But other items, she doesn't remember, so she has to stop and look at her shopping list.

 **Your computer has to do the same thing.**

The things your mum gets all the time from the shoppe, every week? that's the computer's environment. You can see this by typing:

    $ env

This displays a list of all the normal things that the computer has to remember about you on a daily basis. These are things like **$DISPLAY** (where to display your graphical interface) and **$USER** (what your name is) and **$PATH** (where to find instructions on what to do when you type in a command or run an application from a menu or an icon).

The things that you don't remember and have to look at the shopping list for? Those are things that _you_ add to your environment.

There are two ways to add things to your own environment:

  1. You can add something temporarily, if it's just a one-time convenience thing. I do this pretty frequently for some specialised applications that I run; specifically, the application needs a date for its timestamp and normally assumes that the timestamp I want to use is the current date, but sometimes I want to use a date in the future. The application is programmed so that I can override its default date settings with an environment variable, which I do.

But as soon as I close the terminal window that I am working in, that override goes away. That's because I added a _temporary_ environment variable:

    $ export MYDAY=19.09.01

That's all there is to creating a temporary environment variable.

  2. You can add something permanently, if it's something you want as a permanent part of your working environment. For instance, I use emacs for text editing, so when another application has to launch a text editor for me to use, I want it to default to Emacs. I can set this preference by adding the environment variable **EDITOR** (that's a pre-determined environment variable known to any POSIX system) in whatever login script I use. Usually, your system login is **.bash_profile** or **.bashrc** (it would only be something else if you are not using the BASH shell).

        $ grep -B1 EDITOR ~/.bash_profile
    EDITOR=emacs
    export EDITOR

Since the environment variable has been added to my login script, it persists as a default setting; it's like inking in an item onto your shopping list above all the pencilled-in temporary stuff.

That's all there is to it, really.

So "Environment variables" are just a fancy term for your "user preferences" or settings.

You might be wondering how you are supposed to know what environment variables to set. I mean, can you just make something up and go around setting stuff like **BROWSER=firefox** and **DROPBOX=owncloud** or is there a list of the ones that are recognised?

Well, environment variables only exist because there is one or more applications that look for them. So things like **EDITOR** and **DISPLAY** and **PATH** exist because BASH and Xorg and git and other applications, at some point, pause to look that information up.

So yes, you could make random env settings up, but no application would know to use it. That might be OK, if you intend to write your own application that WILL use it, but otherwise it's fairly pointless.

You can find out what environment variables an application looks for by reading the application's user manual or man page. Of course, you only need to set an environment variable if you want to override the defaults; an application will usually _set_ its own environment variable if nothing overriding it is found.

### 4.1.1 Try It Yourself

You can sort of try an experiment with environment variables yourself.

First, we need an application that will look for an environment variable and act upon what it finds. There are obviously many applications, but it's trivial to make our own just for proof-of-concept and testing:

    $ mkdir ~/bin
    $ echo '#!/bin/bash' > ~/bin/envy.sh
    $ echo '$ENVY' >> ~/bin/envy.sh
    $ chmod +x ~/bin/envy.sh

OK, done! Told you it was easy.

Let's see it fail first. Launch your new application:

    $ envy.sh
    bash: envy.sh: command not found

Oops, well heck that's not failure, that's not even getting launched. Why not? because when you type in a command to launch an application (even if it's something simple like **ls** or **cd** ), your terminal looks in some specific places (called the **PATH** ). The folder **~/bin** , where we put our sample app, is not in the **PATH** by default, but we can temporarily put it there:

    $ export PATH=$PATH:$HOME/bin

OK, now launch our sample application:

    $ envy.sh

I think you'll find that nothing happens. You don't get an error, but also nothing actually _happens_. That's because in our sample application, we are attempting to use an environment variable **ENVY** which does not exist.

So let's set one:

    $ export ENVY=xeyes

Check to make sure your environment variable is listed:

    $ env

And then launch your app:

    $ envy.sh

And now you get a response. Because you launched the application from the same terminal window in which you created the ENVY environment variable, that application inherits the environment and its variables. That's why it was able to look at its "shopping list" and be reminded what **$ENVY** stood for.

And that's how it's done.

But watch this. You can override even your own defaults:

    $ ENVY=xcalc envy.sh

Instead of getting the GUI eyes on your screen, you get a calculator. You didn't change the application; you simply prefaced the command to start your application with a on-the-fly environment variable that gets preference over the standard one.

So far, all the variables we have set have just been the temporary kind. If you close your terminal window and then open it back up, you will find that the $PATH variable has returned to whatever it was before, and the $ENVY variable is gone altogether.

As I have said, if you wanted $ENVY to be permanent, you could add it to your **~/.bash_profile** or **~/.bashrc** files.

# 5 My Box

I build my own workstations whenever possible. By the time anyone reads this, the specs surely will be outdated, but I am making note of it for mostly historical purposes. I'll get to what the historical purpose is, aside from "some day we'll all look back at this and laugh that I only had six cores", in a bit.

For now, here are the specs of the computer I built in 2014:

  * $0 (Used): Case [CoolerMaster CM 690] _Plug n Play_

  * $25 NZ (Used): PSU [CoolerMaster Thermal Blaster ATX 350w] _Plug n Play_

  * $155 NZ: Mainboard [Asus M5A97 (R2.0) AMD 970 ATX/DDR3/USB3.0] _Plug n Play_

  * $109 NZ: CPU [AMD FX-6300 Vishera 3.5GHz 6-core Unlocked, 4.1ghz] _Plug n Play_

  * $99 NZ: Drive [2TB Western Digital] _Plug n Play_

  * $109 NZ: GPU [NVidia GeForce GT 630 (2GB/64-bit/GDDR3/PCIEx2.0] _Plug n Play with nouveau; or a driver download from Nvidia_

  * $105 NZ: RAM [G.Skill F3-1333C9D-8GNS PC3-10600 1333MHz 8GB] _Plug n Play_

  * $35 NZ: NIC [TP-Link TL-WN881ND 300Mbps Wireless N] _Plug n Play_

  * $49 US: OS [Slackware Linux] _Install_

  * $15 NZ: Keyboard [Generic] _Plug n Play_

  * $0 (Used): Mouse [Logitech] _Plug n Play_

  * $25 (Used): Display [Acer] _Plug n Play_

  * $39 NZ: Gamepad [Logitech F310] _Activate Xboxdrv driver_

$765 NZ Total

## 5.1 Historical Significance

The interesting thing about building your own computer is how it compares in features and specs with a computer that you did not build.

I don't know if or how the computer industry will change, but a comparable system from a local shoppe would have set me back about $1800 NZD. So, even if I hadn't re-used a case and other parts, I'm nowhere near that cost. As it stands, my build was about 57% cheaper, give or take. I'm willing to settle on 50% for the record, since I have purchased a new keyboard for $10 and a monitor for $25, since writing the prices up initially. Originally, the keyboards were re-used from a dumpster, but they eventually died (both my primary keyboard and my backup keyboard, believe it or not; that never happens!).

Comparing my build to an Apple computer is difficult, since Apple doesn't offer any real customisation. But it's the comparison that matters the most to me, because it's the one that still hurts. To think about the years I struggled to get money together for an Apple, because the only other choice was Windows, all because I didn't know yet about Linux - well, it really does hurt.

You may have heard computer "enthusiasts" (I never understood why anyone would be an enthusiast about hardware over which they basically have no control; even my hardware is just purchased off a shelf and stuffed into a case) wax nostalgic about old computers. There's not a bit of that in me. I don't have nostalgia for these things. I actually probably resent them a little.

On one end of the Apple spectrum, they offer (at the time of this writing, any way) a Mac Mini "server" with a similar CPU, half the RAM, and an insufficien GPU for $1000. So, obviously beat that one.

In the middle range of their offering is an iMac with a similar CPU, same amount of DDR3 RAM, comparable GPU for $2000. Once again, beat that pretty solidly.

Their low-end Mac Pro, at $3000, has the the closest match for the CPU, more RAM at a faster speed, the exact same storage. So, _comme ci comme ca_ but for the GPU; the Mac Pro has two quite powerful GPUs. My mainboard can link two cards together, so that's an option in the future, but just not something I need right now. And for the money I would spend on GPU's, I could match the cost but beat the firepower.

So I guess we could call my build a top-of-the-line iMac, meaning that my build is 80% cheaper, give or take.

In terms of booting the OS, I subscribe to Slackware Linux, for which I pay a mere $49 per release. The install is trivial.. My extra software installs are scripted, so that ran in the background while I did a little customisations here and there for comfort, and the box was ready to go.

## 5.2 Gamepad

Originally, I had picked up a $6 generic PS3-style gamepad. It worked fine, but its analogue stick died pretty after a year. I am not a hardcore gamer, but having an analogue stick that, well, sticks, makes gaming not very fun, so I upgraded to a **Logitech F310** , which I can now highly recommend.

Getting the F310 to work was simple; I disabled the **xpad** driver and initiated the **Xboxdrv** driver. Steam sees the gamepad as an Xbox controller.

For games without gamepad support (and there are _many_ ; apparently most games released for the PC assume you are a PC gamer in love with WASD+mouse, but I am not ashamed to admit that I am a console gamer at heart), the really great application Antimicro works wonders. Map controls on your gamepad to keypresses, and the game is none the wiser.

# 6 Why Software Needs to be Open

I use nothing but open source software, from the operating system itself (Linux) on up to the applications. People have asked me why I believe all software should be open source and free of corporate control.

There is a very well-known hacker named Richard Stallman and he has, in addition to basically starting the whole free softare movement, written many very good essays explaining very eloquently the myriad reasons software ought to be liberated from various kinds of third-party control. These essays are collected in a handy book which you can purchase and read on the bus or in your living room or whatever.

I have my own reasons, as well. Some of my reasons intersect with Mr. Stallman's, some differ.

These are the reasons that I think all software should be open source...or at least, the reasons that come to mind.

## 6.1 What is "Open Source" and How is it "Free"?

First of all, I should make clear that when most people say "free software" they do _not_ actually mean "freeware" or "shareware". They mean _independent and open source_. In fact, many people mean _independent, with a liberal Creative-Commons style copyright, and open source_.

When I say "free software", I generally mean everything. I want software to be liberated from corporate or political ownership and control, I want software to be open source.

And here is why.

## 6.2 Every Kilobyte is Sacred

As Monty Python sang (er, more or less) every kilobyte is sacred. Especially when it is _your_ kilobyte. In a very pragmatic way, I believe that if you create data on a computer, you should be guaranteed access to that data _no matter what_.

Forever.

Full stop.

If you change operating systems, if you stop using the software that enabled you to create the data, if the software that created the data no longer exists. No matter what, the data is a product of your hard work, it belongs to you, and you should be able to get to it.

I cannot count on just one hand how many times I have revisited files created with proprietary software (ie, before I switched to GNU) and found that they were simply inaccessible to me. Luckily, I later became street-smart and started exporting my data into generic, open formats, just for backup. It turns out that those just-for-backup copies have in some cases saved my data. In other cases, the data could be rescued if I gain access to a proprietary system, but in still other cases, the data is locked away forever.

Free software dis-allows that because its source code is available to everyone. Therefore, if I create data with a software application, then I can download the source code and literally store the source code along with the data if I am so inclined. But usually such extreme measures are not necessary; free software, more often than not, respects the user's data enough to ensure that your files will open in new versions of the application, forever, and probably will even open in other competing applications. And if all else fails, your data is usually convertable so that will open in some other application. And if all ELSE fails, you have the source code so even if you have to hire someone to compile the application for you and extract your data, it can be done. Which is saying a lot more than some long-dead proprietary software offers you.

## 6.3 Art and Sharing

People need food, shelter, love, and art. I think in that order. The thing about art is that it boosts morale. It brightens our lives. It makes us happy, and it inspires us to do cool things. In my ideal world, we would all work for these things.

My definition of "art" is very broad, so it includes things like storytelling, painting, healthy uses of religion, modding bicycles, tattoos, funky hairstyles, making yarn from cat fur, and, yes, computers.

It is built into each of us to be productive, and when we produce something neat, we want to share it. Things that we create on computers are not exceptions. And things that we _learn_ on computers are also not exceptions. So if I make a really neat digital painting in GIMP or Mypaint and a friend wants to learn how to do the same thing, then I should be able to hand my friend a copy of the software. To make that simple act of comradeship illegal is, in my opinion, a crime.

## 6.4 Lingua Franca

Computers are the pens and pencils and paper of modern society. We largely expect people to use computing devices on a day-to-day basis. If someone does not know how to use a computer, then they are at a disadvantage when looking for work.

More importantly, a computer can be a fantastic learning engine. Computers, and the desire to know how to operate one, can teach math, logic, literacy, design, and a heck of a lot more. It can elevate the user's very thought process, it can influence how someone sees the world, how one tackles real-world problems, and more.

The point is, if we have these wonderful tools called computers, and we want the world to be a more technological and improved place, then these computers must not be exclusive.

I know that there are many cheap computers, and I know that there are many cast-off computers. However, the drive in the marketplace is to continually upgrade, buy new software that in turn demands new hardware, and get new hardware so you can stay relevant. That is, simply put, evil. For so many reasons (not the least of which is ecological).

Free software will run on old computers (I know, because I am writing this post on a 12 year old computer running the latest version of Debian) and yet stays relevant. Free software would not be exclusive, because it is free and even enables people to run it on old hardware which often times can be obtained for free.

## 6.5 Capitalism is Not the Answer

I do not believe in capitalism. At best, modern society has outgrown capitalism; maybe it worked once, but we are too many people with too much diversity to be served well by capitalism. At this point, the corporate culture of dollar-driven religion is out of control, and it is destroying the spirits of many people, the environment of the planet, and it's enabling a select few to prosper at the expense of many others. It is simple math; there are only so many dollars to be gained, and there are a lot more people.

Free software has shown that under the right conditions, people can and will work together to create an entire system that works best for everyone developing it. As a bonus, a lot of other people _not_ developing it benefit. And the system is self-sustaining because they built a system upon which they could build new systems. Frankly I see the entire endeavour as a blueprint for the way we could all live. Maybe not overnight, but I think we could get there.

## 6.6 Code is Cheap. Support, On the Other Hand...

Anyone using a computer knows that if they want to find a clone of some application that is too expensive, a quick internet search reveals a cheap knock-off that does something similar. Because of rampant kracking and illegal downloading, a lot of people are picky even in terms of what similar application they will bother using, but the point is that code, these days, is cheap. A lot of people code because a lot of people want to.

The strange thing about the cheap knock-offs versus the "real" applications is that they both have bugs, they both have annoyances, they both get updates with improvements and new features. In other words, both readily admit that they are, as yet, and for the foreseeable future, imperfect.

So why, exactly, are we paying money for an incomplete product that needs fixing?

I think many large free software products have proven at this point that there is real money to be made in giving away the code whilst charging for support. This, frankly, is a lot more sensible and even allows for the deadbeats who are going to just grab the code and not pay for it anyway. Doesn't matter! they're using your code, so it not so much theft as much as it is a potenial future [paid] support call.

## 6.7 People Understand Money

I think digital downloads of music and movies has shown that people actually do understand that in the current state of affairs, artists and producers-of-things need money.

Yes, yes, there are those who will download music and movies and software and never pay, but that's just human nature. There will always be that demographic; some will not pay because they do not want to, others will not pay because they really are too poor right now to pay but still have those basic human needs of food/shelter/love/art. But there is also the demographic of people who _will_ pay, because they know that if they take and take and take and never give back, then the software project (or band or director or whatever) that they really love will not be able to continue producing.

Corporations that try to extort money from its audience is doing us all a disservice by assuming we are all crooks. They are also, as a bonus for themselves, getting to decide exactly how much their product is worth. Even when we, their customers, disagree. Kinda nice for them.

By contrast, I have paid quite a lot of money for projects upon which I rely. Even though they are free. And by contrast to that contrast, back when I was out of work, I was able to take advantage of a cheap computer and free software in order to learn skills tha eventually got me the series of jobs that then permitted me to pay for free stuff.

## 6.8 De-Centralization

Free software is developed by lots of people, all over the world. A lot of proprietary software applications are products of one state in one country. I am not necessarily saying that this is inherently bad, but I do believe that in today's world, a little more diversity is a healthy thing.

I also shy away from too much centralization. Having things centralized and controlled by a few entities, all, in this case, under one specific government, can feel a little claustrophobic.

## 6.9 Diversity

Too many people believe that to succeed at _whatever_ , they need to know _whichever_ software. Frankly, that's a myth, and if it isn't then do you really even want to join in a society that provides people with only _one_ point-of-entry? I do not.

This is the same reason I dislike certifications in the tech industry: people can't get past the paperwork and just look at someone's skill. If you are going to judge me by whether or not I know this brand of software or that brand of software, then you're missing out on the talent I could otherwise bring to your project.

It may sound like I am saying that that I trust indpendent developers more than I trust corporations. And since corporations are not people and therefore have no sense of morality, responsibility, or allegiance beyond its sole purpose of making money, to a large degree that's very very true. However, this isn't a perfect world, so sometimes independent developers have to walk away from their code due to monetary or time constraints or a lack of interest or health problems or _whatever_ , so it's not a magic bullet that a developer is independent rather than a faceless, mindless corporation.

Strictly speaking, I only trust independent developers insofar as anything in life is guaranteed. The real reason to invest my emotional, financial, and creative energies into open source code is because the code is available. Once I download it, that copy of it literally **belongs** to me. If the company or the developer in charge of the codebase has to stop development for any reason at all, whether it's through some fault of their own or not, I can, in theory, make that code live on. Sure, if I don't know a lick of programming then I may have to hire someone to do it, but if it's something that important to me, then at least I have that option.

But I'm getting ahead of myself.

## 6.10 Open

You are using software for stuff that is important to you. So you should own that software. If anyone other than you controls the direction or fate of that software, then your data is suddenly at some third party's mercy.

In order for you to control your software, you must have its source code. This is why free software is often said to be "open source", because the source code is "open" and available to everyone.

Being open about something is important, especially when we are talking about computers and the things they do. It's enough that we commit so much of our energy and creativity into computers for us to demand that they are open in how our data is processed, stored, and used. _But_ now we also put everything else into computers, too, like health information, banking information, addresses, personal history, family information, and all kinds of stuff. We have a right to know and understand how computers are processing and dealing with this information, especially since a lot of this information is not something we voluntarily commit.

# 7 @font-face

My friend (and a real live professional web designer) Alexandra Kanik showed me this neat trick that has frankly made me look like a pretty skilled web designer when in fact I'm a complete hack. But it's a little trick that you can do in css, and it goes a little something like this...

Well, first you should grab a font file from some free downloadable pack of fonts. One such freely downloadable packs might be the Great Linux Multimedia Sprints, which contain about 2455 free fonts, including one called **League Gothic**.

Let's assume we want to use League Gothic on our webpage.

  1. Upload the LeagueGothic.otf file to our webserver:

            $ scp -p2292 LeagueGothic.otf \
        klaatu@straightedgelinux.com:~/www/fonts

If you do not know how to upload files to a server, or need a quick refresher because it's not something you do every day, take a look at FileZilla FTP or, just as likely, your file manager (in Linux or BSD, anyway) probably handles it perfectly well.

  2. In our css file, add:

            @font-face { 
          font-family: "titlefont"; 
          src: url("./fonts/LeagueGothic.otf"); 
        }

The **font-family:** value is something you make up: it's a human-readable, friendly name for whatever this font face represents. I am using "titlefont" because it's being used as my main titles. You could just as easily use "officialfont" or "foobar".

The **src:** value is the path to the font file itself. Obviously, the path to the font should be appropriate for your server; in my example, I have the fonts dir existing in the same folder as my css and html files. You may or may not have your site structured that way, so tweak the paths as needed, remembering that a single dot means "this folder" and two dots mean "a folder back".

  3. Now that you've defined the font face name and the location (src), you can call it for any given css class or id you desire. For example, if we want **< h1>** to appear in League Gothic font, then:

            h1 { font-family: "titlefont"; }

And that's all there is to it. For what it's worth, all League of Moveable Type fonts are @font-face ready, so that's a super simple quickstop for a nice webfont.

On this site (the one you are reading right now), you might notice that the text in the code blocks is different from the rest of the text. That's because I am using this **@font-face** trick to make my code samples appear in IBM_Nouveau.ttf, and everything else in Droid or Sans or something like that.

## 7.1 A Note About Fonts and the Web

Traditionally, a lot of books and professors have told people learning web design to specify fonts as "Arial" or "Helvetica" or whatever.

This is wrong. Incorrect. Bad habit. Don't do that.

Why?

Arial and Helvetica and all the other fonts we all grew up with actually cost money (not to you, but to the people putting together your operating system), so not all devices have them installed (especially things made by grassroots organisations).

There are fallback fonts, of course, but if you are designing a site and want to guarantee what its fonts will look like, embed the darned things with **@font-face** (in which case you should not use Adobe or Microsoft fonts, since you may not be legally permitted to do that; free font collections, obviously, do not have any such restriction).

If you don't care so much what the fonts are doing, then just define fonts by generic classification (sans, serif, mono) and let the user's system pick whatever default sans or serif or whatever font it has installed. Trust me, the results are much more attractive on all systems.

# 8 Corporate Recovery

People often don't comprehend me when I tell them I do not like Apple products, because they honestly cannot fathom _not_ loving Apple products. I understand, because it took me years to get over the comfort of Apple hardware, past the obligatory Apple-as-a-revolution trope, and almost a decade to fully rescue my data from Apple formats and conventions.

## 8.1 Hardware Problem

The first is the easiest answer, and one I could go on and on about, but will not, because it's actually pretty obvious:

  1. A computer is not its OS.
  2. You can leave Apple without going to Windows.
  3. Apple makes it difficult to load a independent OS on their hardware.

Would I get a Mac if I could easily and reliably run Linux on it?

Well, actually no, because of the hardware geek answer:

  1. The swiss army knife exists for a reason.
  2. Computers are tools.
  3. Computers should be like a swiss army knife.

Either way you look at it, Apple hardware is really well-made, but at the end of the day, it's less effective at what it strives to do than other computer brands. It makes it tough to run the OS of my choice on it, and it does not provide me with the ports or the number of ports that I expect. They set out to make a computer, but in the end they only qualify by a technicality.

In short:

  * I will never pay three times for half the power and a third of the functionality.

  * I will never be burnt by not having the right dongle or the right port again.

  * I will never be forced to take a computer to an "authorised repair center".

  * I will never fight to load a non-corporate OS and install firmware on hardware just because a collective of vendors, to whom I have forked over a fair amount of cash, decided to make it "impossible" for me to do so.

I have better things to do with my time.

## 8.2 Software Problem

I used to use a little application called, if I'm remembering correctly, SoundJam on my old OS (the one I used prior to switching to a non-corporate, independent OS) to play music files. When I heard about iTunes, I gave it a go, and then I resisted it for a long time, because it seemed to put up a block between me and my music. I stuck with SoundJam for as long as I could, because managing my music as the files that they were, rather than as entries in a database that I did not directly control just seemed to make more sense to me.

SoundJam, ironically, was actually the basis for iTunes, and eventually it just stopped being supported. Instead of adopting VLC or resisting in some other way, I moved to iTunes, thinking that maybe the world really was just moving to database-driven file management whether I liked it or not.

Turns out I was wrong to give in, and I luckily found that out not many years later when I found and moved to Linux.

But by that time, the damage had been done. I had let Mac OS touch and control my data. To this day I still come across whole directories of music with mangled names, no indication of what artist or album they are from, no indication of what order the tracks appear in the album, and so on. And everytime I see it, I know that I have found a directory that is a remnant of my old iTunes mistake. (I know you're thinking that iTunes doesn't do that, or that iTunes has gotten better. But trust me, old iTunes did horrible things to data, especially if you exported playlists in an attempt to backup your data. And no, it has not gotten better.)

More examples abound. Appleworks, Clarisworks, Stuffit, Photoshop, Logic, Final Cut, FileMaker. It almost hurts to drudge up the old names from my memory. Understand: I was a computer addict. I was not a _knowledgeable_ one, but I lived my life on the computer. I did my art on the computer, I socialised on the computer, I played games on the computer. Looking back at my life from where I am now, much of it involves computers.

Which makes it pretty frustrating to think about how little of that I controlled then. And the logical extension of that is the realisation that there's not much of it that I have access to _now_.

It has, in fact, taken years for me to extricate my data from the old, locked-down, secret formats of old. Sometimes this is literal; getting my data from old "self-extracting" Stuffit archives was pretty miserable. Disentangling it from old binary document formats was basically an exercise in manual transcription. Some files are not rescue-able at all.

Adding insult to injury, some stuff is just flat out annoying, like the things that I actually had the good sense to at least _attempt_ to get into nice, open or at least open-spec file formats. I mean, to this day I sometimes open a directory of old music to find horribly mangled filenames and track ordering, just because the export function in some stupid version of iTunes refused to give me back my files in the same condition that I put them in.

## 8.3 The Price of Quitting

At the end of the day, it's fine if you want to commit your data into someone else's control, but I believe that a vendor owes its users the assurance that if you ever decide to leave, then there will be no hard feelings and no sabotage; you can take everything that you brought into the relationship, just as they were when you brought them in.

I realise that that is not possible. I mean, if _you_ the user load plain text into a fancy office app, and then throw away the plain text, then you can't expect your plain text back when you go. Right?

Well...let's look at that. What if applications did this:

  * Warn the user when there is going to be an irreversible loss of original formatting (conversion).
  * Make sure that the user _always_ has the ability to export their data into generic formats.
  * At the very least, release specifications of file formats so that others may reverse them _without_ having to run your proprietary or expensive or platform-specific application.

Suddenly it's not so impossible any more, is it?

And you know what, it's not that hard. All I'm asking companies to do is to:

  * Respect my independence as a consumer. When I buy a piece of hardware, that's my hardware. I understand that warranties (for what they are worth, which lately seems to be not a whole lot) will be voided if I do something unanticipated; that's ok. Show me the dotted line to forsake the warranty and I'll do it.

  * Respect my data. This is my life's work, here (for whatever that's worth). It may look like silly vacation photos and a bad taste in music to you, but to me that's what every day is about. I want a place to store it, and I apprectiate the technology that allows me to do that, but the price is too high if it means you're going to mangle what I put in. And it's downright criminal to do that to someone who doesn't know better.

## 8.4 The Solution

Once a user understands what is at risk, the burden really is on the user to protect themselves from the corporate apathy. I you want to maintain control over the things you create and curate in your digital life, you need to get away from the corporations who see you as a product rather than a person. Switch to an OS that is made by other (real) _people_. Switch to Linux.

# 9 Type Special Characters

On my previous OS, the one I used before I switched to Linux fulltime, I could get specialised characters like a • bullet, or fancy foreign characters like è or € or even ™, and so on, by hitting option and some other key.

On Linux, the same characters are available and the method to get them is similar but not exactly the same. Technically, there are about three different ways to do it, but this, the so-called **Compose Key** or **Multi Key** method, is the way I do it.

With a **Compose Key** , you define some key on your keyboard to be a sort of trigger; first, you press the **Compose Key** and then you press two letters to choose the special character you want to create. Usually there is some logic to it; for instance, to create a é character, you would hit **Compose Key** , and then the ' and then e because you want to, essentially, place a **'** over an **e**. Makes sense.

It does involve a few more key strokes than what I was used to, but it does provide a _substantially_ greater number of possible letters and characters; for instance, you can even do fancy currency symbols, fractions, and a few icons like happy faces and hearts.

A word of warning: test your newfound **Compose Key** abilities in a very robust and GUI application. I spent two hours trying to figure out what I was doing wrong before I finally realized that testing for special characters in a _plain text_ application was just plain stupid. So use an application with nice fonts and that expects to receive special characters. Probably the best and most generic test would be in Libre Office with a Liberation Sans or Serif font, but Abiword or any word processor is probably fine. GIMP or Inkscape would work too.

Just keep in mind that the font that you are using must _have_ the character you are trying to create in order for it to work. If a font designer did not design a glyph for the character, then no amount of typing is going to make it appear. So if this does not work for you straight away, try changing fonts or applications and try again.

Enough pre-amble, here's how to make it work.

## 9.1 Set Up a Compose Key

First, you must set a **Compose Key** on your system. This cannot be pre-set like it can be on That Other OS, because not all Linux keyboards are necessarily the same. Personally, I use the Menu key on the right-hand side of my keyboard. On some laptops, there is no such key, so I just use the right-hand Alt key. If you don't use your Caps Lock key (and let's be honest, who does?) then you can use that.

To set a **Compose** key, open System Settings.

### 9.1.1 KDE

In KDE, locate the Input Devices pane and click on the Advanced tab. In this panel, define the **Compose Key** position. Use some key on your keyboard that you would otherwise never use for anything else.

### 9.1.2 Gnome

On Gnome, MATE, and Cinnamon, open nopen System Settings and enter the Keyboard panel. Locate the Options button and click it.

In the Options panel, set the position of the **Compose Key**.

### 9.1.3 Other Desktops

If you are using a desktop that does not have a control panel for mapping keys, you can map your keys straight through X11.

  1. Find the key that you want to set as **Compose**. Use the xev application to learn the keycode representing the key that you want to use.

  2. Create a file called **.Xmodmap** in your home directory.

In that file, use the keycode you just got and set it:

        keycode 135 = Multi_key

If you are using the Caps Lock key, then you also need to clear the **Lock** function in addition to using the keycode of the Caps Lock:

        clear Lock
    keycode 66 = Multi_key

  3. Load your Xmodmap.

        xmodmap ~/.Xmodmap

## 9.2 Using the Compose Key

This used to confuse me all the time, because I was really used to the way The Other OS implemeted special characters. Let me be clear: on Linux, you _do not press_ the Compose Key and a letter at the same time. It is purely _sequential_.

For intance:

  * For a bullet point, you press Compose, and then . and then =
  * For a ¼, you press Compose, and then 1 and then 4
  * For a © symbol, you press Compose, and then o and then c

And so on. That's three key presses in each case; one press to get into **Compose** mode, and then two presses for whatever combination of letters or keys that create the special character. (In the case of a capital letter, use the shift key as usual; that is, if you are creating a capital-e with an accent over it, you would press Compose, and then apostrophe, and then shift-e.)

For a full list of all possible combos, view the file **/usr/share/X11/locale/en_US.UTF-8/Compose** or poke around online.

I hope that helps!

# 10 Be Dirt Poor and Happy

I have been uncomfortably poor for some fairly long stretches of my life. Interestingly, I think that I would be considered poor at the time of this writing, at least judging what kind of money my friends make in a year, but due to the way I manage my money my point of reference has shifted such that I really cannot think of myself as poor any longer. That is what this article is about; being poor in such a way that you are no longer poor. It can be done.

First, a few disclaimers.

By "poor", I mean to say that your income (regular or not) places you below your local government's definition of poverty.

Secondly, my point of reference is that of an [USA] American, so my "poor" is a heck of a lot easier than the poor of, say, Mexico or the Congo. Or at least, I imagine it is. Certainly, my only experience of "poor" has been in a land of, in all respects, plenty. I (thankfully) cannot speak to being poor in a place where there simply is not enough to go around.

Thirdly, I am speaking as a single person without no dependents. I do not envy people who find themselves struggling with poverty whilst raising children or caring for someone who is ill. It can be done, but it's beyond my experience, so I don't pretend to have any words of wisdom with that regard.

And finally, I understand that people's abilities and situations are different, and I do not intend to belittle or blissfully ignore other people's hardships; I am only trying to pass on whatever small bits of wisdom that I've gathered after many years of poverty in the USA.

  1. Get out of the bad neighbourhood. The ghetto has zero opportunity to earn money and lots of distractions. That's why it's called a ghetto. Get out of it. It is better to be homeless in a good neighbourhood than housed in a bad one. For more information on being voluntarily homeless, go to Hacker Public Radio and listen to the series on "Urban Camping" (a polite euphemism for "homeless").

  2. Own no more than what you can carry. It is counter-intuitive but most poor people I know actually own a lot of useless junk. It clutters up their lives, it ties them down, it costs them time and it's dead-weight that they could sell for money. Get rid of the junk in your life so you can focus on what's really important.

  3. Use free software. This is a no-brainer; if you are paying for software, you're doing it wrong. And free or dirt-cheap computers can be gotten by dumpster diving, or from thrift stores, or from craigslist, or whatever.

  4. Get rid of your vehicle, unless you live in it. I have owned a vehicle a few times in my life and they are nothing but money pits. The only exception to this rule was the vehicle in which I lived; it was a money pit but since the money going into it was also my rent, it made good sense.

Otherwise, a monthly bus pass is _far cheaper_ than pouring money into up-keeping a vehicle, insurance, gas, and so on.

Obviously if the city you live in has no bus system, this tip does you no good. Unless it does; sometimes staying mobile by urban camping so that you can dwell around your work rather than driving to work every day is a good work-around.

  5. Live on no more than what you make. This is kind of the ultimate anti-poor advice. If you do not spend more than what you make, then you are no longer poor. Prove me wrong.

I have found that the problem with most people (myself included, early on) is that we do not understand what "essential" means. The Internet, for instance, is not essential. Cell phones are not essential. TV, fancy clothes, cameras, and so on, are not things you should be paying for if you are claiming to be poor. But a lot of people do, and they just can't understand where their money goes each month.

I think part of this is the mentality that makes you think things like "well, that rich person over there has **X** , so I should have **X** too". While there's some truth to that, I think it's usually a trap. To envy someone who is lives a life of excess and enjoys none of it is kind of like envying a sick person because of all the sympathy cards they get. Yes, balloons are fun, but not if you have to stay in bed all day when all the other kids get to go outside and play.

Cut this stuff out of your expenses and suddenly your expenses drop down to food and possibly shelter (unless you're doing the urban camping thing, then it's just food).

Believe it or not, there are staggering numbers of social programmes in the US and many countries that provide freely the things that you consider essential. People don't believe me when I tell them that I haven't paid for internet in years, because I _live_ on the Internet. How can I not have paid for it? Well, one city I lived in had free access internet; it was dial-up and slow, but it worked when you needed it. Other cities have coffee shops that offer free internet. Other places I just used the Internet at work. Of course, libararies are great resources for free entertainment and education. And so on.

OK, so now you have cut out all excess from your life, leaving nothing but food and shelter as possible expenses. The shelter issue can be solved with urban camping, and the food issue can mostly be solved by using food banks (most churches have one), urban gardens, dumpster diving, or working at a restaurant that will feed you as part of the job.

At that point, literally all you need to do is get some silly job like serving bagels at a cafe (been there) and you're good to go. Find a place to sleep at night, hang out in public spaces during the day, go to work when scheduled, and you're living a normal-ish and happy life.

While living this normal life, be sure to teach yourself some new skill set that interests you and that is more employable than whatever useless degree you got (or not) in college. That useless degree may be why you are poor, so it's time to start working on getting some skills that will finally translate into getting money. For me, the answer was free software; learning computers is how I make my money today. Sometimes I make a little, sometimes i make a whole lot, depending on the job. But it has gotten me out of the urban camping lifestyle and into a low-cost apartment, and it affords me little luxuries, like splurging and having a cup of coffee from a nice cafe twice a week. The point is, develop your skills, find some work, and spend less in a week than what you make each week. Done.

The important thing is to keep positive, and do not judge yourself based upon how other people are living. Being poor or homeless, or whatever, is not a bad thing, and in fact it's not so uncommon in the greater scheme of things. It's just another way of _being_ , so enjoy it as much as you enjoy anything else or maybe more-so. Take time to stop and smell the proverbial flowers. Enjoy your freedom. Learn to hate money and the useless junk it usually encourages us to buy. Get out of the habit of living for money, and you will become impervious to concepts of poverty and wealth, and see that all people are equal and that their labour is equal in spite of what capitalism says.

# 11 Experience

Linux can be tough to get into at first, mainly because most of us, at least for now, were taught as children on proprietary systems. Possibly, we got some basic training on proprietary systems for a job. Probably we didn't get much exposure, and probably almost no meaningful instruction aside from "click there, then there, then there" but it's been constant, steady exposure to a very specific operating environment, and it's taken for granted that we all had this training and know enough of it to get by.

So when one arrives at Linux, one is amazed at how little one understands about computers, after all. It gets frustrating, because you felt so smart before, and now something that took you ten seconds takes you one minute. And that increases logarithmically.

But some people power through that because they want to learn. They want to get better at computing, not just at being a user.

Yes, we all go through this sort of thing. That's why if you say "Does anyone remember how they first got into Linux?" in a group of Linux geeks, you will get 8 hours of long, epic tales of how each person found linux, and their early hardships, and so on. It's all very special (and by "special" I affectionately mean tedious after the first hour), and full of emotion. Think tears and laughter and group hugs. Not really my thing, but it speaks volumes.

I would rather like to see what would happen if we got some wee children and isolated them all in comfortable Tralfamadorean labs, away from computers. Then when they reach 18, we let them each have a blank computer, and we give one a Mac install disc, one a Windows install disc, and the other a Linux install disc. And we say, ok, go for it. Install your OS and learn how to use it. Good luck.

I suspect at the end of it all, they would each have stories of sadness, of despair, and then of triumph, resolution, and ultimately, joy. It would all feel like how it feels for Linux-noobs because there is no comparison, no prior experience to un-learn. It's all painfully new.

But you know what? when we then tell them that their licenses have all run out and, also, all the software they were using is obsolete now and none of their files will work on the computers in the outside world, then you will know the true victor by the confident smile upon that one user's face, that subtle look of true understanding and comprehension. That user, I am sure, will be the Linux user.

# 12 User Betrayal

In the world of computing, we tend to imagine that there's an agreement between the user and programmer that the data being created by the user will always be available.

If I create a document, I assume that it'll be available today, tomorrow, a year from now, 5 years from now. For some, that's the farthest ahead considered, so "5 years" is basically synonymous with _eternity_. But some of us think about that intangible thing we call "the future" and expect our data to be available to us even in 6, or 11, or 15 years.

The problem is, this is just an implied agreement. Look at any agreement, far and wide, and you see:

  * no clause in which a software vendor guarantees against the data and file formats of today being deprecated and abandoned in software updates of tomorrow
  * no guarantee that data formats will now or ever be exportable to another format
  * no agreement that data and file formats will even be _readable_ by other applications

This is _bad_ , but unfortunately it often only begins to resonate with people who have been burned by it. People who have only recently started producing serious data that they have the real need te keep safe see a lot of these blatant omissions from a software's guarantee as minor technicalities. So what if a file won't open 7 years from now? who even has a computer that long? wait, you mean you can transfer files from your old computer to your new one??

But people, often freelancers and artists, who work on things that they themselves must maintain, or companies that spend lots of money on producing files and view those files as investments, such as movie studios and record labels, expect longevity. And frankly, so should we all. I speak as someone who has sifted through the mangled data of reverse-engineered search-and-rescue operations, and as someone who has tried to assist others in recovering their hard work from decades past.

Computing should not be, and does not need to be, this hard.

Here are some examples of problems I've encountered:

 **Cathy**

Cathy creates a document in **ExampleWriter 3.0** and it saves her file into a **.exw** format. Next year, **Example Ltd.** gets bought by **MegaCorp Inc** , who promptly drops **ExampleWriter** without warning.

Cathy isn't even aware of this until she gets a new computer. The new computer won't run her old copy of **ExampleWriter** so she looks to get an update, only to find that it no longer exists.

Desperately, Cathy tries to find some application, ever open source, that can open **.exw** files, but nothing does, because **Example Ltd.** never released the source code for it, or even the specification of how it is designed, and **MegaCorp** , in spite of several online petitions, will not integrate the format or release its specs.

Too bad for Cathy.

 **Emma**

Emma's go-to app for everything is **Awesome App**. Emma has an important **.awe** file that she did for a client. She exported it into a common file format because she knows that not everyone is as awesome as her, so she probably won't have access to **Awesome App** on site.

She gets to the location and finds out that much of the pertinent data has changed, making parts of her presentation obsolete. No worries, she's no dummy; she brought the source file with her on a thumb-drive. All she has to do is open it up, update the charts and graphs, re-export it, and go.

Except, she can't because nobody in the building has a copy of **Awesome App**. Emma goes online and desperately seeks for a solution and eventually she finds a trial version of **Awesome App**. She downloads and installs, opens the file, makes the correction, and finds to her dismay that the demonstration version does not allow exports because it's a demo, and she doesn't have permission to install **Awesome App** onto the presentation machine.

 **Tara**

One more, slightly more optimistic: Tara worked on her art project in **Domo Media** for years. She's finally ready to publish, so she sends the relevant files to her distributor. Turns out the distributor has a different version of **Domo Media** , which changed formats, and besides for final output they require a different format entirely.

 **Domo Media** is too expensive for Tara to buy, and they have no trial version for her to "borrow". She searches forums for three days and finally uncovers a small open source application whose programmer spent three months reverse-engineering the old **Domo** format to rescue his own data from it, but warns that some code he just could not figure out, so it may not work 100% of the time.

Tara downloads the application, ignores the $10 donation being asked, and converts her files. The good news is that most of conversion works, but sadly some of the bits and pieces don't translate exactly as expected.

Tara spends three weeks going through all of her semi-rescued files to finish up the job. She sends an angry email to **Domo** but only gets an auto response offering her a discount on an upgrade. Still fuming, she sends an angry email to the open source developer for not converting the files correctly.

## 12.1 Is it Worth it?

The possibilities are endless, really, but those examples touch on some of the things I have witnessed or experienced in my life of being "the computer guy" for people, both professionally and just by way of proximity.

The point is, companies that let you create data in data formats that they keep secret are flat-out dangerous. The assumption that there is an un-spoken agreement between you and the companies you buy software from is even _more_ dangerous.

I am tempted to say that most of the big software companies that most people use, in practise, do not treat user data lightly, and do _generally_ make an effort to support past data and allow for export to other formats, and so on. The problem is, that's a blanket statement that might be true for _you_ but is certainly not true for everyone. And the point I'm trying to make is that just because a company is more or less responsible _now_ doesn't mean they will be responsible later. And it's the _later_ part that matters. Not to you, but to the future you.

Sadly, very few companies release the specifications for the formats that they use. They see user data as part of their product. It's an investment: if you create data in their format, then you'll keep buying their wares, because that's the only thing that your data can use. In other words, you're being subtly bullied to stay with a vendor.

## 12.2 The Open Specification Alternative

So let's compare this conundrum with what I learnt was a pretty fair alternative to all this. There are actually two solutions, one technically better than the other, but in terms of data sanctity either will work:

The first is open specifications for ALL data formats. And by "all", I mean all. If a user is generating personal data (and if the user is generating it, then it is by definition personal), then the way that the data is saved should be accessible to any other application. Programming is a wonderfully modular thing, so this does not require the vendor to write any code; all they have to do is release a document telling other programmers how to READ that file format.

For bonus points, telling programmers how to WRITE back to that file format would be nice, but it's not a requirement, a company can still maintain control over its users over by shrouding that part of the equation in mystery.

The important thing is that other programmers are able to read user data, regardless of format. It might mean a lot of work for a programmer to integrate a format into their code; but that's up to the other programmers, not to the original vendor.

No one loses and everyone wins.

Will vendors do this? Well, some will, and some do. Sometimes they do it because, apparently, someone in the company actually harbours some form of respect for users, and other times they do it because a court of law tells them they must.

Do vendors like this? no not usually, because they are, as I said, jealous lovers. They see their proprietary formats as gateway drugs and trapdoors into their endless release cycle.

## 12.3 The Open Source Alternative

The second solution, and by far the farther-reaching, more complete option of the two, is open source software.

This means that the code of the programme generating the data in the first place is open source. This means that when you choose an application to generate data with, you not only get the file you create, but you inherit the entire application itself. As long as you keep the source code of the application stashed some place (often times, the Internet does that for you, but you're free to make a personal copy as well) then you will always be able to open the files created in it. Yes, even if the application is abandoned by its creators. Even if the creators sell the application to another company.

Seriously.

Let's look at the plights of Cathy, Emma, and Tara again, this time with open source software. For variety, let's turn up the difficulty level to **11**.

 **Cathy**

Cathy creates a complex document in **Open Sauce App 0.89.2** and it saves her file into an **.osa** format. Next year, **Open Sauce App** is purchased by **MegaCorp Inc**. The website, once a rich happy place of forums and an extensive knowledge base, changes to a single page that assures users that nothing will change, and that life will continue as always, only better because now MegaCorp is in charge.

After months of silence, MegaCorp quietly shuts the site down and announces that **Open Sauce App** is dead.

Cathy shrugs it off; she's been using **OSA** in the meantime, and now that it is officially dead and buried, she finds three other open source applications that use the **.osa** format.

But Cathy really liked the application, so she looks around online and asks around among her nerdy friends, and eventually a few people decide to take the source code of the original **OSA** from Cathy's backup drive, and create a fork called **Super Open Sauce App** which is literally the same programme, just with different branding on it. If Cathy is particularly tech savvy, she could have performed that fork on her own, as well.

 **Emma**

A longtime user of **Open Sauce App** and later **Super Open Sauce App** , Emma created an important **.osa** file for a client presentation.

She exported it into a common file format. She gets to the location and learns that some data has changed, making her presentation out of date. No worries, she's no dummy; she brought the source file with her on a thumb-drive. She can open it up, update the charts, re-export, and go.

Except that she can't because nobody in the building has a copy of **SOSA**. Emma goes online downloads it, installs it, opens her file, makes the change, and exports it again. Problem solved.

Unless the location does not have Internet.

In that case, Emma can use her cell phone to download the app and then transfer it to a computer.

Or maybe Emma is extra smart and brought a copy of the application on her thumb-drive, since it's not only free and legal, but actively encouraged, for her to share the app with others.

Again, problem solved.

Unless for some reason Emma cannot get a copy of it. Luckily, it being an open format, several other open source apps and even a few proprietary apps play nice with the open file format.

 **Tara**

Let's really put Tara through the ringer: Tara has completed her amazing artwork with **Open Media App**. She sends it to her distributor. They tell her that the whole **Open Media App** thing went out of fashion years ago. Everyone has migrated to a whole new paradigm, blah blah blah. They can't open her files.

Tara updates to the latest version of **Open Media App** and updates her project.

But let's say that the developers of **OMA** got shuffled around and let some things slip through the cracks, and there's no way to get her data from **OMA** format to what the distributor is asking for.

She obviously needs an intermediate format that she can use to export her data into, and then import into something else, and save in the correct format.

So she looks online but finds that no one cares about **OMA** any more. It's well and truly forgotten. So Tara uses **OMA** to export to generic formats, and reassembles her project in a new application, and sends the results to her distributor. It takes some extra time and effort, but there is no loss of data and no compromise to her vision.

## 12.4 Open Source Guarantee

To sum up: it's really difficult to imagine a scenario in which a user's data is jeopardised within an open source context. I can make it inconvenient, I can demand technical skill that not everyone will have at their fingertips, but ultimately the user remains in control, and they never have to compromise what the want to do for what their masters want them to do.

Why? because at the end of the day, the licenses that are honest enough to explicitly make no guarantee to its users are the licenses that, at the same time, make sure that the user has all the information, from the file format specifications down to the very source code of the executable itself. The user never has to pay for access to their own data, never has to scramble to rescue their files from out-of-fashion formats, and newer has to walk away from their own data again.

# 13 On Changing the Channel

I abandoned commercial software years ago, but I have been in computing for a long time now (counting a close childhood relationship with home computers) and I have lived through many major software and a few hardware hemorrhages in the industry. I have witnessed the fallout. I have personally helped survivors recover from data disasters.

You would think people would learn their lessons, but still it continues.

Just two examples: I left my old job to get away from commercial software and hardware that was holding back video editors. Some major changes in some major companies upset every user at my old workplace and people announced loudly that they were not going to stand for that kind of treatment any more. They were going to drop the software companies involved, and possibly the hardware vendor, and use someone else's product.

In some related events, I recently came across some angry posts on a few different Big Comanies' user forums, wherein users were publically announcing, on the company's own website, that they hated that company and would no longer give the company their business. They were switching to some other solution, someone else's product.

And what happened?

Well in both cases, the customers, of course, acquiesced. They settled for what the companies had done to them, their livelihood, their workflows, their trust, their data. And as the old definition for insanity goes, not only did they do that, but they turned right around and gave the compaies their money, their trust, and their data again. As if from now on, it would somehow be different.

You know what I think? I think corporations and their customers are in a very unhealthy relationship.

These events got me thinking, and I realised two things. First, I got angry that so many people believe the lie that in order to be productive, creative, and trendy, they have to use the product that has the flashiest ad. That in order to be taken seriously in the world, they have to have certain visual qualifiers, like a certain corporate logo, or a given set of window decorations on their screenshots, or a buzzword-worthy application name.

That mindset pains me, because on the other hand there is so much perceived praise in pop culture of self-made men and women. People who do their own thing, on their own terms, using their own brains and tools that they themselves created, building their own reality from scratch, from the ground up. And yet, conversely, so many people seem to judge their own creative and productive worth off of corporate logo and buzzwords.

The frightening thing to me is that 95% of the people vociferously complaining about how their beloved companies betrayed them are not going to actually, ever leave. They are just going to hang around, continue using their kracked copies of the software, and continue to endure the absurd licensing schemes, proprietary data formats, untraceable and unfixable bugs, the constant threat of having their workflows and very livelihood's completely pulled out from under them. I assume there is a fancy psychological term for this, and it seems a little ill to me.

I want to encourage anyone considering looking into other options to consider going open source from the ground up. Boot into an open source operating system. Pay for phone support if you must pay someone something, but boot into an open source OS like Ubuntu Linux, and start learning the free software that is available for it. Separate yourself from the corporate structure, and create your own reality. Do not just sit on the sofa watching the tv commercials and bad programming. Get up. Change the channel.

# 14 Use dd

First, just to make sure we're on the same page: **dd** is the UNIX command to copy, byte-for-byte, a disk or file. It has no interest in knowing filesystems or anything; it just copies from the beginning of a drive to the end. Period.

This is particularly useful when you need to do a full backup of your harddrive, or make a master image of one computer that then must be copied to a hundred others. Or an image of a thumbdrive and replicate it to lots of other thumbdrives.

## 14.1 Imaging

To make a pristine clone of a drive, re-boot into a Linux live distribution, have an external drive handy, and do something like:

    dd if=/dev/sda of=/dev/sdb1/image.img

Where:

  *  **dd** == the command

  *  **if** == the input file

  *  **/dev/sdX** == could vary, depending on your system; but this is the location of the drive you are cloning. In this example, we are saying the drive is a SATA drive (if it's IDE then you'd use hda), and that it's the first drive on the SATA bus.

  *  **of** == output file

  *  **/dev/sdY1/image.dd** == here we are assuming you're plugged in a USB external drive, which is therefore located at sdb1, and that you are creating a destination file called simply **image.img**

##  14.2 Restoring

The basic command to restore from a clone of a drive would be this:

    dd if=/mnt/sdY1/image.img of=/dev/sdX bs=2M

Where:

  *  **dd** == the command

  *  **if** == the input file

  *  **/mnt/sdY1/image.img** == the image file

  *  **of** == the output destination

  *  **/dev/sdX** == the first sata harddrive

  *  **bs=2M** == block size

## 14.3 Compression

What's kind of neat is this idea:

You can compress the image as you clone it:

    dd if=/dev/sdX bs=1M | gzip -9 \> /mnt/sdY1/image.img.gz

And decompress it as you re-image:

    gzip -cd image.img.gz \> /dev/sdX

# 15 What is "Intuitive Design"?

What is _intuitive_ , anyway?

A lot of us seem to have some idea of what an Intuitive interface is...but really, so much of what we call "intuitive" ends up translating to "this is like something I have used before".

## 15.1 Expectation

The most basic example I can think of: "trashing" a file. Is it intuitive or are we all just really used to the idea of having a **Trash Can** icon into which we can drag a file? Well, because we are all so used to that concept, it IS intuitive, but what if we were all used to having an icon in a panel or on the desktop that meant "zero out space occupied on the harddrive by the currently selected file"? What if that was what we were conditioned with, from the beginning of desktop computers. We would not say things like "I want to trash that file" or "I'm going to throw away that file", we would say "I'm going to zilch that file" or "I'll just zero out this document". That's how engrained it would be. The icon, instead of a trash can, would be a magnet (because the floppy diskettes that were around when computers first got into homes were infamously scrambled if they got too close to magnets; that's actually real, I'm not making it up for this imagined alternate reality). No matter what computer you sat down at, whether at school or work or at home, when you wanted to remove a file from your computer, your eyes looked for a magnet icon.

If that's how things had developed, the first thing we would all look for on an unfamiliar system is a magnet icon, or maybe the keywords "zero" or "dump" or some terminology that expressed to us that the computer was going to take all the bits assigned to a file and make them void, ready for new data. Imagine putting a computer from our world in front of someone raised with that as their expectation. A trash can icon on the desktop would go completely unnoticed, as they looked for the usual magnet. They would look through menus and they would right-click, hoping in vain to find something about zeroing bits, or clearning data. You might even give them a hint, mentioning the trash can a possible solution, and they would ignore you because, as everyone knows, you don't put digital bits into a trash can. If you put the bits into a trash can, they would go away forever, as trash does. All we are looking to do is overwrite the bits, and in order to do that, you zero them out with the magnet icon. Duh.

## 15.2 Exploration

For me, "intuitive"[ness] is a process, not a thing that happens once in the design studio and gets pushed out to the eagerly awaiting masses. Sure, intuitive interface design is the collection of a series of consistently logical steps. That's what makes it seem so obvious that when you want to delete a file, you put it in the Trash, or even that those little boxes we draw with pixels are "buttons" at all.

But it is also the collection of all the different applications on a system. It is a compounding phenomenon. The components of a system all work in similar ways, so the user can learn a few basic concepts and suddenly inherits a whole variety of more complex skills that will use those concepts in new combinations.

For instance, if **Backspace** (the key on the keyboard) is used to left-delete characters, then it should not be used as the hotkey to navigate **back** in a web browser or a file manager, because that does not involve the removal of data. But it could sensibly be used as the hotkey to delete a file in a file manager, since that is similar to removing a character from a string in a text editor.

Likewise, all the bits and pieces that a user learns initially should probably have a consistent logic that encourages the user to explore and learn more about it. For example, if we establish that right-clicking a file and selecting **Properties** shows us all the data about that file, then it is logical that right-clicking on a file is a good first step if we are asked to discover how to apply a custom icon to that file, or to obtain its file size, and so on. That makes sense, and it encourages the user to venture forth and try new things, because the logic of their actions are never betrayed. It would be silly, and irritating, to make it so that **Properties** shows an icon and allows the user to change it, but to disallow the user from changing, say, the file's name. Why one and not the other? Not only does that sort of illogic irritate the user because they feel they've discovered a neat new way of doing something but it turns out to be wrong, but it confuses them; what exactly is **Properties** for, then? what's its mission statement?

## 15.3 Context

Let's say I have a window with a red button and a blue button.

Is that intuitive? It is, to a degree; you're supposed to click one of the buttons. But what do they do? It's not so intuitive any more, is it? Now it's counter-intuitive. If, however, I tell you what you are doing, the context shifts, and suddenly it does become intuitive.

Even if we change the buttons from red and blue to red and green, which have implied meaning in most modern cultures, without contextual knowledge of what we are doing when we press the button, the workflow might be intuitive, but the task remains puzzling. If you use primarily GUI interfaces, you might think that a little bit of clicking and exploration might solve this puzzle pretty easily; click the green button, and see what happens. Click the red button, see what stops happening. Problem solved. But what if the buttons start or stop a daemon process that you don't see?

So intuitiveness is not just simplicity; it is simplicity plus context.

## 15.4 Now try it with Plain Text

Let's look at something that is frequently considered _not_ intuitive: a shell.

If I sit someone in front of a black screen with some green text on it and say, "OK, find the file called **needle.txt** ", they are not going to know what to do. It is not intuitive. They might hazard a guess, they may or may not get it right. If they are adventurous enough to try, then they might type something like:

    $ find needle.txt

And that would, as it turns out, would reveal **needle.txt** as long as **needle.txt** happened to be in the current directory:

    $ find needle.txt
    needle.txt

If not, then it would render this error:

    $ find needle.txt
    find: 'needle.txt': No such file or directory

At which point an adventurous user might also try:

    $ help find
    help: Command not found.
    $ find help
    find: `help': No such file or directory
    $ commands
    commands: Command not found.
    $ help
    help: Command not found.
    $ ?
    ?: No match.

Pretty dismal from an exploratory standpoint, but not all that different to anything else we have looked at. The interface is intuitive; the user understands that they are meant to type words at the computer. The user learnt through exploration that the words are called "commands", and the user understands that several words put together create more complex commands.

The _workflow_ , however, is clearly not intuitive, because no matter how the user might try, getting help from the computer is just not happening without knowing the secret of dashes (such as **\--help** or **-h** ) or the **man** or **info** commands (such as **man find** ).

> Sidenote: In my very humble opinion, **help** should be aliased to **man** or **info** until we all finally decide to get around to implementing proper cheatsheets.

The workflow, of course, is familiar to a hardcore geek. Even if I sit a geek down in front of a different Unix than what that geek is used to, the familiar workflow of _how_ a shell works would be enough to render the shell's unfamiliar interface fairly intuitive. The **find** example is too simple, so instead, let's imagine we put a geek who is used to BASH in front of a **tcsh** shell with the goal of setting a variable:

    % foo=1
    foo=1: Command not found.
    % let foo=1
    let: Command not found.
    % env
    % set env
    % set foo=1
    % echo $foo
    1

Or something like that, and the problem will be solved. The way it has been solved is expectation (the geek knew the keywords to investigate), plus experience (the geek knew how to use and explore the shell, and gets around to the solution with a little bit of exploration), and context (the geek knew what the goal was, and some specifics about the goal, like when we say "set a variable", we are asking for an "environment variable" and we are also, sneakily, providing the **set** keyword to help jog the memory).

## 15.5 Intuitive Design

Intuitive design is a collection of many things:

  *  **Expectation** (or "pre-conditioning", if you like). Expectation is the sum of past experience, and it is through that lens that a user approaches an interface.

  *  **Experience** (nurtured by consistency) is the _process_ of intuiting how an interface will work based on how it has been working so far. It is, if you like. the hypothesis that the user makes about how something works.

Expectation ensures that when a user's hand automatically goes to click an **OK** or **Save** button, that button is where the user has learned, from experienc, it should be, regardless of what application the user is operating: the lower right corner.

  *  **Exploration** is the user testing their hypothesis, formed from experience and expectation, on how an interface works. An interface should _encourage_ exploration by using reasonable internal logic.

When a button is _not_ where the user expects, then the user should feel confident and capable enough to explore the interface and learn its unique traits. This is a designer's fallback, but the lack of consistency might irritate a good user, and confound a poor user.

  *  **Context** is whether or not the user understands the task and goal at hand. If the user has no context, then the design may be intuitive but the workflow remains a mystery.

A single button labelled "start web server" makes starting a web server easy, but the user does not walk away a sys admin.

Anything we as programmers can do to bolster the consistency and cohesiveness within our targeted platforms is generally a _good thing_.

And anything we as users can do to stop buying into the idea that one vendor's "simplicity" automatically means "more intuitive" would probably also be a _good thing_.

# 16 State of Independence

Here are three reasons why independent, open source software is important to me.

## 16.1 Independence.

Mega corporations run everything. I would rather use Linux, which is programmed by independent programmers dispersed across the globe.

## 16.2 Control.

Every time someone has to shrug their shoulders in resignation because their computer vendor has imposed something on them, a fairy dies. I have never killed a fairy by using Linux.

## 16.3 Ecologically Responsible.

Modern operating systems force you to buy a new computer if you value your online security or want new features. Linux is designed to work on old and new computers alike. It _never_ encourages you to throw out a computer. Possibly an extreme case in point: my laptop is from 2004, and it runs the latest release of Debian Linux.

It won't even load the installer of its "native" OS.

Another extreme example: the website you are browsing right now is on a server that uses 10watts of power. Less than most household lightbulbs (even the fancy flourescent ones)!

# 17 So You Want to be a Programmer

Due largely to the success of video games and the Internet in popular culture, I have met quite a few people who tell me that they want to become programmers. The problem is, not everyone understands exactly what it means to "be a programmer".

Some years ago, I started teaching multimedia classes at a local film co-op. Gradually, I came to realise that the question of "what do you want to do when you grow up?" follows a pretty reliable trend. Right now, computers are sexy and the jobs in the computer field sound exciting. That's great; similarly, ten years ago, becoming a filmmaker was sexy, and the number of students I in my classes whose life goal is to emulate [ _name of Hollywood director_ ] is overwhelming. That is, until they learn that making a movie is not glamourous at all.

And before that, astronauts were really amazing and everyone wanted to be one...until you found out that you have to learn science and biology and stuff.

I do not want to discourage people from becoming programmers. In fact, quite the opposite; I sometimes think that learning even basic programming skills ought to be required before you are allowed to purchase a computer because if it were, people wouldn't settle for the mindless slop that Windows and Macs try to pass off as operating systems and application sets. However, I hate for people to spend a lot of effort pursuing something that they come to find is a lot less exciting than what they thought, or to get to the brink of achieving their goal only to find that they spent all their time playing games instead of prepping for how to make them.

So what exactly _is_ a programmer? I mean, what do programmers do all day whilst staring at screenfuls of code? And, likewise, if you intend to seriously investigate becoming a computer programmer, what concepts do you need to start playing around with in your spare time?

My friend, you have come to the right place. Read on!

## 17.1 Systems

I have said it before, and I am saying it again: the best way to get into programming is to start using Linux. Yes, even if you are absolutely, positively going to programme for other platforms, it all (not literally _all_ , but pragmatically) starts with C and the POSIX filesystem. You have to understand how parts of a complex system fit together, and POSIX excels at that.

Get used to it now, and take advantage of an open and fully-hackable codebase.

## 17.2 Parsing

Incredible as it may seem, much of programming is just pushing a bunch of letters and numbers around within RAM. It might be hard to comprehend, but all those villains and all that loot, or all those web forms and buttons, they are all represented by simple variables in a sometimes not-so-simple array or table, and it's your job to move them to where they need to be pending user interaction. It's a lot like using a spreadsheet, only without all those pretty icons to click.

## 17.3 Logic

A lot of time in programming is spent working out how to make certain things happen only under a small set of very specific conditions BUT, under no circumstances, under all other conditions. But how does one define those conditions? and what about that one condition that could possibly happen even though it's not ever supposed to happen?

Yes, that's quite often what a programmer ponders for hours upon hours, trying out different for-loops nested in while-statements nested in a few if statements. And just when you think you have it right, you find one notable exception that throws everything into complete ruin and you have to start over. And then you start pondering regex.

## 17.4 Maths

I use both more and less math than I imagined back when I knew nothing about programming. I do use math a lot, and even more mathematical principles, but I would not say that I was good at math.

The important thing about math and most programming is that you understand key math principles. If you are making a game or a graphical plugin or whatever, you ought to know at least how the cartesian coordinate system works. If you are doing anything that requires iterations (which is nearly everything in programming) then you should know how to make that sort of thing work with comparators and things like that. If you are doing a lot with pixel values, understanding hexadecimal is handy. Binary is always helpful.

And that sort of thing. Unless you're doing hardcore crypto or low-level VFX or openGL/Vulkan programming and other really low level stuff, you probably won't be doing non-stop math, but in programming math pops up basically all the time. Little maths. Not big maths. Just little ones that will either give your brain a bit of a work out, or it will bring you work to a grinding halt if you don't have a good understanding of the basics. So learn at least the basics.

## 17.5 Collaborate

Whilst programmers often have a reputation for being anti-social hermits, which, admittedly, is _sometimes_ justified, there is actually quite a lot of collaboration that must be done. The myth of the lone programmer sitting in a dark corner creating the latest killer app is, like the myth of the lone director making the latest blockbuster ALL BY HIMSELF, is a little skewed toward reclusive romanticism. No matter how much a one-person show you are, there's always some library or module or problem that you need help with.

Or else, you're part of a team and it's part of your job to work with people. This means Agile scrums or Kanban) group hugs (or whatever it is that Kanban does), e-mails that neither flame nor troll, maybe even a phone call or face-to-face meeting sometimes. You don't have to love it or be terribly great at it, but you must not fear it, either.

## 17.6 Computers

Sounds crazy, but a lot of what a programmer does is _Know Computers_. That's another reason I urge people to get into Linux sooner than later, because if there's one thing that Linux empowers you to do, it's to understand how computers work from hardware through to the kernel on up to the software, and all those little stacks in between. When you are programming, stuff goes wrong. A lot.

Often.

Frequently.

So understanding how computers work helps you know where to start looking for the problem, and from there you can probably fix it, or you'll know who to call and when to call them.

## 17.7 Learning

The good news is that once you learn one or two programming languages, you can learn any new language pretty much on an as-needed basis. That is, if you've learnt Python and C++ then if someone offers you a really great job in a far off land developing with Java, you could comfortably take the job after a little practise with Java. In other words, you don't start over, generally; a new language is just a variation on a theme. Obviously there are exceptions and you wouldn't want to take a job as an EXPERT programmer in some language you've never actually used, but the point is that once you get up to speed on the practise of programming, you have a toolbox that is pretty easily expandable.

The bad news, then, is that there is always something new to learn. Wait, that's not actually bad news. That's good news! Well, unless you dislike learning new things.

The thing is, whether it's a new language or just a new library for the language you know and love, programming means constantly learning how to use new tools. Which means you are always learning new tricks. Which is cool! As long as you are into that sort of thing.

## 17.8 Programme

Yes, programmers programme. All day long, and often for the better part of the night. If you want to get a job doing programming, you need to love the act of programming. Get involved with an open source project now, start surrounding yourself with programmers, the culture and discipline of software development (for instance, if you didn't know what "agile" was when I slipped a mention of it in earlier, then you are not yet immersed). You might think "How can I do that when I don't yet know how to programme?" but believe me, you can start learning. Put the game controller down, or close your web browser, or stop whatever you're doing that is _not_ programming, and boot into Linux, and start learning to programme. Start out slow, with simple, even stupid, stuff. But do it until you get addicted, and then you're ready.

If you don't find yourself addicted (and I pretty much mean literally addicted) to programming, then it _may_ be that you're not a programmer. Could be that you are and just haven't found the language or angle that compels you to do it compulsively, or it could be that you are just a computer nerd and would be great at doing something else in a related field (QA testing, project management, UX design, or whatever else). OR it could be that you would not be happy as a programmer, and all you really want to do is play games or browse the web for fun, and do something else that you really love for income. That's not a bad thing, but it is something you want to learn now and not later.

So that's what programmers do, more or less. What are you still doing reading this? go programme something!

# 18 Abstraction the File Chooser

User experience is important, and every OS is always striving to make it better. One thing that Linux has in its favour, in the terms of The Big Picture, is modularity. Modularity is great but sometimes we let it get out of hand, such as in the concept of File Choosers.

A "file chooser" is what I'm calling that dialogue window that appears whenever you Open or Save a file in any given application. On Linux, I can count, so far, a staggering eight or nine different file choosers that a user might encounter on any given day. Sometimes it confuses me and I've been using Linux for a long time. Imagine what it does to a new user's brain.

  * There is the standard GTK file chooser that a user sees in GIMP and GPodder and other such applications.

  * And then there's the fairly standard KDE file chooser that you'll see in Krita and Kate and Kwrite and so on.

  * Except when it's not so standard, such as with Cervisia.

  * Similarly, there is the file chooser that comes along with the Qt toolkit without the KDE special sauce.

  * And the presumably Java-based file chooser that ships with Libre and Open Offices.

  * And then there are any number of random file choosers from other toolkits, such as one I saw in Fluidsynth-DSSI.

  * An interesting one that I saw with geeqie.

  * There's an old one from XMMS.

  * And still another from xpdf.

Frankly, there are probably more, but you get the idea.

It is dishonest for me to claim I see all of these every day. I really only encounter, on a daily basis the choosers for Scribus, GIMP/Firefox, various KDE apps, XMMS, Qtractor, fluidsynth, and xpdf.

But that's still seven different file choosers to deal with every day. I make a shortcut in the side panel of one, and it's anybody's guess as to what other file chooser it will show up in.

## 18.1 Proposed Idea for a Solution

We _could_ create one file chooser and declare it the One True Linux File Chooser, but that's not the GNU way. And it shouldn't be. If you hated the file chooser "we" chose (I'm imagining that there is a "we", even though that's far too collective a term), you'd have no recourse.

On the other hand, the advantage is clear: you would only have to learn one file chooser. This one file chooser would be more integrated and it would feel more "polished" over-all, and certainly it would be less confusing, because it would make the interactions with your computer more transparent, and that's a good thing.

But we use Linux! We ascribe to the Unix principles of modularity!

Modularity provides the solution: if we abstract the file chooser by way of an **xdg** (or similar) variable, we can make it such that when an application calls the FILECHOOSER function, the user's preferences are pinged, and the appropriate file chooser from the user's favourite toolkit is displayed. Simple as that.

I imagine this would best be done via the Free Desktop specification, such that there is some universally-respected variable deciding what file chooser skin the user prefers to see on their system. There would probably be some **dbus** magic that would need to happen, too, but it's all possible.

Even if it's not a trivial change, it _can_ be done; it's all free software, after all, and we as a Linux community did conquer the old issue of unified-notifications (let it be known that we did it before any other OS did, a fact that probably will fade in time), which is another thing that myself and others had written about in years past.

The Free Desktop specs are a powerful powerful system, helping us unify our open source desktop experience. And let's face it: three file choosers is too many, but nine is just plain offensive.

# 19 Howto Write a Howto

There are lots of "how to" articles and tutorials out there. Some ship with a product, others are written by helpful volunteers online, others appear in user forums. This seems like a great thing, until you go searching for something that seems so simple, and you end up with five different how-to articles on the same topic, none of which actually work.

I've written a few howto articles, I've been paid to write user manuals for software products, I have been paid to teach people about computers and multimedia, so I'm going to throw out my thoughts on how one might construct a good "how to" document, and maybe it will help someone help someone [sic].

## 19.1 Proof of Concept EARLY

The first and most immediate goal of a how to article should be to get a person up and running. This obviously depends a little on what you're teaching, but generally speaking, the sooner the user can achieve their goal, the better.

Why? because any sensible reader doesn't trust you until you've proven that your article really is going to work. I mean, anyone who's been on the internet looking for tips and tutorials has already learnt that most articles are not going to work. So if you start with a preamble on video frame rates and codecs or network protocols and fancy line commands and all this other (possibly really important) stuff, then that reader will (or should) abandon your article for another one that actually meets their goals and expectations from the start and proves to them that it's worth reading from start to finish.

This also tells them that their search is over. You wanted to edit video in Blender? hey, you've just imported a clip and can play it back; you're doing it! Now all you have to do is learn the finer points of the process, but the thing you were seeking to learn is now done. Stand down from Red Alert.

## 19.2 Separate Instructions from Explanations

> Calm down, I know what you're thinking. I'm not saying what you think I'm saying.

By this, I mean that if I'm explaining to someone how they can compile source code, then I am going to give them the series of commands they must issue with minimal explanation of why they are doing it.

That's literally just so that the information doesn't get lost in paragraphs and paragraphs of long explanations about concepts they don't even understand yet.

Let me be clear: I'm NOT saying (to reiterate: _not saying_ ) that you shouldn't tell people why they are doing something or how something works. I believe that giving people the "why" is hugely important, and it helps them learn, more...better. Just make sure they know _what_ to do, too.

## 19.3 Tell the user Why they are doing something

Told you I believed in this. I really do; if I tell you to issue a random command such as **lsmod | grep -i b43** then what have you learned? Well, if you came to the article to learn how to get your wireless card working, it has taught you that that's a command that will help you get wifi working, but without any explanation, you have no idea that **lsmod** would also be useful when you then turn your attention to getting your kerflutzer working, or...whatever. So in the end, you have learned very little.

Explanations also both prevents and encourages bug reporting about your article. A good explanation prevents people from reporting bugs in your article because the explanation enables a user to figure stuff out on their own...because the user actually has some understanding of what the goal is. Maybe they're not on the same Linux distribution as your article was written on; without any explanation of the steps you are telling them to take, they would be emailing you for specific instructions on how to do the same steps on their distribution. But with clear explanations of the concepts, they might be able to figure out that while your article references **/usr/local/bin** , they need to use **/usr/bin** as their path. No useless bug report to you, and the user still obtains their goal.

On the other hand, if you are making too many assumptions or have forgotten a step or something has changed in the code since you wrote your article, then users should report these as bugs. But bugs are hard to find when everything in the article is just a series of mysterious magical commands, whereas if my article explains clearly what I believe the user will see and why they should see it, and it clearly is not working, then the user will feel confident that this is worthy of being reported.

And those are the bug reports you want, if you care at all about your docs.

## 19.4 People are there for the Information

Personality and clever asides are nice, but, as they say, _I didn't ask for your life story_. If you must, provide a little bit of background on why you are writing the article, or what qualifies you to write the article, or give some caveats so that you are or are not seen as the definitive source on the topic, and then get on with the reason people are reading your article: The Information.

I am not saying there is no time or place for personality, and I am sure there are some good howto articles out there that do exist in the midst of a long narrative about the author's life, but they tend to be more difficult to parse and, for the auther, more difficult to maintain.

## 19.5 Update the article

This isn't art, it's technical writing (not that there isn't art to technical writing, but go with me on this). If you got something wrong, or if you were unclear on some point, and then someone emailed or commented at you and gave you more and better information, then update your article with that. Don't try to preserve the original artifact for posterity and add addendums and corrections later in the article; no one cares, and in the end it just makes people puzzle-piece your article together to make any sense of it.

### 19.5.1 Comments are not Edits

If someone comments on an article and provides useful corrections or additions, integrate them into the article. Comment back and thank them, and make it clear that the changes have been integrated into the text. If you're especially nice, maybe give them a quick "thanks" in the text of the article so that their contribution is recorded even when the text of the article (I assume you are publishing Gnu Free Documentation or Creative Commons) is separated from the comments.

## 19.6 Provide a Definitive End

People often come to your HOWTO confused, lost, at their wit's end. Be nice to them. Guide them through the tutorial, and at the end, tell them it's the end and tell them where they should or can go from there. Or, if there's no where to go from there and the end really _is_ the end, then tell them so, and wish them well. But end the article. Like this:

# 20 The AT&T Guide to UNIX

I was at a used bookstore the other day looking for old sci-fi paperbacks, and on a whim I thought I should look through the computer section as well. So I asked the shop proprietor if he had a computer book shelf, and he said well, yes he did, but it was all very out of date, and I said "Good!"...because the very reason I wanted to look through the computer book section was to see if I could find any old gems from the yesteryears of computing.

So I looked through the obligatory "Word 95 for Dummies" and "Visual Basic Programming Guide" books, and finally started coming across a few old Mac-related books, and then onto some really big "Fortran for VAX" tomes, and then finally, tucked between a COBOL and an Electrical Engineering book, I found a small-ish yellow wirebound book with the AT&T logo on it, and the title?

 **UNIX System User's Handbook**

And sure enough, it's a handbook on how to use the [already] decade-old AT&T Operating System known as UNIX.

Obviously I purchased it.

It's pretty cool; it's got a lot of commands that are still relevant today in it (in fact I've learned a few new nifty tricks), and some that were apparently specific enough to AT&T not to have survived past whenever AT&T finally gave UNIX away to Berkley and sold it to somebody else and...well, whatever the heck happened.

Amazingly, the original price for this spiral-bound manual was $18.95. Not that I consider that a lot, given that learning Unix is itself priceless, but I'd never seen a spiral bound book sold for that kind of money before, much less one from three decades ago.

Almost as neat is the fact that there is a Xerox'd copy of a **Vi Cheat Sheet** in the front flap of the handbook, probably left there by Christopher Aiken (the former owner of the book, which I know from an inscription on the inside flap). The cool thing about this, aside from being a slice of history in itself, is that **vi** , of course, is now **vIM** ( **vi** -i **m** proved) and happens to be a very popular text editor on Linux even today. So it's kind of neat to see a cheat sheet for its former incarnation.

In the back is a pamphlet, published by some place called **SSC** , titled the "UNIX System Command Summary for Berkeley 4.2 & 4.3 BSD". Obviously the handbook itself is for System V unix.

So, basically, it's a nice curiosity item, as well as being a pretty darned helpful review of essential UNIX concepts. The handbook itself dates back to 1982 so it's not like it's ancient (I guess by ancient I mean "1970") but it feels old. Certainly it represents an older *nix, so it's cool to have.

# 21 Float vs Inline-Block

I had some really good teachers when I was learning CSS, most notably SnackMachine[B] and akanik, both of whom basically taught me everything I know about HTML and CSS. But as many of us will, I also looked around at random websites for lessons and tips, and sometimes, I'm afraid, those lessons are poorly structured.

To wit, when people (and by "people", I mean me) are learning CSS, they often get very confused about how to get certain elements to move left or right on the page. Then they learn about **float** , which usually solves their problems...at first. And then they use float, which by definition removes an object from the CSS flow of the document, for _everything_ , and wonder why CSS appears to make no sense.

The problem with **float** is that it gets over-used. But most people don't understand what it is doing, so they use it for _one_ of its effects and then once they start doing more "serious" web coding, all of its other inherent attributes screw them over.

## 21.1 What Does Float Do?

The float attribute, by design, removes an element from the natural flow of your CSS, meaning that all the logic you have learned about CSS and all the techniques and tricks you have developed to manipulate the positioning of your elements are rendered utterly useless. Nothing works any more; you try to use the width of a div so you can center something, but the float'd element isn't _in_ that div any more, so it can't constrain itself to the div's width. Or you try to move other elements around the float but the float moves around in funny ways, because _it's not in_ the flow of the CSS. That's what float does: it removes its element out of CSS and makes it a free-floating radical.

Sometimes you do want that affect, but often times we only use float because it's what we first learned, and it does _appear_ to get the job done. But in fact, many of us use float when **inline-block** is far more appropriate.

> To be clear: I am not saying there is no place for **float** , I am only saying that it gets over-used. Do not feel compelled to tell me all the places that **float** should be used. Believe me, I have used **float** and loved it.

## 21.2 display: inline-block;

In case you were not aware, CSS sees some code elements as "block" items, and block items _always_ get a new line. In other words, if you had a bunch of **< p>** tags in a document, you would expect CSS to place each paragraph at the beginning of a new line, yes? Well, that's what it would do. And it will also do that for **div** tags and **h1** tags, lists and list elements like **li** and **ol** and **ul** , tables, and probably a few others. Point is, when you use a block element, they _inherently_ get placed on their own new line. This is why it feels so difficult to achieve two columns on a website layout; you want to use a **div** but you cannot get two div's to co-exist on the same line. So you float one, right? Wrong, and here's why.

Take a look at this simple webpage. It consists of two div's, each one given a different colour.

    <!DOCTYPE html> 
      <html lang="en">
      <head><title>Inline Exercise</title>
      <style>
      #body {background-color: #000; color: cyan;}
      #red {
       background-color: red;
      }
      #blue {
       background-color: blue;
      }
      </style>
      </head>
      <body>
       <div id="red">
       Hi some text.
       </div>

      <div id="blue">
       Second div.
      </div>
      </body>
    </html>

Sure enough, each div gets its own line, and is 100% wide, whether we intended it or not:

Hi some text.

Second div.

How to make them sit side by side? Well, if you want them on the same line, you must convert these block elements to **inline-block** elements. They'll still be boxes like we generally think of div's being, but they will be boxes that are able to sit side by side. Well actually that's not entirely true yet; this step will _appear_ to not work, but bear with me.

    <!DOCTYPE html> 
      <html lang="en">
       <head><title>Inline Exercise</title>
        <style>
        #body {background-color: #000; color: cyan;}

        #red {
        background-color: red;
        display: inline-block;
        }

        #blue {
        background-color: blue;
        display: inline-block;
        }
        </style>
       </head>
      <body>
      <div id="red">
       Hi some text.
      </div>
      <div id="blue">
       Second div.
      </div>
     </body>
     </html>

Which renders this:

Hi some text.

Second div.

Why has nothing changed? Well, since they are boxes that we want to sit side by side, we probably also want them to be less than 100% or else they won't fit on the page (because the page itself is only 100%, so how could two things, each measuring 100%, possibly fit side by side?), so we'll make them 45% wide. Watch.

    <!DOCTYPE html> 
      <html lang="en">
       <head><title>Inline Exercise</title>
        <style>
        #body {background-color: #000; color: cyan;}

        #red {
        background-color: red;
        width: 45%;
        display: inline-block;
        }

        #blue {
        background-color: blue;
        width: 45%;
        display: inline-block;
        }
       </style>
       </head>
      <body>
      <div id="red">
       Hi some text.
      </div>
      <div id="blue">
        Second div.
      </div>
     </body>
     </html>

And sure enough, we now have div's that sit nicely side by side:

Obviously we could then manipulate margins and padding and other things that will position the div's where we want them to go. And best of all, the div's will obey, because they are still bound by the laws of CSS, rather than left floating about with no point of reference as to what everything else around them are doing.

Bottom Line: Before you think of using float, consider **display: inline-block;**

# 22 Petty Computing Issues

People ask me why I dislike computer corporations. It is easy to tell someone, and it's even easy to show them, because people get screwed over by these companies on a daily basis in very real ways. The problem is that it doesn't affect everyone at the same time; if it did, maybe a collective awareness would arise and people would (maybe?) put a stop to it. The thing that concerns me is that when it does happen to individuals or to small groups, most people, Stockholm-syndrome style, make excuses for the corporations.

We do this a lot, in many ways:

  * For example, if you say to someone "Hard drives formatted for Apple computers use a secret format of journalling so Linux cannot write to them", what most people hear is "Linux is too primitive to write to Apple drives", when what's actually being said is "Apple is holding computers back from realising their full potential by refusing to play nice with others".

  * If you say "Linux-formatted hard drives are nice, because they use open source formats that anyone is free to include in their OS, for free", but when people go to read an EXT4 drive and their OS refuses to read it, it sounds again like Linux is at fault. After all, we're using hard drives that no one else can read.

The problem is, it doesn't have to be this way. It's only this way because the computer corporations believe that it's better to block inter-operability in what is colloquially called a "jerk move" rather that share even the barest minimum of information such that computers can actually talk to one another.

In fact, the corporations could even get away with _not_ sharing _any_ information if they would at the very least accept formats from other sources than their own dev team.

Corporations won't let indie developers read and write to their drives, and won't bundle indie software with their OS so that people can read and write to their drives. They are both denying to provide information, and declining to accept information; it's the shortest path to an impasse.

To make matters even worse, if you look at the letter of the law, independent developers cannot even be sure that they have implied permission to use formats considered to be proprietary-but-ubiquitous. For instance, Micrsoft sued the Tom-Tom GPS company for using FAT, a 20 year old file system that no one should be using, but everyone does because it's one of the file systems that so many people have reverse engineered that it does end up working for pretty much anything (I use "work" lightly).

It's the same sort of issue that artists have with that eternally gray area of "fair use". It's all well and good until someone gets sued.

This is all rather shocking when you think about the fact that the computer vendors create file systems to store and transfer data, and yet by their own actions they cause their technology to fail at one half of its very purpose.

## 22.1 Moral

So what we are dealing with, as a technologically progressive society, is are corporations that:

  * will not tell you how their filesystem works

  * will not include free filesystems in their products so that you can use your own filesystems on their systems

  * threaten litigation if you use their file system and they are in the mood to sue

In other words, we aren't being held back because of some physical or scientific barrier, we are being held back because of greed.

## 22.2 But Wait, There's More

Hard drives are pretty essential to computing, but other examples abound, because the same conversation occurs when speaking of codecs necessary to playback or encode video, or audio, image formats, office documents, and on and on.

The sad thing about this is that it's the corporations who are making these calls and their own users who suffer. If you are one of the thousands of people who have been burnt by the HFS+ filesystem, or one of the millions burnt by FAT, it's not as if you are really given valid alternatives should you decide you want to try a different solution. You can't use an EXT or JFS or XFS filesystem; there's just no real support (yes, I know drivers blah blah; I'm talking about pragmatic, real world support to make an alternative file system the thing you use for all your data).

Same goes for file formats. You can't import FLAC instead of WAV, or XVID instead of H264, because the applications you are sold won't accept them.

What have you done to upset these software vendors? you purchased their goods and their software, you have invested time and effort into learning the systems, and in the end you are punished for it.

## 22.3 Solution

It's not right, and that's why users of independent operating systems stand against these companies. I do not buy Apple products, and I don't buy computers that ship with Microsoft Windows as their inbuilt OS. You can find computers with independent operating systems available from System 76 and ZaReason.

# 23 Race to Better Security

As ever, there have recently been a smattering of security vulnerabilities found in this library and that application, and recently it so happened that vulnerabilities were discovered in two similar libraries, one from the proprietary world and one from the free software world. Since the internet is famously louder than it is intelligent, comments about these discoveries were pretty much a logical Mobius loop.

Here are some common complaints and criticisms, and why I think they miss the point.

 **Open Source has security vulnerabilities too, so what good is it to be open?**

First of all, all publically-declared security vulnerabilities are, by nature, open source.

It's not always open source _code_ (although it may be an open source sequence of steps detailing how to exploit the vulnerability) but it is an open source of information.

This is what people don't quite get about the open source movement; it's about a whole heck of a lot more than just literal software code.

Make no mistake: it's by hook or by crook that some vulnerabilities are exposed; non-open software vendors **do not want** their users to be aware of flaws in their blackbox'd code. They are not the ones releasing information about this stuff.

In fact, the closed source companies almost never willingly announce the vulnerability themselves (they often acknowledge it when they are outed by sec researchers). The proprietary "solution" to security is to keep the users in the dark, and then casually push a patch out to their system under some obscure heading like "System Update for 0087-24-1335 02/03/2019".

In some cases, there is an accompanying URL that especially inquisitive users may click to find out more information, but generally it is purposefully obfuscated.

Security researchers, on the other hand, believe that _open information_ leads to _better security_.

Why?

  * If users know that a vulnerability exists, then they can make intelligent choices about how to work around that vulnerability until it is patched

  * Users will know to obtain or allow a patch when a fix is available for something that is broken.

Simple as that.

In other words, security alerts are a form of open source.

 **Open source software is going to have it patched overnight. It'll take days for $COMPANY to patch theirs.**

This may or may not be true, depending on the details of the exploit. Obviously it's an overly-broad statement displaying blind faith in open source software.

While it's often true that as a part of the published exploit, a solution is also provided, that doesn't necessarily mean that the open source developers are going to be able to get the fix done overnight (although in mission-critical instances, you could do it yourself).

But that assumes the fix is just handed over along with the exploit reveal, and that's not always the case. If the project developers have to figure it out on their own, it might take a while, or if they introduce another bug whilst fixing the first, that could complicate things.

It's silly to just assume that open source devs will be able to fix any bug overnight while closed source cannot. It's not a sound assumption, even if it is true that open source usually does beat everyone to the punch. Don't count on it, and don't pass it off as some great truism. It's a goal, not a fact of life.

 **Proprietary software has more money and professional programmers, so it is just better.**

Security flaws, like diseases, are pretty agnostic to profession, comp sci degrees, money, and so on. Security flaws are _flaws_ ; they happen to everyone for lots of different reasons. It isn't simply that fake programmers are doing open source projects and _real_ programmers magically get all the paid jobs at big companies. In fact, there are PhD-level programmers working on open source, some of whom are paid for their work and some of whom do it as a labour of love, and there are hobbyist-level programmers working on open source, and there are PhD-level programmers at big companies, and there are hobbyist-level programmers at big software companies.

Each of these programmres will encounter a security flaw at some point in their career. It's just how this stuff works.

## 23.1 Bottom line

Security vulnerabilities are bad. It is not appropriate or constructive to take the opportunity of a security vulnerability to point fingers and blame people or programmers.

The thing to focus on when a security vulnerability is discovered are:

  * How will it affect users?
  * How can users work around the vulnerability until it is patched?
  * How can a user obtain a patch once one is available?

And that is all. Leave all arguments about software design and ideology at the door until the problem is fixed. And then take it to a forum where that debate is useful, rather than spamming the internet with so much fanboy noise that users can't differentiate between an actual security threat versus a flamewar.

Security flaws will never be eradicated any more than everyday mistakes in any area of life. But at least we can all strive to educate users so that they understand why security vulnerabilities happen, how they can work around them, and how they can compute securely on an every day basis.

This isn't a religious war, it's education.

# 24 Security and Upper Management

# 25 Cross Compile

If you run 64bit Linux but want to compile an application and send it to some computer running 32bit, then you need to cross compile. You might do this because all of your 32bit machines are slow and you would rather compile quickly on your development machine, or you might need to do it because your 32bit machines do not have a compiler installed and you do not want one installed, or you may be doing it to accomodate users who do not know how to compile their own version of your app. Whatever the reason, cross compiling is neat trick to know.

> This article is about compiling for one architecture whilst on a system of a different architecture. It does not cover how (at least, not exactly) to compile for a different OS entirely. The principles are the same but I have not had the occasion to ever compile for any OS other than the exact one I am compiling on, so I cannot write a good article stepping you through how it's done. However ,reading this might give you a good head-start in figuring it out on your own.

Let's say that we have my simple dice rolling example programme from my Programming Book, but written in C++ so that it will not run without being compiled:

    #include <iostream>
    #include <cstdlib>

    using namespace std;

    void lose (int c); 
    void win (int c); 
    void draw ();

    int main() { 
     int i; 
      do { 
         cout << "Pick a number 1 to 20: \n"; 
         cin >> i; 
         int c = rand ( ) % 21; 
         if (i > 20) lose (c); 
         else if (i < c ) lose (c); 
         else if (i > c ) win (c); 
         else draw (); 
         } 
     while (1==1); 
     }

    void lose (int c ) 
     { 
     cout << "You lose! Computer rolled "<<c<<"\n"; 
     }

    void win (int c ) 
     { 
     cout << "You win!! Computer rolled "<<c<<"\n"; 
     }

    void draw ( ) 
     { 
     cout << "Try again! \n";

     }

(Admittedly, that version could use a few enhancements, but in the interest of keeping it simple, we will leave it as is.)

To compile it on your system, you could use **g++** directly:

    $ g++ dice.cpp -o dice

The compile happens, and you can run it with

    $ ./dice

We can see what kind of binary we have produced:

    $ file ./dice
    dice: ELF 64-bit LSB executable, x86-64, version 1 (SYSV), dynamically
    linked (uses shared libs), for GNU/Linux 3.10.17, not stripped

And just as importantly, what libraries it has had to link to:

    $ ldd dice

    linux-vdso.so.1 => (0x00007ffe0d1dc000) 
    libstdc++.so.6 => /usr/lib/x86_64-linux-gnu/\
    libstdc++.so.6 (0x00007fce8410e000)
    libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6
    libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 \
    /lib64/ld-linux-x86-64.so.2
    libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1

Two obvious things we get from this information: the binary that you just ran is 64bit, and it is linked to 64bit libraries.

That is what we would expect, so that's good.

That means that in order to cross compile, you need to tell **g++** to...

  1. Produce a 32bit binary, which
  2. links to 32bit libraries instead of your default 64bit libs

## 25.1 Setting Up Your Dev Environment

To compile to 32bit, then, you need to have 32bit libraries and headers installed on your system. If you run a "pure" 64bit system, then you will have no 32bit libs or headers and will need to go get some. On Slackware, you can do this by installing a multilib hack provided by AlienBOB. On other systems, there may be meta packages available, or you can just hand-pick the ones you need.

Ultimately, no matter what system you are using, you will probably end up hand-picking 32bit libs depending on your project. Whatever your code links to, you must have as 32bit libs on your computer.

Once that's taken care of, the compile itself is fairly simple in theory:

    $ g++ -m32 dice.cpp -o dice32 -L /usr/lib -march=i586

Notice that you set a **-m32** flag to compile in 32bit mode. I use **-march=i586** to further define what kind of optimisations should be made. The other flag that you set is the path to your libraries. This is usually **/usr/lib** although depending on how your system is set up, it could be **/usr/lib32** or, heck, even **/opt/usr/lib** if you felt like it, so do a proper **ls -l /** to see how things are laid out.

After the compile finishes, see proof of your build:

    $ file ./dice32
    dice: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV),
    dynamically linked (uses shared libs), for GNU/Linux 3.10.17, not
    stripped

And of course **ldd ./dice32** points to your 32bit libs.

And since you were able to compile it, it should also, then, run on your system.

## 25.2 Caveats

If your application is more complex than the little dice app we just made and actually produces more than just a single binary, then you do need to tell **gcc** what to call those files and where to save them.

Also, if the application you are compiling uses a Make file such that you are not using **gcc** directly, then you will need to add stuff to the Makefile, or else add stuff to the config options.

All that is the same as my example, only with more legwork.

That's all there is to it. Give it a go!

# 26 GNU Linker

Most programs rely on at least one other chunk of software to run; even the simplest GNU or BSD util blissfully assumes that you have C libraries installed.

How do applications know where to find and use the libraries they need in order to run? well, they are linked to those libraries at compile time. As you may know, there are several ways to link software, and there are the usual trade-offs, depending on which method you use.

Briefly, dynamic linking of software and librares, when it works well:

  * Works really well, especially with open source software, because an application can just be recompiled as needed (if the underlying libs change) and everything works as expected.

  * Keeps download sizes small, because you don't ship third-party libraries with your code.

  * Is pretty flexible; programmers can blindly release the source code and let the distribution packagers wrap it up for their userbase.

  * Provides better security, since everything remains modular so that security patches can be applied easily.

But it's not all fluffy bunnies and marshmallows (or robots and laser guns; whatever you're into). There can be drawbacks when dynamic linking:

  * If users don't have the library you think they have, then your link fails.

  * If users don't have the right _version_ of a library, then the link fails. Demanding they update their library could result in apps that link to the old version failing. (To be fair, a symlink spoofing a library version ofter works, but do you really want to count on "spoofing" as your solution?)

  * What if the library you need in turn needs a library itself? and that library needs a library. All the problems above increase by orders of magnitude.

And that's just with the open source stuff. You don't even want to think about the issues a closed source application would have when dealing with the fast-paced dev cycles of Linux.

> For the record, I have no idea why any closed source application would ever dynamically link to Linux targets (and in fact, many of the good ones do _not_ ). Static linking is the only sensible solution for closed source delivery; and yes, you can tell your project manager I said so.

Possibly the most frustrating thing in the world is seeing a proprietary game comupany bundle something up for Linux, _clearly_ not understanding how anything but you-know-what OS works; they ship dynamically linked games, which breaks in a year, and then they post somewhere on their site that maintaining games for Linux is just too much work so they are dropping support. I just want to call them up and shout **Hire me**.

## 26.1 Do it Right

Besides just throwing it all away and using static linking, there is a good solution to all this: it's the **-rpath** option from GNU ld, which we can use directly from within GCC.

Like anything else on a POSIX system, an application has a certain path that it will follow when attempting to execute a command. The path to system libraries is defined by the environment variable **LD_LIBRARY_PATH**.

That variable's a little hard to test for success, because to get that, you first need a binary that has lost track of the path to libraries, but I'll help you produce such a condition later.

First, let's look at how to use **-rpath**.

You use **-rpath** to tell the GNU linker (the **ld** command) at compile time where certain libraries, required by your code, are located.

The basic syntax is pretty simple:

    $ gcc main.c -I include -L lib -Wl,-rpath lib examplelib -o dice

Or you can use an environment variable setting (in BASH) on GCC itself:

    $ echo $SHELL
    bash
    $ LD_RUN_PATH=lib gcc main.c -I include -L lib -examplelib -o dice

Here are some example scenarios. If you want to follow along, you will likely need two computers (or a computer and a virtual one) so that you can see things succeed and fail.

### 26.1.1 FLTK Standard Build

FLTK is pretty common, so let's build a simple demo with a standard compile.

 **Expected Results:**

  * Success on the machine that is building it.

  * Success on target machines which have FLTK installed.

** Build It:**

If you don't have FLTK, then install it first. You should install it on both machines, because we want to see how building for a standard base works.

Drum up a quick demo app:

    #include <FL/Fl.H>
    #include <FL/Fl_Window.H>
    #include <FL/Fl_Box.H>

    int main(int argc, char **argv) {
    Fl_Window*window = newFl_Window(300,180);
    Fl_Box*box = newFl_Box(20,40,260,100,"Hello, World!");

    box->box(FL_UP_BOX);
    box->labelsize(36);
    box->labelfont(FL_BOLD+FL_ITALIC);
    box->labeltype(FL_SHADOW_LABEL);

    window->end();
    window->show(argc, argv);
    returnFl::run();
    }

Simple little "hello world" GUI. Compile it as usual first. You may need to adjust the paths to your includes and libs, depending on your system architecture.

    $ g++ -I /usr/include/FL -L/usr/lib64 -lfltk \
    -lXext -lX11 -lm helloFLTK.cpp -o helloFLTK.bin

You can view what libraries this got linked to with:

    $ ldd helloFLTK.bin

If you move the resulting application, **helloFLTK.bin** to another Linux machine of the same architecture, which has FLTK installed, then launching the application should work.

For added fun, try installing an older or newer version of FLTK on your target machine. See if you can fix the issue with some clever symlinking.

As you can see, though, this method basically is pretty sound. You know your target and what you can reasonably expect them to have installed (or to be easily installable), and you plan accordingly.

## 26.2 SFGUI with -rpath

You probably do not have SFGUI installed, so download it from sfgui.sfml-dev.de/download but don't install the libraries. Just keep them in your code folder.

Speaking of code, here's a copy-paste of a Hello World from SFGUI's github:

    #include \<SFGUI/SFGUI.hpp\>
    #include \<SFGUI/Widgets.hpp\>

    #include <\SFML/Graphics.hpp\>

    const int SCREEN_WIDTH = 800;
    const int SCREEN_HEIGHT = 600;

    class HelloWorld {
        public:
            void OnButtonClick();
            void Run();
        private:
            sfg::SFGUI m_sfgui;
            sfg::Label::Ptr m_label;
    };

    void HelloWorld::OnButtonClick() {
        m_label->SetText( "Hello SFGUI" );
    }

    void HelloWorld::Run() {
        sf::RenderWindow render_window(
        sf::VideoMode( SCREEN_WIDTH,
        SCREEN_HEIGHT ), "Hello world" );
        m_label = sfg::Label::Create( "Hello world" );
        auto button = sfg::Button::Create( "Greet SFGUI!" );
        button->GetSignal(
        sfg::Widget::OnLeftClick ).Connect(
        std::bind( &HelloWorld::OnButtonClick, this ) );
        auto box = sfg::Box::Create(
        sfg::Box::Orientation::VERTICAL, 5.0f );
        box->Pack( m_label );
        box->Pack( button, false );
        auto window = sfg::Window::Create();
        window->SetTitle( "Hello world!" );
        window->Add( box );
        sfg::Desktop desktop;
        desktop.Add( window );
        render_window.resetGLStates();

        sf::Event event;
        sf::Clock clock;
        while( render_window.isOpen() ) {
            while( render_window.pollEvent( event ) ) {
                desktop.HandleEvent( event );
                if( event.type == sf::Event::Closed ) {
                    render_window.close();
                }
            }

            desktop.Update( clock.restart().asSeconds() );
            render_window.clear();
            m_sfgui.Display( render_window );
            render_window.display();
        }
    }

    int main() {
        HelloWorld hello_world;
        hello_world.Run();

        return 0;
    }

Now compile it, explicitly telling the linker where to find the SFGUI libraries.

> I should point out, unrelated to the linking process, that this does require C++11, so if you are on an old compiler you may have issues.

Note that we provide direct call-outs to the libraries to include ( **-lsfgui** and so on).

Also note that there is **no space** between the **-Wl,-rpath,'blah'** flag.

    $ g++ -std=gnu++0x -I `pwd`/SFML-2.1/include/ \
    -I `pwd`/SFGUI-0.2.3/include/ -L `pwd`/libs \
    -lsfml-graphics -lsfml-system \
    -lsfml-window -lsfgui \
    -Wl,-rpath,'${ORIGIN}/libs' \
    helloSFGUI.cpp -o helloSFGUI.bin

As you can see, we use the special **${ORIGIN}** marker to tell our application where our libraries are in relation to where _it_ is. That's important.

To test this application out, send the directory containing the binary **helloSFGUI.bin** and the SFGUI libraries that it need, to you target computer. As long as these things remain bundled together, you should be able to launch your demo app _without SFGUI being installed_. Pretty nice!

### 26.2.1 LD_LIRARY_PATH Example

For additional fun, try moving the SFGUI libs to some other location. Now your binary won't find the libs it expected to find, but you can point it at those libraries with **LD_LIBRARY_PATH** :

    $ LD_LIBRARY PATH=~/path/to/SFGUI ./helloSFGUI.bin

Even though you have moved the libraries away from the expected path, you re-define that path at runtime, and everything still works (you can do that without -rpath having been used, too; LD_LIBRARY_PATH just adds a search path to any binary. It gets used in some industries quite regularly to control which version of libraries an application uses).

# 27 Override Runtime Libraries with Env

A client asked me if it was possible to have different versions of some key libraries installed on Linux, because they were running a closed source application that expected one version of a library but they were not prepared to upgrade their entire stack to that version of the library.

In fact, this is something that can be done fairly easily. I qualify it as "fairly" easy because of course there are ways to make it complex; if the version of the library you need was compiled on top of a completely different stack than what you are running (different c lib, different foo, different bar, and so on), but in my experience it is pretty straight-forward.

## 27.1 The Problem

Let's say your workflow absolutely depends on **OpenFoo** , which depends on **libfoo.so.2**. But then you bring in another app called **Hijinx** , which depends on **libfoo.so.4**.

And let's stretch a little bit and say that you try to compile OpenFoo with libfoo.so.4 but it fails. You file a bug, but the developer is on a job at the moment and the fix will just have to wait. Hijinx isn't open source, so there's no working around it: it must have libfoo.so.4 or it's not going to launch.

Currently, **libfoo.so.2** (for OpenFoo) is in **/usr/lib64**. Hijinx puts **libfoo.so.4** into the same location, and symlinks **libfoo.so** to it. In other words:

    $ ls -l /usr/lib64/libfoo* | rev | cut -f1-3 -d" " | rev
    libfoo.so -> libfoo.so.4
    libfoo.so.2
    libfoo.so.4

You try to launch Hijinx and it works, but OpenFoo fails.

So you re-symlink **libfoo.so** to **libfoo.so.2** ; as a result, OpenFoo launches but Hijinx fails.

See the problem?

## 27.2 The Solution

There are a few ways to fix this, not the least of which is to actually change the ELF binary's link so that it knows to look elsewhere. Assuming we don't want to get into anything quite so technical, the solution is just plain too easy.

  1. What I do is move either the new or the old library (usually the new, because presumably the old one is in use by all your existing applications) to some other location. Choose some location that makes sense to you; **/opt** is a good contender, or maybe **/usr/local/lib**. Whatever you want to use is fine, just obviously be consistent about it.

For this example, let's assume you put it in **/opt** because it's short.

  2. Then I modify the launcher for my new application; the one that is demanding the new library. The mod is simple; the easiest way to do it is to edit the **.desktop** file (located in **/usr/share/applications** ). Specifically, you're looking for the **Exec=** line:

$ fgrep Exec /usr/share/applications/hijinx.desktop Exec=hijinx %F

By default, hijinx (the binary, as delivered from its distributor or vendor) will look in /usr/lib64 (or where ever it was programmed to look, but in this example, I'm calling it /usr/lib64) but we can redirect it at runtime:

       $ sed -i 's%Exec=hijinx%Exec=LD_LIBRARY_PATH=/opt hijinx%' \
       /usr/share/applications/hijinx.desktop

  3. Now launch Hijinx from any launcher (menu, icon, whatever) and it looks in **/opt** for **libfoo.so.4** , launch, and work as expected. Incidentally, OpenFoo also launches and works as expected. And in fact, they will both work as expected simultaneously, or individually. There's no conflict, no interference, no danger of whatever single-minded package manager that you use to come along and nuke Hijinx's special library version.

That, in short, is the simplest solution to potentially conflicted library versioning. It's maybe not the _most_ elegant (if we assume that "most elegant" would be a perfectly harmonious system) but there is a definite beauty to the flexibility and robustness of POSIX.

## 27.3 [Non] Caveats

This can, in theory, get more complex. For my real-life clients, this is not an issue because they are either using Slackermedia, which avoids the issue by avoiding automated package management systems, or they are not using automated package management from their vendor, opting to roll their own packages.

But I could see this getting messy if you are relying half on auto-updates from your distro, and half on special applications that don't necessarily play nice with your distro. Library sets like MLT or FFMPEG or even pandoc can get really really complex; but it's not insurmountable. It just means that you have to build little mini-stacks of the versions you need; and usually I find that the dependencies don't often drill down more than two levels. Sure, foo might need a specific version of bar, but bar doesn't care about the version of baz, so you're really only re-building bar and foo on your existing install of baz. Trust me, it's easier than it sounds.

I call this footnote **Caveats** but really they're non-caveats, because the point remains: isolated libraries and binaries and paths and environments are really easy to manage on POSIX systems. If you need them, it doesn't take much to implement them. And honestly, you usually don't really need them; it's very often down to bad packaging in a closed source application, or else extenuating circumstances.

Enjoy!

# 28 Just a Mechanic

There's a phenomenon known, I would imagine, only to computer repairmen, car mechanics, and doctors: the persistent belief that knowing how to _fix_ a problem has absolutely no relationship to knowing how to prevent the problem in the first place.

I've fixed computers as my primary source of income, I've fixed computers as side gigs, as personal favours, in production emergencies on jobs even though it wasn't my job to do the fixing, and as a personal hobby for my own enjoyment. I see it all the time: someone brings in a computer, _begs_ me to fix a problem because they can't afford a new computer, or they can't afford to have lost such-and-such a file, or they have no idea how that virus got on their box, and on and on. And I'm just skimming the surface here. Having a borked computer brings out the very worst in people; grown men weep, respectable girls flirt, bro's tell you you're cool, investment bankers invite you to lavish parties. Everyone suddenly turns into a junkie, and the only thing that can satisfy is the warm glow of their computer screen.

So I fix it, and then I tell them some things they should do to prevent such a thing from happening again, like maybe establishing a regular backup routine (and yes, I show them how), or maybe not becoming so reliant upon such-and-such a software or saving into proprietary formats that are more difficult to rescue, and so on.

And every time I do it, I turn to them to see if they are at least pretending to listen, only to find that they left my office five minutes ago.

## 28.1 Do as we Do, And as we Say

So the idea that someone can be a "computer expert" and yet also, at the same time, know _nothing_ about computers, is actually quite common to us techie people. We get it all the time from friends, family, random people at the cafe who see that we using a laptop with ease. It's not just "common", it's the _norm_.

Here is a small sample of the problems and the solutions, and as a bonus, what most people opt to do regardlessly.

  * Corrupted files

 **Preventative Medicine:**

This does happen sometimes.

It happens to the best of us...but using a good journalled file system helps, and using open file formats which, even if they do become corrupt, we can sometimes open and read and possibly rescue useful data from, is a great help.

In other words, don't use FAT drives, and instead of, for example, .docx files, save as .rtf or better yet use Open Office and use .odt file format.

Export to generic file formats when possible. For instance, a corrupted .fcp file is useless while a corrupted .xml file could possibly contain extractable data.

 **What they do instead:**

Them: "I have binary blobs that will not open in the proprietary software I created them in. Fix it for me. Again."

Me: "I can't because they're blobs, because you are not using open formats as I told you to use. Did you make backups?"

Them: "...Backup? First I've heard of it."

  * Files unable to open in new version of software; no access to old software

 **Preventative Medicine:**

Hey, upgrades of software and hardware are not as trivial as the corporate marketing engines would have you believe.

If you have files that require certain versions of software which may themselves only run on certain versions of hardware, then you need to either update those files to work with your new upgrades or keep a legacy system around to deal with those files.

Or I guess you can petition the software vendor to be responsible and continue to support their old formats...and if that works out for you, then you should fly to Vegas because you will win.

 **What they do instead:**

Continues to blindly update because TV and internet ads tell them it will give them shiny new things if they do.

  * Found an open source solution that they don't understand. Gets angry that it's too hard to use and not as good as their old non-open software.

 **Preventative Medicine:**

Read the documentation, do some tutorials. We didn't learn Word and Photoshop in a day, did we? then we mustn't expect to just be able to stumble into expertise on other big applications.

 **What they do instead:**

Grumbles about how the software is poorly designed because it isn't an exact clone of what they are used to. Complains it's too hard to use or find support for. Never reads manual, refuses to do online tutorials.

  * Doesn't believe open source software is actually used by anyone for real work.

 **Preventative Medicine:**

Walk into any professional effects house in Hollywood, Wellington, London, or similar, and get a job. You will be using Linux on the desktop, plus open source applications for half your content creation.

Walk into software development companies, software vendors (even the proprietary ones), even a number of small publishing houses, and basically any internet content providers. The list goes on and on. And yes, real people are the people using this stuff.

 **What they do instead:**

Them: "Yeah but...my school/workplace/friends say it's not that common. So anyway, can you fix my computer?"

Me: "Yes, yes I can. Using open source software on my open source operating system."

  * Expired or Lost software license, or not enough licensing for number of users.

 **Preventative Medicine:**

If an application requires a licensing code to use it, do not use it.

A "license" is industryspeak for "we will hold your data for ransom as soon as your license expires, and we will do so without remorse".

Typically, you have licensing problems 5 minutes before a deadline, or at any such time when it really really matters that you have access to the software that you purchased. If you are a budget-conscious person or organisation, then you will require more licenses when your bank balance is at its lowest.

Typically the psychosis runs like this:

The user buys or steals a license. Continues using it until an upgrade breaks something, or they are foced to update to the new version.

Now they've got no license and cannot seem to find a stolen one to use.

Panic.

Begrudgingly use a free solution, hating every minute of it because they irrationally believe they actually want the "real thing" but the "real thing" is out of reach. So they are "settling" for a stand-in.

The moment they can jump back into captivity with another stolen or discounted license, they do. Because why bother getting over the learning curve of a new application so that you can get really good on it and live independent of licensing when one can just slip right back into bondage? Files will not open due to deprecated software. User has no discs and no way to purchase new copies.

Stop using software that disappears over time.

The term "planned obsolescence" was not invented for fun. I used to keep copies of proprietary software for years, just in case I needed to re-install. The problem was that licensing servers would disappear, companies would go under, or an OS upgrades would break my old copies of the apps.

You can try this method if you are dedicated to using proprietary software above all else, but otherwise use software that gives you a copy of its code along with the application itself. This future-proofs you, especially if you are doing this from the bottom of the stack )ie, you can install a historic version of the OS plus a historic version of the application).

In practise, this is not necessary, since open source software famously supports formats and files that are, literally, 40 years old. But that's not to discourage you from keeping well-structured legacy archives.

 **What they do instead:**

Continue using corporate, proprietary software. Scramble to beg, borrow, or steal what you need when it all breaks down on you.

  * Software vendor has changed features, upsetting user's workflow

 **Preventative Medicine:**

Use open source software, where developers are typically available for feedback and feature requests. In the rare event that they are not responsive, since you own a copy of the source code, you can vote with your money and hire a developer to add the feature you want.

With the big corporations, you can also vote with your money by not purchasing the new software that you dislike, but then you are left with nothing, and your problem has not been solved.

 **What they do instead:**

Grumbles and complains publically about how evil the software vendor is, and refuses to change. Goes onto the company forums and threatens to stop using their software, reminding them that they have lost a loyal customer. Looks for alternatives to buy into, but finds that switching means learning something new. Eventually stops complaining and continues to pay for and use the software.

  * Software X is not compatible with Hardware Y or Codec Z (or similar)

 **Preventative Medicine:**

Use open source solutions with a vested interest in cross-compatibility rather than vendor-lockin.

 **What they do instead:**

Submits to vendor-lockin, needlessly suffering through arbitrary incompatibilities. Grumbles about it daily for the next 23 years.

## 28.2 That's MISTER Computer-Repair-Guy to You

Bottom line is that while people seem to be able to accept a tech person's knowledge of disaster recovery, they seem to fight tooth-and-nail to stay in their comfortable, although horribly dangerous, little dugouts.

So here's a little reversale that all tech people, at some point, find themselves doing:

  * "I am going to keep using bad technology and being irresponsible with my own data. Each time disaster strikes, I want you to rescue me.

"I will not pay you for your time if I have less than 6 degrees of separation from you, and even in that case I will try to talk your price down, or try to influence you into feeling guilty aabout changing for your time and knowledge.

"I will also insist that my very life depends on our help, and, if necessary, I will openly weep until you assist me."

 **How you should respond:**

We can bail you out this time, and the next time, and the time after that. But at some point, to save my own sanity, I am going to back away and stop trying to help you.

I will feel bad about it for weeks because I am compelled to help people, but at some point you have to make the choice of listening to my advice, or trying to take advantage of my goodwill to get you out of the problems you create for youself by not listening to my advice.

Use open source technology, learn how to use tools that they depend on, and use it responsibly.

# 29 Brand

A brand is not a product.

Do you want to play Minecraft or do you want to play a pixellated sandbox game with a fun community of players?

Do you want to use Microsoft Word or do you want to use a word processor?

Do you want to use Linux or do you want to use a modular open source OS?

A brand is not a product.

# 30 LAMP

If you're learning web design or development, you probably have already heard about the "LAMP stack". It's at the heart of, well, much of the internet itself, as it drives most of the web servers in existance. And if you're doing web design or development on Linux, you'll be pleased to know that you probably already have a full LAMP stack on your computer, or at the very least can get a full LAMP stack with just a few quick installs.

On other operating systems, installing this stack is convoluted at best, and you never really achieve a full LAMP stack anyway (you are by definition missing the "L"). Whether you have to splice on an environment that simply does not exist on your OS, or whether you are overriding the pre-existing *AMP that shipped with your OS, there are usually lots of hoops to jump through. As is often the case, third parties rise to the occasion to deliver and/or sell easier solutions. Do not be distracted by these if you are running Linux already! getting up and running with LAMP is easier than you think.

## 30.1 Installing LAMP Components

As I said, the "L" in LAMP you get for free because you are running Linux. Installing the "AMP" part of LAMP is pretty easy. Observe:

> LAMP, as you may know, is Linux, Apache, MySQL or MariaDB, and PHP. There are lots of valid alternatives here (BSD or Solaris-based Unix instead of Linux, Nginx instead of Apache, Postgres instead of MySQL, and so on) but that's advanced stuff that you don't need to worry about yet. So, we're going to just install a normal, everyday AMP stack on top of your existing Linux computer. In the future, if you need something else, know that the LAMP stack is pretty flexible and can be turned into much less pronounceable acronyms.

  1. On Ubuntu, Mint, Debian, or similar:

        sudo apt-get install apache2 mysql-server php5

On Fedora, CentOS, Red Hat, Scientific, or similar:

        su -c 'yum install httpd mysql mysql-server php php-mysql'

There are also sometimes shortcuts to installing the normal, everyday LAMP install. On Fedora and Red Hat you can do

        su -c "yum groupinstall 'Web Server'"

and on Ubuntu and Debian you can use:

        sudo apt-get install tasksel ; sudo tasksel install lamp-server

On Slackware, of course, it's already installed.

Depending on your distribution, the names of the packages may be slightly different, and the command might be different. Some distributions, especially ones geared toward servers, will not require these installs at all. Be flexible, be prepared to read the official docs if you need to, but you get the idea.

> Much of the world is transitioning from mySQL to MariaDB. They are basically two names for the same thing: either way, you end up with a mysql database. The name of the package itself is just changing for legal reasons. So if your distribution offers MariaDB, then use it. If not, use MySQL. There is basically no difference from a web design point-of-view.

  2. Next, you need to start (or launch, if you prefer) all the stuff you just installed. In the command below, use either apache2 or httpd, depending on what your distribution calls it. If you are unsure, just use both in curled braces.

The command used to start services varies depending on how your distribution is setup. You can find out by reading the docs or by looking online for help; the usual stuff.

If your distribution uses **systemd** (many do, now), then the command is:

        $ sudo systemctl start {apache2,httpd}
    $ sudo systemctl start mysql-server

If your distribution does not use **systemd** , then it uses some other application to inititalise daemons. For instance, on Slackware:

        $ su -c '/etc/rc.d/rc.httpd start'
    $ su -c '/etc/rc.d/rc.mysqld start'

However you launch it, your init application (and/or process manager) starts the Apache server and the MySQL server. When we say that we are starting a "server", we are of course _not_ speaking of an actual metal server that sits in a data centre somewhere. We are speaking of a software server, which simply means _a software application that runs, usually in the background, and waits for another computer to make a request_.

In the case of a web server, the Apache application runs in the background and waits for another computer to contact it and request access to a web page. In the case of mySQL, it runs in the background and waits for some computer to request access to some database entry.

You can test out the Apache service by opening your web browser and navigating to **localhost** (literally type in the word "localhost" or the number "127.0.0.1" in your browser's URL bar), which should render a web page, running on _your_ machine, telling you that Apache is working.

Generally speaking, default LAMP installs are configured to "serve" files from **/var/www**. If you look to that path, you should see a file such as index.html which contains the "It works" message. On some distributions, they work all of this a bit differently, so if it's not totally obvious right away, just read up on where your distribution puts its web files by default.

And that's it. You have just installed the AMP portion of your LAMP stack. No need for third party "easy" installs like XAMP and MAMP and WAMP, just a plain old, industry-grade LAMP install in two simple commands. Have fun!

# 31 Affordable Computing

I was looking for a reasonably priced laptop (six of them, actually) for an organisation. I found a model that I thought seemed appropriate for what they needed, and once I inquired about them at the store I found that that model was out of stock. So I kept looking, and found a laptop by another brand with similar specs for, actually, less money. The org made the purchase, I installed Linux, and the deal was done.

Before I switched to Linux, or alternately when I worked for a "non-profit" org that was too blind to see that a brand of computers that ensures it has no competitors is not a company you want to depend on.

The sum total for 6 laptops was literally less than 3 of those exclusive models, and all the software was free.

Just wanted to make note of that experience.

# 32 Apologies

There's a trend happening in programming and IT circles: people are apologising for using open source and independent software, presumably because it means that they are using something that "normal" people cannot achieve.

Normal people cannot achieve open source. That's what's being assumed.

Everyone knows that anyone can acquire open source software; the problem is that not just anyone can understand its vast complexity.

I don't believe for one minute that this is the natural popular opinion. I blame two groups of people for this:

  * Computer corporations, who fail to see that they could have everything to gain from individuals picking up computers and actually understanding them. So their marketing strategy is to tell users that no knowledge is required to use their software; everything just happens magically for you with one click of a button.

  * Tech journalists, who are mostly just working indirectly or directly for the computer companies. They are a thinly disguised marketing toolkit for the computer corporations, and market themselves to the same denigrated userbase, assuring users that no tool is yet easy enough for them to use, and that things must drastically improve before anyone should ever go near a computer (but go buy one anyway).

## 32.1 Knowledge as a Threat

I myself ran into this when I told sameone that I did a podcast and released the episodes in Ogg Vorbis and Speex formats, neither of which were accepted by iPods, but both of which could be downloaded and used on any computer for free. She called me a name which I will not transcribe here, but which indicated that I was elitist. She seemed to honestly believe that I was _trying_ to keep my knowledge out of reach from the common man.

Similarly, I have seen lately on more than just a few forum posts or tutorial sites, people themselves assuming that an apology for using open source is necessary. It's not a literal apology, but it's built-in to their statements. Phrasing like "I use OpenFoobar. Not that I have a problem with ClosedFoobar; in fact, I like it. But for me, right now, OpenFoobar is working best...".

In both cases, there's a sense that using something that you did not have to pay for, and that is not "popular" enough, is a problem for everyone else around you. Like the vegetarian who holds up the line for an extra minute by asking for a hamburger without the burger. What? is it really such a major inconvenience for everyone, so profuse apologies must be issued to each and every person.

News flash: it's not a liability that I use file formats that are freely available. It may be different than what you use, and your thing might be more popular, but there's an important difference:

  * Closed Source: Not available to everyone

  * Open Source: Available to everyone

I get that the more ubiquitous something is, the "easier" to get it seems to be. For instance, I realise that generating an MP3 seems really easy because of the 4 sound applications that exist in your world, they all export to MP3. But that does not alter the fact that the last time someone in _my_ world tried shipping an MP3 encoder and decoder in one of their sound applications, they got sued.

So yes, someone is being elitist, and someone is keeping knowledge to themselves, but it's not open source that's doing that. It's the litigious and vindictive computer and software corporations, whose indemnity you may have purchased with your computer and all the software bundled with it, but you have not purchased anything more than insurance against being sued for using the products. The stuff I am using, I actually own, lock stock and barrel, and I'm happy to share it with you.

It just requires an additional install.

## 32.2 Not Suitable for Normal Users

The sad thing about the closed source software racket is that it demands a ceiling.

In closed source software, there is a distinction created between those who are able to understand "complex" concepts, and users who cannot. "Normal users" are not capable of understanding certain things, and so they are sold simple, limited versions of the available tools. There is often a sense that "normal users" should be protected from advanced features because "they don't need that" or "it would only confuse them". These are actual phrases uttered by actual salespeople at tech "press events" (marketing events staged in order to provide tech journalists something to write about, since apparently technology is such an otherwise limited and shallow topic).

Fact is, "normal users" may or may not want certain features, and it makes little sense to _remove_ the features from the "normal" edition. Are you that afraid of injuring your normal users? then make a Normal and an Advanced mode, so the users have a choice. Software doesn't ship by weight; it costs a corporation nothing extra to leave code in, and in fact it costs manhours to take code out.

The actual goal, though, is to force a distinction between "normal users" who you market products to, and "advanced" users who you market pricier products to (because if they pay more, they get to wear the "pro" badge). After this, you still have another group of people who **really** understand stuff, so you market some of the tools that will enable them to build upon what you sell. These are the "developers", and they form that all-important cottage industry groveling at your feet as the One True mega corporation which ultimately decides their value as a tax payer.

This is a class system, exploited from the natural fact that we all do, indeed, have our natural aptitudes and not everyone's interest lies in learning the nuts-and-bolts about computers. But is that an excuse to cut people's access to knowledge? is it any excuse to promote the idea that certain people just won't ever understand certain concepts.

"Open source is fine, but it's written and designed by geeks, and it usually _shows_ ," people say loudly and obligatorily. I've seen this in action in several places; I've seen developers themselves say it, when they are working on closed source software. They utterly ignore that even there, in the high-paid fast-paced world of Silicon Valley, the myth of "UX Design" is a flimsy, ad hoc art project done by a team of people who learned layout first and function last. They say it for the same reason that the masterminds behind [the Creative Commons licensed] **Cards Against Humanity** sold $180,000 worth of feces. I've also heard it at companies that use open source daily; people open an application they don't know all that well, because they only ever bother using it at work, and they falter while someone is watching, so they bring out the old "open source is hard to use" horse and beat it to death.

## 32.3 A New Normal

Since the very origins of UNIX, there has always been this principle of design: normal users should be empowered to customise their workflow. The UNIX pipe system was developed and wielded for this purpose. In fact, one of the early UNIX videos from AT&T expressly highlights this feature.

It's fine that not all users want to become computer experts, but the deprecating myth being perpetuated by corporations and journalists that "normal users" of computers should never be expected to understand or learn complex principles in computing is basically giving a free pass to a broken education system, and a spineless technology infrastructure unable to leverage the power of its own population, choosing instead to pay to maintain an exclusive class of "advanced" users and developers.

The amazing thing to me is that it's not just the USA that does this. It's every country in the world. No country has yet taken a step back from the rush of technological fervor and realised that basing the information systems of entire nations on software being provided by the USA is a liability.

## 32.4 Certification

It's sometimes subtle, and I am not entirely sure where it starts, but there's a reward system in place that results in a few very loud people getting recognition for achieving "success" in computing. Of course, "success" usually is measured in dollars, and "computing" refers to sales rather than actual computation.

This happens on a very blatant level with CEOs. If you're a CEO of a tech company and you make lots of money, then you are protrayed as someone so very geeky and so very advanced that you understand both computers and their users. It never occurs to anyone to question whether you might not understand one, the other, or both; that you might just be a ruthless business person who knows, essentially, nothing about technology or its impact on the world.

On a smaller scale, we are generally told that buying the right tools will qualify us as Real Geeks™. I think this grows from the false distinction that has been created between "normal" and "advanced" users; the normal users have _this_ brand and _this_ set of tools. But the advanced? we have a special set, designed for and marketed and sold especially to us.

You go out and get the right computer, you join the right sites, you use the right version control site, you get the latest and trendiest text editor, you use the latest and coolest language, side with all the right "hacktivists" or align yourself with the right "social enterprise" company, and you're in. That's all you need; just the right gear, and a public self-proclamation, and you're the real thing.

The inverse is problematic; after all, there are developers out there who fail to conform and yet are writing some of the most amazing code available.

Turns out you can be a programmer or developer without buying into the Geek™ scene, but you are one of those disconnected, lofty geeks. You are not, somehow, a geek of the people (which, with the most twisted logic, requires an admission fee to join). You are an ivory tower geek who talks a language that "real geeks" do not understand, and should not be expected to learn. Languages like **BASH** , **C** or **C++** , **Perl** , **sed** , **awk** , and all that stuff that keeps itself removed from normal people by doing things like posting free tutorials online on how to use them, and being distributed free of charge.

## 32.5 Choose Independence

Don't be fooled, kind reader. There are no required buy-ins to be a geek. There is no ceiling on being a computer user. In fact, traditionally, that's been one of the central points of being a hacker or a geek or a nerd. Anyone can learn this stuff, anyone can join in. It doesn't matter how rich you are, what kinds of clothes you wear, what music you listen to, whether you're male or female, brown olive or pink, tall or short, or whatever. If you want to do this, do it.

Look, it can be deceptive. A lot of people look at me and they think I'm really smart (others look at me and laugh; what can you do?). But if I'm honest, I'm not a geek by birth. I wasn't a wiz kid, no teacher ever phoned my parents and told them to move me into the advanced classes. I am not good at math, and I am not an amazing programmer. I dropped out of high school, I dropped out of college. But darn it, I "got" open source, I "got" Unix, and today I help build (and that doesn't always mean actually _programming_ , so if that's not what you're interested in, don't be fooled by that mis-conception either) tools that my friends in the Linux community use on a daily basis.

What I'm saying here is the proverbial "if I can do it, anyone can do it". It's over-used, and too broad in scope, but it's mostly true. You do not need to be a genius to learn new stuff, and that's all open source is. It's new stuff that you didn't get taught in school because school didn't learn you nothing no how. Big deal, get over it and learn something new.

Learn to be a geek on your own terms.

# 33 Rsync and Rsync Daemon

 **rsync**. It's a small but effective, and really easy program that comes on every *nix system.

The best back up plan is:

  1. simple
  2. quick
  3. painless

To make it simple, get ONE big cheap USB harddrive (USB so it will be cheaper) and back up all your stuff to it.

To make it quick, keep the harddrive close at hand, so that you can plug it in and let it do its thing.

To make it painless, establish a cron job that will automate the **rsync** backup. If not cron, then at least a shell script so all you have to do is type in ./backup and watch it do its thing.

How to do all this? Buy the harddrive: buy.com or tigerdirect.com or newegg.com or whatever.

Keep it close at hand.

## 33.1 How to use rsync

Rsync can be as simple as this:

    $ mkdir /mnt/backupdrive/laptop
    $ rsync -av /home/yourname/ /mnt/backupdrive/laptop/

This copies everything _inside_ your home folder to a directory called laptop on the backupdrive. It's that easy. Obviously it'll be fairly slow the first time you do this because EVERYTHING is getting copied. From then on, only the new stuff will be copied. Make it a cron job: The most straight-forward way of doing this is to use the crontab command

Take special note of the slashes in your command, though; use them or do not use them, but make both sides of your command the same. If you give the path of the source with a trailing slash, then give the path of the target with a trailing slash. If you give the path of the source without a trailing slash, then give the path of the target without a trailing slash. **Do not mix and match.**

##  33.2 Automate rsync

Automate an rsync command with cron:

    $ crontab -e

..which means something like "crontab edit". Another cool command is **crontab -l** which means "crontab list". Those are the two I use.

crontab -e opens a text editor and either an empty page or a page with some explanatory comments add this kind of text:

    45 2 * * 0 rsync -av /home/yourname/ /mnt/backupdrive/laptop/

...and hit **control-o** to save it and **control-x** to exit. What you've done is set up an automatic script to run that rsync command every Sunday (day 0) at 02:45 (45 minutes, 2 hours) regardless of what week or month it happens to be (the asterisks for month and day).

To spell it out again:

  * Minute (0-59)

  * Hour (0-23)

  * Day of Month (1-31)

  * Month (1-12)

  * Day of Week (0-6 with 0 being Sunday)

  * command to run

### 33.2.1 Exceptions to the Rule

That's probably all one needs to know....except that crontab isn't always the approved way to do it. Some Unices use a middle step in accomplishing this and fill their default crontab with commands to run cron.hourly, cron.daily, cron.weekly,cron.monthly

What's all that? Have a look in **/etc** and you'll see directories called cron.daily and so on; and inside these there may or may not be scripts with actions to be run daily or weekly or whatever. So what you can also do, if you want to play nice with the distro creators (and why not? they were smart enough to make an entire distro that you use daily, so they must know something, right?) then you'll create a shell script, make it executable and place it in the appropriate cron.* directory.

It doesn't have to be a complex shell script. It can just be as simple as:

    #!/bin/bash

    rsync -av /home/yourname/ /mnt/backupdrive/laptop/

And that's it. Now make sure your computer is on every Sunday night at 02:45, and that your backup drive is plugged in and mounted, and you should be good to go.

## 33.3 Rsync Server

If you have a spare box lying around, you could use it as an rsync server to which your main computer(s) backup to. To do this, you'll need to run **rsync** on the server as a daemon, so that it's running idly, listening for any remote signal knocking to sign in.

To run **rsync** as a daemon, you first need to establish an **/etc/rsync.conf** file, which goes a little something like this:

    motd file=/etc/motd
    #log file=/var/log/rsyncd
    pid file=/var/run/rsyncd.pid

    # MODULE OPTIONS

    [syncserv]
    comment = its_a_backup_server
    path = /home/klaatu
    use chroot = yes
    max connections=1
    lock file = /var/lock/rsyncd
    read only = no
    list = yes
    uid = klaatu
    gid = nogroup secrets file = /etc/rsyncd.scrt
    strict modes = yes
    ignore errors = no
    ignore nonreadable = yes
    transfer logging = no
    timeout = 600
    refuse options = checksum dry-run
    dont compress = *.gz *.tgz *.zip *.z *.rpm *.deb *.iso *.bz *.tbz *.dmg

Essentially, you are establishing some server (or "global") options such as the location of the log file and pid file and motd (Message Of The Day)...not a big deal, and usually the defaults are fine.

After that, you are creating a module. In this case, mine is called **syncserv** but you can call it anything. We provide a human-readable comment, a path to a place on the drive where we can do our backups, a secrets file that defines the password that will let our user into the box (we'll look at that in a moment), and some information about the user. In the example conf files I saw, the uid was set to **nobody** which, to me, tells me that it should be set to **nobody** but in fact what it REALLY means is that you should set this to the _username you wish to be running the process_. In this case, I have entered **klaatu** because it is Klaatu who will be rsync'ing to the server.

> Note: the user you define must exist on the box. That is, you should remember to create a user named klaatu. Sometimes I'll be working as my admin user and I'll forget to actually add the users to the server, so I'm trying to get them to sign into a box to rsync when, as far as the server knows, they don't even exist.

It will also be easier, I think, that if the user has a home directory. Technically it's not necessary; you can have them back up to any folder (just set the path in the conf file to that folder) but in that case you need to make sure that the user has permission to actually write files into that folder. Personally? I just give them a home folder while I create the user with **useradd**.

The other options are, I think, fairly intuitive from what they say; **don't compress** defines what file types rsync shouldn't bother compressing, **timeout** is timeout...and so on.

We need to make another file now; it's the file we've defined as the **secrets** file, or in this case **/etc/rsyncd.scrt** (although you can name it anything you want, so long as you tell **rsyncd.conf** about it). The format for **rsyncd.scrt** is simple:

    klaatu : myverysecretrandomstringpassphrase

Yes, that's the username, a colon, and the passphrase. I use a random string for this because the way I do it, the user doesn't really have to use this directly.

So how do I do it? Well, first we need to start the rsync server:

    # rsync --daemon

And now we need to set up a cron job on the user's computers so that they actually backup on a regular basis.

Here's what I do:

  * Create an **excludes.txt** and place it in **/usr/share/excludes.txt**
  * Create a **backup.rsync** file with the passphrase in it, and place this in **/usr/share/backup.rsync**
  * chmod 600 **/usr/share/backup.rsync**
  * Create a shell script called **backup.sh** and place it in **/usr/local/bin/backup.sh**
  * Create a cron job to run the **backup.sh** script every night.

All of that is self-explanatory except the shell script to run the backup process, and the **exceptions.txt** file. The shell script to backup could be something like this:

    #!/bin/bash

    echo "successful backup on " \>\> /Users/\$USER/backup.log

    echo "\$USER \$(date)" \>\> /Users/\$USER/.backup.log

    rsync -avzrpog /Users/\$USER --password-file=/usr/share/backup.rsync
    --exclude-from '/usr/share/excludes.txt' klaatu@192.168.100.4::syncserv

The exceptions file is straight-forward; it's a list of files you do not wish rsync to backup:

    ~/Downloads/*
    *.mp3
    *.wav
    *.flac
    *.ogg
    *.avi
    *.mp4
    *.mov
    *.mkv
    *.ogv
    ~/Vids/*
    ~/Cinema/*

Set it up as a cron job, and enjoy.

## 33.4 Rsyncing over the Network

As usual with a POSIX system, network transparency is pretty much a given. In other words, you can send files to an external drive, even if that drive happens to be another computer on your network, or on the internet.

Modern implementations of rsync use ssh by default, so all you need do is give rsync a remote host as either source or destination.

    $ rsync -av ./stuff/ klaatu@example.com:/home/klaatu/stuff/

You probably do not use the default port for ssh, so in order to use a different port the syntax is:

    $ rsync -av -e 'ssh -p2299' ./stuff/ klaatu@example.com:/home/klaatu/stuff/

## 33.5 What rsync alone does not do

This is the sledgehammer approach to backing up; it's not the graceful or precise or elegant Stealth class, it's the big dumb Berzerker who breaks down the door, backs everything up and lets the cleanup crew deal with the mess.

In other words, **rsync** doesn't give you daily snapshots, it won't remove old data that clearly does not matter to you, it does not know what an "incremental backup" is. For that, you probably want rdiff-backup, which has a pretty good write-up over at <http://slackermedia.info/book/doku.php?id=backup>.

> Of course, rsync isn't just for backing up data; it's a great substitute for **cp** (especially given that it is networked), so don't throw it aside just because you are thinking of switching to **rdiff-backup**.

# 34 Destructive Habits in Approaching Open Source

Many of us grew up using closed source software.

As a result, I think that many of us were implicitly trained to view open source software as **The Alternative** to "the real thing". If you didn't have the money for something, and couldn't figure out how to steal it, then you resorted to using an open source "replacement". But it was usually a temporary fix; something to help you along just until you could get "the real thing".

And we get that reinforced a lot. Our professors tell us we need to get same software, but it a pinch open-source-alternative-will-do _until you can afford "the real thing"_.

Or else open source was a quick fix for some closed source bug. That happens _a lot_ , even when we don't admit it (and we never admit it). Our expensive closed source application refuses to play ball with our other expensive closed source application, but luckily there is this open source app that just coincidentally happens to have reverse engineered _both_ , thereby bridging them perfectly together.

Yes, open source is the "forgotten feature" of some closed source app. You paid hundreds of dollars (or else some developer did, because you're stealing it) for the software and yet it wouldn't export to the format you need, or it wouldn't import it, or it locked you out of your data because the trial edition expired on you. So you looked around, found an open source tool to bail you out of this pickle, and then you move on with your life.

Or else open source was was a novelty item. Maybe you were bored one day and decided to download a free application. It occupied you for an afternoon, and then you forgot about it.

Whatever the circumstance, most of us have been there once or twice, and these instances gave us the impression that open source wasn't the real thing, but imitations of some larger, more official reality.

We were not, mostly, brought up to think of open source NOT as the way to compute, but as our backup plan when things went wrong, or our alternatives when what we pay for arbitrarily fails us and we have no insurance or guarantee to invoke to make it start working again, or a fun thing to try out now and again. (Except, of course, when we do see it as the real thing, as we do with Firefox and Apache and VLC and far too many things to list here, some of which we use knowingly and others that linger in the background).

## 34.1 The Real Thing

At some point, many people start looking at open source with real interest. Maybe they start noticing how geeks around them are getting a lot of interesting things done using nothing but free and open source software, or maybe they heard that popular mobile and computing operating systems are closed systems skillfully placed atop an open source foundation, or maybe they are just curious to explore computing without arbitrary limitations placed upon them.

Well, I have certainly been in that place myself, and mostly it was a great place to be. It's exciting, it's new, and that's something you really don't get much of in the closed source world. Sure, you look forward to getting some new feature or a new wallpaper and slightly refined icons, but generally after all is said and done, you start to get that empty post-shopping depression, because that's what closed source computing is: the perpetual shopping trip.

Open source is a lot different.

Maybe a little _too_ different, if you're not prepared.

People coming to open source software sometimes approach it with what I'll call a "destructive" attitude. You are free to do that, but if that is your goal then you're wasting your time; it's more efficient to skip open source and stay with the closed stuff where you can be unhappy in familiar territory.

Otherwise, if you want to like open source but aren't sure what it's all about, then here are a few great ways to kill open source for yourself. In other words, these are myths that you may have had embedded long ago without even realising it that are best left behind:

 **" Open Source applications are free versions of closed source apps."**

I hate verbal shortcuts, but we all make them.

"What's Inkscape?"

"It's like Illustrator, but free."

No. That's wrong.

I promise, if a programmer intends to make a "free version" of something, it will be advertised as such. It will look exactly like that application, and people will still say it's not as good.

Programmers are people, believe it or not, with their own opinions, their own ideas, and their own interests. When a programmer sits down to create, say, a drawing application, he doesn't sit down to re-create the flaws (as he sees them) of some other big famous application that, believe it or not, some people loathe. Developers sit down to solve problems; for instance, an application is needed to create vector illustrations.

Once the problem is posed, we could imagine some questions are asked: should I copy the application that everyone knows, even though it is big and bloated and cumbersome and difficult to learn? or should I create a unique interface that is efficient, intuitive, and which makes sense to me? Should I seek to emulate the clunky file formats of the popular commercial application? or should I invent or leverage a new one that is elegant, cross-platform, and open so that everyone can use it to convey data?

Well, imagine an artist sitting down at a piano to compose something. And you go up to that artist and say "I want you to compose me a piece that is EXACTLY like Beethoven's Moonlight Sonata."

It kinda deflates the enthusiasm. It _can_ be done, but it's just not inspiring.

Open source software, even the ones that appear to do _really_ similar tasks as popular closed source apps, have their own way of accomplishing things. These ways are not wrong. They are different.

As a new user of open source, the burden to learn is on _you_. Have patience, view it as a chance to discover something new rather than a barrier between you and Productivity, and I swear to you, as someone who was a new user himself, it can be done, and it can be fun.

 **Spending 4 minutes with an app to decide you're switching to it forever.**

This stems, I think, from not understanding that open source applications are not "free versions" of closed source ones, but I also see this _a lot_ in those dime-a-dozen "10 Free Apps to Replace Foo!" articles online. Here's how it goes: you stumble across an open source application online, its screenshots look pretty good, it claims to do all the stuff you are looking to do, so you install it. You launch it, you poke around, it appears to have all the right buttons and icons, so you try one of its basic features, it doesn't crash, so you close it and announce to the world that you have found a GREAT replacement for your old closed source albatross and it feels swell and Open Source rocks!

This sounds good, right? sounds like open source speed dating doing what it does best. Except when you're ready to sit down and actually get some real work done. Two things happen:

  1. It turns out that the application is missing key features, whether it's a function that you took for granted or just basic stability, that you didn't think to check for in your 4 minute trial run.

  2. The application does do what you need, but you don't know how to use it yet and it's different from the closed source application that you were used to.

These are both deal breakers, and the time to find them out is _not_ the moment you want to sit down and settle in to a nice relaxing evening of Taking Care of Business. Yes, open source is great and yes, people make amazing stuff with it, but you wouldn't go out and buy a new software package without putting it through its paces, would you? well maybe you would, but you wouldn't structure your day around suddenly being able to use it without spending some time with it first, would you?

OK, sure, people _do_ that actually, but that ends exactly in the same way. So my point is, put an application through its paces before attempting to adopt it as the lynchpin of your productive day.

It could be that the application you stumbled across is over-selling itself, or maybe it's just not right for you. Find several options, audition them, learn them, and _then_ get to work. Doing this in any other order (for open or closed source apps) than that is going to lead to frustration.

 **Spending 4 minutes with an app to decide that it's an open source frankenstein that should be killed.**

Yes, it's the mirror-universe version of the previous habit: you find an open source application, you open it up, you look at it for a few minutes, and one of two things happen:

  1. It doesn't work the way you expected; it must be from the dark ages.

  2. It doesn't look like what you are used to; it must be badly designed.

Look, most of us are encouraged to _not_ shop around when it comes to software. That doesn't mean it's the right thing to do, but that's what we learn. Do you want to create a database to keep track of important data at work? how about using a spreadsheet application instea? close enough! Do you want to do some page layout? how about using your word processor or a photo manipulation application? Want to share confidential client data with your co-worker sitting one meter away? post it to your cloud drive.

These are all real-life examples; we've all seen it, and possibly we are guilty of a few ourselves. But it's time to strive toward working smarter.

If you pick up an open source application that you have determined does tasks that you are seeking to do, you need to understand that it's going to be different _by design_ , and it may look different because (news flash) not all applications look the same. Get over it, invest some time, do some research, and learn.

 **I am too {old, poor, rich, busy, dumb, stubborn, confused, smart, brilliant} to change.**

It's easy to make up excuses to resist work. Believe me, I know. But if you believe something should be done, then you may as well face yourself in the mirror, acknowledge that you have looked into the abyss, and get down to grappling all the barriers between you and whatever form of low-level enlightenment you are after. I'm talking about a lot more than software, here; maybe you have seen that capitalism is not working, so you want to strive toward supporting the economy of re-use, maybe you have seen that animal slaughter is excessive and want to embrace vegetarianism, maybe you see that glabalism is not working well and want to embrace your local culture and businesses, maybe you want to start gardening, or maybe all you're doing is getting into open source software.

Whatever it is, it's obviously possible, and it's obviously something that you can handle. It's just going to take a little, or a lot, of extra work on your part. The good news is that, in spite of what the tech world tells you, there's no race happening. You can learn and adopt at your own pace.

I'd be lying if I said that I quit closed source software cold turkey. It was _almost_ cold turkey, but when there were deadlines that had to be met for fear of failing a class or losing a job, I sometimes had to fall back on familiar tools. I did make it a point, when the deadlines were less urgent or absent, to do the task on open source because, even though it was painful, it taught me the lessons I needed to become a pro.

My philosophy is often to "do it the hard way". If you keep trying to find the most inconvenient, most difficult way to do something, you eventually run out of difficult options because you eventually get really good at all the things you used to see as being insurmountable. Again, sometimes you have to fall back on what you know, but you have to be unafraid of learning new things, because your goal is to get better at what you do.

And in the end, you'll reach your goal.

 **Nobody uses open source, or nobody uses it for REAL work.**

There is a very real temptation, especially when something is particularly frustrating, to suspect that nobody actually uses open source, or that nobody uses it the way _you_ are trying to use it. Sure, an application's developer and the other people harping on about the application "use" it, but are they really _using_ it or do they hack on some code, launch their application, see that what they just did does work for their one-time not-real-world test, and then close it? Do they live in this application for days at a time?

I don't want to turn this into a competition, but dear reader, as an open source video editor, I know this feeling maybe far better than you will ever understand. To this day, I still question whether the developers of a few major video editing applications have ever spent more than an hour a day (if that!) in their own product. It's a valid question, especially when you can't find even one example of an edit with L-cuts and effects and a runtime of something greater than 15 minutes from any of the devs working on the editing apps.

The situation can feel even more frustrating when you go to the trouble of reporting a bug only to get the response "you're running an old version, update to the latest git commit and try again".

The truth of the matter is, however, that not everyone works the same as you or me. People _do_ use the applications, but not everyone uses them the same way or to the same degree. It took me a long time to understand this, myself. It helps if you append the phrase "for me" after all of your bewildered criticisms:

"This stupid application doesn't work," you say, "for me."

"This application is so bad," you say, "for me."

And so on.

I'm not saying there aren't applications out there that are purely exercises for a bored programmer. I do wish programmers would make it clear on the application's download page that they don't actually expect normal users to try to use it for anything, but if you look at it from a programmer's point-of-view, it actually makes sense to just post something on the internet. If someone happens to find it useful, then that's cool!

It is up to us, therefore, to shop around. Find applications that work. For you.

 **Only highly specialised users use open source, and they use it for big important stuff, not for everyday tasks.**

While it's true that Linux and open source is used in highly specialised industries (networking, movie special effect houses, animation studios, custom robotics, scientific research facilities, simulations, video game studios, and much more), it's not true that that's the only place open source gets used on a daily basis.

Normal people, just like you and me, use Linux and open source tools for boring, every day tasks, all the time. Intentionally. If all you look at online are websites that serve advertisers and echo commercial marketing rhetoric, then it's true that you don't hear about it all that much, but the communities I am in consist of people who use open source every day, all day, for everything they do at work and at home!

 **I have always worked THIS way. I refuse to accept that there is any other way to work.**

For years, I maintained that the Mac "global menu bar", in which the menu bar of any application was dynamically positioned at the top of the screen depending on what application was in focus. Why? because it was logical and efficient; you can't use a menu bar unless the app is in focus, so why take up the screen real estate by putting a menu in _each_ application window? I swore publically, loudly, that I would never change my mind about this.

Years later, I don't know what I would do with a global menu, except maybe gnaw off my own hand. You don't realise it on apps you use every day, because for those you have memorised all the important menu functions as keyboard shortcuts (things like Save, Open, New, Print, and so on), but have you any idea how tedious it is to have to move your mouse _all the way_ back up to the top left of your bloody screen every time you need to access the application's menu?? My gosh, the global menu made sense when the screens were 12" and singular, but we have BIG screens now, and lots of them. The global menu is a curse! (Incidentally, sometimes you want to _see_ the menu options available to you without having the application in focus. It doesn't happen often, but constantly switching back to an application every time I need to write out an instruction for a user does get tedious.)

That, of course, is just an example. My point is that sometimes even the rules of computing that are written in stone tablets, in your mind, are...well, wrong.

There's a lot of comfort in familiarity, but be patient. You'll be surprised at what gets to be "normal" for you. And don't stop at just being patient; explore your options. Find new tricks that your old platform didn't even have (a clipboard manager!? multiple virtual desktops!? super key global key bindings!). Give it a year or two and you'll be amazed at how quaint and sad your old platform - yes, the one that just couldn't be beat - looks to you then.

 **Open source software has no support.**

This is a funny one, because it's both as true as closed source software and also as completely untrue as closed source software.

Here's what I mean.

First of all, closed source software does not really have support. If something goes wrong, you can make some calls or shoot off some emails, but realistically these are placebos. It's not like you have an emergency number to call that will result in an elite force of software experts to descend into your office, take seats at your computers, and make everything work like you want it to work.

There is often a feeling of support, though, because you if you do a search on the internet for how to do something on some closed source but popular app, then you're sure to get results. So you think there is support.

Well, the same is actually true for open source; do the same search, but instead of a blind search for "how do I do blah blah", try a slightly more specific search, like "how do I do blah blah on Open Foo". Support!

Open source has, actually, even more options; you can file bugs that the developers _actually see_. And of course, you can post in forums just like with closed source software (although you probably want to hunt down forums specific to your application), and you're actually very likely to get a response from either the developer or other users. They may not come to your house to hold your hand through your difficulty, but the help is usually there.

But let's assume the worst; let's say you have no network connection, or you do but no one is helping you. So you want to learn the software. You go to a bookstore and scan the shelves and find nothing. You panic.

Don't panic! there is plenty of information on the software you are using, you have only to seek them out. They may be free online, or they may be books you can purchase from an online store. Or the software itself may have a manual. Trust me, the information is out there, you just have to look for it and not panic when it appears that the one shelf your local library or bookstore dedicates to "computer books" (you know, the For-Dummy books about Microsoft Word) does not bother mentioning the software you use.

 **This open source application has cool screenshots and claims to be worth a million dollars. It must be great!**

I feel like it's time to temper all the praise I've heaped on open source and acknowledge that sometimes open source developers get prematurely over-enthusiastic about their own product.

I can't count the times I've seen a developer claim that his app runs on every platform; it turns out to be technically true, but functionally the thing is useless on anything but the one plafform it was actually developed on. And there's the temporal-displacement issue, in which the dev thinks that the great amazing app that his software surely will be at sometime in the future is what it is right now...when actually right now, the app barely stays open for 5 minutes before crashing.

This is why you, the user, absolutely needs to audition apps, test them, use them, and decide for yourself if they will really meet your needs. Don't listen to the developer, and don't listen to its users who talk loudly about how much they love the app for what they do, because what they do may not be the same as what you do.

Once again, I have to point out that this is also true for closed source apps. The claims made by entire platforms are blatantly false, and completely beside the point. Just because I buy your computer brand or I buy into your OS, it doesn't mean (as it is often implied) that my personal relationships are going to be stronger, my life is going to be easier, my house will become beachside property, the world will become greener, the sky will brighten, my art will become more widely accepted, and I'll become richer. This is marketing and has nothing to do with computers or software. If I were to make decisions based on these marketing ploys, I'd be...well, I'd be along with status quo, I guess, but that's not really my goal.

 **This doesn't have 100% feature parity with your old app, there's no WAY you can use this.**

Sure, there are differences to applications, and maybe you can get over that, but sometimes there's the other issue of scope.

By "scope", I mean that a closed source application you used before might have done Foo, Bar, and Baz. All in the same application. So it seems strange, or even backwards, that the open source app you use now does only Foo and Bar, or only Foo and Baz. Or whatever.

Once again, sometimes you just have to get used to the idea that application design and workflows are not the same everywhere. It might seem obvious to you that an application should do some set of tasks, and it might seem painful that some "obvious" task is left out, but to somebody else the opposite might be true. For instance, I have long said that audio editing should **not** be included in a video editing application, because it leads to bad sound _every time_. But everyone wants and expects audio in their video editor. Yes, people see the "obvious" very differently.

The solution is to look for some other application that does provide your missing feature. Don't insist on having an all-in-one application if that's not how open source devs appear to be doing things. Let your workflow be flexible, and you might find that you actually prefer the change!

 **Open source software just feels different.**

Open source does feel different, but so does closed source application X when you got trained on application Y. And, believe it or not, closed source apps feel different to me when I have to use them, due to the fact that I have used open source applications for so many years.

I think that's one of the funniest things about the "closed source" vs. Open Source debate; people who get spoonfed their computing environment just assume, for whatever reason, that the app they get from their corporate caretaker is the right and normal and obvious and best way to do something. So it's open source that is different.

But to an open source user, the opposite is true. I like the way things are done on my OS, in my callection of applications, and the fact that closed source apps can't figure out how to emulate those annoys me when I have to use a closed source environment.

Either way, we humans are more adaptable than we realise. If I wanted to trade my independent thought for corporate control, I could, and I could adapt, and I'd probably learn to love it eventually. Just as I learned to love open source. (To be fair, I do think it's probably more difficult to go from open to closed.)

 **You can buy and sell open source applications.**

You _can_ buy and sell open source applications, so this is actually a true statement. However, I do believe that it largely misses the point of the open source model. I have no issue with open source being bought and sold, and in fact I have paid for open source more frequently than I've paid for closed source.

But the power play is different.

In closed source, a corporation pays developers to realise the company's vision for a product. Because the price is right, the developers emulate whatever level of personal passion required to sit at a keyboard all day and write code that meets their employer's requirements.

The company starts a marketing campaign to ensure that people are primed to "want" what the company will soon sell them. It's not just ads, it's articles in magazines, and demos at trade shows, and making sure that early copies get into the hands of people producing content that people are going to see so they say "I want to make _that_ happen too!".

Then the company sells the software as an executable application, and that's all they sell. They put hard limits on what their audience can do with that application because to permit more than what they have accounted for would:

  * Potentially rob them of an opportunity to up-sell something as a new feature later.

  * Present a variable in their ability to "support" their own application; from a development standpoint, this has a lot of merit, at least when you are, as these companies do, see support as a liability rather than the main money-making asset.

Open source does not do, well, _any_ of that. They see development as a collaboration between developers and users, so the software gets developed as a platform. Users are given _everything_ , and are able to do anything they want. The support "costs" go up, but since the support _is_ the development, that's not really a problem. Every problem is turned into an opportunity to improve. Every time a user abuses the applications, there's a potential new feature being discovered.

So, yes, you can buy and sell open source software, but in the neo-feudalistic sense that closed source uses.

 **You can buy and sell open source developers.**

Yes, you can hire open source developers the same as closed source developers. There are entire cottage industries built around the fact that anyone providing a service to someone can hire a developer, pay the developer to write code, and then release that code to the public. That's how commerce, whether it's bartering or fuedalism or capitalism, works.

On the other hand, no one can "buy out" a license, and that's where the insurance inherent in open source really makes a lasting difference. Two developers come to mind as examples; the Common Unix Printing System (CUPS) was developed for years before Apple Inc. bought it up.

Quote from the creator and lead developer of CUPS, Michael Sweet:

 _In February of 2007, Apple Inc. acquired ownership the CUPS source code and hired me (Michael R Sweet), the creator of CUPS. CUPS will still be released under the existing GPL2/LGPL2 licensing terms, and I will continue to develop and support CUPS at Apple._

The developer was "bought" but CUPS itself persisted in spite of the famously proprietary Apple Inc. Could it have gone badly, had the developer "sold out" and agreed to a non-open license? Well, not really; a code snapshot of the last public version of the code would have been forked and developed as freeCUPS (or whatever they would have called themselves) and printing on Linux and BSD and others would have continued.

Similar circumstances happened with the mySQL database; Oracle bought mySQL, its lead developer presumably got paid, and then he himself took the last public commit and created mariaDB in what is probably the slickest business deal I've heard of in a long time.

So yes, you can buy and sell developers, but only if they choose to be bought.

## 34.2 So, what is Open Source all about?

Open source is about a lot of things.

It's about developers; some are looking to learn new code tricks, others are seeking to make a name for themselves, some are looking to make an application to fill a need they have, and others are making an application to fill someone else's needs.

It's about users; some are curious hobbyists looking for new fun apps to learn, others are starving artists who need a cheap but good toolset, some are big companies looking for a malleable platform, and others are hobbyist developers helping applications grow.

It's about security; the security inherited from several developers looking at code to make sure it has been written in a sensible and secure way, and the security to know that the application that created data can always be saved along with that data so that a user is never left lokced out of their own work. Every user should always get when choosing software, but it only ever happens with open source because open source is the only place you actually get the source code.

It's about independence, and choice, the opportunity to shop around for what works with you, it's about efficiency and working smarter rather than harder. It's about unleashing the potential of these fancy boxes sitting on our desks that everyone hase always told us are really powerful and yet which traditionally only seem to create more work for us.

And it's about passion; a passion for learning, for exploring, for discovering, and for sharing.

But if you grew up with closed source applications, it's also about change. Happily, you're smart and adaptable, and you can do it. It will be frustrating some times, but in the end it will be hugely rewarding. Decide now to see it as an exciting and positive move, and you'll actually enjoy it, between the moments of frustration. And honestly, between you and me, there really is nothing quite like that feeling of mastering something new, and realising that now that you have figured this (whatever "this" is in any case), you have the power to do _anything_.

# 35 Using 'su' and 'su -'

Everyone knows that **su foo** switches a user over to, in this case, a user named **foo**.

However, not everyone realises that this doesn't bring along with it foo's _user environment_. For example, if user klaatu does not have **/opt** in his $PATH, but user **foo** does, then if su over to **foo** , then I would expect to be able to launch an application that lives in **/opt** , right?

Wrong. Because

    $ su foo

does not inherit environment variables.

Of course user foo can still launch applications from **/opt** by providing the full path, but if you want to switch over to foo _plus_ foo's world (meaning foo's user environment with all of its variable settings and customizations), you must:

    $ su - foo

Note the **dash** between the **su** and the foo.

Now you can do everything the user foo would normally be able to do without any unexpected PATH problems. (I have to thank Popey for that tip; he saved my life with it while I was flirting with postfix. I was amazed I'd never encountered that seemingly basic yet vital distinction in all the beginning UNIX books and courses I've taken.)

So, that's all you ever wanted to know about **su** and **su -**

# 36 Free Gaming

When I game, I play both open source and closed source video games.

I grew up watching my friends playing video games on their consoles, and playing along whenever they would let me. It was fun, I loved it, but my parents wouldn't get us a console and the household OS wasn't exactly known for its game library.

When I realised at last that I was an adult and could go and buy a game console for myself, I resisted for a really long time, partly for fear of becoming a video game addict and partly because I wasn't sure how I felt about games. After all, the gaming industry was extremely proprietary, to the point, in fact, that some games were exclusive to one console and never got ported.

I just wasn't sure that I wanted to contribute to that kind of marketplace.

I thought on this for years, and did a few podcasts on the subject (I think only two of those aired). Eventually, I came to this conclusion:

  * In an ideal world, video games would be open source, because there is probably some graphics-related code that might be useful to others, and besides, sharing is nice.

  * In the real world, video games do not enable me to generate data that I care about, and so if I lose access to the game, I have not lost anything except a somewhat pleasant way to pass an afternoon or a late night.

These two points are true for me. They may vary for others; certainly the last point. Some people do care about the games that they play, the characters they create, levels they design, and so on. If that's the case, then they should at least have the ability to retain that data regardless of what happens to the game.

And the last point also ignores the fact that someone did create the game itself, but since they did so for hire, they don't own their work. That seems pretty dismal to me, too.

## 36.1 Games as Culture

Of course, this plight sounds really familiar to any working artist. You work for hire, you create amazing things, but you have no ownership over that thing.

It seems that in art, society has traditionally dictated a limitation to any legal claim of ownership after a while. You could argue that Rembrandt would have never wanted his stuff displayed at the City Gallery, certainly not right around the corner from a hack like Gauguin, but we ignore all that for the sake of preserving culture.

To a degree, we already see that happening today with computers, programmes, and video games. Eventually, it seems, things fall out of total ownership and get put on display. I don't know how or why it happens, I don't know the legal implications, and I dislike that it is so informal. I do not believe we should be relying on the shrug of someone's shoulder to suddenly decide when something is both important and old enough to warrant Emergency Preservation. I'd rather _not_ rely on that for preservation of the art of video games.

Calling video games "art" does understate the issue a little, though. Video games are obviously art, because lots of things are art. People made the thing, so it's art. It's worth preserving to someone. But video games are more than that; they are part of a culture. And it seems sad to me that companies are happy to squander artifacts of a culture for fear of the competition.

The same argument could be made of books and movies and music, though. There's the art itself: the literal item that you can pick up and hold (or hear or watch), and then there's whatever group it was made for or whatever group that picked it up and championed it as their own. While I respect that, I don't believe it's exactly the place of the artist or the corporate entity that commissioned the art to preserve the culture it is serving. That's theoretically up to the people within that culture, right? well, it is up to the point that the art itself may not be able to be preserved by them because the game is inaccessible due to arcane copyright law.

## 36.2 Open for Culture

In short, I believe video games should be open source for the sake of cultural preservation first, and not really for the code itself.

I mean, I am sure that the code is worth looking at, and I'm sure people could learn something from it, but honestly I've seen game code and I've taught programming through video games, and it just is not that different, one from the other. Game engines are cool and they can make possible amazing visual things, but you don't really need to see the code to understand what's going on.

Now, that's a dangerous can of worms to crack open, because surely we can say the same of nearly everything. And actually, more or less, yes that's true; usually in open source arguments, it's _not_ the actual line-by-line ascii code people are after, it's the ability to use a file or a piece of hardware with or _without_ that code. I honestly believe that if corporations would just provide specs for file formats and hardware, the open source movement would get a lot quieter, because they'd be busy writing _better_ code for those devices and files. It's not the privatization of the code people care so much about as it is the blockade erected between me and the piece of hardware I paid good money for.

It seems to me that all video games do basically the same thing. They are all for entertainment, and they have minimal user data being generated. I cannot bring myself, personally, to really care about the source code, because I have a hard time convincing myself that liberating video game code would help advance society in any meaningful way.

I other words, I can't stop looking at video games as appliances. I am willing to be talked out of this view, and in fact I have discussed the point with anyone interested in both video games and free software, but so far I have not really found anyone so passionate about both that they attempt to change my mind. If you're on this site, you can email me your thoughts; use my first name and the domain name.

My point is that it's not the code that the gaming culture wants to preserve, it's the _thing_ that is _the video game_. Now, it may take access to the code (or at least, again, the specifications on how to make that code work, for instance, on a specific emulated platform) to preserve the thing, but the code itself is not what most people are after when they say "we really should preserve this game so that my children's children can see what we used to play".

So for the sake of a culture, I do strongly believe that video game companies should at least follow the path of Id Software, which pretty reliably releases old code into the wild when they're finished with it. I think that's a pretty darned good policy, and not just because it preserves the game; I like the idea because it's efficient. The game code has been "used" (in the dirty-capitalist sense), so hand it over to any members of your culture that cares about it enough to maintain it. And that's exactly what happens.

## 36.3 Books and Movies and Music

Video games are still relatively new, and they exist in a differnt world than the world that "classic" artworks existed in. I mean, we really treasure old paintings and books and musical scores because, relatively speaking, we have so little from the past that we can look at to get a real idea of what life was like back then. For all I know, humanity with all its fancy digital gadgetry has entered a new hoarding stage of life in which every little thing will be preserved, and future generations will curse us for littering them with every single thing we ever did.

Then again, maybe not.

Maybe future generations will be curious about what we got up to all day, what we were interested in, what we dreamt about, what we saw as our greatest obstacles and threats, what we sought to solve, and how we thought it could be solved. Video games, and the source code that makes them run, are pieces of that puzzle. It'd be nice, I think, to make sure that gets out to the people who come after us.

# 37 GNU tar

Here are a bunch of uses for GNU tar, sometimes also called gtar and mostly just called tar.

## 37.1 creating tarballs

To gather up a group of files into one archive, do the tar command + _destination tarball_ (which you'll be making up, so call it whatever you want) + _source_ (the files you want to tar together)

    $ tar -cvf foo.tar foo

Just tarring a group of files does not compress them in any way, it just makes them easier to move around as one blob. For compression, you can have tar also gzip or bzip the archive:

    $ tar -cjf foo.tbz foo  
    $ tar -czf foo.tar.gz foo

and so on. Common extensions are tar.gz and .tgz for a gzipped tar file, and .tbz and .tar.bz2 for a bzipped tar file.

## 37.2 Add a file or dir to an existing tarball

 _This will not work with a compressed tarball_

    $ tar -rvf foo.tar blah  
    $ tar -rvf foo.tar llamas/

## 37.3 View a list of files within a tarball

    $ tar -tvf foo.tar.bz2  
    foo/bar.info  
    foo/bar.SlackBuild  
    foo/baz  
    foo/baz/baz.a  
    foo/baz/baz.o  
    foo/llamas  
    foo/llamas/purple.txt  
    foo/llamas/red.txt  
    foo/llamas/teal.txt

## 37.4 Extract just one file

    $ tar -xvf foo.tar.bz2 foo/bar.info  
    foo/bar.info

This dumps bar.info into a new directory called ./foo

## 37.5 Extract a directory

    $ tar -xvf foo.tar.bz2 foo/baz

## 37.6 Extract multiple directories

    $ tar -xvf foo.tar.bz2 foo/baz foo/llamas

## 37.7 Extract files using regex

    $ tar -xvf foo.tar --wildcards '*.o'

## 37.8 Extract a tarball

    $ tar -xf foo.tar.bz2

## 37.9 Extract a tarball to another directory

    $ tar -xf foo.tar.bz2 -C ~/blah

# 38 Marketing Exclusivity

Informing people of the existance of something is one thing. Marketing is quite another. The former is, well, informational; its purpose is to _inform_.

I do not generally believe in the myth of "unbiased" opinions or analyses. What usually happens when an unbiased opinion is given is that you try really hard to be fair to all of the things being considered, and so you end up at an 18% gray spot where everything is both as good and as bad as everything else. And so the inevitable follow-up question is "OK, so what do YOU suggest/like best/use?" and then it comes out: your biased opinion, based on your life experience.

There are exceptions, but in general I think that even information intended as an unbiased look at a collection of things is, at some level, as biased as anything else.

Marketing, on the other hand, swings so far in the other direction that the word "bias" loses it very meaning. It's not just _biased_ , it is manipulative. And that's dangerous stuff.

Marketing, which really is just a corporate term for "propaganda", can be so amazingly effective that it really does qualify, at least in my mind, as a religious phenomenon. People become so convinced that marketing is sincere that they become irrationally dedicated to the message. I don't know enough about psychology and all that fancy stuff to know _why_ that happens, but it does, and in a big way.

I realise that everyone is a bundle of a wide variety of contradictions, but marketing really brings it out in us. Avowed vegetarians will wear leather, dedicated "greens" will purchase cell phones and computers they know will be forced into deprecation in a year, feminists will wear makeup, polite society will read a kinky novel, and so on.

One of the most shameful examples of this phenomenon is Apple Inc., which aside from taking obvious inspiration and technique from Scientology, I believe may have truly perfected it for socially-acceptable capitalism.

In today's world of marketing, stores and brands, like Apple, entice you to buy your way in. That's not new, but once you are in, you get congratulated by fellow customers and staff as if you actually had to work to get the thing you just bought. They're not congratulating you on learning something new, or achieving anything; they're (sort of) congratulating you for sacrificing things from your life so that you can afford to buy your way into that brand status.

For this to work, Apple (and others) need their products to remain expensive enough to demand sacrifice. If they lower prices, then just anyone could buy their way in, and the exclusivity is gone. Apple needs people to see that Apple products are not obtainable to everyone. Some people cannot get into Apple, because some people are unable to make the necessary sacrifices.

And how much is too much? depends on who you are. It's a sliding scale, ensuring that someone will always be without the status symbol. Hey, it gives them something to work toward.

"But hold on," you make say, "this is classism! This is, like, the thing we were all supposed to have left behind decades ago."

Ah, but if you make that sort of protestation, you're called out for being anti-capitalist (usually not using that term). I mean, after all, you have to give a little to get a little, right? you have to pull yourself up to the level of Apple, or whatever, to earn its grace. You _can_ do it, but you have to get there somehow. If that "somehow" involves leaning on donors, or parents, and letting yourself be "sponsored" up to the top, that's OK, as long as you keep it quiet.

Could have a point, could be right; I don't think so. I do know that the opportunities in my life have all come from open information and open exchange of ideas and resources. My first Linux computer was a $25 laptop off of Craigslist, and I've been rescuing computers from dumpsters ever since. This didn't just enable a starving artist (me) to learn about computers to the point that I started a successful [enough] career as an IT guy, but it enabled me to teach others, many of whom could not afford "fancy" computers (I'm not even talking the Apple-level of fancy here).

I am not naive. I know better than to think that making everything suddenly free for everyone is not going to also suddenly mean that every lazy bum is going to crave knowledge and productivity. But at the same time, I don't believe in the caste system, either. So let's do away with that.

# 39 Inclusive Technology

I read a lot of geek-centric articles and books, and I have for my entire adult life. In my foolish youth, I would purchased tech books which usually included a 30-day trial version of the non-opensource software being taught. The inevitable problem was that you then had only thirty days to learn as much as you possibly could about that software, and then both the software and the book became useless. The implied solution was probably what you are already thinking: either uninstall (taking care that you find every single cleverly hidden and obfuscated file that will betray that you once had the software installed) and then re-install, or more likely, find an illegal copy of the software to prolong the habit.

These days I use only open source software, so the tech articles that I read generally do not suffer from that problem. Sure, they might become less useful as software grows and develops, but there is no inbuilt deprecation besides the ever-onward marching of Time, to which we are all a slave.

The other day, I was reading an article that arbitrarily decided to use a specific Chrome plugin for a very generic task. It was, basically, a poorly written article because for a generic function it taught you one very specific way of accomplishing the goal; if the article had been called "How to do _Foo_ with Chrome Plugin _Bar_ " then that tactic would have been fair play, but it was not, so the fact that it was forcing Chrome upon you for this task was a little annoying but it was hardly a deal breaker.

And that's when it hit me: it wasn't a deal breaker because Chrome is a $0 browser that can run on basically any computer OS (and if not, then Chromium probably will). There is no innate barrier to entry here. If I want to learn how to do this task in Chrome, I can. If I am running a PC with an out of date and unsupported OS, I can choose instead a free and current OS, install Chrome, and I can _still_ learn the lesson I am seeking to learn. If I am running a 10-year old RISC computer, I can compile Chromium and _still_ learn the lesson.

This might not seem like a big deal to you , unless you have been a starving artist, or a starving college student (no, I mean the kind _without_ a trust fund), or a geek, or, well, any person who has had a barrier placed in front of them entirely for arbitrary reasons. I have been there. I have been in the position where I did not have the money or resources to acquire a software title, much less a new computer that would run the thing, making it impossible for me to learn something.

And that, my friends, is called Exclusivity.

The thing that makes me angry about this is that I was there, wanting knowledge, being told by professors and salespeople that I _needed_ that knowledge, without even the slightest alternative offered. The only "alternative" was to use a trial version of something (and this assumes you have a computer powerful enough to run it); so, in other words, enslave yourself today for free so you can pay for the abuse later in life.

But all along, there were $0 opportunities, had only I known.

 _Exclusive_ is the opposite of _inclusive_. It is something that is created, and in fact specially designed, to keep people out from something unless a person meets certain requirements.

I understand that, the world being what it is, not everyone can have and do the exact same thing, all the time. I realise that brain scans that require a super computer, or a telescope that requires millions of dollars worth of lens design, is going to be innately exclusive. I even understand that not everyone can be allowed to do whatever they want whenever they want, whether the restriction is because there are crazy nutters out there who are unsafe to be around, or whether it's because there really is only so much of that thing to go round.

But that's not what we are talking about here; I am talking about things that anyone can learn, and by learning anyone can improve their options for future careers, or improve their ability to innovate in ways that no one else is innovating.

For instance, learning how to use a programming language should not require a 30-day trial. Full stop.

Learning the basic concepts of compositing images, or retouching photos, or digitally painting, should not require 30-day trials. These are things that modern technology can do for free, so if you are trying to teach me how to do these things, do not give me instructions using applications that require me to meet a class-ist requirement.

Give me the knowledge for free, on a free platform and let me, if I really want to screw myself over that badly, go pay to learn the same procedure on a restricted, pay-to-play application.

Are you afraid of over-saturating the market? are you afraid that you'll let someone in who is not, in fact, passionate about the process after all? Those are dangers, admittedly; I would argue that film is pretty saturated at this point, and I'd argue that music is saturated, and everyone can retouch any one of the 80,000 photos they take on their mobiles.

That's what "progress" looks like, though. Stuff that was once mysterious becomes clear, and then the human race moves on to tackle the next piece of this great puzzle we call passing-the-time.

In other words, teachers, both formal and informal, of the world: default to Free.

# 40 Setup VNC on Linux

VNC is an open protocol that allows a user to securely login to a computer remotely, and most notably, _graphically_. It's available on any UNIX and UNIX-like system, Mac OS (yes, I know), and Windows.

If you're a Linux user, you are probably thinking that you already have this capability with **ssh -X** and you'd be right...in a way.

If you create an SSH tunnel and use X11 Forwarding through the tunnel, you would have a similar effect, but in fact SSH is an application, not a protocol layer. Many applications leverage SSH (rsync, git, and others) but ultimately you are just running applications remotely. With VNC, you are logging into the remote computer such that the entire session is delivered to you. You see the screen of the remote machine exactly as you left it (or exactly as the user is seeing it, in the case of support calls).

In other words, it's secure screen sharing.

## 40.1 Platform Notes

On Linux and BSD, X11 is sort of the default graphical remote login method, but few people actually use it for that, and it isn't fully encrypted the way that VNC is, so VNC is the de facto default.

On Mac OS, a builtin method was implemented through iChat and then re-branded as "Back to my Mac" and then I think it changed again after that. Also, an "Apple Remote Desktop" application existed, which could be purchased separately. As is often the case with Apple technology, longevity of these are sort of a joke, and it's so extremely proprietary that you literally must be on a Mac to access another Mac. I think it's safe to say that no one "in real life" messes around with the Apple implementations and just uses VNC, which is now itself builtin through the **Screen Sharing** option in the **System Preferences** > **Sharing** panel (last time I looked).

On Windows, RDC is the default method of screen sharing. I don't know if the protocol is an open standard or if it's just been reverse-engineered, but it is usable from other systems. However, there's no harm in running VNC as an alternative.

## 40.2 Start the VNC Server

First thing is to start up a VNC server on the _target computer_. You do this so that the computer you want to log onto is listening for incoming connections. The easiest one I have found is **x11vnc**.

  1. If **x11vnc** is not installed on your system, install it with your package manager (apt-get or yum or zypper or whatever you use).

  2. Next, create a password file so that not just anyone can VNC into your computer.

        $ mkdir ~/.vnc

    $ echo "myReallyGoodPassphrase" > ~/.vnc/auth

  3. Finally, start the server.

        x11vnc -passwdfile ~/.vnc/auth -forever

Now your computer is listening on port 5900 for any authenticated request to connect via VNC.

## 40.3 Make the Connection

On another computer, get a VNC client. Because I'm frequently on KDE, I use KRDC, although I've used TigerVNC as well, which is cross-platform so you can use it on any computer you happen to be using.

They all work basically the same way. You start the VNC client, you type in the username and IP address for the computer you want to connect to, and a new window opens with that computer's desktop in it. Done.

Broadly speaking, there are two ways to connect; you are either connecting to a target machine inside your network (maybe the computer in your bedroom, while you are sat on the couch in the lounge, or from your office to another user's computer in another office at work), or you are connecting to a computer on another network (support calls from your grandmother, or support calls from the regional office).

If connecting from within the same network, it's pretty trivial to get the IP address; just type

    $ su -c 'ip addr show'

into a terminal, and the IP address of the machine you are on is listed for you.

Getting to a computer outside your own network is a little trickier because there are actually two IP addresses involved: there's the IP address that the world sees, and there's the IP address living behind the firewalled router.

Any user can see their worldwide address by going to the website icanhazip.com. In a terminal it's really easy:

    $ curl icanhazip.com

The resulting IP address is the one that you want to point your VNC client at.

The firewall/router may not be configured to accept incoming requests for VNC sessions, so you might have to read the next section.

### 40.3.1 Having Trouble Connecting?

If you are not able to get into the remote machine, then you are probably encountering a firewall.

Broadly speaking, there are two kinds of firewalls: software firewalls running on your computer, and dedicated firewalls running on routers.

If you are having a problem getting into a computer in your own network, then _probably_ the firewall you are getting caught in is a firewall in the target computer. Go to that computer and put its security checks in standby (SELinux and firewall, as needed), and try again. If you get in, then go back to the computer and configure it so that security measures are on, with an exception for VNC (port 5900, by default) traffic. Since there are so many different firewalls in use, I'll leave it to you to poke around in official documentation on how that's done on whatever computer you are having problems with.

If you are having trouble getting to a computer outside your network, then you are probably encountering an embedded firewall in the routers or switches between you and that computer.

The fix is easy as long as you have permission to log into the world-facing router's admin interface and forward **port 5900** to the target computer (the one you want to log in to). This makes it so that whenever the router gets traffic tagged for port 5900, it forwards that trafic to the target computer. Provided that the correct authentication is provided, a VNC session is then started.

Unless there is a software firewall on the target computer, in which case you need to open a VNC exception there, as well.

You don't have to use port 5900 as your port for VNC traffic. As long as the router knows to forward traffic tagged for port FOOtoIPaddressBAR, and you set your VNC client software to broadcast on port $FOO, then everything matches up and all the traffic gets to where it needs to go.

Another potential snag is that if you're not physically behind the firewall in order to log in to the router (or the firewall config panel of the computer), it's difficult to talk a client through the process.

Also, if you're trying to login to your home computer and failed to poke the 5900 hole in your target computer's builtin firewall, it's probably impossible to do it remotely (unless you can ssh in, do some firewall configuration, and then VNC).

So make sure you configure the firewall before you go out with expectations of logging in from the outside world.

Hey, VNC is easy and convenient. If you're _the computer person_ for your family and friends, make sure you get it onto every computer you touch so that when they call you for help, you can get on their box without having to be there physically. Before I moved to New Zealand, I did this for my grandmother, and it's saved the day on several occasions.

# 41 What is a "Hack"?

The word _hack_ is an oft-misunderstood term. Historically, it meant very specifically someone who writes code.

The media later abused the term for people who committed crimes by way of electronics; I guess because in order to commit a digital crime, you had to (back in those days) be a real pro, or "hacker", at programming.

Both of these meanings still persist today, and yet the media has again adopted and re-invented it to mean pretty much anything you do that is not already written down somewhere by the manufacturer of whatever product you are using. For instance, are you using tinned pineapple as book ends for your Jerome K. Jerome collection? it's a hack! Are you using butterfly clips to keep your laptop cable from sliding off your desk and onto the floor? that's a hack too! And what about taking vitamins? it's a bio hack!

The funky thing about terminology is that it's free, in the sense of being a free-for-all. No group can claim to own a word, possibly ever, but certainly when they didn't make it up. Now, I could argue that a term like "grok", for instance, may indeed belong to Heinlein and to sci fi fans who keep it sacred, but you might argue that if someone really wants to destroy its meaning badly enough, and that someone had enough influence to convince the pinks of a new meaning, then there's not much we could do about it. Hackers did not invent the term "hack"; they re-purposed the word, probably angering a large number of proud and noble men and women who hack through the bush with machete knives on a regular basis. So if the media outlets that assault us on a daily basis decides that "hack" means any old clever solution to any given problem, that's what they'll do.

But not without a rebuttal.

The term "hack" means, at least to most hackers (I mean that in the traditional sense, but not so traditional as to involve machete knives), writing code, possibly inclusive of config files, depending on the problem and the solution one finds to it. Hacks can be beautiful, or they can be ugly. Most often, they are ugly. That is why they are called hacks.

The most important thing about hacking, though, is _the process_. A hack is something you achieve after trial, error, experimentation, failure, frustration, research, practise, desperation, hatred, and yes, passion. You do not just sit down, look up a tip online, make the change to your system, and call it hacking, because that's not your hack. That might be someone else's hack, and you can and should use it if it works, but you cannot claim it as a hack. You have hacked nothing in that scenario. You may have _used_ hack, but you have not hacked.

Don't get me wrong: there's no shame in this; using other people's hacks teaches you how to invent hacks of your own, so you are on the way. You are making progress. You should still be proud (or ashamed, depending on just how ugly it is, but still be proud!) of what you have done, but don't go sew your hacker badge on you jean jacket just yet.

The reason this is important is because in the modern world, there's an emphasis on the idea that everything well-designed must also therefore require no effort on from the user. The user should never have to learn anything new; everything should be just one button away. And there should only be one or two buttons, because too many buttons means that the user must make a decision about something.

This is not why technology is important.

Technology existing only to make our lives _easier_ is about as important as an easy chair compared to a house. One, you need for shelter. The other you just want so you can be cozier when you zone out in front of the TV.

Technology, in order for it to be worth all the evil it does, should make life _better_ (I don't mean more pleasant, I mean better), and enable us to do amazing things that will help humans across this planet live happy and healthy lives. This is not something you achieve without effort and hard work. There is no "make everything better" button, and no tech company, regardless of how many cool gadgets they sell to make sure that you are entertained whilst doing yoga, is going to be the realisation of that.

Technology works because hackers work on technology. Hackers sit down with technology for hours and hours at a time, which stretch into days and weeks, just to solve **one** problem. And that hack gets rolled into the next iteration, and technology is better for it. That's what hacking is, and that is what it needs to be in order to inspire and drive people to constantly improve. Some people are motivated by money, others by religion, others by attention; hackers are driven by the process of finding solutions.

Calling yourself a hacker, or laying claim to a hack, is not really as personal and precious as my over-dramatisation makes it out to be, but the historical meaning of the term is significant. Abusing it into a pop culture euphemism for simple day-to-day household tips only erodes the work that has been done and is being done by hackers. And it creates a false expectation for people who believe they want to "become a hacker", because when they find out that hacking is not some simple act achieved with the red "hack" button, they get angry and stop trying.

So stand up for hacking, get into hacking, suffer a little. Do some hard work. Man- or Woman- up and make cool stuff happen. It might take a while, and most people will not understand or want to come anywhere near your obsession, but in the end you'll have achieved something so ugly, something so strange and incomprehensible, something so marvelously bland to the average civilian, that you will know, by the look of sheer boredom on your audience's faces, that you have truly achieved a verifiable hack.

# 42 The Promise of Technology

Technology is inherently destructive. By this I mean that there is a cost to technology, especially modern technology, which typically requires us to mine for minerals, produce noxious byproducts, pollution, you name it. So straight away, modern technology has a debt to pay back to us. The longstanding defense against that is modern medical advancement; if it weren't for computers, we wouldn't be doing life-saving brain surgery and so on. That's a very valid argument, but the problem is that not all of our technology at this point has anything to do with brain surgery. In fact, most of it has nothing to do with medical and obviously beneficial research, and does not even bother to help fund it. In other words, the fact that we have cell phones, tablets, and portable media players is pure anti-karma; we consume these things, usually at an embarrassing rate (they didn't invent the phrase "they don't make 'em like they used to" for nothing) but they are purely vanity items, doing not one ounce of Good in the world.

OK, so technology has the burden of making good on its promise to make our lives better. That is, after all, what technologists have claimed they were after, both in sci fi and in marketing campaigns. I mean, my gosh, one famously shameless tech company adopted famous humanitarians, people who have saved lives and who have made astonishing advancements in the world, and made a commercial out of them on the premise that they just wanted to honour those people (certainly there was no intention of equating the company with those great men and women). Another tech company has as its logo an admonishment to do no evil, failing to see that ethically its very existence requires justification. So where's our technological utopia? Where is this promised land of health, equality, and provisions enough for all? Where is the breaking down of barriers and the laying-to-waste of bigotry, war, hunger, and sickness? How much more do we have to buy before we get those promises?

Well, of course there is no product out there poised to deliver to us these promises, because the idea that these things can be purchased is a lie. You can't purchase good will and social responsibility. Want proof? Just look at the pay checks of the CEOs that America venerates, compared to the funding of important programmes like curing blindness, solving deafness, curing cancer, and so on. If my tech purchases actually contributed in a real sense to meaningful advancement, then I might be a lot more eager to consume the crumbs that these tech "giants" try to spoon feed us, but as it is, I see behind the curtain. Tech giants are not changing the world, they are doing no good. The technology we have now is officially Good Enough. It's time to move on toward social improvement, toward the promises that technology needs to make good on in order for us to continue, with moral justification, to produce it.

The kind of technology at this point that can be justified is open source, and non-consumer oriented tools like 3d Printers. These tools alone seek the remove the centripetal power from tech corporations to normal, every day people so that normal, every day people can take back the ability to create, maintain, and repair the technology in their lives. What some people are starting to realise, in spite of the tech company's marketing schemes, is that the technology we have in our lives _right now_ is officially Good Enough. I daresay it has been good enough for roughly a decade now. We con do everything we need to do, and if not one more computer was produced, if Intel and AMD shut their doors today, we would all be perfectly fine for at least another decade. Proof of concept: I still use a computer from 2001; with a new third party battery, the thing runs forever.

We are developing ourselves right out of self-sustainability. I can do remarkable repairs to computers, but there's no chance I'll ever repair a bootROM without a circuit printer of my own. We are breeding, and becoming ourselves, a society of Interiour Decorators who can do amazing things with a living room, but cannot, of all things, build a house.

It's time to work on the foundation. We need to stop the fruitless innovations into entertainment, and start building tools to help us build tools. We need to stop being mindless consumers of the products that are geared only toward making someone else money, and start making our goals loftier: build the utopia that technology keeps claiming it is making possible. You want to change the world? it's not done by buying the latest phone, red or otherwise, nor is it done by making a cool new video, even if it is an edgy documentary, and it's not done by starting a band or following a band, and it's not done by activism or "hacktivism". Figure out what you are doing in your own life to make this world a better place. Do not invest in things that do less.

Avoiding "bad" technology is not actually that hard, because there is a whole ecosystem built up around just that idea. There is old hardware out there, and amazingly it is seen as having no value by most people. You can literally get it for free. Sometimes you can even get paid to take it off of someone's hands. That's how much people value old technology.

Open source software is designed to be extremely backwards compatible. This software runs on anything and everything that it possibly can run on. It is designed to keep technology alive. This empowers the user to use old technology, to maintain it and keep it in good repair, and most importantly, to innovate in small (or big) ways, independent of any tech corporation spoon feeding users (usually whether the user likes it or not) ready-made products and software.

So step away from the corporate push to new "exciting" ways of "changing the world" and enter a world that has already changed: the world of independence, hobbyists, real people working face-to-face on problems and life issues that actually matter to them, and not on abstract ideas of "changing the world" through pointless greed. Go open source, become a hacker, a maker, an independent. No more illusions, no more compromises.

# 43 Makers

At some point in the past decade, a "maker movement" got ~~started~~ branded. As with any quote-movement-unquote, it's anyone's guess as to what that actually means, despite our desire to imagine that it is a coordinated effort toward some goal. Every person involved, naturally, has their own personal interpretation of what "making" is all about. Some common examples:

  * A hearkening back to the early 1900s, when things you purchased could be repaired and maintained by you, the consumer. And you could make things from parts to avoid having to purchase something (ie, you could build your own lamp, or wire your garage, and so on). Formerly called being a "handy man" or a "tinkerer".
  * In order to put your money where everyone else's mouth is about being eco-friendly, a hearkening back to the early 1900s, when things you purchased could be repaired and maintained by you, the consumer. See first point.
  * An intellectual and physical learning exercise in which participants teach themselves to build things.
  * A fun weekend hobby.
  * An educational resource, formerly known as "busy work", to break up a day full of busy work.
  * A business opportunity in which you sell cheap 3d printers from China at a markup to "makers".

And so on. The list is basically endless, because every person who lays claim to the "maker" title has their own reason for wanting to make things. I got interested in "maker culture" first and foremost because of the initial reason: I don't like that I purchase something but am robbed of the right to repair it. I happen to know that we have progressed, technologically speaking, far enough that I should not have to go buy an entire new kitchen blender if the knob on mine breaks, or a new computer just because the face plate of the optical drive fell off, or a new phone because my battery no longer holds a charge. This is unacceptable at any time, but it's downright criminal to have gone from a time when modularity and maintenance prevented this wastefulness to what we have now.

That's not progress, that's a special kind of forked regression that takes us backwards to something worse than what we had before.

So yes, "making" is important.

It's important for the empowerment of the human mind and humanity's collective potential, it's important to our very intelligence, it's important for our environment, and it also happens to be fun, and educational, and yes, it even has its own economy. This "maker movement" is a very rich and diverse thing, and it is important for it to thrive.

## 43.1 The Maker Burden of Proof

I think part of the problem the "makers" of today face is the need to prove that self-reliance is, in fact, progress. I have just briefly explained why I think it is (although you could write a book on the nuances of one first world nation flooding out the need for maintenance by the sheer power of cheap labour and mass production of disposable junk) but to the innocent bystander, the makers of the world look a lot like a bunch of jigsaw puzzle enthusiasts.

By that, I mean a lot of "making" at this point is done, literally or nearly, with kits. It's akin to building a model airplane. You put your Maker-branded hat on, grab the latest issue of Maker Magazine™, and you go to the store and buy all the oddly specific, mass-produced parts, take them home, and put it all together. The difference between what you make and what your neighbour has purchased is that it took you an afternoon to build the thing.

Is one better than the other? well, yes, probably. Your kit, by virtue of the fact that you put it together, is modular. If you put it together, you can take it apart as needed. You can swap out at least some of the parts. You can fix pieces of the whole when something breaks, rather than throwing the entire thing out. That's a significant improvement, and it's one I'd take (and one I do leverage in many of the things I do buy, or ~~make~~ build).

What has not improved here is the source of the parts. Sure, it's cool that the product is modular but if you're just subjecting yourself to kit manufacturers, then you're a builder, not a maker. _Nothing is wrong with that_ , and I'm not critical of it (I do not make my own computers, for instance, but I do build them) but it's important to acknowledge the difference.

So let's start over:

## 43.2 The [Actual] Maker's Burden of Proof

If you are a maker in our revised sense of the word, then we have, as a "movement", a serious burden of proof beyond even the idea that modularity and maintenance is important. We must show the world that making is sustainable and reliable. You see, in your mind, _making_ makes sense. It's the answer. It's the way the world should work. Things should be produced locally, they should be modular, raw materials should be available to everyone and they should be as close to nature as possible with minimal conditioning before it hits the shelves. Technology should be exposed, standards should be open, instructions should be available. Everyone should be able to learn how to do everything. The knowledge and power of the entire population should be able to advance by each individual making a conscious decision to learn something, to practise it until perfection, and to implement it as a service within their real-world communities.

I am already in a community that has faced this challenge, and it's an ongoing struggle. How do you convince people that you are not "just a hobbyist"? worse yet, how do you convince people that what you are making can do what they _already have_ except it can do it differently and eventually in a paradigm that is better for everyone?

Speaking pragmatically, it's almost a stupid and counter-productive argument. It sounds like you are asking everyone to take a giant step backwards, on the promise that the road forward from there will be better. After all, what we have right now has gotten us this far, and it certainly does "work", at least when we measure it by immediate gratification. So why should I throw out the current technology and replace it with something that looks different, that might work differently, that might require work that I haven't got the time for or any interest in doing. How can that possibly be advantageous?

The way to show people that making is a better system than blind consumerism is to make stuff, and to use what we have made. In other words, it goes back to that old anarchist admonition of _direct action_ , or the old adage _talk is cheap_. You can mass produce magazines, you can hold faires and conferences, you can print out a million plastic trinkets in fancy plastic-printing machines, you can start a whole cottage industry of people proudly calling themselves makers, and that's very nice and it will make a lot of people a lot of money, but it's an industry you're creating in the same model as the one you are supposedly moving against. You have your consumers, and they have theirs.

A true [maker or otherwise] movement must be borne of people. "Makers" must be making things, and more importantly, using the things that they each create. You may have no interest in making a chair, but someone around you does, so buy a chair or barter for it with whatever you make. You use your friend's chair, and your friend uses the motion-sensing light that you made, and maybe you each have suggestions for one another on how it might be improved. Maybe your motion-control light automatically shuts off during daylight hours, and the chair becomes a folding chair to maximise space. The possibilities are endless, but the important thing is that a part of making is using. If you have no users, then you really are just a hobbyist...which is OK, but that's not a "movement". We need people to make stuff, and those makers to also be users. Once again, an old adage: "Our dogfood is so good, we eat it ourselves" (according to legend, a prominent dogfood manufacturer said that in order to assure his customers that he was selling a quality product).

So if you're a maker in any way, take a look at your toolchain and start looking at what you can, when it comes time to do so, replace with something that a fellow maker has created. Be a maker through and through; it's a great way to support your fellow makers, and it's the only way to show the world that yes, there really is a better way.

Support local artisans and artists. Support your local hackers.

If you didn't build something, learn how you can maintain and repair it. If you can't, when it's time to replace it, replace it with something that give s you more freedom.

Don't let "making" be a movement or a brand. Let it be a way of life; one of self-reliance and independence.

# 44 Life Lessons for New Sys Admins

When I got my first job as a sys admin, I had no practical experience with real live users. I had administered a network of a hundred clients, but they were all test clients running automated unit tests with an occasional manual test thrown in for fun. As a result, I made a lot of mistakes early on, some of which I could recover from and others that were a little tough to rectify painlessly.

So if you are getting into the sys admin racket, and have little or no sagely guidance from a wiser, older admin, then allow me to chime instead, with some war stories so that you may learn from my mistakes.

## 44.1 Linux Linux Linux

It may seem like a no-brainer, but Linux. Linux, Linux, Linux. And also BSD. If you do not know it as well as you should...well, first of all, learn it now while you can, before you start the job! And then live in it, get to know it like the back of your hand. For your first year, insist upon support directly from Red Hat so you have a support staff of sorts. Try to get it longer, but at least get it for the initial setup stage.

The reason this item is on this list at all is mostly because you _will_ experience self-doubt on the job. Some loud people with lots of letters after their name and certs on their business cards will try to convince you that "Cisco is just a MUST for real networks" or "Microsoft is just a given in a business environment" or even the loudmouth know-nothing "We don't even need a sys admin; we should just use Apple server, because they're just so easy!" and so on.

Look, I have used Linux in unbelievable settings to do unbelievable things. Do not let anyone tell you that open source POSIX is _not_ always the answer, because it is. It might not be tailored for what you are doing, and it may require extra work from you in some cases, but if it's worth doing, then do it right.

I have been glad-handed by these know-it-alls who want to "help" you with your systems, and it always ( _always_ ) ends up turning into someone else putting something on _your_ network that only _they_ can service. I don't care if it's Cisco, Oracle, MS, Apple, or whomever; they are all about restrictions, and they couldn't care less that you are the poor sap who has to deal with it on a daily basis. Avoid it and find custom-designed alternative solutions from open vendors like Ixsystems, Clear, Digium, and CodeWeavers.

To be clear: I am not saying that you should build everything from scratch yourself. I am saying that if you are going to purchase a Solution, then purchase open source. At least open source leaves you empowered to maintain, service, and learn. Closed source "solutions" are always just band-aids on a problem, and they can only be applied by an authorised band-aid serviceman. **Avoid.**

##  44.2 Turnkey

Sometimes things do need to happen _now_ , and you just will not have the time to sit down and become an expert on OpenSwan or FreeIPA or whatever. In these cases, use and abuse TurnKey Linux without shame.

Turnkey Linux, if you have not yet heard of it, is a site that posts ready-made virtual images of Linux servers dedicated to either a specific task (a DNS server) or a group of tasks (a small business suite). You download the image, load it into a VM, do whatever site-specific configuration required, and put it on your network. DONE.

Why is it OK to use TurnKey Linux but not other ready-made products? Well, lots of reasons, including transparency, but in terms of you becoming a better sys admin so that you can do bigger and better things, you can cheat with TurnKey and _still_ look at the config files and see how it is all working underneath the hood. This is a huge advantage over letting other people design and admin your system, so use it when necessary!

Use it _especially_ when your boss wanders stupidly into your office and says he or she is thinking about hiring such-and-such a company to install a file server because the one you have now is "too slow and too hard to learn". The only response, at least in my experience, to that is to install something before he can get back to his office to make the vendor call. Turnkey a solution and get your boss distracted on lolcats, and then move on.

## 44.3 Not Your Repairman

You are not a starving geek any more.

Once you get a job in an IT department, you do not have to repair people's personal computers (BYOD policies not withstanding), or help them sync their phones, or chat with them about the new printer they got at home and cannot figure out how to setup.

Most of us geeks have the compulsion to help anyone and everyone with a tech problem, partly because we're nice. but partly, I think, because we were trained as we were growing up that technology was our superpower: it was our tech knowledge that made people like us, that got us exempted from gym class in a pinch, that made us popular.

Well, my child, that's all over now. Stop feeling obligated to help everyone who utters the phrase "hey, you're good with computers..."

I say this not because I'm mean or want you to be mean, I am saying it because it ultimately goes horribly wrong. If you fix someone's personal gear at work or after work or whatever, then 9 times out of 10 you are now their official support staff whether you like it or not. And because it's on the job, it all feels very official and above-board. You are no longer doing them a little favour, you have made a contract between the company and them; not legally speaking, but that's the secret impression that you both will get. It's very insidious, but you are sending the wrong message about the scope of your obligations, and you both end up falling for it.

Think of it like dating someone from work (which is supposed to be bad, according to cliches), or all the urban legends about the bystander who helps an injured person only later to be sued for having interfered. It seems harmless at the time, but ultimately it leads to unpredictable complications.

So separate yourself from your profession, even if you're faking it. Draw clear boundaries and makes sure you don't get taken advantage of or create false expectations among colleagues.

## 44.4 Illegal Software is Illegal.

Illegal software is illegal. It happens to every IT person; someone comes in and casually hints at, or flat out requests, a copy of Microsoft Office or Adobe Photoshop or any other paid software title. Let me be absolutely clear on this point, because my stance used to be terrifyingly neutral; I figured I wasn't getting paid by Adobe or Microsoft or Apple to protect their licenses, so who cares if someone borrows a software installer for a day? It's not like I am installing the software for them. And if my boss came in and told me to install more copies of a software than the organisation actually had paid for, big deal, right? It's not my concern.

Wrong, wrong, and wrong.

First of all, legal stuff. If a software company does find out about illegal software at your organisation, it's your head on the chopping block. Either the software vendor is going to come after you directly because...you're the IT person...or your CEO is going to be given the option of either paying a hefty fine or firing you as the scapegoat. Either way, you lose. It. Is not. Worth it.

I know it's not fair, but this is the way it is. It should NOT be your job to make sure that people are respecting licenses. That should be the software vendor's job; they programmed the thing, so they should have a licensing scheme designed to protect what they are selling. The music software industry has been quite good about this for ages; software dongles and apps that phone home are pretty standard in that sector, and as a result there is not near as much illegal usage out there. Other vendors have been infamously lackadaisical on the enforcement of licenses, so much so that it has been implicit that a little stealing of software never hurt nobody. I could forgive this misleading message if it had only lasted for, say, the first couple of years of a vendor's life, before they realised that their wares were going to become coveted items. Yet the most aggregious offenders have been around for **30 years**. They should have learnt their lessons two times over, and yet they are only just now starting to implement "cloud"-based licensing schemes to actually put a stop to illegal use.

But whatever we might think of the software vendors for simultaeously condemning and turning a blind eye to rampant illegal use of software, and also for hoisting the responsibilty of policing their licenses onto us IT people, the people who really get screwed in this equation are the developers. If I go to work every day and hack on code for a closed source application, I expect a pay cheque in the end. A company pays its programmers by looking at the market and estimating what the demand is for their product, and they ~~inflate~~ set the price accordingly. Based on that projection, they hire developers and set the scope of the project. Months pass. The software gets completed and now it needs to be paid for. If the software goes out into the world and a third of the market illegally acquires it, that's money that was not received by the company; it's money that they had projected having but do not actually receive.

Guess what happens then? Lay-offs! And don't think it doesn't happen; it happens all the time, even to the big companies. And sadly, it's not us (the IT people) or the developers who are to blame, but the company, for not enforcing their licenses. But the CEO isn't going to take a pay cut to prevent layoffs, so developers get axed. Too bad.

I know, I know, "Copying isn't stealing". It isn't, as long as the producer of the product hasn't already paid bills on the credit of that thing being sold. It's their product; if they don't want to let you use it without you paying them for it, then that's the rule. If you don't like it, go find something that does the same thing without the license fee attached.

It's an ugly and dirty business, so don't go near it. If you have a hard time saying "no" to people, make up a story about hov you get audited by licensing companies and cannot risk "lending" software installers, and then tell them about free alternatives (or don't; it's up to you), or tell them flat out that what they are asking for is illegal. Because it is. They wouldn't go to a theatre manager and ask for a copy of the latest blockbuster off the DCP server, so they shouldn't be asking you for a copy of office or whatever.

My policy eventually became to have a 64gb thumbdrive full of all the open source alternatives (even an ISO of Linux, although I'm pretty sure no one ever bothered using that one). And if someone asked, I would explain to them that it was illegal, and why, but here is a thumbdriwe they could borrow for free alternatives. They always took it, and I do know that some people used some of the free software. Others probably did not. That's not my problem. Not being a part of illegal software distribution, that was my problem, and it was solved very easily: by not doing it.

If I come across as belabouring this point, I have very good reasons for it. I left a company over some disagreements in business policies; forcing me to have a cell phone and therefore basically putting me permanently on a sort of passive on-call status (but only paying for regular business hours, of course) was one thing, but the blatant refusal to pay for software licenses but the insistence that I continue installing the software as part of my job was the other. How bad was it? Well, six months after I had left the company, I got an email from my old boss asking me for instructions, via email, on how he could secretly install, illegally, production-dependant software without licenses, because my replacement was refusing to do so.

I refused to assist, obviously, but trust me, you do not want to go down that road.

(As a side note to that story, I very recently discovered that the place finally got found out; they apparently were contacted by lawyers and had to pay a hefty fine, or so rumour has it; the official story is that there is no comment.)

## 44.5 You are Your Own Wing Man

I used to imagine that IT policies were structured by IT, and then enforced by department heads, and everyone duly played along nicely the way everyone played along with all other company policies, like coming in at 9:00 and leaving at 17:00 and only taking an hour lunch break and things like that.

Turns out that IT is usually not exactly like all the other company policies. People resist your policies, people ignore the things you ask them to do, they use services like Dropbox and Google Docs to share private data, they use different web browsers, they download stupid stuff on their computers, they outright refuse to do stuff. And for some reason, upper management very frequently not only allows this, but are guilty of it themselves.

In other words, do not expect to be backed by management. You have to be your own policy-maker and enforcer, like Judge Dredd. It's tough and not very pleasant, because it feels like you are becoming your own worst enemy: you are restricting people! But you're all about open source, and freedom, and choice; how can you be making rules and laws, and why are you blocking things?

Well, two reasons.

  1. First of all, you have a responsibility at the end of the day. Upper management might not enforce your decisions for you, but you can be darn sure they still expect to see the results. It's like this:

You: "We must not use Dropbox for sensitive information. Use our private install of OwnCloud and access it over VPN."

Boss: "No, don't worry everyone, it's totally OK to use Dropbox. I use it for all of your employee data and it's fine!"

You: "We just got a report that all personal records have been leaked."

Boss: "WHY DIDN'T YOU HARDEN OUR SECURITY?"

  2. Secondly, your job is to support things running on computers. This is exactly why closed source vendors make their products so limited: if you limit what is possible, then you limit the possible errors.

It might hurt to say "We support Firefox, and only Firefox." but trust me, that's _one_ method of accessing the web, and it works, and you are familiar with it, and it's only one set of solutions you need to write up, and one set of instructions on the intranet, and so on. I know, it _feels_ like you should support two options, and it's not that much work to do so, right? No, you're wrong. It's not that much work _right now_ , but parallel options do not scale well. They scale at least to twice the amount of upkeep, twice the amount of troubleshooting, but potentially a lot more. Keep it simple.

See, the IT department is the singular model user for a company. You perfect things for the theoretical user, you formalise that, and then you publish. You have proven a workflow _once_ , you maintain it in _one_ place, and you support it and only it. The great open plain of choices has been utilised. Once. By the IT person in charge of choosing what software the User will use. And then that User gets replicated to all users within the company.

It's restrictive, yes, but only in the sense that any group of people getting together to perform some task. They have to agree on certain standards because if they don't, then they aren't a group of people working on a task, they are a group of individuals working on several different tasks.

So make policies. Enforce them. That's part of the job, whether upper management acknowledges that or not.

## 44.6 Contribute Back

You are not the only new sys admin in town. Find others online and in your local area, talk to them, share ideas. When you come up with a brilliant way to do something, write it up and publish it somewhere. Let other people benefit from your effort, and seek to benefit from others.

This is the way progress happens. It's why there are so many truly great technologies out there helping you with your job. Give back to the larger community and help make efficient, smart, and open computing happen.

# 45 The Trouble with Lifejackets

Lifejackets are great. They will save your life (in certain situations). But two things about lifejackets:

  1. They don't do a bit of good if they're not around when you need them.
  2. They work better if you don't fall into the water in the first place.

As you might suspect, I am using the term "lifejacket" as an analogy here, and am in fact talking about computers, data, and productivity. In order to make sense of my analogy, we need a preamble, only two paragraphs late:

## 45.1 Preamble

In real life, computers are tools and at the end of the day, that's how we all (even us geeks) treat them. I might love to play around with computers all night and all day but if something needs getting done, you can bet I'm not going to fire up some alpha software to do it. I'm going to reach for what I know and trust, and get the job done.

On the other hand, computers are unique from the tools out in your shed. Hammers and drills, and even cars or lawn mowers, do not retain the thing you have been working on. I may go out to the garage to build a shelf, but when I'm done, I take the shelf with me and no longer need the hammer. I don't even need the garage. I have my shelf; the job is done.

Computers are different. If you build something on a computer, you have built that thing, essentially, _into_ the computer. In order to retain what you have built, you have to keep that computer (or hard drive or whatever) around, and look after its health. Now, that's a huge responsibility, and it's a huge burden of trust placed upon the people who made that computer in the first place, because if they screw something up, your work is threatened. In terms of my second analogy, you have built a shelf into your tool shed; it cannot be removed, so you now have to take care of that shed. The question is, would you do all that work...

...in a shed...

...on someone else's property?

After all, if the property owner locks their gates and goes off on holiday for a month, you lose access to your shed and shelf.

My personal reaction to such a scenario is violently and profoundly negative. I cannot imagine investing effort and passion into something that relies on the trust of a third party to keep doing exactly what they have been doing. No contract has been signed, and computer vendors don't even bother verbalising assurances that they aren't going to screw you over. In fact, most computer vendors make a habit of doing exactly that: screwing over their customers via planned obsolescence, deprecation of file formats, arbitrary changes to UI and important libraries, and actually charging money for buggy software with no support options.

But of course, there is an implied contract at work, which states, more or less, that the computer vendors will continue screwing you over in small, inconvenient ways, and you'll adapt because you don't understand technology but everyone insists that this is all for the better, and besides there's no way around it anyway.

After all, proclaiming independence from computer vendors only goes so far. You might run Linux and use nothing but reliable open source applications, but you are still running the software on chips that only Intel, AMD, and a handful of ARM and RISC manufacturers can possibly create. It's a little like denouncing factory farming and resolving to grow your own food, only to plant your garden on land you are only renting.

(Analogies were on sale this week; I stocked up.)

So nothing in life in guaranteed. Hardware manufacturers do have a different agenda than software vendors, and have historically been far slower to change (the x86 instruction set has been around since 1978, and, believe it or not, a trimmed-down Linux kernel can run on the first x86 processor). The hardware industry is, to a large degree, a black box to most of us. If Intel, AMD, and all manufacturing plants of RISC processors went down, we'd all be basically out of luck. So, again, nothing in life is guaranteed.

## 45.2 Insurance

That's where the lifejacket analogy comes back into play. Assuming that we are alright, for now, with using the magical components that the gods provide to us, what can we do to best protect and insure the product of our labour: our data (and by extension, the programmes that we are, essentially, building "into" our data)?

Well, one good "lifejacket" is the knowledge that Linux and open source exists. If all else fails, open source will generally save the day. I am not saying that restricted software cannot save the day, but I am saying that because the software is restricted by licensing and availability, it sometimes is very difficult to get them to do the saving. As I mentioned: lifejackets don't do a bit of good if they're not around when you need them.

I can't count how many times I, as a computer technician, have been enlisted (be it by money, guilt trip, or sheer piteous desperation) to rescue users from some situation involving mission-critical, cannot-live-without, needed-it-five-minutes-ago data that suddenly, for whatever reason, could not be accessed. Whether the issue was that the application or the OS needed to access the data had ceased working and could not be obtained in a timely manner, or a file system had gone wrong, or a preference file had become corrupt, or a user just did something stupid, the allegorical lifejacket was just not there when it was most needed.

Restricted operating systems and applications are designed to keep users *out**. I don't mean that figuratively; they are literally designed that way from the ground up. That is why they are restricted. To get into them, you (or the computer manufacturer pre-loading all the software and passing the cost on to you) have to spend money. They are a service that you can use, assuming that you have passed a number of arbitrary checkpoints. They are not designed to be there for you No Matter What; they are designed to be there for you as long as you have paid your license fee and gone through the "correct channels" in order to obtain them.

Open source, by contrast, is very much designed to be there whenever you need it. And that's no exaggeration. Do you need an OS to get a computer up and running? download a free Linux ISO designed for every day use, or for data rescue, depending on your requirements. Need a server in a pinch? download a ready-made Linux server ISO. Need these things on a slow connection? there are fully-functional Linux distributions in around 200mb, even as small as 50mb. Need applications to read an important file? open source applications are all freely downloadable with no licensing restrictions, no kracks required, and since you literally own the code the minute you download it, you can make local copies entirely legally.

## 45.3 Personal Detachment

That's nice, and it makes Linux and open source out to be a great safety net that we should all at least spiritually support. We should say things like "Open source is important" and "I love open source!" and we should, in theory, admire the selflessness of the open source developers. In this sense, there is no need to become involved with open source ourselves; we can leave it lying around as a fallback for when we happen to need it. We can use VLC when a restricted player fails us, and use Firefox because pop-geek media tells us it's the best (until Google tells us otherwise), and we can use blindly use Linux servers when we build our websites. But there's no need to get personally invested in it.

It's just a safety net.

The second tenant of my lifesaver analogy addresses this attitude: lifejackets or safety nets work best when you don't fall in the first place. Since we know that disaster strikes in real life no matter what we do, we can assume that at some point we will each face some kind of computer emergency, and usually it happons at the least convenient of times. The thing that sets a computer tech like me apart from the people who come begging me to rescue their systems is that I already know the tools needed to set everything straight. I'm not talking about the cases when I actually have to disassemble computers and replace drives or cables.

I'm talking about the times that a computer just needs a once-over to fix some random software issue. Now, admittedly, I might be able to fix the issue far more _elegantly_ than an untrained user, but that also takes time, and my availability. If you haven't got those two things, then why not just grab a Linux disc, re-install (or just boot temporarily if you prefer) and get back to work? that sure beats sitting around for a day or seven, not getting anything done and not having access to your own data, because you have to sort out how to get a copy of some OS or application and either find your license or purchase a new one, and so on and so on.

My point here is that using and getting comfortable with an open source environment _now_ makes falling back onto it a lot smoother than if you literally treat it just as a safety net. If you have a lifejacket and you're 300 km from any sign of land and your boat has sailed away without you, you're still basically screwed.

Make open source the thing you use and rely upon, and when things go wrong, there will be no transition; you just grab the stuff you need, and go on as if nothing ever happened. Trust me; it's happened in my presence and I've seen the results. They _are_ pretty. Everything works out in the end; it's a happy ending, where the data is safe, the show goes on, and there is no loss of blood, sweat, or tears.

It's a lot easier to adapt to something new when there is no deadline looming ower your head. Of course, that usually means that you have to create your own motivation and if millions of computer users _still using XP_ (and I'm writing this is 2015) tells us anything, it's that computer users are bad at motivating themselves to learn anything new.

## 45.4 Maintenance Mode

I'm getting a little tired of all these analogies but, darn it, they're really working for me. So here's another one: you have a fire extinguisher. There's a tag on it. The tag tells you when it was last tested. That's important, because the last thing you want is to be drowning and have your fire extinguisher fail.

Oops, analogies got mixed up.

Point is, when you are relying on something in the "well, we'll always have that thing just in case we need it", it really pays for someone to be maintaining it.

Open source is very much alive and well. People are using it, people are creating it. That's great. But the _more_ people using it and the _more_ people who create it, the stronger it becomes. Believe me, I'm not one of those people who thinks that everyone needs to use open source for the universe to be set right. I believe everyone would benefit from open source, but whether that guy over there uses open source or not has no affect on me, personally.

However, if you are interested in getting into open source, then believe me when I tell you that open source will be stronger and better with you being involved. I don't know what you will contribute; maybe you won't contribute anything directly, but you'll be using it, you'll give others tips on cool things you have done, or ideas by what you create, or improvements with your complaints, or confidence by the very fact that you use it on a daily basis. Whatever you bring to open source, it's only there because you brought it.

## 45.5 Get It

So let me simply encourage you, gently but firmly, to start your journey down the Open road now, while it's the least important thing in the world for you to do. Because when your computer life is brought down by that one unfortunate coffee spill, or that one unexpected power surge, or an expired license, or the discontinuation of that aplication you depend on, or that one broken power plug, or whatever might happen, you will be really, really glad that you did.

# 46 Learning

As a dropout of both high school and university, and having, strangely, worked at a college-level institute, I have more than just a few opinions on the subject of institutionalised education in the USA. Not many of those are good. For now, though, I will just focus on one area, and that is the idea of preparing students for "the real world".

Most schools claim to want to prepare students for the "real world", and many, I think, [claim to] design their courses toward that goal. The problem with this begins with the idea that there is one "real world". Sure, there is a world outside the school walls and yes, there are some great big generalities that you can make about it that will hold true, like the fact that people need to eat and have shelter and so on. There is no way, however, for a teacher or a faculty at large to be able to anticipate what the world will be like by the time a student gets out into it, nor every way each student will experience this world on their path toward becoming a productive member of society.

On the surface, the usual tactic of 80/20 appears logical: teach the skills and theories that 80% of the students will need and you do pretty well. Certainly well enough to continue convincing people to pay for your lessons. The other 20% are either losers who aren't going to succeed at life much less finish school, or they are edge cases for whom traditional education just doesn't work for whatever reason. Maybe they are geniuses or maybe they are jacks-of-some-trades-master-of-none who can hop from one thing to another and make a living off of it.

What I have noticed is more like an 8/92 split in different categories: the 8% of people who get by using the exact knowledge set and the exact tools, in the exact way, that the school taught, and the 92% left scratching their heads wondering why they paid for an entire semester of one thing or another only to find that every job they are getting after school is telling them that no one does it that way.

## 46.1 Knowledge Over Skills

The answer, I feel, is knowledge. You know, the thing that schools traditionally have been all about?

There are many examples of this, but interestingly one of the best is language, and even reading. Observe:

Let's start with reading. As children, we have books read to us and we get lessons on what combinations of letters make what sound, and so on, and we learn to "sound out" a word, until we get pretty good at reading. But the amazing thing is that we have not just learnt one set of words and no more; we can learn new words, and up to a point, we can even glean the meaning of some words just from context.

So we have not learnt to read, but we have learnt how to learn how to read. We have the ability to read new words that we have never encountered before. What a powerful system. Surely there is no way to improve upon this old classic.

Actually, there arguably is a way to improve upon it. It turns out that teaching children about phonemes and spelling and grammar is also pretty helpful. It is not, strictly speaking, essential but it does help a lot. You see children who get lessons about the _why_ behind words and sounds and sentences become truly excellent readers. They end up the sort of people who love books, who go to libraries, who can rattle off the correct spelling of a complex word like 'occur' and 'maneuver' without ever checking a dictionary, who know the difference between 'loose' and 'lose', and all that advanced stuff. It can even be argued that they learn a bunch of other useful things a side benefit, but I'm not going to get on my whole education-accumulates-logarithmically soapbox right now so let's speak no more of that.

Then there's language. This is an easy target because the USA teaches it even less effectively than they teach reading, believe it or not. The USA theory is that you should teach students one language.

That's it.

That's all they do. They require a foreign language course or three in later years, but as with many classes, students scratch by just to make the grade and then forget everything (no surprise, since the grammar they are being taught in their foreign language class is probably the first grammar they have been taught at all). The results speak for themselves: most USA citizens speak one language only.

Many other countries introduce students to foreign languages early on, and it shows. Many very successful non-Americans speak more than one language. In fact, many speak multiple languages because, you see, education accumulates logarithmically. When taught to be a _learner_ , you don't just learn what you learn, you learn to learn more. So you have people who are taught two languages (their native language and, maybe, the second-most-popular language in their region), and now they know inherently how to learn a new language. It's a skill they have, and they can apply it to some other language. So it works poorly if you teach only one language, and it works really well when you teach two. Even if the second is arbitrary, you get better results. You cannot anticipate what language each of your students is going to need to know later in life, but by teaching them some other language, you give them experience with learning a new language so that when they suddenly decide to move to Rwanda and learn Swahili, they can do that with relative confidence.

This brings me to the topic of technology.

## 46.2 Lingua Digitum

Not everyone is going to become a maths professor, or a writer, but we all get maths and writing lessons in school. Similarly, not everyone needs to learn to programme, but since technology has become such an integral part of our lives in modern society, it is important that everyone has experienced at least an introduction to programming.

A lot of people seem to think that an intro to computers means that students should get an quick overview of one OS's interface, and that's pretty much it. This teaches the students basically as much about computing as does a photo gallery of book covers teaches them about reading and writing.

The thing about a user interface approach to computing is that the interfaces are constantly changing, and even without constant re-design, the interfaces in themselves are limited in what they can do.

Sadly, it seems to be the goal of most commercial technology companies to limit what a user can do to program their working environment. Some companies provide users with token scripting abilities so that users can write some high-level applications, but for the most part the system is controlled by the vendor rather than the user.

## 46.3 Open Source

Linux and Python and other open source technologies don't need to limit a user. They open up, either out of a philosophical standpoint (see the FSF) and partly because of the licensing (see the Open Source Ininiative), free and powerful programming for _anyone_ who wants to learn it. Learning to code in an environment that you control is a very powerful and empowering experience, and one that can lead to building up your own style of working, achieving your goals, and truly taking ownership of the technology in your life (whether you want it there or not).

And the programming logic, if not the actual syntax and structure of the language, is a universal and mostly unchanging expression of how computing is done. Learning the basics of a UI (which is sure to have changed by the time you're out of school) teaches you how to click a button, or drag a file to a folder. Understanding a filesystem, directory structure, and basic computer operations, unravels all interfaces and lets a user, starting from the underlying tasks, discover the implementation on their own.

This isn't just a random theory people came up with as a good way to promote open source, because strictly speaking it doesn't really _require_ open source to work. You can learn any system from the bottom up (although how far down you can go in the stack does lessen with closed source); it just happens that open source lends itself better to unraveling mysteries, since every bit of information can be exposed as needed.

The thing is, people who understand the lower levels of computers are able to teach themselves other systems far easier than people who have no concept of what is really happening underneath one level of an interface. Granted, not everyone has the patience to teach themselves new things, and granted, "low level" is defined differently by different people, but by teaching computational concepts, you open up the lowest level to all, and build up from there; people can settle at whatever layer they are most comfortable with, but at least they have been exposed to the basics.

Just like math, just like science, and reading and writing, and all those other topics that no one really wants to learn but ends up having to learn anyway.

## 46.4 But Wait! There's More!

As a side benefit, programming, or at least an above-average understanding of how computers are programmed and how they function, can also lead to gainful employment (again, whether you really want it or not; it does tend to be fairly important). It takes skilled labor to programme, so the demand for people who can code is still very great. It's still a new market in many ways, with even medium and small sized businesses starting to recognise that having a programmer on staff means that their individual and unique needs as an organization can be met with custom-tailored code. The myth of off-the-shelf software solving everyone's needs is slowly but surely being eroded, so the people who know how to code are in great demand.

And they should be, because computers and programming them are of no use if they do not serve the needs of the users, and since computers can be re-programmed, a user should never have to settle for a solution that does not meet their exact requirements (in spite of what the big computer companies say in marketing campaigns, where all users' needs are lumped into three different needs, depending on what the buzzwords _du jour_ happen to be).

The danger of not having programmed is that computers seem magical without understanding how inherently stupid they actually are.

We already see this today among people who have ostensibly been brought up with a computer education in school and at home; people still think that computer operating systems are just "part of the computer", not seeing an OS as just another bit of software that can be installed or thrown out if it fails to meet their needs. People don't understand how computers work, or how data is stored, or how gentrified computers are becoming. They don't back up their data, they are tricked into installing viruses on their own machines, they believe that computers just "slow down" after a while and resort to purchasing a new one instead of learning why their computer is not working like it did when they first purchased it. Computers and phones, for as prevalent as they are today, are still very much mysteries to most people.

If we do not educate ourselves about programming, we ensure a segment of mystified, helpless luddites who will be at the financial and intellectual mercy of those that do know computers.

And that's not what technology is supposed to be about.

# 47 Ebook Formats

This is an admonishment disguised as a friendly, informative article.

Ebooks are great. They really lighten the load for people who read a lot. I would love to have an extensive library of physical books, but that doesn't really suit my lifestyle, so ebooks are marvelous inventions which, honestly, I'd been waiting on for a decade. I still remember staying up late to scan in paperbacks in hopes of somehow turning them into digitised books; I lacked the knowledge of OCR or even how to bundle them together in a usable form, but I sure did try.

And then ebooks happened, and they are amazing, but the very term is broad and overly vague: electronic books. What does that mean? what formats are there? how does one make them? what does one need in order to read them? are they convenient?

So many questions. Here are lots of answers.

## 47.1 Ebook or Information?

OK, stop. The first thing you have to ask yourself when creating an ebook, or choosing which format of an ebook to download, is whether you are looking for an **ebook** or the _information_ the book contains.

For some things, you want an ebook: you want a perfect replica of the book itself, with all of its full page illustrations or photographs, or fancy handwritten text. You want a facsimile of a published work.

Other times, you don't actually care about the "book", you care about the text and the information or story it promises to impart.

For example, a comic book by indie artist Jim Munroe arguably makes sense as, say, a .cbz or PDF because it is necessarily a series of images, while the heady material of indie author Seth Kenlon makes more sense as an .epub or HTML, since it's all just text.

But there are hybrids, too. After all, you might be really interested in the _ideas_ that famous anarchist Emma Goldman wrote about, but maybe you are also interested in preserving the way the magazines of the early 20th century looked; so you do want the images, even though your primary interest is in the words. Or maybe you are reading CS Lewis's classic Narnia books, containing mostly text with just a few illustrations at each chapter break or so; not enough to warrant a scan of each page, but it would be nice to have the illustrations scanned in and included inline.

So that's the setup. Let's explore our options: the good, the bad, and the ugly. Not in that order.

## 47.2 The Bad

Bad ebook formats are thems that are closed. I could go on my usual diatribe about why closed source technology is bad for you and for your audience, but I'm not going to; I'm just going to use common sense:

There's no reason that something so universal and so simple as a _book_ , something that humans have been producing in one form or another for over two millenia, should be wrapped up in a closed source format.

That's just common sense; I don't care what the marketing machines tell you, the truth is the truth, and 2000+ years of producing the written word doesn't lie.

There is nothing that a closed format could possibly offer you that an open source or open standards format does not offer you.

So seriously, avoid formats like the old Kindle .azw, Apple .ibook, Microsoft .lit, and others. If you must publish in those formats (and that happens; some e-readers may only read closed formats, so people using those devices _have to_ use closed formats), at least offer an open standard version alongside.

This makes good sense for everyone. You may not think your work is in demand, but maybe you have quiet fans somewhere in the world, wanting open format versions of your work. Or maybe you don't, but in the future, you might. Locking your work up in a format that the world may well forget when Apple suddenly decides to drop .ibook, or Microsoft finally goes under, or whatever, is just making it harder to preserve your contribution to our culture.

## 47.3 The Ugly

### 47.3.1 PDF

PDFs call themselves the "portable document format". To some degree, that's true, but it's a little presumptuous. Plain old text is a "portable document format"; I could read text on an LCD clock screen wired in as the monitor for a serial port on a computer 25 years old. Now that's portable.

The good thing about PDFs are:

  1. PDFs are the professional printers' standard format. Unless the printer doesn't know their own business (there are those that don't), if you send a PDF spread to a professional printing facility, then what you send as a PDF is exactly what you will get back on paper.

There's a minor technicality about Postscript that we can ignore, so we'll just agree that yes, PDF is the correct format choice when sending work to a professional printer.

  1. The PDF file format is an open standard, which means anyone who wants to write a reader for it can do so. And they do. Since PDF was, for such a long time, the only reliable cross-platform document format, most everyone has a PDF reader installed.

In other words, PDF is ubiquitous. It's a safe _highest common denominator_ if you are looking for a format that you can confidently send to people and rest easy knowing that they will be able to open and read it.

Those are the good points about PDF. There are things to be aware of, too.

First, PDFs are big and bloated. Do you care? You don't think you care, but you do.

Look, a 95,723-word novel by Jim Munroe costs 283k as an epub. Same book in PDF: 1.1 megabytes. Not a big deal at those sizes but it shows that just for plain text (and one cover photo that is roughly 50k in size) it's _four times_ as heavy in PDF as in epub, and you're not getting anything extra. Scale that up (it does not scale linearly) and you might have documents that are 100mb as an epub taking up 200 and 300 mb as a PDF. That makes a huge difference to the hosting bandwidth as well as the client bandwidth, SD card storage space, and impact on a reader's RAM.

Not only are you not gaining features with PDF, you're missing out on features. Users can't control the style of a PDF; this may not matter to you personally, but it makes a world of difference to the colour blind or partially blind, or to picky user who just want to use the features that technology is supposed to provide.

PDFs can do text re-flow to fit a page better on screen, they rarely do and when they do, they don't do it reliably (it often depends on the reader software being used).

In short, PDF is not a replacement for any electronic format. It's an electronic replacement for hard copy print-outs. If it's dynamic viewing you want, then don't use a format designed to emulate the immutable qualities of _paper_.

If your workflow is digital to analog, then PDF makes sense. If it's digital to digital, it makes very little sense, in most cases.

## 47.4 The Good

### 47.4.1 Epub

The EPUB format is an open source ebook specification and format; that means that anyone can create or read an .epub file using open source (or closed source that writes to open formats) software. That alone is argument enough to use it, but there are many more reasons.

Most of the benefits of EPUB are summed up in the fact that there is no EPUB; an .epub file is just a collection of individual files zipped up in a specific order and called a dot-epub instead of a dot-zip. Literally.

This means that without any special software (aside from a text editor and something that creates zip files), you can generate a .epub file that anyone in the world can read on their mobile devices, ereaders, and computers.

Perhaps more importantly, you can _get to the data_ contained in an .epub file without any special software. Sure, having an epub reader is nice, and I would much rather view an EPUB in an EPUB Reader, but the point is that it doesn't require one. In a pinch, or in the distant future when the ebook fad has blown over in favour of tangible holograms or whatever, the data inside an EPUB is always accessible.

EPUB can also be as lossy or as lossless as you want. To some degree, PDFs have this ability, but there's a serious loss of formatting even in a lossless PDF if you ever need to extract what has been bundled into it. Not so with EPUB! What you get out of an EPUB is _exactly_ what you put in because, as I say, it's a ZIP file.

EPUBs are streamlined, especially compared to PDFs. The burden to open an EPUB in an appealing and user-friendly way is on the application (the ebook reader, whether it's on a mobile or your computer) rather than the file itself. So there's no real bloat to an EPUB; it's mostly the data you want the user to see, along with one or two index files for the table of contents, and maybe a stylesheet if you are fancy.

The usual content of an EPUB is HTML and CSS data. That means that your .epub file will be as dynamic and "responsive" as the modern HTML5-based web. Content re-sizes and shifts according to the device that the user is viewing it on. And not only that, but the user maintains control over what they see; most good ebook reader apps have user stylesheets that can override the stylesheet (or lack of one) that you bundle with your .epub. Graphic designers and artists sometimes cringe at that, but you have to face facts: not everyone can read your small typeface, or your medium-gray-on-white text, or whatever. There are people who are colour blind, there are people with low vision; if you want them to enjoy your content, then giving them the ability to view your content in anyway they need to view it is a powerful and empowering thing.

And don't under-estimate the power of good HTML and CSS design; you can make those cool-looking RPG rulebooks, with the medieval stone background and fancy Old English fonts; you just use a background image in your stylesheet, and include (and use, with @font-face the font files in the EPUB container. You and your audience gets to keep their cake but eat it too; you get your fancy design, but your user can override everything as needed.

#### 47.4.1.1 So what is EPUB _not_ good for?

Well, about the only thing EPUB is not good for is print pre-flight. That is, not co-incidentally, the very reason PDF was developed in the first place. So if you are preparing a book layout for printing, don't save it to an electronic book format; export it to a PDF and use that as your master print source. Save the EPUBs for _e_ books.

#### 47.4.1.2 Good for Comics and Photo Books?

EPUBs do, technically, assume that you are using it for books that consist mostly of text. If your book is just a collection of photographs, or just a collection of images (like a comic book), then an EPUB may not be the best choice. That's not to say that it's not the best choice, because even photo books or comics, you may want to include some front matter or a dynamic table of contents allowing users to skip straight to sections they are most interested in; in this case, EPUB still might be the way to go; you just need to make a bunch of pages that consist of nothing but a photo or image.

You can do that in HTML and CSS, or you can do it in markdown and then use pandoc to convert it for you. The process is basically the same either way, but markdown is probably the smarter and most minimalistic option. And when I say minimalistic, I mean it; a book containing a bunch of images only requires one markdown file containing an image link to each file. Convert that to epub and you have an book of images. This can be done in literally two commands.

### 47.4.2 cbz

The .cbz format, or the **Comic Book archive** format, is barely a format and has all the same benefits as an EPUB. It's streamlined, simple, and leaves all the work to the application reading it.

A .cbz file is just a zip file named with a .cbz extension. This gets used by a comic book reading application (yes, there are applications dedicated to reading electronic editions of comic books), which presents the images within the zip file to the user in the order they appear in the zip container. That's pretty much it. It doesn't get much simpler.

The .cbz format is great, obviously, because it's just as open and just as accessible as EPUBs, but it trims the spec down so that all you need are sequentially named image files and something to create a zip container, and you're done.

That means that if you really do have a photo book, or a comic book, or liner notes for an album, the .cbz format is perfect for you.

The one drawback of a .cbz file is that it is a little obscure. I didn't even know it existed until I happened to download one being offered for free by an indie author, and even then I wasn't aware that it was easily opened by my desktop's default document reader, Okular until I accidentally clicked on it even as I was looking around online to find out what comic reader I should install.

The nice thing about such "problems" is that they are pretty easily solved the same way that PDFs were pushed from an obscure print-industry format into becoming a household term: communication. Tell users _what_ you are giving them, and link them to free and open source applications that can be used to read the file.

## 47.5 But Wait There's More

There are several other perfectly fine open ebook formats, including .djvu, .inf, plucker .pdb, and Open XPS. They don't have quite the same level of support across devices, however, so I'm not extolling them as ebook formats, as such.

The important thing to remember is that there really is a time and a place for certain formats, and the more generic you can make your data, the better. It's better for your current audience (because you ensure anyone can get to your content), it's better for you in the event that you need to reverse engineer your own content, and it's better for the future in the event that current tech fads fall flat later on (as they often do). The best formats available do not require you to re-package your data; they allow you to convert it losslessly into another format, and let the applications bear the burden of rendering.

The ideal workflow, in the interest of flexibility and continued maintenance is to keep your data in their native formats, and then produce the packaged versions of that data with automatable format converters. Be consistent in how you store your data, and you'll be able to reproduce your published versions with one command, making corrections, updates, and revisions something that doesn't even feel like work.

There's a sometimes a strange fear of using formats that you feel are obscure; we sometimes feel like we are imposing file formats on our audience by giving them an EPUB or CBZ when all they really want is a PDF. But in fact, using the best format is actually something that everyone benefits from. I have been downloading EPUBs from Project Gutenberg ever since they started offering them, and every time I am on some ancient computer or low-spec mobile device, I go back in time and thank myself profoundly for my good foresight. Start using these formats, and you and your audience will thank you later.

# 48 Pay or Don't Pay

People sometimes get surprised at me when I reveal that I disapprove of people using unlicensed copies of software. After all, aren't I the local free software guy, always talking about open source and Free Software? aren't I the local anarchist who believes that capitalism is an inherently evil motivator substituting for the loss of humanity's...well, humanity?

Well, yes. Yes I am.

The thing is, though, that closed source software comes at a real cost. Not as much as they claim, but the companies producing the software employs people. Many of these people are friends of mine. Some of them work on free software in their spare time at home. Basically, all of them do normal things that you would expect them to do, like pay rent, buy food, and so on.

Before releasing a non-open piece of software, a company draws up a budget. They plan to allot a few million to their greedy overlords (the people you think you are stealing from), a few million to split between their many more plebian developers, some for marketing, some for backroom dealings to ensure institutional support of their wares, and so on. In short, it's an ugly business, but it is a business. People earn their livings on it, and people do need to earn some kind of living. You probably are aware of this, unless you live in a van and dumpster dive for food, clothes, and computers (although it still could be argued that you are surviving by this system even so, because others are earning enough of a living for themselves and you). Whatever your philosophy, humans have the burden of pulling their own weight in the world, in one way or another, so we must recognise that as long as money is the bartering chip that keeps us alive, some people have to go to jobs and earn money.

So if a company has declared (justly or unjustly; I am not arguing that their accounting is fair) that they expect the profits of producing **PhotoSlam** , a fictional high-end software package, is 20 million dollars, then they expect to see 20 million dollars profit. If they do not see 20 million dollars, they do not simply shrug and resolve to try harder next time, they take it out on the workers, because that's what capitalism, love it or hate it, is all about.

What I am saying is that when a company fails to sell copies of their software, they start sacking employees.

I used to think, as many people do, that when you use unlicensed software, you are striking out at the big software companies. You are flying a Jolly Roger flag proudly and you are defiantly taking what belongs to the people. You are a Robin Hood, stealing from the rich and giving to yourself because, presumably, you are poor, or you would be if you had to pay for the software that you want.

Only that's not what's happening.

A better analogy would be that you are a Robin Hood who got a little turned around on your way to the castle and accidentally pillaged the locals. You might even turn around and share your loot with others, but all you have done is take labour from your fellow poor folk and distributed it to other poor folk.

But even this analogy is incomplete. To make it more accurate, let's further suppose that all the loot you got from your fellow plebians was, as it turns out, infected with The Plague, so not only did you steal from them and redistribute to their neighbours, but you are spreading a disease.

OK, "plague" is extreme; we could swap that out with skunk spray, but this is what I am saying: when you take software without paying for it, the programmers of that software are the ones who suffer, because you can bet your life that upper management is not going to allow themselves to lose on the deal (the house always wins, and all that). But to make matters worse, you are also strengthening the stranglehold that the software has on its users.

Yes, you got the application for free, but you have no assurance that the company will not screw you over with sudden format changes or deprecation of code or UI redesigns, so you are still subject to their every whim. Worse still, you are investing your time and effort into learning something that you do not control; you are being the model consumer in every way but money; you are submitting yourself to the software company, and you are furthering their cause by using their software, bolstering the cottage industries around it, and helping its ubiquity to spread by using its file formats and implying that everyone should have a copy of this software, that computing just isn't computing without it.

At the very least, you are _not_ showing the world that there are alternatives.

So you are not being Robin Hood or Guy Fawkes or Ned Kelly. You are being that guy that you passed on your way out of the subway station this morning, or the guy sleeping on the park bench. You know the guy? the guy with the cardboard sign asking for handouts. That guy.

So, wait a minute, aren't I saying on one hand that you should not use closed source software, and yet on the other that you should pay for closed source software? Or at the very least, aren't I telling you that your unlicensed copy of **PhotoSlam** is putting my friends out of work, so you should have paid for it, and yet, at the same time, telling you not to submit to closed source software?

In a word: no.

## 48.1 They Fear Your Silence

What I am saying is that it's a free country, so you can use whatever you like. If you are ok with using closed source software, that's fine. But if you do that, you need to respect that that software was written by someone a lot like you and me. It did not just appear out of the head of a so-called great CEO. It got worked on, a lot, by people who go to work on someone else's stupid application for 10 hours a day, five days a week, and need to be paid for that work. You withholding money from the company is not getting back to the CEO, I guarantee it. It does get back to the employees, I guarantee that.

Again, this is not a good thing. It's ugly and unfair, and if you do not like it then you **should** do something about it. But using the software without paying for it is _not_ the thing to be done. If anything, that just confirms that the over-priced product with way too much of the profit going to the CEO is indeed in demand. In fact, it is the start of this beautifully vicious cycle: Universities have to teach PhotoSlam to students because Company XYZ uses it, but Company XYZ only uses it because everyone coming out of uni has been taught it. Don't contribute to this. Show the companies and the colleges and your colleagues that you do not approve of unfair recompensation for work by _not using the things being produced by a corrupt system_.

Don't encourage them; ignore them.

What's the one thing worse than people stealing your stuff? People not accepting your stuff for _free_ when you offer it to them. That, my friends, is the total de-valuation of a product, and the sterilisation of a corporation.

## 48.2 Capitalism in Action

But all of that really is an aside for those who have problems with capitalism. If you love capitalism and you love closed source applications, then you had darned well better be paying for that software, because otherwise you are lying to yourself. The conditions of the game you want to play are that you have to pay. And if you do not, then you are screwing over someone who works very hard at what they do.

## 48.3 Money

This all might seem very smug coming from someone who generally only uses free software. It might sound like I have nothing at stake here, because I don't pay for my software anyway, right? and I claim to be an "anarchist", so I'm probably sitting back watching the world stew in its own little paradox that, in the end, probably just sort of works its way out according to the law of averages; some people pay, some don't, the companies account for it and take the losses, and everything ends up being just fine.

But actually, no, I am not being smug or academic. I am not trying to point out logic errors just in hopes of short circuiting people's faith in modern economics. This logic is the same logic that I expect myself, and other users of free software, to uphold.

You may or may not know that Free Software (note the significantly capitalised **F** and **S** ) is sometimes not, in fact, priced at 0 dollars. Yes, it is true. The big famous example of this is Red Hat Linux: a totally open source platform used by a lot of very big companies, and yet Red Hat charges a lot of money for support contracts, access to immediate and convenient updates, and so on. It is not in any way cost-free, and people who use it have to buy licenses.

Same goes for Slackware, the distribution of Linux that I use. To subscribe to Slackware Linux, it costs about $50 whenever they send a new release out.

But wait, there's more , one of the best DAWs on the market, has subscription and buy-out plans available. Blender has a cloud service and subscription plan. Synfig developers have a Patreon account. Libre Office asks for donations each time you download the software. Slax sells pre-loaded bootable thumbdrives.

The list goes on and on. And when I come across a project that asks for money, I pay.

I didn't always pay. Back when I was a fresh drop-out from college, trying to find work in film, I literally did not have a spare ten dollars to give over for software. That was one of the reasons I switched to Linux. But things are more stable now, and so I pay whenever a project asks for money. Admittedly, sometimes I don't pay very much; after all, if I never use an application but once a year, I probably am not going to invest a whole lot of money into it. But if it's software that I use in real life, I give generously, because that's the rule that the developers have laid out. The guarantee that the application source code will never go away, and that my data will never be locked out of my grasp, is icing on the cake, and in this case, the cake is not a lie.

In fact, for years I had asked how I could give money to my favourite application developers. Heck, even if they didn't need the money (many devs host the code on a free code site and hack purely for the fun of it), I wanted to buy them a coffee. When an application makes a difference in my life, I want to reach out and thank someone.

As an extra aside, I should point out that this illustrates the difference between the closed source model and the open source. The payment, while it should be respected in both cases, is truly and acceptably optional in open source. For all the projects I mentioned, there are always alternatives to paying for the product, or the payment amount is user-definable, such that if you go to the Ardour site to download, then you can choose to pay $1 (even at my lowest point of post-uni freelance struggle, $1 I could afford). Or, really, you can download the source code and build it yourself. And these are perfectly acceptable options in open source.

It isn't optional in closed source, because it cannot be. You can't budget your multi-million dollar monolith based on an optional payment plan. It just doesn't work that way because you aren't selling the product that you programmed (the fact that there are industry-changing $0 software proves that), you are marketing the idea that your software is essential for emotional and professional success.

Open source sells you the technology, part and parcel.

## 48.4 Honesty

My point is that if you truly believe that closed source software is as great as it markets itself to be, then you buying into the product entirely. It's not part of the deal that you get to use the product un-hindered, for $0. You might get a reduced deal on it, you might get a non-commercial license, but part of the game is that for full rights to use the software, payment must change hands.

If you believe that demanding money for a product is unethical, or that insisting upon license re-newal amounts to extortion, or that a product is just plain over-priced, then you do not believe in the closed source philosophy.

The way to tell companies that you want independence is to not pay to be let into their prison.

# 49 Ownership

The concept of ownership is something we hold very dear in our current social system. Generally, "ownership", the idea of having a certain number of things (clothes, eating utinsils, books, a computer, a media player, and so on), must eventually be boilt down to how we as a society accept claims that are made to things being produced on this planet.

Logically, one would lay claim to owning some object as long as one literally possessed that object. That is, if I am carrying in a few backpacks my clothes, my computer, my books, and so on, then I claim ownership on those things and it should not be acceptable for someone to come to me and take by force the things I claim as my own.

Most people agree to this concept of personal ownership, and few people argue that there should be no ownership whatsoever, and that anyone should be able to go to anyone else and take anything they so please, from the food on their plate to the clothes on their back.

This becomes more complex when it's acknowledged that not everything humans wish to own will fit in a backpack. This introduces real estate, or land ownership, wherein a person ropes off some portion of the very Earth and claims to own it.

Some power greater than the people claiming this ownership is usually given the authority to reinforce this claim, such that if anyone's land is threatened in any way, the greater force of a government steps in and protects the property.

In any case, the participants in the concept of ownership often invite some third party into the equation in order to have some greater power to which everyone may defer should an argument over ownership arise. The assumption is that without this third party to pose an equal threat to both participants, then hand-to-hand bloody combat will result over every item any two people happen to want to own at the same time.

This is, really, the basis for most ideas of government, and it is one of the most significant pragmatic questions most often weilded at anarchists; without a governing all-powerful ruler, how can mere individuals ever agree on who owns any given resource?

Such a question, however, may be flawed.

Quite possibly, the question should not be "without a government, how can individuals agree on who owns what?" but instead it may be "why would people ever argue over who owns something?"

Human beings, stripped of the invented concerns that we have created for ourselves, are simple creatures in need of shelter, food, and companionship. These alone can make a human truly happy, and without these a human will become desperate to acquire them. In fact, a human will do nearly anything to acquire these things if deprived of them.

Conversely, if these things are made available, there is little cause for serious arguments or dispute, and any concern above these things are superficial. Greed has been introduced to the human equation by capitalism, in any of its many flavours.

Currently, a kind of ownership making the rounds in popular ideology involves people owning buildings, and in that building they own redundant items, and those items they offer to sell to other people. These property owners do not actually intend to use these items and in fact they would rather destroy the item than let anyone else have them.

This is a layer of abstraction far beyond mere ownership; this is producing excess, and then withholding it from people until a ransom is paid. To remove it even farther from pragmatism, the only ransom accepted for these items is money. Few, if any, stores today would let someone barter for goods with either labour or unused items of their own. That kind of trade has been relegated to a completely different model of "economy", it has been marked as less efficient and less flexible than a money-based model.

Interestingly, an often-ignored problem with the money-based system is that the value of any given activity or product is completely arbitrary. Since money theoretically reflects amounts of labour performed, then one person might think that an hour of labour certainly would earn them, for instance, a tshirt that they need to protect themselves from the sun, especially when there seems to be an over-abundance of tshirts in most department stores. So in theory, someone should be able to go to a store and trade an hour of labour for a tshirt. But for some reason, most stores will not accept that as a valid trade, and demand instead four hours of labour for a tshirt.

To confuse matters further, if one person's labour is deemed more valuable than someone else's labour, then one person might be able to trade a mere hour (or a fraction of an hour) for a tshirt, because not all labour is equal.

There is an arbitrary value for the things that we want or need, while the work that we do that is supposed to pay for those goods is also valued arbitrarily. This creates a flimsy foundation for the idea of commerce.

If we look at the idea of a Marketplace, we have this at its root:

Proprietor (for lack of a better term) has made a widget, so this proprietor owns the widget and no one should be able to take that away without permission.

Customer (again, lacking a better equivalent term) comes to the market and wants to buy a widget, so the proprietor and customer talk about a fair trade or fair method of recompensation, and the "sale" is made.

This becomes impractical when it scales; a single proprietor cannot single-handedly make a million widgets, but if a million people want widgets, what can the proprietor do? The way it happens now is that a proprietor steps in and decides to have lots of people make widgets, and he pays them as little as he can get away with in order to do this, and then sells those widgets for quite a bit more.

The profits from this combined effort are not shared equally, because the proprietor feels that his role in the effort deserves a greater reward than everyone else's role. He feels that his labour (such as it is) is more valuable than the labour of his workers.

To add insult to injury, the workers making the widgets usually don't even get one of the widgets they themselves have helped make. So not only is their labour worth less than their employer's, it is also worth less than the very product they created.

And the customers must pay whatever price the proprietor demands, or they will get no widget either.

The way it could work (and does, in some communities) is that a group of people get together and decide that they want widgets. Because some organization is required to produce widgets, this group of people decide to appoint the proprietor as the leader of the effort. We won't call the position a leadership role, we'll call it a co-ordinator role. This person's job is to co-ordinate the effort, but the labour required to do that isn't valued any higher than anyone else's contribution.

Once organized, they begin making the widgets, and in the end they have plenty of widgets. In fact, they probably could produce more than they actually need. So they could take the widgets and re-"sell" them for fair prices, without any want of profit; they simply trade their surplus for other people's surplus. Or maybe they do sell it, and then they split the profits.

Either way, everyone wins, and it scales, because groups of people can get things done, and different groups can specialize in different things and can trade one another for the products of their expertise.

This is not something that works only in theory and on paper; it actually happens every day in worldwide communities like the free operating system Linux, the GNU utilities, BSD, in independent online craft stores, within the artistic community via Creative Commons, and in organized trading events such as with nomoola.com and other such groups.

Note that it is less effective to fight to commandeer an "owned" system than it is to ignore it and develop a working alternative. There are probably exceptions, but all the independent successes that I know of and use personally have not been corporations that have been over-run by public opinion, but non-corporate alternatives. This spans the gamut from open source software to co-op and farmers markets, from craft markets to communes. Stealing from a corporation or persuading a corporation to change in order to better suit your needs is not an alternative; it's adjusting the way in which you yourself are commoditised; by stealing, you become marketing, and by opening a dialogue, you become market research.

These are new commerces without the abstraction of surplus ownership, money, and arbitrarily valued labour. This new commerceless system is often seen as a fairy tale and communist fantasy, and the current system is typically conceded to be sometimes harsh and unfair but one that works overall. It is an anarchist's belief, however, that the current system does not work, not in a broad general sense nor in a specific individual personal level. Rather, the current system produces junk that people do not need, makes a few people very wealthy and ensures others are very poor, and causes more waste than we know what to do with now and in the future. It's not a working system at all, it is simply the predominant system in motion, but hardly something that cannot be turned off and replaced.

Seek it out in your community. Support it, partake in it. If it doesn't exist yet, start it.

# 50 Late Blooming Geek

I was not exactly born a computer geek. I was pretty nerdy as a kid, and I did use computers a lot, and I liked really nerdy things, like science fiction and fantasy, and computers and Lego. I was not at all popular and in fact most of my school memories are of bullies. But I was not the kid who had the smarts to just hide out in the computer lab all day and refine his programming skills. I did not attend uni for comp sci, I did not understand even HTML until HTML5, and I did not build my first PC until my first or second "real job".

By contrast, I did happen to be good enough at computers that I was generally considered the Computer Guy among all my friends and family (whether I liked it or not), and I did love learning new applications and tricks. So when I started looking into trying to get into doing something computer-y for a living, I tried my best to learn all the requisite applications.

Being a book nerd already, I was not afraid to read. In fact, I purchased a number of very thick technical books and read them from cover to cover. They were all very specific to very expensive proprietary softwares, although there was some general information as well (like explanations of basic concepts like compositing, colour theory, and so on). The problem was that all the software I was trying to learn cost a lot of money. Usually the books came with a 30 day trial of the software, so you could go through the book and actually use the software of which it spake, but after 30 days it stopped working and so everything that you just learned was left to stagnate from a lack of practise.

## 50.1 Approved Method of Acquisition

Sometimes I would find an old abandoned copy of a software that would still run, and a few times I even resorted to "obtaining" the software by other means. The problem here was that old versions were, well, old, and "obtained" versions often would be broken in small, semi-suspicious ways and you could never quite tell if it was the copy of the software that you had "found" or if it was a bug in the software.

This also created a cycle of desperation and fear; I was always on the look-out for the latest software, because I was told by marketing and friends that I simply had to have the latest or I was useless to the world. And since I started greating files with the software, if I ever wanted to open them again, I would _always_ have to make sure I could get a copy of the software. The minute the software slipped away from me, it meant that I had lost access to my stuff, and I didn't like that feeling.

But it was good. I was the guy who had all the apps. People could ask me if I had a copy of something, and I often did. I was, more or less, living up to the reputation that people had assigned to me; I was The Computer Guy, as long as no one asked for technical details on how stuff worked, and as long as someone on the internet could deliver the software for me to obtain.

Pretty quickly, a new problem became evident. It seemed like with every new release, the software was bigger, bloatier, and hungrier. It was like a drug. Not the software itself, I had no love for that, I just knew that in order to make cool stuff and hopefully eventually get paid for it, I needed this software. But the software itself was very demanding, perpetually needing a bigger and better computer. Since the platform I'd been brought up on was The Singular Most Expensive Platform on the market (to this day, I have no idea why that, of all possible choices, had been the OS that my middle-class parents had chosen to raise their children on), I was really finding it difficult to keep up.

The funny thing about this syndrome is that it's a socially-acceptable sickness. I liken it to alcohol and prescription drugs; in most circles, these are very acceptable vices, even though they are regulated and considered harmful. You can drink alcohol, and you can drink alcohol and drive, but if you do it "too much" then you get into trouble. You can take prescription drugs, and you can hoarde prescription drugs for "recreational" use, but if you do it too much or start selling your stash, then you get into trouble.

It's very much a "play by my rules or go home" approach, as it leaves no valid alternative that enables you to participate. You either play the game by someone else's rules, or you do not play. If you cannot afford the cost-of-entry, then you are permitted to "cheat" in only the approved ways, as long as you don't talk about it (or if you do talk about it, you assume a demeanour that suggests that you are ashamed of it). If you cheat too much, you get into trouble.

The companies selling the software has everything to gain from their software being perceived as valuable enough to be stolen. And certainly they have everything to gain from becoming the de facto choice for a given task. There's a big cottage industry around software, and there's a lot of money to be made from big companies who, whether they like it or not, have to use the software because everyone who comes in for a job knows that software, and only that software. It's a beautiful, sick, self-perpetuating cycle.

## 50.2 A Brush with the Alternative

I resorted to some pretty drastic measures back in those days. I got a job at a computer store partly so I could get employee discounts (such as they were). I really got lucky (financially speaking) by accidentally purchasing one of the few towers that ended up being able to have its CPU upgraded by a third-party hardware hack some years later. I bought the CPU and a better graphic card and increased the clock speed so I could buy an upgraded OS and continue to serve the abusive overlord that was proprietary software.

You might think that all of that was bad enough to convince me that computers were basically evil, and indeed it was beginning to cause me to question my faith. For instance, I had never looked at lesser brands of computers before. When I got the job at the computer store and saw that the proverbial "PC" (that's what I called non-Macs: a PC, which confusingly, to my mind, meant it ran Windows) boxes were a _third of the price_ compared to what my brand was selling, my resolve was shaken a little. And later, when I was particularly desperate for an upgrade, I did seriously consider switching over to the enemy's side.

I am ashamed to say, also, that at the same time, I started hearing murmurs about a trio of products called **Red Hat** , **Yellow dog** , and **Suse**. I had no idea what it was all about, but they were located in the PC side of the store, so I knew it had to be something for Windows, and so obviously unimportant.

I want to point out, in my own defense, that this was somewhat a valid reaction. I am not a fan of Mac at all (any more), but I don't think that the answer to Mac is Windows. Of course, that's how both Windows and Apple position it, having become pretty friendly compatriots who support one another by pretending to compete. But for someone who literally does not understand what Linux _is_ , what options are there? I wasn't going to give up Mac for Windows, because Windows is dismal. So, no, the answer is not Windows, it's PC...but I at the time, I literally did not understand that a "PC" was just a lump of hardware.

Scary to think that I was considered "the geek" among my friends.

Sadly, most people get so blindly partriotic about the brand of their computer or operating system, that even if they do understand that an OS is separate from the hardware, they never actually _express_ that. They just keep spreading discontent, meaningless competition, and disinformation.

At one point, I got a phone call from a customer asking if we carried Red Hat, and I told her that we did, but that I knew nothing about it. She told me that we should pay attention to that stuff, because it was The Future.

I remember her sounding a little desperate and angry. I did not think she was right, but then again by that time I had learnt that most customers were criminally psychotic so I ignored pretty much all of them. Even so, I do often think back at this at-the-time confusing conversation and marvel at how close I was to getting the answer to all of my problems.

## 50.3 Man Behind the Curtain

Eventually, I did the unthinkable and got a credit card (I don't touch the things nowadays; proud to be credit-free) and purchased the latest and greatest name-brand computer and continued to run the proprietary software that I was convinced was going to land me a sweet job doing computery artistic things.

I was still paying the computer off when its hard driwe died. I took it to the specialty computer geniuses (they were certified as such, it said so on their t-shirts) and was told that one year is the expected life span of a hard drive.

I am not making this up.

For the first time in my life, I suspected that I was being ripped off. I didn't blame the hardware failure on the software or even the computer brand that I had so loved. No, I had, instead, finally gotten a glance behind the curtain. The store I was in was glitzy and bright and high tech, and everyone there seemed so smart. But underneath it all, they were lying.

Around the same time, I noticed that a lot of my old files were not opening any more. Some of the software that I had used for the past few years had gone out of business. In the past, when that happened, another company always seemed to pick up where one left off, just sort of magically inheriting file formats. I never really gave it a second thought, it was just something that happnened naturally. But lately it seemed that companies were disappearing and taking the ability to open the files that their software had produced with them. So if you had been sucker enough to use this music application or that word processor or that compression algorithm, and you hadn't thought to save your data to something new (if it could even be saved to another format; specialised project files were a little difficult to just save into some other application) then you were quite possibly out of luck.

And unfortunately, my year-old computer had just died, taking all of those legacy applications with it.

That was probably the deepest scar: being locked out of my data in the sense that quite probably I would never see it again. And worse yet, I'd _paid_ for the privilege of having my hard work locked up.

## 50.4 Unix

I was on a subway, reading a film trade magazine, and there was an interview with a compositor, and he was talking about some software that the studio used. He said that it was great how the software was on Unix, so that for a lot of his work, he didn't even have to launch the user interface. He could just run it from a command line. This was greek to me, but it sounded amazing. I did not understand how such a thing was possible; how was it possible to run an application without _launching_ the application?

It got me very interested in the idea that there were methods of computing out there that I had never heard about.

It also triggered a memory in me: long ago, when I was a mere lad, my father had told me that the purest, most powerful computers, were UNIX computers. These computers were not located in people's homes, but they had always been around, and they powered everything.

I ended up freelancing, oddly enough getting several jobs training the employees and customers of the store where they had told me that hard drives were only expected to work for a year. Luckily, right across the street from that store was a bookstore, where I would hang out after a teaching gig. It was here that I found Visual Quickstart Guide to UNIX. I still remember walking into the store one day with that book tucked under my arm, and one of the salespeople laughing at its title. A visual guide to a text-based system? pretty funny. Shows what he knows; the book is great, and gave me a solid understanding of how Unix worked at its lowest user-facing level.

## 50.5 Hardware Hacking

Around the same time as all this, I was really getting the urge to do something creative with a computer. Deep down, I wanted to design my own interface, because I knew that the interface of OS X was not efficient for me any longer, and hadn't been for a very long time. So I did as much as I could, starting with little mods to change the icon theme, and then add-on software that provided alternatives to the Finder, including one called Quicksilver, which, I'll bet you did not know, later got ripped off by Apple in the form of Spotlight.

Unable to truly customise my computer's interface, I started getting curious about modding the computer itself. I discovered what was then being called "IP TV" (as in "television over IP"). We call them video-casts, or video-podcasts, or "vlogs" now, but this was before youtube was really a thing, and right at the beginning of when the term "podcasting" was catching on, so it was "IP tv" and, in my mind, it was destined to replace traditional TV. I still wish it would, but I digress.

I found a show about hardware mods, and it was amazing. This guy took computers apart, cut designs in cases, re-painted them, re-housed them; he did amazing mods and I wanted to do something like it. So I started collecting computers from customers; someone would bring me a computer so I could transfer their data to their new one, and I'd do the job for free in exchange for the old computer that they were getting rid of anyway.

The more I watched the show about hardware mods, the more I realised that it took a different set of skills than anything I had. I'd never taken so much as basic carpentry in school, much less did I know what a dremel was or how to use it.

So my mind started wandering back to the software side of things.

After some time, I managed to get hold of two "broken" laptops. I took them both apart, swapped out parts (yes, of laptops) until I ended up with one working laptop. Based on what I learnt from the UNIX book, I started messing around with this laptop's UNIX interface (not Linux, but OS X, because I still had not yet learnt about Linux, believe it or not). The terminal proved absolutely addictive to me, and I soon found the Fink Prcoject, a **dpkg** and **apt-get** system for Mac OS X. I didn't know what any of that meant at the time, I just knew that UNIX applications could be obtained and run, for free, from the terminal.

I cannot describe the sense of overwhelming enlightenment this caused me. I had discovered, largely by my own research, an entirely _new_ desktop called **e16** (that's Enlightenment, of course). I found a photo re-touching application, a word processor, games, other terminals, and so much more.

I was not using the internet for most of this research, because it never occurred to me that other people might be doing this sort of thing; and to a degree, I was right, although an internet search or two might have lead me to Linux a lot sooner! But I'd found all of this on my own, from books and by reading files on the computer, and I was enormously excited.

When something from Fink failed to compile, I would download the source code, read the docs, and attempt to make changes as needed so that the application would compile. Sometimes that worked, but mostly it failed, but by the end of my grand experiment, I had a fully functional UNIX operating environment that ran on top of OS X. I intentially broke as much of OS X as possible in order to make this happen (if I didn't do that, then the OS X dock and menu bar kept getting in my way). I moved configuration files, I blew away libraries and executables, I absolutiely destroyed the system until I'd stripped it down as far as it would go (which was not very far) and I even scripted it such that a UNIX desktop (Windowmaker, specifically) would start after the boot sequence. The effect was, more or less, that I was able to boot to a Unix desktop without ever "seeing" OS X.

My first big public UNIX success story came about from chatting with some of the tech people at work, because it had come out that I was interested in UNIX. So we were talking, and I was asking some questions about a subtlety of networking or PIDs, and I accidentally realised, as we talked, that I secretly and politely knew more about UNIX than the entry level IT guys who were on the track to becoming sys admins of a major UNIX network. In fact, I was over at one of their houses once, and he asked me to figure out how to get his computer to print over his home network, because I knew stuff about UNIX. To my own amazement, I configured his computer, via CUPS, to print to his networked printer, when nothing he was able to do via OS X would allow it.

I was more or less having to come to terms with the idea that I might actually know a thing or two about this UNIX stuff, even though I felt like a complete idiot. (And of course, I was; there was a lot to learn yet, but I was clearly doing a fair job of learning it).

## 50.6 GNU's Not Unix

Somewhere in the middle of all these UNIX experiments, I learned the old chestnut about how you could play Tetris "in the terminal". It was wrong, of course, because it was actually referring to playing tetris in Emacs, but since Emacs launched only with the **-nw** flag on OSX, that basically meant the same thing to most "tech journalists".

I started playing tetris "in the terminal" in my spare moments at work, but whenever the boss would come round, I had to quickly switch over to something that was not a game. So I had to do a littile research into how I could do that most efficiently, and it was from there that I learnt that I was not playing tetris "in the terminal", but from inside of an application called Emacs.

So I started looking into this Emacs application, and found tucked away in **/usr/share/docs/emacs** (or thereabouts) a document called **COPYING**. I read the entire thing, from start to finish, and I was enraptured until the last letter. I think there was another document in there, because I do recall some footnotes or commentary about the true meaning of "free"; maybe it was a README or something.

Point is, from these files, tucked into the hidden UNIX file system lurking beneath all of the OS X blivet, I learnt about GNU and Linux. Most importantly, I learnt _why_ they existed, and why they absolutely needed to exist.

This was a big deal for me, because although I ascribed to anarchism and had for years, I was still struggling for a meaningful way to put that into action. The fact that a world of independent computing existed, free of corporate control or influence, and it was all based on UNIX, this gave me a whole new potential. The thing that I thought was just a silly waste of time was something that, in fact, was a very real form of resistance. It was practical anarchism, and it was working!

## 50.7 Linux

Armed with my bastardised UNIX laptop, I was compulsively spending my lunch hours and evenings in the book store, poring over magazines and books about Linux. It turned out there was a rich eco system all about Linux; funny how you don't notice two entire bookshelves that were about nothing BUT Linux until you start looking for it.

I would read through book after book, trying to figure out which Linux distribution was right for me. The one I remember mostly was a book that talked a lot about Mandriva, and in fact in came with a Mandriva DVD installer in the back. It _claimed_ that you could just pop in the disc and reboot, and you could run the computer off the disc. I'd done something like this before, ages ago, with some kind of system rescue disk, so the idea was not foreign to me, but it did seem a little too-good to be true.

I didn't have much money, but I wanted to try Linux, so I got a non-Mac laptop from someone on Craigslist for $25 (should have been free), and took it home. I immediately booted into Mandriva Linux. The laptop didn't really work (the hardware, I mean; like, it was just flat out broken) so I didn't spend much time in Mandriva, but I'd booted into it. I had seen Linux, I saw that it booted and provided a desktop, and that it was something I could probably learn.

It was all downhill from there. I got hold of a PowerPC Ubuntu installer and put that on an old iMac, but a lot of that hardware didn't work due to driver issues, and I eventually realised that if I wanted to get into Linux in earnest, I needed to get a "PC".

At first, I felt like a traitor to Mac, because I was using a non-Mac system. I told my friends that I wasn't leaving Mac, I was just also using Linux, which basically is the same thing as a Mac, if you think about it. They did not agree.

But I was too far gone to care about peer pressure, and I was determined to get a PC that I could run Linux on.

It must have been fate, because as soon as I said that I wanted to get a laptop, an online scam popped up saying that if I signed up for promotional deals and trial memberships for a bunch of different services, then I could earn points toward getting a laptop for free. A friend of mine swore on his life that it had worked for him, and insisted I sign up and do it. I figured that if it did really work (and my friend insisted that it had, and had an iPod as a prize to prove it), it wasn't going to last long. So against my better judgement, but driven by poverty and an addiction to Linux, I signed up.

I could either get one of those fancy new Macbooks that had just come out (Intel) or I could get a Sony Vaio. First of all, I couldn't believe I was trying to call the scam's bluff by actually signing up for these online offers, and second of all, I couldn't believe I was going to choose a Sony over a Mac. It felt dirty.

But the scam turned out not to be a scam. They actually did send me a laptop, and the moment I got the laptop in the mail, I cancelled all my trial memberships and my account with the online service (which, obviously, folded two months later anyway).

I didn't even boot into Windows, obviously; I popped in a distribution, booted, and installed Linux on my very first dedicated Linux PC.

## 50.8 Open Source

My first impression of Linux was total and unabashed excitement, awe, and reverence. I felt like I'd finally found my place in the world. It was perfect.

Strangely, though, I, like many new Linux users, looked at everything through the foggy filter of the current tech industry competitors. It was not possible to see an operating system as something that could exist without also competing for market leadership. It just didn't even occur to me to think of it in any other way.

In addition to that, the way I judged Linux, and sought to understand it, was from an entirely Mac-centric viewpoint. I mean, if ever you want to understand how ancient civilisations possibly believed that there were witches and a flat Earth, just switch to Linux and _relish_ in your brain's reaction to something that simply. does not. fit. into the way you were taught the world works.

My initial decisions about distributions that I tried were invariably based upon how Mac-like they were, and every computing concept presented to me was equated to the nearest Mac convention I could find. Things as basic as a harddrive partition were profoundly confusing to me, the fact that the desktop was a separate entity from the file manager, the fact that there was hardware inside the computer case that the OS had to talk to with drivers, the fact that there was choice in how I could compute. It was so foreign to me that my brain had no option but to reduce everything down to Mac terminology.

It took me a long time before I really understood that Linux was not a brand. It wasn't a publically traded company that was out to put Microsoft and Apple out of business. It wasn't even in business at all. Heck, there wasn't even a "us" for me to grab hold of when going up against "them". Linux, which is what I was seeing as the embodiment Open Source, is not a group of people that meet and plan strategies and run a co-op. Linux and open source is a landscape, in which tribes and loners roam, and sometimes they get along, and sometimes they don't, and sometimes they double-up on effort and repeat one another's successes or mistakes, and oh-my-gosh I was there too. I was one of those loners, wandering the land with my computer and a few modest shell scripts (I had a script to configure a network card on Slackware because I didn't know about **/etc/rc.d/rc.inet** or **wicd** yet).

I guess what I'm trying to say is that learning Unix and Linux was the easy part. Reversing a lifetime of capitalist fundamentalism was quite another. The depth in which ideas form in people's minds is, I think, under-estimated by most of us. I mean, take your own beliefs as an example. Were you raised to believe there is a heaven and a hell? or were you raised to believe there absolutely is no heaven or hell? or were you raised to believe that there is such a thing as right and wrong? or that monogamy is an innate natural law for humans? that there is life on other planets? that light speed cannot be broken? that mathematics is true?

Take your pick; these ideas are so deep that they don't even seem like ideas; we call them "facts", and they shape how we interpret and react to the world around us. Some of them we accept as being debatable, some we admit we do not fully understand yet, and still others we would fight tooth and nail to defend.

And that's what open source feels like: the realisation that all the facts we'd been taught or we've gleaned may not, actually, be the way the world actually works.

That's a good thing, but it's also a lot to process. But once you do process it, it's an insanely powerful force. I didn't set out to become a Linux geek; I wanted to make weird art and get paid for it. Instead, I figured out a way to get paid for using systems that enable me to make art without paying.

# 51 Colophon

Unsurprisingly, nothing but open source software was used to produce this book. There is a lot of software that I enjoy using, and probably I'll leave some out accidentally, but these are some of them:

  * Slackware Linux, the self-proclaimed "most Unix-like of all Linuxes". It is, incidentally, the oldest Linux distribution still in regular production. I have not used it nearly as long as it's been around, having found it only at version 12 or thereabouts. I'll admit to using some Debian PowerPC here and there, because my laptop is a PowerPC machine, but generally I'd rather use Slack.

  * The ultimate text editor, GNU Emacs, has been around forever and is truly as great as its reputation. It took me a long while to get my foot in the door with Emacs; it is a truly complex application, but once everything falls into place in your mind and your fingers, you realise why everyone loves it so much. It is to text editors what Linux is to operating systems; total control, total customisation, and ultimate power.

What I think a lot of people don't understand about Emacs and, indeed, the Unix way of doing things in general, is the profound power it gives you when you can log into a server a continent away and get a full day of work done using _nothing but a text interface_. That's a big deal; the text interace keeps things fast, but the tools are powerful enough that the minimal interface doesn't hurt (in fact, some might argue that it helps).

  * I generally end up using KDE on my main workstation as my desktop, although I'll freely admit to using Fluxbox as well. Both of those two environments promote customisation, control, and efficiency, and since I usually end up using K apps on Flux, it all feels the same to me.

  * The, um, cartoons (such as they are) were drawn with MyPaint. I am obviously not a professional illustrator but MyPaint is pretty cool.

  * The cover of the book was designed in Inkscape, and it uses artwork from the Creative Commons site openclipart.org; specifically, a laptop uploaded by user Johnny Automatic and a penguin sitting at a computer by Moini.

  * All fonts used are also open source. Specifically, I use Kabel (the old official KDE font), Comfortaa, and Liberation Sans (a free Helvetica-compatible font). There are probably some other fonts that snuck in here and there, but mostly those are what I used.

The essays (if I may call them that?) in this book were written largely in Markdown. I'm a big fan of plain text. One issue with truly _plain_ plain text, though, is that there is no structure to it, so parsing it can be tricky. The advantage to markdown is that it preserves the plain textedness of the source material, but provides a subtle structure that you can latch onto and parse when needed.

...which coincidentally brings us to the amazing parser, Pandoc. Pandoc is utterly amazing; it takes text, parses it, converts it to nearly any format you want, and even renders and packages it up as EPUB or similar. Aside from having written a paragraph singing its praises, I am left speechless. This is one of those tools that you use, and then later you sit back and think about, and find yourself shaking your head and mmuttering "This should just _not_ be free!"

But of course, it is. And that's the beauty of this whole culture, isn't it? It's about sharing information, labour, and resources. Small wonder that I'd write in excess of 40,000 words marveling at it.

Thanks for reading.

# 52 License

This book is licensed under the Creative Commons BY-SA License.

## 52.1 Creative Commons Attribution-ShareAlike 4.0 International Public License

By exercising the Licensed Rights (defined below), You accept and agree to be bound by the terms and conditions of this Creative Commons Attribution-ShareAlike 4.0 International Public License ("Public License"). To the extent this Public License may be interpreted as a contract, You are granted the Licensed Rights in consideration of Your acceptance of these terms and conditions, and the Licensor grants You such rights in consideration of benefits the Licensor receives from making the Licensed Material available under these terms and conditions.

 **Section 1 – Definitions.**

  1.  **Adapted Material** means material subject to Copyright and Similar Rights that is derived from or based upon the Licensed Material and in which the Licensed Material is translated, altered, arranged, transformed, or otherwise modified in a manner requiring permission under the Copyright and Similar Rights held by the Licensor. For purposes of this Public License, where the Licensed Material is a musical work, performance, or sound recording, Adapted Material is always produced where the Licensed Material is synched in timed relation with a moving image.
  2.  **Adapter's License** means the license You apply to Your Copyright and Similar Rights in Your contributions to Adapted Material in accordance with the terms and conditions of this Public License.
  3.  **BY-SA Compatible License** means a license listed at creativecommons.org/compatiblelicenses, approved by Creative Commons as essentially the equivalent of this Public License.
  4.  **Copyright and Similar Rights** means copyright and/or similar rights closely related to copyright including, without limitation, performance, broadcast, sound recording, and Sui Generis Database Rights, without regard to how the rights are labeled or categorized. For purposes of this Public License, the rights specified in Section 2(b)(1)-(2) are not Copyright and Similar Rights.
  5.  **Effective Technological Measures** means those measures that, in the absence of proper authority, may not be circumvented under laws fulfilling obligations under Article 11 of the WIPO Copyright Treaty adopted on December 20, 1996, and/or similar international agreements.
  6.  **Exceptions and Limitations** means fair use, fair dealing, and/or any other exception or limitation to Copyright and Similar Rights that applies to Your use of the Licensed Material.
  7.  **License Elements** means the license attributes listed in the name of a Creative Commons Public License. The License Elements of this Public License are Attribution and ShareAlike.
  8.  **Licensed Material** means the artistic or literary work, database, or other material to which the Licensor applied this Public License.
  9.  **Licensed Rights** means the rights granted to You subject to the terms and conditions of this Public License, which are limited to all Copyright and Similar Rights that apply to Your use of the Licensed Material and that the Licensor has authority to license.
  10.  **Licensor** means the individual(s) or entity(ies) granting rights under this Public License.
  11.  **Share** means to provide material to the public by any means or process that requires permission under the Licensed Rights, such as reproduction, public display, public performance, distribution, dissemination, communication, or importation, and to make material available to the public including in ways that members of the public may access the material from a place and at a time individually chosen by them.
  12.  **Sui Generis Database Rights** means rights other than copyright resulting from Directive 96/9/EC of the European Parliament and of the Council of 11 March 1996 on the legal protection of databases, as amended and/or succeeded, as well as other essentially equivalent rights anywhere in the world.
  13.  **You** means the individual or entity exercising the Licensed Rights under this Public License. **Your** has a corresponding meaning.

 **Section 2 – Scope.**

  1.  **License grant**.
    1. Subject to the terms and conditions of this Public License, the Licensor hereby grants You a worldwide, royalty-free, non-sublicensable, non-exclusive, irrevocable license to exercise the Licensed Rights in the Licensed Material to:
      1. reproduce and Share the Licensed Material, in whole or in part; and
      2. produce, reproduce, and Share Adapted Material.
    2. Exceptions and Limitations. For the avoidance of doubt, where Exceptions and Limitations apply to Your use, this Public License does not apply, and You do not need to comply with its terms and conditions.
    3. Term. The term of this Public License is specified in Section 6(a).
    4. Media and formats; technical modifications allowed. The Licensor authorizes You to exercise the Licensed Rights in all media and formats whether now known or hereafter created, and to make technical modifications necessary to do so. The Licensor waives and/or agrees not to assert any right or authority to forbid You from making technical modifications necessary to exercise the Licensed Rights, including technical modifications necessary to circumvent Effective Technological Measures. For purposes of this Public License, simply making modifications authorized by this Section 2(a)(4) never produces Adapted Material.
    5. Downstream recipients.
      1. Offer from the Licensor – Licensed Material. Every recipient of the Licensed Material automatically receives an offer from the Licensor to exercise the Licensed Rights under the terms and conditions of this Public License.
      2. Additional offer from the Licensor – Adapted Material. Every recipient of Adapted Material from You automatically receives an offer from the Licensor to exercise the Licensed Rights in the Adapted Material under the conditions of the Adapter's License You apply.
      3. No downstream restrictions. You may not offer or impose any additional or different terms or conditions on, or apply any Effective Technological Measures to, the Licensed Material if doing so restricts exercise of the Licensed Rights by any recipient of the Licensed Material.
    6. No endorsement. Nothing in this Public License constitutes or may be construed as permission to assert or imply that You are, or that Your use of the Licensed Material is, connected with, or sponsored, endorsed, or granted official status by, the Licensor or others designated to receive attribution as provided in Section 3(a)(1)(A)(i).
  2.  **Other rights**.

    1. Moral rights, such as the right of integrity, are not licensed under this Public License, nor are publicity, privacy, and/or other similar personality rights; however, to the extent possible, the Licensor waives and/or agrees not to assert any such rights held by the Licensor to the limited extent necessary to allow You to exercise the Licensed Rights, but not otherwise.
    2. Patent and trademark rights are not licensed under this Public License.
    3. To the extent possible, the Licensor waives any right to collect royalties from You for the exercise of the Licensed Rights, whether directly or through a collecting society under any voluntary or waivable statutory or compulsory licensing scheme. In all other cases the Licensor expressly reserves any right to collect such royalties.

 **Section 3 – License Conditions.**

Your exercise of the Licensed Rights is expressly made subject to the following conditions.

  1.  **Attribution**.

    1. If You Share the Licensed Material (including in modified form), You must:

      1. retain the following if it is supplied by the Licensor with the Licensed Material:
        1. identification of the creator(s) of the Licensed Material and any others designated to receive attribution, in any reasonable manner requested by the Licensor (including by pseudonym if designated);
        2. a copyright notice;
        3. a notice that refers to this Public License;
        4. a notice that refers to the disclaimer of warranties;
        5. a URI or hyperlink to the Licensed Material to the extent reasonably practicable;
      2. indicate if You modified the Licensed Material and retain an indication of any previous modifications; and
      3. indicate the Licensed Material is licensed under this Public License, and include the text of, or the URI or hyperlink to, this Public License.
    2. You may satisfy the conditions in Section 3(a)(1) in any reasonable manner based on the medium, means, and context in which You Share the Licensed Material. For example, it may be reasonable to satisfy the conditions by providing a URI or hyperlink to a resource that includes the required information.
    3. If requested by the Licensor, You must remove any of the information required by Section 3(a)(1)(A) to the extent reasonably practicable.

  2.  **ShareAlike**.

In addition to the conditions in Section 3(a), if You Share Adapted Material You produce, the following conditions also apply.

    1. The Adapter's License You apply must be a Creative Commons license with the same License Elements, this version or later, or a BY-SA Compatible License.
    2. You must include the text of, or the URI or hyperlink to, the Adapter's License You apply. You may satisfy this condition in any reasonable manner based on the medium, means, and context in which You Share Adapted Material.
    3. You may not offer or impose any additional or different terms or conditions on, or apply any Effective Technological Measures to, Adapted Material that restrict exercise of the rights granted under the Adapter's License You apply.

 **Section 4 – Sui Generis Database Rights.**

Where the Licensed Rights include Sui Generis Database Rights that apply to Your use of the Licensed Material:

  1. for the avoidance of doubt, Section 2(a)(1) grants You the right to extract, reuse, reproduce, and Share all or a substantial portion of the contents of the database;
  2. if You include all or a substantial portion of the database contents in a database in which You have Sui Generis Database Rights, then the database in which You have Sui Generis Database Rights (but not its individual contents) is Adapted Material, including for purposes of Section 3(b); and
  3. You must comply with the conditions in Section 3(a) if You Share all or a substantial portion of the contents of the database.

For the avoidance of doubt, this Section 4 supplements and does not replace Your obligations under this Public License where the Licensed Rights include other Copyright and Similar Rights.

 **Section 5 – Disclaimer of Warranties and Limitation of Liability.**

  1.  **Unless otherwise separately undertaken by the Licensor, to the extent possible, the Licensor offers the Licensed Material as-is and as-available, and makes no representations or warranties of any kind concerning the Licensed Material, whether express, implied, statutory, or other. This includes, without limitation, warranties of title, merchantability, fitness for a particular purpose, non-infringement, absence of latent or other defects, accuracy, or the presence or absence of errors, whether or not known or discoverable. Where disclaimers of warranties are not allowed in full or in part, this disclaimer may not apply to You.**
  2.  **To the extent possible, in no event will the Licensor be liable to You on any legal theory (including, without limitation, negligence) or otherwise for any direct, special, indirect, incidental, consequential, punitive, exemplary, or other losses, costs, expenses, or damages arising out of this Public License or use of the Licensed Material, even if the Licensor has been advised of the possibility of such losses, costs, expenses, or damages. Where a limitation of liability is not allowed in full or in part, this limitation may not apply to You.**

  3. The disclaimer of warranties and limitation of liability provided above shall be interpreted in a manner that, to the extent possible, most closely approximates an absolute disclaimer and waiver of all liability.

 **Section 6 – Term and Termination.**

  1. This Public License applies for the term of the Copyright and Similar Rights licensed here. However, if You fail to comply with this Public License, then Your rights under this Public License terminate automatically.
  2. Where Your right to use the Licensed Material has terminated under Section 6(a), it reinstates:

    1. automatically as of the date the violation is cured, provided it is cured within 30 days of Your discovery of the violation; or
    2. upon express reinstatement by the Licensor.

For the avoidance of doubt, this Section 6(b) does not affect any right the Licensor may have to seek remedies for Your violations of this Public License.

  3. For the avoidance of doubt, the Licensor may also offer the Licensed Material under separate terms or conditions or stop distributing the Licensed Material at any time; however, doing so will not terminate this Public License.
  4. Sections 1, 5, 6, 7, and 8 survive termination of this Public License.

 **Section 7 – Other Terms and Conditions.**

  1. The Licensor shall not be bound by any additional or different terms or conditions communicated by You unless expressly agreed.
  2. Any arrangements, understandings, or agreements regarding the Licensed Material not stated herein are separate from and independent of the terms and conditions of this Public License.

 **Section 8 – Interpretation.**

  1. For the avoidance of doubt, this Public License does not, and shall not be interpreted to, reduce, limit, restrict, or impose conditions on any use of the Licensed Material that could lawfully be made without permission under this Public License.
  2. To the extent possible, if any provision of this Public License is deemed unenforceable, it shall be automatically reformed to the minimum extent necessary to make it enforceable. If the provision cannot be reformed, it shall be severed from this Public License without affecting the enforceability of the remaining terms and conditions.
  3. No term or condition of this Public License will be waived and no failure to comply consented to unless expressly agreed to by the Licensor.
  4. Nothing in this Public License constitutes or may be interpreted as a limitation upon, or waiver of, any privileges and immunities that apply to the Licensor or You, including from the legal processes of any jurisdiction or authority.

Creative Commons is not a party to its public licenses. Notwithstanding, Creative Commons may elect to apply one of its public licenses to material it publishes and in those instances will be considered the "Licensor." The text of the Creative Commons public licenses is dedicated to the public domain under the CC0 Public Domain Dedication. Except for the limited purpose of indicating that material is shared under a Creative Commons public license or as otherwise permitted by the Creative Commons policies published at creativecommons.org/policies, Creative Commons does not authorize the use of the trademark "Creative Commons" or any other trademark or logo of Creative Commons without its prior written consent including, without limitation, in connection with any unauthorized modifications to any of its public licenses or any other arrangements, understandings, or agreements concerning use of licensed material. For the avoidance of doubt, this paragraph does not form part of the public licenses.

Creative Commons may be contacted at creativecommons.org.

