Skip to content

Writing a Book Using Linux Tools

emacs screenshot

When I began writing The Dictator's Handbook, it seemed clear the project would take place on my Linux computer. And it did. We've got a Mac in the house too, but I'm in front of the Linux box more often than not, and in fact my Linux desktop gave me some tricks and tools that made the process of writing and researching a snap. Here they are.

This is part I of the Dictator's Handbook Colophon. (Part II: Making an Epub document from LaTeX).

Writing and Editing

LaTeX: The book consists of 14 chapters with footnotes, a 500-entry bibliography, and a structured organization. I know and love LaTeX and this was clearly a job at which LaTeX would excel. I also wanted the printed version of the manuscript to look spectacular. I wanted initial lettrines (the large capital letter at the start of a chapter), I wanted epigraphs (the quotations that start a chapter), and I wanted total, relentless consistency of styles. I also wanted working on and with the bibliography to be effortless. LaTeX allowed me to do all that. Needless to say, the easiest way to work in LaTeX was with Emacs and aucTeX, and that's the software I used to write the book. The decision to use LaTeX brought a few other advantages, too:

  1. Easy to divide the manuscript into individual .tex files, one for each chapter, and use one master .tex file to bind them all together, like this:
    10-international.tex      6-politics.tex
    11-finance.tex            7-military.tex
    12-elections.tex          8-unrest.tex
    13-endgame.tex            9-press.tex
    1-gettingtopower.tex      back.tex
    2-inimitableyou.tex       cover.tex
    3-buildinggovernment.tex  dictator.tex
    4-runningnation.tex       front.tex
    5-culturefear.tex         dictatorbiblio.bib
  2. You can regenerate the final PDF using just the .tex files and the .bib bibliography file, and because each of those files consists in plain text, they zip up really easily to a compact size for back up.
  3. You can also totally ignore issues of spelling until you're ready to deal with it, and then spellcheck all your files at once, using aspell.
  4. Lastly (and this is the subject of my next article), it's not too hard to go from LaTeX to Epub formats, meaning you can produce a gorgeous printed book and a gorgeous ebook from the same source, and without too much hassle. Stay tuned.

Writing Software

Emacs and AucTeX: Emacs and LaTeX are a match made in heaven. I needed no special modifications, but did add one keybinding to my .emacsrc file in order to permit pasting from the browser to my buffer. This is it:

(global-set-key (kbd "M-V") 'x-clipboard-yank)

I also found it a lot easier on the eyes to work using a dark blue background and a light yellow foreground, instead of the glaring black-on-white of most word processors. Emacs accomodated that with the commands set-background-color and set-foreground-color.

Desktop Environment (KDE3 and Windowmaker)

KDE3 and Windowmaker Desktops: I really appreciated working on KDE3 and Windowmaker (I'm not a Gnome fan and like it less as time goes on). Advantages of both:

  1. Multiple desktops. For example, I had my browser and the bibliography on one desktop, the manuscript on another, and my email client on a third. That kept me focused on researching, writing, and communicating respectively, with no chance of distraction. As a writer, distraction is the biggest enemy.
  2. Custom keyboard shortcuts. On both I had custom keystrokes to switch between desktops, maximize, minimize, shade, maximize vertically, or hide application windows. So easy, so convenient. I never had to take my hands off the keyboard, which is the way I like it. Focus follows mouse. Windows and Mac users have no idea how useful this can be, but once you discover it, it's hard to go back. No more clicking between different programs, each one simply rises to meet your cursor as you work. A nice complement to the custom keyboard shortcuts.

Web Browser

I did tons of online research, and Opera remains my browser of choice after ten years of finding everything else not-quite-to-my-taste. Opera offers tabbed browsing of course, but also multiple search engines and even custom searches (any web page with a "search" button on it can be turned into something you can search from Opera directly), plus lots more. People are very particular about their browsers, and Opera is for me.

Backup Strategy

As mentioned above, the .tex and .bib files zip up to a very small size. That made it easy to write a little script that backs up all the manuscript files and dates the zip package. That means as you work and create backups, you develop a folder of dated backups, making disaster recovery pretty easy. Knowing that's been taken care of allows you to relax and write, which is essential. Other than the .tex and .bib files, the manuscript required 14 graphics files, but unlike commercial word processors which embed them into every single version, with LaTeX it's easy to simply save a backup of the graphics files once. Then you only need to ensure you back up the text files, which incur modifications over time. This is a big deal! Instead of multi-megabyte backup files for every single iteration, the Dictator's Handbook was only several hundred kilobytes per backup, making it easy to keep a full set of backups on a little USB key.

Here is the script:

#! /bin/bash
# Script to back up Dictator's Handbook manuscript,
# zip them up, and put one copy of the zip file on a flashdrive
# at a known location, and email another one to myself.
# Foolproof, even for a dork like me.

bkupdate="$(date +%F)"

echo "Where is the USB key? (eg. /media/RWOOD-PEN)"
echo "Omit the trailing slash, please."
echo "Backing up to $bkupkeydir"
echo "Current date is:" $bkupdate

# Begin the backup: all .tex and .bib files
echo "Zipping $bkupdir ..."
cd $bkupdir
 zip -rv $bkupfile *.tex dictatorbib.bib 
mv $bkupfile ~
cd ~
echo "Send to flash drive? (y/n)"
read answer

if [ $answer = "y" ]; then
# Now put a copy on the zip drive
echo "Checking for Flash drive."
if [ -d $bkupkeydir ];
echo "Flashdrive found.  Copying $bkupfile to $bkupkeydir"
cp $bkupfile $bkupkeydir
echo "Error.  Couldn't find drive.  Care to specify it?"
read specifydrive
if [ $specifydrive <> "" ]; then
cp $bkupfile $specifydrive/DictatorBackup/

echo "Webdav to offsite storage? (y/n)"
read answer

if [ $answer = "y" ]; then

# FTP the file up to webdav account using curl
echo "Uploading.  Hold onto your pants."
# a webdav account is like this: 
# curl -u USER@ACCOUNT:PASSWORD -T filename https://DAVADDRESS
curl -u NAME@ACCOUNT:PASSWORD -T $bkupfile https://PROVIDER	

echo "Send to email address? (y/n)"
read answer
if [ $answer = "y" ]; then

# Mail a copy to myself using Kmail.  Body text is a fun fortune.
echo "Mailing zip file to $towhom"
kmail -s "DH Backup $bkupdate" --body "`fortune -o`" \
         --attach ~/$bkupfile $towhom 2>/dev/null
echo "Mailing complete."

# Badda boom, badda bing. We're done.
exit 0
echo "Done."

It's easy enough to run the script from a terminal, but KDE3 allows you to create custom buttons on your task bar (kicker). So it's no sweat to "Add non-KDE Action" as an applet. Choose the script as your executable, choose a happy icon, and you've got a lovely, clickable, custom backup mechanism. Fantastic.

The Website, Forum, and Blog


Not much to describe here, other than to point out that for a simple, five page HTML site it was smartest to simply hand-code HTML than to deal with some enormous content management system. I used minimum javascript to keep it light and fast: one script to swap images on the "Preview" and "Purchase" buttons, and another to provide the slideshow of five images. I found all of them online at Javascriptkit. I eschewed Google Analytics (which I've found to be slow elsewhere) for the simple and pleasant which has more than met my needs. That call to Statcounter is the only external call on the site, meaning no page is heavier than a kilobyte or two. Compare that to almost any modern, Web 2.0 site whose pages now clock in at several hundred kilobytes with calls at several or even many external sources.

For kicks, I redid each page in a mobile version, keeping graphics limited and small. I think it works pretty well. Check it at


One novel way of organizing the mass of news articles coming in was to post them somewhere public, and that led to the development of a forum —a newsgroup, actually — for comments and discussion. This isn't the most efficient way of organizing notes, but it worked for me and because I knew this book would lead to a forum anyway, this had to happen one way or another.

The forum is actually an INND news server, operating a set of newsgroups that don't replicate across Usenet. They're text only, completely anonymous, and the posts expire after 90 days. Perfect. Learning INN was no piece of cake, but I love newsreaders, the offline post-writing discipline, and the threaded discussions of Usenet. Web forums don't cut it for me. The remaining 99% of the world wants to access the Internet through their browser though, so the forum needed a web interface.

The bulk of the newsposting happens via an account by the name of NewsFeeder. It's running on the Newsbeuter RSS aggregator and a custom perl script that feeds interesting RSS items to INN. Newsbeuter is a great text-only RSS news aggregator, and any interesting articles get piped to the custom script, which sends it to INN. Consider it an RSS-NNTP gateway with a wetware interface that hand selects and curates the articles.

The NNTP-only interface was a wet dream for this nerd only, and even semi-technical friends complained there was no web interface. I experimented with a project called Papercut, which accesses the MySQL database of a Phorum forum and allows posting and reading via NNTP and a newsreader. That sounded perfect for a low-volume forum like mine, and I liked Phorum's threaded view very much. But Phorum has been updated while Papercut has not, and it was unworkable. I went instead with Newsportal, by Florian Arnheim. It's a brilliant PHP interface that simply spools and reads from an INN server. It's got some protections built in, probably wouldn't handle a high-volume server, and is very simple. But simple is what I wanted, and it's been great. Spectacular, in fact: I'm indebted to Mr. Amrhein.

Don't take my word for it; have a look at it right here.


The Blog is running on Serendipity (S9Y). I first discovered it running at a Postgresql blog. I run Joomla elsewhere and thought it would be too heavy for this purpose, and I desperately wanted to avoid Wordpress because I'm bored of seeing Wordpress sites and because they tend to have security vulnerabilities. The more I checked into S9Y, the more impressed I was. It's lightweight, simple to manage and configure, quite snappy, and its modularity makes it easy to run the bare minimum of modules without bloating up the system or introducing more vulnerabilities.

Lastly, the entire system is running on FreeBSD 10. What a lovely OS. I'd been building openSUSE VPS systems until then, but got stung when one OS hit end-of-life and the package repositories were taken offline. My server wasn't complete yet and I could download no more software. Fair enough: that's the deal when you choose a community distro; SUSE itself would have been supported. But I'd learned enough about FreeBSD by then to recognize its stunning security profile, and that appealed to me. I found a VPS provider – RockVPS – that offered FreeBSD VPSes, and I've never been happier.


No Trackbacks


Display comments as Linear | Threaded

No comments

The author does not allow comments to this entry