Optimizing for Offline Use
Draft Article - This article is still a work in progress
- Bringing it all Togeter
- News Feeds
So much of the modern Internet makes the assumption that we have ever present and stable connectivity. But what if you don't? It's this question that has driven me to optimize my living for offline use. Wherever possible, I try to find tools and strategies for working offline. In doing so I've found that I'm more productive and more able to work from anywhere I want instead of being tied to where I have a connection. This article will detail some of my tools, use-cases and strategies for working online, even when I'm off.
One of the main things I look for in my tools is the ability to asynchornously pull data from the Internet to my computer for later use. This allows me to opportunistically use data connections when they become available. I have several cronjobs configured on my laptop, for example, that run at fairly frequent intervals to push and pull data from remote servers. The jobs that run are small scripts I have written. Each script does only one thing and is written in a way that is tollerant to connection faults; either re-trying a few times before giving up or "crashing" gracefully so that the next run can try again without issues.
This is probably a good time to mention that when I say I use connections opportunistically I mean I am always on the lookout for free public wireless networks. When doing this, it's important to make sure you have a secure way to connect to your remote services. Many of the services I use support TLS encryption (eg. email over IMAPs). Some services, however, do not (eg. usenet). To prevent someone from snooping on my connection I use a VPN tunnel. My particular configuration is setup to use a VPN tunnel to my house where I have OpenVPN running on my router. This ensures that all traffic from my laptop is encrypted to my house. Only then is it decrypted sent over a "trusted" wired Internet connection in an unencrypted form. To make this process seamless and error-proof I use a ifup.d script to ensure that the VPN connection is established as soon as a network connection is up. This is not difficult to do and does not require that you run your own VPN server, there are many VPN providers available that are relatively inexpensive (~$5/month). However the details of this configuration are subject on to themselves and something best saved for their own article at a later time. Feel free to email me if you want details before then
Email is a huge part of my life. I probably write more email per day than anything else on a daily basis. Aside from IRC, email is probably the single most important source of new information. I subscribe to a multitude of mailing lists and news letters. I correspond with friends and family and discuss a myriad of topics ranging from tech policy to what I want to do for dinner with my wife. Without email, my life would come to a screaching halt. For this reason, it's critical that I am able to work on my email at any time of the day, regardless of my ability to connect to the Internet. This is why it was one of the first things I optimized for offline use
Email is thankfully very simple to adapt to using offline. It's an old technology that predated the near ubiquitous connectivity we enjoy today. As a result there are plenty of tools available and lots of documentation to help get things set up. I settled on three separate tools for my offline email experience: OfflineIMPA, ImapFilter, MSMTP, and Mutt.
The Internet Message Access Protocol (IMAP) is a wonderful piece of work. It enables us to interact with remote mailboxes via local mail clients, securely, without the need to log in to the remote system hosting the mail directly. One caveat for our setup, though, is that the typical use cas for IMAP assumes you have connectivty to the mail server when you want to interact with your mail. But what if I want to view my email while I'm offline? Well, there's the Post Office Protocol (POP) but what if I want to use multiple folders and have those folders exist on both the remote side and local side. This is where OfflineIMAP is useful.
OfflineIMAP uses the IMAP just like a normal mail client would but copies the files locally to a Maildir on your machine. Folders created within the Maildir are synchronized to the server and vice-versa. This allows you to configure your mail client to pull mail from the local Maildir which not only speeds up, in some cases, the mail client's response (because you're not having to wait on a remote connection before an action completes) but allows you to work even when you're offline by opportunistically using the network connection when it's available to pull the mail locally and view it later, maybe while offline. But OfflineIMAP can't do everything. If you've got these folders you want to preserve, you probably have filters that automatically sort the incoming mail in to them. For this we need another tool.
IMAPFilter is a utility for sorting mail based on rules written in the powerful lua programming language. It connects to a remote system via IMAP, the same way OfflineIMAP does, and filters the mail on the remote server as defined by rules in the IMAPFilter config file which resides on your system. Originally I was using procmail for this. This worked fine and was relatively simple to use. What drove me away was that these rules had to be stored on the remote server. This in turn required that I have a network connection available in order to edit or add filter rules. Unfortunately, the time when I realized I needed a new rule was not always when I had a connection and I'd forget I needed the rule by the time I was. This is when I went searching for an alternative. Some way to have rules locally on my system, much like mail clients such as Mozilla Thunderbird have but in a format which was more portable, ideally a flat file I could easily backup (Thunderbird's rules must be exported from some kind of internal format in order to be backed up). My only complaint about IMAPFilter his that you need to write the rules in lua. I've never bothered to learn much about the language so the initial learning curve for the first few rules was steep. Once I wrote one or two, however, I was able to reuse the rules and it became a matter of copy/paste. I wish there was a tool which utilized standard procmail syntax. I've been using IMAPFilter for about a year now and have been generally very happy with it.
Mutt is a command line mail reader. It's my reader of choice for a number of reasons, among them it's speed and configurability. I'm a heavy vim user and I appreciate any time I can configure applications to have keybindings which mimic those from vim. Mutt allows me to do this. It also handles message threading better than any other mail client I've used. It's light weight meaning I don't have to wait for it to open when I need it and it doesn't occupy very much of my system RAM and can occupy the same screen space as the other applications in my workflow, within the same tmux session.
Sending email while offline is relatively straight forward; there are many ways to do it. I choose to use msmtp-offline. It uses two executables, msmtp-offline and msmtp-queue to do the work. msmtp-offline is a wrapper for msmtp which attempts to send the email. If the send fails, however, unlike msmtp it will add it to a queue for sending later. That's where msmtp-queue comes in. Msmtp-queue works to flush mailboxes defined in msmtp.conf. It can flush all of some of the mail depending on how it is configured. For me, I only have one smtp server configured so I have it flush the entire queue. This keeps things simple.
The fun part of all this is setting it up to take advantage of an Internet connection the moment it becomes available. You can configure msmtp-queue to run on a schedule, but you'll want to make sure it runs frequently enough that you don't miss a window of connectivity. But, you don't want it running too frequently. Why waste the CPU cycles if you don't have to. This is why I configured mine to run as an ifup action. When a network connection comes up ifup scripts will run. I had previously configured this in order to ensure I automatically connect to my VPN server at all times. Using this experience I configured another entry to run `msmtp-queue -f` to attempt to flush the mail queue any time a network connection is detected. This, so far, has worked very well.
Bringing it all Together
These tools are all neat by themselves but together they really shine. I've combined them together in to a script that runs on my laptop every five minutes. The script first runs IMAPFilter to sort the mailboxes. It then runs OfflineIMAP to pull my mail down to the Maildir on my laptop. This allows me to configure Mutt to read mail out of my local Maildir any time I want without worrying about whether I have a data connection
If you're interested in a selection of the configuration files I'm using for these, check out my My Pub Directory.
For news I looked to the past. Well, it's still active but not as much as it once was. Usenet is a distributed, decentralized network of systems distributed across the Internet. Usenet messages, like email, can be opportunistically fetched for later use. The client side of Usenet also fits within my workflow. In addition to Usenet I use RSS, great for the same reasons as Usenet but instead of being decentralized your client polls multiple remote systems for their feed which is then stored locally. Generally speaking, RSS works great for offline use.
The primary difference between Usenet and RSS from a client perspective is that with Usenet you generally get the entire message (unless the client is configured just to pull message headers only) where as some RSS feeds do not provide entire messages and require that you visit a remote website for the entire article. Like sending email, this a problem I've yet to solve completely. For the time being, I've been able to find a selection of news feeds which provide entire articles directly and it gives enough information for whta I need. If you know of goo methods for also fetching linked articles in RSS feeds, I'd be interested in hearing about it.