Social Distancing Part 7 - Escape Hatch
It all started because of a failed Python upgrade. I began to wonder if I would be forced into an OS upgrade just to get Homebrew to behave - on a six year old machine - that isn’t anyone’s idea of a good time.1
Now, I’ll admit, I have an slightly irrational dislike for all things Python, even as it has grown in popularity thorough being the tool of choice in fields like Data science. The primary reason for it has almost everything to do with the environment. Compared with the Ruby equivalent - rbenv - the Python equivalents are still too fiddly for my taste. Although progress has been made over several years by Ken Reitz and others working on the language proper, in my experience it still2 requires some manual setup to get working right.
As the best way to figure all of this out is to try it, I took another look at making a dotfiles repo to contain some of this chaos. Like many people in a similar situation, I started on Bash and moved to Zsh right before getting Mojave up and running, which will make the transition to Catalina - and whatever comes after it - easier. It’s made from scratch without the need for frameworks like ohmyzsh or prezto and fairly straightforward.3
To my surprise, I found by testing it, the dependencies in my Brewfile were upgraded to the point that the then-current Python was installed as well.
It should come in handy when I get my next machine. I originally planned on doing this as soon as the new 13” MacBook Pro was released, but COVID-19 scuttled those plans, so I’m waiting until the shipping times are back to something approaching normal - whenever that is - before making a purchase.
I did eventually upgrade to Mojave, but when the time came to do so, it was motivated by larger concerns - e.g, the potential loss of security updates if I stayed on High Sierra. ↩
The appearance of pipenv in official documentation is a really good sign…until you scroll down and see that virtualenv is still hanging on, a vestige of a different age. ↩
A Brewfile, global
.hushlogin, and - last but not least - my
.zshrc, which contains aliases and a couple of functions alongside my configuration. ↩
Social Distancing Part 6 - (Not Quite) Language-Agnostic Interfaces for Words & Code
Inspired by Parker’s post on the same subject, I decided to write my own now that it’s been up and running for a few days. I say ‘not quite’ in the title because - as will become apparent - I only use a few languges to do most of the work.1 At the end of the process, I had entirely reorganized my Jekyll Scripts repository - now called Rake Jekyll - from a series of shell scripts into a single Rakefile. I also migrated my Reverse Job Posting to use Rake. As I worked, it got me thinking about how I approached the same task in other projects. 2
Make for the printed word
Makefiles are used to generate output from a number of TeX sources (Redirection FTW). These are simpler and only run the input through a converter - usually pdfTeX or XeTeX - which creates the final PDF. I occasionally run the source text through an app like Pages, which has real grammar and spellcheck functions, just in case aspell, Gabe’s language module or Matt Might’s scripts miss something.
My personal dictionary is pretty well customized and its mode support for TeX is smart enough to figure out what to match on, a handy feature I learned about from Dan Bader. I know there are nearly endless other ways of accomplishing this task, but it’s the approach I’ve found that works the best for me.
Rake on the web
Rakefiles are just containers for shell sequences - usually with a few modifications and are easy to stick together. For the blog, I can now run
rake draft to create a draft post on the fly.3
In RJP, the default task starts a web server so I can view the output. I tried to do the same thing for the blog, but invoking
bundle exec in this way returned an error. So, for now this task triggers an incremental build which lands in the
Reductio ad Xkcdum
As with any kind of automation, the inevitable question is - usually - ‘Did it save time?’
Well…maybe. After all, I wrote this post about it while using the tools to get it done. In the end, that’s what the site is for.
For a while, the working title for this post was ‘Glue Code’, a not-so-subtle reference to the small pieces that do most of the heavy lifting for me. ↩
If I was more versed in Regular Expression syntax, I could try to pass the title argument into the filename, but I havn’t gotten that far yet. ↩
Social Distancing Part 5 - IndieWeb, Inside Out
Kind of stuck at this point, I started thinking, ‘there’s got to be an easier way’. It turns out1 I already knew what needed to be done. However, as Jekyll is kind of the OG of static site generators, getting this kind of IndieWeb functionally (specifically, POSSE) into the site required a bit of additional finesse.
Why Don’t We Do It (on the Server)?
At first, I tried IFTTT. This was by far the easiest to set up. Just provide the URL of a feed to an applet and it will tweet new items on your behalf as they get added to the site. It does the job, and the results are nicely formatted, but it’s the equivalent of a black box. If IFTTT changes how applets talk to services or the developer who made it loses interest, there’s no guarantee it will continue working.2
Enter Bridgy, a service I remember reading about a while back but hadn’t looked at in a serious way for some time. From the site’s FAQ:
Bridgy periodically checks social networks for responses to your posts and links to your web site and sends them back to your site as webmentions. Bridgy also translates the responses to microformats so that your web site can fetch and parse them. Bridgy can also publish posts, comments, likes, and retweets from your web site to social networks. When you ask Bridgy to publish a post, it analyzes its microformats data and publishes it using the social network APIs.
This means I can add a webmention3 endpoint to my post template and when it is published, Bridgy will hit the Twitter API and send a tweet for me. I had previously set up - but never actually used - a separate Twitter account for this purpose - an idea I got from Casey Liss’ engine, Camel. However, I’ve found that implementing micropub style posting (think Twitter, or for the more technically inclined, Mastodon or Micro.blog) is easier if it all uses the same account.
While this is a good start, it’s not finished yet. The web interface to Bridgy gives you a POST request that will actually send the webmention, but for some reason, although I included the Twitter URL - from Bridgy - in my template, a tweet containing a link back to my post wasn’t showing up when I first tried to use this setup. As a result, there’s a tiny script - called
webmention.sh - which uses curl to talk to Bridgy, which goes and pings Twitter.
Ideally, I would like to automate this step, which would allow me to re-enable autopublishing in Netlify. At that point, all that would be required to tweet new items from the feed would be a
git push. Webmention.app seems like the best option for my purposes and I’ve done some testing with it for a few days. It integrates with Netlify’s webhooks feature so I can send a webmention automatically when each deploy succeeds.4
It would also mean I don’t need to hand-edit my script to add the URL of each new post. I just need to figure out how the POST request should be formatted and it should be ready to go.
Another argument in favor of, ‘If you want something done, do it yourself.’ ↩
Remy Sharp, who runs the service, along with users in the Netlify Community forum have been remarkably patient to answer silly questions from me as they come up, which has been helpful as I try to get this working right. ↩
Social Distancing Part 4 - When Government (Almost) Gets It Right
So it has come down to this. A topic I thought I would never, ever write about here. But, through a series of mostly COVID-19 related events, here we are.
Politics. Specifically, the United States Census.
It happens every 10 years, but honestly I’ve never given it much thought. As with voting, time has a way of grabbing your attention at moments that used to just pass by.1 Because I had some extra time on my hands, I even applied to work at a local office in my district. For whatever reason, this exercise didn’t pan out. Also, this was before coronavirus, when you could do amazing things like actually work with other people in the same place without social distancing.
It started on YouTube. I don’t know if they’ve done it in the past, but the people at Sesame Workshop put together a short video2 to encourage everyone to respond to the Census. Popular culture tends to depict the process as kind of a drag and an artifact of a bygone era, so getting everyone on board requires getting a little creative.
The result is brilliant. Short, to the point, free of political bias, and presented in a way that makes even little kids feel like they’re part of the process.3
With “stay-at-home” orders in place the past few weeks, it became clear that people wouldn’t be canvassing the neighborhood anytime soon. That raises the question, ‘could the government actually pull this off online?’
Surprisingly, the answer seems to be yes - with the exception of one quirk I’ll get to in a moment. Just go to the website and click “Start Questionnaire”. Like they say, the process literally takes 10 minutes and anyone should be able to get through it.
Looking at the site’s design, it seems to be the work of the USDS, who, along with 18F are trying to keep the Federal government’s technology infrastructure in running order. In some ways, I’m reminded of the UK’s equivalent, GOV.UK, although the US hasn’t yet put all of its sites under one roof. When so much of government feels like it’s in disarray, it’s good to know there are at least a few experts left, even if their work goes unnoticed most of the time.
That is, until I clicked submit — whereupon the site deleted my responses and made me start all over again.
Although YouTube was around in 2010, I suspect it - and social media as a whole - hadn’t yet become an integral part of how brands reach their target market. ↩
Social Distancing Part 3 - A Better Linkblog
For as long as I’ve had a blog of any kind, the idea of quick, short-form writing - sometimes called linkblogging - has been an area of interest. I first remember seeing this concept sometime back in 2006, when I found my way to John Gruber’s long running site, Daring Fireball. His being the gold standard for this kind of thing, I found myself wanting to do likewise. For interesting articles that I find through RSS or just browsing around the web, it’s convenient to have an outlet for the stray thought that passes through my mind as I’m reading.
My first attempt at this idea involved using a link from within the text of a post to the article I was commenting on. Easy enough, but that was different than most other blogs of this style that I read, who used the title to fill that role. Furthermore, Jekyll doesn’t easily lend itself to that use case without a number of - what I consider to be ugly - modifications to your templates.
Later on, I had a layout where the anchor containing the
page.title variable pointed to the link destination, but this was not how the people I was reading built their sites, and more importantly, it was error prone. For a little while, I tried using Jekyll’s support for collections to get the job done, but this turned out to be no better than juggling destination links and permalinks. Collections are really meant for similar, related items, but other than the fact that these were all links, that didn’t fit the bill.
This brief trip through history brings me to today and what I hope is a better long term solution. Jekyll supports the concept of a
_data directory where you can define custom variables and data that can be accessed through Liquid templates. Now, I have a YAML1 file - naturally, it’s called links - that can simply be added to as I come across interesting things that I want to write about, but don’t merit a full post. Each time the site is updated, Jekyll reads the contents of this file and populates the Links page with its contents.
Other formats include JSON, CSV, and TSV ↩
Social Distancing Part 2 - Adventures in Continuous Integration
It all sounded so simple.1 At least this is what I told myself when I decided to move off of my previous host and on to Netlify. It was going to be great - almost all of the capabilities of a VPS for free, but I didn’t have to manage it.
This kind of setup is more common these days and works well if you want to get a site up and running quickly without having to fiddle with configuring a server.2 In my case, I had this blog on DreamHost and an existing microsite (my Reverse Job Posting) over on GitHub Pages but wanted more control over things like HTTP headers and redirects.
Around that time, I watched John Wilander’s presentation from WWDC 18 and Scott Helme’s from dotSecurity 2016 and thought, ‘I can do this!’
Netlify even has a sandbox environment, called the “Playground” where you can test out rules to make sure they work. Take those rules, export them in a TOML file - Netlify’s format of choice for configuration data - upload it to a domain, and that should be it…
Not quite. Enter Content Security Policy.
In theory, the pitch for CSP is awesome. Security configuration changes that are really only a few key-value pairs. You could argue that this is overkill, that for a single page application (no, not that kind - although I do have a Service Worker) it’s needlessly complicated, but I was too far into “learning mode” to think about that, so I pressed on. Problem is, if you don’t do it correctly, parts of your site will disappear3 and you might not know until it’s live on the Internet.
In the past, this wasn’t an issue for me. I ran
rake locally, then uploaded the files and that was it. Well, in a CI scenario, that’s a problem. If, for example, I forgot to run the build task or didn’t set CSP to
report-only, raw Markdown might get displayed, things wouldn’t show up, or the site would refuse to accept changes in a stylesheet.
It was infuriating.
After many false starts (my repo’s history for this time period is a mess), and sleeping on the problem for a few days, I finally got this working. Long term, this will allow me to investigate how I can use Netlify’s AWS Lambda integration to send off a tweet whenever a new post is published.4
Ambitious to be sure, especially given how much fumbling about I did at the beginning - but now I’m halfway there. That’s a win.
Famous last words. ↩
If you have external scripts from somewhere else, but specify just the default source e.g, ‘self’ - only load scripts from my domain - no matter what you do, those other resources will refuse to load. It is possible to list a resource as ‘none’, but not loading any URLs of a given type isn’t all that common. ↩
From what I can tell, deploying via the CI method is a requirement if you want to use functions. ↩
Social Distancing Part 1 - Hello, Zsh!
This is what (I hope) will be a series of posts about what I’m doing to keep my mind occupied while practicing COVID-19 induced social distancing. When you work for a public library (but are closed to the public), just about everything slows to a crawl.
The writing is on the wall for Bash in macOS. It’s really old, and probably won’t be updated because of terms set forth in the GPL Version 3.1
If you are sufficiently determined, you can upgrade to Bash 5 - but you’re on your own should you decide to go that route. The default shell in Catalina is now Zsh, and if you launch Bash on a Mac running 10.15, you’ll get this friendly message2:
The default interactive shell is now zsh. To update your account to use zsh, please run `chsh -s /bin/zsh`. For more details, please visit https://support.apple.com/kb/HT208050.
In my case, a combination of Apple’s instructions, Armin’s book and a handful of Twitter conversations was enough to make it through this transition. My setup isn’t that complicated and making the move now means some of the privacy related changes in macOS will be easier to digest before I even encounter Catalina.3
Having my configuration backed up doesn’t hurt either.
It’s licensed under the GPL Version 2, but the program itself is now at version 3.2. That’s not confusing at all. ↩
Feel free to suppress this warning with
export BASH_SILENCE_DEPRECATION_WARNING=1, but honestly, that feels like fighting the old war. ↩
Drafts in Six Lines of Code
A few years ago, I came across a presentation1 that Brian Kernighan gave back in 2009 called The Elements of Programming Style, a title borrowed from his book of the same name. In it, he made one point which I think applies equally well whether you are writing words or code.
So, there’s a balance that you have to find between too clever and - not clever enough - being kind of dim about what you’re doing
With this in mind, I took another look at the “drafts” script I mentioned in my last post. Thinking about the problem a bit more, I’ve found that Kernighan’s observation was precisely the problem I was having all along. Although I could read the script, and understand what it was doing, I didn’t really grasp why.
My knowledge of Make is not very deep and AWK even less so. As I hadn’t thought about this for some time and it’s usually easier to deal with the devil you know than the one you don’t, I ultimately decided to rewrite it in Ruby.
5 minutes of work, 6 lines of code, and (hopefullly) the incentive to write more often.
I am always impressed by how “down to Earth” Kernighan is every time I hear him speak. ↩
With the exception of my “draft” script, I’ve moved all my shell scripts for this site into a Makefile. Originally, I used gulp-shell, but was quickly reminded of the sheer number of packages you need to do even the simplest operations in Node. At that point, I decided to find an alternative solution.
Although I hadn’t used it before, I found the process to be pretty smooth - aided by this Hacker News comment where I asked a question about it. In simple terms, a Makefile typically contains a set of “rules” in a particular format and Bash commands can map to each of them. These rules perform actions on files or directories (a.k.a, their “targets”).
Converting the “draft” script will take a bit longer, as it’s made up of multiple lines and has variables, redirection, and a sprinkling of AWK all thrown together. Most of the examples I’ve found are far simpler, which is helpful, but they are usually written with a language like C++ in mind. As a result, I’m kind of in the dark about how all of the pieces fit together.
In short, it’s the one drawback of being something of an accidental programmer. I suspect there’s more to write on this particular topic, but that’s an exercise I’ll save for another day.
Audible, Goodreads, and a Very Long List
Because all the cool kids are doing it, I’ve compiled a list of all the books I’ve read in the past year. You might think, ‘Who reads anymore’, or ‘Who has the time?’ Surprisingly, this was easier than I thought and only required 3 things:
An Audible subscription
Audiobooks have come a long way from the books on tape I remember listening to growing up. Long books felt shorter, and because there were no visuals involved, I found it easier to retain information, especially when it came to more complicated topics. Still, this approach is not without its drawbacks; The basic account only gives you 1 credit per month, and I’ve never really been tempted by the Audible Originals. However, the selection is really good and since they’re an Amazon company, the service is always up.
Well worth the $15.
A Goodreads account
I have a note in iCloud (and a Reminders list) to keep track of things, but for now, the canonical record is located here. The website may be kind of ugly, but it does a decent job keeping track of what I’ve read and what my next pick should be.
Time to listen
This is a big one — having a walkable commute in addition to a working environment where I can listen while doing other things has been a great aid to this process.
With that said, here’s what I read in 2018.
Young Money: Inside the Hidden World of Wall Street’s Post-Crash Recruits by Kevin Roose
The Big Short: Inside the Doomsday Machine by Michael Lewis
The New CSS Layout by Rachel Andrew
Going Offline by Jeremy Keith
Too Big to Fail: The Inside Story of How Wall Street and Washington Fought to Save the Financial System from Crisis – and Themselves by Andrew Ross Sorkin
Accessibility for Everyone by Laura Kalbag
Flash Boys: A Wall Street Revolt by Michael Lewis
Becoming Steve Jobs: The Evolution of a Reckless Upstart into a Visionary Leader by Brent Schlender and Rick Tetzeli
A Colossal Failure of Common Sense: The Inside Story of the Collapse of Lehman Brothers by Lawrence G. McDonald
Straight to Hell: True Tales of Deviance, Debauchery, and Billion-Dollar Deals by John LeFevre
The Accidental Billionaires: The Founding of Facebook: A Tale of Sex, Money, Genius, and Betrayal by Ben Mezrich
Steve Jobs by Walter Isaacson
Into Thin Air: A Personal Account of the Mount Everest Disaster by John Krakauer
What Happened by Hillary Clinton
The Fifth Risk by Michael Lewis
The Coming Storm by Michael Lewis
A Brief Reprieve
I don’t know how long it will remain, but the potential endgame for Safari extensions described in my last post may be further off in the future than originally thought. After all, Apple isn’t Bare Bones, who extensively document how to restore prior behaviors in their apps - usually by editing a property list. As time goes on, I would expect the number of hurdles that must jump through in order to continue use legacy extensions to increase, but the inclusion of the “unsigned” option in the Develop menu gives me hope.
For now, the search for Chrome versions of all my extensions continues apace, as I don’t know how Safari 12’s Intelligent Tracking Prevention might reduce the need for some of the customizations I’ve made.
Maybe I’m just out of touch or I’m in too deep…
The Future of Safari Extensions
In most technology circles, hearing somebody say “the Future of X” when referring to a given technology is a bad sign. It’s the death knell, usually means time is running out, and a big change is on the horizon. With this in mind, The Safari team announced at WWDC that legacy
.safariextz extensions distributed outside of the Extensions Gallery will not load in Safari 12. While they will accept new submissions until the end of this year, it’s clear that App Extensions and Content Blockers are the future of extensibility in Safari.
This wouldn’t be so bad if the app install/uninstall experience on the Mac wasn’t terrible - one thing that I actually miss from the days when I used Windows regularly. Microsoft gets flack for using a centralized database (the Registry) to handle this, but for the most part it does the job.
Of course, there’s a reason why I don’t run as many extensions in Chrome: The web store is a mess. Google doesn’t curate anything and it shows.
The 60% Keyboard
Unless there’s a change coming next week, I might need to consider something like this - or bite the bullet and dive headfirst into the Wirecutter’s keyboard reviews. My current MacBook Pro will be 5 in December, so it’s getting to the point where I should start thinking about what comes next.
With all of the hand-wringing over the reliability of Apple’s current keyboard designs, it seems appropriate to take stock of where things stand today in our house. The keyboard on the 2017 MacBook has held up well so far, but it is currently shared by my parents and doesn’t really get all that much use. However, my brother is looking to replace a truly awful HP Envy and is considering his first Mac.
I suspect that telling him a separate keyboard purchase might be something he should spend money on will not go over well.
I can almost hear the yelling now…all the way from Charlotte.
Where We’re Going, We Don’t Need Subdomains
Subdomains are out and collections are in; which means a proper linkblog can live at the same domain as the rest of the site. It took some time - reading docs, asking questions on the Jekyll Talk forum, and watching a CloudCannon tutorial - but now, interesting links have their own index (naturally, it’s
/links) and associated URLs.
This also solves a problem I was never able to find a good solution for the first time around - linking to external sites from the title of a post. Octopress, a Jekyll fork by Brandon Mathis, implemented this feature in the form of a plugin called Linkblog, but unfortunately it hasn’t seen an update since 2015 when Brandon announced he was embarking on a ground-up rewrite of the code.
Maybe, just maybe, this will be the push I need to whittle down the Siracusian number of tabs I have open on my phone.
At last count, it was 106.
SSH for Humans
Let’s Encrypt should really think about getting into the business of managing SSH keys - it would save everyone (e.g, me) the headache of looking up credentials which haven’t been used since…
Actually, I have no idea, Whoops.
While it would be nice if DreamHost supported ED25519, 4096 bits will have to do for now. Transmit 5 can use them or even generate keys on its own - thanks to the instructions in this video from @antichrista - but for whatever reason won’t import my public key.
I don’t directly SCP or SFTP all that often, so it isn’t a dealbreaker. Gulp handles my deploy script just fine.
Sites on the Cheap
As this site’s content has become a bit of a running commentary regarding its own creation, let’s continue with that theme.
I really like what @parkr did with his Stuff repo - just a simple microsite for collecting links to interesting things he finds - so I decided to do the same. I called it Links (because why not), just to avoid any namespace conflicts with other things already in my Home directory. Of course, it needs to live somewhere, and since I like to keep things as simple as possible, I decided to use a subdomain (e.g, links.chrisfinazzo.com).
I seem to remember @imathis built a plugin for doing linkblog style entries, but I don’t know if it was ever ported to Jekyll 3. Parker created a bunch of tasks for doing these kinds of things in a Rakefile, so I’ll see if that works out better. It might even give me some ideas for how to build out my Jekyll Scripts repo, which itself could use an update one of these days.
My Service Worker ended up being a few lines shorter than I thought, but it’s now live and caching a few pages on this site. Kudos to Jeremy Keith for writing a book that’s approachable, understandable, and a quick read that lays out the basics of how this all works.
Security by Deletion
I guess there’s a first time for everything, so imagine my surprise when an email from GitHub about a security warning showed up in my Inbox this past week.
Granted, the Node ecosystem has a large number of modules, each with their own dependencies which may have security vulnerabilities from time to time. Nothing really surprising about that…with one exception.
The specific vulnerability is in a sub dependency of a module I don’t control and the supposed fix is to bump the version number in
package-lock.json. However, this file is automatically generated by
npm install and is a way to ensure you install the correct version of a module. The prevailing opinion, based on what I’ve read, is that tweaking the file on your own is not recommended unless you are really confident in what you’re doing.
My fix? Remove Workbox and rewrite my Service Worker by hand. In total, it’s 63 lines of code, most of which I’d already written for Minima.
Note to self: Having invalid markup in a Liquid template makes HTTP caches go crazy. Also, service workers are still weird (and awesome).
Counting down the hours until @adactio’s book comes out tomorrow.
A Desk and a Chair
I need a new desk and a new chair. The desk I’m sitting at is at least 20 years old and the chair - which originally belonged to a child’s school desk - is even older than that.
Can it really be that hard to find a suitable replacement for each of these items? I don’t think so and I’m not even especially picky about it. At a minimum the desk should be:
a) Less than 50” wide. My current desk (a cheap, completely unremarkable Staples thing with a crack running along the underside of its surface) is 48” and in the corner of the room. I don’t have the space for something bigger.
b) Have at least one drawer. Somewhere. I don’t know who designs this kind of office furniture, but at some point in the last 20 years, somebody got it into their head that all people really need or want is just a big, wide area to put stuff on. I have 4 external hard drives, my headphones, and a bunch of other small items spread across a two level cabinet in the current desk. Do they all need to be on top of my desk right this second?
Other than that, just about everything else is negotiable. The Wirecutter’s pick of the Jarvis Bamboo is nice, and I’ve never tried a standing desk before, but I don’t know if I want to spend that much. “Writing” desks are awkward and I would like to stay away from that if I possibly can, but they’re cheaper than most other options I’ve seen.
The chair part of the equation is easier (mostly because of cost). Arms, along with reasonable back support, and that’s really all that I think I need. The HON Exposure (another Wirecutter selection - for no particular reason, that’s just where I started my search) would probably do the job, but I wonder about its durability if I’m sitting in it for long stretches of time. Although anything would likely be an improvement over the tiny cushion that is on my current chair. I don’t know the exact price, but I do know it was cheap.
Lumbar pillows exist, and might fix some of the back issues I’ve dealt with from sitting in such an awful chair, but it just feels like a Band-Aid and another excuse to not address the problem in its entirety. I’d rather pay slightly more for a chair that can be adjusted until I feel comfortable in it.
Drunk with Power Management
Just when you get to the point where you think everything is ready to go, with the “i’s” dotted and “t’s” crossed, nothing can possibly go wrong and…no.
Go home computer, today is not your day.
Adventures in rsync
As I was saying, wait…what was I saying? Oh yes, something about removing obstacles to writing.
It helps if you rsync files to the correct web root and haven’t borked secured hosting in the process. Of course it’s Friday the 13th.
Minimum Viable (Writing) Product
You say you want to write for a living, just do it.
I had a phone call recently with someone I met through networking and he said these words to me at the end of our conversation. For the longest time, I spent a large portion of my time trying to get all of the technical parts of a site together before actually putting pen to paper – or, in this case – fingers to keyboard.
This got me thinking about my Reverse Job Posting and what went into building that site. Is it the most attractive site possible? No. Far from it. However it is the smallest possible thing I could think of in order to give people a jumping off point to find programming projects and writing samples I’ve worked on. All-together, it took me a week from start to finish.
Without realizing it, trying to come up with the most technically interesting solution was a barrier that kept me from writing. Playing around with raw HTML or alternative blogging engines was an interesting exercise, but what it comes down to is that I was just “spinning my wheels”.
With that lesson in mind, I’m back to using Jekyll with a couple of scripts to make the writing process easier.
Let’s do this thing.