- Django Ditto and archiving your stuff
I recently made a collection of apps for the Django web framework, called Django Ditto. It’s for grabbing your photos from Flickr, tweets from Twitter, bookmarks from Pinboard, mirroring them all on your own website.
It can save data about:
- Flickr photos/videos
- Flickr photosets
- Pinboard bookmarks
- Twitter tweets
- Twitter favorites
Hopefully it’ll do more in the future.
It can optionally archive copies of your original Flickr photo and video files, and images attached to tweets. It can’t save videos from Twitter as the API no longer allows access to original video files, only a stream.
It can do this for multiple Flickr, Twitter and Pinboard accounts. Private photos, tweets and bookmarks are saved, but Ditto’s views and templates only display public ones; private items are only visible in the Django Admin screens.
You can go and look at:
This is only of use to people who are developers, and who want to use Django. It’d be nice to make something like this that’s easily usable by non-coders. Maybe one day.
Why did I make this?
I’ve been intending to rebuild my own website for years but the site’s become so large and complicated that this is a daunting feat. When planning to re-make the whole thing in Django, rather than the current nest of Movable Type and PHP, it made sense to split the task into smaller chunks of work.
The first was to combine the aggregation of Flickr photos, Tweets, etc into one system, rather than the differing ways they’re sort-of captured at the moment. I found an existing Django app that did some of this, django-syncr and started updating it in 2011 but soon ground to a halt, through a combination of work and boredom.
The following year I started again, with a new project, django-archivr, using the bits of django-syncr that I liked but, again, I couldn’t sustain the momentum for more than a couple of months.
Last year I started for a third time, with Django Ditto, and through a combination of greater free time and greater bloody-mindedness, I’ve got somewhere. There’s plenty I want to add, a prospect that doesn’t fill me with joy at the moment, as it’s been a grind, but this is a good start.
As with so many of my projects (like that Ansible stuff, or Twelescreen, or the Mappiness chart), I’ve overdone it. Ditto is probably way over-engineered, and I certainly spent longer than necessary refactoring various bits (which might have improved them).
For example, unlike the previous two abandoned projects, Django Ditto can archive multiple Flickr, Twitter and Pinboard accounts. I don’t need this functionality myself, but it seemed like it could be useful, and so I did the extra work. It’s the kind of feature that’s easier to build in from the start rather than retroactively. If you need it. Maybe someone will.
As ever, I feel the code is terrible, an embarrassment. I’m too much of a self-taught, self-doubting, solo programmer to have much confidence in my skills. But I’ve learned a lot, and some of that purposely; I want my personal projects to stretch me, and give me a chance to learn things that might be useful for paying work. This is the first project on which:
I’ve learned how to package a python module for PyPI
(which will mean being stricter with myself about release and version numbering).
I’ve written more tests than I ever have before (I’m sure some are awful, but still).
I’ve learned how to use tox to run the tests on multiple versions of python and Django.
I’ve used Travis to run the tests when new code is pushed to GitHub.
I’ve written documentation using Sphinx to go on Read The Docs, rather than rely on a single overly-long README.
I’ve made a demonstration site using the latest release.
None of which is groundbreaking, but it’s mostly new to me. It’s also often quite dull, more dull than it seems a personal, free-time project should be. But trying to improve one’s professional skills sometimes is dull, or frustrating — if it was all easy and fun I’m not sure I’d be learning as much. At least, that’s how I’m justifying spending days writing documentation for something no one else might ever use.
There’s another “Why?” question: “Why does anyone need to mirror all their tweets and photos etc on their own site?” Maybe if one’s Flickr photos only exist on Flickr, it makes sense to make copies of those precious memories. But does anyone need to copy all of their Tweets? Or bookmarks? Or YouTube favourites? Or Foursquare check-ins? Who’s going to look at or care about any of that? Maybe one’s “digital wake” or “digital exhaust” should remain as ephemeral as those metaphors suggest.
I’ve long believed that we should have control over our own digital “stuff”, no matter which commercial service we post it on. In my ideal world, I’ve often thought, everyone would have their own website that would contain copies of everything they posted elsewhere. You should own the photos and jokes and thoughts and videos and events you post onto platforms controlled by companies over which you have no control, that might suddenly vanish. These platforms provide great services, with network effects that achieve so much more than posting things solely on your own website could. But it feels wrong to me to give it all away, sending it into the corporate-controlled ether, without maintaining your own copy.
At least, that’s what I believed, 100%, a few years ago when I first started on all this. Since then, part of me has become less sure. Is it essential to have copies of all this stuff? And if so, does it need to be hosted online, on your website? If you can download a copy of your material, maybe that’s enough (e.g., Twitter’s downloadable archive of your tweets is pretty good).
I still firmly believe that all this stuff needs to be archived permanently somewhere, in a browsable way that’s as close to the original experience as possible, even if that’s difficult. Tweets may seem like ephemeral nonsense, but some of the most fascinating ancient discoveries are “unimportant” things that, at the time, no one would have thought worth keeping.
But is it important to archive all your own ephemeral digital stuff, and do so publicly? Running your own website is often a pain in the arse, and a website that relies on many external APIs and services even more so. Plenty of friends who also work in this field, perfectly capable of building their own website, have no desire to do so, or if they do, keep it as simple as possible.
But I’ve tried not to think too much about these doubts recently. Yes, it’s important to own your material, your self-expression. That’s what I keep telling myself, or I’d never have got this far. Stay on target. Stay on the bus, Archive everything. Publish it. Worry about whether it’s worthwhile later.
- The Shaolin Film Club
Back in the mid-90s, when I used to go out, I went to an evening called The Shaolin Film Club two or three times. Twenty years on and it barely exists online, even as a phrase. So I thought I’d aggregate what I have so that this ephemeral thing lives a little longer.
Because I’m a selective hoarder I discovered I still have my membership card, which I think got me £1 off subsequent admissions.
I also found an email from a friend describing one of the nights in 1996:
OCTOBER 11, FRIDAY
8-9 happy hour drink special half price beer
9 pm One Armed Boxer…the sequel
11 pm a jackie chan flick
dj’s spinning hip hop, breakbeats, and funk… playstation/ saturn games, and the retro arcade (including pong, atari, nes, jaguar, and intelivision)
148 Charing Cross Road above the book case. Tottenham Ct Tube it’s right on the corner, the building with the huge book mural on it.
doors open at 8pm addmission £4 or £5
0171 209 3549
I’m not sure if that’s written by my friend or whoever ran the club. But it gives an idea of the flavour. I must have gone at least twice as I also remember seeing a Lone Wolf and Cub film there. I should add that it’s not like I’m a huge fan of these films, and probably haven’t seen any apart from those I saw at the Shaolin Film Club. But it was an interesting, friendly and fun place to go.
I can only find one person on the web describing going, AshRa posting in a Kung-Fu Films thread at Dissensus on 13 June 2005:
Did anybody used to go to the Shaolin film club in London around ‘95 - 2000…? The main venue was upstairs in a disused building at the top of Charing Cross Road (seem to remember it was pretty near Silverfish but i’ve not been back to London for years and my memory is failing!) They would show a shaolin double bill interspersed with Tekken tournaments plus music & food (CUP NOODLES!) in another room.
It was my favourite night out for ages when I lived down there but it disappeared for a while and came back to club 333 for a couple of nights, but the magic just wasn’t there any more!
And, for completeness, here’s Matthew Presley in the rec.music.hip-hop Usenet newsgroup asking about it on 18 September 1998:
This is for Uk headz…..sorry everyone else..
Anyone in London know about Shaolin Film Club.
Like a Hip-hop club with Martial arts flicks etc…
148 Charing Cross Road no longer exists — it was flattened to make way for Crossrail. It used to be the northern-most shop on the eastern side before you got to Centre Point. The closest photo to 1996 I could find was from 2008’s Google Street View sweep, when “the book case” had made way for Mr Topper’s haircuts and, downstairs, Orbital Comics. Although both may have left the building by 2008.
The Shaolin nights were on the first floor of the Victorian(?) building. The front room, overlooking Charing Cross Road, had the “retro arcade” set up around the edges, a DJ, and cans of drink and noodles for sale. The back room, which I remember being larger, had the films projected towards the back wall, with chairs set up for the audience. Between and/or before the films, Tekken (according to AshRa) was projected instead.
The only other mentions I can find now include a record of the Shaolin Film Club being awarded a grant of £4,050 by Arts Council England on 19 March 1997.
And then these mentions of whatever the club became after moving to the 333 in Shoreditch:
‘Chosen Few: clubs’, Independent, 23 November 1997:
Clubocular @ The Blue Note N1 22.00 - 05.00 pounds 10/pounds 8 concs
A night exploring the crossover between film, music and graphics with influential film scores set against a diverse backdrop of contemporary soundscapes. Contributors to the night include Faze Action’s Simon Lee, David Arnold and The Shaolin Film Club. This night runs in conjunction with Ocularis, a multi-media experience at The Lux (box office 0171 684 0201).
‘The 333’ at Thee Chronicles ov Jstevekane, 4 March 2010:
…in the early days off the 333 i attended on a regular basis at the time there was a regular Hip Hop breaks night at this club which i used to attend which had a DJ named 33 1/3 other nights i attended included “OMSK” and the “Shaolin film club” at which my Kung fu teacher did a Kung Fu demostration.
If you have any other information or memories about the club, let me know.
- Everything had to be new
I enjoyed Luc Sante’s ‘The Birth of Bohemia in Paris’ in the New York Review of Books from 22 October 2015. The way 19th century Parisian artists began creating fleeting, odd fashions sounds quite familiar to modern ears.
[Artistic bohemia’s] earliest manifestation was hatched around 1818 by the students of Guillaume Guillon-Lethière, a painter…
They … launched, perpetrated, and squelched dozens of fads, in a way that appears to have had few precedents. There was first a medieval fad, countering the prevailing obsession with Greece and Rome, which began with them reading cheap romances and soon saw them wearing satin jerkins and gigot sleeves, carrying around lyres and short swords, and speaking in affected medievalise. They even changed their names: every Jean became a Jehan, every Pierre a Petrus, every Louis a Loÿs. Then they were onto the Sots (via Sir Walter Scott), the modern Greeks (thanks to Byron), the Turks (by way of Lamartine’s Méditations and Hugo’s Orientales). They alternately grew their hair to their shoulders, after the English Cavaliers, and shaved it down to a stubble, after the Roundheads. At the theatre they made a great show of yawning at tragedies and laughing at melodramas. “A great anxiety haunted them: everything had to be new at all costs.”
By 1830 they’d split into different camps with different aesthetics and interests, then:
All of them faded away around 1838, leaving only a joint hatred of the bourgeoisie, whom they called “grocers”. The bourgeoisie, however, converted those fads into consumable objects, which were still turning up at flea markets a century and a half later:
Clocks in the shape of cathedrals, gothic bindings, letter-openers in the form of daggers, inkwells and night-lights and innumerable other objects made to look like dungeons or medieval castles with drawbridges, posterns, brattices, machicolations, watchtowers, allures…
It sounds familiar, the pattern of young artists combining clothes and ideas from earlier periods, desperate to make something “new” out of the old, all of it ultimately commercialised and sold to more conservative markets keen for something “arty” or “different”. But all of this in the early 19th century, decades before even the early example of the Creative Class — Picasso, Apollinaire, Modigliani, et al — helped transform the quiet village of Montmartre into something rather more louche, around the turn of the century:
The Montmartre bohemians had no choice but to get their clothes from the flea market, and weird clothes were cheaper because they were less in demand. They wore Rembrandt hats, cavalry trousers, sailors’ jerseys, Spanish capes, coachmen’s capes, hooded cloaks, mechanics’ jumpsuits, dusters, priests’ hats, jockeys’ caps; the women sometimes unearthed elaborate ballgowns or the short eighteenth-century jackets called pet en l’air (fart in the open) — it was as if they were replaying all the fads of [Guillon-Lethière’s crowd] at once. Since oddball health regimes were also, almost inevitably, in effect, people went barefoot for reasons of “circulation” and wore colourful turbans that allegedly relieved headaches. Their parties were as loud and disruptive as things could get before the advent of amplified music: firecrackers, animal noises, breaking bottles, obscene songs, target practice with revolvers.
- WordPress.com, and user testing, by phone
Recently, I’ve been helping someone use WordPress.com remotely, over the phone, which has been more difficult than necessary because WordPress.com has two interfaces.
As a combination they’re a little confusing to use at times. There’s the original “WP Admin” and the newer, “improved”, Editor. Here are a couple of screenshots:
Both interfaces have pros and cons for different people and different situations, and I’m not going to discuss each in detail. I can see why there was a need to create the second, new, Editor — over the years the original WP Admin has become bigger and more complex, and increasingly daunting. And I can see why they don’t want to get rid of the original WP Admin — radically changing the interface that thousands or millions of people use every day would be a customer support nightmare.
But, if we make the assumption that the only current option is to have both interfaces operational, for different people or uses, here are a couple of things that create confusion:
When looking at the new Editor (which I think is the default), there’s a link in the left-hand menu to the old WP Admin. But there’s no obvious way to get back again, from the old WP Admin to the new Editor. And that link to the old WP Admin opens in a new browser window or tab. It’s very easy to end up with multiple tabs open without realising it, all containing different versions of Posts, Pages, etc.
When editing a Post in the old WP Admin there’s a banner suggesting you “Switch to the improved editor.” This changes the view to the new Editor (in the same tab/window this time), but there’s no way to switch back to the old WP Admin (other than your browser Back button). The left-hand menu does have a “BACK” link, but this doesn’t take you back… it takes you to the list of Posts, still in the new Editor.
While each interface gives you a sense of where you are within it, switching between the two makes me feel lost. I have no sense of where I am in the system as a whole, or the two interfaces’ relation to each other. I’m not sure what the new Editor is even called (I’ve named it “the new Editor” consistently here to try and avoid some confusion).
I think I want more of a sense that I’m switching between two parallel interfaces, and that I can switch back and forth between equivalent screens. Personally, I’d like an always-visible form of navigation that names the two interfaces consistently, so I can always go directly from one to the other.
You might say “well, don’t switch between them then!” But there are, I think, some things which can only be done in the old WP Admin. And also, if you’re finding your way around, it’s very easy to switch from one to the other and have no idea how to return to the slightly more familiar version.
All of these confusions are amplified when trying to help someone use WordPress.com over the phone. It’s difficult enough to help someone use software when you can’t see what they’re seeing, but when you don’t even know which interface they’re using it’s even worse.
Is it blue and white? Or does it have a black menu on the left-hand side? Unless you’ve changed the colour scheme under “Users” and then “Personal Settings” in which case the menu could be any colour… OK, what shade of blue is it?
Without any sense of where you are, or which interface you’re using, and with new browser tabs sometimes opening as you switch between them, it only makes helping someone more difficult.
I’ve seen other friends recently talk about the difficulties of remote support, including Dan Hon in a recent newsletter. It makes me wonder if anyone who does user testing for websites (as it’s websites I’m most interested in) tests for the ease of remote support.
I mean, usually when doing user testing, you might give someone a task and sit them in front of your website and watch how they do. Perhaps, as well, you could give someone a task and sit them in front of your website, but they have to talk a second person through how to complete the task. The second person is in another room, on the phone, and is using a different web browser, with a different monitor resolution, and is starting from a different web page. I’m sure this set-up could be improved and made more rigorous — I’m not a user testing expert — but you get the idea. I suspect it would highlight things which are inconsistent or hard to describe.
- Folklore universe
In 2001 a friend shared a link to this paper, ‘Folk Computing: Revisiting Oral Tradition as a Scaffold for Co-Present Communities’. There’s one part of it that really struck me then and I’ve thought about many times since (and it’s more interesting than the academic title sounds).
The paper’s authors created some devices which they gave to 350 children and staff at a “K-8 school”. Here’s a summary of the device and software:
The name “i-ball” is short for “ball of information.” I-balls are simple software folk objects that have some toy- and game-like qualities. People can design their own i-balls and then share with other members of community. In our prototype, i-balls exist on key-chain-sized video game devices made by SEGA and sold as part of their DreamCast video game system, as shown in Figure 1. We wrote our own software for this commercial device, and renamed it the “i-socket,” to distinguish its capabilities from those of the original SEGA “Visual Memory Unit (VMU)”– designed to let kids store and recall their state in a particular DreamCast game.
People design their i-balls on a PC using a prototype graphical programming tool we developed. The most basic form of i-ball consists of a single “animation” programming block. This block allows kids to “decorate” their i-ball by composing an animation out of 128 different letters and icons in a simple “flip-book” style animation editor. Figure 2 [above] shows the first frame of an animation a child created that depicts a face that blinks and says “hi”. I-balls created on a PC can then be downloaded to an i-socket via a small “docking station”.
Like the play objects they are named after, i-balls can be passed between people. Participants can give a copy of one of their i-balls to someone else, or, using “jump” blocks and “rule” blocks in authoring environment, i-balls can be programmed to “bounce” from one person’s i-socket to another’s, based on user-defined rules.
There’s a lot more stuff about the kinds of things that were created and how they were shared. They authors, and kids, could look at how the i-balls spread around the school. And this is my favourite bit:
One of the most interesting “ahas” came after a third grade class viewed a visualization of how one of their favorite i-balls had spread. Someone in the class had made an i-ball version of the class mascot: a bunny named Shadow. They felt this i-ball was very popular and asked several times for a poster-sized printout of how it traveled through the school. When we brought it to their class, however, they were disappointed.
The visualization revealed that while most of the third-grade had gotten a copy of the Shadow i-ball, it had not spread much beyond that. After a long discussion, it became increasingly clear that the third grade had come to believe that because everyone in the class had seen Shadow, many in the school probably had as well. This led to some interesting conversation about the limits of generalizing what is true about your close-knit group to what is true about the larger population.
Ordinarily, of course, the mistaken beliefs of the third graders are self-sealing. People’s folklore universe is determined by whom they interact with, and there is no way of getting outside this universe to test the limits of it. Phenomena like insularity are notoriously hard to see from the inside.
When I first read this I’d already read a fair bit about social networks and the diffusion of ideas through them, but this bit was my own “aha” moment — I don’t think I’d seen such a clear example of a group of people assuming their entire “universe” thought like they did. When everyone in your social group is aware of many of the same things it’s really hard to see the broader world objectively. You assume what’s common knowledge for your peers is common knowledge for everyone.
I’ve thought of that class so many times over the years because we see similar behaviour over and over again: People being surprised at how the wider world doesn’t know what they know, and often laughing at someone in the wider world for their apparent lack of “common” knowledge.
- On spending too long setting up one webserver
Over the past couple of months I’ve spent most of my free time setting up a webserver. One webserver. One. This is ridiculous and has made me question my profession and my sanity. This is my attempt to justify myself.
First, let’s step back a bit.
After running The Diary of Samuel Pepys for ten years I had the bright idea of rewriting the site and doing it again. So in 2013 the new version, written using Django, appeared. The website was hosted on Heroku, one of whose benefits is simplifying the task of running a webserver. It can become expensive but for my simple needs it cost US$9 per month (for a database).
Last year Heroku re-jigged their pricing, understandably, and so I’d now have to pay for not only the database but also serving the site itself. The monthly cost would rise to $16 per month. It’s not a huge amount but it was a nudge and I wondered about alternatives.
As a comparison all of my other self-hosted sites are hosted on a Webfaction “shared hosting” account for a total of $10 per month. I could probably have moved Pepys there but this seemed like a good chance to learn something new.
I decided to try setting up a virtual server at DigitalOcean. This is a bit more like a “real” webserver, without Heroku hiding the grubby bits beneath the hood. Consequently it requires more work to get going. But it’s cheaper: the smallest option at DigitalOcean only costs $5 per month. It’s more time consuming and there’s more scope for things to go wrong. But this was a chance to learn more about part of the “full stack” of web development that I’ve usually avoided.
Thankfully, there are lots of instructions for doing this kind of thing, and I was able to follow these to set up a server and get my Django site working. It took about a day of carefully following along and making notes but, almost uniquely in my experience, the process worked.
I was amazed. I’d learned useful new things. I’d saved $11 per month. Imagine the celebrations.
While I had a new, live, production website, I also needed to write new code on a different version of the site. Django makes it easy to run a local version of your code for development, which is fine but not ideal — my Mac is a different environment to a DigitalOcean virtual server. It’d probably be OK for my own decreasingly popular personal projects but I like to do things “properly”.
Them: You have an interesting CV, Phil. Now then, what’s your biggest weakness?
Me: Well… I like to do things properly.
I always like to try and find the very best, the most “proper” way to do something.
When I’m doing work for clients this tendency meets its natural enemy: deadlines. Working out how to do a new thing the best way can take a while. And while I, of course, like to do client work well, the pressures of time always mean balancing “properly” with “good enough”. This is the real world and the reason anything gets done.
When I’m doing work for myself I am not in the real world. I rarely have set deadlines — I’ll never finish all of my personal projects anyway, so I may as well take the time to find the best way to do things. I may as well try and do things properly!
This is satisfying, and involves learning a lot more, but it takes longer and the end, visible, result is often little different to a “good enough”, “it seems to work” solution.
For a development version of Pepys I, of course, wanted to do things properly — I wanted to mimic the live server as closely as possible. I’d used Vagrant before — it lets you run virtual computers on your own computer — so I just needed to use it to set up a copy of my live server. Which would mean manually going through the same instructions that took a day to set up the live server. And I’d have to repeat this every time I needed a new server for the same or a different project. That seemed like it could be a lot of boring, repetitive days.
Of course, people write code to avoid boring, repetitive days, and there are many ways to automate the setting up of servers. Chef, Puppet, Ansible, SaltStack… all with their own metaphors. I liked the sound of Ansible — it didn’t sound too complicated, and I liked that you run its scripts on one server (such as your laptop) to configure a remote server.
I decided to try using Ansible to set up a Vagrant server, for local development, and to replace the new live server with a new one set up using the same script. I could even make a staging server on DigitalOcean too. Soon, with a single command I could set up as many identical servers as I liked!
Well, not that soon.
I like to do things properly.
Although DigitalOcean isn’t terribly expensive, it seemed like a waste to have an entire virtual server for Pepys. It rarely uses more than 10% of the CPU. And I imagined I might build Django sites in the future that would be smaller and even less worth their own server.
Ansible, and its ilk, are usually used to set up a single website spread over several different servers (to handle the load of a lot of traffic). I wanted to go the other way — set up several websites on a single server. This can’t be that hard, I thought. This must be possible.
I can confirm that it’s possible, and my new DigitalOcean server is live, and I have a matching Vagrant virtual machine. It’s easy to create new ones. It’s great! But, somehow, it’s now the end of March and this is almost all I’ve achieved so far this year.
This is “doing things properly”.
If I’d wanted to use Ansible to set up a server with a single Django app, that’s been done. I would, being me, have built up my own version, copying bits from existing solutions, which would have taken a while, but not too long. It would have been closer to the “it’ll probably take a couple of weeks to get to grips with Ansible” scenario that I initially imagined.
This has taken me far, far too long. Simply to replicate one webserver. If it was going to take me a whole day to manually create each new server I could have done that for many future servers and still saved time. I’ve spent several weeks inching forward, getting despondent, producing nothing visible. But, now, if I make a new Django app it can go on the same server as Pepys. So long as it’s not really popular. By spending weeks typing and swearing I’ve saved myself a hypothetical $5 per month in the future! More celebrations!
Here is my Ansible playbook which I don’t really expect anyone else to use. It has a lot of documentation because I quickly forget how to do things, so always write it down in detail.
Why was this so difficult?
Part of the problem was that I’m not that familiar with servers. So I was learning how to automate something that I know little about. I’d end up spending a lot of time puzzling over basic server administration issues. I am, now, a bit more familiar with this stuff, so that’s good.
A lot of my time was spent trying to make things work for multiple Django websites. Lots of tasks that are quite simple to automate with Ansible become trickier when wanting to do it for one or more websites. Little was impossible, but a lot was harder. A lot.
More time was taken up with making the playbook easy to reuse. More of that “doing things properly”. I wanted it to be easy to add a new Django website to the server, simply by adding a few configuration variables. Which it is, mostly. But making things easily reusable — smoothing out rough edges, thinking through how to structure things — takes a while.
A good few days early on were lost to figuring out the differences between setting up a “real” webserver (like on DigitalOcean) versus a Vagrant one. Although Vagrant mostly behaves like an actual server, there are enough differences that some Ansible things that work fine for proper servers don’t work the same, if at all, on Vagrant.
There were probably a few days sunk solely into how best to handle different Python environments. I can’t quite believe this is so difficult, but there we go. I wanted it to be possible for each Django site to have its own version of Python, its own Python modules and its own environment variables. This is not a solved problem.
(To get a little more technical for a couple of paragraphs… I’ve ended up using: pyenv to manage different python versions; virtualenv and pyenv-virtualenv to manage the discrete environments for installing modules; and autoenv for handling environment variables. It works, but it feels like a mess.
Autoenv sets environment variables OK, but doesn’t have a way of unsetting them, which makes me uneasy. I also looked at direnv and python-dotenv which had their own problems. I spent a long time struggling to get virtualenvwrapper and pyenv-virtualenvwrapper working but failed, not helped by confusing documentation. I feel like I’m missing something.)
Whenever I’m learning some new technical thing it’s a case of running blindly from one seemingly insoluble problem to another. This was no exception, probably because it involves so many different moving parts. There’s Ansible, and there’s knowing general UNIX-y stuff. And then there are the wrinkles of every piece of software that’s being installed, configured and used. Git, Nginx, Gunicorn, Supervisor, Postgresql, Memcached, Django, all that Python gubbins… I can’t pretend to have mastered that lot.
My struggles with learning any new development things are chronicled in my desperate questions on StackOverflow. I usually spend a day or so exhausting all my Googling and experimentation before I give in and compose a question in the hope an American will come along overnight and provide the answer. About 50% of the time I end up answering my own questions eventually, as if I’m continually breaking new and exciting ground like a trailblazing scientist, rather than an increasingly despairing web developer. This project has resulted in these “questions”:
- Using pyenv in Ansible
- Using a variable from one Ansible var file in a second var file
- In Ansible, how to combine variables from separate files into one array?
- Using Ansible set_fact to create a dictionary from register results
- User’s home directory owned by root when syncing a folder with Vagrant and Ansible
- In Ansible, run only one item in a loop based on a variable
There we go. I’ve no idea how much time I’ve actually spent on this. If it was full-time, five-days-a-week it must be over a month. Six weeks? Whatever, it seems ridiculous. It’s been a grind, without even the usual, eventual, reward of being able to say “TA-DA! Look at this beautiful and/or useful thing what I made!” I guess this blog post is the replacement. “TA-DA! Look at what I’ve been through, even if everything looks exactly like it did before!”
On the bright side, I’ve learned a lot! I’ve filled in some gaps in my knowledge and have a bit more idea about how to serve the websites I might build. My sysadmin skills have probably advanced from the “completely useless” level to the “almost knows enough to be dangerous” level. Which is progress.
On the other hand — and I always have a pessimistic third hand ready — I can’t even keep up with front-end development these days, never mind starting to keep up with a completely new part of the stack. I already had spinning plates crashing down around me and now I’ve found a completely new set of crockery wobbling, wobbling, wobbling away.
- A thought on job ads
If I ever post a handful of Tweets on one topic I feel they should have been a blog post. You remember blog posts. So, a thought on job ads.
I occasionally receive emails from recruiters who have “viewed your page on Github”, or “reviewed your LinkedIn profile”, or whatever. The jobs always sound very dull. Most jobs probably are dull. But these supposedly enticing descriptions are unnecessarily dull. They’re always blandly generic.
This is mainly because recruiters anonymise the company they’re recruiting for, withholding any identifying information. Presumably because they don’t want you applying directly to the company and denying the recruiter their percentage. But making something sound as un-special as possible seems a terrible way to sell it.
(An aside: I’m very picky. There are, of course, many people who will apply for any job that sounds remotely like they have a chance of getting it. And the exact language used in the ad probably isn’t high on their list of priorities. But, while I’ve been sort-of-looking for a proper job for a long time, years of freelancing tells me I can make an interesting living without one. And I can barely imagine working in one company for years any more. So I’m not desperate for any job right now. I’m job-curious.)
Occasionally I go and look through job ads on the web and those that are most appealing are those that appear to be written by someone interested in finding the right person for their own team. They don’t go over the top with buzzwords and making the place sound AMAZING. They sound human. They try to convey a sense of what the company is like as a place to work, and what the role would be like, both initially and in the longer term.
This is important to me because, once past the bullet-points of required and desirable skills, I want to imagine whether I’ll fit in, and whether I’ll like being there and doing the job a couple of years in the future. Does this sound like someone I want to work with/for? Beyond the technologies, does it sound like the kind of work I can do and will want to do every day? Is the company my ideal balance of informal but without the worst excesses of start-up-world? Is it exactly the right place for this special snowflake?
Writing appealing job ads must be difficult. And that’s if you’re part of the company and trying to attract people to you. If you’re a third-party recruiter who’s dashing out loads of these things for clients, and anonymising them, it’s easy to see why the emails aren’t appealing. How can I possibly get a feel for this “innovative Fintech start up” or that “multi award winning agency” when I have nothing specific to go on?
I guess there’s a level of specialised, high-level recruiter that doesn’t work like this. (“Headhunters”? I guess that’s the thing?) And that I shouldn’t be surprised these out-of-the-blue emails are as unappealing as generic spam. But it seems like a waste of everyone’s time.
(This blog post is based on a Twitter conversation with @ClareBurr.)