Fixing Vagrant VMs after a VirtualBox upgrade

I use virtual machines to split up my web development projects. This lets me have a small Linux server run inside my Mac, allowing me to match the operating system with what the code will run in on a public service, without going all the way to running Linux all the time. As I want an easy life when it comes to setting these up, I use Vagrant to make and manage the virtual machines within VirtualBox, which runs them for me.

Recently I set up an extra Vagrant virtual machine (VM) to hold two projects for a client and keep them separate from some of my own projects, as they needed particular settings within Linux and a different database server to run them than the settings and database for my own projects.

I used a Homestead VM, as I knew the settings were good. Homestead is made for Laravel PHP sites, but is perfectly valid for lots of other PHP projects. My client uses various versions of the Symfony framework, and they work fine within Homestead built boxes.

The new VM worked fine but the existing VM stopped working. I could start it, I could SSH into it, but I couldn’t get any websites from it to show in the browser. Which is a big problem, as that’s all it is there for.

After much faff and investigation, I discovered the problem is a bug in the current version of VirtualBox. I had updated it while setting up the new VM, but unlike previous upgrades, this caused a problem. VirtualBox needs VMs to use IP addresses within the 192.168.56.0/21 range or the “bridge” between MacOS and the virtual machine doesn’t work, so no web pages show.

The IP of the old Homestead box my own projects were in was 192.168.10.10, which was the default when I installed it. That used to work, but now does not. The new VM uses 192.168.56.4, which is within the allowed range, so it worked fine.

To fix the old VM:

I had to edit the Homestead.yaml file. At the top where it says:

ip: "192.168.10.10"

I changed it to:

ip: "192.168.56.10"

I then ran:

vagrant reload --provision

To get vagrant to reconfigure the existing Homestead box to use this changed IP address, without deleting anything inside the VM.

And finally I edited the entries in my hosts file (which is in /etc for me in MacOS) for the websites in the VM, changing their IP from 192.168.10.10 to 192.168.56.10

Once all that was done, the websites in the VM started working again.

This was a very, very frustrating problem. I spent a long time investigating what was happening inside the VM as I presumed that’s where the problem was, eventually stumbling on the solution after searching for wider and wider versions of the problem. Thanks to Tom Westrick whose post on Github got me to the solution.

Keeping on top of spam in a smallish sub-Reddit

I moderate a small part of Reddit for UK freelancers. As we grow in size, more spam and other rule breaking posts get submitted to the group. Spam is annoying in general, but I particularly didn’t want it hanging around in the sub-Reddit as spam breeds more spam as others realise their posts will stick around, and I didn’t want people checking the group, seeing the spam and deciding either it’s dead or we just don’t care about it. If a spam post is flagged as spam by members, it will disappear if enough different people flag it, but as a small group there aren’t enough people being that diligent and as the person running the group it falls to me to remove the posts and potentially ban the person who sent it in from posting to the group again.

I had been checking the group once a day, but sometimes forgot, or if I was unlucky, a spam post would be hanging around for almost a day until I caught it. I wanted to keep on top of the problem, and wanted a warning when a new post was added so I could catch it if it was spam, or help out if it was legitimate and something I could answer.

Each sub-Reddit has an RSS feed of the new posts that come in, for /r/freelanceuk it is https://www.reddit.com/r/freelanceuk/.rss so I knew I could have something read that and send me a warning of new posts. It would be easy to get an email when a new post happens, I already have code that can read a feed and copy it into an email. But, I tend to avoid my email outside of work hours to let my brain relax and not think about work. So, a notification to my phone would be a better alternative.

I tried a phone app (name forgotten, sorry) which said it could do this – read a feed and notify when it changed, but I never received any notifications. So being a programmer, I looked at writing my own, using a service like Truepath to do the notifying part. While I got notifications working briefly, I couldn’t get it working longer term, maybe because it was set up to trigger to a browser and I wasn’t opening that enough to keep it active? Anyway, I was back to square one.

Having asked on the Farm mailing list whether anyone knew of a phone app for Android which would do what I wanted, Julian suggested I use an RSS reader like Feedly to do it, and Mike that I use IFTTT, which he uses to do other things with RSS feeds but supports notifications too.

I’ve used IFTTT in the past and found it a bit awkward and hit and miss, and was getting fed up of thinking about this problem by now and just wished Reddit would let a moderator get a notification on new posts within their sub-Reddit natively. So, I tried a couple of RSS readers instead, looking for small and light ones that didn’t require subscriptions.

RSS Reader listing several stories from freelanceUK sub-RedditThe one that has worked well is the bluntly titled RSS Reader, previously “Simple RSS Reader.” I have it set to check for updates every half an hour and it has worked great. Now, soon after a post is made I get a notification on my phone that one is waiting, pressing that opens RSS Reader and from there I can press through to the website, or just open BaconReader, my preferred app for reading Reddit, and go to that section as normal.

I’ve got the notification vibration turned off, as I don’t want to be disturbed when I’m concentrating on work, but I still check my phone often enough that I’m catching any spam pretty quickly. Also, I’m helping people more quickly too as I’m seeing their posts while they’re still fresh.

So, thumbs up to RSS Reader and it’s creator, Svyatoslav Vasilev!

Fixing a download link downloading the same file when it should be dynamic

One of my clients, the Database of Pollinator Interactions allows researchers to easily search on which insects and mammals pollinate which plants. To make it simple for researchers to gather the references to papers they need to look up, their website allows downloading of search results as a CSV file.

This is powered by a little bit of AJAX, when a searcher clicks the download button, Javascript calls on a PHP script which reads the search filters out of a cookie, compiles the results into a comma separated file, and lets you download it.

However, the site had a bug (no pun intended). If you ran a search and downloaded the results, then ran a different search and downloaded the results, you got the same file of results, even if the search was completely different and the results shown on the page were correct for the search.

This turned out to be because the live server was set up to cache pages where possible, whereas my development server was not. The call to the script that made the file was on a URL that did not change, as it read what it required from a cookie. So the browser thought it was hitting the same URL each time the download button was pressed, so to help speed things up served up the file for download from its cache, rather than requesting a new one from the website.

The fix for this was quite straightforward. In the PHP script that received the call from Javascript, I added these headers at the top of the code:

header("Cache-Control: no-store, no-cache, must-revalidate, max-age=0");
header("Cache-Control: post-check=0, pre-check=0", false);
header("Pragma: no-cache");

Having them all is probably overkill and I should go back and find which one really does the job for them.

This makes the script send headers when the browser requests the URL saying not to cache what is sent back. So the browser will request it fresh each time the URL is called.

Now, when the button is pressed, a new CSV is always requested rather than the browser providing the one it has already received. Problem solved.

Using Microsites for SEO purposes

A ‘microsite’ is a small website, a business can use them for a lot of different purposes: promoting a particular event like a competition or conference, targeting a particular type of business by having a site just for them, highlighting a service that customers might be interested in but which is not the main focus of the company.

How I’ve used microsites successfully is as a way to experiment with Search Engine Optimisation (SEO).

SEO has a lot of rules, some clear, others rather vague. Usually, I err on the side of being very safe when optimising for a client. I fix any technical problems, and work with copywriters like David Rosam who can write for both being found in search and convincing a visitor to become a customer simultaneously. No dodgy linking. No weird technical tricks. Straight down the line works over the long term.

However, there are times when I want to try something in the grey area where the rules are vague.

One of these situations was targeting various searches and locations. My client wants to be found for various searches which were a combination of <service> & <location>, so cleaning flats brighton. We could see these getting clicks and conversions from Google Ads searches, so wanted to see if we could get found for them in normal, ‘organic’ search results as well.

So the goal is: get found for various services in a wide range of locations.

The client already had registered a wide variety of domain names, some of which contained the service (keyword) being targeted, i.e. cleaningflats.co.uk or apartment-cleaners.net They weren’t in use, so I borrowed four to start our experiment. (To be clear: the searches and domains are just examples, my client is not a cleaning company.)

A copywriter the client already used wrote four pages of copy centred around the service and a single location, and four smaller amounts of intro copy we could use on the home page of each microsite.

I created four small websites using the same branding and navigation we used on the client’s main site. As well as the intro copy from the copywriter, each homepage listed the main locations being targeted by the client. The locations linked through to individual pages which were re-written using code and a database to have the location in the text replaced with the location that was in the link. Also, the pages showed the nearest office to the location rather than just the HQ.

Now, strictly speaking, this is against the guidelines of the big search engines. They do not like text re-written in this way as it is too repetitive across all the pages. But, they were useful pages for the people who were searching for the service in a particular location. And… they worked. Google and Bing did show the pages to searchers looking for the particular service in a particular area covered.

These are not high traffic sites. They are small – hence ‘microsite’ – and tailored to a very particular audience. Maybe this is why the search engines were happy to show them in the results.

Once a few had shown their use, the client was happy to roll out several more to target all of their top services. I added in some refinements to the sites to help the re-written pages be more useful to the people finding them, and to the whole sites to show Google we weren’t running a network of spam sites to try to influence their ranking of the client’s main website – it would have been a terrible result to have some small sites doing well in search and lose the whole of the main site from being found.

Over time, the microsites have brought thousands of extra enquiries in to the client, so have been excellent value compared to the cost of creation and maintenance.

An interesting side effect of having a main website and range of microsites is how they are affected when Google changes their ranking systems. When a big change comes through, if conversions through the main site start dropping, we often see a boost in enquiries from the microsites. By following a different path in optimising the microsites, we’ve gained a bit of resilience in riding out the big changes.

If you have an SEO experiment you wish to run, a microsite can be a great way of doing it. Re-using existing branding and navigation saves a lot of time, so you can concentrate on the content and some promotion to get the site found. You don’t even need the technology behind the microsites to be the same as your main site. Use a complicated, bespoke CMS behind the main site but want to use something simpler for your experiment? Feel free to use WordPress, or some simple static HTML pages. Anything that lets you get an experiment up and running quickly is fine.

Interested in looking at how microsites might help your business? Get in touch, I’m happy to chat about your needs.

Weeknotes 29: 22 July – 28 July 2019

Client work

Not a lot to report, I was doing a lot of updates and bug fixes while catching up from the time lost last week due to a migraine.

For one client I had a Google Ads account review with their new “Account Strategist.” This is basically going through your account with someone at Google who is very good at Google Ads (Adwords) and them showing you things that need to be fixed or can be optimised more. It was a semi-useful experience this time as a few things I’d missed were brought up, and I have been neglecting the account we were looking at while working more on the client’s SEO recently and having limited time to do both.

Unfortunately, the new Account Strategist is one of the rather pushy ones, and here we hit a problem with the way Google runs their publicly facing staff at Google Ads. Bluntly, money talks when it comes to Google Ads. Spending a lot in their system? Get senior staff who are very meticulous. Pay less? Get a more junior, often pushy person who’s more interested in you turning on a feature that helps Google more than it helps you. Spend little? Basically get a full on pushy salesman who will try to make you turn on YouTube ads all the time, or at worst someone you can barely understand who appears to have never seen their interface before.

I was feeling rough on the day of the review and when I pushed back on doing things like writing a new advert and putting it live in the middle of the phone call (which he was late for, as is often the case for the lower down staff) he acted incredulous, and the same for me not wanting to immediately trigger changes which I know will cost the client more money for little return, as we’ve tried the same feature in the past.

I’ve been handing Google Ads for clients for a lot of years now and I’ve learnt putting things live during a call generally does not work out well for me – they are changes I haven’t properly thought through as I’m in the call rather than considering all the implications of my actions. It is always worth pushing back in these circumstances.

The attitude of their contact people reminds me a lot of the recruiters I’ve worked with – the better ones tend to listen and understand when you don’t want to be rushed and they tend to end up running high value accounts, the ones who top out in the middle of the ranks are always pushy. Fortunately Google tends to rotate all their contact staff every 6-12 months, so you don’t have to put up with the ones you don’t like for long. They rarely seem to make notes though, so you always have the same initial conversations whenever you change Strategist. I’d have thought Google would have a cracking CRM tool for the part of the business driving most of their billions in revenue, but apparently not.

My Projects

Very little on my projects as I was catching up with work that did not get done last week, and trying to not fill every minute of the day as I didn’t want to trigger another migraine.

Productivity/Health

Two things hit my productivity a lot – heat and politics. The heat I partly managed to solve by working a few days at The Skiff, where the air conditioning is brilliant and meant I could work properly. Boris Johnson becoming the new Prime Minister in waiting was a big distraction, and I’ve become so fed up with everything he says being leapt on by people on Twitter that I’m trying muting several phrases for a week. Twitter will not show any tweets containing the words you have muted, and lets you mute them for a week or more – very handy for trying things out.

I have muted “no deal”, “johnson”, “boris” and “brexit” for a week to see how it goes. So far, my timeline is a lot shorter, but also less stressful. I’m not ignoring the news, I listen to a couple of political podcasts and check news sites, but I’m doing it more when I want rather than it dominating social media when I’m interested in what my friends have been doing. I muted “farage” and “trump” a long time ago and feel I’m much better off for it.

Learning

Again, no progress on 30×500 apart from a little work on content for the Farm site.

New meet up

I went to the inaugrual Brighton Indie Hackers Coffee Meet, run by Rosie Sherry. I both like Rosie a lot and have a hell of a lot of respect for her too. She’s built her success through an enormous amount of effort, building a community where testers can get the respect they deserve.

The best bit of the meet was hearing what she’s been doing recently, which sadly has been burning out on her own project and taking a step back. I hope she can get over that quickly and find something new that she’s as happy with as she has been with MoT. One of the things she’s doing now is be community manager for Indie Hackers and they couldn’t have made a better choice for that.

The meet up has got me to try out the Indie Hackers podcast and check the site out. I hadn’t realised how big Indie Hackers had got, I was only aware of it as a site which seemed to be content marketing aimed at the Hacker News crowd, and I kind of forgot about it when my interest in HN petered out a couple of years ago. So, Rosie’s community management is going well on a very local level!

The Farm

Once again held at the Caxton Arms. Only 12 people along but summer can be very up and down as people do things with their families. My notes were…

  • Live mobile app work
  • Airflow – Apache’s batch system
  • Trying to get people on big teams to understand the consequences of actions on other projects
  • Building a local setup to match a client’s weird setup
  • Dealing with the heat
  • Creating Downs-land in town
  • Wildflowers
  • Boris Johnson
  • Affluent reactionaries
  • Moasic demographic: low horizons
  • Will there be a general election soon?
  • SD card aging
  • Long lost relatives
  • Photography with film vs digital
  • Star Trek: Picard
  • DNA testing
  • Ancestry.com and the Church of the Latter Day Saints
  • Sockmaggedon

Family

During the weekend I took Tom out to Worthing, where he enjoyed a playground and playing some Pokemon Go, and I enjoyed an American car show – lots of massive old cars I usually only see on Roadkill episodes.

Tom and old Chevy pickups

Reading: Continued Titan by John Varley. Also started Super Thinking: The Big Book of Mental Models by Gabriel Weinberg and Lauren McCann. I’m reading Super Thinking on the train to and from The Skiff and so far I’m only a little way in and it’s… a bit disappointing. A lot of stuff I already know and a lot of repeating that the book is going to change the way I think. I’d be much happier if it got on with that changing rather than telling me it was going to do it, but I guess that is a business book kind of thing.

Writing with: a Uniball Eye Needle Fine, a good rollerball I’ve been using for my Farm notes as it’s good for using in a small notebook. Good for a big notebook too!

Writing in: a Beechmore Books A5 dotted notebook. I got this at a good price from Amazon, it has nice paper, smooth paper which is coated so fountain pen ink works well on it, although it takes a while to dry. Part of the notebook has a slight binding problem so the paper isn’t completely flat, I’ll have to wait and see if that’s a problem or whether using the earlier part will flatten it out. For £4-5 cheaper than a Leuchtturm1917 notebook, it is a very good choice.