How I built and launched PropertyGuessr

Published on

You can play the game at: propertyguessr.fun.

The price of property is interesting to many because, in our modern society, it’s inescapable. Regardless of what you want, the price of property is going to have an effect on your life. There are people like me who are looking to buy property. We’re interested in the price because we need to know how much to save for a down payment and if we’re getting a good deal. There are people are already own property. They’re interested because they’re invested in the market and want to know how that investment is performing. Then there are those who are renting: Their rent payments are driven by the mortgages their landlords are paying, which are a direct result of the price of those properties.

four-bedroom-house

The idea for this project first came to me while I was searching for a flat to buy in Edinburgh. I found myself perpetually glued to property portals such as Rightmove and Zoopla, determined to find a good deal. The only problem was that I had no idea what a 2 bedroom flat in Edinburgh typically went for. The average sale price for the property type didn’t give me enough context to know if I was getting a bargain or getting swindled. I wound up scrolling the sold price of properties on Rightmove but what I really wanted was a gamified way to quickly develop an intuition for what property in my area was worth. So that’s what I set out to build.

Finding the data

To make any of this a reality, I was going to need data. I needed to get the prices and listings of recently sold properties. I started searching for APIs that would give me this information, but I quickly ran into a problem: they all wanted money, and I had none to offer them. Then I found a ray of hope: Zoopla, one of the UK’s largest property portals, had a free public API. I started reading through the documentation, and it had everything I needed. Just before I went to sign up, I spotted that the copyright notice hadn’t been updated since 2017, this struck me as a bit odd. I managed to sign up, but when I tried to make a request, I realised that I hadn’t been given an API key. I read through the sign up instructions again, I read through a blog post on how to get started with the Zoopla API, I even created a new account to see if I had just blitzed past it, but I still couldn’t find it. After some more googling, I found out the truth, it was all a mirage. Zoopla had stopped supporting this API years ago but had left all the documentation up without even a warning in an effort to cripple developer productivity worldwide! I was just the latest in a long line of victims.

couple-crying-zoopla

I gave up on my dreams of finding a free API and decided to just scrape the data myself. Given my new hatred for Zoopla, I choose to instead scrape their direct competitor, Rightmove. Rightmove has a sold price page that allows you to search by postcode or location names like “Edinburgh”. The page shows you a list of all recently sold homes along with their prices, and you can also access the listing. Unfortunately, all of this content is rendered using Javascript, which makes scraping more difficult. When we initially make a web request to this page from our scraping script, we won’t get this content back because the Javascript hasn’t rendered it yet. Typically, this is where people resort to using tools such as Selenium or Puppeteer that will render the Javascript before you scrape the page. In this case, there is a far simpler solution, Rightmove is passing all this data in a variable within a script tag, which means we can use some clever trickery to extract it with just a web request.

rightmove-property-page-edinbrugh

I manually created a list of locations based on Wikipedia’s list of cities in the UK, each had a key, a title, and a url that pointed to the Rightmove sold price page. I started writing a script that, for each of these locations, would scrape the number of eligible properties I requested.

After making the request for the list of properties for a location I ran into my first issue. I was using cheerio to extract the script tag that contained the variable with the data I was looking for in it, but I wasn’t able to parse the data in the variable as JSON. It quickly became obvious that I was unable to do this because it wasn’t JSON it was Javascript. After some research, I opted to parse it as JSON5 which is an extension intended to be used for writing configuration files but, crucially for my use case, didn’t require quotes around the keys in the data.

Building the game

Armed with the data I had scraped from Rightmove, I set about creating a wireframe in Figma. I then tried creating some higher fidelity mockups but abandoned those in favour of just winging it with tailwind.

rightmove-property-page-edinbrugh

Next, I went back and updated the list of locations that we were using with coordinates for each location. On the location picker screen, I’m just plotting those locations on a map with Leaflet. Leaflet proved to be a complete pain in the ass to work with because I wasn’t working with a fixed height container, but I got there eventually.

The actual game screen closely resembles the design. Swiper is used to display the images in a carousel. The key information is shown on the card below the carousel, and everything else is delivered via a system of modals.

game-screen

You won’t deliver your idea, and that’s okay

When I initially imagined the game, each location showed you five properties, and what you shared with your friends was your average amount off for that location. I pictured people using this game to become accustomed quickly to the value of properties in their area. Unfortunately, this was not a sustainable model. I ran some analysis on the data I had scraped, and it was obvious that at a rate of 5 properties a day, we wouldn’t be guessing the price of recently sold properties for very long. This is partially because in some locations there aren’t five eligible properties sold every day. It’s also because the Rightmove sold page isn’t updated everyday; there can be periods of months when no properties are added. I made the decision to reduce the number of properties per day to just one per location. This was initially pretty demotivating, but you just have to accept that you can’t always deliver the idea that’s in your head, it is unencumbered by reality. If you become too attached to your initial idea, you will never deliver.

Launching the game

Due to the UK specific nature of this project, I didn’t think it would perform well on the traditional maker sites like ProductHunt or HackerNews. I chose instead to launch via a few UK specific subreddits. I posted to r/HousingUK, r/UnitedKingdom and r/CasualUK. To my surprise, the posts immediately started picking up traction. At this point, I was so sick of thinking about house prices that I didn’t believe anyone would be interested in playing the game.

There was a lot of feedback on the posts. Some users reported a desktop issue that prevented them from guessing the price of the property. This took me a while to figure out because it was being reported on a wide range of browsers and systems. I managed to track it down to a screen height issue. My development had been so focused on mobile that I had forgotten to make sure the game worked on smaller laptops.

Another thing that people pointed out was that when they shared their guess for a property with their friends, it showed not only the amount they were off but also the actual price of the property. Which embarrassingly defeated the purpose of using the share feature, but luckily was a quick fix.

In the first 48 hours, there were 318k page views and 32.7k visitors to the site. The reddit posts collectively received over 2500 upvotes. The statistics showed that, on average, each player had played 9 locations. The way people were playing the game was totally different from how I imagined it; it became less about knowing your location and more about knowing or discovering how different locations affected prices across the UK.

umami-analytics-screenshot

Automating from behind

When I first released the game, updating the properties was a manual process that had to be completed at 12 a.m. each night. I would run the scraper, move the files to the client, and then push up the changes to trigger the deployment. This wasn’t sustainable and, quite frankly, was very annoying. After some investigating, I realised I could just automate my current manual process using GitHub actions.

The first action runs earlier in the night and does most of the heavy lifting:

  • Creates a new branch.
  • Runs the scraper.
  • Copies the results to the client and an archive.
  • Commits the changes and pushes up the new branch.
  • Creates a pull request for the new branch.

The second action is triggered at 12 a.m. and merges the open pull request into master, which triggers a deployment. There are some advantages to doing all this in GitHub actions instead of running a separate server application:

  1. It’s free; for private repos, you get 2000 minutes per month free per account. My workflows only use 3 minutes per day.
  2. The calls to Rightmove are being made by different workers, which should make the scraping harder to detect and prevent.

The only drawback, as far as I can tell, is that when you schedule a Github action, it will only run once a worker becomes available. This means that although the action that merges the pull request is scheduled for 12 a.m., it can sometimes run up to an hour later due to the high volume of other actions scheduled for around that time.

Marketing strategy

Shortly after the launch, a journalist from a UK media group reached out to me. She was interested in writing a light-hearted article about the game. I did a short interview with her, explaining the motivation behind the project and how I had built it. The way this media group works is that they write articles and distribute them to various new outlets, which pick and choose stories to publish. A week passed, and I assumed that no outlet had picked up the article, so I began contacting journalists myself. I started by targeting property blogs, but found many of them were inactive or were SEO exercises from real estate agents. I opted to cast my net wider to include viral news outlets. I also contacted some property-focused TikTok influencers.

My proactive marketing attempt was a complete failure. I reached out to 31 journalists and 10 TikTok accounts. I received two responses, but no one wrote or posted about the game. Shortly after my proactive attempt, the article written by the journalist I talked to earlier was published on EdinburghLive. The article did not generate a lot of interest in the game, but it was still a nice win. In retrospect, I think if I had been more patient, I could have leveraged this article as proof of concept when reaching out to others.

article-in-edinburghlive

My key takeaways

Overall, this project was my most successful to date and has been a huge learning experience for me. Here are some of my key takeaways:

  • In a project like this, code quality does not matter. This was all a huge mess of spaghetti code, with it sometimes feeling like I was adopting the opposite of the DRY principle.
  • The appeal of the idea is what carried this project. I saw someone describe it as “A great idea but terribly executed” and I agree, but in the end it didn’t matter; people still played it.
  • You’ll never build that idea in your head. It will need to change, and when it does, you will likely overestimate the impact of it.
  • Zoopla sucks.
  • Even if you don’t think anyone is going to use it, you should still release it.
  • You can do a lot of unconventional stuff in Github actions.