Local development with Valet

I’ve used MAMP Pro with my team for local development for many years. It’s a convenient tool, but it’s often slow (especially the CLI) and often hangs or crashes. I’ve been looking around for alternatives for quite some time, so after it was recommended by Zuzana thought I’d take a look at Laravel’s local dev environment Valet.

Installing Valet

Valet installs a lightweight set of tools to run websites on your Mac via Homebrew. This feels like a good approach to me. I’m not sure we really need the complexity of virtual machines, the code we write tends to work fine on a Mac. Having tools locally installed via Homebrew is fast and convenient.

I started by updating Homebrew to make sure my local packages are up to date:

brew update

Then installed Valet globally via Composer, and installed it:

composer global require laravel/valet
valet install

MySQL

Next I need to install MySQL since this will no longer be available via MAMP:

brew install mysql@5.7
brew services start mysql@5.7

This has installed MySQL, I now want to secure the local root password (by default it is empty):

/usr/local/Cellar/mysql@5.7/5.7.25/bin/mysql_secure_installation

(please note the path to mysql_secure_installation may change depending on your version of MySQL).

Next, I want to connect to the database to install a copy of my own blog WordPress database for local testing. We use the excellent SequelPro for managing MySQL.

It’s easy enough to connect to localhost using the details:

  • MySQL Host: 127.0.01
  • Username: root
  • Password: (the secure password I set above)

This worked fine, I created a new database for my local test site and imported a recent SQL backup.

Serving a site

The default method to serve sites in Valet is to use valet park to serve all sub-folders in ~/Sites as websites via URLs in the format folder-name.test

For example, http://wordpress.test/ would serve a website with the document root of ~/Sites/wordpress

This isn’t really how we work, most projects have a “web” sub-folder inside a project to allow for files outside of the document root. So for setting up local sites I’ll need to use the valet link command to set each site up manually.

To create a link from ~/Sites/simonrjones.net/web for the host simonrjones.test it’s simple enough:

cd ~/Sites/simonrjones.net/web
valet link simonrjones

I can verify the sites setup in Valet via:

valet links

The final step is to ensure WordPress knows about this new local test URL, otherwise WordPress has a habit of redirecting to what it thinks is the correct blog URL.

I use the multi-environment config on my personal blog so this is easily achieved by changing the ‘domain’ value for the ‘development’ environment in the file wp-config.env.php

 'development' => [
        'domain' => 'simonrjones.test',

I can test this now via the URL http://simonrjones.test/ – which worked first time!

Testing with a more complex setup

Next, I tried this with one of our client’s WordPress multi-site installs, which is a little more complex. The site is hosted at WP Engine so it uses the standard wp-config.php setup (not multi-environment config).

CD’ing into the project folder and running valet link is enough to make the site run from a local *.test URL. However, trying to access the site doesn’t work and WordPress multi-site redirects the request to what it thinks is the correct URL.

I have WP CLI installed, so I tried to use this to update old site URLs to the new *.test ones, but I couldn’t see a way to reliably do this for multi-sites. So I went back to straightforward SQL!

UPDATE wp_options SET option_value='http://clientdomain.test' WHERE option_name='siteurl' OR option_name='home';

UPDATE wp_site SET domain='clientdomain.test' WHERE id=1;

UPDATE wp_sitemeta SET meta_value='http://clientdomain.test/' WHERE meta_key='siteurl';

UPDATE wp_blogs SET domain='clientdomain.test' WHERE blog_id=1;
UPDATE wp_blogs SET domain='site1.clientdomain.test' WHERE blog_id=2;
UPDATE wp_blogs SET domain='site2.clientdomain.test' WHERE blog_id=3;
UPDATE wp_blogs SET domain='site3.clientdomain.test' WHERE blog_id=4;
UPDATE wp_blogs SET domain='site4.clientdomain.test' WHERE blog_id=5;

UPDATE wp_2_options SET option_value='http://site1.clientdomain.test' WHERE option_name='siteurl' OR option_name='home';
UPDATE wp_3_options SET option_value='http://site2.clientdomain.test' WHERE option_name='siteurl' OR option_name='home';
UPDATE wp_4_options SET option_value='http://site3.clientdomain.test' WHERE option_name='siteurl' OR option_name='home';
UPDATE wp_5_options SET option_value='http://site4.clientdomain.test' WHERE option_name='siteurl' OR option_name='home';

In addition I had to set the wp-config.php setting to the default site URL

define('DOMAIN_CURRENT_SITE', 'clientdomain.test');

Finally, I had to setup additional URLs to serve the multi-site URLs in Valet:

valet link site1.clientdomain.test
valet link site2.clientdomain.test
valet link site3.clientdomain.test
valet link site4.clientdomain.test

The above appeared to serve the multi-site install from the different URLs. However, when I attempted to login to WordPress I was presented with a white screen of death.

I tried valet log and viewed the Nginx error log. Nothing. I refreshed the page a few times and WordPress came up. Navigating around most WordPress admin pages seemed to work, but I got the occassional slow page load or white page. On the front-end pages in sub-folders often seemed to not work on the first request. This is a little disoncerting. This may be caused by the complex WordPress setup, one of the installed plugins, or simply the fact it’s using Nginx and we’ve built these sites to work on Apache. It’s something I’ll have to look into.

Next steps

Valet certainly seems to be a useful tool and one that is very quick to setup. Things I’d like to look at next..

  • Review the secure HTTPS option in Valet
  • Take a look at Valet+, though it looks a little out of date compared with Valet.
  • Can we customise the webserver used so it more closely matches our production environment (we use Apache instead of Nginx FastCGI)?
  • Is there a WP CLI plugin to help change multi-site URLs? If not, we should write one!

If we can use a lightweight dev setup such as Valet it would be nice. Though as with everything in tech, it’s clearly not completely plain sailing! MAMP Pro is certainly easier to use and setup, though it’s the speed (web requests and CLI) which I’m starting to want improved more than MAMP appears able to offer.

Creativity, Playdate and making things

It was with some delight I opened this month’s Edge magazine which I found on my table this evening. This month they have an exclusive on a nifty new handheld console built by software developers Panic called Playdate.

From the article, Playdate looks awesome. It’s a a cute yellow handheld with a high-quality LCD screen, simple controls and an intriguing crank, designed for fun gaming experiences written by indie developers. It looks like nothing I’ve played before. The concept seems pretty crazy, but that seems to be the point.

Edge #333 - Playdate handheld console

With a series of fun, creative, offbeat games released every Monday over wifi once you boot the Playdate up, the concept seems to be genuinely original and born out of a desire to just have fun and make stuff. I can’t wait to get my hands on one!

If you’re a fan of gaming the Edge article is well worth the price of the magazine. Reading through a few things stuck out for me.

I’ve been aware of Panic for many years. We use their software at Studio 24 (their excellent file transfer tool) and I’ve always been struck by their attention to detail and quality of design. I eagerly played through Firewatch when it was released on the Playstation 4. A fantastic game beautifully designed, full of atmosphere and good storytelling. The sort of game I really enjoy.

From the Edge article, Cabel Sasser (co-founder) explains what he believes was the origin moment for the project. He talks about Panic being a 20-person company with revenue around the two million dollar mark. He woke up one morning with “a bit of an existential crisis”. He had a profitable, independent company without external investors, with a team that could put their hand to a range of things – not just the same sort of work they’ve been focussed on for so long.

I realised we don’t have to keep doing the exact same thing that we’re always doing – this ceaseless develoment-and-support cycle. We can do some weird things too, as long as we’re not betting the farm. If we have this chance, we should probably start doing some things that take us to new places. Maybe they’ll work, maybe they won’t. But if we’re not doing that, we’re just wasting out lives.

Cabel Sasser, Co-founder, Panic

This kinda resonates with me. I run a digital agency, not far off the same size and revenue as Panic. I started as a creator too, hacking together web pages and making software. Making things has always been part of my make-up. But with running an agency that often takes a back seat, and spare time pretty much disappears if you’re not careful. It’s fantastic to see other companies of a similar size spin out internal projects into something so impressive.

Another lovely quote is:

Running this company now, I feel almost like a different person. I feel like a huge part of that is finding again how important it is to me and for everyone here to just make things, and be proud and excited about it.

Cabel Sasser, Co-founder, Panic

It’s a bold and exciting move for Panic, but one that I’m sure will give them new opportunities and rewards. There seems to be a real movement for more interesting, playful gaming at present. I hope Playdate does really well.

I’ve seen this in other companies recently too. Only this week I read that the excellent WordPress agency Human Made released Altis, their own “next generation” CMS platform built on top of WordPress. From my brief review it looks like a really interesting set of content and development tools to make creating engaging websites way easier. It sounds like an interesting venture for Human Made.

It also reminds me of Brendan Dawes’s talk at New Adventures about creating things, hacking together technology and making new things. (if you’re not aware New Adventures is a superb conference on digital creativity, ethics, inclusion and other essential topics)

Digital is so powerful and teams that work in this industry end up with such a variety of skills that can be put to great use. Studio 24 is twenty years old this year and I hope to be able to make more time for creating things with my fantastic team. I’ve started already with a foray into building sites with Headless CMSs and some tools I hope we can spin out into a viable open source project in the near future (which I also hope will spark off some interesting talks I can do at user meetups & conferences).

I’m also trying to blog more these days. Blogging on your own site seems to be coming back into vogue, I think it’s simply a nice way to note down your thoughts to help inspire and motivate. As Field Notes neatly puts it: “I’m not writing it down to remember it later, I’m writing it down to remember it now.”

Excellence in entrepreneurship: an audience with Google

I’ve just returned from an excellent talk organised by Cambridge Judge Business School’s Entrepreneurship Centre. Simon Hall hosted a fascinating discussion with Google leaders Jonathan Rosenberg, Senior VP of Alphabet and former Senior VP of Google Products, and Alan Eagle, Director of Executive Communications at Google.

The talk was primarily about the value of coaching, and was celebrating the life of Silicon Valley luminary Bill Campbell (who I admit I did not know), business coach to Steve Jobs, Eric Schmidt and others.

The event was primarily targeted at CJBS students though there was a fair crowd of entrepreneurs and local business types, I chatted to a few after the talk.

Two things happened in the talk I wasn’t expecting. First, quite how entertaining Jonathan and Alan were together. They were clearly good friends and had a hilarious double act going on between them. Alan’s description of how Bill Campbell grilled him in a job interview was brilliant as he made Simon stand in for himself while he channeled an intimidating Bill (there’s a good photo of this moment here).

Secondly, everything Alan and Jonathan said was focussed around human values. How important empathy and developing your team is to business success and how coaching is a great way to achieve that. I wasn’t expecting the talk to be focussed so much on how important human-centered values are to business, but I was very glad to hear so much from the two Silicon Valley business leaders that I found myself agreeing with.  

Alan retelling the story of how he was interviewed by Bill Campbell

A few notes from the talk…  

Interview as much as you can, good to learn how to tell good people.
A good question to ask about experience: What did you learn from this?  

If you have to let someone go do it with dignity. Why? Because it’s the right thing to do. Because it will affect the rest of your team if you don’t. Because it’s a small world, people who you let go may be future opportunities / they may talk about your business.

5 elements of a successful team:

  • Safety
  • Clarity of goals
  • Respect
  • Big mission that matters
  • A meaningful role

Find out more at Project Oxygen  

An effective manager needs to marry the principles of coaching with management.  

Important future tech skills (Alan repeated this a lot):

  • Computer science
  • Machine learning
  • Data

Soft skills:

  • Passion, interest in learning
  • Smart creative dedicated to learning
  • Good communicator, concisely make your point, speak with passion

Important for Google’s success: Speed & simplicity  
All sorts of latency exist so speed really matters. Fast results makes people come back.

The concept of no managers didn’t really work for Google. We had this for 18 months or so. Asked team and they wanted someone to mentor them & take decisions.  

Engineers need a career ladder to rise to the highest level in a company without having to be managers. If you’re the most senior engineer the impact you can have on a tech company is profound & you should be paid the same / more than managers. Not many companies do this.  

Guide & lead, give people freedom. Don’t micromanage.  

There was a question about whether too much growth is bad. Seems not. You can hire brilliant people off the back of fast growth. With an internet business you can often support fast growth. Don’t worry about getting everything right / perfect.
Great quote: “If everything’s going right you’re not going fast enough”    

You can read more in their latest book, Trillion Dollar Coach, available from all good bookstores. I’ll be enjoying the copy I bought this evening!

Adding a Staging environment to Symfony 4

Environments in Symfony

We use Symfony a lot at Studio 24 for building modern web applications. Our normal setup is to have a local development environment, a staging environment for clients to test code, and a production live site.

By default Symfony supports the following environments:

  • dev – (development) intended for development purposes, can be used locally or on a hosting environment to test your application
  • test – (automated tests) intended for use when running automated tests (e.g. phpunit)
  • prod – (production) intended for the live web application

Ideally we would have a third web environment to represent staging, which is what we use to preview functionality before go-live. So that’s what I set out to do.

Adding a custom environment

I want to call my new environment stage to represent staging, since Symfony already uses shortened versions for other environments.

It turns out you can just add any old environment name and Symfony recognises this. So setting the new environment locally is really only a matter of updating your local environment settings file .env.local (you can also set this via actual server environment variables).

# Website environment
APP_ENV=stage

Environment configuration

Symfony loads environment variables from .env files. It uses the .env file for default values, then loads the .env.{environment} file for environment-specific settings. Finally, it loads the .env.local file for sensitive variables (e.g. API keys or database credentials – this file should not be committed to version control).

To help keep track of my staging environment variables I created a file at .env.stage to store these.

Package configuration

Different packages use YAML config files in the config/packages folder. I created the folder config/packages/stage/ which is used to store package configuration for the stage environment. It’s possible to inherit values from another environment via the imports key, which is really handy. Here I’m importing the prod settings for the stage environment.

# config/packages/stage/monolog.yaml
imports:
- { resource: '../prod/' }

Composer packages

One gotcha is your PHP code may depend on a library that is loaded by Composer locally in your dev environment (via require-dev), but does not load on stage or prod.

When I first tested the above code, it crashed since Monolog was not found (this is used for logging). It turns out Monolog was loaded in my local dev environment via symfony/debug-pack which is setup to only install on require-dev in my composer file.

This simple composer require command quickly fixed it!

composer require symfony/monolog-bundle

Debug mode

Debug is enabled by default for dev and test environments, and disabled for prod and any new environments.

Normally I’d recommend not displaying debug mode for a staging site, since it is supposed to be an environment to preview the site and should work in the same was as production.

However, you can enable debug and the Symfony debug bar for your new stage environment. First set APP_DEBUG in your .env.local file:

APP_DEBUG=true

Next, ensure the debug bundles are enabled. Edit config/bundles.php and ensure the WebProfilerBundle and DebugBundle are both enabled for the new stage environment.

    Symfony\Bundle\WebProfilerBundle\WebProfilerBundle::class => ['dev' => true, 'test' => true, 'stage' => true],

Symfony\Bundle\DebugBundle\DebugBundle::class => ['dev' => true, 'test' => true, 'stage' => true]

Finally, create the following config files:

# config/packages/stage/debug.yaml
imports:
- { resource: '../dev/' }
# config/packages/stage/web_profiler.yaml
imports:
- { resource: '../dev/' }
# config/routes/stage/web_profiler.yaml
web_profiler_wdt:
resource: '@WebProfilerBundle/Resources/config/routing/wdt.xml'
prefix: /_wdt

web_profiler_profiler:
resource: '@WebProfilerBundle/Resources/config/routing/profiler.xml'
prefix: /_profiler

Summary

That’s it! It turns out it’s very easy to setup new environments for Symfony 4, most of the work is in enabling any bundles you require and ensuring the right config files are setup for your new environment.

The front-end of Headless CMS

I wrote a blog on what Headless CMS is all about  on my agency site back in December. Partly to help explain to our clients what this technology is all about, partly to help crystalise some of my thoughts on the subject.

I’m currently embarking on an interesting project to build a set of tools to build a front-end website in PHP based on Headless CMS technology. My aim is to open source this front-end code, though we need to prove it first on a few projects.

I thought I’d blog about my experience and findings here. The following is a short technical focussed intro to all this. Continue reading “The front-end of Headless CMS”

OpenTech 2017

I made my first trip to OpenTech yesterday hosted at University College London. I didn’t really know what to expect, I’d spotted the conference on my Twitter feed and I understood it to be a day full of discussions on open data, technology and how they contribute to society.

I was impressed. It was a busy and passionate conference, full of people who work with tech trying to make a difference to society, making it more open and fair, against a challenging and often unhelpful world.

My day started with Hadley Beeman, a member of the W3C Technical Architecture Group. Hadley’s talk was on “Standards for Private Browsing,” she explained how user expectations of how private browsing works differs from how browsers actually do it. Some US research stated the most popular reason to use private browsing mode is to hide embarrassing searches, however only Safari hides recent searches. Not helpful for users.

The concept of private browsing needs standardisation, not only to help user’s expectations about how their data is stored but also to help people build technology and be confident about how private mode will work. With the rise of Web Payments this is only going to become a larger issue. Hadley said more user research is needed to help in this area.

Rachel Coldicutt followed on with a passionate, excellent talk about Doteveryone, the think tank that is “fighting for a fairer internet.” Rachel gave a good overview of how Doteveryone is trying to improve digital understanding for everyone by focussing on education, standards for responsible technology, and stimulating new value models.

She talked about the rise of power of the big four “GAFA” (Google, Apple, Facebook, Amazon) and how these companies wield much unaccountable power on the internet today. With a government, if you disagree how things are run, you can revolt, not so with Facebook. She revealed 7 developers are responsible for the Facebook timeline algorithm (just 7!), a technology that is becoming bigger news with how it’s seen to have influence on recent political decisions. She also raised an interesting idea around a “fair trade” mark on the internet and how that could work.

The next session was by Anna Powell-Smith who talked about an offshore property ownership project she worked on for Private Eye. She worked on pulling data sources together to build a map of properties in England and Wales owned by offshore companies. Offshore ownership is problematic because it’s used for tax avoidance by those with often dubious means of making money. Anna told an interesting story of how she matched FOI requested data up with the INSPIRE dataset (important, but restricted, geo-spatial data on properties), a process that seemed pretty convoluted and difficult but was successful. The Private Eye report was discussed in parliament and it looks like the government are starting to make some positive movement in making this data more available.

However.. Ordnance Survey are legally obliged to make money out of their data so they are not willing to make this completely open. The critical component Anna used in her research, matching the INSPIRE ID to title IDs is no longer available without spending £3 per property, which makes it cost-prohibitive.

The government has put this requirement on Ordnance Survey to sell their data rather than make it open. Anna made a call for any economists to help make the case for why this data should be free and how it would have a positive economic impact in the UK. If anyone can help contact Anna at https://anna.ps/

The next speaker was ill, so John Sheridan helped out with an impromptu talk on his work at the National Archives. This was fascinating, touching on the different challenges between physical and digital archives, how context is important in archived data, how copying is a core part of digital archiving (“there is no long term storage solution for digital”), how this also requires validating the data you have stored is still the same (they use hashing methods to help with this), and how you need to understand the data you store so you can also provide a means to view it. The general message was data encoded in open formats is easier to archive, and to make available in the future.

John also touched on the UK Web Archive project, run by the British Library who have a digital archive of around 3 petabytes, most of which is not published online mostly for copyright reasons. While the US-based Internet Archive has a policy to publish first and takedown content on request, as a UK public institution the British Library and National Archives have a lower appetite for risk for potential legal action — and therefore only publish when they have permission to do so.

I chatted to John in the bar after the event and he explained that the National Archives takes responsibility for archiving all government digital content, taking snapshots every 3 months or so. The Web Archive project deals with UK websites. I asked him where a past project we worked on would be archived, the Armada Tapestries site for the House of Lords. Apparently this is taken care of by Parliament itself in the Parliamentary Archive. Lots of people archiving things!

After lunch I joined the Post Fact / Future News panel which turned out to be a real highlight of the day.

James, Wendy, Becky and Gavin

The speakers were James Ball, Wendy Grossman and Gavin Starks and the panel was hosted by Becky Hogge.

James started proceedings and talked eloquently and in detail explaining the difference between Fake news (an outright lie, not so common in the UK) and Post-truth bullshit (manipulation of an almost-truth) — basically where we find ourselves today. James talked at speed and with confidence and painted a fascinating, dark picture of how news is being manipulated for political ends at present and how a good narrative can often trump a complicated truth that is difficult to explain to the general public.

James made a great point on how you “can’t use technology to solve cultural issues” and that “fake news is not an internet problem.” He highlighted the problem is in society already and in figures such as Boris Johnson who have a long history of manipulating the truth for a political agenda. He’s written a book on this topic, so go buy it: Post-Truth How Bullshit Conquered the World!

He also noted we need to “think about the business of the internet,” the idea of business / value models cropped up a few times during the day, a lot of the issues we associate with the internet are exacerbated by how the web makes money — alternative models need to be found to help improve the current state of affairs.

A very funny Wendy on what today’s nine year olds may think about future society

Wendy then moved onto future news. She talked about predictions she made in 1997 and how many of these have some truth today. She went on to explore what younger generations will think about technology and society and what are future headlines likely to be. Wendy’s talk was fabulous fun.

Gavin began his slot by reading out a written statement by Bill Thompson who was due to speak but was otherwise waylaid at the Venice Biennale! Gavin read out a short piece by Bill on the rotten state of the net at present. It made for a sobering interlude to the discussion.

Gavin then moved on to talk about the work he’s been involved in to make the internet more open. The Open Banking Standard, an anti-slavery corporate statement registry, and tracking the origin of products through the supply chain.

He talked about how we now need to up our game more, how the community thought the case for open data was won but this is not currently the case.

Gavin is currently interesting in creating impact@web-scale, trying to tackle solvable problems in the UK between policy and technology, bringing public and private sector together. He’s looking for people to help, you can sign up at http://www.dgen.net/ or find out more on his blog.

I’ve probably written too much already, but the rest of the afternoon was also enjoyable, peppered with public interest technology, Ada Lovelace Day(celebrating women in STEM), using climate change data to make a symphony, electrocution for fun and profit (and education!), using neural networks to help map happy places, what the Open Data Institute is up to, and a few beers in the union bar.

By the end of the day, my head was full of ideas, problems and a better understanding of what people are doing in the area of open tech. I learnt a bunch of useful things that I can takeaway for my day-to-day work, and will get me thinking about ways I can help make a difference and contribute to a better, more open and responsible technology.

Finally, a shout out to Kevin Marks who as well as live tweeting most of OpenTech also wrote a whole bunch of interesting notes.