RAJAR Q1 2018

RAJAR
As ever, this post is brought to you in association with RALF from DP Software and Services. I’ve used RALF for the past 9 years, and it’s my favourite RAJAR analysis tool. So I am delighted that I continue to be able to bring you this RAJAR analysis in association with RALF. For more details on the product, contact Deryck Pritchard via this link or phone 07545 425677.

50.9%.

UK radio is now more listened to via a digital platform than it is an analogue one. The rise has been steady over a number of years but as the chart below shows, we’ve finally seen the percentage of all radio listening breach 50% this quarter.

As I said previously, while this theoretically should kick-start the process for a digital switch-over, I don’t actually foresee anything major happening at this point.

What I’m not saying is that a great deal will happen very quickly once the 50% mark is breached. While theoretically allows processes to begin for an analogue to digital switchover for radio, I just don’t see that happening very soon. Generally speaking other things are using up lots of Parliamentary time at the moment. Similarly, I suspect that recently announced radio deregulation will take longer than many might hope, because there just isn’t time to fit in the primary legislation required to do anything.

Ofcom published a good primer on the subject last year:

In July 2010 the Government launched its Digital Radio Action Plan. As part of this, it was requested that Ofcom produce an annual review of the digital radio market.

The Action Plan was launched to ensure that if and when digital switchover occurs in radio, it can be delivered at a time when the market is ready and in a way that protects the needs of listeners.

The Government stated that a decision on whether to set a date for digital radio switchover would be considered when the following criteria are met:

  • when 50% of all radio listening is via digital platforms; and
  • when national DAB coverage is comparable to FM, and local DAB reaches 90% of the population and all major roads.

The Action Plan was finalised in November 2013, and on 16 December 2013 DCMS announced that while there had been steady growth in digital listening, it was not yet the time to commit to a switchover. The last version of the Digital Radio Action Plan was published in January 2014.

And of course the one outstanding key challenge is in-car listening. At this point 33.4% of in-car listening is digital. That’s good, and the vast majority of new cars come with DAB as standard. But there are lot of other cars on the road.

Elsewhere, it’s also worth noting that Q1 each year usually sees a bump in listenership because of devices sold over the Christmas period. This year, an awful lot of Amazon Alexa and Google Assistant devices were sold. But stalwart DAB radios always do well at this time of year too. Combined, they mean that post Christmas, people change the way they listen to the radio.

Radio Listening

Reach is up to 49.2m people a week, or 90% of the population. But average hours per listener have fallen below 21 for the first time, down to 20.8 hours a week. Inevitably that’s a consequence of other things eating into overall radio listenership.

I hate to keep labour the same point every quarter, but this is being driven to a significant extent by younger listeners. 15-24s now only listen for an average of 12.7 hours a week, which is a whole hour lower than the previous lowest figure. To put this in context, five years ago this group listened for 15.8 hours a week.

The one thing I would bring to bear from this, is that any formats or licences that target listeners by age groups – particular younger groups – are on a hiding to nothing. For example, Radio 1’s average age is 35 (down from 36 last quarter), and at this point, it’s essentially impossible to lower its average age.

National and Digital

It has been a decent quarter for Commercial Radio, with reach up 1.5% on last quarter and 4.2% on last year, just putting it ahead of BBC Radio in overall terms.

BBC Radio has more listening, despite seeing hours fall 3.2% on last quarter, and 1.5% down on last year.

The BBC national radio networks have all seen some disappointing numbers this quarter. Five Live is perhaps most disappointing with a fall in reach of 5.7% on the last quarter, to 5.1m (down 3.7% on last year). Listening hours are worse being 9.2% down on the quarter and 13.0% down on the year.

Such are the declines that I’d probably wait another quarter to be certain that they’ve not just had a bad RAJAR. While the Premier League hasn’t been the most exciting this year, there was plenty of football on during this period, and it was a generally busy time for both news and sport.

Perhaps all the listeners have gone to 6 Music, because they’ve had another superb set of results, with record reach and hours. Reach is up 8.0% on the quarter (up 8.0% on the year), to 2.5m. Hours are up a whopping 12.3% on the quarter (and up a more modest 3.2% on the year).

The interesting thing here is that 6 Music listeners might be considered to be the kind of people more likely to have Spotify or Apple Music (RAJAR doesn’t measure that), so the audience is rising at the same time as more of its audience has access to more music. Indeed, as with younger demos, 35-44s are seeing a gradual decline in time spent listening, which somehow 6 Music is overcoming. That said, the average age of a 6 Music listener is 43, and that has crept up from 38 over time.

There’s probably an interesting question to asked around the musical breadth of knowledge of a 6 Music listeners – or at least their desire to have one – and the need for guiding voices in the stations’ presenters. On the other hand, a station that plays a much tighter playlist might have less demanding listeners, and therefore find itself more susceptible to listeners switching to playlists on Spotify et al. That said, listeners to those stations are probably less likely to spend £9.99 a month on music.

But I’m hypothesising wildly here. Let’s get back to the numbers.

Radio 1 will be disappointed with its fall this quarter after a decent set of results last time. It’s down 3.8% in reach on last quarter, although it’s up 4.0% on last year. Hours are also down, falling 7.7% on last quarter, but just falling 0.5% on last year. More worrying is that the average listener spends just 6.0 hours a week with the station.

Radio 2 sees small falls too, with reach down a fractional 0.5% on the quarter while being up 2.6% on the year. Hours are down 5.1% on the quarter however, and down 2.5% on the year.

The station has just made some of the biggest changes to its weekday schedule that it’s done for years, but it’s going to be another couple of quarters before we can see the first results of that. And even then, the most notable change in peak, is a slight change in hours of Simon Mayo’s show and the introduction of Jo Whiley to the mix.

Radio 3 is down 0.9% in reach on the quarter, but up 2.6% on the year. Hours are somewhat better as it jumps 5.6% on the quarter and 2.7% on the year.

Radio 4 ducks just below 11m in reach with a fall of 3.0% on the quarter (down 1.8% on the year). Hours are up 0.9% on the quarter, but down 4.0% on the year. It’s not as though there’s a shortage of news, but one suspects there’s only so much Brexit/Trump that some listeners can take, hence the slight dip in reach after a strong run of results in recent quarters.

Radio 4 Extra has had a disappointing quarter with reach down 8.1% on the quarter, although up 3.1% on the year – which if nothing else shows that smaller stations can see their numbers bounce around. Perhaps more concerning is the 15.6% fall in hours on the quarter (and a 8.0% fall on the year).

The World Service remains fairly consistent with 1.4m listeners down 5.1% on the quarter, but up 7.4% on the year. Hours are up slightly with 3.4% growth on the quarter and 2.3% growth on the year.

Classic FM has had a solid set of results with reach down a little to 5.6m – down 1.7% on the quarter, but up 4.0% on the year. Hours are a little more mixed falling 4.1% on the quarter yet rising 10.2% on the year.

Talksport has had a some of its best numbers for a while, and has risen back above 3m again to 3.1m reach – an 8.9% rise on the quarter and a massive 14.3% rise on the year. Meanwhile hours are back over 20m and are up a massive 25.4% on the quarter and 13.5% on the year. The station continues to receive newspaper marketing support from its parent company News UK, and they again seem to be more active in the sports rights market. Although not in this RAJAR period, they have recently bought some England Test cricket rights for upcoming overseas tours to Sri Lanka and the West Indies, while they also had exclusive radio commentary of the recent Anthony Joshua fight.

Digital sibling, Talksport 2 has some positive numbers with reach up 1.0% on the quarter, although up 15.9% on the year. More importantly, hours are up 37.2% on the quarter and 49.9% on the year. Perhaps their EFL rights which largely sit on Talksport 2, are beginning to pay off?

Good news for Talksport 2 listeners and others on the SDL mulitplex, is that owner Arqiva on Tuesday announced that they will be extending the reach of the mulitplex by a further 4m with 19 new transmitters due to come on board.

That will also be useful for TalkRadio, which had some positive numbers as well, with reach up 30.6% on the quarter (32.8% on the year) and hours massively increasing, up 55.7% on the quarter (up 155.7% on the year). While these are good numbers, there’s no doubt that the format is expensive, and the station needs to see more growth to get it from 316,000 reach closer to somewhere around 1m.

Absolute Radio had some great results last quarter, but slipped back to 2.4m this quarter, down 7.3% in reach, although still up 11.4% on the year. In hours terms they were flat – really flat. 18,517,000 last quarter v 18,514,000 this quarter. And they were up 6.4% versus last year.

Christian O’Connell leaves Absolute Radio tomorrow, before he relocates to Australia to present the breakfast show on Gold FM in Melbourne. These therefore aren’t quite the final set of results for his tenure at the station.

The wider Absolute Radio Network has fallen a little, down 3.2% on the quarter, although still up 7.2% on the year in reach. Hours fell 4.4% on the quarter and were down 2.0% on the year.

Absolute 80s, however, did better this quarter, growing 5.8% on the quarter and up 14.8% on the year in reach. It also rose 13.5% in hours on the quarter, but fell 11.1% on the year.

Recall that Absolute 80s has a new competitor on the block in the form of Heart 80s, and the newcomer has better coverage being on D1 rather than SDL where Absolute 80s moved to (Again, the increase in coverage of the SDL mux should benefit Absolute 80s in due course).

Heart 80s also grew, rising 20% on the quarter (it’s too new for year on year figures), while hours dipped 5.5%.

For those keeping score, Absolute 80s is 161,000 listeners ahead of Heart 80s with 1.560m listeners. Although as an aside, it’s clear that the two stations, whilst both featuring music from the 80s, are actually quite different. Read this excellent and enlightening Twitter thread from Nik Goodman to get a better understanding of the differences.

Partly as a result of the success of Heart 80s, the Heart Brand (including all the local Heart stations, Heart 80s and Heart Extra) overall has had some good results. Reach is up 3.6% on the quarter and up 6.1% on the year, while hours are up 1.9% on the quarter, although down 1.9% on the year.

Sister network, Capital Brand, fared less well with reach down slightly – down 0.7% on the quarter and down 0.8% on the year. Hours fared slightly worse, perhaps reflecting wider listening behaviours in their target age group, with a fall of 7.1% on the quarter and a fall of 7.8% on the year.

The Kiss Network targets a similar age group, and saw falls on the quarter, although better results compared with this time last year. Reach was down 0.8% on the quarter but up 9.0% on the year, while hours fell 12.2% on the quarter but were up 3.2% on the year.

The Magic Network didn’t have a great quarter with reach down 3.4% on the quarter, although up 5.8% on the year. Hours are down 3.7% on the quarter and down 2.7% on the year. None of their digital sister stations, Magic Chilled, Magic Soul and Mellow Magic are doing enormously well, with only Magic Soul seeing an increase this quarter. Mellow Magic is the biggest of the three with a reach of 432,000 and 1.7m hours.

LBC is one of the better performers this time around, and whatever you think of it, their mix of politically charged presenters and the various politicians (and ex-politicians) that they get in for phone ins, seems to work well for them.

Reach is up 7.1% on the quarter and 21.5% on the year to 2.2m. That’s their biggest ever audience under the current methodology (You’d probably have to go back to the 70s or 80s to get a bigger audience for its FM in London, and at that time, there were only two commercial stations in the capital).

Hours aren’t quite a record, but they’re up 0.3% on the quarter and 5.7% on the year.

Jazz FM isn’t a station I mention too often, but I probably should. Their reach is up 16.1% this quarter (and up 22.4%) on the year, to 591,000. Hours slipped to 1.7m – down 18.7% on the quarter, although up 7.6% on the year. I mention this particularly to put their numbers in perspective with some of the other newer, but smaller digital stations.

London

The London radio market is always worth looking at – if only for signs of things to come. The average London listens to 19.4 hours of radio a week – so a bit less than the UK average. In part, that will be due to fewer people driving in London, but it might also be down to things like propensity to subscribe to other audio services.

19.4 hours isn’t the lowest we’ve seen – that was 19.1 hours a week back in Q2 2017. But it’s definitely part of a trend that last saw the average London listening to the radio for more than 20 hours being back in the middle of 2016.

I will also dutifully point out that the most listened to radio station in London is, as always, Radio 4 with 2.7m listeners. That’s followed by Radio 2 with 2.1m, itself very closely followed by Capital London, also with 2.1m (I’m rounding here for simplicity).

So Capital is the reach leader commercially (Radio 1 has a reach of 1.6m). The station is up in reach on the quarter (up 1.4%), but down on the year (down 4.6%). In hours terms, it’s not so good, with a 7.9% fall on the quarter to 9.0m hours and a 16.6% fall on the year.

Heart London is the commercial music leader in terms of hours with 10.1m, up 11.6% on the quarter and up 8.8% on the year. Reach is down 4.2% on the quarter but up 7.0% on the year.

Another figure to mull over when comparing the two Global stations is their respective average hours. For Heart it’s 6.7 hours a week, but for Capital it’s just 4.2 hours a week. That feels very low for a market leader. Just a year ago, it was 4.8 hours a week.

Kiss is a close competitor to both these two services, with 1.9m reach (down 3.7% on the quarter and up 8.3% on the year) and 9.6m hours (down 10.1% on the quarter and up 14.4% on the year). It has 4.9 hours per week average listening.

But the actual commercial hours leader in London is of course LBC which has grown in London as it has done nationally. Reach is up 3.3% on the quarter and 17.1% on the year to 1.3m, while hours are basically flat at 11.2m (down 0.1% on the quarter and down 3.0% on the year). It’s listeners spend 8.9 hours a week with it. And interestingly, their average age has just fallen to 49. LBC is perhaps younger than you think…

Magic has not had a great set of results this quarter in London, falling 12.3% in reach on the quarter and down 5.7% on the year. In hours, they’re down 11.6% on the quarter and down 4.6% on the year.

A couple of other Global services with good figures are Radio X and Smooth. Very different, but both showing positive moves.

Radio X has seen its best reach since its rebrand from Xfm, and indeed even if you include Xfm’s numbers, it’s best figures since 2013. It’s reach is up 4.3% on the quarter and remarkable 40.5% on the year, to 531,000.

In terms of hours, it’s an even better story, with hours up 14.3% on the quarter and up 81.9% on the year to 3.7m. That’s an average of 7.0 hours a listener per week, and the best hours the station has had since it was Xfm in 2004! Global has spent a lot over time marketing the service, and it may be coming to fruition.

Smooth said goodbye to Russ Williams on breakfast, but he left as the station put on 13.3% reach in London on the quarter (and up 6.6% on the year), while hours were up 6.8% on the quarter and up 0.3% on the year.

BBC London‘s numbers have been a little all over the place of late. Last quarter they had some incredibly good record breaking numbers, and things have, perhaps, “normalised” a little this quarter. Reach is down 20.9% on the quarter, but still up 38.0% on the year to 454,000. Meanwhile hours are down 50.5% on the quarter, but up 59.6% on the year to 2.1m. The station’s numbers are, frankly, bouncing ridiculously. 50% swings between quarters don’t happen, and it suggests that measuring the station’s audience is hard.

BBC London aside, it feels like RAJAR in London isn’t swinging around as wildly as it had in the past, which is much better for the currency.

MIDAS

RAJAR’s MIDAS survey isn’t actually part of the regular RAJAR release and was published last week. But I thought that there were a few things that were worth mentioning here.

11% of the UK population listen to a podcast in any given week – that’s 6.0m people (down very slightly from last time around, although the trend remains upwards).

Radio’s share of all audio is at 75% which is the same as last time around.

But if there’s a theme, it’s that the share of audio that is live radio for 15-24s has fallen below 50% for the first time. In the Winter 2017 survey it was at 50% for this demo, and 63% for 25-34s. However, in this new release, the share amongst 15-24s has fallen to 46%, while that among 25-34s is the same as before. On the other hand, on demand music services (e.g. Spotify) has grown from 28% to 34% for the younger demo.

This rate of change is fast, and it’s entirely conceivable that within a year, radio will have fallen below on demand music services for 15-24s.

At the moment this is a youth oriented issue. Among 35-54s, only 6% of audio is on demand music, and it drops to 1% for 55+. That offers some comfort to radio, but it will need to adapt to match the growth of these new services.

The full MIDAS release is here.

Further Reading

For more RAJAR analysis, I’d recommend the following sites:

The official RAJAR site and their infographic
Radio Today for a digest of all the main news
Go to Media.Info for lots of numbers and charts
Mediatel’s Newsline will have lots of figures and analysis
Paul Easton for more lots analysis including London charts
Matt Deegan will have some great analysis
The BBC Mediacentre for BBC Radio stats and findings
Bauer Media’s corporate site
Global Radio’s corporate site

All my previous RAJAR analyses are here.


Source: RAJAR/Ipsos MORI/RSMB, period ending 1 April 2018, Adults 15+.

Disclaimer: These are my views alone and do not represent those of anyone else, including my employer. Any errors (I hope there aren’t any!) are mine alone. Drop me a note if you want clarifications on anything. Access to the RAJAR data is via RALF from DP Software as mentioned at the top of this post.

Three New Exhibitions

There are some really good exhibitions on at the moment in London. Actually, there are always really good exhibitions on. But over the weekend I went to three new ones, and all three were really good, and well worth visiting in their own rights.

I spent a May Sunday visiting the three and using a Boris Bike to travel between them.

My first stop was the Victoria and Albert Museum where they have just opened The Future Starts Here which aims to show “100 projects shaping the world of tomorrow.” That could make it sound a little dry, but there are some real things of substance in here. From food to society and democracy, everything is covered.

The exhibition explores electronics that are there to help us – the first thing you see is a robot that will seemingly do the laundry for you, to exosuits that could help those who require extra support or strength. Sometimes there are projects that are relatively simple – reusing old smartphones to do other tasks around the home.

Other times, these are much bigger projects – underwater drones, or 3D printing building to live in on Mars.

The exhibition asks questions of the future of democracy. They even have an exhibit which shows Alexander Nix of Cambridge Analytica famously explaining what his company claimed it was capable of, speaking at a conference. I laughed out loud when I saw they’d included that!

The exhibition is there to challenge us, and ask us questions. What is the future going to mean for us?

It runs until 4th November 2018.

From there it was a ride through Hyde Park around Buckingham Palace, through Westminster and along the South Bank to Tate Modern. They’ve just opened a new exhibition – Shape of Light: 100 Years of Photography and Abstract Art.

This is an exhibition to be experienced rather than described. The images – mostly photographs – are broad, and arranged thematically by subject. The tale is told of abstract movement and photography moving in parallel as artists began to understand what was achievable. Sometimes they utilised nature – other times very close up imagery to present us with things we mightn’t understand.

I went away quite enthused and keen to explore some of the themes in some of my own work.

Shape of Light runs until 14th October 2018.

Finally it was over the bridge and into the City to the Museum of London, somewhere I’ve not been for a while. They have a new photographic exhibition called London Nights. It displays an enormous range of often extraordinary photos taken over the last hundred years or more. While today we expect our smartphones to be able to take halfway decent photos in the lowest of light, it’s worth noting that photographers in the past had to go to great lengths to take photos in such conditions. Some of the earliest pictures, showing London’s fog-filled streets, are therefore remarkable.

The real fun can come from seeing everyday shots of London from the past, particularly in familiar settings. Trafalgar Square, Leicester Square and Piccadilly Circus appear repeatedly, with the people and the signs being fascinating.

The exhibition is thematically structured, and reaches right up to some very contemporary photographs. But sometimes a photographer like Bill Brandt will have photos in a variety of sections, seemingly able to cover it all.

Often it’s the very ordinary that becomes extraordinary. There are a series of perhaps a couple of hundred contact prints taken in the fifties, and even though the images of are “just” of people, you can’t help staring into the lives of those captured at that moment in time.

The exhibition catalogue is particularly good and worth mentioning, being published by the excellent Hoxton Mini Press who publish some excellent photographic books. Furthermore, compared with many equivalent exhibition catalogues, it’s really good value at just £14.95 for a hardbound copy (for exhibition ticket holders).

London Nights runs until the 11th November 2018 and is well worth a visit.

Gursky

In 2011 a record price was set for the sale of a photograph. Rhein II by Andreas Gursky was sold at Christie’s for $4.3m. It was the then highest price paid for a photograph (and likely remains so). Compared to the Leonardo da Vinci Salvador Munci painting that was sold for $450m last year, that’s a relatively modest price. But photo sales are more interesting.

First of all, there’s the fact that they’re largely reproducible. While a painting is one of a kind, a photographer can make as many, or as few, prints as they choose. A photo might be sold in editions of as few as 1 or as many as several thousand.

Gursky reportedly sells in editions of six, with two artist proofs. Without any attribution displayed elsewhere in the recent Hayward Gallery exhibition, I assume that the exhibits in the Gursky exhibition currently nearing the end of its run at the reopened Hayward Gallery, are all artist proofs.

I went along to the exhibition because, frankly, I’ve never really got Andreas Gursky. What I mean by that is that while I appreciate his skill as a photographer, and the grandiosity of his works’ scale, I have never seen him as an artist far and away ahead of other photographers, as the prices of his pictures tend to suggest.

I wanted to see if my eyes would be opened by this exhibition. Was I missing something? Why are some of his photos traded for millions of dollars?

Reader, I still don’t really understand.

Gursky absolutely makes powerful pictures, often detailing man’s impact on the landscape. And the scale of many of his photos is really important. They are often more than 2 metres wide or tall, some bigger than that. And Gursky’s style is to have the lens wide open – everything should be in focus. Furthermore, and importantly, many of his images are composite photos made up from several images, with a significant amount of post-processing in software like Photoshop. Gursky is clearly a master at this kind of thing, because as he flattens out perspectives, you can’t see the joins.

I think the most obvious photomontage for me was a picture entitled Tour de France, which purported to show the race heading up a mountain. Except that somewhere in the lower portion of the image you could see the King of the Mountains banner – which would almost always be somewhere near the top. The images were probably taken from a helicopter, and I’m not certain they were shot on a single mountain. You can’t see enough detail, but some of the “leading” cyclists don’t seem to be accompanied by camera bikes which would ordinarily be the case, while groups further down the mountain do.

And Gursky also makes his colours pop quite a lot, often adding an almost ethereal glow to the pictures.

So these are heavily manipulated images. But they don’t pretend to be anything but that. And so I’m not sure.

Some of the images are not even taken by him. The exhibition features a satellite photo of Antarctica. I suspect that it’s a heavily manipulated collage of many satellite images, and it’s possible that Gursky commissioned his own images from a satellite photography provider. A second image of the North Atlantic claims to have had much of the clear water created with software. So perhaps Gursky was using imagery from a platform like ESA’s Earth from Space.

In another photo entitled Supernova we see a relatively decent example of astro-photography, but nothing especially impressive.

Returning to that record selling Rhein II, which is displayed here, and what’s most remarkable about it is how unremarkable it is. Another photo-montage, Gursky has removed a powerstation to leave nothing else but the grassy banks, the river, and the sky.

I’m probably being unfair. Gursky’s images are impressive, and he does have something to say. But I can’t claim to have been converted by this exhibition. I would put him in a similar category to Damien Hirst, in that I can see the talent, but I don’t really understand the appeal, and certainly don’t understand the prices that are achieved by his works.

Anyway, if you’ve not seen the exhibition, it’s too late, as it closed a couple of weeks ago.

Garmin Varia RTL510

I seem to have a constant battle with rear lights on my bikes. The main problem is that I use a saddlebag on my full-size bike, and attaching a bike light to it is a seemingly simple task, but tends not to be brilliant.

If you have enough seat-post showing, then placing the light below the saddlebag in such a way that it’s still visible to traffic, is probably the preferred option. But in my case, there isn’t really enough seat-post showing.

Topeak seem to have the popular saddlebag market sewn up, and I have owned several of their models. However, in many instances, when you then hook a light through the slot made for them, they hang backwards and downwards, meaning that the light isn’t as effective. Remember, a rear light is basically only there for you to be seen!

My preferred rear lights, for compactness, have been Lezyne’s Zecto Drive range. But they suffer this problem.

My recent solution has been to change my saddlebag to a use a Topeak Wedge Sidekick saddlebag. I have the smaller of the two sizes. That’s enough for a tube, a couple of CO2 canisters, a large multi-tool, tyre levers and patches. Importantly, it’s firmer than other Topeak models, so hooking a light on the rear keeps the light pointing higher rather than lower. I’ve been happy so far.

All of which brings us to Garmin’s new Radar Light. Now why might I want a radar light? Is that strictly necessary? The answer is clearly not, but it has immediately proved itself useful.

The light fixes to your bike via a regular Garmin quarter-turn connection. The box includes mounts for a seat-post, but as mentioned above, I don’t have room to place it on a seat-post. Fortunately, creative people who design stuff to be 3D printed have got solutions for you. I bought a Varia Saddle Bag Clip via Shapeways. They 3D print things that creators have uploaded to order. It’s an extra cost, and it’d be nice if Garmin packaged one in their box, but it does the trick. Alongside the Topeak Wedge Sidekick, the light stays firmly pointed in the correct direction.

The light itself is relatively basic. There is a single led light and it has four modes – solid on, night flash mode, day flash mode and standby mode (As far as I can see, standby mode is a bit useless since it doesn’t have traffic detection). The battery is recharged via micro USB and the battery life seems decent with 6 hours in solid mode and 15 hours in day flash mode. Fine for most rides, but you’ll probably need a backup light if you do, say, the Dunwich Dynamo.

So how does it work in practice? While a standalone device is available (RTL511), it’s perhaps most useful when paired with compatible Garmin bike computer. In my case I paired it with my Garmin 1000 which was as simple as adding a new sensor. In the top right hand corner you get an indicator that there is connection, and you’re ready to go.

It works by determining larger objects that are moving at a different speed to you. When it sees one, it gives you an alert and small dots appear on the side of your Garmin bike computer (the right hand side by default). The device can determine several vehicles at once, and you’ll see a series of dots. The closer the dots get to the top of the screen, the closer they are to you. If a car passes particularly fast, the screen goes red, but if it’s slower then you get green. The unit will also beep to alert you to this traffic.

I must say that in practice, it worked very well. You do get the concessional false positive, and if a car stays behind you, matching your speed, perhaps up a slow windy hill with few overtaking opportunities, it may lose the vehicle for a while. Other cyclists tend not to show up, but in general I really like it. Note too that it obviously only detects traffic behind you and coming towards you. You shouldn’t see dots tailing off towards the bottom of the screen!

The radar has a 40 degree wide angle which covers a decent chunk of the road. It also means it continues to work going around corners for example. Garmin says that it can detect vehicles up to 140m away, and I’ve no reason to doubt that in my usage.

And when the vehicle gets very close, the blinking on your light increases in frequency to make sure that the driver has seen you!

The only real downside is the impact on battery life of your bike computer. The Edge 1000 I use has never had amazing battery life, but I got the low battery warning after a 70km ride last weekend which is a bit early. Obviously, the number of sensors you’re using will impact on that, as will things like screen brightness and me using maps (which I was). But while the light itself will probably last well, you’ll need to keep your bike computer’s battery topped up.

I’ve not tried the light in the city centre, and I understand that it can be less useful – probably too much other traffic to cope. In any case, you nearly always have cars behind you, so there’s little added value. It’s best for those places where it feels like cars sneak up on you.

Even with only a couple of rides under my belt, I’m already a fan.

For a much better and more detailed review, DC Rainmaker is obviously the place to go.

The Redundancy of Imploring Me To Change Things

I regularly receive emails of the following type:

Hi, My name is XXXX and I’m writing to you on behalf of YYYY.

We have noticed that you wrote about on your page .

We have a new that would be helpful to your readers.

We think you’re doing a wonderful job and everything you publish is excellent.

Can you make the and let us know when you have done so?

Yours,

Invariably I simply ignore the email, and then I get several follow-ups over the course of the following days, weeks and months.

A couple of things.

This is a blog. The only time I make changes is if there’s something inaccurate, wrong, or there’s a worthwhile update that readers should be aware of.

I assume that these pages are targeted because if you enter the right search combination into Google, a page from my blog ends up somewhere vaguely towards the top of search results.

I literally have no interest in updating old pages. While this blog isn’t some kind of journalistic record, it does represent my thoughts and views at the time the entries were written. And just because you’ve got a better place for me to link to now, it’s kind of your fault if my original link was not great.

Of course, I completely realise that these emails are automated. But that knowledge makes me even less likely to make changes. Do not underestimate my intransigence!

But I do hope the companies that are employing these agencies to drive more traffic to things that they want to promote are completely wasting their money.

* When I say “infographic”, I mean some kind of feeble unsubstantiated graphic that has the most dubious information imaginable, and is basically an advert.

Netflix, Independent Cinema, and Hollywood’s New Business Model

The other day The Ringer published a piece about Netflix and their original movie strategy. The piece, entitled Netflix and Shrill listed the original movies that Netflix has already released in 2018 and challenged readers to see how many they recognised. For most people, the most familiar title will have been The Cloverfield Paradox. This was an $XXm space horror film that became part of the Cloverfield franchise. However the studio that made it, Paramount, got cold feet and decided to sell the thing to Netflix lock, stock and barrel. They promptly gave it a surprise release right after the Super Bowl, during which of course, they promoted it.

But what about the rest of the titles in Sean Fennessey’s piece? Well only three others on the list actually resonate with me at all – Mute, Kodachrome and Mercury 13. The former because it’s a Duncan Jones film, and the latter two because I just added both to my Netflix List.

Netflix gets films in a few different ways. It sometimes licences big name studio films either directly from the studios or via third party rights packages. That’s the way most of those familiar titles end up on the service. However, those titles are probably only licenced for a specific period of time. That’s why you get lists of movies that are coming off the service.

Then there are those it acquires at film festivals. The model for smaller independent titles has often been to scrap together funding from wherever, then pitch up somewhere like the Sundance Festival and try to get a distributor to take on the picture, getting it into theatres and, importantly, marketing it. The latter is expensive, and it’s the reason why titles sometimes end up unseen even though funding had been found to actually make them. Netflix’s preferred model is to buy the global rights and buy out the film in perpetuity. But sometimes that’s not possible because different territory’s rights may have been given up as part of the funding model. Furthermore residual rights for home release like Blu Ray or iTunes may reside with someone else.

Finally, there are Netlfix original productions – those that are put together on paper and then shot specifically for Netflix. These are labelled “Netflix Originals,” although confusingly, so are those acquired at places like Sundance. When Netflix owns the film in totality, they get to release it globally and own it in perpetuity on every platform. They control whether you can ever even see the film somewhere like iTunes.

What all this means is that the list at the top of The Ringer article only completely applies to the US. That said, when I checked, all but one of the films was also available in the UK.

I recently read a really good new book called The Big Picture by Wall Steet Journal reporter Ben Fritz, who has long covered the entertainment beat. The book goes through deep into the current Hollywood business model, because it has changed fundamentally inside the last ten years. You only have to look at the table in The Ringer piece.

Fennessey notes that the six major Hollywood studios have released a total of 25 films in the first 16 weeks of 2018. During that same period, Netflix has also released 25 films!

But there’s a reason for that. Hollywood has just dropped out of the middle market – those $30-$80m or more production films that weren’t based on franchises, relying instead on audiences turning out to see stars. They included thrillers, romantic comedies and more serious fare. Fritz’s book takes a really good look at the model that yet used to hold up Hollywood, because some of those titles in the past might have lost money, but others would have made decent cash.

However in the scheme of things, Hollywood was only make 10% and now for a studio like Disney it’s closer to 30%. That’s because they don’t these days make films that aren’t based on franchises or other known intellectual property.

Most famously Disney has Marvel. But they’ve also got Star Wars, their own animated back catalogue now being remade in live action, Pixar (who are perhaps the only real originators of new stories at the moment, even if they themselves are relying more than ever on franchises. Did we really need another Toy Story, or did the trilogy end perfectly before?), and coming soon Indiana Jones.

Fritz’s book looks closely at the travails of Sony. In part because they were the studio that were considered the most talent friendly in the past. Amy Pascal who led the studio had great rapport with the talent and was as a result Sony was home to lots of those kinds of mid-budget films, while only really having Spiderman as a top tier franchise.

The other reason the books uses Sony as a case study is because of the massive email hack. All those communications ended up online and viewable to all. These caused Sony enormous damage at the time, not least when studio heads bad-mouthed people in some of those emails. But Fritz uses them to illustrate some of the inside thinking at Sony as they realised that they desperately needed franchises, and at the same time were struggling with their most valuable asset in Spiderman. As long as they kept making new Spiderman movies on a semi-regular basis, Marvel wasn’t able to grab back arguably their biggest property.

This is all important in light of The Ringer piece because it explains why the number of studio releases this year equals the number released by Netflix. If it wasn’t for Netflix, it’s not clear how those movies would get released at all!

I’m not saying that some of them wouldn’t make it to our screens. In the US, Alex Garland’s highly regarded recent release, Annihilation, based on the Jeff Vandermeer novel, got a theatrical release. But the studio who made it – Paramount again – got slightly cold feet and sold the rights for the rest of the world to Netflix. So a film that was visually spectacular ended up going no a screen no bigger than our televisions, and no doubt for many people, no bigger than their phones. However, that’s another discussion for another day.

Had Netflix not existed, then yes, I suspect some kind of theatrical release would have happened for Annihilation – certainly in the UK. But I can’t see studios like Paramount continuing with this kind of strategy for long. Nor can I see Netflix wandering around picking up and endless succession of studio releases that the studios have suddenly got concerned about. While Annihilation is excellent, the same can’t be said of The Cloverfield Paradox which is decidedly the weakest in the somewhat contrived franchise.

The risk is that Netflix is perceived as the dumping ground for movies that have tested badly with the distributors. Of course Paramount and their ilk manage to avoid having a flop on their hands, and come out cash neutral, or perhaps with a small upside.

Meanwhile, I completely understand that filmmakers must be frustrated. They made these films to be shown on the big screen – that’s how they’re conceived and shot. You frame things differently for television. On the other hand, it has long been the case that far larger audiences will see films on television than will the big screen.

More and more, then, it’s going to continue to be Netflix and Amazon that become the homes of these medium and smaller films. What they perhaps struggle to do is sufficiently market those films.

A lot is made of Netflix’s algorithms that surface films that viewers will want to see with incredible accuracy. I don’t agree. I’ve long felt that Netflix (and Amazon) are woefully bad at surfacing their own titles. They think they know me, but they really don’t.

When Netflix emails me to alert me to a new Adam Sandler release, Netflix being the exclusive home of new Sandler releases these days (Fritz’s book details this deal), then Netflix has failed to grasp even the most basic understanding of my interests. Of course they only know what they know. They don’t know that I enjoy Westworld on Sky Atlantic; The City and the City and Howard’s End on the BBC; Endeavour on ITV. They don’t know that I saw nearly all the Oscar Best Picture shortlist at the cinema this year.

Furthermore, when big releases like Annihilation or that recent flawed Duncan Jones title, Mute are released, I have to really go searching to find them. Did either Kodachrome or Mercury 13 show up on the Netflix home page? No – I had to do a search.

Now these are titles that I’m actively aware of. What about others that I suspect I’d like if they were marketed properly? Well those are the titles that are disappearing into the depth of the platform.

It still seems remarkable to me that neither Netflix nor Amazon are able to replicate what a good physical store is able to do in showing me new titles. If I visit a branch of Fopp (about the only significant retailer of physical discs in the UK right now), I might browse at a display of films from the Criterion Collection, the BFI or Second Sight. In some instances, I simply won’t have heard of some of the titles, but I’ll still pick up discs and browse at them. I may actually buy them. The same is true in a good bookshop where as well as the latest bestsellers, the bookseller has perhaps contrived to display some thematically interesting books together on a table somewhere.

A properly released mid- budget or indie film will have press ads, posters, bus sides, and importantly, reviews. The latter is an area that Netflix and others need to work hard at. Most of the broadsheets have full time film reviewers, but in the main they don’t review streaming titles very well. The release medium seems to dictate what gets reviewed. In the past studios would “game” this. A release that was really “direct to DVD” would get a brief cinema release over a weekend just so they got notability before you spotted the title in the DVD aisle of Sainsburys the following week.

Somehow a movie poster can tell me more about a film than a small box with barely even a one line description of the title. Netflix has some incredible algorithms to test multiple images to find just the right one to appeal to me. Am I a fan of a particular actor? Then I see that actor in the image on the platform. You see something different to illustrate the same title. But beyond that, they need to work harder. Choosing to start a stream is a much more proactive choice than flicking through the channels on a remote control before settling on something.

So that’s the real reason why those movies have disappeared without me aware of them. That said, if you gave me a list of everything released at the cinema in the first few months of this, many of them too would be unfamiliar. There are a lot of films craving for attention, and only so much attention that they can be given.

I’m not going to criticise Netflix for their release strategy – but they do need to work harder on marketing of titles. Otherwise, yes, it can feel as though these films didn’t exist at all. An unfamiliar movie title in a long list remains just that. A consumer gets more excited when they seen a known property than an unknown one.

The Ringer piece notes forthcoming films from Paul Greengrass and Alfonso Cuarón, both of which I’m excited to see. Netflix will also be bringing Andrew Niccol’s new SF film, Anon (It’ll air on Sky Cinema in the UK). I’m always keen to see a new film from the man who brought us Gattaca. As long as Netflix does enough to raise the profile of these films rather them just at best appearing as a meaningless title that tells us nothing, then I’m excited for their future.

The studios, however, I’m more worried about. Their strategy of shifting to fewer and bigger films runs all kinds of risks in the longer term. The words ‘eggs’ and ‘baskets’ spring to mind.

Marvel may be unassailable at the moment, but it only takes one or two duff movies, and that success can begin to slip. In his book Fritz notes that the reduced number of releases affords movie executives more time to spend on the titles that they are releasing. They can give them the time that they need, delaying releases if necessary. That’s great in theory, but even Marvel films have dates to meet, particularly if the outcome of one film leads into the next Avengers title or whatever.

The Marvel Cinematic Universe is, as he says, the world’s highest budget TV series. Audiences go and see the new Marvel films regardless of the hero, a bit like watching your favourite TV shows week in and week out. Marvel tries to structure the films a little like a TV a procedural. You can basically watch each as a standalone, but of course there’s a larger story arc underlying the series. But as we know, even the biggest TV series juggernaut, eventually falls from grace eventually.

And will audiences continue to actually go to cinemas? They’re fighting the battle by laying on bigger and better seats that can sometimes be more akin to a business class seat on a long distance flight. They’re offering in-chair food and drinks service, and we’re seeing new formats like IMAX 3D and 4DX. Yet cinema ticket prices continue to rise ahead of inflation, and they become ever more hostile environments when they don’t ensure that patrons keep their phones switched off for example.

Disney’s answer to this potential uncertainty is to get skin into the streaming game as well. With its Disney Life app in the UK, and the forthcoming bigger offering that is coming in the US, they get to do their version of Netflix. Star Wars and Disney titles will soon disappear from Netflix as a previous deal expires. Don’t expect to see further expansions of the Netflix Marvel TV series featuring the likes of Jessica Jones and Daredevil, although I suspect the existing titles will continue, with the former having just been renewed for a third season.

Disney is claiming back its catalogue, and will no doubt look towards making its own Marvel TV series, and almost certainly, a live action Star Wars universe series. Who would bet against a reboot of the Young Indy series in the future too?

Will audiences get bored of superheroes? Are there enough franchises out there? How often can the same series be “rebooted”?

Who knows. But Hollywood is betting big time on them not running out any time soon.

Winding Down Local TV

In the dim and distant past of 2011, Jeremy Hunt, then Culture Secretary, kicked off “Local TV.”

“For consumers, what this will mean is a new channel dedicated to the provision of local news and content,” he said.

In due course, he saw through the legislation to create a series of local TV services. This included the requirement for channels to be carried on all the main broadcast platforms. Furthermore, the new services would find positions fairly high their respective EPGs. Broadly speaking, the higher a channel appears in the EPG, the more viewing it is able to capture.

The BBC’s then licence fee settlement included funding that it must pay to the new services, both to build out a transmitter network and provide funding for each channel over the first three years. In theory programming might find its way back to the BBC.

Famously Hunt said, “Birmingham Alabama … has eight local TV stations – despite being a quarter the size of our Birmingham that, again, doesn’t even have one.”

But the idea was flawed from the outset.

First of all, equating UK and US TV stations was an irrelevance. US TV networks don’t exist in the same way. They are networks of largely independently owned stations, each of which affiliates itself to a major network in a given market. Sometimes there are big operators who own multiple stations, while their station in one city might be an affiliate of CBS, but in another, it is an affiliate of NBC. Mornings and evenings are filled with network programming, while afternoons are filled with nationally syndicated programming (Judge Judy, Ellen, etc).

Those “eight” local TV services in Birmingham, Alabama are basically ABC, NBC, CBS and so on, with local news bulletins scattered throughout the day when they’re not playing syndicated or network programming. Pretty much the same as watching BBC1 or ITV from a viewer’s perspective then. There are barely any true local services that operate around the clock. Sure, an affiliate might break away from the main network to cover a major breaking news story in its area, or more likely a car chase live from a helicopter, but they’re not truly “local” beyond news programmes and advertising sales.

Nonetheless, a variety of people applied for the early local TV licences advertised by Ofcom and they were duly handed out, via a “beauty parade.” In other words, to a win a licence, you had to promise the best programming. Another flawed part of the process, since to win a licence, applicants tended to over-promise.

The new owners of the licences varied. In London, it was the group that owned both the London Evening Standard and The Independent that got the licence. In Glasgow and later Edinburgh, Aberdeen, Ayr and Dundee, the local ITV franchise STV (which is still independent of the rest of ITV) won the licences. They were more successful than some others because they had a large news operation anyway across Scotland, as well as a healthy library of STV owned catalogue programming. However even these channels, collectively known as STV2, are facing a review over their future.

In Norwich, Mustard TV was operated by newspaper publisher Archant. They published the major daily and weekly newspaper titles in the area, and as in London, would be able to share resources with their stablemates.

But a couple of groups emerged to run what were effectively wider groups: “Made In…” and “That’s…” Your local service might be That’s Oxfordshire or That’s Lancashire, Made In North Wales or Made In Tyne and Wear.

Made Television has six stations while its larger rival That’s TV has fourteen.

Sometimes these groups won the licences from the outset; other times, they took over failing local stations (including the aforementioned Mustard TV).

But all the time, while these quasi groups were being built, something else was happening. A series of “change requests” was going through. Each of these would see a reduction in the number of hours of new original programming each station had to broadcast. The initial bidders had been wildly optimistic about the volume of new television they could make – indeed the “beauty parade” aspect to licencing actively encouraged them to make these promises. But it’s hard to make good television. It’s also hard to make cheap television. And it’s very hard to make good, cheap television.

Every so often, the stations went back to the regulator and asked to be relieved of some of the promises they’d made. That mostly meant reductions in locally made programming.

If they weren’t making original local programming, then how did they fill their schedules? Well they could licence old programming from various parties and save costs.

London Live, for example, licences large swathes of Channel 4 programming, and fits that between cheap Danny Dyer films that it has also licenced.

Made Television is able to licence episodes of Judge Judy, It Takes a Killer and Medical Detectives – all cheap syndicated fare.

Any channel could licence a lot of this programming, but not every channel gets a prime EPG slot as the local services get. Discovery or UKTV would kill to get Freeview channels 7 or 8. Viewers find those channels much more easily.

On the plus side, local TV has probably been a boon for Talking Pictures TV, the classic film channel. It has had an agreement in place that saw carriage of its channel across a lot of That’s TV stations.

Which all brings us to today’s news that Ofcom wants to end the local rollout of new services. This hasn’t come soon enough, because the economics just don’t work.

If a group wants to start a TV channel in 2018, then they should be perfectly able to do it on their own without any government assistance. And building expensive broadcast infrastructure really doesn’t feel the way to go. While there are definitely advantages to broadcast versus IP delivery, building a community TV channel on, say, YouTube would be a perfectly sensible and viable thing to do. Committed volunteers using cheap cameras and open source software can produce decent quality video – tens of thousands of YouTubers show what’s possible.

Indeed, the idea that a small TV channel is capable of filling 24 hours a day is laughable, so concentrating on a decent quality single programme that can be watched at the viewer’s convenience is definitely the most pragmatic solution.

TV is not easy, and most of the groups that started out with local TV services have struggled. Viewing figures are low – indeed for the most part they’re not collected by the ratings body BARB (Be very dubious of claims of viewership that come from other surveys).

The only real good I can think that came out of this experiment was as a training opportunity for new people into the television marketplace which is far too London-centric. But even then, I’d love to know whether everyone is being paid or not.

The idea was flawed from the outset, and while channels that remain will probably be able to struggle by for a while, they simply can’t afford to make quality local TV programming – especially news. While some UK TV regions are large, meaning that viewers feel distant even from their local BBC or ITV news programmes, they shouldn’t underestimate how expensive that programming is for both the BBC and ITV to make. It’s hard to see how a cost effective local TV service can truly feel that void.

And anyone who’s spent any time actually watching local TV bulletins in the US will know that for the most part, they’re not high quality, often concentrating on stories that make good pictures (car crashes, fires, the aftermath of murders), and filling their bulletins with syndicated material of often dubious quality. (See, for example, the scandal surrounding Sinclair Broadcasting recently.)

The whole plan was wrongheaded from the outset, taking up resource at the regulator, and costing licence fee payers money that won’t up on the services that they’re paying for. Even in 2011, the future of hyper local services was clearly the internet, and the US TV model was both irrelevant to the UK market, and in any case, not very good.

I would never want to see services closed down, but this an experiment that has completely failed.

Gaming Google

It’s widely understood that news organisations can find the going quite precarious in this digital age, with a reluctance on the part of consumers to pay for news, and advertising alone not bringing in enough revenues. So it’s perhaps not surprising that they should look at whatever advantages they can take, and some of these seem to be at the expense of “gaming” Google.

I’ll highlight a couple of things that do irk me a little. But it’s worth noting that while these work for news organisations, they probably won’t work for anyone else. That’s because Google tends to prioritise news outlets in search results that return news sources.

Generally speaking, if your search result is purely factual and not newsworthy then unless a Google “snippet” appears, the top results will be relevant sites, quite often including results from places like Wikipedia or Quora.

However if the search is about current events, then Google throws recently updated news sites in the mix, andthese will find themselves in a prestigious position near the top of the page. Most of the time, that’s because it’s relevant. Someone searching on a current event probably does want a news site at the top of the list of results, rather than some dated article that contains the same keywords.

But that means that news organisations can game the system a little, and here are two examples.

1. The Google Doodle

As anyone who ever uses Google knows, Google loves to replace its regular logo with doodles on its home page. These celebrate all sorts of things from anniversaries of famous people to major events that are happening. Sometimes the doodles are localised to specific countries or regions, and other times they run globally.

Occassionally there’ll be a really ornate interactive one that offers something like a game or even a musical instrument!

But what happens when you see a doodle that you perhaps don’t understand or that intrigues you?

You click it.

And therein lies an opportunity. Because what that actually does is perform a Google search on whatever the subject matter is.

If you’re a news outlet, you swiftly write a piece on the subject on the doodle, noting that Google is celebrating said subject, and you get it published post haste.

The result is that when user click on the doodle, they get a page of results on, say, clockmaker John Harrison. But near the top of the screen are some links to news sites’ “Top Stories” about the very same.

Sure, the Wikipedia piece is there, but the other stories are hacked together pieces written full in the knowledge that they will generate page views as a result of Google’s doodle.

There’s nothing particularly wrong here, but it does push other relevant search results further down the page.

2. When’s It On?

Another type of gaming that goes on is also based on anticipating what people are Googling. Often these will be based around sports events or TV series.

There’s a big fight this weekend, or a big game in the Champions’ League. Perhaps a really popular TV drama is returning to our screens.

In any of these cases, some people will Google something along the lines of, “When is the Joshua fight?”

Now there is some semblance of information being asked for. They do want a date or a time. Perhaps they want to know what channel it’ll be on, or how they go about getting access to that channel.

Into that void rush news outlets. They quickly author pieces providing that information, but usually padding it out beyond briefly stating the date, time and channel. If I were suspicious I’d suspect that Google’s algorithms downgrade stories that are too short. So they get bulked out. You try writing 500 words on when a football match starts!

To put this into perspective, a search for “What time does the super bowl start” – in quotes – returns 15,400 pages.

Yes, these are questions that people want answers to. But do we really need dozens of “news” stories on them?

Of course, Google can sort of kill this my providing the information itself. In some cases it does that, but it doesn’t stop the news sites offering their own pages.

I probably find the first of these two things more irritating that latter, but you still have to recognise these articles for what they are – cheap traffic drivers that don’t really offer a great deal.