Nationalise Openreach?

Disclaimer: I am Chief Engineer for Gigaclear Ltd, a rural-focused fibre-to-the-home operator with a footprint in excess of 100,000 homes in the south of the UK. So I have a slight interest in this, but also know a bit about the UK market. What I’m writing here is my own thoughts, though, and doesn’t in the least bit represent company policy or direction.

Labour has recently proposed, as an election pledge, to nationalise Openreach and make them the national monopoly operator for broadband, and to give everyone in the UK free internet by 2030.

The UK telecoms market today is quite fragmented and complex, and so this is not the obvious win that it might otherwise appear to be.

In lots of European markets there’s a franchising model, and we do this in other utility markets – power being an excellent example. National Grid is a private company that runs transmission networks, and Distribution Network Operators (DNOs) like SSE, Western Power, etc run the distribution networks in regions. All are private companies with no shares held by government – but the market is heavily regulated and things like 100% coverage at reasonable cost is built in.

The ideal outcome for the UK telecoms market would clearly have been for BT (as it was then) never to have been privatised, and for the government to simply decide on a 100% fibre-to-the-home coverage model. This nearly happened, and that it didn’t is one of the great tragedies in the story of modern Britain; if it had, we’d be up at the top of the leaderboard on European FTTH coverage. As it is, we only just made it onto the leaderboard this year.

But that didn’t happen – Thatcher privatised it, and regulation was quite light-touch. The government didn’t retain majority control, and BT’s shareholders decided to sweat the asset they had and invest strategically in R&D to sweat that asset, along with some national network build-out. FTTC/VDSL2 was the last sticking plaster that made economic sense for copper after ADSL2+; LR-VDSL and friends might have given them some more time if the end of copper was still tied to performance.

As it is, enough people have been demonstrating the value of FTTH for long enough now that the focus has successfully shifted from “fast enough” to “long-term enough”. New copper technologies won’t last the next decade, and have huge reliability issues. Fibre to the home is the only long-term option to meaningfully improve performance, coverage, etc, especially in rural areas.

So how do we go about fixing the last 5%?

First, just so we’re clear, there are layers to the UK telecoms market – you have infrastructure owners who build and operate the fibre or copper. You have wholesale operators who provide managed services like Ethernet across infrastructure – people like BT Wholesale. Then you have retail operators who provide an internet connection – these are companies like BT Retail, Plusnet, TalkTalk, Zen, Andrews & Arnold, Sky, and so on. To take one example, Zen buy wholesale services from BT Wholesale to get bits from an Openreach-provided line back to their internet edge site. Sometimes Zen might go build their own network to an Openreach exchange so they effectively do the wholesale bit themselves, too, but it’s the same basic layers. We’re largely talking about the infrastructure owners below.

The issue is always that commercially the last 5-10% of the network in terms of hardest-to-reach places will never make sense to go and do, because it’s really expensive to do. Gigaclear’s model and approach is entirely designed around that last 5%, so we can make it work, but it takes a long-term view to do it. The hard-to-reach is, after all, hard-to-reach.

But let’s say we just nationalise Openreach. Now Openreach, in order to reach the hardest-to-reach, will need to overbuild everyone else. That includes live state-aid funded projects. While it’s nonsense to suggest that state aid is a reason why you couldn’t buy Openreach, it is a reason why you couldn’t get Openreach to go overbuild altnets in receipt of state aid. It’d also be a huge waste of money – billions already spent would simply be spent again to achieve the same outcome. Not good for anyone.

So let’s also say you nationalise everyone else, too – buy Virgin Media, Gigaclear, KCOM, Jersey Telecom, CityFibre, B4RN, TalkTalk’s fibre bits, Hyperoptic, and every startup telecom operator that’s built fibre to the home in new build housing estates, done their own wireless ISP, or in any other way provides an access technology to end users.

Now you get to try and make a network out of that mess. That is, frankly, a recipe for catastrophe. BT and Virgin alone have incredibly different networks in topology, design, and overall approach. Throw in a dozen altnets, each of whom is innovating by doing things differently to how BT do it, and you’ve got a dozen different networks that are diametrically opposed in approach, both at a physical and logical level. You’re going to have no network, just a bunch of islands that will likely fall into internal process black holes and be expensive to operate because they won’t look like 90% of the new operator’s infrastructure (i.e. Openreach’s network) and so require special consideration or major work to make it look consistent.

A more sensible approach is that done in some European countries – introduce a heavily regulated franchising market. Carve the market up to enable effective competition in services. Don’t encourage competition on territory so much – take that out of the equation by protecting altnets from the national operator where they’re best placed to provide services, and making it clear where the national operator will go. Mandate 100% coverage within those franchise areas, and provide government support to achieve that goal (the current Universal Service Obligation model goes some way towards this). Heavier regulation of franchise operators would be required but this is already largely accounted for under Significant Market Power regulations.

Nationalising Openreach within that framework would make some sense. It’d enable some competition in the markets, which would be a good thing, and it’d ensure that there is a national operator who would go and build the networks nobody could do on even a subsidised commercial basis. That framework would also make state aid easier to provide to all operators, which would further help. Arguably, though, you don’t need to nationalise Openreach – just tighten up regulation and consider more subsidies.

This sort of approach was costed in the same report that Labour appear to be using, which Frontier Economics did for Ofcom as part of the Future Telecoms Infrastructure Review. It came out broadly equivalent in cost and outcomes.

But I do want free broadband…

So that brings us to the actual pledge which was free broadband for everyone. The for everyone bit is what we’ve just talked about.

If you’ve got that franchise model then that’s quite a nice approach to enable this sort of thing, because the government can run its own ISP – with its own internet edge, peering, etc – and simply hook up to all the franchise operators and altnets. Those operators would still charge for the service, with government footing the bill (in the case of the state operator, the government just pays itself – no money actually changes hands). The government just doesn’t pass the bill on to end-users. You’d probably put that service in as a “basic superfast access” service around 30Mbps (symmetrical if the infrastructure supports it).

This is a really good model for retail ISPs because it means that infrastructure owners can compete on price and quality (of service and delivery) but are otherwise equivalent to use and would use a unified technical layer to deliver services. The connection between ISPs and operators would still have to be managed and maintained – that backhaul link wouldn’t come from nowhere – but this can be solved. Most large ISPs already do this or buy services from Openreach et al, and this could continue.

There’d still be a place for altnets amidst franchise operators, but they’d be specialised and narrow, not targeting 100% coverage; a model where there is equal competition for network operators would be beneficial to this and help to encourage further innovation in services and delivery. You’d still get people like Hyperoptic doing tower blocks, business-focused unbundlers going after business parks with ultrafast services, and so on. By having a central clearing house for ISPs, those infrastructure projects would suddenly be able to provide services to BT Retail, Zen, TalkTalk, and so on – widening the customer base and driving all the marketing that BT Retail and others do into commercial use of the best infrastructure for the end-user and retailer. This would be a drastic shake-up of the wholesale market.

Whether or not ISPs could effectively compete with a 30Mbps free service is I think a valid concern. It might be better to drop that free service down to 10Mbps – still enough for everyone to access digital services and to enable digital inclusion, but slow enough to give heavier users a reason to pay for a service and so support the infrastructure. That, or the government would have to pay the equivalent of a higher service tier (or more subsidy) to ensure viability in the market for ISPs.

I think that – or some variant thereof – is the only practical way to have a good outcome from nationalising or part-nationalising the current telecoms market. Buying Openreach and every other network and smashing them together in the hopes of making a coherent network that would deliver good services would be mad.

What about free WiFi?

Sure, because that’s a sensible large-scale infrastructure solution. WiFi is just another bearer at some level, and you can make the argument that free internet while you’re out and about should be no different to free internet at home.

The way most “WiFi as a service” is delivered is through a “guest WiFi” type arrangement on home routers, with priority given to the customer’s traffic so you can’t sit outside on a BTWiFi-with-FON access point and stream Netflix to the detriment of the customer whose line you’re using. Unless you nationalised the ISPs too you can’t effectively see this happening.

Free WiFi in town centres, village halls, and that sort of thing is obviously a good thing, but it still works in the franchise model.

How about Singapore-on-Thames?

Well, Singapore opted to do full fibre back in 2007 and were done by about 2012 – but they are a much smaller nation with no “hard to reach” parts. Even the most difficult, remote areas of Singapore are areas any network operator would pounce on.

But they do follow a very similar model, except for the “free access” bit. The state operator (NetLink Trust) runs the physical network, but there are lots of ISPs who compete freely (Starhub, M1, Singtel, etc). They run all the active equipment in areas they want to operate in, and use NetLink’s fibre to reach the home. Competition shifts from the ability to deploy the last mile up to the service layer. This does mean you end up with much more in the way of triple/quad-play competition, though, since you need to compete on something when services are broadly equivalent.

It’s a good example of how the market can work, but it isn’t very relevant to the UK market as it stands today.

Privacy and security concerns

One other thing I’ve heard people talk about today is the concerns around having a government-run ISP, given the UK government’s record (Labour and Tory) of quite aggressively nasty interference with telecoms, indiscriminate data collection, and other things that China and others have cribbed off us and used to help justify human rights abuses.

Realistically – any ISP in the UK is subject to this already. Having the govt run an ISP does mean that – depending on how it actually gets set up – it might be easier for them to do some of this stuff without necessarily needing the legislation to compel compliance. But the message has been clear for the last 5-10 years: if you care about privacy or security, your ISP must not be a trusted party in your threat model.

So this doesn’t really change a thing – keep encrypting everything end-to-end and promote technologies that feature privacy by design.

Is it needed? Is it desirable?

Everyone should have internet access. That’s why I keep turning up to work. It’s an absolute no-brainer for productivity (which we need to fix, as a country) and some estimates from BT came up with in the order of £80bn of value from universal broadband.

Do we need to shake up the market right now? BT are doing about 350k homes a quarter right now and are speeding up, so if you left them to their own devices they’ll be done in at worst about 16-20 years. Clearly they’re aiming for 2030 or sooner anyway and are trying to scale up to that. However, that is almost all in urban areas.

Altnets and others are also making good progress and that tends to be focused on the harder-to-reach or semi-rural areas like market towns.

I think that it’s unlikely that nationalising Openreach or others and radically changing how the market works is something you’d want to do in a hurry. Moving to a better model for inter-operator competition and increasing regulation to mandate open access across all operators would clearly help the market, but it has to be done smartly.

There are other things that would help radically in deploying new networks – fixing wayleave rules is one. Major changes to help on this front have been waiting in the “when Parliament is done with Brexit” queue for over a year now.

There is still a question about how you force Openreach or enable the markets to reach the really hard to reach last mile, and that’s where that £20bn number starts looking a bit pithy. While the FTIR report from Frontier Economics isn’t mad, it does make the point that reaching the really hard to reach would probably blow their £20bn estimate. I think you’d easily add another £10-20bn on to come to a sensible number for 100% coverage in practice given the UK market as it is.

Openreach spend £2.1bn/yr on investment in their network, and have operating costs of £2.5bn/yr. At current run-rate that means you’d be looking at ~£70bn, not £20bn, to buy, operate and build that network using Openreach in its current form. Labour have said £230m/yr – that looks a bit short, too.

(Since I wrote this, various industry people have chimed in with numbers between £50bn and £100bn, so this seems a consistent number – the £230m/yr appears to include capital discounting, so £700m+/yr looks closer)

The real challenge in doing at-scale fibre rollout, though, is in people. Education (particularly adult education and skills development) is lacking, and for the civil engineering side of things there has historically been a reliance on workforces drawn from across the continent as well as local workforces. Brexit isn’t going to make that easier, however soft it is.

We also don’t make fibre in the UK any more. I’ve stood at the base of dusty, long-abandoned fibre draw towers in England, now replaced by more modern systems in Europe to meet the growing demand there as it dwindled here. Almost every single piece of network infrastructure being built in the UK has come from Europe, and for at least a decade now, every single hair-thick strand of glass at the heart of modern networks of the UK has been drawn like honey from a preform in a factory in continental Europe. We may weave it into a British-made cable and blow that through British-made plastic piping, but fibre networking is an industry that relies heavily on close ties with Europe for both labour and goods (and services, but that’s another post).

Labour’s best move for the telecoms market, in my view, would be to increase regulation, increase subsidy to enable operators to go after the hardest-to-reach, and altogether ditch Brexit. Providing a free ISP on top of a working and functional telecoms market is pretty straightforward once you enable the current telecoms market to go after everyone.

An evening in the hobby

I’ve gotten into quite a good routine, sequence, whatever you might call it, for my hobby. While it’s an excellent hobby when it comes to complex things to fiddle around with, once you actually get some dark, clear skies, you don’t want to waste a minute, particularly in the UK.

Not having an observatory means a lot of my focus is on a quick setup, but it also means I’ve gotten completely remote operation (on a budget) down pretty well.

I took a decision to leave my setup outdoors some time ago, and bought a good quality cover rated for 365-days-of-the-year protection from Telegizmos. So far it’s survived, despite abuse from cats and birds. The telescope, with all its imaging gear (most of the time), sits underneath on its stock tripod, on some anti-vibration pads from Celestron. I also got some specialist insurance and set a camera nearby – it’s pretty well out of the way and past a bit of security anyway, but it doesn’t hurt to be careful. Setting up outside has been the best thing I’ve done so far, and is further evidence in support of building an observatory!

The telescope, illuminated from an oversize flat frame generator, after a night of imaging.

Keeping the camera mounted means I can re-use flat frames between nights, though occasionally I will take it out to re-collimate if it’s been a while. The computer that connects to all the hardware remains, too – a Raspberry Pi 4 mounted in a Maplin project case on the telescope tube.

This means everything stays connected and all I have to do is walk out, plug a mains extension cable in, bring out a 12V power supply, and plug in two cables – one for the mount, and one for the rest. Some simple snap-fit connector blocks distribute the 12V and 5V supplies around the various bits of equipment on the telescope.

That makes for quite calm setup, which I can do hours in advance of darkness in these early season nights. The telescope’s already cooled down to ambient, so there’s no delay there, either. I’ve already taken steps to better seal up my telescope tube to protect against stray light, which also helps keep any would-be house guests out.

My latest addition to the setup is an old IP camera so I can remotely look at the telescope position. This eliminates the need for me to take my laptop outside whenever the telescope is moving – I can confirm the position of the telescope and hit the “oh no please stop” button if anything looks amiss, like the telescope swinging towards a tripod leg.

I use the KStars/Ekos ecosystem for telescope control and imaging, so this all runs on a Linux laptop which I usually VNC into from my desktop. This means I can pull data off the laptop as I go and work on e.g. calibration of data on the desktop.

A normal evening – PixInsight, in this case looking at some integration approaches for dark frames, and VNC into KStars/Ekos, with PHD2 guiding, and a webcam view of the telescope

So other than 10 minutes at the start and 10 minutes in the early hours of the following morning my observing nights are mostly spent indoors sat in front of a computer. That makes for a fairly poor hobby in terms of getting out of my seat and moving around, but a really good hobby in terms of staying warm!

I do often wander out for half an hour or so and try to get some visual observation in, using a handheld Opticron monocular. Honestly, the monocular isn’t much use – it’s hard to hold steady enough, and low-magnification. Just standing out under the stars and trying to spot some constellations and major stars is satisfying, but I’d quite like to get a visual telescope I can leave set up and use while the imaging rig is doing its thing. That’s a fair bit of time+money away though, and I’d prefer to get the observatory built first. On a dark night, lying down and staring up at the milky way is quite enough to be getting on with.

A typical night, though, involves sitting indoors with the telescope under its cover, and yelling at clouds or the moon (which throws out enough light to ruin contrast on deep space objects).

On that basis I’ve been thinking about other ways to enjoy the hobby that don’t involve dark, clear nights. Some good narrowband filters would let me image on moonlit nights, but run into the many hundreds of pounds, putting a set of Ha/OIII/SII filters around £1k.

Narrowband image, shot in the hydrogen alpha emission line using a Baader 7nm filter – cheap but cheerful – of some of the Elephant’s Trunk Nebula; ~7.5 hours of capture

Making my own telescope, though, struck me as a fun project. It’s something quite frequently done, but the bit that most interested me is mirror making. That’s quite a cheap project (£100 or so) to get started on and should take a few months of evenings, so ought to keep me busy for a while – so that’s the next thing to do. I’ve decided to start with an 8″ f/5 mirror – not only is it quite a small and simple mirror, I could place it directly into my existing telescope without having to spend any more money. I’ve been doing lots of research, reading books on the topic and watching videos from other mirror-makers.

And that is definitely one of the recurring themes throughout the hobby – there’s always something to improve on, and nothing is trivially grasped. Everything takes a bit of commitment and thought. I think that’s one of the reasons I enjoy it so much.

A focused approach

In my last post on astrophotography I wrote about planning for dark skies and about my plans to build an observatory. Well, finances haven’t permitted for the observatory – this year – so this month I opted to get my existing telescope and mount working at their theoretical best.

This mostly boiled down to:

  • Improving the ability to achieve and hold accurate focus
  • Getting the mount running as smoothly and accurately as possible
  • Making small improvements to the way I set things up
  • Getting the optics as precisely collimated as possible

If I do all those things then the limiting factors should be the intrinsic limits of the kit I have and the environment, and I should be able to produce some great pictures with all that! So I started off, knowing the focuser was mechanically weak and a real problem in terms of operating the scope, by replacing the focuser.

Focusers and motors

This started out as a “undo 4 bolts, replace 4 bolts” project and turned into a bit more work. It also required me to remove the secondary mirror for the first time, which meant tooling up to collimate that properly – my laser isn’t enough to set the position completely.

The new, on the left, and the old. Note the chunkier construction, much bigger bearings, and the larger motors. The new one’s also larger, and internally baffled to help cut down on stray light.

The holes in the tube didn’t quite fit the new plate, so I had to drill new holes – I measured very carefully, several times, with different measurement approaches (not wishing to recreate the Hubble problem of relying on a single instrument). The position isn’t critical but it makes life easier if it’s in the right place.

Nervous drilling

The focuser I went for was a Baader Steeltrack Diamond. To summarise the choice, there’s only a few major groups of manufacturers – Moonlite and friends sit at the “fine for visual, okay for imaging” end of things with traditional Crayford designs. Then you’ve got people like Baader and JTW who are a bit more serious about focuser slop and rigidity. Then there’s Feather-Touch. FT appear to be held in messianic regard by literally everyone, which I can only assume is for good reason. They’re also two to three times the price. Which rules them out. Baader’s Diamond NT focuser appeared to be very well regarded mechanically, and having bought a number of parts from them I knew they were of good quality.

It didn’t disappoint – it’s very well made, and manual movement of the focuser when it arrived was buttery. I popped the fine focus knob off and prepared it for the addition of the focus motor

If you’ve not done imaging before you might think that motorising a focuser is a bit excessive – and indeed when I started out I just focused manually once at the start of the evening and then left the camera to it. But to get the most out of a telescope, frequent or constant refocusing is needed, to compensate for contraction of the telescope and optics due to temperature change. It’s also useful to be able to let the computer focus for you to achieve the most precise focus.

Again, there are many options here. I opted for a lower cost option which was fairly well reviewed, the Primaluce Lab Sesto Senso focus motor. This despite it missing a key feature, temperature compensation. This feature automatically moves the motor based on a temperature reading, rather than having the computer do it for you. However, most software supports doing this. Sadly, KStars/Ekos does not – yet.

The new focuser and motor installed on the tube

Spot the difference

After installing the focuser and motor I had to re-install the secondary and collimate it – this was actually pretty straightforward. However I also wanted to replace the centre spot on my telescope with a “hotspot” to make barlow laser and autocollimator checks easier, so the primary mirror came out too. Both got a very gentle soak and rinse with no agitation, and then the old primary spot was removed with some isopropyl alcohol.

The old spot and mirror in its cell
The mirror, spotless!
Spotting the mirror using a Cats Eye template, weighted down with cotton wool. There was a lot of careful staring at this before I affixed the spot.
The completed install.

After this there was just a lot of very time consuming adjustment to get everything set up as well as possible. This mostly just involved staring down cheshire eyepieces and then moving things very slowly with an allen key until it all looked like it should.

A quick barlowed laser check as part of reassembly, looking down the tube – you can see the reflection of the centre trefoil in the middle, which is actually a reflection off a piece of paper in the bottom of the barlow in the focuser.

I still need to add an autocollimator to my toolbox, but the Catseye ones are quite dear, so that’s a “next month” purchase. That will however be the last tool I need to add there, I think!

Mount problems

I had been seeing issues with my tracking the last few attempts I made to set up, so wanted to verify my mount was mechanically sound. This mostly involved adjusting the worm carrier blocks – large metal blocks which form both part of the housing and the mechanism by which the worm meshing can be adjusted. This, again, involved a lot of slackening off one thing, tightening another, then rotating the whole axis through 360 degrees to make sure nothing bound or stuck.

Dismantling an axis driver to check everything is okay – the worm carrier block is the lower bit of metal, where the big gear sits. Behind this is the worm gear shaft.

After a lot of measurement, trying to work out what was going on, I realised it was the obvious thing – polar alignment. My Polemaster – a camera that sits on the mount to do a polar alignment – wasn’t getting good enough results, and that was all I was using. I used a method called drift alignment and improved from ~15 arcminutes accuracy down to about 2 arcminutes. This has radically improved my guiding, which is now down at around 1 arcsecond – where it should be! The adjustment knobs on the EQ6-R Pro are the limiting factor now – it’s just not possible to get the alignment much better.

Balancing the mount more carefully has helped, too, and I’ve rotated the telescope in its tube so the focuser points at the RA axis. This means that as the axis rotates the weight distribution remains constant. It also means I can’t really use the telescope for visual observation, but I’ve not done that in a long while!

I’ve also added some Celestron anti-vibration pads to the tripod. While a cubic metre or two of concrete would be better, these should help isolate vibration from the ground and also help with oscillation in the tripod itself as a result of mount movement.

To help minimise the number of cables coming off the mount I’ve also put my INDI server on the tube itself by mounting a Raspberry Pi, 12V-5V step-down, and USB hub. This also helps to counterbalance the focuser around the Dec axis. There’s now only three cables to the mount – 12V, Ethernet, and the mount control cable.

The other major upgrades I’ve made lately have been on guidescope mounting – I now have some very solid aluminium guidescope brackets that a colleague at work milled for me. This does appear to have solved the differential flexure problem. I still want to upgrade the camera and explore off-axis guiding, but it’s a great improvement.

Worth it?

It’s too early to say, really, but the indication is that probably, together, this has all produced a much improved system for astrophotography for not much (in AP terms) money. This image of M101, the Pinwheel galaxy, I produced last night with less than 2 hours of light:

Precise focus has helped massively, though temperature compensation and per-filter focus offset automation would be very welcome additions to Ekos – it might even be enough to push me back to Sequence Generator Pro, though I’m very much enjoying the INDI/Linux approach so far (bugs that require me to completely shut down KStars mid-session aside). The mount guiding is definitely a big upgrade over where it was – I think I had broadly been getting lucky with this over winter, though I suspect the colder atmosphere might’ve helped the Polemaster.

All in all it’s a good step forward – now I just need some really cold clear skies!