Mapping Electromagnetic Field

This is part blog post, part prelude and part documentation.

At Electromagnetic Field (EMFCamp, being held later this month) I will be giving a talk on mobile mapping technologies, what the current state of the art looks like, precise location and some open source tools. We use mobile mapping and some of the tools I’ll discuss at my work, Gigaclear, to survey large areas of the rural UK for our fibre-to-the-home network build, which is how I’ve been able to wrangle a quick drive around the EMFCamp site at Eastnor from the survey vehicle.

That vehicle is equipped with fairly standard mobile mapping hardware, using a Ladybug5 camera for panoramic 30MP images (which I can’t distribute for privacy reasons) and a Riegl VUX-1HA scanner for LiDAR scanning. The Riegl captures 1 million points each second and rotates its scan head 250 times every second.


Words of caution and apology

LiDAR data is sometimes a pain to work with. Even with the best kit in the world, and a bunch of time spent processing, without control points and lots of manual marrying up of points in overlapping passes of the scanner, there’s noise and variation in the output. This isn’t a project that Gigaclear have done in our usual manner – I’ve had no such time in preparing this in my evenings, and so this dataset is presented as a “best effort” dataset, likely riddled with all sorts of errors and inaccuracies that we wouldn’t usually accept and which professional users will, rightly, sneer at!

In absolute terms the x/y accuracy of this dataset is pretty good, and an upper bound of 5cm RMS error from OSGB36 (the British National Grid) can be expected throughout most of the scan. Within the scanner output the accuracy is around 3mm between points – but only within the same pass. This dataset contains multiple overlapping and automatically aligned passes (you can see these as point source ID in the LAS file), and so there are some errors and anomalies. On top of this, the colour in this dataset comes from the overlaying of images on the points, using a calibration file and alignment – and I know the alignment I used wasn’t great. And the drivers didn’t go down the middle of the campsite, so there’s a bit of a void there. So, expectations set!


Sensible scale

Often, very dense point clouds can be counterproductive. In the case of our initial dataset there were over 1 billion points returned. Most of the subsequent processing was done on this dataset, thinned to a 5mm grid (still about a billion points). This dataset is about 32 gigabytes and is a real pain to work with.

Intensity view – the infrared brightness of the reflection from the laser

What I’m publishing here is therefore a reduced dataset; it is the same dataset, thinned using simple decimation (taking 1 in every 10 points), making it about 3.2 gigabytes in size and containing 92 million points – something that will fit in RAM on most modern PCs. In terms of detail, it’s still pretty fantastic for many uses. It’s a LAS 1.4 file, georeferenced to the UK National Grid (OSTN15 flavour, for those who care) with some fairly imprecise classifications, raw intensity and RGB data per point.

RGB colours – taking photo data and laying it onto the point cloud

This data can be post-processed for your needs, desires and interest. If you’ve never worked with LiDAR data before, CloudCompare is a great tool to start with – you’ll need the alpha version for liblas LAS 1.4 support. If you fancy generating rasters or generating filtered versions of the data (or writing your own Python code to work with it) then PDAL is a great tool.

Hillshade maps are easily produced by asking PDAL to write a GeoTIFF with the Z dimension

… interesting stuff, right?

If you do think this sort of stuff is downright fascinating from a technology standpoint, I’ll be doing a talk on the underlying technology at EMFcamp, whenever the schedule computer deems it so. Come along and find out more!

I’m personally really excited to see what comes of giving a gathering like EMFcamp this sort of data, and I’ve already heard some great ideas – let me know what you make with it!

And if you fancy a job working on software that works with this sort of stuff, and solving similar interesting problems in the geospatial world, drop me a line or check our website.

The Data!

Eastnor Deer Park – LAS 1.4 – Version 1, 1:10 Decimated – 3.2GB – Download here

This dataset is also available for online consumption here, but if you’re going to do anything interesting or serve it to many people please don’t do it off this server. The online version was produced with PotreeConverter and uses the excellent Potree web based renderer.

As the creator of this dataset, I license this dataset under a Creative Commons BY-SA license. The dataset may be used for any purpose, so long as it is attributed in some way and any derivative works are shared alike.

Creative Commons License
Eastnor Park LiDAR Survey is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

The Investigatory Powers Bill for architects and administrators

OK, it’s not the end of the world. But it does change things radically, should it pass third reading in its current form. There is, right now, an opportunity to effect some change to the bill in committee stage, and I urge you to read it and the excellent briefings from Liberty and the Open Rights Group and others and to write to your MP.

Anyway. What does this change in our threat models and security assessments? What aspects of security validation and testing do we need to take more seriously? I’m writing this from my perspective, which is from a small ISP systems perspective, but this contains my personal views, not that of my employer, yada yada.

Continue reading The Investigatory Powers Bill for architects and administrators

The Dark Web: Guidance for journalists

We had a lot of coverage of “the dark web” with the latest Ashley Madison leak coverage. Because a link to a torrent was being shared via a Tor page (well, nearly – actually most people were passing around the Tor2Web link), journalists were falling over themselves to highlight the connection to the “dark web”, that murky and shady part of the internet that probably adds another few % to your click-through ratios.

So many outlets and journalists – even big outfits like BBC News and The Guardian – got their terminology terribly wrong on this stuff, so I thought I’d slap together some guidance, being somewhat au fait with the technology involved. Journalists are actually most of the reason why these sorts of tools exist in the first place, in fact – if that surprises you, read on…

Continue reading The Dark Web: Guidance for journalists