Register | Login
Attackpoint - performance and training tools for orienteering athletes

Discussion: Lidar / kartapullatin newbie question

in: Orienteering; General

Aug 27, 2019 5:13 PM # 
Hammer:
I’m about to purchase lidar for my research needs and was thinking I could also make some O maps with kartapullatin (or hire Runner99 to do it for me) at the same time. ;-)

Can somebody let me know if the raw LAS point cloud files are what is used In kartapullatin? This is what the company stated in their quote:

“For this costing your team would be getting the Raw LAS point cloud files at the minimum 8 point per meter (PPM) noted. This data set will be unclassified and will have a relative positional accuracy of 15cm (xyz). If a classified and/or a controlled LAS data set is required, these can be provided at additional costs.”
Advertisement  
Aug 27, 2019 5:32 PM # 
JLaughlin:
KP (and now OCAD Dem) takes the raw point cloud (LAS or LAZ) files. Based on what you have in the quote, you would be able to use KP to generate an orienteering map.

You may want to specify the reference system used (e.g. UTM).
Aug 27, 2019 7:44 PM # 
cedarcreek:
You'll need to classify at least the ground points, but that's fairly easy using LasTools, probably lasground_new.exe .

I can help with command lines to get you going, or anything else you need.

I always recommend that people use laz format, but for raw data, I would make an exception: las is probably better because of some obscure technical issues. I'd output in laz format for everything lastools does (that is, for all the intermediate and final steps). It's 1/7th the size of las format.
Aug 27, 2019 8:13 PM # 
hughmac4:
Hi Hammer! LiDAR point cloud files do need to be classified to work with KP, as cedarcreek says.

Algorithmic classification can be done with 'lasclassify' from LASTools, and should be good enough for O maps. A good walkthrough of a pipeline with nice verbiage can be found here, including sample data to try out the methods. Note that LASTools is no longer free, so depending on your data size you might need to go to a paid toolkit. Exercise those grant $$. :)

You might ask the vendor about their classification method. If it's just running it through an algorithm, it's likely that the raw, unclassified data is fine, and you can use LASTools. If they do a lot of sophisticated massaging (interns? :)), it might be worth the cost, particularly if you're looking for more detailed canopy evaluation for your research.

Output from the demo files in the rapidlasso demo I linked above ... not the best O example, but:

RAW KP Output


CLASSIFIED KP Output
Aug 27, 2019 9:57 PM # 
cedarcreek:
Hammer, the one thing I'd ask about is issues related to a "ringing sensor" in the lidar instrument. In the RapidLasso/LASTools videos I've watched, Martin Isenburg shows how to detect a ringing sensor that gives a "false last return" about 2m below the ground surface.

If they're guaranteeing 15cm accuracy, my assumption would be that they've handled that ringing issue. (But I would ask them and look for it anyway.) Martin explains how to fix it, but it's much more difficult than the other classification steps, which are quite easy.

This is my basic workflow. I just keep nesting directories inside each other. If you were doing this with a batch file, you'd probably find a better way. But this works, it's convenient because I don't classify lidar very much, and I could see myself spending a lot of time spinning my wheels trying to understand the batch file and command line syntax. Make sure you go in afterwards and delete the intermediate step files or you'll waste a lot of space.

lasground_new -i *.laz -olaz -odir ground
cd ground
lasheight -i *.laz -olaz -odir height
cd height
lasclassify -i *.laz -olaz -dir classify

Definitety read the readme files. I think lasground_new assumes a 25m step size ("-town" setting). If you have very few buildings, you can possibly use a smaller step size. I attempted to improve the classification of the Big Basin Redwoods lidar, with some success, but I had to use the -metro (50m step) setting, which is intended to handle things like factory buildings. There were so few ground points in the Redwoods areas, I believe the -metro switch gave the best results---which is very counterintuitive. The -woods? (5m?) and -wilderness? (3m?) settings were completely unusable.

The only other thing you might ask for is data in several tile sizes. Here in the US, one "normal" tile size is 5000 ft, which is exactly 1524m (assuming international feet and not survey feet). 1000m would probably be a convenient tile size (and I'd recommend that). But if you can, you should ask for (additionally) tiles at 250m with no bounding box, and 250m with a 32m bounding box. I'm assuming this is something the company could provide trivially, with only the total file sizes being a problem.

The reason I'd ask for those 250m with and without tiles is because of the limitations of the unregistered (free) version of lastools. A lot of times you can avoid the stripping of return number, intensity, and a bunch of other desireable data by tiling small. Even in a large batch operation, lastools uses the number of points in the tile to determine if the data needs to be stripped. I've had issues with 250m tiles, but only when it contained many overlapping swathes of data. Does anyone know if lastile strips data from a very large single lidar file? For some reason, I'm thinking lasmerge is "free" but lastile isn't (that is, lastile strips data above some arbitrary input file size). (It might be also trivial to strip the bounding box tiles to create 250m with no bb tiles yourself. So maybe just ask for 250m with 32m bb?)

I haven't checked this, but I think it's accurate. If you're classifying a large tiled set, you probably will need to set up the bounding box tiles, process it through the ground, height, and classify steps, and then strip away the bounding boxes. At that point you might merge the small tiles, but perhaps not.

I'm pretty sure the ground and height steps do not strip data in the unregistered versions, but I believe classify does. And small tiles with bounding boxes probably will prevent that.
Aug 27, 2019 10:11 PM # 
cedarcreek:
If I'm having trouble with KP or OL-Laser, I'll run this command to see if the lidar is classified. Usually you just need category 2 (ground) classified. Everything else can be undefined. Check the lasclassify readme---I think it classifies low and high vegetation. There might be additional lastools steps that classify buildings and other things, but I haven't tried that yet.

lasinfo -i filename.laz -otxt

This creates a text report you can open to see the xyz coordinates and what point classifications exist in the lidar file.
Aug 28, 2019 12:44 AM # 
Hammer:
Thanks everyone! I've got some questions to ask the company. Greatly appreciated.
Aug 28, 2019 7:40 AM # 
Terje Mathisen:
cedarcreek has a lot of good tips here, but the tile issue is actually wrong:

As long as you can get the raw lidar point cloud you should be fine!

I always start my pipeline with the following steps( this is from my lidar-adaptive.bat batch file):

echo Indexing input LAZ files
lasindex -i %1 -cores %TILECORES%
echo Tiling single input dir
lastile -i %1 -cores %TILECORES% -flag_as_withheld -tile_size 256 -buffer 32 -full_bb -refine 1490000 -o adaptive\tile -olaz 2>NUL
:tiled
echo "Spitting too large tiles into 128x128 sub-tiles"
lastile -cores %CORES% -i adaptive\*.laz -refine_tiles 1490000 -olaz 2>NUL
echo "Spitting too large tiles into 64x64 sub-sub-tiles"
lastile -cores %CORES% -i adaptive\*.laz -refine_tiles 1490000 -olaz 2>NUL

I.e. I start by generating 256x256 tiles with 32m buffers, then I go through all the tiles two more times, splitting them into 128x128 and then 64x64, but only for the tiles that would break the limit of 1.49M points.
Aug 28, 2019 9:51 AM # 
Jagge:
Correct me if I got this wrong, but I live in impression KP does not use information unlicensed lasground drops. And the return number cedarcreek mentioned should stay there. So these workarounds to preserve some info is really not needed for KP (but other apps may need or use them).

Like Terje wrote, as long as you can get the raw lidar point cloud you should be fine. And you'll good support here - people coming up with piles shell commands even without asking is a good sign of that :)
Aug 28, 2019 12:31 PM # 
hughmac4:
Off topic, but: I just got an e-mail from a trusted member of the AP community, who mentioned he couldn't see my images in Safari. I load from Dropbox, and apparently the proper "use Dropbox to host files for web consumption" URL is 'raw=1', not 'dl=1'. FYI, in case anyone else users Dropbox as their image repo.

And now I have like 100 image posts to find and edit. Doh!
Aug 28, 2019 2:49 PM # 
jjcote:
Meanwhile, I can't see part of that message because it's apparently red and I'm colorblind.
Aug 28, 2019 3:03 PM # 
cedarcreek:
Terje---thanks. I was typing from memory, and I really don't classify files that much. I have found that reclassifying data is useful sometimes. What I think you are saying is that you can detect when the point count in a tile is too large, and you can go to 128 or 64m tiles to get below that limit. Does lastile add the noise and strip data, or is it other steps like lasclassify and las2dem?

I should probably go for the 2^n size tiles (like 256 versus 250), but I like convenient tile coordinates.

Jagge---Yes, definitely for other programs. The pro mapper I work with the most asks for intensity images from ground returns and "all last returns". They're like low-res aerials that strip away the high vegetation---sometimes very helpful. LASTools definitely strips intensity when you breach that point count limit.
Aug 28, 2019 4:09 PM # 
JanetT:
jjcote:
Meanwhile, I can't see part of that message because it's apparently red and I'm colorblind


It says to use raw=1 instead of dl=1
Aug 29, 2019 11:26 AM # 
hughmac4:
Learning so many things ... fixed with yellow, jjcote?
Aug 30, 2019 12:36 AM # 
jjcote:
Looks great now!

This discussion thread is closed.