Generating Potree point clouds

Generating Potree point clouds

Potree Examples

http://cals.arizona.edu/~tswetnam/potree_test/opuntia_srer.html

http://cals.arizona.edu/~tswetnam/potree_test/big_mesquite_srer.html

http://cals.arizona.edu/~tswetnam/potree_test/wgew_colored.html

Workflow

After I finished processing the WGEW lidar with PDAL and LAStools I wanted to generate a viewable dataset for users that could be hosted online.

Potree is a nice viewer for examining the entire point cloud at one time - it is capable of displaying billions of points (through an octree point decimation protocol).

In order to get Potree running I first had to install Windows 2015 Visual Studio and Apache/XAMPP - I followed the Getting Started guide on Github for the XAMPP installation. You must have the web-service running in the background to view your output Potree point clouds in your browser using the local host directory.

First, I needed to set the local host directory in XAMPP so that I could view the point clouds in my browser.

The httpd.conf file is located in C:\xampp\apache\conf\

Scroll down to line 247 and reset the DocumentRoot and Directory (I've ##'d out the original folder) in my code):

## original: DocumentRoot "C:/xampp/htdocs"
## original: <Directory "C:/xampp/htdocs">
DocumentRoot "F:/xampp"
<Directory "F:/xampp">

Next run Potree in a terminal window (I'm using the OSGEO4W Shell in this instance):

c:
cd Potree

Output flow:

C:\Potree>PotreeConverter.exe --overwrite -p big_mesquite_srer -o F:\xampp\potree_test -d 400 --output-format LAS -a RGB -q NICE --edl-enabled --show-skybox --title big_mesquite --description "SfM of mature mesquite on the Santa Rita Experimental Range, Arizona. (~47 mil points)" --source  F:\sfm\6_8_2016\big_mesquite\big_mesquite_cleaned_up.las
== params ==
source[0]:              F:\sfm\6_8_2016\big_mesquite\big_mesquite_cleaned_up.las
outdir:                 F:\xampp\potree_test
spacing:                0
diagonal-fraction:      400
levels:                 -1
format:
scale:                  0
pageName:               big_mesquite_srer
output-format:          LAS
projection:
AABB:
min: [-7.12259, -10.4547, 8.44801]
max: [14.1254, 8.30878, 15.8762]
size: [21.248, 18.7635, 7.42819]
cubic AABB:
min: [-7.12259, -10.4547, 8.44801]
max: [14.1254, 10.7933, 29.696]
size: [21.248, 21.248, 21.248]
spacing calculated from diagonal: 0.0920065
READING:  F:\sfm\6_8_2016\big_mesquite\big_mesquite_cleaned_up.las
INDEXING: 1,000,000 points processed; 1,000,000 points written; 4.161 seconds passed
INDEXING: 2,000,000 points processed; 2,000,000 points written; 5.183 seconds passed
INDEXING: 3,000,000 points processed; 3,000,000 points written; 6.172 seconds passed
INDEXING: 4,000,000 points processed; 4,000,000 points written; 7.248 seconds passed
INDEXING: 5,000,000 points processed; 5,000,000 points written; 8.305 seconds passed
INDEXING: 6,000,000 points processed; 6,000,000 points written; 9.395 seconds passed
INDEXING: 7,000,000 points processed; 7,000,000 points written; 10.463 seconds passed
INDEXING: 8,000,000 points processed; 8,000,000 points written; 11.559 seconds passed
INDEXING: 9,000,000 points processed; 9,000,000 points written; 12.712 seconds passed
INDEXING: 10,000,000 points processed; 10,000,000 points written; 13.766 seconds passed
FLUSHING: 5.796s
INDEXING: 11,000,000 points processed; 11,000,000 points written; 20.633 seconds passed
INDEXING: 12,000,000 points processed; 12,000,000 points written; 21.704 seconds passed
INDEXING: 13,000,000 points processed; 13,000,000 points written; 22.786 seconds passed
INDEXING: 14,000,000 points processed; 14,000,000 points written; 23.808 seconds passed
INDEXING: 15,000,000 points processed; 15,000,000 points written; 24.898 seconds passed
INDEXING: 16,000,000 points processed; 16,000,000 points written; 25.974 seconds passed
INDEXING: 17,000,000 points processed; 17,000,000 points written; 27.037 seconds passed
INDEXING: 18,000,000 points processed; 18,000,000 points written; 28.09 seconds passed
INDEXING: 19,000,000 points processed; 19,000,000 points written; 29.224 seconds passed
INDEXING: 20,000,000 points processed; 20,000,000 points written; 30.478 seconds passed
FLUSHING: 7.313s
INDEXING: 21,000,000 points processed; 21,000,000 points written; 38.977 seconds passed
INDEXING: 22,000,000 points processed; 22,000,000 points written; 40.112 seconds passed
INDEXING: 23,000,000 points processed; 23,000,000 points written; 41.248 seconds passed
INDEXING: 24,000,000 points processed; 24,000,000 points written; 42.425 seconds passed
INDEXING: 25,000,000 points processed; 25,000,000 points written; 43.554 seconds passed
INDEXING: 26,000,000 points processed; 26,000,000 points written; 44.649 seconds passed
INDEXING: 27,000,000 points processed; 27,000,000 points written; 45.763 seconds passed
INDEXING: 28,000,000 points processed; 28,000,000 points written; 46.887 seconds passed
INDEXING: 29,000,000 points processed; 29,000,000 points written; 47.962 seconds passed
INDEXING: 30,000,000 points processed; 30,000,000 points written; 49.081 seconds passed
FLUSHING: 6.604s
INDEXING: 31,000,000 points processed; 31,000,000 points written; 56.827 seconds passed
INDEXING: 32,000,000 points processed; 32,000,000 points written; 58.08 seconds passed
INDEXING: 33,000,000 points processed; 33,000,000 points written; 59.284 seconds passed
INDEXING: 34,000,000 points processed; 34,000,000 points written; 60.406 seconds passed
INDEXING: 35,000,000 points processed; 35,000,000 points written; 61.561 seconds passed
INDEXING: 36,000,000 points processed; 36,000,000 points written; 62.604 seconds passed
INDEXING: 37,000,000 points processed; 37,000,000 points written; 63.712 seconds passed
INDEXING: 38,000,000 points processed; 38,000,000 points written; 64.831 seconds passed
INDEXING: 39,000,000 points processed; 39,000,000 points written; 65.722 seconds passed
INDEXING: 40,000,000 points processed; 40,000,000 points written; 66.836 seconds passed
FLUSHING: 8.136s
INDEXING: 41,000,000 points processed; 41,000,000 points written; 76.128 seconds passed
INDEXING: 42,000,000 points processed; 42,000,000 points written; 77.314 seconds passed
INDEXING: 43,000,000 points processed; 43,000,000 points written; 78.555 seconds passed
INDEXING: 44,000,000 points processed; 44,000,000 points written; 79.737 seconds passed
INDEXING: 45,000,000 points processed; 45,000,000 points written; 80.955 seconds passed
INDEXING: 46,000,000 points processed; 46,000,000 points written; 82.249 seconds passed
closing writer
conversion finished
46,843,930 points were processed and 46,843,930 points ( 100% ) were written to the output.
duration: 90.248s
C:\Potree>


Open Chrome and type 'localhost' in the browser/search header

 

For the next step I was interested in coloring the LAS/LAZ data. The most current aerial orthophotography over Walnut Gulch is from 2015 NAIP.

Downloading NAIP imagery

Follow these instructions here.

  1. Go to http://datagateway.nrcs.usda.gov/
    1. Note the System Status to determine whether the NAIP imagery is presently online or offline.
  2. On the home page, click the green Get Data button
  3. Input your state and county of interest and click Submit Selected Counties.
  4. In the next window, scroll down until you reach the heading of Ortho Imagery
  5. Place a check next to the year you want, and then press Continue.
    1. CCMs over 8 Gigabytes in size cannot be downloaded from the Data Gateway site.
  6. Read the information, FTP Download is selected for you. Press Continue.
  7. Enter contact information and then press Continue.
  8. Review your order and press the Place Order button.
  9. Within a few hours, you will receive an email with your ftp download link.

Clipping the NAIP Image

The NAIP imagery (.sid) for the entire Cochise County is a very large file, even compressed.

I opened the file in QGIS and used gdal_translate to clip the file first to the square boundary area around Walnut Gulch, and then to the shapefile perimeter of the lidar flight.

First, I tried using libLAS las2las to color the full WGEW.laz file:

las2las -i F:\Woolpert\laz\wgew.laz -o F:\Woolpert\laz\wgew_colored.laz --color-source F:\Walnut_Gulch\NAIP\WGEW_NAIP_2015.tif --color-source-bands 1 2 3

This was rejected because of the file type conversion from las v 1.4.

Then I tried LAStools - this too was problematic, additionally the lascolor is proprietary. 

Using PDAL to color the cloud using the NAIP image

C:\OSGeo4W64>pdal translate -i F:\Woolpert\las\wgew.las -o F:\Woolpert\las_colored\wgew_colored.las -f filters.colorization --filters.colorization.raster=F:\Walnut_Gulch\NAIP\wgew_rendered.tif