I am following this tutorial to load LiDAR data into PostGIS/PostgreSQL database.
The order is:
Create extensions:
CREATE EXTENSION postgis;
CREATE EXTENSION pointcloud;
CREATE EXTENSION pointcloud_postgis;
Look at the metadata:
pdal info --input 20090429_42122c8225_ld_p23.laz --schema
pdal info --input 20090429_42122c8225_ld_p23.laz --metadata --xml
Run the data loading:
pdal pipeline laz2pg.xml
In this step I have an error message:
PDAL: laz2pg.xml: JSON pipeline: Unable to parse pipeline:
* Line 1, Column 1
Syntax error: value, object or array expected.
Maybe someone knows how to solve this problem?
-
Please always include errors as text rather than a picture so that it is available for future searches.PolyGeo– PolyGeo ♦2017年11月19日 11:16:19 +00:00Commented Nov 19, 2017 at 11:16
-
The Boundless tutorial needs to be caught up with PDAL's JSON pipeline syntax. XML support was dropped for the PDAL 1.6 release. PDAL 1.5 might still work for you, but it should be straightforward to convert the pipelines. Sorry for the churn.Howard Butler– Howard Butler2017年11月21日 14:02:36 +00:00Commented Nov 21, 2017 at 14:02
1 Answer 1
As HowardButler stated you need to use the JSON option for any PDAL version over 1.5. I am using PDAL version 1.7.2 and PostGreSQL version 9.6.
1) Create a text file using NotePad as described below. This particular JSON script reads from a standard .LAS file and writes to a PostGreSQL database table. The PostGreSQL database has the pointcloud
and postgis
extensions enabled. The lidar records go in as patches (groups) of 600 per database record. I don't know why they call them "patches" but that's what they decided to name them. Note the double \ in the filename path. JSON doesn't like single backslashes so you need to escape them. Save the file as theNameOfYourJsonFile.TXT
. Obviously you will change the SRID to whatever is appropriate for what you are working on.
{
"pipeline":[
{
"type":"readers.las",
"filename":"C:\\foldername\\foldername\\foldername\\nameOfLasFile.las",
"spatialreference":"EPSG:26916"
},
{
"type":"filters.chipper",
"capacity":600
},
{
"type":"writers.pgpointcloud",
"connection":"host='localhost' dbname='nameOfYourDb' user='yourUserName' password='yourPassword'",
"table":"pcpatches",
"compression":"dimensional",
"srid":"26917"
}
]
}
2) I am using the Windows version of PDAL that came with OSGeo4W so in my case I then open the command window to run the next step. If you are using a different operating system you may need to do this next step differently. At the c: prompt in the command window type in the following:
pdal pipeline -i C:\foldername\theNameOfYourJsonFile.txt
The pdal pipeline
command will run using the JSON from the text file you made. JUST LET IT RUN! It might take anywhere from a few minutes to 10 or 15 minutes. Remember point clouds are HUGE! When complete you will have a PostGreSQL table with your lidar records imported. Each record will contain groups of 600 points that can be extracted using SQL. Here is one way to view the data, there are many other ways available:
SELECT PC_AsText(
PC_Explode(nameOfFieldThatContainsYourPatches))
FROM tableName;