I am trying to rasterize a shapefile, and write values from a specific column of the shapefile into the resulting GTiff.
Here is what I've done so far, but that only creates a GTiff of zeros.
from osgeo import gdal
from osgeo import ogr
from osgeo import gdalconst
ndsm = 'base.tif'
shp = 'polygon.shp'
data = gdal.Open(ndsm, gdalconst.GA_ReadOnly)
geo_transform = data.GetGeoTransform()
source_layer = data.GetLayer()
x_min = geo_transform[0]
y_max = geo_transform[3]
x_max = x_min + geo_transform[1] * data.RasterXSize
y_min = y_max + geo_transform[5] * data.RasterYSize
x_res = data.RasterXSize
y_res = data.RasterYSize
mb_v = ogr.Open(shp)
mb_l = mb_v.GetLayer()
pixel_width = geo_transform[1]
target_ds = gdal.GetDriverByName('GTiff').Create('my.tiff', x_res, y_res, 1, gdal.GDT_Byte)
target_ds.SetGeoTransform((x_min, pixel_width, 0, y_min, 0, pixel_width))
band = target_ds.GetRasterBand(1)
NoData_value = -999999
band.SetNoDataValue(NoData_value)
band.FlushCache()
gdal.RasterizeLayer(target_ds, [1], mb_l, options=["ATTRIBUTE=hedgerow"])
How do I get the values of column 'hedgerow' into the GTiff?
In addition, I have to use the option 'ALL_TOUCHED=TRUE' with RasterizeLayer. How does the gdal.RasterizeLayer
call has to look with both options in it?
Edit: using gdal.RasterizeLayer(target_ds, [1], mb_l, options=["ATTRIBUTE=hedgerow", "ALL_TOUCHED=TRUE"])
runs without any problem, yet I only get zeros.
3 Answers 3
I tried out your code, slightly modified by me for adapting it to mi data (adding your 'hedgerow' field to my shapefile), and it ran well with only add to your code the line: target_ds = None
.
from osgeo import gdal
from osgeo import ogr
from osgeo import gdalconst
ndsm = '/home/zeito/pyqgis_data/utah_demUTM2.tif'
shp = '/home/zeito/pyqgis_data/polygon8.shp'
data = gdal.Open(ndsm, gdalconst.GA_ReadOnly)
geo_transform = data.GetGeoTransform()
#source_layer = data.GetLayer()
x_min = geo_transform[0]
y_max = geo_transform[3]
x_max = x_min + geo_transform[1] * data.RasterXSize
y_min = y_max + geo_transform[5] * data.RasterYSize
x_res = data.RasterXSize
y_res = data.RasterYSize
mb_v = ogr.Open(shp)
mb_l = mb_v.GetLayer()
pixel_width = geo_transform[1]
output = '/home/zeito/pyqgis_data/my.tif'
target_ds = gdal.GetDriverByName('GTiff').Create(output, x_res, y_res, 1, gdal.GDT_Byte)
target_ds.SetGeoTransform((x_min, pixel_width, 0, y_min, 0, pixel_width))
band = target_ds.GetRasterBand(1)
NoData_value = -999999
band.SetNoDataValue(NoData_value)
band.FlushCache()
gdal.RasterizeLayer(target_ds, [1], mb_l, options=["ATTRIBUTE=hedgerow"])
target_ds = None
This is the condition before running the code:
After running the code I got:
Shapefile was adequately rasterized by 'hedgerow' field.
-
Sorry, adding the
target_ds = None
doesn't change my result. I'm not sure if there can be a problem with the extent or not. The shapefile has a much bigger spatial extent than the GTiff where I retrieve the extent from. However, the GTiff I create usinggdal.RasterizeLayer
has the right extent. I also checked the layer retrieved from the shapefile, after opening it, and there are more than 300.000 objects in it. I also checked that the column 'hedgerow' is actually there, and it is. Is there a specific format the 'hedgerow' column has to have?Thomas Becker– Thomas Becker2016年10月03日 11:44:15 +00:00Commented Oct 3, 2016 at 11:44 -
I saw that your raster data type is byte so, my 'hedgerow' column was integer.xunilk– xunilk2016年10月03日 11:57:51 +00:00Commented Oct 3, 2016 at 11:57
-
Followed the code and implemented in my project. I got an error as follows: AttributeError: 'Dataset' object has no attribute 'SetGeotransform'Sai Krishna– Sai Krishna2021年03月25日 10:50:11 +00:00Commented Mar 25, 2021 at 10:50
-
This answer was produced four and a half years ago (by using QGIS 2). Several things could have changed. I will try in QGIS 3 for evaluating which issues are playing.xunilk– xunilk2021年03月25日 12:26:32 +00:00Commented Mar 25, 2021 at 12:26
It's my first time writing here, but hope I might give you some insight. I used the code from your post, with some minor changes it worked... kind of. I checked the correlation matrices that I get by comparing some of my classification data with data rasterized in the QGIS tool and raster generated by this code. Raster generated by your code had kappa coefficient near 0.4, while that generated by QGIS was near 0.9. I think the problem lies somewhere in
target_ds.SetGeoTransform((x_min, pixel_width, 0, y_min, 0, pixel_width))
Using .GetGeoTransform() solved the problem for me (got same correlation matrices as from QGIS data)
target_ds.SetGeoTransform(data.GetGeTransform())
Adding to xunilks answer. Add to georeference the output:
target_ds.SetProjection(data.GetProjection())
Explore related questions
See similar questions with these tags.
target_ds = None
at the botton of your code (see my answer).target_ds = None
do?