Spatial data and analysis in R using terra

From CUOSGwiki
Revision as of 14:13, 12 March 2024 by Smitch (talk | contribs) (Added data retrieval v1)
Jump to navigationJump to search


Spatial analysis and the handling of spatial data have long been powerful capabilities in R. However, recently the dominant libraries providing these capabilities (including sp, raster, rgdal, ...) have been declared obsolete and were removed from current versions of R. New replacement packages include terra and sf. This move has advantages in that it removed a mix of incompatible tools, forcing standardization, and boosting the ability to use larger, modern datasets. It does mean, however, that there are a lot of tutorials and other documentation on the Internet or in books about spatial data analysis which is now also out of date.

This tutorial aims to help fix that situation. In the spirit of most of the tutorials on this site, it will use only open data, and while we focus on data centred on Carleton University, the same methods should work on similar data from anywhere else in the world. It was developed using R-4.3.1, and I am running it within the RStudio environment, which will show up in screenshots, and there are a few setup instructions given that are specific to RStudio. Besides that, the actual processing will be demonstrated using just console-based R commands, so they should work in any R environment.

To start with, we need some data. First, we will use some Landsat imagery, from the open archive provided by USGS. To speed things up, I have searched for images free from cloud cover, and clipped an image surrounding Carleton University. This file contains that image in GeoTIFF format. Download that file into a folder of your choice.

Next, we will get some vector data from the City of Ottawa Open Data; there's a lot there you can explore, but to get started, we will download Parks and Greenspace data into the same folder you used above.