<- function(dirname, datadir, sourceurl,
zip_load existing_dirs = list.files(datadir)) {
print(existing_dirs)
if (!(dirname %in% existing_dirs)) {
<- file.path(datadir, paste0(dirname, '.zip'))
zippath download.file(sourceurl, destfile = zippath)
unzip(zippath, exdir = file.path(datadir, dirname))
file.remove(zippath)
} }
Download Zip helper
The issue
When we download files from the internet, we often feed in a url, and it returns a zip, which we then want to unzip to access. There’s a fairly simple way to do that, but we can write a quicky function to do it and clean up the directory afterwards.
The function
we want to give it the dirname
for the file(s), the datadir
that contains our data, and the URL. Then it checks if it exists, and downloads, unzips, and cleans up.
An example
Get the Murray-Darling basin boundary
zip_load('mdb_boundary', 'data', "https://data.gov.au/data/dataset/4ede9aed-5620-47db-a72b-0b3aa0a3ced0/resource/8a6d889d-723b-492d-8c12-b8b0d1ba4b5a/download/sworkingadhocjobsj4430dataoutputsmdb_boundarymdb_boundary.zip")
[1] "42343_shp"
[2] "ANAE_Rivers_v3_23mar2021"
[3] "ANAE_Wetlands_v3_24mar2021"
[4] "DATA_362071"
[5] "geofabric"
[6] "MDB_ANAE_Aug2017"
[7] "mdb_boundary"
[8] "negbin_testing"
[9] "pix4dout"
[10] "Surface Water Water Resource Plan Areas"
I’ve saved this in functions/
so I have easy access to it everywhere.