6 Processing many polar volumes
6.1 Accessing radar data
- US NEXRAD polar volume data can be accessed in the Amazon cloud. Use function
download_pvolfiles()
for local downloads - European radar data can be accessed at https://aloftdata.eu. These are processed vertical profiles, the full polar volume data are not openly available for most countries. Use function
download_vpfiles()
for local downloads.
The names of the radars in the networks can be found here:
- US: NEXRAD network
- Europe: OPERA database
Useful sites for inspecting pre-made movies of the US composite are https://birdcast.info/migration-tools/live-migration-maps/ and https://www.pauljhurtado.com/US_Composite_Radar/.
6.2 Processing multiple polar volumes
This section contains an example for processing a directory of polar volumes into profiles:
First we download more files, and prepare an output directory for storing the processed profiles:
# First we download more data, for a total of one additional hour for the same radar:
download_pvolfiles(date_min=as.POSIXct("2017-05-04 01:40:00"), date_max=as.POSIXct("2017-05-04 02:40:00"), radar="KHGX", directory="./data_pvol")
# We will process all the polar volume files downloaded so far:
<- list.files("./data_pvol", recursive = TRUE, full.names = TRUE, pattern="KHGX")
my_pvolfiles
my_pvolfiles# create output directory for processed profiles
<- "./data_vp"
outputdir dir.create(outputdir)
We will use the following custom settings for processing: * We will use MistNet to screen out precipitation * we will calculate profile layers of 50 meter width * we will calculate 60 profile layers, so up to 50*60=3000 meter altitude
Note that we enclose the calculate_vp()
function in tryCatch()
to keep going after errors with specific files
Having generated the profiles, we can read them into R:
# we assume outputdir contains the path to the directory with processed profiles
<- list.files(outputdir, full.names = TRUE, pattern="KHGX")
my_vpfiles # print them
my_vpfiles# read them
<- read_vpfiles(my_vpfiles) my_vplist
You can now continue with visualizing and post-processing as we did earlier:
# make a time series of profiles:
<- bind_into_vpts(my_vplist)
my_vpts # plot them between 0 - 3 km altitude:
plot(my_vpts, ylim = c(0, 3000))
# because of the rain moving in, our ability to estimate bird profiles slowly drops:
# let's visualize rain by plotting all aerial reflectivity:
plot(my_vpts, ylim = c(0, 3000), quantity="DBZH")
Note that we enclose the calculate_vp()
function in tryCatch()
to keep going after errors with specific files
6.3 Parallel processing
We may use one of the parallelization packages in R to further speed up our processing. We will use mclapply()
from package parallel
. First we wrap up the processing statements in a function, so we can make a single call to a single file. We will disable MistNet, as this deep-learning model does not parallelize well on a cpu machine.
<- function(file_in){
process_file # construct output filename from input filename
<- paste(outputdir, "/", basename(file_in), "_vp.h5", sep = "")
file_out # run calculate_vp()
<- tryCatch(calculate_vp(file_in, file_out, mistnet = FALSE, h_layer=50, n_layer=60), error = function(e) NULL)
vp if(is.vp(vp)){
# return TRUE if we calculated a profile
return(TRUE)
else{
} # return FALSE if we did not
return(FALSE)
}
}
# To process a file, we can now run
process_file(my_pvolfiles[1])
Next, we use mclapply()
to do the parallel processing for all files:
# load the parallel library
library(parallel)
# detect how many cores we can use. We will keep 2 cores for other tasks and use the rest for processing.
= detectCores() - 2
number_of_cores # Available cores:
number_of_cores# let's loop over the files and generate profiles
=Sys.time()
startmclapply(my_pvolfiles, process_file, mc.cores = number_of_cores)
=Sys.time()
end# calculate the processing time that has passed:
-start end