I am extremely new to building functions and loops. I have looked at previous questions that are similar to my issue but I can't seem to find the solution for my problem. My goal is to extract climate data from a webpage like this:
where I will use this data to calculate growing degree days for a crop growth model. I have had success pulling data using a for loop.
uticaNE <- "https://mesonet.agron.iastate.edu/cgi-bin/request/coop.py?network=NECLIMATE&stations=NE8745&year1=2020&month1=1&day1=1&year2=2020&month2=12&day2=31&vars%5B%5D=gdd_50_86&model=apsim&what=view&delim=comma&gis=no&scenario_year=2019"
friendNE <- "https://mesonet.agron.iastate.edu/cgi-bin/request/coop.py?network=NECLIMATE&stations=NE3065&year1=2020&month1=1&day1=1&year2=2020&month2=12&day2=31&vars%5B%5D=gdd_50_86&model=apsim&what=view&delim=comma&gis=no&scenario_year=2019"
location.urls <- c(uticaNE, friendNE)
location.meso.files <- c("uticaNe.txt", "friendNE.txt")
for(i in seq_along(location.urls)){
download.file(location.urls[i], location.meso.files[i], method="libcurl")
}
I will have around 20 locations I will be pulling data in daily. What I want to do is apply a task where I calculate fahrenheit, GDD, etc. to each file and save the output of each file separately.
This is the following code I have currently.
files <- list.files(pattern="*.txt", full.names=TRUE, recursive=FALSE)
func <- for (i in 1:length(files)){
df <- read.table(files[i], skip=10, stringsAsFactors =
FALSE)
colnames(df) <- c("year", "day", "solrad", "maxC",
"minC", "precipmm")
df$year <- as.f(df$year)
df$day <- as.factor(df$day)
df$maxF <- (df$maxC * (9/5) + 32)
df$minF <- (df$minC * (9/5) + 32)
df$GDD <- (((df$maxF + df$minF)/2)-50)
df$GDD[df$GDD <= 0] <- 0
df$GDD.cumulateive <- cumsum(df$GDD)
df$precipmm.cumulative <- cumsum(df$precipmm)
return(df)
write.table(df, path="./output", quote=FALSE,
row.names=FALSE, col.names=TRUE)
}
data <- apply(files, func)
Any help would be greatly appreciated.
-ML