0

I have written a code to filter, group, and sort my large data files. I have multiple text files I have to analyze. I know I can copy the code and run it with new data but I was wondering if there was a way to put this in a for loop that would open the text files one by one and run and store the results. I use the following to load all my text files. In the next steps, I select columns and filter them to find the desired values. But at the moment it only reads one file. I want to obtain results from all data files.

Samples <- Sys.glob("*.csv")
for (filename in Samples) {
try <- read.csv(filename, sep = ",", header = FALSE)
shear <- data.frame(try[,5],try[,8],try[,12])
lane <- shear[which(shear$Load == "LL-1"),]
Ext <- subset(lane, Girder %in% c("Left Ext","Right Ext"))
Max.Ext <- max(Ext$Shear)
}
Maral Dorri
  • 426
  • 4
  • 15
  • Use `lapply`. Start with `alldat – r2evans Mar 14 '20 at 04:34
  • I just updated the question with a part of my code to clarify. My goal is to run multiple steps to get specific values out of my data files. I want to add this to a for loop so I do not have to run the code for each data file separately. – Maral Dorri Mar 14 '20 at 04:56
  • Are you sure you want `header=FALSE`? – Edward Mar 14 '20 at 05:16

1 Answers1

1

You can put everything that you want to apply to each file in a function :

apply_fun <- function(filename) {

  try <- read.csv(filename, sep = ",", header = FALSE)
  shear <- data.frame(try[,5],try[,8],try[,12])
  lane <- shear[which(shear$Load == "LL-1"),]
  Ext <- subset(lane, Girder %in% c("Left Ext","Right Ext"))
  return(max(Ext$Shear, na.rm = TRUE))
}

and here it seems we want only one number (max) from each file, we can use sapply to apply the function to each file.

Samples <- Sys.glob("*.csv")
sapply(Samples, apply_fun)
Ronak Shah
  • 355,584
  • 18
  • 123
  • 178