I know next to nothing about r or any type of coding. I am taking a class that requires analysis of data using r. My final project is to download and partition accelerometer data from smart watches. I have done this successfully. Then I have to run four models like decision tree, random forest, multinomial logistic regression and SMV. I only gotten the decision tree to work.
The file is so large, 3.5 million observations that I had to select a tiny percent of the data to get it to run without timing out. My data is train1. The variable 'gt' can be walk, sit, stand, stairsup, stairsdown, Null and bike. I made it a factor and called it 'gtF' I really only wanted to compare it to variables of 'x''y' and 'z' but it won't run that way so instead I am trying what you see below by just reomoving 'Index''Model' and 'Device. The error is below the code.
Can anyone offer suggestions on what I am doing wrong? Please explain it as if you were explaining it to a child because I am a very basic beginner with this.
I've also included my code and errors for SVM and Multinomial Logistic Regression.
My instructor said that some of the error message indicate there is missing data. I ran mice and got a message saying all data was there.
Random Forest
library(randomForest)
rf <- randomForest(gtF~ .-Index - Model -Device -gt,data=train1,
ntree = 300,
mtry = 8,
importance = TRUE,
proximity = TRUE)
print(rf)
attributes(rf)
Error in randomForest.default(m, y, ...) : NA/NaN/Inf in foreign function call (arg 1)
Support Vector Machine
library(e1071)
mymodel <- svm(gtF~ .-Index -Model -Device -gt, data = train1)
summary(mymodel)
plot(mymodel, data = train1,
gt~x)
Error in contrasts<-
(*tmp*
, value = contr.funs[1 + isOF[nn]]) : contrasts can be applied only to factors with 2 or more levels
Multinomial Logistic Regression
library(nnet)
mymodel <- multinom(out~.-Index -Model -Device -gt -gtF,data=train1)
summary(mymodel)
Error in contrasts<-
(*tmp*
, value = contr.funs[1 + isOF[nn]]) : contrasts can be applied only to factors with 2 or more levels
Thank you!