I have numeric vectors of different lengths and want to "standardize" their length to, say, 100 to make them comparable and transform the raw signal to a smoother version.
I do need control over the length and 'smoothing' parameters so I used the get_dct_transform
function from the syuzhet
package.
However, this is not super efficient with larger raw vectors:
set.seed(123)
sample_space = seq(-2, 2, .01)
a = sample(sample_space, 1000, replace = T)
b = sample(sample_space, 10000, replace = T)
c = sample(sample_space, 100000, replace = T)
system.time(syuzhet::get_dct_transform(raw_values = a, low_pass_size = 5))
# user system elapsed
# 0.020 0.000 0.021
system.time(syuzhet::get_dct_transform(raw_values = b, low_pass_size = 5))
# user system elapsed
# 1.562 0.370 1.937
system.time(syuzhet::get_dct_transform(raw_values = c, low_pass_size = 5))
# user system elapsed
# 152.802 27.118 180.223
Are there faster ways of doing these kinds of transformations (DCT or FT) in R?
Another SO answer suggested the fft
function but I'd need direct control over the length and filter parameter. Same issue for the emuR::dct
.
I'm not married to the DCT - other methods to transform a vector of length v to an 'smoothed' vector of length n would also be okay.