Fuzzy Model Optimization Using of Giving the Amplitude Scale Factor

semanticscholar(2020)

Cited 0|Views3
No score
Abstract
Classical time series modeling is often used by practitioners for prediction purposes. Many researchers were already developed a way to predict time series using new methods, one of which is a method of fuzzy method. Fuzzy models were constructed based on the data, commonly not yet have an optimal value of Mean Absolute Error (MAE), which mean MAE on the model can still be reduced. This paper describes how the reduction of MAE on fuzzy model using of giving the amplitude scale factor. Impairment of MAE mathematically demonstrated in the evidence.
More
Translated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined