Performance comparison of wavelet neural network and adaptive neuro-fuzzy inference system with small data sets

J Mol Graph Model. 2020 Nov:100:107698. doi: 10.1016/j.jmgm.2020.107698. Epub 2020 Jul 24.

Abstract

In this work, performance of wavelet neural network (WNN) and adaptive neuro-fuzzy inference system (ANFIS) models were compared with small data sets by different criteria such as second order corrected Akaike information criterion (AICc), Bayesian information criterion (BIC), root mean squared error (RMSE), mean absolute relative error (MARE), coefficient of determination (R2), external Q2 function ( [Formula: see text] ) and concordance correlation coefficient (CCC). Another criterion was the over-fitting. Ten data sets were selected from literature and their data were divided into training, test, and validation sets. Network parameters were optimized for WNN and ANFIS models and the best architectures with the lowest errors were selected for each data set. A precise survey of the number of permitted adjustable parameters (NPAP) and the total number of adjustable parameters (TNAP) in WNN and ANFIS models was shown that 60% of the ANFIS models and 30% of the WNN models had over-fitting. As a rule of thumb, to avoid over-fitting it is suggested that the ratio of the number of observations in training set to the number of input neurons must be greater than 10 and 20 for WNN and ANFIS, respectively. The smaller ratio required in WNN indicates its flexibility vs. ANFIS that relates to differences in structure and connections in the both networks.

Keywords: Adaptive neuro-fuzzy inference system; Akaike information criterion; Bayesian information criterion; Over-fitting; Wavelet neural network.

MeSH terms

  • Bayes Theorem
  • Fuzzy Logic*
  • Neural Networks, Computer*