TY - JOUR AU - Singh, Manu Pratap AU - Rashmi, Pratibha PY - 2023 TI - Convolution Neural Networks of Dynamically Sized Filters with Modified Stochastic Gradient Descent Optimizer for Sound Classification JF - Journal of Computer Science VL - 20 IS - 1 DO - 10.3844/jcssp.2024.69.87 UR - https://thescipub.com/abstract/jcssp.2024.69.87 AB - Deep Neural Networks (DNNs), specifically Convolution Neural Networks (CNNs) are found well suited to address the problem of sound classification due to their ability to capture the pattern of time and frequency domain. Mostly the convolutional neural networks are trained and tested with time-frequency patches of sound samples in the form of 2D pattern vectors. Generally, existing pre-trained convolutional neural network models use static-sized filters in all the convolution layers. In this present work, we consider the three different types of convolutional neural network architectures with different variable-size filters. The training set pattern vectors of time and frequency dimensions are constructed with the input samples of the spectrogram. In our proposed architectures, the size of kernels and the number of kernels are considered with a scale of variable length instead of fixed-size filters and static channels. The paper further presents the reformulation of a minibatch stochastic gradient descent optimizer with adaptive learning rate parameters according to the proposed architectures. The experimental results are obtained on the existing dataset of sound samples. The simulated results show the better performance of the proposed convolutional neural network architectures over existing pre-trained networks on the same dataset.