Emotional Speech Recognition for Marathi Language


  • Bharati Borade Dr Babasaheb Ambedkar Marathwada University, Aurangabad.
  • Dr.R.R.Deshmukh Dr Babasaheb Ambedkar Marathwada University, Aurangabad.


Non- Native Speech Database, Emotional Speech database, MFCC, LPC, emotion recognition


A spontaneous mental state, emotion does not result from deliberate effort. There are many different kinds of emotions in a speech. Because it enhances interactions between people and technology, automatic emotion identification from human speech is becoming more common today. Several temporal and spectral features of human speech can be extracted. Several methods can be used to categorise pitch-related traits, Mel Frequency Cepstral Coefficients (MFCCs), and speech formants. This study looks at statistical characteristics, including MFCCs and linear discriminant analysis, which were used to categorise these properties (LDA). This article also describes a database of artificially emotionalized Marathi speech. The data samples were collected from Marathi speeches given by men and women that mimicked the emotions that resulted to Marathi utterances that could be utilised in everyday conversation and are interpreted in all analysed emotions.  To identify data samples, three essential categories—happy, sad, and angry—were used. For MFCC and LPC, the training accuracy and testing accuracy are 98, 82 and 85,82 respectively.


Shrishrimal PP, Deshmukh RR. Design and Development of Spoken Marathi Isolated Words Database for Agriculture Purpose and its Analysis. M. Phil. Computer Science Thesis. 2013 May.

Tiwari V. MFCC and its applications in speaker recognition. International journal on emerging technologies. 2010 Feb;1(1):19-22.

Davis S, Mermelstein P. Comparison of parametric representations for monosyllabic word recognition in continuously spoken sentences. IEEE transactions on acoustics, speech, and signal processing. 1980 Aug;28(4):357-66.

Young S. J., Odell J., Ollason D., Valtchev V., Woodland P., “The HTK Book.Version2.1”,Department of Engineering, Cambridge University,UK,1995.

“The NIST Year 2001 Speaker Recognition Evaluation Plan”,The NIST of USA,2001.Available: http://www.nist.gov/speech/tests/spk/2001/doc/2001-spkrec-evalplan-v05.9.pdf.

Skowronski MD, Harris JG. Exploiting independent filter bandwidth of human factor cepstral coefficients in automatic speech recognition. The Journal of the Acoustical Society of America. 2004 Sep;116(3):1774-80.

Mao X, Chen L, Zhang B. Mandarin speech emotion recognition based on a hybrid of HMM/ANN. international journal of computers. 2007;1(4):321-4.

Zhou Y, Sun Y, Yang L, Yan Y. Applying articulatory features to speech emotion recognition. In2009 International Conference on Research Challenges in Computer Science 2009 Dec 28 (pp. 73-76). IEEE.

Waghmare VB, Deshmukh RR, Shrishrimal PP, Janvale GB, Ambedkar B. Emotion recognition system from artificial marathi speech using MFCC and LDA techniques. InFifth international conference on advances in communication, network, and computing–CNC 2014.

Mao X, Chen L, Zhang B. Mandarin speech emotion recognition based on a hybrid of HMM/ANN. international journal of computers. 2007;1(4):321-4.

Drgas S, Dabrowski A. Speaker recognition based on multilevel speech signal analysis on Polish corpus. Multimedia Tools and Applications. 2015 Jun;74:4195-211.

Gevaert W, Tsenov G, Mladenov V. Neural networks used for speech recognition. Journal of Automatic control. 2010;20(1):1-7.

Vemula Yakub Reddy1, Mangipudi Pavan Kumar2, Mankala Sushma, Gurindagunta Kiran4,Vijaya Kumar Gurrala , “SPEECH BASED EMOTION DETECTION SYSTEM USING MFCC”, International Research Journal of Engineering and Technology (IRJET), 2020, vol 7, Issue 5, pp. 4329-4332.

Kolita S, Acharjee PB. Speech Emotion Recognition using Non-linear Classifier-A Review.

Babitha MM, Sushma C. Trends of Artificial Intelligence for online exams in education. International journal of Early Childhood special Education. 2022;14(01):2457-63.

Devi JS, Sreedhar MB, Arulprakash P, Kazi K, Radhakrishnan R. A path towards child-centric Artificial Intelligence based Education. International Journal of Early Childhood. 2022;14(03):2022.

Mr. D. Sreenivasulu, Dr. J. Sirishadevi, et al, “Implementation of Latest machine learning approaches for students Grade Prediction”, International journal of Early Childhood special Education, June 2022, Vol 14, Issue 03, pp. 9887-9894.

Kazi KS, Shirgan SS. Face Recognition based on Principal Component Analysis and Feed Forward Neural Network. InNational Conference on Emerging trends in Engineering, Technology, Architecture 2010 Dec (pp. 250-253).

Liyakat KK, Mane VA, Paradeshi KP, Kadam DB, Pandyaji KK. Development of Pose Invariant Face Recognition Method Based on Pca and Artificial Neural Network. Journal of Algebraic Statistics. 2022 Jul 31;13(3):3676-84.

Aavula R, Deshmukh AB, Mane VA, Chavhan GH, Liyakat KK. Design and Implementation of sensor and IoT based Remembrance system for closed one. Telematique. 2022 Aug 15:2769-78.

Nikita S, Sanika C, Nagveni K, Sakshi K, Kazi KS. Announcement system in Bus. Journal of Image Processing and Intelligent Remote Sensing (JIPIRS) ISSN 2815-0953. 2022 Oct 18;2(06):1-0.

Vaijnath SP, Siddheshwar MP, Pradip MS, Liayakat D, Sayyad KK. Smart Safty Device for Women. Int. J. of Aquatic Science. 2022 Jan 1;13(1):556-60.

Miss. Priyanka M Tadlgi, et al, “Depression Detection”, Journal of Mental Health Issues and Behavior (JHMIB), 2022, Vol 2, Issue 6, pp. 1-7

Dr. Kazi Kutubuddin , et al, “Pattern Recognition- An Approach towards Machine Learning”, Lambert Publications, 2022, ISBN- 978-93-91265-58-8

Waghmare Maithili, et al, “Smart watch system”, International journal of information Technology and computer engineering (IJITC), 2022, Vol 2, issue 6, pp. 1- 9.

Swami DV, Thamake SS, Ubale NS, Lokhande PV, Liyakat KK. Sending Notification to Someone Missing you Through Smart Watch. International Journal of Information Technology & Computer Engineering (IJITC) ISSN: 2455-5290. 2022 Sep 29;2(05):19-24.

Liyakat KK. Predict the Severity of Diabetes Cases, Using K-Means and Decision Tree Approach.



How to Cite

Bharati Borade, & Dr.R.R.Deshmukh. (2024). Emotional Speech Recognition for Marathi Language. JOURNAL OF ADVANCED APPLIED SCIENTIFIC RESEARCH, 6(3). Retrieved from http://joaasr.com/index.php/joaasr/article/view/936