Applications of End to End Automatic Speech Recognition

by Shivam Kashyap

This project comprehensively investigates the applications of end-to-end ASR, including models like Transformers and the combination of RNNs with CNNs and CTC loss for the English language. The primary goal is to evaluate the performances of these architectures for sequence-to-sequence tasks that require accurate temporal alignment and robust handling of input sequences with varying lengths, specifically in the context of speech recognition. We tried to compare applications of E2E ASR by using RNN-CNN models and transformers models. We used the datasets from LJspeech for the English language. The RNN-CNN model combines the advantages of CNNs for extracting features and RNNs for processing sequential input to enable alignment-free training. The CNN component enhances the encoding of local features, while the RNN component captures temporal dependencies. The combined effect of both components leads to an improvement in recognition accuracy. The second model utilizes a Transformer architecture, which utilises self-attention for capturing long-range dependency without recurrent connections. This architectural design tackles the constraints of RNNs in managing lengthy sequences and parallel processing, resulting in the potential for quicker training and inference durations. The results of our experiments on a commonly used English language dataset namely LJspeech indicate significant performance improvements. The Transformer model also demonstrates higher scalability and efficiency when dealing with huge datasets. We compared the WER and computation time for both models and found superior WER performance by 3% to 4% for the transformer-based model over the RNN-CNN model. Additionally, the transformer based model was found to be five times more time efficient per epoch but requires more number of epochs for training The results indicate that RNN-CNN models are efficient for tasks with prominent local dependencies, whereas Transformers exhibit notable benefits in terms of computational efficiency and managing long-range dependencies. This makes Transformers a compelling option for large-scale English language processing applications.

Leave a Reply

[script_15]

This site uses Akismet to reduce spam. Learn how your comment data is processed.

✓ Customized M.Tech Projects | ✓ Thesis Writing | ✓ Research Paper Writing | ✓ Plagiarism Checking | ✓ Assignment Preparation | ✓ Electronics Projects | ✓ Computer Science | ✓ AI ML | ✓ NLP Projects | ✓ Arduino Projects | ✓ Matlab Projects | ✓ Python Projects | ✓ Software Projects | ✓ Readymade M.Tech Projects | ✓ Java Projects | ✓ Manufacturing Projects M.Tech | ✓ Aerospace Projects | ✓ AI Gaming Projects | ✓ Antenna Projects | ✓ Mechatronics Projects | ✓ Drone Projects | ✓ Mtech IoT Projects | ✓ MTech Project Source Codes | ✓ Deep Learning Projects | ✓ Structural Engineering Projects | ✓ Cloud Computing Mtech Projects | ✓ Cryptography Projects | ✓ Cyber Security | ✓ Data Engineering | ✓ Data Science | ✓ Embedded Projects | ✓ AWS Projects | ✓ Biomedical Engineering Projects | ✓ Robotics Projects | ✓ Capstone Projects | ✓ Image Processing Projects | ✓ Power System Projects | ✓ Electric Vehicle Projects | ✓ Energy Projects Mtech | ✓ Simulation Projects | ✓ Thermal Engineering Projects

© 2024 All Rights Reserved Engineer’s Planet

Digital Media Partner #magdigit 

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. OK Read More

Privacy & Cookies Policy
-
00:00
00:00
Update Required Flash plugin
-
00:00
00:00