Project: Pepperdine

Goal: To create a simple voice assistant using ML5.js. This voice assistant is able to understand various commands related to local weather conditions and a user's calendar.

How it works
Speech recognition and synthesis provided by the JavaScript SpeechRecognition API. Speech coming through the microphone is processed by the SpeechRecognition API and translated to a string. This string is then passed into a charRNN where it is compared to a dataset of "commands : actions"
e.g. "do i need a jacket : weather"

The charRNN model returns a command string (which in the previous example would be "weather"). This string is then further processed to determine the proper resolution to the user's intial input. In this case, the application redirects the user to either or

- The application has an admittedly poor dataset and results are some times incorrect.
- There is currently no way to train the application when it gets an incorrect answer. Pepperdine Training API in future versions?

Educational Value for Students
- Teaches students, generally speaking, how their voice assistants work.
- Offers a great final/midterm project for students learning about machine learning with ML5.
- Make require students to compile their own models and assemble their own datasets.
- Project offers a lot of room for enhancements-- there are tons of creative avenues for this!