I would like to share with you my AI and IoT project at Harvard to help the blind people navigate the world. I used Edge Impulse for the Sound Classification model to classify between environmental noise, motor vehicle, and alert sound from my ultrasonic distance sensing device.
Ambiental Sound Classification by Edge Impulse part of my AI Project at Harvard to help the blind people see thru the lens of AI
This is fantastic, great work! If it’s okay with you, I’d love to share this with our community—is there a Twitter handle or a paper you’d like us to include?
By the way, I used the Ambiental Sound classification using your Water Faucet tutorial. Thanks a lot for this great AI platform. I have a lot of use cases for the sound classification platform,
Kudos to Edge Impulse Team!
Thanks @janjongboom for this amazing platform. Do you guys have computer vision in your roadmap?
Regards and more power to the Edge Impulse Team!
Yeah, we do, but for few classes and on still images for now! Expect some interesting news coming in the next few months!
Awesome, I’ve shared your video on Twitter! Thanks for sharing your work with our community; I’m excited to see what you continue to build
Here’s the background story of Optisapiens: https://www.braincubation.com/post/how-ford-and-edison-inspired-daughter-and-father-to-build-optisapiens-ai