Introducing the Breathe Happy Platform
Movement and wellbeing analysis for the modern yogi
We can be blind to the obvious, and we are also blind to our blindness.
- Daniel Kahneman in Thinking, Fast and Slow
When noted psychologist Kahneman talked about 'being blind to our own blindness', he probably wasn't thinking of yoga practitioners in 2021 trying to get into headstands in their bedrooms with no understanding if they’re following the right instructions from their laptops or not.
Having felt the struggles of practitioners over thousands of hours of online classes, our team of yogis who are passionate about technology set out to use computer vision & machine learning to help yoga practitioners at home see these blind spots - assisting a teacher in providing the most helpful instruction remotely and nurturing a student's physical and mental wellbeing.
What does the Platform do?
Breathe Happy is a Movement Analysis Platform providing cutting edge Video, Audio and Computer Vision technologies to power the future of wellbeing-at-home.
Using the cameras on laptops or smartphones, our proprietary AI algorithms will quantify relative changes in balance, breathing and effort, and recharge your practice through data-backed insights.
Our custom video player is built specially for live, teacher-led yoga training, not just business meetings.
What do you get from the Platform?
Built for Yoga-at-home
We don’t just replicate the in-person experience from a studio, we take it up a notch with features like private 1:1 conversation in a group class. Our intuitive interface allows practitioners to navigate through voice commands, so they never have to leave their mat mid class.
Practice Insights backed by Data
We use Computer Vision to assess the movements, balance and breathing of students remotely. Pose estimation, body segmentation and yoga asana detection are achieved using our own custom-built Convolutional Neural Network models trained using real yoga class videos of a wide range of student ages, body shapes and abilities.
Safer Classes via Machine Learning
A combination of algorithms using geometric algebra and models trained from annotated videos are used to identify strain and unsafe joint positions allowing teachers to be alerted to which student is most at risk of injury as well as where in that student’s body there is the potential for injury.
Practice Insights can then be used, either directly by students, or by the teacher, to optimise a personalised exercise plan that will strengthen where a student is weak, to lengthen where a student is tight.