I feel like the primary use case for such a technology is manipulating and profiling people over video chat, maybe even autonomously. Hiring managers, HR, landlords, and police are obvious customers.
This tech (detecting pulse from regular video) has been around almost 20 years now, and this doesn't seemed to have happened yet.
You see this type of thing in spy movies, but I'm not sure it's that useful in real life. You're basically taking one piece of data a polygraph uses, but without the most important component (skin conductance). Polygraph accuracy isn't that great to begin with. You can profile and manipulate people more effectively based on their reactions and behaviour, and their pulse will be much harder to interpret.
I don't know any commercial uses of such tech today. I'm not saying they don't exist. I just don't know of them.
I had said I don't think it's very useful for "manipulating and profiling people over video chat", so I wouldn't really expect there to be a commercial product for that. Probably it's used in fitness or heart rate monitoring apps for people that don't have a fitness tracker device and prefer not to manually count their pulse.
The core algorithm is really simple. You find a patch of skin. Take the average color of the pixels in that patch. The color will become more reddish each pulse. Do an FFT and take the strongest peak in the plausible heart rate range. You could prototype this in a few hundred lines of python.
If this were useful for police or hiring managers, someone could have use the tech to make an app for them within the past 19 years.
Of course, companies have a history of trying to market a lot of BS metrics (e.g. graphology, MBTI) to hiring managers, so I wouldn't be that surprised to see a company claim they can predict employee success using pulse. Whether it works is another story.
Don't get distracted, sit down and read it in full.
Don't get distracted, think about what he wrote.
If you still don't get it, take a step back. Think. Process. Then take a break and read it again tomorrow.
Slow down. Don't get distracted. You don't need to respond so fast. Take your time. There is no rush. There is no shortcut. Read it in full and you'll understand this comment says much more.
Probably because I repeated "don't get distracted". But if you read the article then I think it'll take on a different context, as I'm mimicking the author, including their short paragraph style.
It’s not very accurate. Maybe because of the camera fidelity. It was about 10bpm lower than actual for me. Seems to operate off of subtle motions caused by pulse. It was even worse at detecting breathing.
I made my own heart rate app (using gemini at first and then Claude for lots of further edits). https://xosh.org/heart-rate/https://github.com/SMUsamaShah/heart-rate it's all offline. UI needs more work but me and my wife are the only users therefore it doesn't matter that much.
At one point I added the same EVM based heart rate detection but that requires sitting very still. I use it on my phone mainly therefore the common finger method is easiest one to use.
I would prefer some kind of privacy statement or even some kind of explanation about what is going on before I just randomly turn my webcam on. This might be great and I’m proud of you for launching but I don’t do things like that. Heck, videos can make a person’s heart race - I had my first attack at 39 and that’s a hell of a lot of risk.
I haven't dug deeper due to time availability, but for the same sake of privacy, I've found:
1. `/api/event` endpoint mentioned in the `/stats/script.js` file;
2. There's `/parties/lobby/main/telemetry` in a minified JavaScript chunk asset;
3. There's VitalLens mentioned, and there's an error string in the same asset: "A valid API key or proxy URL is required to use VitalLens. If you signed up recently, please try again in a minute to allow your API key to become active. Otherwise, head to https://www.rouast.com/api to get a free API key."
The response I anticipate will be "But this will help doctors over telehealth and stuff!" - Please see https://calebhearth.com/dont-get-distracted
You see this type of thing in spy movies, but I'm not sure it's that useful in real life. You're basically taking one piece of data a polygraph uses, but without the most important component (skin conductance). Polygraph accuracy isn't that great to begin with. You can profile and manipulate people more effectively based on their reactions and behaviour, and their pulse will be much harder to interpret.
Can you cite any commercially available uses of such tech?
I had said I don't think it's very useful for "manipulating and profiling people over video chat", so I wouldn't really expect there to be a commercial product for that. Probably it's used in fitness or heart rate monitoring apps for people that don't have a fitness tracker device and prefer not to manually count their pulse.
Here is the tech demonstrated in 2007: https://pubmed.ncbi.nlm.nih.gov/17074525/
The core algorithm is really simple. You find a patch of skin. Take the average color of the pixels in that patch. The color will become more reddish each pulse. Do an FFT and take the strongest peak in the plausible heart rate range. You could prototype this in a few hundred lines of python.
If this were useful for police or hiring managers, someone could have use the tech to make an app for them within the past 19 years.
Of course, companies have a history of trying to market a lot of BS metrics (e.g. graphology, MBTI) to hiring managers, so I wouldn't be that surprised to see a company claim they can predict employee success using pulse. Whether it works is another story.
You mean Claude can one-shot this.
1: https://pmc.ncbi.nlm.nih.gov/articles/PMC2717852/
Don't get distracted, think about what he wrote.
If you still don't get it, take a step back. Think. Process. Then take a break and read it again tomorrow.
Slow down. Don't get distracted. You don't need to respond so fast. Take your time. There is no rush. There is no shortcut. Read it in full and you'll understand this comment says much more.
At one point I added the same EVM based heart rate detection but that requires sitting very still. I use it on my phone mainly therefore the common finger method is easiest one to use.
He even shows pulse detection (around 8:30).
It's super cool. Thanks for sharing. I want to build a biofeedback app for meditation and this looks like a good platform to use.
macOS 15.7.1 (24G231) Brave 1.87.186 (Official Build) (arm64) Chromium: 145.0.7632.45
``` try { const l=await navigator.mediaDevices.getUserMedia({audio:!1,video:{facingMode:"user"}}); /* ... */ } catch { this.showError("Could not access webcam. Please check permissions.") } ```
There are alternatives to verify mediaDevices support as https://addpipe.com/getusermedia-examples/
1. `/api/event` endpoint mentioned in the `/stats/script.js` file;
2. There's `/parties/lobby/main/telemetry` in a minified JavaScript chunk asset;
3. There's VitalLens mentioned, and there's an error string in the same asset: "A valid API key or proxy URL is required to use VitalLens. If you signed up recently, please try again in a minute to allow your API key to become active. Otherwise, head to https://www.rouast.com/api to get a free API key."