hi i need help i need to interface camera module of raspberry pi and watch the real time video on blynk app via video streaming widget using node red
There is really only one part of your project that has any direct reference to Blynk, and that is the Video Widget.
However, first you need to Google for the various ways to stream video from an RPi using different types of programming languages and cameras.
Once you have the URL for the stream, you can then enter directly into the widget or possibly send it via Node-Red/Blynk integration… but again you probably need to learn that on Node-Red’s website, and by reading this link…
I don’t think Node-Red plays any part in this.
You’ll need to run a script on the Pi which creates an HTTP video stream that is in a format the Blynk video streaming widget can understand and point the video streaming widget at that URL.
Posts like this one might help:
i am using camera pi node i just want a picture in output but it is creating a 0 byte file
So you’ve changed from wanting real time video in the Blynk video streaming widget to wanting still images.
Do you plan to view these via the new Image widget?
i just want this image first i am a newbee so doing it step by step i am not getting the image here is my flow
But my point is that if your end goal is wanting streaming video then this is just wasted effort. It won’t give you what you need.
If you want streaming video then Node-Red doesn’t play a part in that process, the video star, goes direct from your Pi (acting as a web server) to your mobile device running Blynk.
The web streaming app doesn’t use virtual pins, it uses a URL. Node-Red doesn’t have a Blynk web streaming widget for this reason.
i don’t know much about this but i got rpi camera module and want to interface it for video streaming and for face recognition also so this node plays a role but the basic is not working
So have you seen a project that uses still images to do facial recognition within Node-Red?
Gone through some of the blogs but not sure how to do it yet
But the node has to be used so i just want the images after that i will go to other parts of it
I’m virtually a 100% Node-Red user, and love it, but Node-Red has no role in the video streaming process.
My advice would be to get streaming from the Pi to the App working correctly first. The solution you use to get that working will make what you’re currently doing with still images redundant, as the input into your Take Photo node will be from the video stream URL, not from from wherever you’re getting it from at the moment.
I personally doubt very much that you’ll do any facial recognition without pushing the images out to a 3rd party service, and the latency involved with that would probably make the software unusable.
I’d love it if you were able to prove me wrong though.
Sir i first need to use camerapi node but its not working
I am following this blog too but not working please help me with this and then i will move towards other parts
This is not the “Getting cameras to work on the RPi” forum…
We will gladly assist you with learning about Blynk, and help you with the Video Widget once you get to that point (as in, you have a valid URL for your streaming video)
Until then I recommend you do some Googling for the rest.