I see that you’re using a custom pipeline to reduce latency. The pipeline looks fine, so perhaps the RTSP stream is outputting a format that BrainFrame doesn’t expect. Does this stream connect properly without a custom pipeline?
I think in order to diagnose the custom pipeline problem, we’ll need more logs. Please create a new file at /usr/local/share/brainframe/.env (assuming you used the BrainFrame CLI to install) with the following contents:
Then, try creating the stream with your custom pipeline again and send me the new logs. Hopefully this will provide us with enough information to diagnose the problem.
BrainFrame (Server) doesn’t have any buffers, but the BrainFrame Client does. The Client has to buffer until it receives inference results from the server, so that it can pair up frames to their bounding boxes. The client does this even if there are no capsules loaded on the server.
Did you know that the BrainFrame Client is actually Open Source? The source code can be found here. All of the logic for displaying video + bounding boxes can be found in that repository. It’s also easy to package and build a snap for it to distribute a modified version.
Feel free to ask any questions if this is something you are interested in.