USB/CSI Camera or a Webcam on Stream
For Windows
It is possible by creating an RTSP server on your host Windows OS that can be accessed by a Stream container, you just need to follow these steps:
- Create a folder on your host machine where the RTSP server files will reside.
- Go to
mediamtx releases repository
and download the zip file. - Once you have it in your local folder, extract it.
- Open the
mediamtx.yml
file in a text editor and make the following changes inpaths
section, replace:
paths:
# example:
# my_camera:
# source: rtsp://my_camera
with:
paths:
camera01:
runOnInit: ffmpeg -f dshow -video_size 1280x720 -framerate 30 -i video="<YOUR_VIDEO_CAMERA_NAME>" -pix_fmt yuv420p -c:v libx264 -preset ultrafast -b:v 600k -f rtsp rtsp://localhost:$RTSP_PORT/$MTX_PATH
runOnInitRestart: yes
Dshow
Parameters used:
-video_size
: By defaultffmpeg
receives input from camera at a640x480
resolution, with this parameter you can select the desired one, this example sets it to1280x720
.-framerate
: Framerates or fps, this example sets it to30
.
Replace <YOUR_VIDEO_CAMERA_NAME>
with the video camera name you want to use.
If you don't know the name of your camera, you can list all your audio/video devices using this command:
ffmpeg -list_devices true -f dshow -i dummy
Get from the results list the one you need to use i.e.:
...
[dshow @ 000001c636eadcc0] "c922 Pro Stream Webcam" (video)
...
Copy the descriptive name in parentheses (i.e.: c922 Pro Stream Webcam
) and paste it on the required field in the mediamtx.yml
file.
- Save the changes you made in the file.
- Execute the
mediamtx.exe
program. You should be able to see a Windows cmd output like this:
2024/02/06 23:55:52 INF MediaMTX v1.5.1
2024/02/06 23:55:52 INF configuration loaded from C:\Users\User\Desktop\mediamtx\mediamtx.yml
2024/02/06 23:55:52 INF [path camera01] runOnInit command started
2024/02/06 23:55:52 INF [RTSP] listener opened on :8554 (TCP), :8000 (UDP/RTP), :8001 (UDP/RTCP)
2024/02/06 23:55:52 INF [RTMP] listener opened on :1935
2024/02/06 23:55:52 INF [HLS] listener opened on :8888
2024/02/06 23:55:52 INF [WebRTC] listener opened on :8889 (HTTP), :8189 (ICE/UDP)
2024/02/06 23:55:52 INF [SRT] listener opened on :8890 (UDP)
ffmpeg version 6.1.1-essentials_build-www.gyan.dev Copyright (c) 2000-2023 the FFmpeg developers
built with gcc 12.2.0 (Rev10, Built by MSYS2 project)
configuration: --enable-gpl --enable-version3 --enable-static --pkg-config=pkgconf --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-zlib --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-sdl2 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxvid --enable-libaom --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec --enable-nvenc --enable-dxva2 --enable-d3d11va --enable-libvpl --enable-libgme --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libtheora --enable-libvo-amrwbenc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-librubberband
libavutil 58. 29.100 / 58. 29.100
libavcodec 60. 31.102 / 60. 31.102
libavformat 60. 16.100 / 60. 16.100
libavdevice 60. 3.100 / 60. 3.100
libavfilter 9. 12.100 / 9. 12.100
libswscale 7. 5.100 / 7. 5.100
libswresample 4. 12.100 / 4. 12.100
libpostproc 57. 3.100 / 57. 3.100
Input #0, dshow, from 'video=c922 Pro Stream Webcam':
Duration: N/A, start: 184827.343686, bitrate: N/A
Stream #0:0: Video: mjpeg (Baseline) (MJPG / 0x47504A4D), yuvj422p(pc, bt470bg/bt709/unknown), 1280x720, 30 fps, 30 tbr, 10000k tbn
Stream mapping:
Stream #0:0 -> #0:0 (mjpeg (native) -> h264 (libx264))
Press [q] to stop, [?] for help
[swscaler @ 0000016617efca00] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 000001661b3a7040] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 0000016617efca00] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 000001661b3a6340] deprecated pixel format used, make sure you did set range correctly
[libx264 @ 00000166154bef80] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
[libx264 @ 00000166154bef80] profile Constrained Baseline, level 3.1, 4:2:0, 8-bit
[libx264 @ 00000166154bef80] 264 - core 164 r3172 c1c9931 - H.264/MPEG-4 AVC codec - Copyleft 2003-2023 - http://www.videolan.org/x264.html - options: cabac=0 ref=1 deblock=0:0:0 analyse=0:0 me=dia subme=0 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=18 lookahead_threads=3 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=250 keyint_min=25 scenecut=0 intra_refresh=0 rc=abr mbtree=0 bitrate=600 ratetol=1.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=0
2024/02/06 23:55:52 INF [RTSP] [conn [::1]:52234] opened
2024/02/06 23:55:52 INF [RTSP] [session bbd94c07] created by [::1]:52234
2024/02/06 23:55:52 INF [RTSP] [session bbd94c07] is publishing to path 'camera01', 1 track (H264)
Output #0, rtsp, to 'rtsp://localhost:8554/camera01':
Metadata:
encoder : Lavf60.16.100
Stream #0:0: Video: h264, yuv420p(tv, bt470bg/bt709/unknown, progressive), 1280x720, q=2-31, 600 kb/s, 30 fps, 90k tbn
Metadata:
encoder : Lavc60.31.102 libx264
Side data:
cpb: bitrate max/min/avg: 0/0/600000 buffer size: 0 vbv_delay: N/A
Now that the RTSP server is running, you need to add to the parameter --network=host
to run the Stream container:
docker run... --network=host platerecognizer/alpr-stream:latest
Once the Stream container is running, just open the Stream config.ini
file and add the URL of the RTSP server running in the host machine to the url
field in the camera section.
More information about the URL configuration here.
Make sure to use the correct URL.
Format:rtsp://<WINDOWS_HOST_IP_ADDRESS>:<RTSP_PORT>/<CAMERA_SERVER_PATH>
Example: rtsp://192.168.0.59:8554/camera01
rtsp
is the service protocol.192.168.0.56
is the Windows host machine IP address.8554
is the default RTSP port.camera01
is the camera path in the server.
USB Camera For Linux
- Find the id of the camera. For example,
/dev/video0
.
ls -ltrh /dev/video*
- Update the
url
parameter in yourconfig.ini
. Use the id of the camera from the previous command. For example,/dev/video0
.
url = '/dev/video0'
- Start Stream with the
docker run
. Use the camera id from the previous command. For example,/dev/video0
. And set your ownLICENSE_KEY
,TOKEN
and volume for/path/to/stream_dir
.
docker run \
-t \
--runtime nvidia \ # remove this if docker complains about it
--privileged \
--name stream \
--restart="unless-stopped" \
--user `id -u`:`id -g` \
--group-add video \
-v /path/to/stream_dir:/user-data \
-e OPENCV_API_PREFERENCE=200 \
-e LICENSE_KEY=XXXXX \
-e TOKEN=YYYYY \
platerecognizer/alpr-stream:<your-tag>
To overwrite camera settings, you can use OpenCV CAP_
properties listed here and include it in the run command as an -e
flag.
docker run ... -e CAP_PROP_FRAME_WIDTH=1280 -e CAP_PROP_FRAME_HEIGHT=720 platerecognizer/alpr-stream
You must be sure that the selected CAP_
property is supported by the camera hardware.
CSI Camera For Raspberry Pi: Camera Module 2/3/4
We recommend updating your OS and packages before starting the process.
-
Choose a format (use command v4l2-ctl --list-formats-ext from v4l-utils apt package) supported by camera hardware. For example, width=1280, height=1080, framerate=15/1, format=UYVY.
-
Set the camera
url
inconfig.ini
to:
url = "libcamerasrc ! video/x-raw, width=AAAAA, height=BBBBB, framerate=CCCCC/1, format=DDDDD ! videoconvert ! video/x-raw,format=(string)BGR ! appsink max-buffers=5"
- Start Stream with the
docker run
. And set your ownLICENSE_KEY
,TOKEN
and volume for/path/to/stream_dir
. Notice the new environment variables, volumes and flags.
docker run \
-t \
--privileged \
--name stream \
--restart="unless-stopped" \
--user `id -u`:`id -g` \
--group-add video \
-v /run/udev:/run/udev
-v /path/to/stream_dir:/user-data \
-e OPENCV_API_PREFERENCE=1800 \
-e LICENSE_KEY=XXXXX \
-e TOKEN=YYYYY \
platerecognizer/alpr-stream:arm
Read more about Gstreamer and libcamera, if you need to toggle specific camera options or switch to another device.
Jetson USB/CSI Camera
To use onboard cameras a few additional configurations are necessary.
- First get your available camera resolutions
gst-launch-1.0 nvarguscamerasrc sensor-id=0
- Set the camera
url
inconfig.ini
to:
url = "nvarguscamerasrc ! video/x-raw(memory:NVMM), width=AAAAA, height=BBBBB, framerate=CCCCC/1, format=NV12 ! nvvidconv flip-method=0 ! video/x-raw,width=960, height=616 ! nvvidconv ! video/x-raw ! videoconvert ! video/x-raw,format=(string)BGR ! videoconvert ! appsink max-buffers=5"
Change AAAAA, BBBBB, and CCCCC to your values available to your camera. The framerate (FPS) of the camera could be any value below the available for the specified resolution.
- Now you can run the application. Notice the new environment variable
-e OPENCV_API_PREFERENCE=1800
, volume and device. List of OPENCV_API_PREFERENCE:
docker run \
-t \
--runtime nvidia \
--privileged \
--name stream \
--restart="unless-stopped" \
--user `id -u`:`id -g` \
--group-add video \
-v /tmp/argus_socket:/tmp/argus_socket \
-v /path/to/stream_dir:/user-data \
-e OPENCV_API_PREFERENCE=1800 \
-e LICENSE_KEY=XXXXX \
-e TOKEN=YYYYY \
platerecognizer/alpr-stream:jetson
For the commands above, make sure to:
- Change XXXXX to the License Key that we gave you.
- Change YYYYY to your Plate Recognizer Token.