-
Notifications
You must be signed in to change notification settings - Fork 0
Container Build
Every component of architecture is containerized, so container building container and respective execution requires certain set of commands. Following commands can be used for respective container builds:
Stream classification inference server is responsible of batched inference, results computation and handling of clients. It can track multiple clients and multiple input modes for inference from each client. To build the container, following command can be used:
docker build -t stream_classification:server -f Build_Server .Also, to execute container, following command can be used:
docker run --rm -it --gpus all \
-v [ Optional: Your Path to Model Weights File ]:/resources/convnext.model \
-v [ Optional: Your Path to One Hot Encoded Labels File ]:/resources/OHE.labels \
-p [ Your Port to Expose Server ]:1234 \
stream_classification:server [ Your Arguments ]For the more automated approach, following command can be used:
docker run --rm -it --gpus all \
-v [ Optional: Your Path to Model Weights File ]:/resources/convnext.model \
-v [ Optional: Your Path to One Hot Encoded Labels File ]:/resources/OHE.labels \
-p [ Your Port to Expose Server ]:1234 \
codeadeel/stream_classification:server [ Your Arguments ]Training on the subject data can be automatically done and handled by training pipeline. To build the container, following command can be used:
docker build -t stream_classification:trainer -f Build_Trainer .Also, to execute container, following command can be used:
docker run --rm -it --gpus all \
-v [ Required: Your Path to Data Directory ]:/data \
-v [ Required / Optional: Your Directory Path to Save Output Files ]:/resources \
stream_classification:trainer [ Your Arguments ]For the more automated approach, following command can be used:
docker run --rm -it --gpus all \
-v [ Required: Your Path to Data Directory ]:/data \
-v [ Required / Optional: Your Directory Path to Save Output Files ]:/resources \
codeadeel/stream_classification:trainer [ Your Arguments ]Single file processor is used to convert video file into training data, against given label. To build the container, following command can be used:
docker build -t stream_classification:single_processor -f Single_Processor .Also, to execute container, following command can be used:
docker run --rm -it \
-v [ Required: Your Path to Video File ]:/Video.mp4 \
-v [ Required: Your Directory Path to Save Generated / Updated Processed Data ]:/Output \
stream_classification:single_processor [ Your Arguments ]For the more automated approach, following command can be used:
docker run --rm -it \
-v [ Required: Your Path to Video File ]:/Video.mp4 \
-v [ Required: Your Directory Path to Save Generated / Updated Processed Data ]:/Output \
codeadeel/stream_classification:single_processor [ Your Arguments ]Annotation based file processor is used to convert video file into training data, against given annotation file. More on annotation file format. To build the container, following command can be used:
docker build -t stream_classification:processor -f Processor .Also, to execute container, following command can be used:
docker run --rm -it \
-v [ Required: Your Path to Video File ]:/Video.mp4 \
-v [ Required: Your Path to Annotation File ]:/Annotation.csv \
-v [ Required: Your Directory Path to Save Generated / Updated Processed Data ]:/Output \
stream_classification:processor [ Your Arguments ]For the more automated approach, following command can be used:
docker run --rm -it \
-v [ Required: Your Path to Video File ]:/Video.mp4 \
-v [ Required: Your Path to Annotation File ]:/Annotation.csv \
-v [ Required: Your Directory Path to Save Generated / Updated Processed Data ]:/Output \
codeadeel/stream_classification:processor [ Your Arguments ]This client can be used to perform inference on live stream, and display results on local display, using .X11 socket. To build the container, following command can be used:
docker build -t stream_classification:live_client -f Live_Client .Also, to execute container, following command can be used:
docker run --rm -it \
-v [ Optional: Your Directory Path to Save Output ]:/Output \
stream_classification:live_client [ Your Arguments ]For the more automated approach, following command can be used:
docker run --rm -it \
-v [ Optional: Your Directory Path to Save Output ]:/Output \
codeadeel/stream_classification:live_client [ Your Arguments ]This client can be used to perform inference on live stream, and display results through webpage. To build the container, following command can be used:
docker build -t stream_classification:live_client_browser -f Live_Client_Browser .Also, to execute container, following command can be used:
docker run --rm -it \
-v [ Optional: Your Directory Path to Save Output ]:/Output \
-p [ Optional: Your Port to Expose ]:80 \
stream_classification:live_client_browser [ Your Arguments ]For the more automated approach, following command can be used:
docker run --rm -it \
-v [ Optional: Your Directory Path to Save Output ]:/Output \
-p [ Optional: Your Port to Expose ]:80 \
codeadeel/stream_classification:live_client_browser [ Your Arguments ]This client can be used to perform inference on live stream, and display results through RTSP stream. To build the container, following command can be used:
docker build -t stream_classification:live_client_rtsp -f Live_Client_RTSP .Also, to execute container, following command can be used:
docker run --rm -it \
-v [ Optional: Your Directory Path to Save Output ]:/Output \
-p [ Optional: Your Port to Expose ]:80 \
stream_classification:live_client_rtsp [ Your Arguments ]For the more automated approach, following command can be used:
docker run --rm -it \
-v [ Optional: Your Directory Path to Save Output ]:/Output \
-p [ Optional: Your Port to Expose ]:80 \
codeadeel/stream_classification:live_client_rtsp [ Your Arguments ]This client can be used to perform inference on multiple live streams, and display results on through webpage. To build the container, following command can be used:
docker build -t stream_classification:multi_live_browser -f Multi_Live_Client_Browser .Also, to execute container, following command can be used:
docker run --rm -it \
-p [ Optional: Your Port to Expose ]:80 \
stream_classification:multi_live_client_browser [ Your Arguments ]For the more automated approach, following command can be used:
docker run --rm -it \
-p [ Optional: Your Port to Expose ]:80 \
codeadeel/stream_classification:multi_live_client_browser [ Your Arguments ]This client can be used to perform inference on multiple live streams, and display results through RTSP stream. To build the container, following command can be used:
docker build -t stream_classification:multi_live_rtsp -f Multi_Live_Client_RTSP .Also, to execute container, following command can be used:
docker run --rm -it \
-p [ Optional: Your Port to Expose ]:80 \
stream_classification:multi_live_client_rtsp [ Your Arguments ]For the more automated approach, following command can be used:
docker run --rm -it \
-p [ Optional: Your Port to Expose ]:80 \
codeadeel/stream_classification:multi_live_client_rtsp [ Your Arguments ]This client can be used to perform inference on local video, and display results on local display, using .X11 socker. To build constainer, following command can be used:
docker build -t stream_classification:video_client -f Video_Client .Also, to execute container, following command can be used:
docker run --rm -it \
-v [ Required: Your Path to Video File]:/video.mp4 \
-v [ Optional: Your Directory Path to Save Output ]:/Output \
stream_classification:video_client [ Your Arguments ]For the more automated approach, following command can be used:
docker run --rm -it \
-v [ Required: Your Path to Video File]:/video.mp4 \
-v [ Optional: Your Directory Path to Save Output ]:/Output \
codeadeel/stream_classification:video_client [ Your Arguments ]This client can be used to perform inference on local stream, and display results through webpage. To build the container, following command can be used:
docker build -t stream_classification:video_client_browser -f Video_Client_Browser .Also, to execute container, following command can be used:
docker run --rm -it \
-v [ Required: Your Path to Video File]:/video.mp4 \
-v [ Optional: Your Directory Path to Save Output ]:/Output \
-p [ Optional: Your Port to Expose ]:80 \
stream_classification:video_client_browser [ Your Arguments ]For the more automated approach, following command can be used:
docker run --rm -it \
-v [ Required: Your Path to Video File]:/video.mp4 \
-v [ Optional: Your Directory Path to Save Output ]:/Output \
-p [ Optional: Your Port to Expose ]:80 \
codeadeel/stream_classification:video_client_browser [ Your Arguments ]This client can be used to perform inference on local stream, and display results through RTSP stream. To build the container, following command can be used:
docker build -t stream_classification:video_client_rtsp -f Video_Client_RTSP .Also, to execute container, following command can be used:
docker run --rm -it \
-v [ Required: Your Path to Video File]:/video.mp4 \
-v [ Optional: Your Directory Path to Save Output ]:/Output \
-p [ Optional: Your Port to Expose ]:80 \
stream_classification:video_client_rtsp [ Your Arguments ]For the more automated approach, following command can be used:
docker run --rm -it \
-v [ Required: Your Path to Video File]:/video.mp4 \
-v [ Optional: Your Directory Path to Save Output ]:/Output \
-p [ Optional: Your Port to Expose ]:80 \
codeadeel/stream_classification:video_client_rtsp [ Your Arguments ]