DeepStream SDK 4.0은 이제 기본적으로 NVIDIA 기본 Componemt에 들어가 있어 쉽게 SDKManger를 이용하여 기본설치가능
- JetPack 4.2.1 설치
https://ahyuo79.blogspot.com/2019/07/jetpack-421.html
NVIDIA JetPack 관련부분
https://ahyuo79.blogspot.com/search/label/NVIDIA-JetPack
- DeepStream 문서 온라인 지원
https://docs.nvidia.com/metropolis/index.html
DeepStream SDK 4.0
https://ahyuo79.blogspot.com/search/label/NVIDIA-DeepStream%20SDK%204.0
DeepStream SDK 3.0
https://ahyuo79.blogspot.com/search/label/NVIDIA-DeepStream%20SDK%203.0
DeepStream
https://ahyuo79.blogspot.com/search/label/NVIDIA-DeepStream
- Jetson 성능 최대화 설정
$ sudo nvpmodel -m 0 $ sudo jetson_clocks //변경됨
$ sudo nvpmodel -q NV Fan Mode:quiet NV Power Mode: MAXN 0
- DeepStream SDK 위치 파악 및 Library 관련사항 확인
설치한 후 무작정 Deepstream 관련정보를 찾고 위치를 파악하고 Library 관계도 대충파악
$ find / -name *.caffemodel 2> /dev/null // 설치 위치 파악 /usr/src/tensorrt/data/googlenet/googlenet.caffemodel /usr/src/tensorrt/data/resnet50/ResNet50_fp32.caffemodel /usr/src/tensorrt/data/mnist/mnist.caffemodel /usr/src/tensorrt/data/mnist/mnist_lenet.caffemodel /usr/src/tegra_multimedia_api/data/Model/resnet10/resnet10.caffemodel /usr/src/tegra_multimedia_api/data/Model/GoogleNet_one_class/GoogleNet_modified_oneClass_halfHD.caffemodel /usr/src/tegra_multimedia_api/data/Model/GoogleNet_three_class/GoogleNet_modified_threeClass_VGA.caffemodel /opt/nvidia/deepstream/deepstream-4.0/samples/models/Secondary_CarMake/resnet18.caffemodel /opt/nvidia/deepstream/deepstream-4.0/samples/models/Secondary_CarColor/resnet18.caffemodel /opt/nvidia/deepstream/deepstream-4.0/samples/models/Primary_Detector/resnet10.caffemodel /opt/nvidia/deepstream/deepstream-4.0/samples/models/Primary_Detector_Nano/resnet10.caffemodel /opt/nvidia/deepstream/deepstream-4.0/samples/models/Secondary_VehicleTypes/resnet18.caffemodel $ which deepstream-app // 실행파일만 /usr/bin에 위치 /usr/bin/deepstream-app $ ls /usr/bin/deepstream-* /usr/bin/deepstream-app /usr/bin/deepstream-nvof-app /usr/bin/deepstream-test2-app /usr/bin/deepstream-dewarper-app /usr/bin/deepstream-perf-demo /usr/bin/deepstream-test3-app /usr/bin/deepstream-gst-metadata-app /usr/bin/deepstream-segmentation-app /usr/bin/deepstream-test4-app /usr/bin/deepstream-image-decode-app /usr/bin/deepstream-test1-app /usr/bin/deepstream-user-metadata-app //DeepStream 4.0 PlugIn $ ls /opt/nvidia/deepstream/deepstream-4.0/lib/gst-plugins libnvdsgst_dewarper.so libnvdsgst_infer.so libnvdsgst_msgconv.so libnvdsgst_multistreamtiler.so libnvdsgst_ofvisual.so libnvdsgst_segvisual.so libnvdsgst_dsexample.so libnvdsgst_msgbroker.so libnvdsgst_multistream.so libnvdsgst_of.so libnvdsgst_osd.so libnvdsgst_tracker.so //DeepStream 4.0 Library $ ls /opt/nvidia/deepstream/deepstream-4.0/lib gst-plugins libnvbufsurftransform.so libnvds_csvparser.so libnvds_inferutils.so libnvds_mot_iou.so libnvds_nvdcf.so libnvds_utils.so libiothub_client.so libnvds_amqp_proto.so libnvdsgst_helper.so libnvds_kafka_proto.so libnvds_mot_klt.so libnvds_nvtxhelper.so libvpi.so.0 libiothub_client.so.1 libnvds_azure_edge_proto.so libnvdsgst_meta.so libnvds_logger.so libnvds_msgconv.so libnvds_opticalflow_jetson.so libvpi.so.0.0.2 libnvbufsurface.so libnvds_azure_proto.so libnvds_infer.so libnvds_meta.so libnvds_msgconv.so.1.0.0 libnvds_osd.so // Gst-nvinfer shared Library 의존성확인 $ ldd /opt/nvidia/deepstream/deepstream-4.0/lib/gst-plugins/libnvdsgst_infer.so linux-vdso.so.1 (0x0000007f8bab2000) libnvds_infer.so => /opt/nvidia/deepstream/deepstream-4.0/lib/libnvds_infer.so (0x0000007f8b64b000) libglib-2.0.so.0 => /usr/lib/aarch64-linux-gnu/libglib-2.0.so.0 (0x0000007f8b53d000) libgobject-2.0.so.0 => /usr/lib/aarch64-linux-gnu/libgobject-2.0.so.0 (0x0000007f8b4df000) libgstreamer-1.0.so.0 => /usr/lib/aarch64-linux-gnu/libgstreamer-1.0.so.0 (0x0000007f8b3b0000) libgstbase-1.0.so.0 => /usr/lib/aarch64-linux-gnu/libgstbase-1.0.so.0 (0x0000007f8b33a000) libnvbufsurface.so.1.0.0 => /usr/lib/aarch64-linux-gnu/tegra/libnvbufsurface.so.1.0.0 (0x0000007f8b2bc000) libnvbufsurftransform.so.1.0.0 => /usr/lib/aarch64-linux-gnu/tegra/libnvbufsurftransform.so.1.0.0 (0x0000007f8ac2b000) libnvds_meta.so => /opt/nvidia/deepstream/deepstream-4.0/lib/libnvds_meta.so (0x0000007f8ac15000) libnvdsgst_helper.so => /opt/nvidia/deepstream/deepstream-4.0/lib/libnvdsgst_helper.so (0x0000007f8ac01000) libnvdsgst_meta.so => /opt/nvidia/deepstream/deepstream-4.0/lib/libnvdsgst_meta.so (0x0000007f8abed000) libcuda.so.1 => /usr/lib/aarch64-linux-gnu/tegra/libcuda.so.1 (0x0000007f89cc8000) libdl.so.2 => /lib/aarch64-linux-gnu/libdl.so.2 (0x0000007f89cb3000) libstdc++.so.6 => /usr/lib/aarch64-linux-gnu/libstdc++.so.6 (0x0000007f89b20000) libpthread.so.0 => /lib/aarch64-linux-gnu/libpthread.so.0 (0x0000007f89af4000) librt.so.1 => /lib/aarch64-linux-gnu/librt.so.1 (0x0000007f89add000) libc.so.6 => /lib/aarch64-linux-gnu/libc.so.6 (0x0000007f89984000) libnvparsers.so.5 => /usr/lib/aarch64-linux-gnu/libnvparsers.so.5 (0x0000007f89645000) libnvonnxparser.so.0 => /usr/lib/aarch64-linux-gnu/libnvonnxparser.so.0 (0x0000007f89221000) libnvinfer.so.5 => /usr/lib/aarch64-linux-gnu/libnvinfer.so.5 (0x0000007f8029d000) libnvinfer_plugin.so.5 => /usr/lib/aarch64-linux-gnu/libnvinfer_plugin.so.5 (0x0000007f7ffc7000) libnvds_inferutils.so => /opt/nvidia/deepstream/deepstream-4.0/lib/libnvds_inferutils.so (0x0000007f7ffb1000) libm.so.6 => /lib/aarch64-linux-gnu/libm.so.6 (0x0000007f7fef7000) libgcc_s.so.1 => /lib/aarch64-linux-gnu/libgcc_s.so.1 (0x0000007f7fed3000) libpcre.so.3 => /lib/aarch64-linux-gnu/libpcre.so.3 (0x0000007f7fe61000) /lib/ld-linux-aarch64.so.1 (0x0000007f8ba87000) libffi.so.6 => /usr/lib/aarch64-linux-gnu/libffi.so.6 (0x0000007f7fe49000) libgmodule-2.0.so.0 => /usr/lib/aarch64-linux-gnu/libgmodule-2.0.so.0 (0x0000007f7fe35000) libnvrm.so => /usr/lib/aarch64-linux-gnu/tegra/libnvrm.so (0x0000007f7fdf3000) libEGL.so.1 => /usr/lib/aarch64-linux-gnu/libEGL.so.1 (0x0000007f7fdd2000) libnvos.so => /usr/lib/aarch64-linux-gnu/tegra/libnvos.so (0x0000007f7fdb4000) libnvbuf_fdmap.so.1.0.0 => /usr/lib/aarch64-linux-gnu/tegra/libnvbuf_fdmap.so.1.0.0 (0x0000007f7fda1000) libnvrm_graphics.so => /usr/lib/aarch64-linux-gnu/tegra/libnvrm_graphics.so (0x0000007f7fd82000) libnvddk_vic.so => /usr/lib/aarch64-linux-gnu/tegra/libnvddk_vic.so (0x0000007f7fd64000) libnvrm_gpu.so => /usr/lib/aarch64-linux-gnu/tegra/libnvrm_gpu.so (0x0000007f7fd21000) libnvidia-fatbinaryloader.so.32.2.0 => /usr/lib/aarch64-linux-gnu/tegra/libnvidia-fatbinaryloader.so.32.2.0 (0x0000007f7fcc3000) libcudnn.so.7 => /usr/lib/aarch64-linux-gnu/libcudnn.so.7 (0x0000007f68e5d000) libcublas.so.10.0 => /usr/local/cuda-10.0/lib64/libcublas.so.10.0 (0x0000007f634f4000) libcudart.so.10.0 => /usr/local/cuda-10.0/lib64/libcudart.so.10.0 (0x0000007f63483000) libnvdla_compiler.so => /usr/lib/aarch64-linux-gnu/tegra/libnvdla_compiler.so (0x0000007f63080000) libnvmedia.so => /usr/lib/aarch64-linux-gnu/tegra/libnvmedia.so (0x0000007f6301d000) libcrypto.so.1.1 => /usr/lib/aarch64-linux-gnu/libcrypto.so.1.1 (0x0000007f62dd8000) libGLdispatch.so.0 => /usr/lib/aarch64-linux-gnu/libGLdispatch.so.0 (0x0000007f62caa000) libnvdc.so => /usr/lib/aarch64-linux-gnu/tegra/libnvdc.so (0x0000007f62c8b000) libnvtvmr.so => /usr/lib/aarch64-linux-gnu/tegra/libnvtvmr.so (0x0000007f62bfc000) libnvparser.so => /usr/lib/aarch64-linux-gnu/tegra/libnvparser.so (0x0000007f62bc0000) libnvimp.so => /usr/lib/aarch64-linux-gnu/tegra/libnvimp.so (0x0000007f62bab000) //Gst-nvinfer Symbol Table 확인 , 이미 제거했음 $ nm /opt/nvidia/deepstream/deepstream-4.0/lib/gst-plugins/libnvdsgst_infer.so nm: /opt/nvidia/deepstream/deepstream-4.0/lib/gst-plugins/libnvdsgst_infer.so: no symbols //Gst-nvinfer Symbol Table Function 확인 $ readelf -s /opt/nvidia/deepstream/deepstream-4.0/lib/gst-plugins/libnvdsgst_infer.so Symbol table '.dynsym' contains 329 entries: Num: Value Size Type Bind Vis Ndx Name 0: 0000000000000000 0 NOTYPE LOCAL DEFAULT UND 1: 000000000000e280 0 SECTION LOCAL DEFAULT 9 2: 00000000000919e8 0 SECTION LOCAL DEFAULT 19 3: 0000000000000000 0 FUNC GLOBAL DEFAULT UND _Znam@GLIBCXX_3.4 (2) 4: 0000000000000000 0 FUNC GLOBAL DEFAULT UND __fxstat@GLIBC_2.17 (3) 5: 0000000000000000 0 FUNC GLOBAL DEFAULT UND g_object_new 6: 0000000000000000 0 FUNC GLOBAL DEFAULT UND socket@GLIBC_2.17 (3) 7: 0000000000000000 0 FUNC GLOBAL DEFAULT UND nvds_add_label_info_meta_ 8: 0000000000000000 0 FUNC GLOBAL DEFAULT UND sem_destroy@GLIBC_2.17 (4) 9: 0000000000000000 0 FUNC GLOBAL DEFAULT UND memcpy@GLIBC_2.17 (3) 10: 0000000000000000 0 FUNC GLOBAL DEFAULT UND gst_plugin_register_stati 11: 0000000000000000 0 FUNC GLOBAL DEFAULT UND g_cond_clear 12: 0000000000000000 0 FUNC GLOBAL DEFAULT UND fchmod@GLIBC_2.17 (3) 13: 0000000000000000 0 FUNC GLOBAL DEFAULT UND g_queue_new 14: 0000000000000000 0 FUNC GLOBAL DEFAULT UND fread@GLIBC_2.17 (3) ............ //Linking Loader 설정확인 $ cat /etc/ld.so.conf include /etc/ld.so.conf.d/*.conf $ ls /etc/ld.so.conf.d/ aarch64-linux-gnu.conf aarch64-linux-gnu_EGL.conf aarch64-linux-gnu_GL.conf cuda-10-0.conf deepstream.conf fakeroot-aarch64-linux-gnu.conf libc.conf nvidia-tegra.conf $ cat /etc/ld.so.conf.d/deepstream.conf /opt/nvidia/deepstream/deepstream-4.0/lib $ cd /opt/nvidia/deepstream/deepstream-4.0 $ ls bin doc lib LicenseAgreement.pdf LICENSE.txt README samples sources $ cd .. $ cp -a deepstream-4.0 ~/deepstream-4.0 // Source 도 수정할 것 이기때문에 복사하자.
DeepStream SDK 4.0 설정관련문서
https://docs.nvidia.com/metropolis/deepstream/4.0/dev-guide/index.html
1.1 DeepStream SDK 4.0 관련정보
- DeepStream SDK 4.0 관련정보
- DeepStream NGC(NVIDIA GPU CLOUD) Docker 이용설치가능
https://ngc.nvidia.com/catalog/containers/nvidia:deepstream-l4t
- DeepStream SDK Source 및 Package Download (JetPack 이미설치)
https://developer.nvidia.com/deepstream-download
- DeepStream SDK 4.0 호환성
https://docs.nvidia.com/metropolis/deepstream/4.0/faq/index.html#page/DeepStream_FAQ/DeepStream_FAQ.html#wwpID0EIHA
- DeepStream 관련 설치 및 기본사용법
https://docs.nvidia.com/metropolis/deepstream/4.0/dev-guide/index.html
- DeepStream SDK Development Guide
https://docs.nvidia.com/metropolis/deepstream/4.0/dev-guide/DeepStream_Development_Guide/baggage/index.html
2. DeepStream SDK 4.0 Sample 기본예제
- DeepStream SDK 4.0 Sample ( Contents 설명)
https://docs.nvidia.com/metropolis/deepstream/4.0/dev-guide/index.html#page/DeepStream_Development_Guide/deepstream_quick_start.html#wwpID0E6HA
2.1 deepstream-app
현재 CSI 와 USB Interface를 사용하지 않기때문에 이것을 제외한 것들을 테스트 진행
$ cd ~/deepstream-4.0 $ deepstream-app -c samples/configs/deepstream-app/source12_1080p_dec_infer-resnet_tracker_tiled_display_fp16_tx2.txt //이밖에 예제 source1_csi_dec_infer_resnet_int8.txt source1_usb_dec_infer_resnet_int8.txt source2_csi_usb_dec_infer_resnet_int8.txt source30_1080p_dec_infer-resnet_tiled_display_int8.txt source4_1080p_dec_infer-resnet_tracker_sgie_tiled_display_int8.txt source6_csi_dec_infer_resnet_int8.txt source8_1080p_dec_infer-resnet_tracker_tiled_display_fp16_nano.txt source8_1080p_dec_infer-resnet_tracker_tiled_display_fp16_tx1.txt
2.2 deepstream-dewarper-app
NVIDIA의 Smart Garage System에서 지원을 하는 어안렌즈를 사용한 카메라를 4개의 화면으로 펴서 보여주는 deepstream 예제를 지원해준다.
하지만 생각보다 느리며, 관련세부설정은 config_dewarper.txt 와 함께 을 살펴보자.
$ cd ~/deepstream-4.0 $ cd sources/apps/sample_apps/deepstream-dewarper-test/ $ make $ ./deepstream-dewarper-app file:///home/nvidia/deepstream-4.0/samples/streams/sample_cam6.mp4 6 //config_dewarper.txt 필요 $ deepstream-dewarper-app file:///home/nvidia/deepstream-4.0/samples/streams/sample_cam6.mp4 6 file:///home/nvidia/deepstream-4.0/samples/streams/sample_cam6.mp4 6 // Multiversion
아래의 경우는 2 Channel 로 사용하여 총 8개의 화면으로 구성
2.3 deepstream-gst-metadata-app
이전과 큰 차이가 없으며, metadata를 처리하는 Callback Function하지만 기존과 좀 다르게 User Metadata와 Gst-nvinfer의 Config 존재한다.
내부소스를 보면 기준이 H.264 Parse 기준으로 되어있어 Sample File을 잘 사용하며 구조가 간단하다.
$ cd ~/deepstream-4.0 $ cd sources/apps/sample_apps/deepstream-gst-metadata-test $ make $./deepstream-gst-metadata-app ../../../../samples/streams/sample_720p.h264 // 실행가능 $./deepstream-gst-metadata-app ../../../../samples/streams/sample_1080p_h264.mp4 // Error 발생 $./deepstream-gst-metadata-app ../../../../samples/streams/sample_1080p_h265.mp4 // Error 발생 , 이유는 내부소스를 보면 파악가능 (H.264)
2.4 deepstream-image-decode-test
MJPEG or JPEG 의 Sample을 테스트 가능하며, 상위과 비슷하므로, 소스를 보면 간단하다.
$ cd ~/deepstream-4.0 $ cd sources/apps/sample_apps/deepstream-image-decode-test $ make $./deepstream-image-decode-app ../../../../samples/streams/sample_720p.mjpeg ../../../../samples/streams/sample_720p.mjpeg // 실행가능
2.5 deepstream-infer-tensor-meta-test
이전에 이와 비슷하게 프로그램을 작성을 했는데, 이렇게 쉽게 예제로 나오니, 삽질을 했다는 생각이 든다.
소스는 상단 좌측위의 자동차와 사람 정보 표시하고, 이를 표시해주면, 가장 기본이 되는 소스 같다.
$ cd ~/deepstream-4.0 $ cd sources/apps/sample_apps/deepstream-infer-tensor-meta-test $ export CUDA_VER=10.0 $ make $./deepstream-infer-tensor-meta-app ../../../../samples/streams/sample_720p.h264 // 실행가능
2.6 deepstream-nvof-test
드디어 Optical Flow 지원을 해주는 Source가 Porting되었으며, 이전에 이것을 Porting하려고 무지 애써는데, 참 헛된 짓 많이 한 것 같아 좀 그렇다.
소스를 보면 출력은 720P이며, streammux로 처음 받기 때문에 MultiURT이 지원되므로, RTSP도 가능하다
Optical flow로 인하여, 차량의 움직임을 알수 있는데, 이전의 제공해주던 예제처럼 일반영상과 같이 제공해주는 방식이 아니므로 , 그것은 직접 구현을 해보는 수 밖에 없겠다.
$ cd ~/deepstream-4.0 $ cd sources/apps/sample_apps/deepstream-nvof-test $ make $./deepstream-nvof-app file:///home/nvidia/deepstream-4.0/samples/streams/sample_720p.h264 // 실행가능 $./deepstream-nvof-app file:///home/nvidia/deepstream-4.0/samples/streams/sample_720p.h264 file:///home/nvidia/deepstream-4.0/samples/streams/sample_720p.h264 //2 Ch 실행가능 $./deepstream-nvof-app file:///home/nvidia/deepstream-4.0/samples/streams/sample_720p.h264 rtsp://10.0.0.199:554/h264 //2 Ch 실행가능 $./deepstream-nvof-app rtsp://10.0.0.199:554/h264 // 실행가능
1 Channel / 2 Channel / RTSP와 같이 사용
2.7 deepstream-perf-demo
MUX를 소스를 보면, 720P이며,Rows 와 Column과 Tile을 사용하여 Channel 수를 설정하고이를 Play해주는 Program이다.
이름을 보면 성능을 측정해주는 부분이 어딜까 봤는데, ENABLE_PROFILING 보면 되지만, 이 부분이 define되어있지 않다.
소스를 수정하여 사용을 해보고, accumulated_base 부분을 자세히 보자
특히한 부분은 source_switch_thread 부분인데, 실시간으로 각 File들을 Switch해주는 기능으로 이부분이 주요기능같다.
filesrc (switch)-> h264parser -> nvv4l2decoder -> nvstreammux -> pgie -> sgie1
-> sgie2-> sgie3 -> nvvideoconvert -> nvdsosd -> (nvtransform) ->nveglglessink
GTC 2018의 Kubernetes
https://www.youtube.com/watch?v=8bbtAvMAI2c
$ cd ~/deepstream-4.0 $ cd sources/apps/sample_apps/deepstream-perf-demo $ vi Makefile ifeq ($(TARGET_DEVICE),aarch64) CFLAGS:= -DPLATFORM_TEGRA -DENABLE_PROFILING endif $ vi ./deepstream-perf-demo.cpp //본인이 성능측정하고 싶은 곳에 profile 함수사용 , accumulated_base 를 자세히 분석하자 static GstPadProbeReturn eos_probe_cb(GstPad* pad, GstPadProbeInfo* info, gpointer u_data) { ..... if ((info->type & GST_PAD_PROBE_TYPE_BUFFER)) { GST_BUFFER_PTS(GST_BUFFER(info->data)) += prev_accumulated_base; #if defined(ENABLE_PROFILING) //jhlee 정확하지 않아 추후 수정해보자. if((frame_number%30) == 29){ profile_end(); profile_result(); }else if((frame_number%30) == 0) profile_start(); frame_number++; #endif } $ make $ mkdir ../../../../samples/streams2 $ cp ../../../../samples/streams/sample_720p.h264 ../../../../samples/streams2/sample_720p.h264 // 한개만 복사해도됨 $ vi perf_demo_pgie_config.txt model-engine-file=../../../../samples/models/Primary_Detector/resnet10.caffemodel_b1_int8.engine $ vi perf_demo_sgie1_config.txt model-engine-file=../../../../samples/models/Secondary_CarColor/resnet18.caffemodel_b16_int8.engine $ vi perf_demo_sgie2_config.txt model-engine-file=../../../../samples/models/Secondary_CarMake/resnet18.caffemodel_b16_int8.engine $ vi perf_demo_sgie3_config.txt model-engine-file=../../../../samples/models/Secondary_VehicleTypes/resnet18.caffemodel_b16_int8.engine $ export GST_DEBUG=2 //eos_probe_cb 함수의 accumulated_base 보기위함 $ ./deepstream-perf-demo 1 1 ../../../../samples/streams2 // Rows Columns Stream Dir 1 Ch 반복재생 $ ./deepstream-perf-demo 2 2 ../../../../samples/streams2 // 4 Ch 순차 반복 재생 혼동하지말자
- Gstreamer DEBUG 방법
GST_DEBUG=3 or GST_DEBUG=2,nvstreammux:4
숫자의 의미
1 | ERROR
2 | WARNING
3 | FIXME
4 | INFO
5 | DEBUG
6 | LOG
7 | TRACE
8 | MEMDUMP
https://gstreamer.freedesktop.org/documentation/tutorials/basic/debugging-tools.html?gi-language=c
Gstreamer 의 gst_pad_add_probe
https://gstreamer.freedesktop.org/documentation/gstreamer/gstpad.html?gi-language=c#gst_pad_add_probe
https://gstreamer.freedesktop.org/documentation/application-development/advanced/pipeline-manipulation.html?gi-language=c#page-description
GST_EVENT_X
https://gstreamer.freedesktop.org/documentation/gstreamer/gstevent.html?gi-language=c#GstEventType
4 Channel로 만들어도, 1 Channel 씩 Swtich 되면서 재생, 결국은 1Channel 만 재생
2.8 deepstream-segmentation-test
영상을 Segmentation 해주는 DeepStream으로 MJPEG 과 JPEG에서만 동작되는 TEST이다.
특히한 것은 설정들을 보면 UFF Format 기준이다.
$ cd ~/deepstream-4.0 $ cd sources/apps/sample_apps/deepstream-segmentation-test $ make $ ./deepstream-segmentation-app dstest_segmentation_config_semantic.txt ../../../../samples/streams/sample_720p.mjpeg ../../../../samples/streams/sample_720p.mjpeg $ ./deepstream-segmentation-app dstest_segmentation_config_industrial.txt ../../../../samples/streams/sample_720p.mjpeg ../../../../samples/streams/sample_720p.mjpeg $ ./deepstream-segmentation-app dstest_segmentation_config_industrial.txt ../../../../samples/streams/sample_industrial.jpg // 원본사진이 좀 독특한데,
- dstest_segmentation_config_semantic 설정 , mjpeg 테스트
- dstest_segmentation_config_industrial 설정 , mjpeg 테스트
- dstest_segmentation_config_industrial 설정 , jpeg TEST
2.9 deepstream-test-1/2/3
기존과 유사하지만, 다른 점이 있다면, TEST 4번 Kafka 지원이외에 AZURE/AMQP 지원을 해준다.
이전에 이미 Kafka를 테스트를 해봤기때문에, AMQP만 테스트 해보면 될 것 같다. (AZURE는 포기)
$ cd ~/deepstream-4.0/sources/apps/sample_apps/deepstream-test1 $ make $./deepstream-test1-app ../../../../samples/streams/sample_720p.h264 // $ cd ~/deepstream-4.0/sources/apps/sample_apps/deepstream-test2 $ make $./deepstream-test2-app ../../../../samples/streams/sample_720p.h264 $ vi dstest2_pgie_config.txt model-engine-file=../../../../samples/models/Primary_Detector/resnet10.caffemodel_b1_int8.engine $ vi dstest2_sgie1_config.txt model-engine-file=../../../../samples/models/Secondary_CarColor/resnet18.caffemodel_b16_int8.engine $ vi dstest2_sgie2_config.txt model-engine-file=../../../../samples/models/Secondary_CarMake/resnet18.caffemodel_b16_int8.engine $ vi dstest2_sgie3_config.txt model-engine-file=../../../../samples/models/Secondary_VehicleTypes/resnet18.caffemodel_b16_int8.engine $ cd ~/deepstream-4.0/sources/apps/sample_apps/deepstream-test3 $ make $./deepstream-test3-app file:///home/nvidia/deepstream-4.0/samples/streams/sample_1080p_h264.mp4 $./deepstream-test3-app rtsp://10.0.0.199:554/h264
- TEST 1/2/3 부분 거의 동일
- TEST1 기존과 동일하지만, 상단 좌측에 자동차와 사람의 인식수가 상위버전과 동일 (개선됨)
- TEST2 기존과 동일하며, TEST1의 기능에 기존 1GIE/2GIE 사용하여 Classfication 가능
- TEST3 기존과 동일하며, FILE 과 RTSP 지원하며 1GIE만지원
- TEST4의 변화 (nvmsgconv/nvmsgbroker)
이 부분은 lib와 같이 해야 하므로 다음에 테스트 진행
NvDsFrameMeta
- NvDsObjectMetaList -> NvDsObjectMeta
https://docs.nvidia.com/metropolis/deepstream/4.0/dev-guide/DeepStream_Development_Guide/baggage/struct__NvDsFrameMeta.html
NvDsObjectMeta
https://docs.nvidia.com/metropolis/deepstream/4.0/dev-guide/DeepStream_Development_Guide/baggage/struct__NvDsObjectMeta.html
https://docs.nvidia.com/metropolis/deepstream/4.0/dev-guide/DeepStream_Development_Guide/baggage/nvdsmeta_8h_source.html
2.10 deepstream-user-metadata-test
Deepstream의 Metadata 처리 부분이 Pad probe Callback 함수가 Gst-nvosd 이외 Gst-nvinfer 가 존재하며 각각 NvDsUserMeta 처리 부분이 핵심인 소스이다.
기본적인 구조는 Objection Detection 밖에 못하지만, User Metadata를 처리를 위한 좋은 예제인 것 같다.
$ cd ~/deepstream-4.0 $ cd sources/apps/sample_apps/deepstream-user-metadata-test $ make $ ./deepstream-user-metadata-app ../../../../samples/streams/sample_720p.h264
3. DeepStream 의 Gst-plugins Build 확인
이미 설치된 것이기 때문에, 빌드만 하고 소스만 분석을 하고 넘어가고, 만약 PlugIN을 작성해야 한다면, 이전 처럼 gst-dsexample을 참조하자.
$ cd ~/deepstream-4.0/sources/gst-plugins/gst-dsexample $ make CUDA_VER=10.0 $ ls libnvdsgst_dsexample.so libnvdsgst_dsexample.so $ cd ~/deepstream-4.0/sources/gst-plugins/gst-nvinfer $ make CUDA_VER=10.0 $ ls libnvdsgst_infer.so libnvdsgst_infer.so $ cd ~/deepstream-4.0/sources/gst-plugins/gst-nvmsgbroker $ make $ ls libnvdsgst_msgbroker.so libnvdsgst_msgbroker.so $ cd ~/deepstream-4.0/sources/gst-plugins/gst-nvmsgconv $ make $ ls libnvdsgst_msgconv.so libnvdsgst_msgconv.so
- DeepStream SDK 3.0 의 설치 및 실행부분과 비교
https://ahyuo79.blogspot.com/search/label/NVIDIA-DeepStream%20SDK%203.0