0%

Today’s concert: ONE WORLD : TOGETHER AT HOME. Yup, today, I’ve my previous blog updated. A lot of modifications. Khadas VIM3 is really a good product. With Amlogic‘s A311D with 5.0 TOPS NPU, the board itself comes with a super powerful AI inference capability.

What a sunny day after the **FIRST** snow in this winter. Let me show you 3 pictures in the first row, and 3 videos in the second. We need to enjoy both R&D and life…
Green Timers Lake 1 Green Timers Lake 2 Green Timers Park
Green Timers Lake 1 Green Timers Lake 2 Green Timers Park
A Pair of Swans A Group of Ducks A Little Stream In The Snow

After a brief break, I started investigating Khadas VIM3 again.

1. About Khadas VIM3

Khadas VIM3 is a super computer based on Amlogic A311D. Before we start, let’s carry out several simple comparisons.

1.1 Raspberry Pi 4 Model B vs. Khadas VIM3 vs. Jetson Nano Developer Kit

Please refer to:

1.2 Amlogic A311D & S922X-B vs. Rockchip RK3399 (Pro) vs. Amlogic S912

Please refer to:

2. Install Prebuilt Operating System To EMMC Via Krescue

2.1 WIRED Connection Preferred

As mentioned in VIM3 Beginners Guide, Krescue is a Swiss Army knife. As of January 2020, Krescue can download and install OS images directly from the web via wired Ethernet.

2.2 Flash Krescue Onto SD Card

1
2
3
4
5
➜  Krescue sudo dd bs=4M if=VIM3.krescue-d41d8cd98f00b204e9800998ecf8427e-1587199778-67108864-279c13890fa7253d5d2b76000769803e.sd.img of=/dev/mmcblk0 conv=fsync 
[sudo] password for longervision:
16+0 records in
16+0 records out
67108864 bytes (67 MB, 64 MiB) copied, 4.03786 s, 16.6 MB/s

2.3 Setup Wifi From Within Krescue Shell

If you really don’t like the WIRED connection, boot into Krescue shell, and use the following commands to set up Wifi:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
root@Krescue:~# wifi.config WIFI_NAME WIFI_PASSWORD
root@Krescue:~# wifi.client
root@Krescue:~# wifi.status
DRIVER=brcmfmac
OF_NAME=wifi
OF_FULLNAME=/soc/sd@ffe03000/wifi@1
OF_COMPATIBLE_0=brcm,bcm4329-fmac
OF_COMPATIBLE_N=1
SDIO_CLASS=00
SDIO_ID=02D0:4359
MODALIAS=sdio:c00v02D0d4359
[i] iw dev
phy#0
Interface wlan0
ifindex 4
wdev 0x1
addr xx:xx:xx:xx:xx:xx
ssid XXXXXXXXXX
type managed
channel 161 (5805 MHz), width: 80 MHz, center1: 5775 MHz
txpower 31.00 dBm
[i] CLIENT mode active 6754
Selected interface 'wlan0'
bssid=xx:xx:xx:xx:xx:xx
freq=5805
ssid=XXXXXXXXXX
id=0
mode=station
pairwise_cipher=CCMP
group_cipher=CCMP
key_mgmt=WPA2-PSK
wpa_state=COMPLETED
ip_address=192.168.1.110
address=xx:xx:xx:xx:xx:xx
uuid=xxxxxxxxxxxxxxxxxxxxxxxxxxxx
ieee80211ac=1

2.4 SSH Into Krescue Via Wireless Connection

Now, let’s try to connect Khadas VIM3 board remotely.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
➜  ~ ping 192.168.1.110
PING 192.168.1.110 (192.168.1.110) 56(84) bytes of data.
64 bytes from 192.168.1.110: icmp_seq=1 ttl=64 time=140 ms
64 bytes from 192.168.1.110: icmp_seq=2 ttl=64 time=54.0 ms
64 bytes from 192.168.1.110: icmp_seq=3 ttl=64 time=13.1 ms
^C
--- 192.168.1.110 ping statistics ---
3 packets transmitted, 3 received, 0% packet loss, time 2001ms
rtt min/avg/max/mdev = 13.191/69.145/140.193/52.936 ms
➜ ~ ssh root@192.168.1.110
The authenticity of host '192.168.1.110 (192.168.1.110)' can't be established.
RSA key fingerprint is SHA256:0t0PZw/24nWc8hWaCJkltYtwCduMMSlRuux2Nn865Os.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added '192.168.1.110' (RSA) to the list of known hosts.


BusyBox v1.28.4 () built-in shell (ash)

OpenWrt 18.06.3, r7798-97ae9e0ccb

__ _____ Khadas ## hyphop ##
/ //_/ _ \___ ___ ______ _____
/ ,< / , _/ -_|_-</ __/ // / -_)
/_/|_/_/|_|\__/___/\__/\_,_/\__/

extreme tiny and fast rescue system

BUILD: Sat Apr 18 08:49:28 UTC 2020

[i] POST_CONFIG: ap_ssid=Krescue ap_passw=12345678 wifi_mode=2g script=sd:launcher.sh eth_hw=C0:4A:00:C0:3F:DB booted= hwver=VIM3.V12

=== WARNING! =====================================
There is no root password defined on this device!
Use the passwd command to set up a new password
in order to prevent unauthorized SSH logins.
--------------------------------------------------
root@Krescue:~# uname -a
Linux Krescue 5.4.5 #4 SMP PREEMPT Thu Apr 9 22:07:48 +09 2020 aarch64 GNU/Linux

2.5 Flash OS onto EMMC (WIRED Connection Preferred)

Let’s take a look at the SD card device:

1
2
root@Krescue:~# ls /dev/mmcblk*
/dev/mmcblk1 /dev/mmcblk1p1 /dev/mmcblk1p2 /dev/mmcblk2 /dev/mmcblk2boot0 /dev/mmcblk2boot1 /dev/mmcblk2rpmb

2.5.1 Install OS Using Shell Command

Please refer to the Shell Commands Examples.

curl -sfL dl.khadas.com/.mega | sh -s - -Y -X > /dev/mmcblk? should do.

2.5.2 Install OS Using Krescue GUI

Let’s bring back Krescue GUI by command krescue, and select VIMx.Ubuntu-xfce-bionic_Linux-4.9_arm64_V20191231.emmc.kresq and have it flashed onto EMMC.

Krescue Default Image Write To EMMC
Krescue Default Image Write To EMMC
Select Prebuilt OS Start Downloading OS
Select Prebuilt OS Start Downloading OS
Start Installation Installation Complete
Start Installation Installation Complete
Krescue Reboot Ubuntu XFCE Desktop
Krescue Reboot Ubuntu XFCE Desktop

2.6. Boot From EMMC

Actually, the 8th image in the above just showed Ubuntu XFCE desktop. We can also SSH into it after configuring Wifi successfully.

2.6.1 SSH Into Khadas VIM3

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
➜  ~ ssh khadas@192.168.1.95
The authenticity of host '192.168.1.95 (192.168.1.95)' can't be established.
ECDSA key fingerprint is SHA256:Q59XrIX7bSWsphZCpgHBSnVH5ETgCY9iLfDEuvRKtOw.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added '192.168.1.95' (ECDSA) to the list of known hosts.
khadas@192.168.1.95's password:

Welcome to Fenix 0.8.1 Ubuntu 18.04.3 LTS Linux 4.9.206
_ ___ _ __ _____ __ __ _____
| |/ / |__ __ _ __| | __ _ ___ \ \ / /_ _| \/ |___ /
| ' /| '_ \ / _` |/ _` |/ _` / __| \ \ / / | || |\/| | |_ \
| . \| | | | (_| | (_| | (_| \__ \ \ V / | || | | |___) |
|_|\_\_| |_|\__,_|\__,_|\__,_|___/ \_/ |___|_| |_|____/


* Website: https://www.khadas.com
* Documentation: https://docs.khadas.com
* Forum: https://forum.khadas.com

To run a command as administrator (user "root"), use "sudo <command>".
See "man sudo_root" for details.

2.6.2 Specs For Khadas VIM3

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
khadas@Khadas:~$ uname -a
Linux Khadas 4.9.206 #13 SMP PREEMPT Tue Dec 31 00:37:47 CST 2019 aarch64 aarch64 aarch64 GNU/Linux
khadas@Khadas:~$ cat /proc/cpuinfo
processor : 0
BogoMIPS : 48.00
Features : fp asimd evtstrm aes pmull sha1 sha2 crc32
CPU implementer : 0x41
CPU architecture: 8
CPU variant : 0x0
CPU part : 0xd03
CPU revision : 4

processor : 1
BogoMIPS : 48.00
Features : fp asimd evtstrm aes pmull sha1 sha2 crc32
CPU implementer : 0x41
CPU architecture: 8
CPU variant : 0x0
CPU part : 0xd03
CPU revision : 4

processor : 2
BogoMIPS : 48.00
Features : fp asimd evtstrm aes pmull sha1 sha2 crc32
CPU implementer : 0x41
CPU architecture: 8
CPU variant : 0x0
CPU part : 0xd09
CPU revision : 2

processor : 3
BogoMIPS : 48.00
Features : fp asimd evtstrm aes pmull sha1 sha2 crc32
CPU implementer : 0x41
CPU architecture: 8
CPU variant : 0x0
CPU part : 0xd09
CPU revision : 2

processor : 4
BogoMIPS : 48.00
Features : fp asimd evtstrm aes pmull sha1 sha2 crc32
CPU implementer : 0x41
CPU architecture: 8
CPU variant : 0x0
CPU part : 0xd09
CPU revision : 2

processor : 5
BogoMIPS : 48.00
Features : fp asimd evtstrm aes pmull sha1 sha2 crc32
CPU implementer : 0x41
CPU architecture: 8
CPU variant : 0x0
CPU part : 0xd09
CPU revision : 2

Serial : 290b1000010c1900000437304e424e50
Hardware : Khadas VIM3
khadas@Khadas:~$ clinfo
Number of platforms 1
Platform Name ARM Platform
Platform Vendor ARM
Platform Version OpenCL 2.0 git.c8adbf9.ad00b04c1b60847de257177231dc1a53
Platform Profile FULL_PROFILE
Platform Extensions cl_khr_global_int32_base_atomics cl_khr_global_int32_extended_atomics cl_khr_local_int32_base_atomics cl_khr_local_int32_extended_atomics cl_khr_byte_addressable_store cl_khr_3d_image_writes cl_khr_int64_base_atomics cl_khr_int64_extended_atomics cl_khr_fp16 cl_khr_icd cl_khr_egl_image cl_khr_image2d_from_buffer cl_khr_depth_images cl_khr_create_command_queue cl_arm_core_id cl_arm_printf cl_arm_thread_limit_hint cl_arm_non_uniform_work_group_size cl_arm_import_memory cl_arm_shared_virtual_memory
Platform Extensions function suffix ARM

Platform Name ARM Platform
Number of devices 1
ERROR: The DDK (built for 0x70030000 r0p0 status range [0..15]) is not compatible with this Mali GPU device, /dev/mali0 detected as 0x7212 r0p0 status 0.
Failed creating base context during DDK compatibility check.
Device Name <printDeviceInfo:0: get CL_DEVICE_NAME size : error -6>
Device Vendor ARM
ERROR: The DDK (built for 0x70030000 r0p0 status range [0..15]) is not compatible with this Mali GPU device, /dev/mali0 detected as 0x7212 r0p0 status 0.
Failed creating base context during DDK compatibility check.
Device Vendor ID <printDeviceInfo:2: get CL_DEVICE_VENDOR_ID : error -6>
Device Version OpenCL 2.0 git.c8adbf9.ad00b04c1b60847de257177231dc1a53
Driver Version 2.0
Device OpenCL C Version OpenCL C 2.0 git.c8adbf9.ad00b04c1b60847de257177231dc1a53
Device Type GPU
Device Profile FULL_PROFILE
Device Available Yes
Compiler Available Yes
Linker Available Yes
ERROR: The DDK (built for 0x70030000 r0p0 status range [0..15]) is not compatible with this Mali GPU device, /dev/mali0 detected as 0x7212 r0p0 status 0.
Failed creating base context during DDK compatibility check.
Max compute units <printDeviceInfo:17: get CL_DEVICE_MAX_COMPUTE_UNITS : error -6>
ERROR: The DDK (built for 0x70030000 r0p0 status range [0..15]) is not compatible with this Mali GPU device, /dev/mali0 detected as 0x7212 r0p0 status 0.
Failed creating base context during DDK compatibility check.
Max clock frequency <printDeviceInfo:21: get CL_DEVICE_MAX_CLOCK_FREQUENCY : error -6>
Device Partition (core)
Max number of sub-devices 0
Supported partition types None
Max work item dimensions 3
ERROR: The DDK (built for 0x70030000 r0p0 status range [0..15]) is not compatible with this Mali GPU device, /dev/mali0 detected as 0x7212 r0p0 status 0.
Failed creating base context during DDK compatibility check.
Max work item sizes <printDeviceInfo:36: get number of CL_DEVICE_MAX_WORK_ITEM_SIZES : error -6>
ERROR: The DDK (built for 0x70030000 r0p0 status range [0..15]) is not compatible with this Mali GPU device, /dev/mali0 detected as 0x7212 r0p0 status 0.
Failed creating base context during DDK compatibility check.
Max work group size <printDeviceInfo:37: get CL_DEVICE_MAX_WORK_GROUP_SIZE : error -6>
ERROR: The DDK (built for 0x70030000 r0p0 status range [0..15]) is not compatible with this Mali GPU device, /dev/mali0 detected as 0x7212 r0p0 status 0.
Failed creating base context during DDK compatibility check.
Preferred work group size multiple <getWGsizes:671: create context : error -6>
Preferred / native vector sizes
char 16 / 4
short 8 / 2
int 4 / 1
long 2 / 1
half 8 / 2 (cl_khr_fp16)
float 4 / 1
double 0 / 0 (n/a)
Half-precision Floating-point support (cl_khr_fp16)
Denormals Yes
Infinity and NANs Yes
Round to nearest Yes
Round to zero Yes
Round to infinity Yes
IEEE754-2008 fused multiply-add Yes
Support is emulated in software No
Single-precision Floating-point support (core)
Denormals Yes
Infinity and NANs Yes
Round to nearest Yes
Round to zero Yes
Round to infinity Yes
IEEE754-2008 fused multiply-add Yes
Support is emulated in software No
Correctly-rounded divide and sqrt operations No
Double-precision Floating-point support (n/a)
Address bits 64, Little-Endian
ERROR: The DDK (built for 0x70030000 r0p0 status range [0..15]) is not compatible with this Mali GPU device, /dev/mali0 detected as 0x7212 r0p0 status 0.
Failed creating base context during DDK compatibility check.
Global memory size <printDeviceInfo:74: get CL_DEVICE_GLOBAL_MEM_SIZE : error -6>
Error Correction support No
ERROR: The DDK (built for 0x70030000 r0p0 status range [0..15]) is not compatible with this Mali GPU device, /dev/mali0 detected as 0x7212 r0p0 status 0.
Failed creating base context during DDK compatibility check.
Max memory allocation <printDeviceInfo:80: get CL_DEVICE_MAX_MEM_ALLOC_SIZE : error -6>
Unified memory for Host and Device Yes
ERROR: The DDK (built for 0x70030000 r0p0 status range [0..15]) is not compatible with this Mali GPU device, /dev/mali0 detected as 0x7212 r0p0 status 0.
Failed creating base context during DDK compatibility check.
Shared Virtual Memory (SVM) capabilities <printDeviceInfo:83: get CL_DEVICE_SVM_CAPABILITIES : error -6>
ERROR: The DDK (built for 0x70030000 r0p0 status range [0..15]) is not compatible with this Mali GPU device, /dev/mali0 detected as 0x7212 r0p0 status 0.
Failed creating base context during DDK compatibility check.
Shared Virtual Memory (SVM) capabilities (ARM) <printDeviceInfo:84: get CL_DEVICE_SVM_CAPABILITIES_ARM : error -6>
Minimum alignment for any data type 128 bytes
Alignment of base address 1024 bits (128 bytes)
Preferred alignment for atomics
SVM 0 bytes
Global 0 bytes
Local 0 bytes
Max size for global variable 65536 (64KiB)
Preferred total size of global vars 0
Global Memory cache type Read/Write
ERROR: The DDK (built for 0x70030000 r0p0 status range [0..15]) is not compatible with this Mali GPU device, /dev/mali0 detected as 0x7212 r0p0 status 0.
Failed creating base context during DDK compatibility check.
Global Memory cache size <printDeviceInfo:97: get CL_DEVICE_GLOBAL_MEM_CACHE_SIZE : error -6>
ERROR: The DDK (built for 0x70030000 r0p0 status range [0..15]) is not compatible with this Mali GPU device, /dev/mali0 detected as 0x7212 r0p0 status 0.
Failed creating base context during DDK compatibility check.
Global Memory cache line size <printDeviceInfo:98: get CL_DEVICE_GLOBAL_MEM_CACHELINE_SIZE : error -6>
Image support Yes
Max number of samplers per kernel 16
Max size for 1D images from buffer 65536 pixels
Max 1D or 2D image array size 2048 images
Base address alignment for 2D image buffers 32 bytes
Pitch alignment for 2D image buffers 64 pixels
Max 2D image size 65536x65536 pixels
Max 3D image size 65536x65536x65536 pixels
Max number of read image args 128
Max number of write image args 64
Max number of read/write image args 64
Max number of pipe args 16
Max active pipe reservations 1
Max pipe packet size 1024
Local memory type Global
Local memory size 32768 (32KiB)
Max number of constant args 8
Max constant buffer size 65536 (64KiB)
Max size of kernel argument 1024
Queue properties (on host)
Out-of-order execution Yes
Profiling Yes
Queue properties (on device)
Out-of-order execution Yes
Profiling Yes
Preferred size 2097152 (2MiB)
Max size 16777216 (16MiB)
Max queues on device 1
Max events on device 1024
Prefer user sync for interop No
ERROR: The DDK (built for 0x70030000 r0p0 status range [0..15]) is not compatible with this Mali GPU device, /dev/mali0 detected as 0x7212 r0p0 status 0.
Failed creating base context during DDK compatibility check.
Profiling timer resolution <printDeviceInfo:145: get CL_DEVICE_PROFILING_TIMER_RESOLUTION : error -6>
Execution capabilities
Run OpenCL kernels Yes
Run native kernels No
printf() buffer size 1048576 (1024KiB)
Built-in kernels
Device Extensions cl_khr_global_int32_base_atomics cl_khr_global_int32_extended_atomics cl_khr_local_int32_base_atomics cl_khr_local_int32_extended_atomics cl_khr_byte_addressable_store cl_khr_3d_image_writes cl_khr_int64_base_atomics cl_khr_int64_extended_atomics cl_khr_fp16 cl_khr_icd cl_khr_egl_image cl_khr_image2d_from_buffer cl_khr_depth_images cl_khr_create_command_queue cl_arm_core_id cl_arm_printf cl_arm_thread_limit_hint cl_arm_non_uniform_work_group_size cl_arm_import_memory cl_arm_shared_virtual_memory

NULL platform behavior
clGetPlatformInfo(NULL, CL_PLATFORM_NAME, ...) ARM Platform
clGetDeviceIDs(NULL, CL_DEVICE_TYPE_ALL, ...) Success [ARM]
ERROR: The DDK (built for 0x70030000 r0p0 status range [0..15]) is not compatible with this Mali GPU device, /dev/mali0 detected as 0x7212 r0p0 status 0.
Failed creating base context during DDK compatibility check.
clCreateContext(NULL, ...) [default] <checkNullCtx:2694: create context with device from default platform : error -6>
ERROR: The DDK (built for 0x70030000 r0p0 status range [0..15]) is not compatible with this Mali GPU device, /dev/mali0 detected as 0x7212 r0p0 status 0.
Failed creating base context during DDK compatibility check.
clCreateContextFromType(NULL, CL_DEVICE_TYPE_DEFAULT) <checkNullCtxFromType:2737: create context from type CL_DEVICE_TYPE_DEFAULT : error -6>
clCreateContextFromType(NULL, CL_DEVICE_TYPE_CPU) No devices found in platform
ERROR: The DDK (built for 0x70030000 r0p0 status range [0..15]) is not compatible with this Mali GPU device, /dev/mali0 detected as 0x7212 r0p0 status 0.
Failed creating base context during DDK compatibility check.
clCreateContextFromType(NULL, CL_DEVICE_TYPE_GPU) <checkNullCtxFromType:2737: create context from type CL_DEVICE_TYPE_GPU : error -6>
clCreateContextFromType(NULL, CL_DEVICE_TYPE_ACCELERATOR) No devices found in platform
clCreateContextFromType(NULL, CL_DEVICE_TYPE_CUSTOM) No devices found in platform
ERROR: The DDK (built for 0x70030000 r0p0 status range [0..15]) is not compatible with this Mali GPU device, /dev/mali0 detected as 0x7212 r0p0 status 0.
Failed creating base context during DDK compatibility check.
clCreateContextFromType(NULL, CL_DEVICE_TYPE_ALL) <checkNullCtxFromType:2737: create context from type CL_DEVICE_TYPE_ALL : error -6>

ICD loader properties
ICD loader Name OpenCL ICD Loader
ICD loader Vendor OCL Icd free software
ICD loader Version 2.2.11
ICD loader Profile OpenCL 2.1
khadas@Khadas:~$ cat /proc/partitions
major minor #blocks name

1 0 4096 ram0
1 1 4096 ram1
1 2 4096 ram2
1 3 4096 ram3
1 4 4096 ram4
1 5 4096 ram5
1 6 4096 ram6
1 7 4096 ram7
1 8 4096 ram8
1 9 4096 ram9
1 10 4096 ram10
1 11 4096 ram11
1 12 4096 ram12
1 13 4096 ram13
1 14 4096 ram14
1 15 4096 ram15
179 0 30535680 mmcblk0
179 1 4096 mmcblk0p1
179 2 65536 mmcblk0p2
179 3 8192 mmcblk0p3
179 4 8192 mmcblk0p4
179 5 32768 mmcblk0p5
179 6 30351360 mmcblk0p6
179 96 4096 mmcblk0rpmb
179 64 4096 mmcblk0boot1
179 32 4096 mmcblk0boot0
251 1 262144 zram1
251 2 262144 zram2
251 3 262144 zram3
251 4 262144 zram4
khadas@Khadas:~$ df -h
Filesystem Size Used Avail Use% Mounted on
udev 1.9G 0 1.9G 0% /dev
tmpfs 371M 15M 357M 4% /run
/dev/rootfs 29G 3.2G 25G 12% /
tmpfs 1.9G 0 1.9G 0% /dev/shm
tmpfs 5.0M 4.0K 5.0M 1% /run/lock
tmpfs 1.9G 0 1.9G 0% /sys/fs/cgroup
tmpfs 1.9G 24K 1.9G 1% /tmp
tmpfs 371M 8.0K 371M 1% /run/user/1000

2.6.3 Package Versions

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
khadas@Khadas:~$ gcc --version
gcc (Ubuntu/Linaro 7.5.0-3ubuntu1~18.04) 7.5.0
Copyright (C) 2017 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

khadas@Khadas:~$ python --version
Python 3.6.9
khadas@Khadas:~$ ls /usr/lib/libopencv*
libopencv_calib3d.so libopencv_dnn.so.3.4.3 libopencv_highgui.so.3.4 libopencv_ml.so libopencv_photo.so.3.4.3 libopencv_superres.so.3.4 libopencv_videostab.so
libopencv_calib3d.so.3.4 libopencv_features2d.so libopencv_highgui.so.3.4.3 libopencv_ml.so.3.4 libopencv_shape.so libopencv_superres.so.3.4.3 libopencv_videostab.so.3.4
libopencv_calib3d.so.3.4.3 libopencv_features2d.so.3.4 libopencv_imgcodecs.so libopencv_ml.so.3.4.3 libopencv_shape.so.3.4 libopencv_video.so libopencv_videostab.so.3.4.3
libopencv_core.so libopencv_features2d.so.3.4.3 libopencv_imgcodecs.so.3.4 libopencv_objdetect.so libopencv_shape.so.3.4.3 libopencv_video.so.3.4
libopencv_core.so.3.4 libopencv_flann.so libopencv_imgcodecs.so.3.4.3 libopencv_objdetect.so.3.4 libopencv_stitching.so libopencv_video.so.3.4.3
libopencv_core.so.3.4.3 libopencv_flann.so.3.4 libopencv_imgproc.so libopencv_objdetect.so.3.4.3 libopencv_stitching.so.3.4 libopencv_videoio.so
libopencv_dnn.so libopencv_flann.so.3.4.3 libopencv_imgproc.so.3.4 libopencv_photo.so libopencv_stitching.so.3.4.3 libopencv_videoio.so.3.4
libopencv_dnn.so.3.4 libopencv_highgui.so libopencv_imgproc.so.3.4.3 libopencv_photo.so.3.4 libopencv_superres.so libopencv_videoio.so.3.4.3
khadas@Khadas:~$ cat /usr/lib/pkgconfig/opencv.pc
# Package Information for pkg-config

prefix=/usr/local/nick
exec_prefix=${prefix}
libdir=${exec_prefix}/lib
includedir_old=${prefix}/include/opencv
includedir_new=${prefix}/include

Name: OpenCV
Description: Open Source Computer Vision Library
Version: 3.4.3
Libs: -L${exec_prefix}/lib -lopencv_dnn -lopencv_ml -lopencv_objdetect -lopencv_shape -lopencv_stitching -lopencv_superres -lopencv_videostab -lopencv_calib3d -lopencv_features2d -lopencv_highgui -lopencv_videoio -lopencv_imgcodecs -lopencv_video -lopencv_photo -lopencv_imgproc -lopencv_flann -lopencv_core
Libs.private: -ldl -lm -lpthread -lrt
Cflags: -I${includedir_old} -I${includedir_new}
khadas@Khadas:~$ sudo apt remove opencv3
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following package was automatically installed and is no longer required:
libllvm8
Use 'sudo apt autoremove' to remove it.
The following packages will be REMOVED:
opencv3
0 upgraded, 0 newly installed, 1 to remove and 0 not upgraded.
After this operation, 1024 B disk space will be freed.
Do you want to continue? [Y/n] Y
(Reading database ... 118978 files and directories currently installed.)
Removing opencv3 (3.4.3-2) ...
khadas@Khadas:~$ ls /usr/lib/libopencv*
ls: cannot access '/usr/lib/libopencv*': No such file or directory
khadas@Khadas:~$ cat /usr/lib/pkgconfig/opencv.pc
cat: /usr/lib/pkgconfig/opencv.pc: No such file or directory

It looks current OpenCV on current VIM3_Ubuntu-xfce-bionic_Linux-4.9_arm64_EMMC_V20191231.img is a kind of outdated. Let’s just remove package opencv3 and have OpenCV-4.3.0 installed manually.

3. Install Manjaro To TF/SD Card

As one of my dreamed operating systems, Manjaro has already provided 2 operating systems for Khadas users to try out.

To flash either of the above systems onto a TF/SD card is simple. However, both are ONLY for SD-USB, instead of EMMC. For instancen:

1
2
3
➜  Manjaro burn-tool -b VIM3 -i ./Manjaro-ARM-xfce-vim3-20.04.img 
Try to burn Amlogic image...
ERROR: Try to burn to eMMC storage, but the image installation type is 'SD-USB', please use 'EMMC' image!

Before moving on, let’s cite the following word from Boot Images from External Media:

1
WARNING: Don’t use your PC as the USB-Host to supply the electrical power, otherwise it will fail to activate Multi-Boot!

4. NPU

In this section, we’re testing the computing capability of Khadas VIM3‘s NPU.

Before everything starts, make sure you have the galcore module loaded, by using command modinfo galcore.

4.1 Obtain aml_npu_sdk From Khadas

Extract the obtained aml_npu_sdk.tgz on your local host. Bear in mind that it is your local host, BUT NOT Khadas VIM3. Relative issues can be found at:

4.2 Model Conversion on Host

Afterwards, the models applicable on Khadas VIM3 can be obtained by following Model Conversion. Anyway, on my laptop, I obtained the converted model as follows:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
➜  nbg_unify_inception_v3 ll
total 28M
-rwxrwxrwx 1 longervision longervision 577 Apr 27 14:10 BUILD
-rwxrwxrwx 1 longervision longervision 28M Apr 27 14:10 inception_v3.nb
-rwxrwxrwx 1 longervision longervision 13K Apr 27 14:10 inceptionv3.vcxproj
-rwxrwxrwx 1 longervision longervision 5.8K Apr 27 14:10 main.c
-rwxrwxrwx 1 longervision longervision 2.0K Apr 27 14:10 makefile.linux
-rwxrwxrwx 1 longervision longervision 358 Apr 27 14:10 vnn_global.h
-rwxrwxrwx 1 longervision longervision 7.1K Apr 27 14:10 vnn_inceptionv3.c
-rwxrwxrwx 1 longervision longervision 985 Apr 27 14:10 vnn_inceptionv3.h
-rwxrwxrwx 1 longervision longervision 3.5K Apr 27 14:10 vnn_post_process.c
-rwxrwxrwx 1 longervision longervision 464 Apr 27 14:10 vnn_post_process.h
-rwxrwxrwx 1 longervision longervision 20K Apr 27 14:10 vnn_pre_process.c
-rwxrwxrwx 1 longervision longervision 1.3K Apr 27 14:10 vnn_pre_process.h

Do I need to emphasize that I’m using Tensorflow 2.1.0 ? Anyway, check the following:

1
2
3
4
5
6
7
8
➜  ~ python
Python 3.6.9 (default, Apr 18 2020, 01:56:04)
[GCC 8.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import tensorflow as tf
2020-04-29 03:11:24.272348: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.2
>>> tf.__version__
'2.1.0'

4.3 Build Case Code

4.3.1 Cross-build on Host

You can of course cross-build the case code on your local host, instead of Khadas VIM3 by referring to Compile the Case Code. (The document seems NOT updated yet.) Instead of using 1 argument, we specify 2 auguments, one for aml_npu_sdk, the other for Fenix.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
➜  nbg_unify_inception_v3 ./build_vx.sh ....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4 ....../fenix 
aarch64-linux-gnu-gcc -c -DLINUX -Wall -D_REENTRANT -fno-strict-aliasing -mtune=cortex-a53 -march=armv8-a -O2 -DgcdENABLE_3D=1 -DgcdENABLE_2D=0 -DgcdENABLE_VG=0 -DgcdUSE_VX=1 -DUSE_VDK=1 -DgcdMOVG=0 -DEGL_API_FB -DgcdSTATIC_LINK=0 -DgcdFPGA_BUILD=0 -DGC_ENABLE_LOADTIME_OPT=1 -DgcdUSE_VXC_BINARY=0 -DgcdGC355_MEM_PRINT=0 -DgcdGC355_PROFILER=0 -DVIVANTE_PROFILER=1 -DVIVANTE_PROFILER_CONTEXT=1 -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/build/sdk/include -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/build/sdk/include/HAL -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/sdk/inc -I./ -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/acuity-ovxlib-dev/include/utils -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/acuity-ovxlib-dev/include/client -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/acuity-ovxlib-dev/include/ops -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/acuity-ovxlib-dev/include -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/acuity-ovxlib-dev/third-party/jpeg-9b -o bin_r/vnn_pre_process.o vnn_pre_process.c
aarch64-linux-gnu-gcc -c -DLINUX -Wall -D_REENTRANT -fno-strict-aliasing -mtune=cortex-a53 -march=armv8-a -O2 -DgcdENABLE_3D=1 -DgcdENABLE_2D=0 -DgcdENABLE_VG=0 -DgcdUSE_VX=1 -DUSE_VDK=1 -DgcdMOVG=0 -DEGL_API_FB -DgcdSTATIC_LINK=0 -DgcdFPGA_BUILD=0 -DGC_ENABLE_LOADTIME_OPT=1 -DgcdUSE_VXC_BINARY=0 -DgcdGC355_MEM_PRINT=0 -DgcdGC355_PROFILER=0 -DVIVANTE_PROFILER=1 -DVIVANTE_PROFILER_CONTEXT=1 -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/build/sdk/include -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/build/sdk/include/HAL -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/sdk/inc -I./ -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/acuity-ovxlib-dev/include/utils -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/acuity-ovxlib-dev/include/client -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/acuity-ovxlib-dev/include/ops -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/acuity-ovxlib-dev/include -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/acuity-ovxlib-dev/third-party/jpeg-9b -o bin_r/vnn_inceptionv3.o vnn_inceptionv3.c
vnn_inceptionv3.c: In function ‘vnn_CreateInceptionV3’:
vnn_inceptionv3.c:139:29: warning: unused variable ‘data’ [-Wunused-variable]
uint8_t * data;
^~~~
At top level:
vnn_inceptionv3.c:91:17: warning: ‘load_data’ defined but not used [-Wunused-function]
static uint8_t* load_data
^~~~~~~~~
aarch64-linux-gnu-gcc -c -DLINUX -Wall -D_REENTRANT -fno-strict-aliasing -mtune=cortex-a53 -march=armv8-a -O2 -DgcdENABLE_3D=1 -DgcdENABLE_2D=0 -DgcdENABLE_VG=0 -DgcdUSE_VX=1 -DUSE_VDK=1 -DgcdMOVG=0 -DEGL_API_FB -DgcdSTATIC_LINK=0 -DgcdFPGA_BUILD=0 -DGC_ENABLE_LOADTIME_OPT=1 -DgcdUSE_VXC_BINARY=0 -DgcdGC355_MEM_PRINT=0 -DgcdGC355_PROFILER=0 -DVIVANTE_PROFILER=1 -DVIVANTE_PROFILER_CONTEXT=1 -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/build/sdk/include -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/build/sdk/include/HAL -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/sdk/inc -I./ -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/acuity-ovxlib-dev/include/utils -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/acuity-ovxlib-dev/include/client -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/acuity-ovxlib-dev/include/ops -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/acuity-ovxlib-dev/include -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/acuity-ovxlib-dev/third-party/jpeg-9b -o bin_r/main.o main.c
aarch64-linux-gnu-gcc -c -DLINUX -Wall -D_REENTRANT -fno-strict-aliasing -mtune=cortex-a53 -march=armv8-a -O2 -DgcdENABLE_3D=1 -DgcdENABLE_2D=0 -DgcdENABLE_VG=0 -DgcdUSE_VX=1 -DUSE_VDK=1 -DgcdMOVG=0 -DEGL_API_FB -DgcdSTATIC_LINK=0 -DgcdFPGA_BUILD=0 -DGC_ENABLE_LOADTIME_OPT=1 -DgcdUSE_VXC_BINARY=0 -DgcdGC355_MEM_PRINT=0 -DgcdGC355_PROFILER=0 -DVIVANTE_PROFILER=1 -DVIVANTE_PROFILER_CONTEXT=1 -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/build/sdk/include -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/build/sdk/include/HAL -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/sdk/inc -I./ -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/acuity-ovxlib-dev/include/utils -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/acuity-ovxlib-dev/include/client -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/acuity-ovxlib-dev/include/ops -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/acuity-ovxlib-dev/include -I....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/acuity-ovxlib-dev/third-party/jpeg-9b -o bin_r/vnn_post_process.o vnn_post_process.c
aarch64-linux-gnu-gcc -mtune=cortex-a53 -march=armv8-a -Wl,-rpath-link ....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/build/sdk/drivers bin_r/vnn_pre_process.o bin_r/vnn_inceptionv3.o bin_r/main.o bin_r/vnn_post_process.o -o bin_r/inceptionv3 -L....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/build/sdk/drivers -l OpenVX -l OpenVXU -l CLC -l VSC -lGAL ....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/acuity-ovxlib-dev/lib/libjpeg.a -L....../aml_npu_sdk/linux_sdk/linux_sdk_6.3.3.4/acuity-ovxlib-dev/lib -l ovxlib -L....../fenix/build/toolchains/gcc-linaro-aarch64-linux-gnu/aarch64-linux-gnu/libc/lib -lm -lrt
aarch64-linux-gnu-strip bin_r/inceptionv3
make: Nothing to be done for 'all'.
➜ nbg_unify_inception_v3 ll bin_r
total 164K
-rwxrwxrwx 1 longervision longervision 126K Apr 27 14:22 inceptionv3
-rwxrwxrwx 1 longervision longervision 6.3K Apr 27 14:22 main.o
-rwxrwxrwx 1 longervision longervision 3.9K Apr 27 14:22 vnn_inceptionv3.o
-rwxrwxrwx 1 longervision longervision 3.5K Apr 27 14:22 vnn_post_process.o
-rwxrwxrwx 1 longervision longervision 17K Apr 27 14:22 vnn_pre_process.o

inceptionv3 now should be ready to use, but in my case, it’s NOT working properly. It’s probably because Fenix is NOT able to provide/represent the correct cross-compile toolchains for my installed VIMx.Ubuntu-xfce-bionic_Linux-4.9_arm64_V20191231.emmc.kresq. Anyway, this is NOT my preference.

4.3.2 Directly Build on Khadas VIM3

Let’s leave this for the next section 4.4 Run Executable on Khadas VIM3.

4.4 Run Executable on Khadas VIM3

4.4.1 Step 1: Install aml-npu

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
khadas@Khadas:~$ sudo apt install aml-npu
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following packages were automatically installed and are no longer required:
libllvm8 libssh-dev
Use 'sudo apt autoremove' to remove them.
The following NEW packages will be installed:
aml-npu
0 upgraded, 1 newly installed, 0 to remove and 1 not upgraded.
Need to get 0 B/3318 kB of archives.
After this operation, 1024 B of additional disk space will be used.
Selecting previously unselected package aml-npu.
(Reading database ... 136037 files and directories currently installed.)
Preparing to unpack .../aml-npu_6.4.0.3_arm64.deb ...
Unpacking aml-npu (6.4.0.3) ...
Setting up aml-npu (6.4.0.3) ...

And with command line dpkg -L aml-npu, you’ll see what’s been installed by aml-npu. However, due to its commercial license, I may NOT be allowed to show anything here in my blog.

😶

4.4.2 Step 2: Install aml-npu-demo and Run Demo

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
khadas@Khadas:~$ sudo apt install aml-npu-demo
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following packages were automatically installed and are no longer required:
libllvm8 libssh-dev
Use 'sudo apt autoremove' to remove them.
The following NEW packages will be installed:
aml-npu-demo
0 upgraded, 1 newly installed, 0 to remove and 1 not upgraded.
Need to get 0 B/19.7 MB of archives.
After this operation, 1024 B of additional disk space will be used.
Selecting previously unselected package aml-npu-demo.
(Reading database ... 136098 files and directories currently installed.)
Preparing to unpack .../aml-npu-demo_6.3.3.4_arm64.deb ...
Unpacking aml-npu-demo (6.3.3.4) ...
Setting up aml-npu-demo (6.3.3.4) ...

Where is the sample to run? /usr/share/npu/inceptionv3.

Alright, let’s try it.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
khadas@Khadas:~$ cd /usr/share/npu/inceptionv3
khadas@Khadas:/usr/share/npu/inceptionv3$ ./inceptionv3 ./inception_v3.nb ./dog_299x299.jpg
D [setup_node:368]Setup node id[0] uid[4294967295] op[NBG]
D [print_tensor:136]in(0) : id[ 1] vtl[0] const[0] shape[ 3, 299, 299, 1 ] fmt[u8 ] qnt[ASM zp=137, scale=0.007292]
D [print_tensor:136]out(0): id[ 0] vtl[0] const[0] shape[ 1001, 1 ] fmt[f16] qnt[NONE]
D [optimize_node:312]Backward optimize neural network
D [optimize_node:319]Forward optimize neural network
I [compute_node:261]Create vx node
Create Neural Network: 37ms or 37726us
I [vsi_nn_PrintGraph:1421]Graph:
I [vsi_nn_PrintGraph:1422]***************** Tensors ******************
D [print_tensor:146]id[ 0] vtl[0] const[0] shape[ 1001, 1 ] fmt[f16] qnt[NONE]
D [print_tensor:146]id[ 1] vtl[0] const[0] shape[ 3, 299, 299, 1 ] fmt[u8 ] qnt[ASM zp=137, scale=0.007292]
I [vsi_nn_PrintGraph:1431]***************** Nodes ******************
I [vsi_nn_PrintNode:159]( NBG)node[0] [in: 1 ], [out: 0 ] [10587cb0]
I [vsi_nn_PrintGraph:1440]******************************************
I [vsi_nn_ConvertTensorToData:750]Create 268203 data.
Verify...
Verify Graph: 1ms or 1811us
Start run graph [1] times...
Run the 1 time: 28ms or 28075us
vxProcessGraph execution time:
Total 28ms or 28091us
Average 28.09ms or 28091.00us
I [vsi_nn_ConvertTensorToData:750]Create 2002 data.
--- Top5 ---
208: 0.819824
209: 0.040344
223: 0.009354
185: 0.002956
268: 0.002829
I [vsi_nn_ConvertTensorToData:750]Create 2002 data.

The program runs smoothly.

😏

4.4.3 Step 3: Build Your Own Executable and Run

Clearly, ALL (really???) required development files have been provided by aml-npu, in such, we should be able to build this demo inceptionv3 out by ourselves.

4.4.3.1 You STILL Need aml_npu_sdk from Khadas

Besides aml-npu from repo, in order to have the demo inceptionv3 fully and successfully built, you still need aml_npu_sdk from Khadas. In my case, you do need acuity-ovxlib-dev, and let’s do export ACUITY_OVXLIB_DEV=path_to_acuity-ovxlib-dev.

4.4.3.2 Build inceptionv3 from Source

We don’t need to copy the entire aml_npu_sdk onto Khadas VIM3, but ONLY demo/inceptionv3. Here in my case, ONLY demo/inceptionv3 is copied under ~/Programs.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
khadas@Khadas:~/Programs/inceptionv3$ ll
total 28236
drwxr-xr-x 3 khadas khadas 4096 Apr 29 09:23 ./
drwxrwxr-x 5 khadas khadas 4096 Apr 29 09:22 ../
-rwxr-xr-x 1 khadas khadas 577 Apr 29 09:22 BUILD*
drwxr-xr-x 2 khadas khadas 4096 Apr 29 09:22 bin_demo/
-rwxr-xr-x 1 khadas khadas 9878 Apr 29 09:22 build_vx.sh*
-rwxr-xr-x 1 khadas khadas 28807168 Apr 29 09:23 inception_v3.nb*
-rwxr-xr-x 1 khadas khadas 12691 Apr 29 09:22 inceptionv3.vcxproj*
-rwxr-xr-x 1 khadas khadas 5869 Apr 29 09:23 main.c*
-rwxr-xr-x 1 khadas khadas 2000 Apr 29 09:23 makefile.linux*
-rwxr-xr-x 1 khadas khadas 358 Apr 29 09:23 vnn_global.h*
-rwxr-xr-x 1 khadas khadas 7191 Apr 29 09:23 vnn_inceptionv3.c*
-rwxr-xr-x 1 khadas khadas 985 Apr 29 09:23 vnn_inceptionv3.h*
-rwxr-xr-x 1 khadas khadas 3566 Apr 29 09:23 vnn_post_process.c*
-rwxr-xr-x 1 khadas khadas 464 Apr 29 09:23 vnn_post_process.h*
-rwxr-xr-x 1 khadas khadas 20385 Apr 29 09:23 vnn_pre_process.c*
-rwxr-xr-x 1 khadas khadas 1294 Apr 29 09:23 vnn_pre_process.h*

This is almost the same as folder nbg_unify_inception_v3 shown in 4.2 Model Conversion on Host.

Now, The MOST important part is to modify makefile.

1
khadas@Khadas:~/Programs/inceptionv3$ cp makefile.linux makefile

My makefile is modified as follows.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
khadas@Khadas:~/Programs/inceptionv3$ cat makefile
CFLAGS += -I${ACUITY_OVXLIB_DEV}/include

################################################################################
# Supply necessary libraries.
LIBS += -L/lib -lOpenVX -lOpenVXU -lCLC -lVSC -lGAL -lovxlib -lm -ljpeg

#############################################################################
# Macros.
PROGRAM = 1
TARGET_NAME = inceptionv3
CUR_SOURCE = ${wildcard *.c}
#############################################################################
# Objects.
OBJECTS = ${patsubst %.c, $(OBJ_DIR)/%.o, $(CUR_SOURCE)}

# installation directory
INSTALL_DIR := ./

################################################################################
# Include the common makefile.

include ${VIVANTE_SDK}/common.target

In fact, you still need to modify common.target a little bit accordingly. However, to disclose it in this blog is still NOT allowed I think. Anyway, after the modification, let’s make it.

1
2
3
4
5
6
7
8
9
10
khadas@Khadas:~/Programs/inceptionv3$ make
cc -c -I/opt/acuity-ovxlib-dev/include -o bin_r/vnn_pre_process.o vnn_pre_process.c
cc -c -I/opt/acuity-ovxlib-dev/include -o bin_r/vnn_inceptionv3.o vnn_inceptionv3.c
cc -c -I/opt/acuity-ovxlib-dev/include -o bin_r/main.o main.c
cc -c -I/opt/acuity-ovxlib-dev/include -o bin_r/vnn_post_process.o vnn_post_process.c
cc -Wl,-rpath-link /opt/vivante_sdk/drivers bin_r/vnn_pre_process.o bin_r/vnn_inceptionv3.o bin_r/main.o bin_r/vnn_post_process.o -o bin_r/inceptionv3 -L/lib -lOpenVX -lOpenVXU -lCLC -lVSC -lGAL -lovxlib -lm -ljpeg -lrt
bin_r/inceptionv3
Usage: bin_r/inceptionv3 data_file inputs...
/opt/vivante_sdk/common.target:64: recipe for target 'bin_r/inceptionv3' failed
make: *** [bin_r/inceptionv3] Error 255

Don’t worry about the error. It just failed to run the demo, but the executable inceptionv3 has already been successfully built under folder bin_r.

1
2
3
4
5
6
7
8
9
khadas@Khadas:~/Programs/inceptionv3$ ll bin_r
total 92
drwxrwxr-x 2 khadas khadas 4096 Apr 29 09:46 ./
drwxr-xr-x 4 khadas khadas 4096 Apr 29 09:46 ../
-rwxrwxr-x 1 khadas khadas 34864 Apr 29 09:46 inceptionv3*
-rw-rw-r-- 1 khadas khadas 6864 Apr 29 09:46 main.o
-rw-rw-r-- 1 khadas khadas 5128 Apr 29 09:46 vnn_inceptionv3.o
-rw-rw-r-- 1 khadas khadas 4456 Apr 29 09:46 vnn_post_process.o
-rw-rw-r-- 1 khadas khadas 21976 Apr 29 09:46 vnn_pre_process.o

4.4.3.3 Run inceptionv3

Let’s run inceptionv3 under folder bin_demo.

1
2
3
4
5
6
7
8
9
10
khadas@Khadas:~/Programs/inceptionv3$ cd bin_demo/
khadas@Khadas:~/Programs/inceptionv3/bin_demo$ ll
total 28384
drwxr-xr-x 2 khadas khadas 4096 Apr 29 09:52 ./
drwxr-xr-x 4 khadas khadas 4096 Apr 29 09:57 ../
-rwxr-xr-x 1 khadas khadas 15981 Apr 29 09:52 dog_299x299.jpg*
-rwxr-xr-x 1 khadas khadas 88322 Apr 29 09:52 goldfish_299x299.jpg*
-rwxr-xr-x 1 khadas khadas 10479 Apr 29 09:52 imagenet_slim_labels.txt*
-rwxr-xr-x 1 khadas khadas 28807168 Apr 29 09:52 inception_v3.nb*
-rwxr-xr-x 1 khadas khadas 129568 Apr 29 09:52 inceptionv3*

This is the original status of ALL files under bin_demo. Let’s copy and paste our built bin_r/inceptionv3 into this folder bin_demo. The size of the executable seems to be dramatically decreased.

1
2
3
khadas@Khadas:~/Programs/inceptionv3/bin_demo$ cp ../bin_r/inceptionv3 ./
khadas@Khadas:~/Programs/inceptionv3/bin_demo$ ll inceptionv3
-rwxr-xr-x 1 khadas khadas 34864 Apr 29 09:58 inceptionv3*

Now, let’s copy the built inception_v3.nb from host to Khadas VIM3. It seems inception_v3.nb built by Tensorflow 2.1.0 on host is of the same size as provided by Khadas.

1
2
khadas@Khadas:~/Programs/inceptionv3/bin_demo$ ll inception_v3.nb 
-rwxr-xr-x 1 khadas khadas 28807168 Apr 29 10:06 inception_v3.nb*

Finally, let’s run the demo.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
khadas@Khadas:~/Programs/inceptionv3/bin_demo$ ./inceptionv3 ./inception_v3.nb ./dog_299x299.jpg 
D [setup_node:368]Setup node id[0] uid[4294967295] op[NBG]
D [print_tensor:136]in(0) : id[ 1] vtl[0] const[0] shape[ 3, 299, 299, 1 ] fmt[u8 ] qnt[ASM zp=137, scale=0.007292]
D [print_tensor:136]out(0): id[ 0] vtl[0] const[0] shape[ 1001, 1 ] fmt[f16] qnt[NONE]
D [optimize_node:312]Backward optimize neural network
D [optimize_node:319]Forward optimize neural network
I [compute_node:261]Create vx node
Create Neural Network: 58ms or 58961us
I [vsi_nn_PrintGraph:1421]Graph:
I [vsi_nn_PrintGraph:1422]***************** Tensors ******************
D [print_tensor:146]id[ 0] vtl[0] const[0] shape[ 1001, 1 ] fmt[f16] qnt[NONE]
D [print_tensor:146]id[ 1] vtl[0] const[0] shape[ 3, 299, 299, 1 ] fmt[u8 ] qnt[ASM zp=137, scale=0.007292]
I [vsi_nn_PrintGraph:1431]***************** Nodes ******************
I [vsi_nn_PrintNode:159]( NBG)node[0] [in: 1 ], [out: 0 ] [a56c0cb0]
I [vsi_nn_PrintGraph:1440]******************************************
I [vsi_nn_ConvertTensorToData:750]Create 268203 data.
Verify...
Verify Graph: 1ms or 1959us
Start run graph [1] times...
Run the 1 time: 29ms or 29038us
vxProcessGraph execution time:
Total 29ms or 29063us
Average 29.06ms or 29063.00us
I [vsi_nn_ConvertTensorToData:750]Create 2002 data.
--- Top5 ---
208: 0.828613
209: 0.040771
223: 0.008278
268: 0.002737
185: 0.002396
I [vsi_nn_ConvertTensorToData:750]Create 2002 data.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
khadas@Khadas:~/Programs/inceptionv3/bin_demo$ ./inceptionv3 ./inception_v3.nb ./goldfish_299x299.jpg 
D [setup_node:368]Setup node id[0] uid[4294967295] op[NBG]
D [print_tensor:136]in(0) : id[ 1] vtl[0] const[0] shape[ 3, 299, 299, 1 ] fmt[u8 ] qnt[ASM zp=137, scale=0.007292]
D [print_tensor:136]out(0): id[ 0] vtl[0] const[0] shape[ 1001, 1 ] fmt[f16] qnt[NONE]
D [optimize_node:312]Backward optimize neural network
D [optimize_node:319]Forward optimize neural network
I [compute_node:261]Create vx node
Create Neural Network: 64ms or 64785us
I [vsi_nn_PrintGraph:1421]Graph:
I [vsi_nn_PrintGraph:1422]***************** Tensors ******************
D [print_tensor:146]id[ 0] vtl[0] const[0] shape[ 1001, 1 ] fmt[f16] qnt[NONE]
D [print_tensor:146]id[ 1] vtl[0] const[0] shape[ 3, 299, 299, 1 ] fmt[u8 ] qnt[ASM zp=137, scale=0.007292]
I [vsi_nn_PrintGraph:1431]***************** Nodes ******************
I [vsi_nn_PrintNode:159]( NBG)node[0] [in: 1 ], [out: 0 ] [6df47cb0]
I [vsi_nn_PrintGraph:1440]******************************************
I [vsi_nn_ConvertTensorToData:750]Create 268203 data.
Verify...
Verify Graph: 2ms or 2951us
Start run graph [1] times...
Run the 1 time: 28ms or 28835us
vxProcessGraph execution time:
Total 28ms or 28873us
Average 28.87ms or 28873.00us
I [vsi_nn_ConvertTensorToData:750]Create 2002 data.
--- Top5 ---
2: 0.832520
795: 0.008316
974: 0.003586
408: 0.002302
393: 0.002016
I [vsi_nn_ConvertTensorToData:750]Create 2002 data.

By comparing to imagenet_slim_labels.txt under current folder, let’s take a look at our inference results. Only the FIRST inference is qualified because of the probability.

Index Result for dog_299x299.jpg Result for goldfish_299x299.jpg
N/A dog_299x299.jpg goldfish_299x299.jpg
1 208: ‘curly-coated retriever’, 2: ‘tench’,
2 209: ‘golden retriever’, 795: ‘shower cap’,
3 223: ‘Irish water spaniel’, 974: ‘cliff’,
4 268: ‘miniature poodle’, 408: ‘altar’,
5 185: ‘Kerry blue terrier’, 393: ‘coho’,
😘

5. Dual Boot From Manjaro

5.1 How to Boot Images from External Media?

There are clearly 2 options:

1. Power on VIM3. 2. Long press the POWER key without releasing it. 3. Short press the ‘Reset’ key and release it. 4. Count for 2 to 3 seconds, then release the POWER key to enter into Upgrade Mode. You will see the sys-led turn ON when you’ve entered Upgrade Mode. - multiple boot via [grub](https://www.gnu.org/software/grub/): Reasonably speaking, 2 operating systems may even have a chance to be installed onto **a SINGLE EMMC**

5.2 How to flash Manjaro XFCE for Khadas Vim 3 from TF/SD card to EMMC?

ONLY 1 operating system is preferred. Why??? Khadas VIM3 board comes with a large EMMC of size 32G.

After a VERY long time struggling, I would really like to emphasize the quality of Type C cable and power adaptor again. Try to buy things NOT from Taobao.

😢 😭

Finally, I had Manjaro XFCE for Khadas Vim 3 on SD card booted and running, as follows:

Manjaro

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
➜  ~ ssh khadas@192.168.1.95
khadas@192.168.1.95's password:
Welcome to Manjaro-ARM
~~Website: https://manjaro.org
~~Forum: https://forum.manjaro.org/c/manjaro-arm
~~IRC: #manjaro-arm on irc.freenode.net
~~Matrix: #manjaro-arm-public:matrix.org
Last login: Sun Apr 19 03:24:46 2020 from 192.168.1.200
[khadas@manjaro ~]$ ls
Desktop Documents Downloads Music Pictures Public Templates Videos
[khadas@manjaro ~]$ pwd
/home/khadas
[khadas@manjaro ~]$ uname -a
Linux manjaro 5.6.0-0.6 #1 SMP PREEMPT Wed Apr 1 19:25:06 +03 2020 aarch64 GNU/Linux
[khadas@manjaro ~]$ lsb_release -a
LSB Version: n/a
Distributor ID: Manjaro-ARM
Description: Manjaro ARM Linux
Release: 20.04
Codename: n/a
[khadas@manjaro ~]$ gcc --version
-bash: gcc: command not found
[khadas@manjaro ~]$ g++ --version
-bash: g++: command not found
[khadas@manjaro ~]$ sudo apt install gcc g++ automake autoconfig libtool

We trust you have received the usual lecture from the local System
Administrator. It usually boils down to these three things:

#1) Respect the privacy of others.
#2) Think before you type.
#3) With great power comes great responsibility.

[sudo] password for khadas:
sudo: apt: command not found
[khadas@manjaro ~]$ pamac help
Available actions:
pamac --version
pamac --help,-h [action]
pamac clean [options]
pamac checkupdates [options]
pamac update,upgrade [options]
pamac search [options] <package(s)>
pamac info [options] <package(s)>
pamac list [options] <package(s)>
pamac install [options] <package(s)>
pamac reinstall [options] <package(s)>
pamac clone [options] <package(s)>
pamac build [options] [package(s)]
pamac remove [options] [package(s)]
[khadas@manjaro ~]$

It seems Arch Linux is totally different from Debian. What can I say? Go to bed.

Today, April 17, 2020, I’ve got 2 big NEWS for me.

  • China Airline cancelled my flight back to China in May.
  • I received an Email from balena to encourage us to contribute our spare computing power (PCs, laptops, single-board devices) to [Rosetta@Home](https://boinc.bakerlab.org/) and support vital COVID-19 research.

Well, having been using balenaEtcher for quite a while, I of course will support Baker Laboratory at W - University of Washington. There are 2 points need to be emphasized here:

  • W used to be my dreamed university, but it’s STILL in my dream so far.
😂

Alright, let’s taste how they bake this COVID-19. 2 manuals to follow:

Make sure one thing: Wired Connection.

Now, it’s your choice to visit either http://foldforcovid.local/ or IP address of this Raspberry Pi 4, you will see your Raspberry Pi 4 is up and running, and you are donating the compute capacity to support COVID-19.

foldforcovid.local 192.168.1.111
foldforcovid.local 192.168.1.111

Finally, 2 additional things:

Let me update a bit: Besides this Fold for Covid, there are so many activities ongoing:

Visited Green Timbers Lake again.

The Grassland Ducks In the Lake A Pair of Ducks In The Lake
The Grassland Ducks In the Lake A Pair of Ducks In The Lake
The Lake Me In Facial Mask for COVID-19 The Lake - The Other Side
The Lake Me In Facial Mask for COVID-19 The Lake - The Other Side

1. About Tensorspace

2. Let’s Have Some Fun

We FIRST create an empty project folder, here, named as WebTensorspace.

2.1 Follow Towards Data Science Blog FIRST

Let’s strictly follow this part of Towards Data Science Blog (cited from Towards Data Science Blog).

Finally we need to create .html file which will output the result. Not to spend time on setting-up TensorFlow.js and JQuery I encourage you just to use my template at TensorSpace folder. The folder structure looks as following:
  • index.html — out html file to run visualization
  • lib/ — folder storing all the dependencies
  • data/ — folder containing .json file with network inputs
  • model/ — folder containing exported model

For our html file we need to first import dependecies and write a TensorSpace script.

Now, let’s take a look at our project.

1
2
➜  WebTensorspace ls
data index.html lib model

2.1.1 lib

Three resource is referred to download ALL required libraries.

Some required libraries suggested by me:

Now, let’s take a look at what’s under folder lib.

1
2
3
➜  WebTensorspace ls lib
Chart.min.js signature_pad.min.js tensorspace.min.js tf.min.js.map TrackballControls.js
jquery.min.js stats.min.js tf.min.js three.min.js tween.cjs.js

2.1.2 model

Let’s just use tf_keras_model.h5 provided by TensorSpace as our example. You may have to click on the Download button to have this tf_keras_model.h5 downloaded into folder model.

1
2
➜  WebTensorspace ls model 
tf_keras_model.h5

2.1.2.1 tensorspacejs_converter Failed to Run

Now, let’s try to run the following command:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
➜  WebTensorspace tensorspacejs_converter \                              
--input_model_from="tensorflow" \
--input_model_format="tf_keras" \
--output_layer_names="padding_1,conv_1,maxpool_1,conv_2,maxpool_2,dense_1,dense_2,softmax" \
./model/tf_keras_model.h5 \
./model/converted
Traceback (most recent call last):
File "~/.local/lib/python3.6/site-packages/pkg_resources/__init__.py", line 584, in _build_master
ws.require(__requires__)
File "~/.local/lib/python3.6/site-packages/pkg_resources/__init__.py", line 901, in require
needed = self.resolve(parse_requirements(requirements))
File "~/.local/lib/python3.6/site-packages/pkg_resources/__init__.py", line 792, in resolve
raise VersionConflict(dist, req).with_context(dependent_req)
pkg_resources.ContextualVersionConflict: (tensorboard 2.2.1 (~/.local/lib/python3.6/site-packages), Requirement.parse('tensorboard<2.2.0,>=2.1.0'), {'tensorflow'})

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "~/.local/bin/tensorspacejs_converter", line 6, in <module>
from pkg_resources import load_entry_point
File "~/.local/lib/python3.6/site-packages/pkg_resources/__init__.py", line 3258, in <module>
@_call_aside
File "~/.local/lib/python3.6/site-packages/pkg_resources/__init__.py", line 3242, in _call_aside
f(*args, **kwargs)
File "~/.local/lib/python3.6/site-packages/pkg_resources/__init__.py", line 3271, in _initialize_master_working_set
working_set = WorkingSet._build_master()
File "~/.local/lib/python3.6/site-packages/pkg_resources/__init__.py", line 586, in _build_master
return cls._build_from_requirements(__requires__)
File "~/.local/lib/python3.6/site-packages/pkg_resources/__init__.py", line 599, in _build_from_requirements
dists = ws.resolve(reqs, Environment())
File "~/.local/lib/python3.6/site-packages/pkg_resources/__init__.py", line 787, in resolve
raise DistributionNotFound(req, requirers)
pkg_resources.DistributionNotFound: The 'tensorflow-estimator<2.2.0,>=2.1.0' distribution was not found and is required by tensorflow

Clearly, we can downgrade tensorflow-estimator from 2.2.0 to 2.1.0.

1
2
3
4
5
6
7
8
9
10
11
➜  WebTensorspace pip show tensorflow_estimator
Name: tensorflow-estimator
Version: 2.1.0
Summary: TensorFlow Estimator.
Home-page: https://www.tensorflow.org/
Author: Google Inc.
Author-email: UNKNOWN
License: Apache 2.0
Location: ~/.local/lib/python3.6/site-packages
Requires:
Required-by: tensorflow

Now, we try to re-run the above tensorspacejs_converter command:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
➜  WebTensorspace tensorspacejs_converter \                     
--input_model_from="tensorflow" \
--input_model_format="tf_keras" \
--output_layer_names="padding_1,conv_1,maxpool_1,conv_2,maxpool_2,dense_1,dense_2,softmax" \
./model/tf_keras_model.h5 \
./model/converted
Using TensorFlow backend.
Traceback (most recent call last):
File "~/.local/bin/tensorspacejs_converter", line 11, in <module>
load_entry_point('tensorspacejs==0.2.0', 'console_scripts', 'tensorspacejs_converter')()
File "~/.local/lib/python3.6/site-packages/tensorspacejs-0.2.0-py3.6.egg/tensorspacejs/tsp_converters.py", line 167, in main
flags.output_layer_names
File "~/.local/lib/python3.6/site-packages/tensorspacejs-0.2.0-py3.6.egg/tensorspacejs/tf/tensorflow_conversion.py", line 35, in preprocess_tensorflow_model
output_node_names
File "~/.local/lib/python3.6/site-packages/tensorspacejs-0.2.0-py3.6.egg/tensorspacejs/tf/keras_model.py", line 15, in preprocess_hdf5_combined_model
with K.get_session():
File "~/.local/lib/python3.6/site-packages/Keras-2.3.1-py3.6.egg/keras/backend/tensorflow_backend.py", line 379, in get_session
'`get_session` is not available '
RuntimeError: `get_session` is not available when using TensorFlow 2.0.

2.1.2.2 Install tfjs-converter

Please checkout tfjs and enter tfjs-converter, and then have the python package installed tfjs-converter/python

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
➜  python git:(master) ✗ pwd
....../tfjs/tfjs-converter/python
➜ python git:(master) ✗ python setup.py install --user
running install
running bdist_egg
running egg_info
writing tensorflowjs.egg-info/PKG-INFO
writing dependency_links to tensorflowjs.egg-info/dependency_links.txt
writing entry points to tensorflowjs.egg-info/entry_points.txt
writing requirements to tensorflowjs.egg-info/requires.txt
writing top-level names to tensorflowjs.egg-info/top_level.txt
file tensorflowjs.py (for module tensorflowjs) not found
file tensorflowjs/converters.py (for module tensorflowjs.converters) not found
package init file 'tensorflowjs/op_list/__init__.py' not found (or not a regular file)
reading manifest template 'MANIFEST.in'
writing manifest file 'tensorflowjs.egg-info/SOURCES.txt'
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_py
file tensorflowjs.py (for module tensorflowjs) not found
file tensorflowjs/converters.py (for module tensorflowjs.converters) not found
file tensorflowjs.py (for module tensorflowjs) not found
file tensorflowjs/converters.py (for module tensorflowjs.converters) not found
creating build/bdist.linux-x86_64/egg
creating build/bdist.linux-x86_64/egg/tensorflowjs
creating build/bdist.linux-x86_64/egg/tensorflowjs/converters
copying build/lib/tensorflowjs/converters/common.py -> build/bdist.linux-x86_64/egg/tensorflowjs/converters
copying build/lib/tensorflowjs/converters/converter.py -> build/bdist.linux-x86_64/egg/tensorflowjs/converters
copying build/lib/tensorflowjs/converters/fold_batch_norms.py -> build/bdist.linux-x86_64/egg/tensorflowjs/converters
copying build/lib/tensorflowjs/converters/fuse_depthwise_conv2d.py -> build/bdist.linux-x86_64/egg/tensorflowjs/converters
copying build/lib/tensorflowjs/converters/fuse_prelu.py -> build/bdist.linux-x86_64/egg/tensorflowjs/converters
copying build/lib/tensorflowjs/converters/graph_rewrite_util.py -> build/bdist.linux-x86_64/egg/tensorflowjs/converters
copying build/lib/tensorflowjs/converters/keras_h5_conversion.py -> build/bdist.linux-x86_64/egg/tensorflowjs/converters
copying build/lib/tensorflowjs/converters/keras_tfjs_loader.py -> build/bdist.linux-x86_64/egg/tensorflowjs/converters
copying build/lib/tensorflowjs/converters/tf_saved_model_conversion_v2.py -> build/bdist.linux-x86_64/egg/tensorflowjs/converters
copying build/lib/tensorflowjs/converters/wizard.py -> build/bdist.linux-x86_64/egg/tensorflowjs/converters
copying build/lib/tensorflowjs/converters/__init__.py -> build/bdist.linux-x86_64/egg/tensorflowjs/converters
creating build/bdist.linux-x86_64/egg/tensorflowjs/op_list
copying build/lib/tensorflowjs/op_list/arithmetic.json -> build/bdist.linux-x86_64/egg/tensorflowjs/op_list
copying build/lib/tensorflowjs/op_list/basic_math.json -> build/bdist.linux-x86_64/egg/tensorflowjs/op_list
copying build/lib/tensorflowjs/op_list/control.json -> build/bdist.linux-x86_64/egg/tensorflowjs/op_list
copying build/lib/tensorflowjs/op_list/convolution.json -> build/bdist.linux-x86_64/egg/tensorflowjs/op_list
copying build/lib/tensorflowjs/op_list/creation.json -> build/bdist.linux-x86_64/egg/tensorflowjs/op_list
copying build/lib/tensorflowjs/op_list/dynamic.json -> build/bdist.linux-x86_64/egg/tensorflowjs/op_list
copying build/lib/tensorflowjs/op_list/evaluation.json -> build/bdist.linux-x86_64/egg/tensorflowjs/op_list
copying build/lib/tensorflowjs/op_list/graph.json -> build/bdist.linux-x86_64/egg/tensorflowjs/op_list
copying build/lib/tensorflowjs/op_list/image.json -> build/bdist.linux-x86_64/egg/tensorflowjs/op_list
copying build/lib/tensorflowjs/op_list/logical.json -> build/bdist.linux-x86_64/egg/tensorflowjs/op_list
copying build/lib/tensorflowjs/op_list/matrices.json -> build/bdist.linux-x86_64/egg/tensorflowjs/op_list
copying build/lib/tensorflowjs/op_list/normalization.json -> build/bdist.linux-x86_64/egg/tensorflowjs/op_list
copying build/lib/tensorflowjs/op_list/reduction.json -> build/bdist.linux-x86_64/egg/tensorflowjs/op_list
copying build/lib/tensorflowjs/op_list/slice_join.json -> build/bdist.linux-x86_64/egg/tensorflowjs/op_list
copying build/lib/tensorflowjs/op_list/spectral.json -> build/bdist.linux-x86_64/egg/tensorflowjs/op_list
copying build/lib/tensorflowjs/op_list/transformation.json -> build/bdist.linux-x86_64/egg/tensorflowjs/op_list
copying build/lib/tensorflowjs/quantization.py -> build/bdist.linux-x86_64/egg/tensorflowjs
copying build/lib/tensorflowjs/read_weights.py -> build/bdist.linux-x86_64/egg/tensorflowjs
copying build/lib/tensorflowjs/resource_loader.py -> build/bdist.linux-x86_64/egg/tensorflowjs
copying build/lib/tensorflowjs/version.py -> build/bdist.linux-x86_64/egg/tensorflowjs
copying build/lib/tensorflowjs/write_weights.py -> build/bdist.linux-x86_64/egg/tensorflowjs
copying build/lib/tensorflowjs/__init__.py -> build/bdist.linux-x86_64/egg/tensorflowjs
byte-compiling build/bdist.linux-x86_64/egg/tensorflowjs/converters/common.py to common.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorflowjs/converters/converter.py to converter.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorflowjs/converters/fold_batch_norms.py to fold_batch_norms.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorflowjs/converters/fuse_depthwise_conv2d.py to fuse_depthwise_conv2d.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorflowjs/converters/fuse_prelu.py to fuse_prelu.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorflowjs/converters/graph_rewrite_util.py to graph_rewrite_util.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorflowjs/converters/keras_h5_conversion.py to keras_h5_conversion.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorflowjs/converters/keras_tfjs_loader.py to keras_tfjs_loader.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorflowjs/converters/tf_saved_model_conversion_v2.py to tf_saved_model_conversion_v2.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorflowjs/converters/wizard.py to wizard.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorflowjs/converters/__init__.py to __init__.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorflowjs/quantization.py to quantization.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorflowjs/read_weights.py to read_weights.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorflowjs/resource_loader.py to resource_loader.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorflowjs/version.py to version.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorflowjs/write_weights.py to write_weights.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorflowjs/__init__.py to __init__.cpython-36.pyc
creating build/bdist.linux-x86_64/egg/EGG-INFO
copying tensorflowjs.egg-info/PKG-INFO -> build/bdist.linux-x86_64/egg/EGG-INFO
copying tensorflowjs.egg-info/SOURCES.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying tensorflowjs.egg-info/dependency_links.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying tensorflowjs.egg-info/entry_points.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying tensorflowjs.egg-info/requires.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying tensorflowjs.egg-info/top_level.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
zip_safe flag not set; analyzing archive contents...
tensorflowjs.__pycache__.resource_loader.cpython-36: module references __file__
creating 'dist/tensorflowjs-1.7.0-py3.6.egg' and adding 'build/bdist.linux-x86_64/egg' to it
removing 'build/bdist.linux-x86_64/egg' (and everything under it)
Processing tensorflowjs-1.7.0-py3.6.egg
creating ~/.local/lib/python3.6/site-packages/tensorflowjs-1.7.0-py3.6.egg
Extracting tensorflowjs-1.7.0-py3.6.egg to ~/.local/lib/python3.6/site-packages
Adding tensorflowjs 1.7.0 to easy-install.pth file
Installing tensorflowjs_converter script to ~/.local/bin
Installing tensorflowjs_wizard script to ~/.local/bin

Installed ~/.local/lib/python3.6/site-packages/tensorflowjs-1.7.0-py3.6.egg
Processing dependencies for tensorflowjs==1.7.0
Searching for PyInquirer==1.0.3
Best match: PyInquirer 1.0.3
Adding PyInquirer 1.0.3 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Not found: prompt_toolkit==1.0.14
Not found: Pygments>=2.2.0
Not found: regex>=2016.11.21
Searching for gast==0.3.3
Best match: gast 0.3.3
Adding gast 0.3.3 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for tensorflow-hub==0.8.0
Best match: tensorflow-hub 0.8.0
Adding tensorflow-hub 0.8.0 to easy-install.pth file
Installing make_image_classifier script to ~/.local/bin
Installing make_nearest_neighbour_index script to ~/.local/bin

Using ~/.local/lib/python3.6/site-packages
Searching for tensorflow==2.1.0
Best match: tensorflow 2.1.0
Adding tensorflow 2.1.0 to easy-install.pth file
Installing estimator_ckpt_converter script to ~/.local/bin
Installing saved_model_cli script to ~/.local/bin
Installing tensorboard script to ~/.local/bin
Installing tf_upgrade_v2 script to ~/.local/bin
Installing tflite_convert script to ~/.local/bin
Installing toco script to ~/.local/bin
Installing toco_from_protos script to ~/.local/bin

Using ~/.local/lib/python3.6/site-packages
Searching for six==1.14.0
Best match: six 1.14.0
Adding six 1.14.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for numpy==1.18.2
Best match: numpy 1.18.2
Adding numpy 1.18.2 to easy-install.pth file
Installing f2py script to ~/.local/bin
Installing f2py3 script to ~/.local/bin
Installing f2py3.6 script to ~/.local/bin

Using ~/.local/lib/python3.6/site-packages
Searching for h5py==2.10.0
Best match: h5py 2.10.0
Adding h5py 2.10.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for prompt-toolkit==1.0.14
Best match: prompt-toolkit 1.0.14
Processing prompt_toolkit-1.0.14-py3.6.egg
prompt-toolkit 1.0.14 is already the active version in easy-install.pth

Using ~/.local/lib/python3.6/site-packages/prompt_toolkit-1.0.14-py3.6.egg
Searching for regex==2020.4.4
Best match: regex 2020.4.4
Adding regex 2020.4.4 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for Pygments==2.6.1
Best match: Pygments 2.6.1
Adding Pygments 2.6.1 to easy-install.pth file
Installing pygmentize script to ~/.local/bin

Using ~/.local/lib/python3.6/site-packages
Searching for protobuf==3.11.3
Best match: protobuf 3.11.3
Adding protobuf 3.11.3 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for grpcio==1.28.1
Best match: grpcio 1.28.1
Adding grpcio 1.28.1 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for opt-einsum==3.2.1
Best match: opt-einsum 3.2.1
Adding opt-einsum 3.2.1 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for Keras-Preprocessing==1.1.0
Best match: Keras-Preprocessing 1.1.0
Adding Keras-Preprocessing 1.1.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for wrapt==1.12.1
Best match: wrapt 1.12.1
Adding wrapt 1.12.1 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for scipy==1.4.1
Best match: scipy 1.4.1
Adding scipy 1.4.1 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for tensorflow-estimator==2.1.0
Best match: tensorflow-estimator 2.1.0
Adding tensorflow-estimator 2.1.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for tensorboard==2.1.1
Best match: tensorboard 2.1.1
Processing tensorboard-2.1.1-py3.6.egg
tensorboard 2.1.1 is already the active version in easy-install.pth
Installing tensorboard script to ~/.local/bin

Using ~/.local/lib/python3.6/site-packages/tensorboard-2.1.1-py3.6.egg
Searching for astunparse==1.6.3
Best match: astunparse 1.6.3
Adding astunparse 1.6.3 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for google-pasta==0.2.0
Best match: google-pasta 0.2.0
Adding google-pasta 0.2.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for termcolor==1.1.0
Best match: termcolor 1.1.0
Adding termcolor 1.1.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for absl-py==0.9.0
Best match: absl-py 0.9.0
Adding absl-py 0.9.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for wheel==0.34.2
Best match: wheel 0.34.2
Adding wheel 0.34.2 to easy-install.pth file
Installing wheel script to ~/.local/bin

Using ~/.local/lib/python3.6/site-packages
Searching for wcwidth==0.1.9
Best match: wcwidth 0.1.9
Adding wcwidth 0.1.9 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for setuptools==46.1.3
Best match: setuptools 46.1.3
Adding setuptools 46.1.3 to easy-install.pth file
Installing easy_install script to ~/.local/bin
Installing easy_install-3.8 script to ~/.local/bin

Using ~/.local/lib/python3.6/site-packages
Searching for Werkzeug==1.0.1
Best match: Werkzeug 1.0.1
Adding Werkzeug 1.0.1 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for requests==2.23.0
Best match: requests 2.23.0
Adding requests 2.23.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for Markdown==3.2.1
Best match: Markdown 3.2.1
Adding Markdown 3.2.1 to easy-install.pth file
Installing markdown_py script to ~/.local/bin

Using ~/.local/lib/python3.6/site-packages
Searching for google-auth==1.14.0
Best match: google-auth 1.14.0
Adding google-auth 1.14.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for google-auth-oauthlib==0.4.1
Best match: google-auth-oauthlib 0.4.1
Adding google-auth-oauthlib 0.4.1 to easy-install.pth file
Installing google-oauthlib-tool script to ~/.local/bin

Using ~/.local/lib/python3.6/site-packages
Searching for urllib3==1.25.9
Best match: urllib3 1.25.9
Adding urllib3 1.25.9 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for idna==2.9
Best match: idna 2.9
Adding idna 2.9 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for certifi==2020.4.5.1
Best match: certifi 2020.4.5.1
Adding certifi 2020.4.5.1 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for chardet==3.0.4
Best match: chardet 3.0.4
Adding chardet 3.0.4 to easy-install.pth file
Installing chardetect script to ~/.local/bin

Using /usr/lib/python3/dist-packages
Searching for pyasn1-modules==0.2.8
Best match: pyasn1-modules 0.2.8
Adding pyasn1-modules 0.2.8 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for cachetools==4.1.0
Best match: cachetools 4.1.0
Adding cachetools 4.1.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for rsa==4.0
Best match: rsa 4.0
Adding rsa 4.0 to easy-install.pth file
Installing pyrsa-decrypt script to ~/.local/bin
Installing pyrsa-encrypt script to ~/.local/bin
Installing pyrsa-keygen script to ~/.local/bin
Installing pyrsa-priv2pub script to ~/.local/bin
Installing pyrsa-sign script to ~/.local/bin
Installing pyrsa-verify script to ~/.local/bin

Using ~/.local/lib/python3.6/site-packages
Searching for requests-oauthlib==1.3.0
Best match: requests-oauthlib 1.3.0
Adding requests-oauthlib 1.3.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for pyasn1==0.4.8
Best match: pyasn1 0.4.8
Adding pyasn1 0.4.8 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for oauthlib==3.1.0
Best match: oauthlib 3.1.0
Adding oauthlib 3.1.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Finished processing dependencies for tensorflowjs==1.7.0
➜ python git:(master) ✗

2.1.2.3 Install tensorspace-converter

Please checkout my modified tensorspace-converter and have the python package installed. Please also keep an eye on my PR.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
➜  tensorspace-converter git:(master) python setup.py install --user
running install
running bdist_egg
running egg_info
writing tensorspacejs.egg-info/PKG-INFO
writing dependency_links to tensorspacejs.egg-info/dependency_links.txt
writing entry points to tensorspacejs.egg-info/entry_points.txt
writing requirements to tensorspacejs.egg-info/requires.txt
writing top-level names to tensorspacejs.egg-info/top_level.txt
reading manifest template 'MANIFEST.in'
warning: no previously-included files matching '*' found under directory 'tensorspacejs/tfjs/node_modules'
warning: no previously-included files matching '*' found under directory 'tensorspacejs/tf/pb2json/node_modules'
writing manifest file 'tensorspacejs.egg-info/SOURCES.txt'
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_py
creating build/bdist.linux-x86_64/egg
creating build/bdist.linux-x86_64/egg/tensorspacejs
copying build/lib/tensorspacejs/install.py -> build/bdist.linux-x86_64/egg/tensorspacejs
creating build/bdist.linux-x86_64/egg/tensorspacejs/krs
copying build/lib/tensorspacejs/krs/keras_conversion.py -> build/bdist.linux-x86_64/egg/tensorspacejs/krs
copying build/lib/tensorspacejs/krs/keras_model.py -> build/bdist.linux-x86_64/egg/tensorspacejs/krs
copying build/lib/tensorspacejs/krs/__init__.py -> build/bdist.linux-x86_64/egg/tensorspacejs/krs
creating build/bdist.linux-x86_64/egg/tensorspacejs/tf
copying build/lib/tensorspacejs/tf/frozen_model.py -> build/bdist.linux-x86_64/egg/tensorspacejs/tf
copying build/lib/tensorspacejs/tf/keras_model.py -> build/bdist.linux-x86_64/egg/tensorspacejs/tf
creating build/bdist.linux-x86_64/egg/tensorspacejs/tf/pb2json
copying build/lib/tensorspacejs/tf/pb2json/package.json -> build/bdist.linux-x86_64/egg/tensorspacejs/tf/pb2json
copying build/lib/tensorspacejs/tf/pb2json/pb2json_conversion.py -> build/bdist.linux-x86_64/egg/tensorspacejs/tf/pb2json
copying build/lib/tensorspacejs/tf/pb2json/README.md -> build/bdist.linux-x86_64/egg/tensorspacejs/tf/pb2json
creating build/bdist.linux-x86_64/egg/tensorspacejs/tf/pb2json/tools
copying build/lib/tensorspacejs/tf/pb2json/tools/compiled_api.js -> build/bdist.linux-x86_64/egg/tensorspacejs/tf/pb2json/tools
copying build/lib/tensorspacejs/tf/pb2json/tools/pb2json_converter.ts -> build/bdist.linux-x86_64/egg/tensorspacejs/tf/pb2json/tools
copying build/lib/tensorspacejs/tf/pb2json/__init__.py -> build/bdist.linux-x86_64/egg/tensorspacejs/tf/pb2json
copying build/lib/tensorspacejs/tf/saved_model.py -> build/bdist.linux-x86_64/egg/tensorspacejs/tf
copying build/lib/tensorspacejs/tf/tensorflow_conversion.py -> build/bdist.linux-x86_64/egg/tensorspacejs/tf
copying build/lib/tensorspacejs/tf/__init__.py -> build/bdist.linux-x86_64/egg/tensorspacejs/tf
creating build/bdist.linux-x86_64/egg/tensorspacejs/tfjs
creating build/bdist.linux-x86_64/egg/tensorspacejs/tfjs/app
copying build/lib/tensorspacejs/tfjs/app/Converter.js -> build/bdist.linux-x86_64/egg/tensorspacejs/tfjs/app
copying build/lib/tensorspacejs/tfjs/app/Summary.js -> build/bdist.linux-x86_64/egg/tensorspacejs/tfjs/app
copying build/lib/tensorspacejs/tfjs/main.js -> build/bdist.linux-x86_64/egg/tensorspacejs/tfjs
copying build/lib/tensorspacejs/tfjs/package.json -> build/bdist.linux-x86_64/egg/tensorspacejs/tfjs
copying build/lib/tensorspacejs/tfjs/tfjs_conversion.py -> build/bdist.linux-x86_64/egg/tensorspacejs/tfjs
creating build/bdist.linux-x86_64/egg/tensorspacejs/tfjs/utils
copying build/lib/tensorspacejs/tfjs/utils/Utils.js -> build/bdist.linux-x86_64/egg/tensorspacejs/tfjs/utils
creating build/bdist.linux-x86_64/egg/tensorspacejs/tfjs/wrapper
copying build/lib/tensorspacejs/tfjs/wrapper/ModelWrapper.js -> build/bdist.linux-x86_64/egg/tensorspacejs/tfjs/wrapper
copying build/lib/tensorspacejs/tfjs/__init__.py -> build/bdist.linux-x86_64/egg/tensorspacejs/tfjs
copying build/lib/tensorspacejs/tsp_converters.py -> build/bdist.linux-x86_64/egg/tensorspacejs
creating build/bdist.linux-x86_64/egg/tensorspacejs/utility
copying build/lib/tensorspacejs/utility/file_utility.py -> build/bdist.linux-x86_64/egg/tensorspacejs/utility
copying build/lib/tensorspacejs/utility/__init__.py -> build/bdist.linux-x86_64/egg/tensorspacejs/utility
copying build/lib/tensorspacejs/version.py -> build/bdist.linux-x86_64/egg/tensorspacejs
copying build/lib/tensorspacejs/__init__.py -> build/bdist.linux-x86_64/egg/tensorspacejs
creating build/bdist.linux-x86_64/egg/tensorspacejs/__pycache__
copying build/lib/tensorspacejs/__pycache__/version.cpython-36.pyc -> build/bdist.linux-x86_64/egg/tensorspacejs/__pycache__
copying build/lib/tensorspacejs/__pycache__/__init__.cpython-36.pyc -> build/bdist.linux-x86_64/egg/tensorspacejs/__pycache__
byte-compiling build/bdist.linux-x86_64/egg/tensorspacejs/install.py to install.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorspacejs/krs/keras_conversion.py to keras_conversion.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorspacejs/krs/keras_model.py to keras_model.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorspacejs/krs/__init__.py to __init__.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorspacejs/tf/frozen_model.py to frozen_model.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorspacejs/tf/keras_model.py to keras_model.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorspacejs/tf/pb2json/pb2json_conversion.py to pb2json_conversion.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorspacejs/tf/pb2json/__init__.py to __init__.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorspacejs/tf/saved_model.py to saved_model.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorspacejs/tf/tensorflow_conversion.py to tensorflow_conversion.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorspacejs/tf/__init__.py to __init__.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorspacejs/tfjs/tfjs_conversion.py to tfjs_conversion.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorspacejs/tfjs/__init__.py to __init__.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorspacejs/tsp_converters.py to tsp_converters.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorspacejs/utility/file_utility.py to file_utility.cpython-36.pyc
byte-compiling build/bdist.linux-x86_64/egg/tensorspacejs/utility/__init__.py to __init__.cpython-36.pyc
creating build/bdist.linux-x86_64/egg/EGG-INFO
copying tensorspacejs.egg-info/PKG-INFO -> build/bdist.linux-x86_64/egg/EGG-INFO
copying tensorspacejs.egg-info/SOURCES.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying tensorspacejs.egg-info/dependency_links.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying tensorspacejs.egg-info/entry_points.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying tensorspacejs.egg-info/requires.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying tensorspacejs.egg-info/top_level.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
zip_safe flag not set; analyzing archive contents...
tensorspacejs.__pycache__.install.cpython-36: module references __file__
tensorspacejs.__pycache__.tsp_converters.cpython-36: module references __file__
tensorspacejs.tf.pb2json.__pycache__.pb2json_conversion.cpython-36: module references __file__
tensorspacejs.tfjs.__pycache__.tfjs_conversion.cpython-36: module references __file__
creating 'dist/tensorspacejs-0.6.1-py3.6.egg' and adding 'build/bdist.linux-x86_64/egg' to it
removing 'build/bdist.linux-x86_64/egg' (and everything under it)
Processing tensorspacejs-0.6.1-py3.6.egg
creating ~/.local/lib/python3.6/site-packages/tensorspacejs-0.6.1-py3.6.egg
Extracting tensorspacejs-0.6.1-py3.6.egg to ~/.local/lib/python3.6/site-packages
Adding tensorspacejs 0.6.1 to easy-install.pth file
Installing tensorspacejs_converter script to ~/.local/bin

Installed ~/.local/lib/python3.6/site-packages/tensorspacejs-0.6.1-py3.6.egg
Processing dependencies for tensorspacejs==0.6.1
Searching for tensorflow==2.1.0
Best match: tensorflow 2.1.0
Adding tensorflow 2.1.0 to easy-install.pth file
Installing estimator_ckpt_converter script to ~/.local/bin
Installing saved_model_cli script to ~/.local/bin
Installing tensorboard script to ~/.local/bin
Installing tf_upgrade_v2 script to ~/.local/bin
Installing tflite_convert script to ~/.local/bin
Installing toco script to ~/.local/bin
Installing toco_from_protos script to ~/.local/bin

Using ~/.local/lib/python3.6/site-packages
Searching for Keras==2.3.1
Best match: Keras 2.3.1
Processing Keras-2.3.1-py3.6.egg
Keras 2.3.1 is already the active version in easy-install.pth

Using ~/.local/lib/python3.6/site-packages/Keras-2.3.1-py3.6.egg
Searching for tensorflowjs==1.7.2
Best match: tensorflowjs 1.7.2
Processing tensorflowjs-1.7.2-py3.6.egg
tensorflowjs 1.7.2 is already the active version in easy-install.pth
Installing tensorflowjs_converter script to ~/.local/bin

Using ~/.local/lib/python3.6/site-packages/tensorflowjs-1.7.2-py3.6.egg
Searching for tensorboard==2.1.1
Best match: tensorboard 2.1.1
Processing tensorboard-2.1.1-py3.6.egg
tensorboard 2.1.1 is already the active version in easy-install.pth
Installing tensorboard script to ~/.local/bin

Using ~/.local/lib/python3.6/site-packages/tensorboard-2.1.1-py3.6.egg
Searching for Keras-Preprocessing==1.1.0
Best match: Keras-Preprocessing 1.1.0
Adding Keras-Preprocessing 1.1.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for grpcio==1.28.1
Best match: grpcio 1.28.1
Adding grpcio 1.28.1 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for termcolor==1.1.0
Best match: termcolor 1.1.0
Adding termcolor 1.1.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for six==1.14.0
Best match: six 1.14.0
Adding six 1.14.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for protobuf==3.11.3
Best match: protobuf 3.11.3
Adding protobuf 3.11.3 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for wrapt==1.12.1
Best match: wrapt 1.12.1
Adding wrapt 1.12.1 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for opt-einsum==3.2.1
Best match: opt-einsum 3.2.1
Adding opt-einsum 3.2.1 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for google-pasta==0.2.0
Best match: google-pasta 0.2.0
Adding google-pasta 0.2.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for scipy==1.4.1
Best match: scipy 1.4.1
Adding scipy 1.4.1 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for h5py==2.10.0
Best match: h5py 2.10.0
Adding h5py 2.10.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for absl-py==0.9.0
Best match: absl-py 0.9.0
Adding absl-py 0.9.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for wheel==0.34.2
Best match: wheel 0.34.2
Adding wheel 0.34.2 to easy-install.pth file
Installing wheel script to ~/.local/bin

Using ~/.local/lib/python3.6/site-packages
Searching for gast==0.3.3
Best match: gast 0.3.3
Adding gast 0.3.3 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for numpy==1.18.2
Best match: numpy 1.18.2
Adding numpy 1.18.2 to easy-install.pth file
Installing f2py script to ~/.local/bin
Installing f2py3 script to ~/.local/bin
Installing f2py3.6 script to ~/.local/bin

Using ~/.local/lib/python3.6/site-packages
Searching for tensorflow-estimator==2.1.0
Best match: tensorflow-estimator 2.1.0
Adding tensorflow-estimator 2.1.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for astunparse==1.6.3
Best match: astunparse 1.6.3
Adding astunparse 1.6.3 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for Keras-Applications==1.0.8
Best match: Keras-Applications 1.0.8
Adding Keras-Applications 1.0.8 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for PyYAML==5.3.1
Best match: PyYAML 5.3.1
Adding PyYAML 5.3.1 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for tensorflow-hub==0.8.0
Best match: tensorflow-hub 0.8.0
Adding tensorflow-hub 0.8.0 to easy-install.pth file
Installing make_image_classifier script to ~/.local/bin
Installing make_nearest_neighbour_index script to ~/.local/bin

Using ~/.local/lib/python3.6/site-packages
Searching for Werkzeug==1.0.1
Best match: Werkzeug 1.0.1
Adding Werkzeug 1.0.1 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for setuptools==46.1.3
Best match: setuptools 46.1.3
Adding setuptools 46.1.3 to easy-install.pth file
Installing easy_install script to ~/.local/bin
Installing easy_install-3.8 script to ~/.local/bin

Using ~/.local/lib/python3.6/site-packages
Searching for requests==2.23.0
Best match: requests 2.23.0
Adding requests 2.23.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for Markdown==3.2.1
Best match: Markdown 3.2.1
Adding Markdown 3.2.1 to easy-install.pth file
Installing markdown_py script to ~/.local/bin

Using ~/.local/lib/python3.6/site-packages
Searching for google-auth==1.14.0
Best match: google-auth 1.14.0
Adding google-auth 1.14.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for google-auth-oauthlib==0.4.1
Best match: google-auth-oauthlib 0.4.1
Adding google-auth-oauthlib 0.4.1 to easy-install.pth file
Installing google-oauthlib-tool script to ~/.local/bin

Using ~/.local/lib/python3.6/site-packages
Searching for chardet==3.0.4
Best match: chardet 3.0.4
Adding chardet 3.0.4 to easy-install.pth file
Installing chardetect script to ~/.local/bin

Using /usr/lib/python3/dist-packages
Searching for certifi==2020.4.5.1
Best match: certifi 2020.4.5.1
Adding certifi 2020.4.5.1 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for idna==2.9
Best match: idna 2.9
Adding idna 2.9 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for urllib3==1.25.9
Best match: urllib3 1.25.9
Adding urllib3 1.25.9 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for cachetools==4.1.0
Best match: cachetools 4.1.0
Adding cachetools 4.1.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for rsa==4.0
Best match: rsa 4.0
Adding rsa 4.0 to easy-install.pth file
Installing pyrsa-decrypt script to ~/.local/bin
Installing pyrsa-encrypt script to ~/.local/bin
Installing pyrsa-keygen script to ~/.local/bin
Installing pyrsa-priv2pub script to ~/.local/bin
Installing pyrsa-sign script to ~/.local/bin
Installing pyrsa-verify script to ~/.local/bin

Using ~/.local/lib/python3.6/site-packages
Searching for pyasn1-modules==0.2.8
Best match: pyasn1-modules 0.2.8
Adding pyasn1-modules 0.2.8 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for requests-oauthlib==1.3.0
Best match: requests-oauthlib 1.3.0
Adding requests-oauthlib 1.3.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for pyasn1==0.4.8
Best match: pyasn1 0.4.8
Adding pyasn1 0.4.8 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Searching for oauthlib==3.1.0
Best match: oauthlib 3.1.0
Adding oauthlib 3.1.0 to easy-install.pth file

Using ~/.local/lib/python3.6/site-packages
Finished processing dependencies for tensorspacejs==0.6.1

2.1.2.4 Try tensorspacejs_converter Again

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
➜  WebTensorspace tensorspacejs_converter \
--input_model_from="tensorflow" \
--input_model_format="tf_keras" \
--output_layer_names="padding_1,conv_1,maxpool_1,conv_2,maxpool_2,dense_1,dense_2,softmax" \
./model/tf_keras_model.h5 \
./model/convertedModel
Using TensorFlow backend.
2020-04-17 04:45:23.473683: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1
2020-04-17 04:45:23.477188: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-04-17 04:45:23.477770: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1558] Found device 0 with properties:
pciBusID: 0000:01:00.0 name: GeForce GTX 980M computeCapability: 5.2
coreClock: 1.1265GHz coreCount: 12 deviceMemorySize: 3.94GiB deviceMemoryBandwidth: 149.31GiB/s
2020-04-17 04:45:23.478004: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.2
2020-04-17 04:45:23.479508: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10
2020-04-17 04:45:23.480978: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10
2020-04-17 04:45:23.481245: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10
2020-04-17 04:45:23.482737: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10
2020-04-17 04:45:23.483660: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10
2020-04-17 04:45:23.487100: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7
2020-04-17 04:45:23.487273: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-04-17 04:45:23.487902: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-04-17 04:45:23.488310: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1700] Adding visible gpu devices: 0
2020-04-17 04:45:23.488621: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: SSE3 SSE4.1 SSE4.2 AVX AVX2 FMA
2020-04-17 04:45:23.511713: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 2599990000 Hz
2020-04-17 04:45:23.512333: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x5bc9f30 initialized for platform Host (this does not guarantee that XLA will be used). Devices:
2020-04-17 04:45:23.512354: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version
2020-04-17 04:45:23.553407: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-04-17 04:45:23.553764: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x5c320e0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
2020-04-17 04:45:23.553782: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): GeForce GTX 980M, Compute Capability 5.2
2020-04-17 04:45:23.553969: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-04-17 04:45:23.554208: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1558] Found device 0 with properties:
pciBusID: 0000:01:00.0 name: GeForce GTX 980M computeCapability: 5.2
coreClock: 1.1265GHz coreCount: 12 deviceMemorySize: 3.94GiB deviceMemoryBandwidth: 149.31GiB/s
2020-04-17 04:45:23.554271: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.2
2020-04-17 04:45:23.554302: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10
2020-04-17 04:45:23.554315: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10
2020-04-17 04:45:23.554344: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10
2020-04-17 04:45:23.554389: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10
2020-04-17 04:45:23.554413: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10
2020-04-17 04:45:23.554442: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7
2020-04-17 04:45:23.554519: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-04-17 04:45:23.554755: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-04-17 04:45:23.554956: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1700] Adding visible gpu devices: 0
2020-04-17 04:45:23.555015: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.2
2020-04-17 04:45:23.555794: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1099] Device interconnect StreamExecutor with strength 1 edge matrix:
2020-04-17 04:45:23.555806: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1105] 0
2020-04-17 04:45:23.555830: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1118] 0: N
2020-04-17 04:45:23.555939: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-04-17 04:45:23.556211: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-04-17 04:45:23.556465: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1244] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 133 MB memory) -> physical GPU (device: 0, name: GeForce GTX 980M, pci bus id: 0000:01:00.0, compute capability: 5.2)
Preprocessing hdf5 combined model...
Loading .h5 model into memory...
WARNING:tensorflow:From /home/longervision/.local/lib/python3.6/site-packages/tensorflow/python/ops/resource_variable_ops.py:1658: calling BaseResourceVariable.__init__ (from tensorflow.python.ops.resource_variable_ops) with constraint is deprecated and will be removed in a future version.
Instructions for updating:
If using Keras pass *_constraint arguments to layers.
Generating multi-output encapsulated model...
Saving temp multi-output .h5 model...
Converting .h5 to web friendly format...
Deleting temp .h5 model...
Mission Complete!!!
➜ WebTensorspace ls model/convertedModel
group1-shard1of1.bin model.json

2.1.3 index.html

2.1.3.1 helloworld-empty

Let’s copy and paste TensorSpace’s Example helloworld-empty.html and do some trivial modification for 3D visualization, as follows:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>AI Model Visualization By Tensorspace - 3D Interactive</title>
<meta name="author" content="Longer Vision / https://longervision.github.io/2020/04/17/AI/Visualization/model-visualization-tensorspace-3D-interactive/">

<script src="./lib/three.min.js"></script>
<script src="./lib/tween.cjs.js"></script>
<script src="./lib/tf.min.js"></script>
<script src="./lib/TrackballControls.js"></script>
<script src="./lib/tensorspace.min.js"></script>
<script src="./lib/jquery.min.js"></script>

<style>

html, body {
margin: 0;
padding: 0;
width: 100%;
height: 100%;
}

#container {
width: 100%;
height: 100%;
}

</style>

</head>

<body>

<div id="container"></div>

<script>

$(function() {

let modelContainer = document.getElementById( "container" );
let model = new TSP.models.Sequential( modelContainer );

model.add( new TSP.layers.GreyscaleInput( { shape: [ 28, 28 ] } ) );
model.add( new TSP.layers.Padding2d( { padding: [ 2, 2 ] } ) );
model.add( new TSP.layers.Conv2d( { kernelSize: 5, filters: 6, strides: 1 } ) );
model.add( new TSP.layers.Pooling2d( { poolSize: [ 2, 2 ], strides: [ 2, 2 ] } ) );
model.add( new TSP.layers.Conv2d( { kernelSize: 5, filters: 16, strides: 1 } ) );
model.add( new TSP.layers.Pooling2d( { poolSize: [ 2, 2 ], strides: [ 2, 2 ] } ) );
model.add( new TSP.layers.Dense( { units: 120 } ) );
model.add( new TSP.layers.Dense( { units: 84 } ) );
model.add( new TSP.layers.Output1d( {
units: 10,
outputs: [ "0", "1", "2", "3", "4", "5", "6", "7", "8", "9" ]
} ) );

model.init( function() {
console.log( "Hello World from TensorSpace!" );
} );

} );

</script>

</body>
</html>

First snow in 2020. Actually, it is ALSO the FIRST snow for the winter from 2019 to 2020.

First Snow 1 First Snow 2 First Snow 3
First Snow 1 First Snow 2 First Snow 3

Both my son and the Chinese New Year are coming. Let’s start the mode of celebrating. Today, I’m going to do the hotpot.

Hotpot 1 Hotpot 2 Hotpot 3
Hotpot 1 Hotpot 2 Hotpot 3

It looks in 1-day time, everybody is doing the edge computing. Today, we’re going to have some fun of Google Coral.

1. Google Coral USB Accelerator

Image cited from Coral official website.

Google Coral Accelerator

To try out Google Coral USB Accelerator is comparitively simple. The ONLY thing to do is just to follow Google Doc - Get started with the USB Accelerator. Anyway, let’s test it out with the following commands.

Make sure we are able to list the device.

1
2
3
➜  classification git:(master) ✗ lsusb
Bus 002 Device 003: ID 1a6e:089a Global Unichip Corp.
...

We then checkout Google Coral Edgue TPU and test the example classify_image.py.

1
2
3
4
5
6
7
8
9
10
11
12
➜  edgetpu git:(master) pwd
/opt/google/edgetpu
➜ edgetpu git:(master) python3 examples/classify_image.py \
--model test_data/mobilenet_v2_1.0_224_inat_bird_quant_edgetpu.tflite \
--label test_data/inat_bird_labels.txt \
--image test_data/parrot.jpg
---------------------------
Ara macao (Scarlet Macaw)
Score : 0.6796875
---------------------------
Platycercus elegans (Crimson Rosella)
Score : 0.12109375

BTW, I’m going to discuss

sooner or later. Just keep an eye on my blog.

2. Google Coral Dev Board

In the following, we’re going to disscuss Google Coral Dev Board more. Image cited from Coral official website.

Google Coral Devboard

2.1 Mendel Installation

2.1.1 Mendel Linux Preparation

Google Corel Mendel Linux can be downloaded from https://coral.ai/software/. In our case, we are going to try Mendel Linux 4.0.

2.1.2 Connect Dev Board Via Micro-USB Serial Port

On the host, we should be able to see:

1
2
3
4
5
6
7
8
9
10
11
12
➜  mendel-enterprise-day-13 lsusb
......
Bus 001 Device 020: ID 10c4:ea70 Cygnal Integrated Products, Inc. CP210x UART Bridge
......
➜ mendel-enterprise-day-13 dmesg | grep ttyUSB
[45021.091322] usb 1-8: cp210x converter now attached to ttyUSB0
[45021.092681] usb 1-8: cp210x converter now attached to ttyUSB1
➜ mendel-enterprise-day-13 ll /dev/ttyUSB*
crwxrwxrwx 1 root dialout 188, 0 Apr 6 21:33 /dev/ttyUSB0
crwxrwxrwx 1 root dialout 188, 1 Apr 6 21:25 /dev/ttyUSB1
➜ mendel-enterprise-day-13 screen /dev/ttyUSB0 115200
.....

Now what you see is a black screen. After having connected the Type C power cable, you should be able to see:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
......
[ 2871.285085] NOHZ: local_softirq_pending 08
[ 2871.376306] NOHZ: local_softirq_pending 08
[ 2872.117716] NOHZ: local_softirq_pending 08
[ 2874.283909] NOHZ: local_softirq_pending 08
[ 2875.286859] NOHZ: local_softirq_pending 08

U-Boot SPL 2017.03.3 (Nov 08 2019 - 22:29:30)
power_bd71837_init
pmic debug: name=BD71837
Board id: 6
check ddr4_pmu_train_imem code
check ddr4_pmu_train_imem code pass
check ddr4_pmu_train_dmem code
check ddr4_pmu_train_dmem code pass
Training PASS
Training PASS
check ddr4_pmu_train_imem code
check ddr4_pmu_train_imem code pass
check ddr4_pmu_train_dmem code
check ddr4_pmu_train_dmem code pass
Training PASS
Normal Boot
Trying to boot from MMC1
hdr read sector 300, count=1


U-Boot 2017.03.3 (Nov 08 2019 - 22:29:30 +0000)

CPU: Freescale i.MX8MQ rev2.0 1500 MHz (running at 1000 MHz)
CPU: Commercial temperature grade (0C to 95C) at 64C
Reset cause: POR
Model: Freescale i.MX8MQ Phanbell
DRAM: 1 GiB
Board id: 6
Baseboard id: 1
MMC: FSL_SDHC: 0, FSL_SDHC: 1
*** Warning - bad CRC, using default environment

In: serial
Out: serial
Err: serial

BuildInfo:
- ATF
- U-Boot 2017.03.3

flash target is MMC:0
Net:
Warning: ethernet@30be0000 using MAC address from ROM
eth0: ethernet@30be0000
Fastboot: Normal
Hit any key to stop autoboot: 0
u-boot=> [A

That is shown on the screen monitor of Google Coral Dev Board. We now need to input fastboot 0 on u-boot=> prompt. After having connected the Type C OTG cable, we should be able to see on the host:

1
2
➜  mendel-enterprise-day-13 fastboot devices
101989d6f32efb39 fastboot

2.1.3 Flash Corel Dev Board

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
➜  mendel-enterprise-day-13 ls
boot_arm64.img flash.sh partition-table-16gb.img partition-table-64gb.img partition-table-8gb.img README recovery.img rootfs_arm64.img u-boot.imx
➜ mendel-enterprise-day-13 bash flash.sh
Sending 'bootloader0' (991 KB) OKAY [ 0.055s]
Writing 'bootloader0' OKAY [ 0.190s]
Finished. Total time: 0.266s
Rebooting into bootloader OKAY [ 0.024s]
Finished. Total time: 0.125s
Sending 'gpt' (33 KB) OKAY [ 0.018s]
Writing 'gpt' OKAY [ 0.309s]
Finished. Total time: 0.346s
Rebooting into bootloader OKAY [ 0.022s]
Finished. Total time: 0.122s
Erasing 'misc' OKAY [ 0.069s]
Finished. Total time: 0.079s
Sending 'boot' (131072 KB) OKAY [ 5.321s]
Writing 'boot' OKAY [ 3.632s]
Finished. Total time: 8.972s
Sending sparse 'rootfs' 1/4 (368422 KB) OKAY [ 14.792s]
Writing 'rootfs' OKAY [ 36.191s]
Sending sparse 'rootfs' 2/4 (408501 KB) OKAY [ 16.646s]
Writing 'rootfs' OKAY [ 18.944s]
Sending sparse 'rootfs' 3/4 (389107 KB) OKAY [ 15.881s]
Writing 'rootfs' OKAY [ 37.021s]
Sending sparse 'rootfs' 4/4 (231325 KB) OKAY [ 9.482s]
Writing 'rootfs' OKAY [ 65.204s]
Finished. Total time: 214.205s
Rebooting OKAY [ 0.005s]
Finished. Total time: 0.105s
➜ mendel-enterprise-day-13

2.1.4 Boot Mendel

Screen Snapshot Reboot

After a while, you’ll see:

Screen Snapshot Login

Now, login with

  • username: mendel
  • password: mendel
1
2
➜  ~ mdt devices
mocha-shrimp (192.168.100.2)

You will be able to see Google Coral Dev Board is NOW connected. If you don’t see the EXPECTED output mocha-shrimp (192.168.101.2), just plug out and plug in the Type C power cable again.

Unfortunately, mdt tool does NOT work properly.

1
2
3
4
5
6
7
8
9
10
11
➜  mendel-enterprise-day-13 mdt shell
Waiting for a device...
Connecting to mocha-shrimp at 192.168.101.2
Key not present on mocha-shrimp -- pushing

It looks like you're trying to connect to a device that isn't connected
to your workstation via USB and doesn't have the SSH key this MDT generated.
To connect with `mdt shell` you will need to first connect to your device
ONLY via USB.

Cowardly refusing to attempt to push a key to a public machine.

This bug has been clarified on StackOverflow. By modifying file vim $HOME/.local/lib/python3.6/site-packages/mdt/sshclient.py line 86, from if not self.address.startswith('192.168.100'): to if not self.address.startswith('192.168.10'):, problem solved.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
➜  mendel-enterprise-day-13 mdt shell
Waiting for a device...
Connecting to mocha-shrimp at 192.168.101.2
Key not present on mocha-shrimp -- pushing
Linux mocha-shrimp 4.14.98-imx #1 SMP PREEMPT Fri Nov 8 23:28:21 UTC 2019 aarch64

The programs included with the Mendel GNU/Linux system are free software;
the exact distribution terms for each program are described in the
individual files in /usr/share/doc/*/copyright.

Mendel GNU/Linux comes with ABSOLUTELY NO WARRANTY, to the extent
permitted by applicable law.
Last login: Thu Feb 14 10:12:02 2019
mendel@mocha-shrimp:~$ ls
mendel@mocha-shrimp:~$ pwd
/home/mendel
mendel@mocha-shrimp:~$ uname -a
Linux mocha-shrimp 4.14.98-imx #1 SMP PREEMPT Fri Nov 8 23:28:21 UTC 2019 aarch64 GNU/Linux
mendel@mocha-shrimp:~$ lsb_release -a
No LSB modules are available.
Distributor ID: Mendel
Description: Mendel GNU/Linux 4 (Day)
Release: 10.0
Codename: day
mendel@mocha-shrimp:~$ ip -c address
1: lo: <LOOPBACK,UP,LOWER_UP> mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000
link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00
inet 127.0.0.1/8 scope host lo
valid_lft forever preferred_lft forever
inet6 ::1/128 scope host
valid_lft forever preferred_lft forever
2: eth0: <NO-CARRIER,BROADCAST,MULTICAST,UP> mtu 1500 qdisc mq state DOWN group default qlen 1000
link/ether 7c:d9:5c:b1:fa:cc brd ff:ff:ff:ff:ff:ff
3: wlan0: <NO-CARRIER,BROADCAST,MULTICAST,UP> mtu 1500 qdisc mq state DOWN group default qlen 3000
link/ether 7c:d9:5c:b1:fa:cd brd ff:ff:ff:ff:ff:ff
4: p2p0: <BROADCAST,MULTICAST> mtu 1500 qdisc noop state DOWN group default qlen 3000
link/ether 00:0a:f5:89:89:81 brd ff:ff:ff:ff:ff:ff
5: usb0: <NO-CARRIER,BROADCAST,MULTICAST,UP> mtu 1500 qdisc pfifo_fast state DOWN group default qlen 1000
link/ether 02:22:78:0d:f6:df brd ff:ff:ff:ff:ff:ff
inet 192.168.100.2/24 brd 192.168.100.255 scope global noprefixroute usb0
valid_lft forever preferred_lft forever
inet6 fe80::cc6d:b3d4:f07e:eed1/64 scope link tentative noprefixroute
valid_lft forever preferred_lft forever
6: usb1: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc pfifo_fast state UP group default qlen 1000
link/ether 02:22:78:0d:f6:de brd ff:ff:ff:ff:ff:ff
inet 192.168.101.2/24 brd 192.168.101.255 scope global noprefixroute usb1
valid_lft forever preferred_lft forever
inet6 fe80::5bf4:c217:d9c9:859c/64 scope link noprefixroute
valid_lft forever preferred_lft forever
mendel@mocha-shrimp:~$

After activate the Internet by nmtui, we can NOW clearly see the wlan0 IP is automatically allocated.

1
2
3
4
5
6
7
8
9
10
11
mendel@mocha-shrimp:~$ ip -c address
......
3: wlan0: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc mq state UP group default qlen 3000
link/ether 7c:d9:5c:b1:fa:cd brd ff:ff:ff:ff:ff:ff
inet 192.168.1.110/24 brd 192.168.1.255 scope global dynamic noprefixroute wlan0
valid_lft 86367sec preferred_lft 86367sec
inet6 2001:569:7e6e:dc00:d1c4:697a:f60e:b5a4/64 scope global dynamic noprefixroute
valid_lft 7468sec preferred_lft 7168sec
inet6 fe80::e10b:9dc6:60c4:b91b/64 scope link noprefixroute
valid_lft forever preferred_lft forever
......

Of course, we can setup a static IP for this particular Google Coral Dev Board afterwards.

2.1.5 SSH into Mendel

In order to SSH into Mendel and connect remotely, we need to do Connect to a board’s shell on the host computer. You MUST pushkey before you can ssh into the board via the Internet IP instead of the virtual IP via USB, say 192.168.100.2 or 192.168.101.2.

1
2
➜  ~ ssh -i  ~/.ssh/id_rsa_mendel.pub mendel@192.168.1.97
Connection closed by 192.168.1.97 port 22

However, for now, I’ve got NO idea why ssh NETVER works for Google Coral Dev Board any more.

From now on, a huge modification.

2.2 Flash from U-Boot on an SD card

If you get unlucky and you can't even boot your board into U-Boot, then you can recover the system by booting into U-Boot from an image on the SD card and then reflash the board from your Linux (cited from Google Coral Dev Board’s Official Doc). Now, fastboot devices from host is NOW back.

1
2
➜  mendel-enterprise-day-13 fastboot devices
101989d6f32efb39 fastboot

The, we reflash Google Coral Dev Board.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
➜  mendel-enterprise-day-13 bash flash.sh 
Sending 'bootloader0' (991 KB) OKAY [ 0.056s]
Writing 'bootloader0' OKAY [ 0.190s]
Finished. Total time: 0.267s
Rebooting into bootloader OKAY [ 0.024s]
Finished. Total time: 0.125s
Sending 'gpt' (33 KB) OKAY [ 0.018s]
Writing 'gpt' OKAY [ 0.308s]
Finished. Total time: 0.420s
Rebooting into bootloader OKAY [ 0.022s]
Finished. Total time: 0.122s
Erasing 'misc' OKAY [ 0.069s]
Finished. Total time: 0.079s
Sending 'boot' (131072 KB) OKAY [ 5.331s]
Writing 'boot' OKAY [ 3.604s]
Finished. Total time: 8.954s
Sending sparse 'rootfs' 1/4 (368422 KB) OKAY [ 14.924s]
Writing 'rootfs' OKAY [ 35.731s]
Sending sparse 'rootfs' 2/4 (408501 KB) OKAY [ 16.079s]
Writing 'rootfs' OKAY [ 18.689s]
Sending sparse 'rootfs' 3/4 (389107 KB) OKAY [ 15.396s]
Writing 'rootfs' OKAY [ 36.536s]
Sending sparse 'rootfs' 4/4 (231325 KB) OKAY [ 9.290s]
Writing 'rootfs' OKAY [ 64.694s]
Finished. Total time: 212.891s
Rebooting OKAY [ 0.005s]
Finished. Total time: 0.105s

Now we are able to run mdt shell successfully.

1
2
3
4
5
6
7
8
9
10
11
12
13
➜  mendel-enterprise-day-13 mdt shell
Waiting for a device...
Connecting to green-snail at 192.168.100.2
Key not present on green-snail -- pushing
Linux green-snail 4.14.98-imx #1 SMP PREEMPT Fri Nov 8 23:28:21 UTC 2019 aarch64

The programs included with the Mendel GNU/Linux system are free software;
the exact distribution terms for each program are described in the
individual files in /usr/share/doc/*/copyright.

Mendel GNU/Linux comes with ABSOLUTELY NO WARRANTY, to the extent
permitted by applicable law.
Last login: Mon Nov 11 18:19:48 2019

Run ssh-keygen and pushkey consequently:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
➜  .ssh ssh-keygen
Generating public/private rsa key pair.
Enter file in which to save the key (/home/longervision/.ssh/id_rsa): /home/longervision/.ssh/id_rsa_mendel
Enter passphrase (empty for no passphrase):
Enter same passphrase again:
Your identification has been saved in /home/longervision/.ssh/id_rsa_mendel.
Your public key has been saved in /home/longervision/.ssh/id_rsa_mendel.pub.
The key fingerprint is:
......
➜ .ssh mdt pushkey ~/.ssh/id_rsa_mendel.pub
Waiting for a device...
Connecting to green-snail at 192.168.100.2
Pushing /home/longervision/.ssh/id_rsa_mendel.pub
Key /home/longervision/.ssh/id_rsa_mendel.pub pushed.

Then, with mdt shell, run command nmtui to activate wlan0.

Let’s briefly summarize:

2.3 Demonstration

2.3.1 edgetpu_demo –device & edgetpu_demo –stream

Let’s ignore edgetpu_demo –device for I ALMOST NEVER work with a GUI mode. The demo video is on my youtube channel, please refer to:

On console, it just displays as:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
mendel@deft-orange:~$ edgetpu_demo --stream
Press 'q' to quit.
Press 'n' to switch between models.

(edgetpu_detect_server:9991): Gtk-WARNING **: 07:56:57.725: Locale not supported by C library.
Using the fallback 'C' locale.
INFO:edgetpuvision.streaming.server:Listening on ports tcp: 4665, web: 4664, annexb: 4666
INFO:edgetpuvision.streaming.server:New web connection from 192.168.1.200:37536
INFO:edgetpuvision.streaming.server:Number of active clients: 1
INFO:edgetpuvision.streaming.server:New web connection from 192.168.1.200:37538
INFO:edgetpuvision.streaming.server:[192.168.1.200:37536] Rx thread finished
INFO:edgetpuvision.streaming.server:[192.168.1.200:37536] Tx thread finished
INFO:edgetpuvision.streaming.server:Number of active clients: 2
INFO:edgetpuvision.streaming.server:[192.168.1.200:37536] Stopping...
INFO:edgetpuvision.streaming.server:[192.168.1.200:37536] Stopped.
INFO:edgetpuvision.streaming.server:Number of active clients: 1
INFO:edgetpuvision.streaming.server:New web connection from 192.168.1.200:37540
INFO:edgetpuvision.streaming.server:Number of active clients: 2
INFO:edgetpuvision.streaming.server:New web connection from 192.168.1.200:37542
INFO:edgetpuvision.streaming.server:Number of active clients: 3
INFO:edgetpuvision.streaming.server:[192.168.1.200:37538] Rx thread finished
INFO:edgetpuvision.streaming.server:[192.168.1.200:37540] Rx thread finished
INFO:edgetpuvision.streaming.server:New web connection from 192.168.1.200:37544
INFO:edgetpuvision.streaming.server:[192.168.1.200:37538] Tx thread finished
INFO:edgetpuvision.streaming.server:[192.168.1.200:37542] Rx thread finished
INFO:edgetpuvision.streaming.server:[192.168.1.200:37542] Tx thread finished
INFO:edgetpuvision.streaming.server:Number of active clients: 4
......

2.3.2 Classification

Refer to Install the TensorFlow Lite library.

1
2
3
4
5
6
7
mendel@green-snail:~/.local$ pip3 install https://dl.google.com/coral/python/tflite_runtime-2.1.0.post1-cp37-cp37m-linux_aarch64.whl
Collecting tflite-runtime==2.1.0.post1 from https://dl.google.com/coral/python/tflite_runtime-2.1.0.post1-cp37-cp37m-linux_aarch64.whl
Downloading https://dl.google.com/coral/python/tflite_runtime-2.1.0.post1-cp37-cp37m-linux_aarch64.whl (1.9MB)
100% |████████████████████████████████| 1.9MB 203kB/s
Requirement already satisfied: numpy>=1.12.1 in /usr/lib/python3/dist-packages (from tflite-runtime==2.1.0.post1) (1.16.2)
Installing collected packages: tflite-runtime
Successfully installed tflite-runtime-2.1.0.post1
1
2
3
4
5
6
7
8
9
10
mendel@green-snail:~/Downloads/tflite/python/examples/classification$ python3 classify_image.py   --model models/mobilenet_v2_1.0_224_inat_bird_quant_edgetpu.tflite   --labels models/inat_bird_labels.txt   --input images/parrot.jpg
----INFERENCE TIME----
Note: The first inference on Edge TPU is slow because it includes loading the model into Edge TPU memory.
13.5ms
3.5ms
2.7ms
3.0ms
3.0ms
-------RESULTS--------
Ara macao (Scarlet Macaw): 0.77734

2.3.3 Camera

2.3.3.1 Google Coral camera

The Google Coral camera can be detected as a video device:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
mendel@mocha-shrimp:~$ v4l2-ctl --list-formats-ext --device /dev/video0
ioctl: VIDIOC_ENUM_FMT
Type: Video Capture

[0]: 'YUYV' (YUYV 4:2:2)
Size: Discrete 640x480
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 720x480
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 1280x720
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 1920x1080
Interval: Discrete 0.067s (15.000 fps)
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 2592x1944
Interval: Discrete 0.067s (15.000 fps)
Size: Discrete 0x0

2.3.3.2 Face Detection Using Google TPU

My youtube real-time face detection video clearly shows Google TPU is seriously powerful.

On console, it displays:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
mendel@deft-orange:~$ edgetpu_detect_server \
> --model ${DEMO_FILES}/mobilenet_ssd_v2_face_quant_postprocess_edgetpu.tflite

(edgetpu_detect_server:4081): Gtk-WARNING **: 10:40:45.436: Locale not supported by C library.
Using the fallback 'C' locale.
INFO:edgetpuvision.streaming.server:Listening on ports tcp: 4665, web: 4664, annexb: 4666
INFO:edgetpuvision.streaming.server:New web connection from 192.168.1.200:33950
INFO:edgetpuvision.streaming.server:[192.168.1.200:33950] Rx thread finished
INFO:edgetpuvision.streaming.server:[192.168.1.200:33950] Tx thread finished
INFO:edgetpuvision.streaming.server:Number of active clients: 1
INFO:edgetpuvision.streaming.server:[192.168.1.200:33950] Stopping...
INFO:edgetpuvision.streaming.server:[192.168.1.200:33950] Stopped.
INFO:edgetpuvision.streaming.server:Number of active clients: 0
INFO:edgetpuvision.streaming.server:New web connection from 192.168.1.200:33952
INFO:edgetpuvision.streaming.server:Number of active clients: 1
INFO:edgetpuvision.streaming.server:[192.168.1.200:33952] Rx thread finished
INFO:edgetpuvision.streaming.server:[192.168.1.200:33952] Tx thread finished
INFO:edgetpuvision.streaming.server:New web connection from 192.168.1.200:33954
INFO:edgetpuvision.streaming.server:Number of active clients: 2
INFO:edgetpuvision.streaming.server:[192.168.1.200:33952] Stopping...
INFO:edgetpuvision.streaming.server:[192.168.1.200:33954] Rx thread finished
INFO:edgetpuvision.streaming.server:[192.168.1.200:33952] Stopped.
INFO:edgetpuvision.streaming.server:Number of active clients: 1
......

2.3.4 Bugs

1
2
3
4
5
6
7
8
9
10
11
mendel@green-snail:~$ edgetpu_demo --stream
Press 'q' to quit.
Press 'n' to switch between models.
Unable to init server: Could not connect: Connection refused

(edgetpu_detect_server:8391): Gtk-WARNING **: 20:18:07.433: Locale not supported by C library.
Using the fallback 'C' locale.
Unable to init server: Could not connect: Connection refused
Unable to init server: Could not connect: Connection refused

(edgetpu_detect_server:8391): Gtk-WARNING **: 20:18:07.473: cannot open display:
1
2
3
4
5
6
7
8
9
10
mendel@green-snail:~/Downloads/edgetpu/test_data$ edgetpu_detect_server --model ./mobilenet_ssd_v2_face_quant_postprocess_edgetpu.tflite
Unable to init server: Could not connect: Connection refused

(edgetpu_detect_server:8382): Gtk-WARNING **: 20:16:43.553: Locale not supported by C library.
Using the fallback 'C' locale.
Unable to init server: Could not connect: Connection refused
Unable to init server: Could not connect: Connection refused
Unable to init server: Could not connect: Connection refused

(edgetpu_detect_server:8382): Gtk-WARNING **: 20:16:44.967: cannot open display: :0

To solve this problem, run the following command:

1
mendel@green-snail:~$ sudo systemctl restart weston

Beautiful Sakura 1 Beautiful Sakura 2 Beautiful Sakura 3
Beautiful Sakura 1 Beautiful Sakura 2 Beautiful Sakura 3
Beautiful Sakura 4 Beautiful Sakura 5 Beautiful Sakura 6
Beautiful Sakura 4 Beautiful Sakura 5 Beautiful Sakura 6
Beautiful Sakura 7 Beautiful Sakura 8 The Willow
Beautiful Sakura 7 Beautiful Sakura 8 Beautiful Willow

Spent some time with the beautiful flowers.

😆

Okay, today, let’s briefly talk about PlotNeuralNet for 3D visualization of various AI model architectures. We just follow PlotNeuralNet and generate some results for fun.

1. pyexamples

1.2 test_simple

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
➜  pyexamples git:(master) ✗ ../tikzmake.sh test_simple  

\documentclass[border=8pt, multi, tikz]{standalone}
\usepackage{import}
\subimport{../layers/}{init}
\usetikzlibrary{positioning}
\usetikzlibrary{3d} %for including external image


\def\ConvColor{rgb:yellow,5;red,2.5;white,5}
\def\ConvReluColor{rgb:yellow,5;red,5;white,5}
\def\PoolColor{rgb:red,1;black,0.3}
\def\UnpoolColor{rgb:blue,2;green,1;black,0.3}
\def\FcColor{rgb:blue,5;red,2.5;white,5}
\def\FcReluColor{rgb:blue,5;red,5;white,4}
\def\SoftmaxColor{rgb:magenta,5;black,7}


\newcommand{\copymidarrow}{\tikz \draw[-Stealth,line width=0.8mm,draw={rgb:blue,4;red,1;green,1;black,3}] (-0.3,0) -- ++(0.3,0);}

\begin{document}
\begin{tikzpicture}
\tikzstyle{connection}=[ultra thick,every node/.style={sloped,allow upside down},draw=\edgecolor,opacity=0.7]
\tikzstyle{copyconnection}=[ultra thick,every node/.style={sloped,allow upside down},draw={rgb:blue,4;red,1;green,1;black,3},opacity=0.7]


\pic[shift={(0,0,0)}] at (0,0,0)
{Box={
name=conv1,
caption= ,
xlabel={{64, }},
zlabel=512,
fill=\ConvColor,
height=64,
width=2,
depth=64
}
};


\pic[shift={ (0,0,0) }] at (conv1-east)
{Box={
name=pool1,
caption= ,
fill=\PoolColor,
opacity=0.5,
height=32,
width=1,
depth=32
}
};


\pic[shift={(1,0,0)}] at (pool1-east)
{Box={
name=conv2,
caption= ,
xlabel={{64, }},
zlabel=128,
fill=\ConvColor,
height=32,
width=2,
depth=32
}
};


\draw [connection] (pool1-east) -- node {\midarrow} (conv2-west);


\pic[shift={ (0,0,0) }] at (conv2-east)
{Box={
name=pool2,
caption= ,
fill=\PoolColor,
opacity=0.5,
height=28,
width=1,
depth=28
}
};


\pic[shift={(3,0,0)}] at (pool1-east)
{Box={
name=soft1,
caption=SOFT,
xlabel={{" ","dummy"}},
zlabel=10,
fill=\SoftmaxColor,
opacity=0.8,
height=3,
width=1.5,
depth=25
}
};


\draw [connection] (pool2-east) -- node {\midarrow} (soft1-west);


\end{tikzpicture}
\end{document}

This is pdfTeX, Version 3.14159265-2.6-1.40.18 (TeX Live 2017/Debian) (preloaded format=pdflatex)
restricted \write18 enabled.
entering extended mode
(./test_simple.tex
LaTeX2e <2017-04-15>
Babel <3.18> and hyphenation patterns for 84 language(s) loaded.
(/usr/share/texlive/texmf-dist/tex/latex/standalone/standalone.cls
Document Class: standalone 2015/07/15 v1.2 Class to compile TeX sub-files stand
alone
(/usr/share/texlive/texmf-dist/tex/generic/oberdiek/ifluatex.sty)
(/usr/share/texlive/texmf-dist/tex/generic/oberdiek/ifpdf.sty)
(/usr/share/texlive/texmf-dist/tex/generic/ifxetex/ifxetex.sty)
(/usr/share/texlive/texmf-dist/tex/latex/xkeyval/xkeyval.sty
(/usr/share/texlive/texmf-dist/tex/generic/xkeyval/xkeyval.tex
(/usr/share/texlive/texmf-dist/tex/generic/xkeyval/xkvutils.tex
(/usr/share/texlive/texmf-dist/tex/generic/xkeyval/keyval.tex))))
(/usr/share/texlive/texmf-dist/tex/latex/standalone/standalone.cfg)
(/usr/share/texlive/texmf-dist/tex/latex/base/article.cls
Document Class: article 2014/09/29 v1.4h Standard LaTeX document class
(/usr/share/texlive/texmf-dist/tex/latex/base/size10.clo))
(/usr/share/texlive/texmf-dist/tex/latex/pgf/frontendlayer/tikz.sty
(/usr/share/texlive/texmf-dist/tex/latex/pgf/basiclayer/pgf.sty
(/usr/share/texlive/texmf-dist/tex/latex/pgf/utilities/pgfrcs.sty
(/usr/share/texlive/texmf-dist/tex/generic/pgf/utilities/pgfutil-common.tex
(/usr/share/texlive/texmf-dist/tex/generic/pgf/utilities/pgfutil-common-lists.t
ex)) (/usr/share/texlive/texmf-dist/tex/generic/pgf/utilities/pgfutil-latex.def
(/usr/share/texlive/texmf-dist/tex/latex/ms/everyshi.sty))
(/usr/share/texlive/texmf-dist/tex/generic/pgf/utilities/pgfrcs.code.tex))
(/usr/share/texlive/texmf-dist/tex/latex/pgf/basiclayer/pgfcore.sty
(/usr/share/texlive/texmf-dist/tex/latex/graphics/graphicx.sty
(/usr/share/texlive/texmf-dist/tex/latex/graphics/graphics.sty
(/usr/share/texlive/texmf-dist/tex/latex/graphics/trig.sty)
(/usr/share/texlive/texmf-dist/tex/latex/graphics-cfg/graphics.cfg)
(/usr/share/texlive/texmf-dist/tex/latex/graphics-def/pdftex.def)))
(/usr/share/texlive/texmf-dist/tex/latex/pgf/systemlayer/pgfsys.sty
(/usr/share/texlive/texmf-dist/tex/generic/pgf/systemlayer/pgfsys.code.tex
(/usr/share/texlive/texmf-dist/tex/generic/pgf/utilities/pgfkeys.code.tex
(/usr/share/texlive/texmf-dist/tex/generic/pgf/utilities/pgfkeysfiltered.code.t
ex)) (/usr/share/texlive/texmf-dist/tex/generic/pgf/systemlayer/pgf.cfg)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/systemlayer/pgfsys-pdftex.def
(/usr/share/texlive/texmf-dist/tex/generic/pgf/systemlayer/pgfsys-common-pdf.de
f)))
(/usr/share/texlive/texmf-dist/tex/generic/pgf/systemlayer/pgfsyssoftpath.code.
tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/systemlayer/pgfsysprotocol.code.
tex)) (/usr/share/texlive/texmf-dist/tex/latex/xcolor/xcolor.sty
(/usr/share/texlive/texmf-dist/tex/latex/graphics-cfg/color.cfg))
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcore.code.tex
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmath.code.tex
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathcalc.code.tex
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathutil.code.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathparser.code.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.code.tex
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.basic.code
.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.trigonomet
ric.code.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.random.cod
e.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.comparison
.code.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.base.code.
tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.round.code
.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.misc.code.
tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.integerari
thmetics.code.tex)))
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathfloat.code.tex))
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepoints.code.te
x)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathconstruct.
code.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathusage.code
.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcorescopes.code.te
x)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcoregraphicstate.c
ode.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcoretransformation
s.code.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcorequick.code.tex
)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreobjects.code.t
ex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathprocessing
.code.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcorearrows.code.te
x)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreshade.code.tex
)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreimage.code.tex

(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreexternal.code.
tex))
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcorelayers.code.te
x)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcoretransparency.c
ode.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepatterns.code.
tex)))
(/usr/share/texlive/texmf-dist/tex/generic/pgf/modules/pgfmoduleshapes.code.tex
) (/usr/share/texlive/texmf-dist/tex/generic/pgf/modules/pgfmoduleplot.code.tex
)
(/usr/share/texlive/texmf-dist/tex/latex/pgf/compatibility/pgfcomp-version-0-65
.sty)
(/usr/share/texlive/texmf-dist/tex/latex/pgf/compatibility/pgfcomp-version-1-18
.sty)) (/usr/share/texlive/texmf-dist/tex/latex/pgf/utilities/pgffor.sty
(/usr/share/texlive/texmf-dist/tex/latex/pgf/utilities/pgfkeys.sty
(/usr/share/texlive/texmf-dist/tex/generic/pgf/utilities/pgfkeys.code.tex))
(/usr/share/texlive/texmf-dist/tex/latex/pgf/math/pgfmath.sty
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmath.code.tex))
(/usr/share/texlive/texmf-dist/tex/generic/pgf/utilities/pgffor.code.tex
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmath.code.tex)))
(/usr/share/texlive/texmf-dist/tex/generic/pgf/frontendlayer/tikz/tikz.code.tex

(/usr/share/texlive/texmf-dist/tex/generic/pgf/libraries/pgflibraryplothandlers
.code.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/modules/pgfmodulematrix.code.tex
)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tik
zlibrarytopaths.code.tex))))
(/usr/share/texlive/texmf-dist/tex/latex/import/import.sty) (../layers/init.tex

(/usr/share/texlive/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tik
zlibraryquotes.code.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/libraries/pgflibraryarrows.meta.
code.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tik
zlibrarypositioning.code.tex) (../layers/Ball.sty) (../layers/Box.sty)
(../layers/RightBandedBox.sty))
(/usr/share/texlive/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tik
zlibrary3d.code.tex)
No file test_simple.aux.
ABD: EveryShipout initializing macros
(/usr/share/texlive/texmf-dist/tex/context/base/mkii/supp-pdf.mkii
[Loading MPS to PDF converter (version 2006.09.02).]
) (/usr/share/texlive/texmf-dist/tex/latex/oberdiek/epstopdf-base.sty
(/usr/share/texlive/texmf-dist/tex/generic/oberdiek/infwarerr.sty)
(/usr/share/texlive/texmf-dist/tex/latex/oberdiek/grfext.sty
(/usr/share/texlive/texmf-dist/tex/generic/oberdiek/kvdefinekeys.sty
(/usr/share/texlive/texmf-dist/tex/generic/oberdiek/ltxcmds.sty)))
(/usr/share/texlive/texmf-dist/tex/latex/oberdiek/kvoptions.sty
(/usr/share/texlive/texmf-dist/tex/generic/oberdiek/kvsetkeys.sty
(/usr/share/texlive/texmf-dist/tex/generic/oberdiek/etexcmds.sty)))
(/usr/share/texlive/texmf-dist/tex/generic/oberdiek/pdftexcmds.sty)
(/usr/share/texlive/texmf-dist/tex/latex/latexconfig/epstopdf-sys.cfg))
Overfull \hbox (7.76463pt too wide) in paragraph at lines 87--87
[][][][][]\OT1/cmr/bx/n/10 SOFT[]
[1{/var/lib/texmf/fonts/map/pdftex/updmap/pdftex.map}] (./test_simple.aux) )
(see the transcript file for additional information)</usr/share/texlive/texmf-d
ist/fonts/type1/public/amsfonts/cm/cmbx10.pfb></usr/share/texlive/texmf-dist/fo
nts/type1/public/amsfonts/cm/cmr10.pfb></usr/share/texlive/texmf-dist/fonts/typ
e1/public/amsfonts/cm/cmr9.pfb>
Output written on test_simple.pdf (1 page, 27575 bytes).
Transcript written on test_simple.log.
rm: cannot remove '*.vscodeLog': No such file or directory
➜ pyexamples git:(master) ✗

The following is to show .pdf file is able to be displayed in hexo:

test_simple.png

1.2 unet

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
➜  pyexamples git:(master) ✗ ../tikzmake.sh unet

\documentclass[border=8pt, multi, tikz]{standalone}
\usepackage{import}
\subimport{../layers/}{init}
\usetikzlibrary{positioning}
\usetikzlibrary{3d} %for including external image


\def\ConvColor{rgb:yellow,5;red,2.5;white,5}
\def\ConvReluColor{rgb:yellow,5;red,5;white,5}
\def\PoolColor{rgb:red,1;black,0.3}
\def\UnpoolColor{rgb:blue,2;green,1;black,0.3}
\def\FcColor{rgb:blue,5;red,2.5;white,5}
\def\FcReluColor{rgb:blue,5;red,5;white,4}
\def\SoftmaxColor{rgb:magenta,5;black,7}


\newcommand{\copymidarrow}{\tikz \draw[-Stealth,line width=0.8mm,draw={rgb:blue,4;red,1;green,1;black,3}] (-0.3,0) -- ++(0.3,0);}

\begin{document}
\begin{tikzpicture}
\tikzstyle{connection}=[ultra thick,every node/.style={sloped,allow upside down},draw=\edgecolor,opacity=0.7]
\tikzstyle{copyconnection}=[ultra thick,every node/.style={sloped,allow upside down},draw={rgb:blue,4;red,1;green,1;black,3},opacity=0.7]


\node[canvas is zy plane at x=0] (temp) at (-3,0,0) {\includegraphics[width=8cm,height=8cm]{../examples/fcn8s/cats.jpg}};


\pic[shift={ (0,0,0) }] at (0,0,0)
{RightBandedBox={
name=ccr_b1,
caption= ,
xlabel={{ 64, 64 }},
zlabel=500,
fill=\ConvColor,
bandfill=\ConvReluColor,
height=40,
width={ 2 , 2 },
depth=40
}
};


\pic[shift={ (0,0,0) }] at (ccr_b1-east)
{Box={
name=pool_b1,
caption= ,
fill=\PoolColor,
opacity=0.5,
height=32,
width=1,
depth=32
}
};


\pic[shift={ (1,0,0) }] at (pool_b1-east)
{RightBandedBox={
name=ccr_b2,
caption= ,
xlabel={{ 128, 128 }},
zlabel=256,
fill=\ConvColor,
bandfill=\ConvReluColor,
height=32,
width={ 3.5 , 3.5 },
depth=32
}
};


\pic[shift={ (0,0,0) }] at (ccr_b2-east)
{Box={
name=pool_b2,
caption= ,
fill=\PoolColor,
opacity=0.5,
height=24,
width=1,
depth=24
}
};


\draw [connection] (pool_b1-east) -- node {\midarrow} (ccr_b2-west);


\pic[shift={ (1,0,0) }] at (pool_b2-east)
{RightBandedBox={
name=ccr_b3,
caption= ,
xlabel={{ 256, 256 }},
zlabel=128,
fill=\ConvColor,
bandfill=\ConvReluColor,
height=25,
width={ 4.5 , 4.5 },
depth=25
}
};


\pic[shift={ (0,0,0) }] at (ccr_b3-east)
{Box={
name=pool_b3,
caption= ,
fill=\PoolColor,
opacity=0.5,
height=19,
width=1,
depth=19
}
};


\draw [connection] (pool_b2-east) -- node {\midarrow} (ccr_b3-west);


\pic[shift={ (1,0,0) }] at (pool_b3-east)
{RightBandedBox={
name=ccr_b4,
caption= ,
xlabel={{ 512, 512 }},
zlabel=64,
fill=\ConvColor,
bandfill=\ConvReluColor,
height=16,
width={ 5.5 , 5.5 },
depth=16
}
};


\pic[shift={ (0,0,0) }] at (ccr_b4-east)
{Box={
name=pool_b4,
caption= ,
fill=\PoolColor,
opacity=0.5,
height=12,
width=1,
depth=12
}
};


\draw [connection] (pool_b3-east) -- node {\midarrow} (ccr_b4-west);


\pic[shift={ (2,0,0) }] at (pool_b4-east)
{RightBandedBox={
name=ccr_b5,
caption=Bottleneck,
xlabel={{ 1024, 1024 }},
zlabel=32,
fill=\ConvColor,
bandfill=\ConvReluColor,
height=8,
width={ 8 , 8 },
depth=8
}
};


\draw [connection] (pool_b4-east) -- node {\midarrow} (ccr_b5-west);


\pic[shift={ (2.1,0,0) }] at (ccr_b5-east)
{Box={
name=unpool_b6,
caption= ,
fill=\UnpoolColor,
opacity=0.5,
height=16,
width=1,
depth=16
}
};


\pic[shift={ (0,0,0) }] at (unpool_b6-east)
{RightBandedBox={
name=ccr_res_b6,
caption= ,
xlabel={{ 512, }},
zlabel=64,
fill={rgb:white,1;black,3},
bandfill={rgb:white,1;black,2},
opacity=0.5,
height=16,
width=5.0,
depth=16
}
};


\pic[shift={(0,0,0)}] at (ccr_res_b6-east)
{Box={
name=ccr_b6,
caption= ,
xlabel={{512, }},
zlabel=64,
fill=\ConvColor,
height=16,
width=5.0,
depth=16
}
};


\pic[shift={ (0,0,0) }] at (ccr_b6-east)
{RightBandedBox={
name=ccr_res_c_b6,
caption= ,
xlabel={{ 512, }},
zlabel=64,
fill={rgb:white,1;black,3},
bandfill={rgb:white,1;black,2},
opacity=0.5,
height=16,
width=5.0,
depth=16
}
};


\pic[shift={(0,0,0)}] at (ccr_res_c_b6-east)
{Box={
name=end_b6,
caption= ,
xlabel={{512, }},
zlabel=64,
fill=\ConvColor,
height=16,
width=5.0,
depth=16
}
};


\draw [connection] (ccr_b5-east) -- node {\midarrow} (unpool_b6-west);


\path (ccr_b4-southeast) -- (ccr_b4-northeast) coordinate[pos=1.25] (ccr_b4-top) ;
\path (ccr_res_b6-south) -- (ccr_res_b6-north) coordinate[pos=1.25] (ccr_res_b6-top) ;
\draw [copyconnection] (ccr_b4-northeast)
-- node {\copymidarrow}(ccr_b4-top)
-- node {\copymidarrow}(ccr_res_b6-top)
-- node {\copymidarrow} (ccr_res_b6-north);


\pic[shift={ (2.1,0,0) }] at (end_b6-east)
{Box={
name=unpool_b7,
caption= ,
fill=\UnpoolColor,
opacity=0.5,
height=25,
width=1,
depth=25
}
};


\pic[shift={ (0,0,0) }] at (unpool_b7-east)
{RightBandedBox={
name=ccr_res_b7,
caption= ,
xlabel={{ 256, }},
zlabel=128,
fill={rgb:white,1;black,3},
bandfill={rgb:white,1;black,2},
opacity=0.5,
height=25,
width=4.5,
depth=25
}
};


\pic[shift={(0,0,0)}] at (ccr_res_b7-east)
{Box={
name=ccr_b7,
caption= ,
xlabel={{256, }},
zlabel=128,
fill=\ConvColor,
height=25,
width=4.5,
depth=25
}
};


\pic[shift={ (0,0,0) }] at (ccr_b7-east)
{RightBandedBox={
name=ccr_res_c_b7,
caption= ,
xlabel={{ 256, }},
zlabel=128,
fill={rgb:white,1;black,3},
bandfill={rgb:white,1;black,2},
opacity=0.5,
height=25,
width=4.5,
depth=25
}
};


\pic[shift={(0,0,0)}] at (ccr_res_c_b7-east)
{Box={
name=end_b7,
caption= ,
xlabel={{256, }},
zlabel=128,
fill=\ConvColor,
height=25,
width=4.5,
depth=25
}
};


\draw [connection] (end_b6-east) -- node {\midarrow} (unpool_b7-west);


\path (ccr_b3-southeast) -- (ccr_b3-northeast) coordinate[pos=1.25] (ccr_b3-top) ;
\path (ccr_res_b7-south) -- (ccr_res_b7-north) coordinate[pos=1.25] (ccr_res_b7-top) ;
\draw [copyconnection] (ccr_b3-northeast)
-- node {\copymidarrow}(ccr_b3-top)
-- node {\copymidarrow}(ccr_res_b7-top)
-- node {\copymidarrow} (ccr_res_b7-north);


\pic[shift={ (2.1,0,0) }] at (end_b7-east)
{Box={
name=unpool_b8,
caption= ,
fill=\UnpoolColor,
opacity=0.5,
height=32,
width=1,
depth=32
}
};


\pic[shift={ (0,0,0) }] at (unpool_b8-east)
{RightBandedBox={
name=ccr_res_b8,
caption= ,
xlabel={{ 128, }},
zlabel=256,
fill={rgb:white,1;black,3},
bandfill={rgb:white,1;black,2},
opacity=0.5,
height=32,
width=3.5,
depth=32
}
};


\pic[shift={(0,0,0)}] at (ccr_res_b8-east)
{Box={
name=ccr_b8,
caption= ,
xlabel={{128, }},
zlabel=256,
fill=\ConvColor,
height=32,
width=3.5,
depth=32
}
};


\pic[shift={ (0,0,0) }] at (ccr_b8-east)
{RightBandedBox={
name=ccr_res_c_b8,
caption= ,
xlabel={{ 128, }},
zlabel=256,
fill={rgb:white,1;black,3},
bandfill={rgb:white,1;black,2},
opacity=0.5,
height=32,
width=3.5,
depth=32
}
};


\pic[shift={(0,0,0)}] at (ccr_res_c_b8-east)
{Box={
name=end_b8,
caption= ,
xlabel={{128, }},
zlabel=256,
fill=\ConvColor,
height=32,
width=3.5,
depth=32
}
};


\draw [connection] (end_b7-east) -- node {\midarrow} (unpool_b8-west);


\path (ccr_b2-southeast) -- (ccr_b2-northeast) coordinate[pos=1.25] (ccr_b2-top) ;
\path (ccr_res_b8-south) -- (ccr_res_b8-north) coordinate[pos=1.25] (ccr_res_b8-top) ;
\draw [copyconnection] (ccr_b2-northeast)
-- node {\copymidarrow}(ccr_b2-top)
-- node {\copymidarrow}(ccr_res_b8-top)
-- node {\copymidarrow} (ccr_res_b8-north);


\pic[shift={ (2.1,0,0) }] at (end_b8-east)
{Box={
name=unpool_b9,
caption= ,
fill=\UnpoolColor,
opacity=0.5,
height=40,
width=1,
depth=40
}
};


\pic[shift={ (0,0,0) }] at (unpool_b9-east)
{RightBandedBox={
name=ccr_res_b9,
caption= ,
xlabel={{ 64, }},
zlabel=512,
fill={rgb:white,1;black,3},
bandfill={rgb:white,1;black,2},
opacity=0.5,
height=40,
width=2.5,
depth=40
}
};


\pic[shift={(0,0,0)}] at (ccr_res_b9-east)
{Box={
name=ccr_b9,
caption= ,
xlabel={{64, }},
zlabel=512,
fill=\ConvColor,
height=40,
width=2.5,
depth=40
}
};


\pic[shift={ (0,0,0) }] at (ccr_b9-east)
{RightBandedBox={
name=ccr_res_c_b9,
caption= ,
xlabel={{ 64, }},
zlabel=512,
fill={rgb:white,1;black,3},
bandfill={rgb:white,1;black,2},
opacity=0.5,
height=40,
width=2.5,
depth=40
}
};


\pic[shift={(0,0,0)}] at (ccr_res_c_b9-east)
{Box={
name=end_b9,
caption= ,
xlabel={{64, }},
zlabel=512,
fill=\ConvColor,
height=40,
width=2.5,
depth=40
}
};


\draw [connection] (end_b8-east) -- node {\midarrow} (unpool_b9-west);


\path (ccr_b1-southeast) -- (ccr_b1-northeast) coordinate[pos=1.25] (ccr_b1-top) ;
\path (ccr_res_b9-south) -- (ccr_res_b9-north) coordinate[pos=1.25] (ccr_res_b9-top) ;
\draw [copyconnection] (ccr_b1-northeast)
-- node {\copymidarrow}(ccr_b1-top)
-- node {\copymidarrow}(ccr_res_b9-top)
-- node {\copymidarrow} (ccr_res_b9-north);


\pic[shift={(0.75,0,0)}] at (end_b9-east)
{Box={
name=soft1,
caption=SOFT,
zlabel=512,
fill=\SoftmaxColor,
height=40,
width=1,
depth=40
}
};


\draw [connection] (end_b9-east) -- node {\midarrow} (soft1-west);


\end{tikzpicture}
\end{document}

This is pdfTeX, Version 3.14159265-2.6-1.40.18 (TeX Live 2017/Debian) (preloaded format=pdflatex)
restricted \write18 enabled.
entering extended mode
(./unet.tex
LaTeX2e <2017-04-15>
Babel <3.18> and hyphenation patterns for 84 language(s) loaded.
(/usr/share/texlive/texmf-dist/tex/latex/standalone/standalone.cls
Document Class: standalone 2015/07/15 v1.2 Class to compile TeX sub-files stand
alone
(/usr/share/texlive/texmf-dist/tex/generic/oberdiek/ifluatex.sty)
(/usr/share/texlive/texmf-dist/tex/generic/oberdiek/ifpdf.sty)
(/usr/share/texlive/texmf-dist/tex/generic/ifxetex/ifxetex.sty)
(/usr/share/texlive/texmf-dist/tex/latex/xkeyval/xkeyval.sty
(/usr/share/texlive/texmf-dist/tex/generic/xkeyval/xkeyval.tex
(/usr/share/texlive/texmf-dist/tex/generic/xkeyval/xkvutils.tex
(/usr/share/texlive/texmf-dist/tex/generic/xkeyval/keyval.tex))))
(/usr/share/texlive/texmf-dist/tex/latex/standalone/standalone.cfg)
(/usr/share/texlive/texmf-dist/tex/latex/base/article.cls
Document Class: article 2014/09/29 v1.4h Standard LaTeX document class
(/usr/share/texlive/texmf-dist/tex/latex/base/size10.clo))
(/usr/share/texlive/texmf-dist/tex/latex/pgf/frontendlayer/tikz.sty
(/usr/share/texlive/texmf-dist/tex/latex/pgf/basiclayer/pgf.sty
(/usr/share/texlive/texmf-dist/tex/latex/pgf/utilities/pgfrcs.sty
(/usr/share/texlive/texmf-dist/tex/generic/pgf/utilities/pgfutil-common.tex
(/usr/share/texlive/texmf-dist/tex/generic/pgf/utilities/pgfutil-common-lists.t
ex)) (/usr/share/texlive/texmf-dist/tex/generic/pgf/utilities/pgfutil-latex.def
(/usr/share/texlive/texmf-dist/tex/latex/ms/everyshi.sty))
(/usr/share/texlive/texmf-dist/tex/generic/pgf/utilities/pgfrcs.code.tex))
(/usr/share/texlive/texmf-dist/tex/latex/pgf/basiclayer/pgfcore.sty
(/usr/share/texlive/texmf-dist/tex/latex/graphics/graphicx.sty
(/usr/share/texlive/texmf-dist/tex/latex/graphics/graphics.sty
(/usr/share/texlive/texmf-dist/tex/latex/graphics/trig.sty)
(/usr/share/texlive/texmf-dist/tex/latex/graphics-cfg/graphics.cfg)
(/usr/share/texlive/texmf-dist/tex/latex/graphics-def/pdftex.def)))
(/usr/share/texlive/texmf-dist/tex/latex/pgf/systemlayer/pgfsys.sty
(/usr/share/texlive/texmf-dist/tex/generic/pgf/systemlayer/pgfsys.code.tex
(/usr/share/texlive/texmf-dist/tex/generic/pgf/utilities/pgfkeys.code.tex
(/usr/share/texlive/texmf-dist/tex/generic/pgf/utilities/pgfkeysfiltered.code.t
ex)) (/usr/share/texlive/texmf-dist/tex/generic/pgf/systemlayer/pgf.cfg)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/systemlayer/pgfsys-pdftex.def
(/usr/share/texlive/texmf-dist/tex/generic/pgf/systemlayer/pgfsys-common-pdf.de
f)))
(/usr/share/texlive/texmf-dist/tex/generic/pgf/systemlayer/pgfsyssoftpath.code.
tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/systemlayer/pgfsysprotocol.code.
tex)) (/usr/share/texlive/texmf-dist/tex/latex/xcolor/xcolor.sty
(/usr/share/texlive/texmf-dist/tex/latex/graphics-cfg/color.cfg))
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcore.code.tex
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmath.code.tex
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathcalc.code.tex
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathutil.code.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathparser.code.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.code.tex
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.basic.code
.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.trigonomet
ric.code.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.random.cod
e.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.comparison
.code.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.base.code.
tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.round.code
.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.misc.code.
tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.integerari
thmetics.code.tex)))
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmathfloat.code.tex))
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepoints.code.te
x)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathconstruct.
code.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathusage.code
.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcorescopes.code.te
x)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcoregraphicstate.c
ode.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcoretransformation
s.code.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcorequick.code.tex
)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreobjects.code.t
ex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathprocessing
.code.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcorearrows.code.te
x)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreshade.code.tex
)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreimage.code.tex

(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreexternal.code.
tex))
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcorelayers.code.te
x)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcoretransparency.c
ode.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepatterns.code.
tex)))
(/usr/share/texlive/texmf-dist/tex/generic/pgf/modules/pgfmoduleshapes.code.tex
) (/usr/share/texlive/texmf-dist/tex/generic/pgf/modules/pgfmoduleplot.code.tex
)
(/usr/share/texlive/texmf-dist/tex/latex/pgf/compatibility/pgfcomp-version-0-65
.sty)
(/usr/share/texlive/texmf-dist/tex/latex/pgf/compatibility/pgfcomp-version-1-18
.sty)) (/usr/share/texlive/texmf-dist/tex/latex/pgf/utilities/pgffor.sty
(/usr/share/texlive/texmf-dist/tex/latex/pgf/utilities/pgfkeys.sty
(/usr/share/texlive/texmf-dist/tex/generic/pgf/utilities/pgfkeys.code.tex))
(/usr/share/texlive/texmf-dist/tex/latex/pgf/math/pgfmath.sty
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmath.code.tex))
(/usr/share/texlive/texmf-dist/tex/generic/pgf/utilities/pgffor.code.tex
(/usr/share/texlive/texmf-dist/tex/generic/pgf/math/pgfmath.code.tex)))
(/usr/share/texlive/texmf-dist/tex/generic/pgf/frontendlayer/tikz/tikz.code.tex

(/usr/share/texlive/texmf-dist/tex/generic/pgf/libraries/pgflibraryplothandlers
.code.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/modules/pgfmodulematrix.code.tex
)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tik
zlibrarytopaths.code.tex))))
(/usr/share/texlive/texmf-dist/tex/latex/import/import.sty) (../layers/init.tex

(/usr/share/texlive/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tik
zlibraryquotes.code.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/libraries/pgflibraryarrows.meta.
code.tex)
(/usr/share/texlive/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tik
zlibrarypositioning.code.tex) (../layers/Ball.sty) (../layers/Box.sty)
(../layers/RightBandedBox.sty))
(/usr/share/texlive/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tik
zlibrary3d.code.tex)
No file unet.aux.
ABD: EveryShipout initializing macros
(/usr/share/texlive/texmf-dist/tex/context/base/mkii/supp-pdf.mkii
[Loading MPS to PDF converter (version 2006.09.02).]
) (/usr/share/texlive/texmf-dist/tex/latex/oberdiek/epstopdf-base.sty
(/usr/share/texlive/texmf-dist/tex/generic/oberdiek/infwarerr.sty)
(/usr/share/texlive/texmf-dist/tex/latex/oberdiek/grfext.sty
(/usr/share/texlive/texmf-dist/tex/generic/oberdiek/kvdefinekeys.sty
(/usr/share/texlive/texmf-dist/tex/generic/oberdiek/ltxcmds.sty)))
(/usr/share/texlive/texmf-dist/tex/latex/oberdiek/kvoptions.sty
(/usr/share/texlive/texmf-dist/tex/generic/oberdiek/kvsetkeys.sty
(/usr/share/texlive/texmf-dist/tex/generic/oberdiek/etexcmds.sty)))
(/usr/share/texlive/texmf-dist/tex/generic/oberdiek/pdftexcmds.sty)
(/usr/share/texlive/texmf-dist/tex/latex/latexconfig/epstopdf-sys.cfg))
Overfull \hbox (15.26395pt too wide) in paragraph at lines 469--469
[][][][][]\OT1/cmr/bx/n/10 SOFT[]
[1{/var/lib/texmf/fonts/map/pdftex/updmap/pdftex.map} <../examples/fcn8s/cats.j
pg>] (./unet.aux) )
(see the transcript file for additional information)</usr/share/texlive/texmf-d
ist/fonts/type1/public/amsfonts/cm/cmbx10.pfb></usr/share/texlive/texmf-dist/fo
nts/type1/public/amsfonts/cm/cmr10.pfb></usr/share/texlive/texmf-dist/fonts/typ
e1/public/amsfonts/cm/cmr9.pfb>
Output written on unet.pdf (1 page, 101043 bytes).
Transcript written on unet.log.
rm: cannot remove '*.vscodeLog': No such file or directory
➜ pyexamples git:(master) ✗

unet.png

Pretty neat, isn’t it?

2. Provided Examples

Let’s FIRST take a look how many LETEX files are there under the folder examples.

1
2
3
4
5
6
7
8
9
➜  PlotNeuralNet git:(master) ✗ cd examples 
➜ examples git:(master) ✗ find . -depth -name "*.tex"
./fcn32s/fcn32.tex
./fcn8s/fcn8.tex
./HED/HED.tex
./SoftmaxLoss/SoftmaxLoss.tex
./Unet/Unet.tex
./Unet_Ushape/Unet_ushape.tex
./VGG16/vgg16.tex

We then enter each subfolder and run command ../../tikzmake.sh example_name, for instance:

1
2
➜  examples git:(master) ✗ cd fcn32s
➜ fcn32s git:(master) ✗ ../../tikzmake.sh fcn32

It’s WEIRD that so far, each .tex file is removed after the .pdf file for AI model architecture has been generated. Anyway, let’s take a look at all generated files in .png format.

2.1 fcn32

fcn32.png

2.2 fcn8

fcn8.png

2.1 HED

HED.png

2.2 SoftmaxLoss

SoftmaxLoss.png

2.1 Unet

Unet.png

2.2 Unet_ushape

Unet_ushape.png

2.1 vgg16

vgg16.png

Beautiful Sakura 1 Beautiful Sakura 2 Beautiful Sakura 3
Beautiful Sakura 1 Beautiful Sakura 2 Beautiful Sakura 3
Beautiful Sakura 4 Beautiful Sakura 5 Beautiful Sakura 6
Beautiful Sakura 4 Beautiful Sakura 5 Beautiful Sakura 6
Beautiful Sakura 7 Beautiful Sakura 8 Beautiful Sakura 9
Beautiful Sakura 7 Beautiful Sakura 8 Beautiful Sakura 9

For simplicity, let’s just pick up Python Server way to install Netron by pip install --user netron. After installation:

1
2
3
4
5
6
7
8
9
10
11
➜  ~ pip show netron
Name: netron
Version: 4.0.6
Summary: Viewer for neural network, deep learning and machine learning models
Home-page: https://github.com/lutzroeder/netron
Author: Lutz Roeder
Author-email: lutzroeder@users.noreply.github.com
License: MIT
Location: /home/longervision/.local/lib/python3.6/site-packages
Requires:
Required-by:

We then test out some popular models, including

bvlcalexnet-9.onnx finetune_flickr_style.caffemodel
bvlcalexnet-9.onnx finetune_flickr_style.caffemodel

inception_v3.ckpt

Today is Easter Sunday, Prime minister of United Kindom Boris Johnson recovered from COVID-19. Canada has been suffereing COVID-19 for a month already. I’m re-writing this blog FIRSTLY written in September 2019, and UPDATED ONCE in December 2019.

Merry Christmas and happy new year everybody. I’ve been back to Vancouver for several days. These days, I’m updating this blog FIRSTLY written in September 2019. 2020 is coming, and we’re getting 1 year older. A kind of sad hum?

😝

Okay… No matter what, let’s enjoy the song first: WE ARE YOUNG. Today, I joined Free Software Foundation and start my journey of supporting Open Source Software BY CASH. For me, it’s not about poverity or richness. It’s ALL about FAITH.

To write something about Raspberry Pi is to say GOOD BYE to my Raspberry Pi 3B, and WELCOME Raspberry Pi 4 at the same time. Our target today is to build an AI edge computing end as the following Youtube video:

1. About Raspberry Pi 4

1.1 Raspberry Pi 4 vs. Raspberry Pi 3B+

Before we start, let’s carry out a simple comparison between Raspberry Pi 4 and Raspberry Pi 3B+.

1.2 Raspbian Installation

1
2
3
4
5
➜  raspbian sudo dd bs=4M if=2020-02-13-raspbian-buster.img of=/dev/mmcblk0 conv=fsync 
[sudo] password for longervision:
903+0 records in
903+0 records out
3787456512 bytes (3.8 GB, 3.5 GiB) copied, 198.861 s, 19.0 MB/s

1.3 BCM2711 is detected as BCM2835

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
➜  ~ cat /proc/cpuinfo
processor : 0
model name : ARMv7 Processor rev 3 (v7l)
BogoMIPS : 108.00
Features : half thumb fastmult vfp edsp neon vfpv3 tls vfpv4 idiva idivt vfpd32 lpae evtstrm crc32
CPU implementer : 0x41
CPU architecture: 7
CPU variant : 0x0
CPU part : 0xd08
CPU revision : 3

processor : 1
model name : ARMv7 Processor rev 3 (v7l)
BogoMIPS : 108.00
Features : half thumb fastmult vfp edsp neon vfpv3 tls vfpv4 idiva idivt vfpd32 lpae evtstrm crc32
CPU implementer : 0x41
CPU architecture: 7
CPU variant : 0x0
CPU part : 0xd08
CPU revision : 3

processor : 2
model name : ARMv7 Processor rev 3 (v7l)
BogoMIPS : 108.00
Features : half thumb fastmult vfp edsp neon vfpv3 tls vfpv4 idiva idivt vfpd32 lpae evtstrm crc32
CPU implementer : 0x41
CPU architecture: 7
CPU variant : 0x0
CPU part : 0xd08
CPU revision : 3

processor : 3
model name : ARMv7 Processor rev 3 (v7l)
BogoMIPS : 108.00
Features : half thumb fastmult vfp edsp neon vfpv3 tls vfpv4 idiva idivt vfpd32 lpae evtstrm crc32
CPU implementer : 0x41
CPU architecture: 7
CPU variant : 0x0
CPU part : 0xd08
CPU revision : 3

Hardware : BCM2835
Revision : c03111
Serial : 100000006c0c9b01
Model : Raspberry Pi 4 Model B Rev 1.1

This issue seems to be a well-known bug. Raspberry Pi 4’s specification can be retrieved from The MagPi Magazine. More details about the development history of Raspberry Pi can be found on Wikipedia.

2. Movidius Neural Compute Stick on Raspberry Pi 4

Then, we just follow the following 2 blogs Run NCS Applications on Raspberry Pi and Adding AI to the Raspberry Pi with the Movidius Neural Compute Stick to test out Intel Movidius Neural Compute Stick 2:

Intel Movidius Neural Compute Stick 2 Intel Movidius Neural Compute Stick 1
Intel Movidius Neural Compute Stick 2 Intel Movidius Neural Compute Stick 1

Intel Movidius Neural Compute Stick 1 is NOT listed on Intel’s official website any more. But github support for Intel Movidius Neural Compute Stick 1 can be found at https://github.com/movidius/ncsdk.

2.1 NCSDK Installation

We FIRST need to have ncsdk installed. Yup, here, as described in Run NCS Applications on Raspberry Pi, we carry out the installation directly under folder ...../ncsdk/api/src.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
➜  src make
cc -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/shared -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/pc -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLinkConsole/pc -I /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/swCommon/include/ -I /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/shared/include/ -D__PC__ -DUSE_USB_VSC -DVERSION_NAME="\"`cat ./version.txt`\"" -DDEVICE_SHELL_ENABLED -DEXCLUDE_HIGHCLASS -O2 -Wall -pthread -fPIC -MMD -MP -I. -I../include -I/usr/include/libusb-1.0 -c mvnc_api.c -o obj-armv7l/mvnc_api.o
mvnc_api.c:457:12: warning: ‘deviceGetNumberOfDevices’ defined but not used [-Wunused-function]
static int deviceGetNumberOfDevices()
^~~~~~~~~~~~~~~~~~~~~~~~
mvnc_api.c: In function ‘ncGraphCreate’:
mvnc_api.c:898:5: warning: ‘strncpy’ specified bound 28 equals destination size [-Wstringop-truncation]
strncpy(g->name, name, NC_MAX_NAME_SIZE);
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
mvnc_api.c: In function ‘ncFifoCreate’:
mvnc_api.c:2210:5: warning: ‘strncpy’ specified bound 28 equals destination size [-Wstringop-truncation]
strncpy(handle->name, name, NC_MAX_NAME_SIZE);
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
cc -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/shared -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/pc -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLinkConsole/pc -I /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/swCommon/include/ -I /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/shared/include/ -D__PC__ -DUSE_USB_VSC -DVERSION_NAME="\"`cat ./version.txt`\"" -DDEVICE_SHELL_ENABLED -DEXCLUDE_HIGHCLASS -O2 -Wall -pthread -fPIC -MMD -MP -I. -I../include -I/usr/include/libusb-1.0 -c fp16.c -o obj-armv7l/fp16.o
cc -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/shared -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/pc -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLinkConsole/pc -I /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/swCommon/include/ -I /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/shared/include/ -D__PC__ -DUSE_USB_VSC -DVERSION_NAME="\"`cat ./version.txt`\"" -DDEVICE_SHELL_ENABLED -DEXCLUDE_HIGHCLASS -O2 -Wall -pthread -fPIC -MMD -MP -I. -I../include -I/usr/include/libusb-1.0 -c /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/pc/usb_boot.c -o obj-armv7l//home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/pc/usb_boot.o
cc -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/shared -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/pc -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLinkConsole/pc -I /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/swCommon/include/ -I /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/shared/include/ -D__PC__ -DUSE_USB_VSC -DVERSION_NAME="\"`cat ./version.txt`\"" -DDEVICE_SHELL_ENABLED -DEXCLUDE_HIGHCLASS -O2 -Wall -pthread -fPIC -MMD -MP -I. -I../include -I/usr/include/libusb-1.0 -c /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/pc/pcie_host.c -o obj-armv7l//home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/pc/pcie_host.o
cc -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/shared -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/pc -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLinkConsole/pc -I /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/swCommon/include/ -I /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/shared/include/ -D__PC__ -DUSE_USB_VSC -DVERSION_NAME="\"`cat ./version.txt`\"" -DDEVICE_SHELL_ENABLED -DEXCLUDE_HIGHCLASS -O2 -Wall -pthread -fPIC -MMD -MP -I. -I../include -I/usr/include/libusb-1.0 -c /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/shared/XLink.c -o obj-armv7l//home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/shared/XLink.o
cc -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/shared -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/pc -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLinkConsole/pc -I /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/swCommon/include/ -I /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/shared/include/ -D__PC__ -DUSE_USB_VSC -DVERSION_NAME="\"`cat ./version.txt`\"" -DDEVICE_SHELL_ENABLED -DEXCLUDE_HIGHCLASS -O2 -Wall -pthread -fPIC -MMD -MP -I. -I../include -I/usr/include/libusb-1.0 -c /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/shared/XLinkDispatcher.c -o obj-armv7l//home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/shared/XLinkDispatcher.o
cc -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/shared -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/pc -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLinkConsole/pc -I /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/swCommon/include/ -I /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/shared/include/ -D__PC__ -DUSE_USB_VSC -DVERSION_NAME="\"`cat ./version.txt`\"" -DDEVICE_SHELL_ENABLED -DEXCLUDE_HIGHCLASS -O2 -Wall -pthread -fPIC -MMD -MP -I. -I../include -I/usr/include/libusb-1.0 -c /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLinkConsole/pc/XLinkConsole.c -o obj-armv7l//home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLinkConsole/pc/XLinkConsole.o
/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLinkConsole/pc/XLinkConsole.c: In function ‘shellThreadWriter’:
/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLinkConsole/pc/XLinkConsole.c:44:33: warning: pointer targets in passing argument 2 of ‘XLinkWriteData’ differ in signedness [-Wpointer-sign]
XLinkWriteData(cId, str, bytes);
^~~
In file included from /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLinkConsole/pc/XLinkConsole.c:18:
/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/shared/XLink.h:54:14: note: expected ‘const uint8_t *’ {aka ‘const unsigned char *’} but argument is of type ‘char *’
XLinkError_t XLinkWriteData(streamId_t streamId, const uint8_t* buffer, int size);
^~~~~~~~~~~~~~
/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLinkConsole/pc/XLinkConsole.c: In function ‘shellThreadReader’:
/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLinkConsole/pc/XLinkConsole.c:81:5: warning: implicit declaration of function ‘pthread_create’ [-Wimplicit-function-declaration]
pthread_create(&shellWriter, NULL, shellThreadWriter, (void*) context);
^~~~~~~~~~~~~~
cc -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/shared -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/pc -I/home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLinkConsole/pc -I /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/swCommon/include/ -I /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/shared/include/ -D__PC__ -DUSE_USB_VSC -DVERSION_NAME="\"`cat ./version.txt`\"" -DDEVICE_SHELL_ENABLED -DEXCLUDE_HIGHCLASS -O2 -Wall -pthread -fPIC -MMD -MP -I. -I../include -I/usr/include/libusb-1.0 -c /home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/pc/XLinkPlatform.c -o obj-armv7l//home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/pc/XLinkPlatform.o
if [ ! -e ./version.txt ] ; then echo "missing version.txt file"; exit 1; fi;
cc -shared obj-armv7l/mvnc_api.o obj-armv7l/fp16.o obj-armv7l//home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/pc/usb_boot.o obj-armv7l//home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/pc/pcie_host.o obj-armv7l//home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/shared/XLink.o obj-armv7l//home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/shared/XLinkDispatcher.o obj-armv7l//home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLinkConsole/pc/XLinkConsole.o obj-armv7l//home/pi/Downloads/ncsdk-2.10.01.01/api/src/common/components/XLink/pc/XLinkPlatform.o -o obj-armv7l/libmvnc.so.0 -lpthread -lusb-1.0 -ldl
ln -fs obj-armv7l/libmvnc.so.0 libmvnc.so
ln -fs obj-armv7l/libmvnc.so.0 libmvnc.so.0
NCSDK FW successfully installed
➜ src sudo make install
NCSDK FW successfully installed
mkdir -p /usr/local/include/
mkdir -p /usr/local/include/mvnc2
mkdir -p /usr/local/lib/
cp obj-armv7l/libmvnc.so.0 /usr/local/lib/
ln -fs libmvnc.so.0 /usr/local/lib/libmvnc.so
cp ../include/mvnc.h /usr/local/include/mvnc2
ln -fs /usr/local/include/mvnc2/mvnc.h /usr/local/include/mvnc.h
mkdir -p /usr/local/lib/mvnc
cp mvnc/MvNCAPI-*.mvcmd /usr/local/lib/mvnc/
mkdir -p /etc/udev/rules.d/
cp 97-usbboot.rules /etc/udev/rules.d/
mkdir -p /usr/local/lib/python3.7/dist-packages
mkdir -p /usr/local/lib/python3.7/dist-packages
cp -r ../python/mvnc /usr/local/lib/python3.7/dist-packages/
cp -r ../python/mvnc /usr/local/lib/python3.7/dist-packages/
udevadm control --reload-rules
udevadm trigger
ldconfig
➜ src

2.2 Test NCSDK Example Apps

2.2.1 For Movidius NCS 1

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
➜  hello_ncs_py lsusb
......
Bus 001 Device 004: ID 03e7:2150 Intel Myriad VPU [Movidius Neural Compute Stick]
......
➜ hello_ncs_py python hello_ncs.py
D: [ 0] ncDeviceCreate:307 ncDeviceCreate index 0
D: [ 0] ncDeviceCreate:307 ncDeviceCreate index 1
D: [ 0] ncDeviceOpen:523 File path /usr/local/lib/mvnc/MvNCAPI-ma2450.mvcmd
I: [ 0] ncDeviceOpen:529 ncDeviceOpen() XLinkBootRemote returned success 0
I: [ 0] ncDeviceOpen:567 XLinkConnect done - link Id 0
D: [ 0] ncDeviceOpen:581 done
I: [ 0] ncDeviceOpen:583 Booted 1.1-ma2450 -> VSC
I: [ 0] getDevAttributes:382 Device attributes
I: [ 0] getDevAttributes:385 Device FW version: 2.a.2450.8a
I: [ 0] getDevAttributes:387 mvTensorVersion 2.10
I: [ 0] getDevAttributes:388 Maximum graphs: 10
I: [ 0] getDevAttributes:389 Maximum fifos: 20
I: [ 0] getDevAttributes:391 Maximum graph option class: 1
I: [ 0] getDevAttributes:393 Maximum device option class: 1
I: [ 0] getDevAttributes:394 Device memory capacity: 522047856
Hello NCS! Device opened normally.
I: [ 0] ncDeviceClose:775 closing device
Goodbye NCS! Device closed normally.
NCS device working.

2.2.2 For Movidius NCS 2

1
2
3
4
5
6
7
8
9
10
➜  hello_ncs_py lsusb
......
Bus 001 Device 007: ID 03e7:2485 Intel Movidius MyriadX
......
➜ hello_ncs_py python hello_ncs.py
D: [ 0] ncDeviceCreate:307 ncDeviceCreate index 0
D: [ 0] ncDeviceCreate:307 ncDeviceCreate index 1
D: [ 0] ncDeviceOpen:523 File path /usr/local/lib/mvnc/MvNCAPI-ma2480.mvcmd
W: [ 0] ncDeviceOpen:527 ncDeviceOpen() XLinkBootRemote returned error 3
Error - Could not open NCS device.

The above bug has ALREADY been expalined in main online resource:

All these hint that OpenVINO should be utilized instead of NCSDK2.

2.3 mvnc Python Package

1
2
3
4
5
6
7
8
➜  ~ python
Python 3.7.3 (default, Dec 20 2019, 18:57:59)
[GCC 8.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import mvnc
>>> mvnc.__file__
'/usr/local/lib/python3.7/dist-packages/mvnc/__init__.py'
>>>

3. Transitioning from Intel Movidius Neural Compute SDK to Intel OpenVINO

By following Intel’s official documentation Transitioning from Intel® Movidius™ Neural Compute SDK to Intel® Distribution of OpenVINO™ toolkit, we are transitioning to OpenVINO, which supports both Intel NCS 2 and Intel NCS 1.

3.1 Install OpenVINO for Raspbian

For the installation details of OpenVINO, please refer to the following 2 documentations:

We now extract the MOST up-to-date l_openvino_toolkit_runtime_raspbian_p_2020.3.220.tgz under folder /opt/intel/openvino. Let’s take a brief look at this folder:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
➜  openvino pwd
/opt/intel/openvino
➜ openvino ll
total 28K
drwxr-xr-x 2 pi pi 4.0K Jan 27 08:06 bin
drwxr-xr-x 4 pi pi 4.0K Jan 27 08:06 deployment_tools
drwxr-xr-x 2 pi pi 4.0K Jan 27 08:08 documentation
lrwxrwxrwx 1 pi pi 33 Jan 27 08:04 inference_engine -> deployment_tools/inference_engine
drwxr-xr-x 2 pi pi 4.0K Jan 27 08:08 install_dependencies
drwxr-xr-x 5 pi pi 4.0K Jan 27 08:08 licensing
drwxr-xr-x 8 pi pi 4.0K Jan 27 08:09 opencv
drwxr-xr-x 6 pi pi 4.0K Jan 27 08:09 python
➜ openvino ls inference_engine
external include lib samples share version.txt

Clearly, by comparing with OpenVINO™ Toolkit - Deep Learning Deployment Toolkit repository, we know that the open source version of deployment_tools contains some more content than the trimmed version for Raspbian. We’ll use model-optimizer for sure. Therefore, we checked out dldt, and put it under folder /opt/intel.

1
2
3
4
5
6
7
➜  intel pwd
/opt/intel
➜ intel ll
total 12K
drwxr-xr-x 8 pi pi 4.0K Apr 7 06:37 dldt
drwxr-xr-x 9 pi pi 4.0K Apr 7 14:04 l_openvino_toolkit_runtime_raspbian_p_2020.1.023
lrwxrwxrwx 1 root root 48 Apr 7 06:29 openvino -> l_openvino_toolkit_runtime_raspbian_p_2020.1.023

3.2 Build OpenVINO Samples

Before start building OpenVINO samples, please have OpenCV built from source and installed. You can of course directly run the command build_samples.sh to build ALL samples. However, I personally would like you to build and install OpenCV from source FIRST.

Note: Be sure to enable -DCMAKE_CXX_FLAGS=’-march=armv7-a’ while building dldt/samples, which is exactly the same as the source under l_openvino_toolkit_runtime_raspbian_p_2020.1.023/inference_engine/samples/cpp. In fact, to build l_openvino_toolkit_runtime_raspbian_p_2020.1.023/inference_engine/samples/c, -DCMAKE_CXX_FLAGS=’-march=armv7-a’ also needs to be enabled.

After having successfully built C/C++ samples, let’s enter folder /opt/intel/openvino/inference_engine/samples.

3.3 Device Query

3.2.1 C

There is NO such an exe file hello_query_device_c.

3.2.2 C++

For NCS 2

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
➜  samples $ ./cpp/build/armv7l/Release/hello_query_device
Available devices:
Device: MYRIAD
Metrics:
DEVICE_THERMAL : UNSUPPORTED TYPE
RANGE_FOR_ASYNC_INFER_REQUESTS : { 3, 6, 1 }
SUPPORTED_CONFIG_KEYS : [ CONFIG_FILE PERF_COUNT EXCLUSIVE_ASYNC_REQUESTS DEVICE_ID VPU_MYRIAD_PLATFORM VPU_IGNORE_IR_STATISTIC VPU_CUSTOM_LAYERS VPU_MYRIAD_FORCE_RESET VPU_PRINT_RECEIVE_TENSOR_TIME LOG_LEVEL VPU_HW_STAGES_OPTIMIZATION ]
SUPPORTED_METRICS : [ DEVICE_THERMAL RANGE_FOR_ASYNC_INFER_REQUESTS SUPPORTED_CONFIG_KEYS SUPPORTED_METRICS OPTIMIZATION_CAPABILITIES FULL_DEVICE_NAME AVAILABLE_DEVICES ]
OPTIMIZATION_CAPABILITIES : [ FP16 ]
FULL_DEVICE_NAME : Intel Movidius Myriad X VPU
Default values for device configuration keys:
CONFIG_FILE : ""
PERF_COUNT : OFF
EXCLUSIVE_ASYNC_REQUESTS : OFF
DEVICE_ID : ""
VPU_MYRIAD_PLATFORM : ""
VPU_IGNORE_IR_STATISTIC : OFF
VPU_CUSTOM_LAYERS : ""
VPU_MYRIAD_FORCE_RESET : OFF
VPU_PRINT_RECEIVE_TENSOR_TIME : OFF
LOG_LEVEL : LOG_NONE
VPU_HW_STAGES_OPTIMIZATION : ON

For NCS 1

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
➜  samples ./cpp/build/armv7l/Release/hello_query_device
Available devices:
Device: MYRIAD
Metrics:
DEVICE_THERMAL : UNSUPPORTED TYPE
RANGE_FOR_ASYNC_INFER_REQUESTS : { 3, 6, 1 }
SUPPORTED_CONFIG_KEYS : [ CONFIG_FILE PERF_COUNT EXCLUSIVE_ASYNC_REQUESTS DEVICE_ID VPU_MYRIAD_PLATFORM VPU_IGNORE_IR_STATISTIC VPU_CUSTOM_LAYERS VPU_MYRIAD_FORCE_RESET VPU_PRINT_RECEIVE_TENSOR_TIME LOG_LEVEL VPU_HW_STAGES_OPTIMIZATION ]
SUPPORTED_METRICS : [ DEVICE_THERMAL RANGE_FOR_ASYNC_INFER_REQUESTS SUPPORTED_CONFIG_KEYS SUPPORTED_METRICS OPTIMIZATION_CAPABILITIES FULL_DEVICE_NAME AVAILABLE_DEVICES ]
OPTIMIZATION_CAPABILITIES : [ FP16 ]
FULL_DEVICE_NAME : Intel Movidius Myriad 2 VPU
Default values for device configuration keys:
CONFIG_FILE : ""
PERF_COUNT : OFF
EXCLUSIVE_ASYNC_REQUESTS : OFF
DEVICE_ID : ""
VPU_MYRIAD_PLATFORM : ""
VPU_IGNORE_IR_STATISTIC : OFF
VPU_CUSTOM_LAYERS : ""
VPU_MYRIAD_FORCE_RESET : OFF
VPU_PRINT_RECEIVE_TENSOR_TIME : OFF
LOG_LEVEL : LOG_NONE
VPU_HW_STAGES_OPTIMIZATION : ON

3.2.3 Python

For NCS 2

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
➜  samples python ./python/hello_query_device/hello_query_device.py
Available devices:
Device: MYRIAD
Metrics:
DEVICE_THERMAL: UNSUPPORTED TYPE
RANGE_FOR_ASYNC_INFER_REQUESTS: 3, 6, 1
SUPPORTED_CONFIG_KEYS: CONFIG_FILE, PERF_COUNT, EXCLUSIVE_ASYNC_REQUESTS, DEVICE_ID, VPU_MYRIAD_PLATFORM, VPU_IGNORE_IR_STATISTIC, VPU_CUSTOM_LAYERS, VPU_MYRIAD_FORCE_RESET, VPU_PRINT_RECEIVE_TENSOR_TIME, LOG_LEVEL, VPU_HW_STAGES_OPTIMIZATION
SUPPORTED_METRICS: DEVICE_THERMAL, RANGE_FOR_ASYNC_INFER_REQUESTS, SUPPORTED_CONFIG_KEYS, SUPPORTED_METRICS, OPTIMIZATION_CAPABILITIES, FULL_DEVICE_NAME, AVAILABLE_DEVICES
OPTIMIZATION_CAPABILITIES: FP16
FULL_DEVICE_NAME: Intel Movidius Myriad X VPU
AVAILABLE_DEVICES: 1.1.1-ma2480

Default values for device configuration keys:
CONFIG_FILE:
PERF_COUNT: OFF
EXCLUSIVE_ASYNC_REQUESTS: OFF
DEVICE_ID:
VPU_MYRIAD_PLATFORM:
VPU_IGNORE_IR_STATISTIC: OFF
VPU_CUSTOM_LAYERS:
VPU_MYRIAD_FORCE_RESET: OFF
VPU_PRINT_RECEIVE_TENSOR_TIME: OFF
LOG_LEVEL: LOG_NONE
VPU_HW_STAGES_OPTIMIZATION: ON

For NCS 1

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
➜  samples python ./python/hello_query_device/hello_query_device.py
Available devices:
Device: MYRIAD
Metrics:
DEVICE_THERMAL: UNSUPPORTED TYPE
RANGE_FOR_ASYNC_INFER_REQUESTS: 3, 6, 1
SUPPORTED_CONFIG_KEYS: CONFIG_FILE, PERF_COUNT, EXCLUSIVE_ASYNC_REQUESTS, DEVICE_ID, VPU_MYRIAD_PLATFORM, VPU_IGNORE_IR_STATISTIC, VPU_CUSTOM_LAYERS, VPU_MYRIAD_FORCE_RESET, VPU_PRINT_RECEIVE_TENSOR_TIME, LOG_LEVEL, VPU_HW_STAGES_OPTIMIZATION
SUPPORTED_METRICS: DEVICE_THERMAL, RANGE_FOR_ASYNC_INFER_REQUESTS, SUPPORTED_CONFIG_KEYS, SUPPORTED_METRICS, OPTIMIZATION_CAPABILITIES, FULL_DEVICE_NAME, AVAILABLE_DEVICES
OPTIMIZATION_CAPABILITIES: FP16
FULL_DEVICE_NAME: Intel Movidius Myriad 2 VPU
AVAILABLE_DEVICES: 1.1.1-ma2450

Default values for device configuration keys:
CONFIG_FILE:
PERF_COUNT: OFF
EXCLUSIVE_ASYNC_REQUESTS: OFF
DEVICE_ID:
VPU_MYRIAD_PLATFORM:
VPU_IGNORE_IR_STATISTIC: OFF
VPU_CUSTOM_LAYERS:
VPU_MYRIAD_FORCE_RESET: OFF
VPU_PRINT_RECEIVE_TENSOR_TIME: OFF
LOG_LEVEL: LOG_NONE
VPU_HW_STAGES_OPTIMIZATION: ON

3.3 Object Detection

Please refer to Device-specific Plugin Libraries for ALL possible device types.

We then download two model files as given in the blog Install OpenVINO™ toolkit for Raspbian* OS.

Please note that:

Afterwards, start running Object Detection for 2 images: me.jpg and parents.jpg:

3.3.1 C

For NCS 2

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
➜  samples ./c/build/armv7l/Release/object_detection_sample_ssd_c -m face-detection-adas-0001.xml -d MYRIAD -i parents.jpg
[ INFO ] InferenceEngine:
2.1.2020.3.0-3467-15f2c61a-releases/2020/3
[ INFO ] Parsing input parameters
[ INFO ] Files were added: 1
[ INFO ] parents.jpg
[ INFO ] Loading Inference Engine
[ INFO ] Device info:
MYRIAD
myriadPlugin version ......... 2.1
Build ......... 2020.3.0-3467-15f2c61a-releases/2020/3
[ INFO ] Loading network files:
face-detection-adas-0001.xml
face-detection-adas-0001.bin
[ INFO ] Preparing input blobs
[ WARNING ] Image is resized from (1280, 960) to (672, 384)
[ INFO ] Batch size is 1
[ INFO ] Preparing output blobs
[ INFO ] Loading model to the device
[ INFO ] Create infer request
[ INFO ] Start inference
[ INFO ] Processing output blobs
[0, 1] element, prob = 1.000000 (826, 366)-(1026, 644) batch id : 0 WILL BE PRINTED!
[1, 1] element, prob = 0.996582 (539, 173)-(693, 429) batch id : 0 WILL BE PRINTED!
[2, 1] element, prob = 0.539062 (1094, 47)-(1135, 96) batch id : 0 WILL BE PRINTED!
[3, 1] element, prob = 0.212402 (848, 22)-(886, 70) batch id : 0
[4, 1] element, prob = 0.052734 (7, 783)-(151, 957) batch id : 0
[5, 1] element, prob = 0.039551 (1033, 78)-(1070, 122) batch id : 0
[6, 1] element, prob = 0.034668 (1123, 75)-(1158, 122) batch id : 0
[7, 1] element, prob = 0.032227 (1091, 109)-(1130, 171) batch id : 0
[8, 1] element, prob = 0.032227 (1086, 14)-(1138, 85) batch id : 0
[9, 1] element, prob = 0.030273 (1109, 43)-(1150, 93) batch id : 0
[10, 1] element, prob = 0.029297 (1151, 83)-(1190, 131) batch id : 0
[11, 1] element, prob = 0.027344 (1030, 2)-(1073, 43) batch id : 0
[12, 1] element, prob = 0.027344 (1091, 6)-(1133, 55) batch id : 0
[13, 1] element, prob = 0.027344 (1064, 73)-(1098, 121) batch id : 0
[14, 1] element, prob = 0.027344 (1058, 113)-(1096, 168) batch id : 0
[15, 1] element, prob = 0.027344 (1143, 74)-(1211, 165) batch id : 0
[16, 1] element, prob = 0.026855 (876, 73)-(915, 118) batch id : 0
[17, 1] element, prob = 0.026855 (1047, -3)-(1110, 59) batch id : 0
[18, 1] element, prob = 0.026855 (849, 13)-(1030, 236) batch id : 0
[19, 1] element, prob = 0.025879 (941, 22)-(981, 74) batch id : 0
[20, 1] element, prob = 0.025879 (1030, 35)-(1074, 89) batch id : 0
[21, 1] element, prob = 0.025879 (845, 112)-(886, 167) batch id : 0
[22, 1] element, prob = 0.025879 (1036, 111)-(1076, 171) batch id : 0
[23, 1] element, prob = 0.025879 (1113, -1)-(1175, 59) batch id : 0
[24, 1] element, prob = 0.025879 (1002, 40)-(1093, 151) batch id : 0
[25, 1] element, prob = 0.024902 (1059, 0)-(1098, 43) batch id : 0
[26, 1] element, prob = 0.024902 (876, 29)-(918, 79) batch id : 0
[27, 1] element, prob = 0.024902 (1002, 69)-(1048, 123) batch id : 0
[28, 1] element, prob = 0.024902 (1133, 104)-(1170, 163) batch id : 0
[29, 1] element, prob = 0.024902 (1056, 145)-(1094, 208) batch id : 0
[30, 1] element, prob = 0.024902 (861, 2)-(918, 63) batch id : 0
[31, 1] element, prob = 0.023926 (966, 1)-(1018, 45) batch id : 0
[32, 1] element, prob = 0.023926 (1098, 63)-(1131, 109) batch id : 0
[33, 1] element, prob = 0.023926 (1010, 84)-(1092, 204) batch id : 0
[34, 1] element, prob = 0.023926 (1022, 139)-(1076, 224) batch id : 0
[35, 1] element, prob = 0.023926 (943, 34)-(1073, 208) batch id : 0
[36, 1] element, prob = 0.022949 (811, 3)-(851, 48) batch id : 0
[37, 1] element, prob = 0.022949 (1154, 27)-(1198, 79) batch id : 0
[38, 1] element, prob = 0.022949 (1000, 155)-(1046, 226) batch id : 0
[39, 1] element, prob = 0.022949 (1086, 145)-(1125, 208) batch id : 0
[40, 1] element, prob = 0.022949 (958, 20)-(1030, 84) batch id : 0
[41, 1] element, prob = 0.022949 (934, 65)-(986, 155) batch id : 0
[42, 1] element, prob = 0.022949 (1045, 99)-(1106, 178) batch id : 0
[43, 1] element, prob = 0.022949 (983, -9)-(1141, 105) batch id : 0
[44, 1] element, prob = 0.022949 (1065, -16)-(1192, 124) batch id : 0
[45, 1] element, prob = 0.022949 (922, 81)-(1087, 304) batch id : 0
[46, 1] element, prob = 0.021973 (796, 3)-(834, 47) batch id : 0
[47, 1] element, prob = 0.021973 (911, 4)-(953, 55) batch id : 0
[48, 1] element, prob = 0.021973 (797, 29)-(835, 79) batch id : 0
[49, 1] element, prob = 0.021973 (821, 34)-(855, 81) batch id : 0
[50, 1] element, prob = 0.021973 (910, 28)-(945, 78) batch id : 0
[51, 1] element, prob = 0.021973 (755, 77)-(790, 125) batch id : 0
[52, 1] element, prob = 0.021973 (969, 80)-(1008, 128) batch id : 0
[53, 1] element, prob = 0.021973 (938, 108)-(980, 173) batch id : 0
[54, 1] element, prob = 0.021973 (1123, 186)-(1161, 252) batch id : 0
[55, 1] element, prob = 0.021973 (985, 28)-(1056, 105) batch id : 0
[56, 1] element, prob = 0.021973 (845, 56)-(898, 135) batch id : 0
[57, 1] element, prob = 0.021973 (1023, 62)-(1080, 131) batch id : 0
[58, 1] element, prob = 0.021973 (1041, 40)-(1125, 143) batch id : 0
[59, 1] element, prob = 0.021973 (1035, 116)-(1110, 242) batch id : 0
[60, 1] element, prob = 0.021973 (990, 166)-(1054, 269) batch id : 0
[61, 1] element, prob = 0.021973 (1078, 169)-(1130, 261) batch id : 0
[62, 1] element, prob = 0.021973 (971, 13)-(1140, 235) batch id : 0
[63, 1] element, prob = 0.021973 (980, 125)-(1108, 286) batch id : 0
[64, 1] element, prob = 0.020996 (940, 5)-(987, 55) batch id : 0
[65, 1] element, prob = 0.020996 (991, 0)-(1043, 41) batch id : 0
[66, 1] element, prob = 0.020996 (1126, 0)-(1165, 42) batch id : 0
[67, 1] element, prob = 0.020996 (998, 41)-(1039, 86) batch id : 0
[68, 1] element, prob = 0.020996 (1068, 40)-(1103, 86) batch id : 0
[69, 1] element, prob = 0.020996 (821, 77)-(856, 123) batch id : 0
[70, 1] element, prob = 0.020996 (913, 80)-(949, 128) batch id : 0
[71, 1] element, prob = 0.020996 (821, 112)-(861, 164) batch id : 0
[72, 1] element, prob = 0.020996 (878, 113)-(919, 175) batch id : 0
[73, 1] element, prob = 0.020996 (910, 110)-(950, 170) batch id : 0
[74, 1] element, prob = 0.020996 (1149, 111)-(1189, 163) batch id : 0
[75, 1] element, prob = 0.020996 (870, 149)-(918, 215) batch id : 0
[76, 1] element, prob = 0.020996 (1033, 152)-(1070, 209) batch id : 0
[77, 1] element, prob = 0.020996 (1128, 141)-(1165, 204) batch id : 0
[78, 1] element, prob = 0.020996 (1147, 135)-(1188, 199) batch id : 0
[79, 1] element, prob = 0.020996 (1064, 188)-(1099, 246) batch id : 0
[80, 1] element, prob = 0.020996 (1238, 903)-(1279, 963) batch id : 0
[81, 1] element, prob = 0.020996 (1146, 19)-(1208, 109) batch id : 0
[82, 1] element, prob = 0.020996 (1054, 58)-(1110, 126) batch id : 0
[83, 1] element, prob = 0.020996 (1111, 99)-(1190, 228) batch id : 0
[84, 1] element, prob = 0.020996 (1052, 161)-(1105, 265) batch id : 0
[85, 1] element, prob = 0.020996 (1060, 145)-(1139, 284) batch id : 0
[86, 1] element, prob = 0.020996 (-16, 860)-(79, 985) batch id : 0
[87, 1] element, prob = 0.020996 (1174, -17)-(1281, 136) batch id : 0
[88, 1] element, prob = 0.020996 (1073, 32)-(1198, 190) batch id : 0
[89, 1] element, prob = 0.020996 (950, 188)-(1072, 366) batch id : 0
[90, 1] element, prob = 0.020020 (943, 81)-(978, 130) batch id : 0
[91, 1] element, prob = 0.020020 (968, 104)-(1008, 159) batch id : 0
[92, 1] element, prob = 0.020020 (930, 148)-(973, 211) batch id : 0
[93, 1] element, prob = 0.020020 (1092, 191)-(1128, 246) batch id : 0
[94, 1] element, prob = 0.020020 (1124, 225)-(1161, 291) batch id : 0
[95, 1] element, prob = 0.020020 (1007, -2)-(1083, 55) batch id : 0
[96, 1] element, prob = 0.020020 (890, 60)-(960, 138) batch id : 0
[97, 1] element, prob = 0.020020 (962, 65)-(1023, 137) batch id : 0
[98, 1] element, prob = 0.020020 (1101, 145)-(1183, 284) batch id : 0
[99, 1] element, prob = 0.020020 (866, -10)-(1029, 107) batch id : 0
[100, 1] element, prob = 0.020020 (1005, 179)-(1111, 360) batch id : 0
[101, 1] element, prob = 0.020020 (1019, 342)-(1283, 785) batch id : 0
[102, 1] element, prob = 0.019043 (758, 2)-(795, 45) batch id : 0
[103, 1] element, prob = 0.019043 (756, 38)-(792, 89) batch id : 0
[104, 1] element, prob = 0.019043 (968, 36)-(1011, 82) batch id : 0
[105, 1] element, prob = 0.019043 (729, 77)-(763, 127) batch id : 0
[106, 1] element, prob = 0.019043 (1003, 109)-(1044, 169) batch id : 0
[107, 1] element, prob = 0.019043 (1030, 190)-(1066, 242) batch id : 0
[108, 1] element, prob = 0.019043 (887, 83)-(971, 209) batch id : 0
[109, 1] element, prob = 0.019043 (995, 99)-(1055, 192) batch id : 0
[110, 1] element, prob = 0.019043 (902, 135)-(963, 221) batch id : 0
[ INFO ] Image out_0.bmp created!
[ INFO ] Execution successful
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
➜  samples ./c/build/armv7l/Release/object_detection_sample_ssd_c -m face-detection-adas-0001.xml -d MYRIAD -i me.jpg     
[ INFO ] InferenceEngine:
2.1.2020.3.0-3467-15f2c61a-releases/2020/3
[ INFO ] Parsing input parameters
[ INFO ] Files were added: 1
[ INFO ] me.jpg
[ INFO ] Loading Inference Engine
[ INFO ] Device info:
MYRIAD
myriadPlugin version ......... 2.1
Build ......... 2020.3.0-3467-15f2c61a-releases/2020/3
[ INFO ] Loading network files:
face-detection-adas-0001.xml
face-detection-adas-0001.bin
[ INFO ] Preparing input blobs
[ WARNING ] Image is resized from (924, 1280) to (672, 384)
[ INFO ] Batch size is 1
[ INFO ] Preparing output blobs
[ INFO ] Loading model to the device
[ INFO ] Create infer request
[ INFO ] Start inference
[ INFO ] Processing output blobs
[0, 1] element, prob = 0.870117 (421, 202)-(745, 608) batch id : 0 WILL BE PRINTED!
[1, 1] element, prob = 0.026855 (-55, 556)-(234, 1100) batch id : 0
[2, 1] element, prob = 0.021973 (20, 925)-(239, 1340) batch id : 0
[3, 1] element, prob = 0.020020 (-18, -144)-(179, 581) batch id : 0
[4, 1] element, prob = 0.018066 (636, 315)-(757, 471) batch id : 0
[5, 1] element, prob = 0.017090 (-27, 672)-(137, 1217) batch id : 0
[6, 1] element, prob = 0.017090 (74, 976)-(364, 1322) batch id : 0
[7, 1] element, prob = 0.016113 (291, 958)-(349, 1122) batch id : 0
[8, 1] element, prob = 0.016113 (329, 963)-(391, 1126) batch id : 0
[9, 1] element, prob = 0.016113 (203, 928)-(423, 1346) batch id : 0
[10, 1] element, prob = 0.015625 (694, 1034)-(732, 1140) batch id : 0
[11, 1] element, prob = 0.015625 (832, 1121)-(937, 1295) batch id : 0
[12, 1] element, prob = 0.015625 (219, 39)-(388, 627) batch id : 0
[13, 1] element, prob = 0.015625 (76, 785)-(357, 1160) batch id : 0
[14, 1] element, prob = 0.014648 (355, 616)-(475, 778) batch id : 0
[15, 1] element, prob = 0.014648 (240, 779)-(331, 1048) batch id : 0
[16, 1] element, prob = 0.013672 (0, 893)-(24, 984) batch id : 0
[17, 1] element, prob = 0.013672 (658, 306)-(752, 405) batch id : 0
[18, 1] element, prob = 0.013672 (204, 1198)-(267, 1275) batch id : 0
[19, 1] element, prob = 0.013672 (322, 611)-(430, 794) batch id : 0
[20, 1] element, prob = 0.013672 (171, 244)-(423, 798) batch id : 0
[21, 1] element, prob = 0.013672 (252, 798)-(539, 1177) batch id : 0
[22, 1] element, prob = 0.013672 (133, 875)-(301, 1420) batch id : 0
[23, 1] element, prob = 0.013672 (443, 202)-(802, 1022) batch id : 0
[24, 1] element, prob = 0.012695 (718, 466)-(757, 540) batch id : 0
[25, 1] element, prob = 0.012695 (166, 527)-(206, 594) batch id : 0
[26, 1] element, prob = 0.012695 (694, 511)-(731, 590) batch id : 0
[27, 1] element, prob = 0.012695 (705, 441)-(771, 555) batch id : 0
[28, 1] element, prob = 0.012695 (272, 962)-(322, 1117) batch id : 0
[29, 1] element, prob = 0.012695 (268, 486)-(376, 688) batch id : 0
[30, 1] element, prob = 0.012695 (323, 489)-(422, 690) batch id : 0
[31, 1] element, prob = 0.012695 (277, 607)-(384, 796) batch id : 0
[32, 1] element, prob = 0.012695 (233, 709)-(342, 924) batch id : 0
[33, 1] element, prob = 0.012695 (256, 672)-(406, 957) batch id : 0
[34, 1] element, prob = 0.012695 (252, 1043)-(431, 1214) batch id : 0
[35, 1] element, prob = 0.011719 (695, 466)-(733, 539) batch id : 0
[36, 1] element, prob = 0.011719 (673, 514)-(709, 587) batch id : 0
[37, 1] element, prob = 0.011719 (370, 1005)-(399, 1078) batch id : 0
[38, 1] element, prob = 0.011719 (670, 256)-(753, 346) batch id : 0
[39, 1] element, prob = 0.011719 (151, 500)-(221, 610) batch id : 0
[40, 1] element, prob = 0.011719 (447, 489)-(511, 608) batch id : 0
[41, 1] element, prob = 0.011719 (126, 625)-(200, 719) batch id : 0
[42, 1] element, prob = 0.011719 (299, 626)-(387, 709) batch id : 0
[43, 1] element, prob = 0.011719 (164, 673)-(251, 747) batch id : 0
[44, 1] element, prob = 0.011719 (398, 850)-(453, 1008) batch id : 0
[45, 1] element, prob = 0.011719 (425, 851)-(476, 1008) batch id : 0
[46, 1] element, prob = 0.011719 (310, 915)-(371, 1077) batch id : 0
[47, 1] element, prob = 0.011719 (357, 925)-(413, 1073) batch id : 0
[48, 1] element, prob = 0.011719 (378, 923)-(435, 1074) batch id : 0
[49, 1] element, prob = 0.011719 (468, 906)-(521, 1070) batch id : 0
[50, 1] element, prob = 0.011719 (284, 1016)-(357, 1188) batch id : 0
[51, 1] element, prob = 0.011719 (203, -95)-(283, 182) batch id : 0
[52, 1] element, prob = 0.011719 (269, 378)-(374, 595) batch id : 0
[53, 1] element, prob = 0.011719 (-22, 575)-(71, 831) batch id : 0
[54, 1] element, prob = 0.011719 (318, 698)-(433, 911) batch id : 0
[55, 1] element, prob = 0.011719 (221, 716)-(395, 1258) batch id : 0
[56, 1] element, prob = 0.011719 (859, 771)-(987, 1518) batch id : 0
[57, 1] element, prob = 0.010742 (145, 466)-(185, 527) batch id : 0
[58, 1] element, prob = 0.010742 (166, 463)-(204, 521) batch id : 0
[59, 1] element, prob = 0.010742 (185, 477)-(224, 537) batch id : 0
[60, 1] element, prob = 0.010742 (214, 477)-(250, 540) batch id : 0
[61, 1] element, prob = 0.010742 (458, 478)-(488, 536) batch id : 0
[62, 1] element, prob = 0.010742 (151, 537)-(177, 594) batch id : 0
[63, 1] element, prob = 0.010742 (460, 522)-(488, 590) batch id : 0
[64, 1] element, prob = 0.010742 (481, 523)-(509, 588) batch id : 0
[65, 1] element, prob = 0.010742 (716, 512)-(755, 592) batch id : 0
[66, 1] element, prob = 0.010742 (129, 581)-(158, 641) batch id : 0
[67, 1] element, prob = 0.010742 (150, 579)-(179, 635) batch id : 0
[68, 1] element, prob = 0.010742 (171, 583)-(201, 641) batch id : 0
[69, 1] element, prob = 0.010742 (457, 564)-(491, 658) batch id : 0
[70, 1] element, prob = 0.010742 (481, 581)-(510, 647) batch id : 0
[71, 1] element, prob = 0.010742 (652, 565)-(689, 650) batch id : 0
[72, 1] element, prob = 0.010742 (674, 571)-(710, 650) batch id : 0
[73, 1] element, prob = 0.010742 (694, 571)-(734, 649) batch id : 0
[74, 1] element, prob = 0.010742 (143, 640)-(188, 705) batch id : 0
[75, 1] element, prob = 0.010742 (191, 743)-(224, 805) batch id : 0
[76, 1] element, prob = 0.010742 (437, 843)-(465, 920) batch id : 0
[77, 1] element, prob = 0.010742 (-1, 943)-(22, 1037) batch id : 0
[78, 1] element, prob = 0.010742 (414, 1007)-(443, 1080) batch id : 0
[79, 1] element, prob = 0.010742 (896, 1206)-(924, 1279) batch id : 0
[80, 1] element, prob = 0.010742 (678, 200)-(743, 294) batch id : 0
[81, 1] element, prob = 0.010742 (70, 374)-(190, 512) batch id : 0
[82, 1] element, prob = 0.010742 (422, 429)-(485, 563) batch id : 0
[83, 1] element, prob = 0.010742 (106, 477)-(213, 629) batch id : 0
[84, 1] element, prob = 0.010742 (428, 477)-(479, 615) batch id : 0
[85, 1] element, prob = 0.010742 (648, 498)-(727, 605) batch id : 0
[86, 1] element, prob = 0.010742 (703, 490)-(769, 614) batch id : 0
[87, 1] element, prob = 0.010742 (16, 531)-(150, 712) batch id : 0
[88, 1] element, prob = 0.010742 (87, 546)-(150, 685) batch id : 0
[89, 1] element, prob = 0.010742 (127, 552)-(197, 653) batch id : 0
[90, 1] element, prob = 0.010742 (425, 557)-(481, 671) batch id : 0
[91, 1] element, prob = 0.010742 (437, 509)-(516, 700) batch id : 0
[92, 1] element, prob = 0.010742 (467, 549)-(524, 678) batch id : 0
[93, 1] element, prob = 0.010742 (633, 551)-(704, 664) batch id : 0
[94, 1] element, prob = 0.010742 (416, 611)-(487, 735) batch id : 0
[95, 1] element, prob = 0.010742 (466, 602)-(530, 734) batch id : 0
[96, 1] element, prob = 0.010742 (124, 673)-(198, 759) batch id : 0
[97, 1] element, prob = 0.010742 (156, 705)-(260, 849) batch id : 0
[98, 1] element, prob = 0.010742 (54, 828)-(137, 933) batch id : 0
[99, 1] element, prob = 0.010742 (451, 812)-(498, 960) batch id : 0
[100, 1] element, prob = 0.010742 (470, 843)-(520, 1004) batch id : 0
[101, 1] element, prob = 0.010742 (447, 897)-(500, 1066) batch id : 0
[102, 1] element, prob = 0.010742 (641, 966)-(706, 1122) batch id : 0
[103, 1] element, prob = 0.010742 (164, 85)-(233, 410) batch id : 0
[104, 1] element, prob = 0.010742 (188, 497)-(307, 688) batch id : 0
[105, 1] element, prob = 0.010742 (205, 445)-(357, 715) batch id : 0
[106, 1] element, prob = 0.010742 (230, 602)-(341, 791) batch id : 0
[107, 1] element, prob = 0.010742 (274, 785)-(390, 1021) batch id : 0
[108, 1] element, prob = 0.010742 (26, 709)-(209, 1239) batch id : 0
[109, 1] element, prob = 0.010742 (309, 868)-(485, 1410) batch id : 0
[110, 1] element, prob = 0.010742 (229, 981)-(575, 1306) batch id : 0
[111, 1] element, prob = 0.010742 (29, 141)-(518, 883) batch id : 0
[ INFO ] Image out_0.bmp created!
[ INFO ] Execution successful

For NCS 1

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
➜  samples pwd
/opt/intel/openvino/inference_engine/samples
➜ samples ./c/build/armv7l/Release/object_detection_sample_ssd_c -m face-detection-adas-0001.xml -d MYRIAD -i parents.jpg
[ INFO ] InferenceEngine:
2.1.2020.3.0-3467-15f2c61a-releases/2020/3
[ INFO ] Parsing input parameters
[ INFO ] Files were added: 1
[ INFO ] parents.jpg
[ INFO ] Loading Inference Engine
[ INFO ] Device info:
MYRIAD
myriadPlugin version ......... 2.1
Build ......... 2020.3.0-3467-15f2c61a-releases/2020/3
[ INFO ] Loading network files:
face-detection-adas-0001.xml
face-detection-adas-0001.bin
[ INFO ] Preparing input blobs
[ WARNING ] Image is resized from (1280, 960) to (672, 384)
[ INFO ] Batch size is 1
[ INFO ] Preparing output blobs
[ INFO ] Loading model to the device
[ INFO ] Create infer request
[ INFO ] Start inference
[ INFO ] Processing output blobs
[0, 1] element, prob = 1.000000 (826, 366)-(1026, 644) batch id : 0 WILL BE PRINTED!
[1, 1] element, prob = 0.996582 (539, 173)-(693, 429) batch id : 0 WILL BE PRINTED!
[2, 1] element, prob = 0.552734 (1094, 47)-(1135, 95) batch id : 0 WILL BE PRINTED!
[3, 1] element, prob = 0.221680 (848, 22)-(886, 70) batch id : 0
[4, 1] element, prob = 0.053711 (8, 784)-(151, 956) batch id : 0
[5, 1] element, prob = 0.039551 (1033, 78)-(1070, 122) batch id : 0
[6, 1] element, prob = 0.034668 (1123, 75)-(1158, 122) batch id : 0
[7, 1] element, prob = 0.032227 (1086, 14)-(1138, 85) batch id : 0
[8, 1] element, prob = 0.031250 (1091, 110)-(1130, 171) batch id : 0
[9, 1] element, prob = 0.030273 (1108, 43)-(1150, 93) batch id : 0
[10, 1] element, prob = 0.029297 (1151, 83)-(1190, 131) batch id : 0
[11, 1] element, prob = 0.027344 (1091, 6)-(1133, 55) batch id : 0
[12, 1] element, prob = 0.027344 (1064, 73)-(1098, 121) batch id : 0
[13, 1] element, prob = 0.026855 (1030, 2)-(1073, 42) batch id : 0
[14, 1] element, prob = 0.026855 (1058, 113)-(1096, 168) batch id : 0
[15, 1] element, prob = 0.026855 (1047, -3)-(1110, 59) batch id : 0
[16, 1] element, prob = 0.026855 (1142, 75)-(1211, 165) batch id : 0
[17, 1] element, prob = 0.026855 (849, 14)-(1030, 236) batch id : 0
[18, 1] element, prob = 0.025879 (941, 22)-(981, 74) batch id : 0
[19, 1] element, prob = 0.025879 (1030, 36)-(1074, 89) batch id : 0
[20, 1] element, prob = 0.025879 (876, 73)-(915, 118) batch id : 0
[21, 1] element, prob = 0.025879 (845, 112)-(886, 167) batch id : 0
[22, 1] element, prob = 0.025879 (1036, 110)-(1076, 171) batch id : 0
[23, 1] element, prob = 0.025879 (1113, -1)-(1175, 59) batch id : 0
[24, 1] element, prob = 0.025879 (1003, 41)-(1094, 151) batch id : 0
[25, 1] element, prob = 0.024902 (1059, 0)-(1098, 43) batch id : 0
[26, 1] element, prob = 0.024902 (1002, 69)-(1048, 123) batch id : 0
[27, 1] element, prob = 0.024902 (1133, 105)-(1170, 164) batch id : 0
[28, 1] element, prob = 0.023926 (966, 1)-(1018, 45) batch id : 0
[29, 1] element, prob = 0.023926 (876, 29)-(918, 79) batch id : 0
[30, 1] element, prob = 0.023926 (1056, 145)-(1094, 208) batch id : 0
[31, 1] element, prob = 0.023926 (861, 2)-(918, 63) batch id : 0
[32, 1] element, prob = 0.023926 (1010, 84)-(1092, 204) batch id : 0
[33, 1] element, prob = 0.023926 (1022, 139)-(1076, 224) batch id : 0
[34, 1] element, prob = 0.023926 (942, 35)-(1072, 208) batch id : 0
[35, 1] element, prob = 0.022949 (811, 3)-(851, 48) batch id : 0
[36, 1] element, prob = 0.022949 (1098, 63)-(1131, 109) batch id : 0
[37, 1] element, prob = 0.022949 (983, -9)-(1141, 105) batch id : 0
[38, 1] element, prob = 0.022949 (921, 81)-(1087, 304) batch id : 0
[39, 1] element, prob = 0.021973 (796, 3)-(835, 47) batch id : 0
[40, 1] element, prob = 0.021973 (911, 4)-(953, 55) batch id : 0
[41, 1] element, prob = 0.021973 (797, 29)-(835, 79) batch id : 0
[42, 1] element, prob = 0.021973 (1154, 27)-(1198, 79) batch id : 0
[43, 1] element, prob = 0.021973 (969, 80)-(1008, 128) batch id : 0
[44, 1] element, prob = 0.021973 (1000, 155)-(1046, 226) batch id : 0
[45, 1] element, prob = 0.021973 (1086, 145)-(1125, 208) batch id : 0
[46, 1] element, prob = 0.021973 (1123, 186)-(1161, 252) batch id : 0
[47, 1] element, prob = 0.021973 (958, 20)-(1030, 84) batch id : 0
[48, 1] element, prob = 0.021973 (985, 28)-(1056, 105) batch id : 0
[49, 1] element, prob = 0.021973 (845, 56)-(898, 134) batch id : 0
[50, 1] element, prob = 0.021973 (934, 65)-(986, 155) batch id : 0
[51, 1] element, prob = 0.021973 (1046, 99)-(1107, 178) batch id : 0
[52, 1] element, prob = 0.021973 (1035, 116)-(1110, 241) batch id : 0
[53, 1] element, prob = 0.021973 (990, 166)-(1054, 269) batch id : 0
[54, 1] element, prob = 0.021973 (1065, -16)-(1192, 124) batch id : 0
[55, 1] element, prob = 0.021973 (971, 13)-(1140, 235) batch id : 0
[56, 1] element, prob = 0.021973 (980, 125)-(1108, 286) batch id : 0
[57, 1] element, prob = 0.020996 (991, 0)-(1043, 41) batch id : 0
[58, 1] element, prob = 0.020996 (1126, 0)-(1165, 42) batch id : 0
[59, 1] element, prob = 0.020996 (821, 35)-(856, 79) batch id : 0
[60, 1] element, prob = 0.020996 (910, 28)-(945, 78) batch id : 0
[61, 1] element, prob = 0.020996 (997, 41)-(1038, 86) batch id : 0
[62, 1] element, prob = 0.020996 (1068, 40)-(1103, 86) batch id : 0
[63, 1] element, prob = 0.020996 (754, 77)-(789, 125) batch id : 0
[64, 1] element, prob = 0.020996 (819, 77)-(855, 123) batch id : 0
[65, 1] element, prob = 0.020996 (913, 79)-(949, 128) batch id : 0
[66, 1] element, prob = 0.020996 (821, 112)-(861, 164) batch id : 0
[67, 1] element, prob = 0.020996 (938, 108)-(980, 173) batch id : 0
[68, 1] element, prob = 0.020996 (1149, 112)-(1189, 162) batch id : 0
[69, 1] element, prob = 0.020996 (870, 149)-(918, 215) batch id : 0
[70, 1] element, prob = 0.020996 (1033, 152)-(1070, 209) batch id : 0
[71, 1] element, prob = 0.020996 (1064, 188)-(1099, 246) batch id : 0
[72, 1] element, prob = 0.020996 (1023, 63)-(1081, 131) batch id : 0
[73, 1] element, prob = 0.020996 (1054, 58)-(1110, 126) batch id : 0
[74, 1] element, prob = 0.020996 (1041, 40)-(1125, 143) batch id : 0
[75, 1] element, prob = 0.020996 (1111, 99)-(1190, 228) batch id : 0
[76, 1] element, prob = 0.020996 (1078, 169)-(1130, 261) batch id : 0
[77, 1] element, prob = 0.020996 (950, 189)-(1072, 367) batch id : 0
[78, 1] element, prob = 0.020020 (940, 6)-(987, 56) batch id : 0
[79, 1] element, prob = 0.020020 (943, 81)-(978, 130) batch id : 0
[80, 1] element, prob = 0.020020 (878, 114)-(919, 175) batch id : 0
[81, 1] element, prob = 0.020020 (910, 110)-(950, 170) batch id : 0
[82, 1] element, prob = 0.020020 (968, 104)-(1008, 159) batch id : 0
[83, 1] element, prob = 0.020020 (1128, 141)-(1165, 204) batch id : 0
[84, 1] element, prob = 0.020020 (1147, 135)-(1188, 199) batch id : 0
[85, 1] element, prob = 0.020020 (1238, 903)-(1279, 963) batch id : 0
[86, 1] element, prob = 0.020020 (1007, -2)-(1083, 55) batch id : 0
[87, 1] element, prob = 0.020020 (1052, 161)-(1105, 266) batch id : 0
[88, 1] element, prob = 0.020020 (1061, 145)-(1140, 284) batch id : 0
[89, 1] element, prob = 0.020020 (-16, 860)-(79, 985) batch id : 0
[90, 1] element, prob = 0.020020 (866, -10)-(1029, 107) batch id : 0
[91, 1] element, prob = 0.020020 (1173, -17)-(1280, 136) batch id : 0
[92, 1] element, prob = 0.020020 (1072, 32)-(1197, 189) batch id : 0
[93, 1] element, prob = 0.020020 (1005, 179)-(1111, 359) batch id : 0
[94, 1] element, prob = 0.019043 (758, 2)-(795, 45) batch id : 0
[95, 1] element, prob = 0.019043 (968, 35)-(1011, 82) batch id : 0
[96, 1] element, prob = 0.019043 (1003, 109)-(1044, 169) batch id : 0
[97, 1] element, prob = 0.019043 (930, 148)-(973, 211) batch id : 0
[98, 1] element, prob = 0.019043 (1092, 191)-(1128, 246) batch id : 0
[99, 1] element, prob = 0.019043 (1123, 225)-(1161, 291) batch id : 0
[100, 1] element, prob = 0.019043 (1146, 19)-(1208, 109) batch id : 0
[101, 1] element, prob = 0.019043 (897, 59)-(958, 143) batch id : 0
[102, 1] element, prob = 0.019043 (962, 65)-(1023, 137) batch id : 0
[103, 1] element, prob = 0.019043 (886, 83)-(971, 209) batch id : 0
[104, 1] element, prob = 0.019043 (995, 99)-(1055, 192) batch id : 0
[105, 1] element, prob = 0.019043 (902, 135)-(963, 221) batch id : 0
[106, 1] element, prob = 0.019043 (1101, 145)-(1183, 284) batch id : 0
[107, 1] element, prob = 0.019043 (821, -17)-(946, 129) batch id : 0
[108, 1] element, prob = 0.019043 (940, -16)-(1076, 120) batch id : 0
[109, 1] element, prob = 0.019043 (819, 34)-(945, 205) batch id : 0
[110, 1] element, prob = 0.019043 (1019, 342)-(1283, 785) batch id : 0
[111, 1] element, prob = 0.018066 (1147, 7)-(1193, 54) batch id : 0
[112, 1] element, prob = 0.018066 (756, 38)-(792, 89) batch id : 0
[ INFO ] Image out_0.bmp created!
[ INFO ] Execution successful
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
➜  samples ./c/build/armv7l/Release/object_detection_sample_ssd_c -m face-detection-adas-0001.xml -d MYRIAD -i me.jpg     
[ INFO ] InferenceEngine:
2.1.2020.3.0-3467-15f2c61a-releases/2020/3
[ INFO ] Parsing input parameters
[ INFO ] Files were added: 1
[ INFO ] me.jpg
[ INFO ] Loading Inference Engine
[ INFO ] Device info:
MYRIAD
myriadPlugin version ......... 2.1
Build ......... 2020.3.0-3467-15f2c61a-releases/2020/3
[ INFO ] Loading network files:
face-detection-adas-0001.xml
face-detection-adas-0001.bin
[ INFO ] Preparing input blobs
[ WARNING ] Image is resized from (924, 1280) to (672, 384)
[ INFO ] Batch size is 1
[ INFO ] Preparing output blobs
[ INFO ] Loading model to the device
[ INFO ] Create infer request
[ INFO ] Start inference
[ INFO ] Processing output blobs
[0, 1] element, prob = 0.871094 (420, 202)-(745, 608) batch id : 0 WILL BE PRINTED!
[1, 1] element, prob = 0.026855 (-54, 555)-(234, 1100) batch id : 0
[2, 1] element, prob = 0.020996 (21, 925)-(239, 1340) batch id : 0
[3, 1] element, prob = 0.019043 (-18, -144)-(178, 583) batch id : 0
[4, 1] element, prob = 0.018066 (636, 315)-(757, 470) batch id : 0
[5, 1] element, prob = 0.017090 (75, 977)-(363, 1321) batch id : 0
[6, 1] element, prob = 0.016113 (291, 958)-(349, 1123) batch id : 0
[7, 1] element, prob = 0.015625 (329, 962)-(391, 1126) batch id : 0
[8, 1] element, prob = 0.015625 (832, 1121)-(937, 1293) batch id : 0
[9, 1] element, prob = 0.015625 (-27, 670)-(136, 1220) batch id : 0
[10, 1] element, prob = 0.015625 (76, 785)-(357, 1160) batch id : 0
[11, 1] element, prob = 0.015625 (204, 928)-(423, 1346) batch id : 0
[12, 1] element, prob = 0.014648 (694, 1035)-(732, 1141) batch id : 0
[13, 1] element, prob = 0.014648 (219, 40)-(388, 626) batch id : 0
[14, 1] element, prob = 0.013672 (658, 306)-(752, 405) batch id : 0
[15, 1] element, prob = 0.013672 (204, 1198)-(267, 1275) batch id : 0
[16, 1] element, prob = 0.013672 (322, 612)-(430, 793) batch id : 0
[17, 1] element, prob = 0.013672 (240, 779)-(331, 1046) batch id : 0
[18, 1] element, prob = 0.013672 (171, 245)-(424, 798) batch id : 0
[19, 1] element, prob = 0.013672 (443, 205)-(802, 1021) batch id : 0
[20, 1] element, prob = 0.012695 (694, 511)-(731, 590) batch id : 0
[21, 1] element, prob = 0.012695 (0, 893)-(24, 984) batch id : 0
[22, 1] element, prob = 0.012695 (272, 962)-(322, 1117) batch id : 0
[23, 1] element, prob = 0.012695 (268, 487)-(376, 688) batch id : 0
[24, 1] element, prob = 0.012695 (323, 490)-(422, 690) batch id : 0
[25, 1] element, prob = 0.012695 (277, 608)-(384, 795) batch id : 0
[26, 1] element, prob = 0.012695 (233, 710)-(342, 923) batch id : 0
[27, 1] element, prob = 0.012695 (256, 672)-(406, 957) batch id : 0
[28, 1] element, prob = 0.012695 (252, 799)-(540, 1176) batch id : 0
[29, 1] element, prob = 0.012695 (133, 876)-(301, 1420) batch id : 0
[30, 1] element, prob = 0.011719 (718, 466)-(757, 540) batch id : 0
[31, 1] element, prob = 0.011719 (166, 527)-(206, 594) batch id : 0
[32, 1] element, prob = 0.011719 (673, 514)-(709, 587) batch id : 0
[33, 1] element, prob = 0.011719 (-1, 943)-(22, 1038) batch id : 0
[34, 1] element, prob = 0.011719 (670, 256)-(753, 346) batch id : 0
[35, 1] element, prob = 0.011719 (705, 441)-(771, 555) batch id : 0
[36, 1] element, prob = 0.011719 (126, 625)-(200, 718) batch id : 0
[37, 1] element, prob = 0.011719 (164, 673)-(251, 747) batch id : 0
[38, 1] element, prob = 0.011719 (399, 850)-(453, 1008) batch id : 0
[39, 1] element, prob = 0.011719 (424, 851)-(476, 1009) batch id : 0
[40, 1] element, prob = 0.011719 (357, 925)-(413, 1073) batch id : 0
[41, 1] element, prob = 0.011719 (399, 919)-(454, 1074) batch id : 0
[42, 1] element, prob = 0.011719 (468, 906)-(521, 1070) batch id : 0
[43, 1] element, prob = 0.011719 (284, 1016)-(357, 1186) batch id : 0
[44, 1] element, prob = 0.011719 (269, 379)-(374, 595) batch id : 0
[45, 1] element, prob = 0.011719 (349, 563)-(485, 831) batch id : 0
[46, 1] element, prob = 0.011719 (252, 1042)-(431, 1215) batch id : 0
[47, 1] element, prob = 0.011719 (858, 771)-(987, 1518) batch id : 0
[48, 1] element, prob = 0.010742 (165, 463)-(204, 521) batch id : 0
[49, 1] element, prob = 0.010742 (185, 477)-(224, 537) batch id : 0
[50, 1] element, prob = 0.010742 (214, 477)-(250, 540) batch id : 0
[51, 1] element, prob = 0.010742 (458, 478)-(488, 536) batch id : 0
[52, 1] element, prob = 0.010742 (695, 466)-(733, 539) batch id : 0
[53, 1] element, prob = 0.010742 (460, 522)-(488, 590) batch id : 0
[54, 1] element, prob = 0.010742 (481, 523)-(509, 588) batch id : 0
[55, 1] element, prob = 0.010742 (716, 512)-(755, 592) batch id : 0
[56, 1] element, prob = 0.010742 (150, 579)-(179, 635) batch id : 0
[57, 1] element, prob = 0.010742 (171, 584)-(201, 641) batch id : 0
[58, 1] element, prob = 0.010742 (481, 582)-(510, 647) batch id : 0
[59, 1] element, prob = 0.010742 (652, 565)-(689, 650) batch id : 0
[60, 1] element, prob = 0.010742 (674, 571)-(710, 650) batch id : 0
[61, 1] element, prob = 0.010742 (694, 571)-(734, 648) batch id : 0
[62, 1] element, prob = 0.010742 (142, 640)-(188, 705) batch id : 0
[63, 1] element, prob = 0.010742 (335, 627)-(390, 695) batch id : 0
[64, 1] element, prob = 0.010742 (191, 744)-(225, 803) batch id : 0
[65, 1] element, prob = 0.010742 (370, 1005)-(399, 1078) batch id : 0
[66, 1] element, prob = 0.010742 (414, 1006)-(443, 1079) batch id : 0
[67, 1] element, prob = 0.010742 (896, 1207)-(924, 1280) batch id : 0
[68, 1] element, prob = 0.010742 (86, 386)-(160, 513) batch id : 0
[69, 1] element, prob = 0.010742 (106, 477)-(214, 630) batch id : 0
[70, 1] element, prob = 0.010742 (151, 501)-(221, 610) batch id : 0
[71, 1] element, prob = 0.010742 (428, 478)-(479, 615) batch id : 0
[72, 1] element, prob = 0.010742 (450, 478)-(506, 611) batch id : 0
[73, 1] element, prob = 0.010742 (647, 498)-(728, 605) batch id : 0
[74, 1] element, prob = 0.010742 (148, 556)-(220, 658) batch id : 0
[75, 1] element, prob = 0.010742 (425, 557)-(481, 671) batch id : 0
[76, 1] element, prob = 0.010742 (437, 510)-(516, 700) batch id : 0
[77, 1] element, prob = 0.010742 (463, 554)-(527, 674) batch id : 0
[78, 1] element, prob = 0.010742 (633, 551)-(704, 664) batch id : 0
[79, 1] element, prob = 0.010742 (280, 628)-(367, 711) batch id : 0
[80, 1] element, prob = 0.010742 (416, 612)-(488, 735) batch id : 0
[81, 1] element, prob = 0.010742 (155, 706)-(260, 850) batch id : 0
[82, 1] element, prob = 0.010742 (55, 828)-(137, 932) batch id : 0
[83, 1] element, prob = 0.010742 (451, 813)-(498, 960) batch id : 0
[84, 1] element, prob = 0.010742 (310, 914)-(371, 1076) batch id : 0
[85, 1] element, prob = 0.010742 (447, 896)-(500, 1065) batch id : 0
[86, 1] element, prob = 0.010742 (641, 966)-(706, 1121) batch id : 0
[87, 1] element, prob = 0.010742 (203, -95)-(283, 182) batch id : 0
[88, 1] element, prob = 0.010742 (164, 86)-(233, 410) batch id : 0
[89, 1] element, prob = 0.010742 (188, 498)-(307, 688) batch id : 0
[90, 1] element, prob = 0.010742 (205, 445)-(357, 715) batch id : 0
[91, 1] element, prob = 0.010742 (-22, 576)-(71, 832) batch id : 0
[92, 1] element, prob = 0.010742 (230, 603)-(342, 791) batch id : 0
[93, 1] element, prob = 0.010742 (318, 698)-(433, 909) batch id : 0
[94, 1] element, prob = 0.010742 (273, 784)-(390, 1021) batch id : 0
[95, 1] element, prob = 0.010742 (175, 740)-(444, 1237) batch id : 0
[ INFO ] Image out_0.bmp created!
[ INFO ] Execution successful

3.3.2 C++

For NCS 2

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
➜  samples ./cpp/build/armv7l/Release/object_detection_sample_ssd -m face-detection-adas-0001.xml -d MYRIAD -i me.jpg 
[ INFO ] InferenceEngine:
API version ............ 2.1
Build .................. 2020.3.0-3467-15f2c61a-releases/2020/3
Description ....... API
Parsing input parameters
[ INFO ] Files were added: 1
[ INFO ] me.jpg
[ INFO ] Loading Inference Engine
[ INFO ] Device info:
MYRIAD
myriadPlugin version ......... 2.1
Build ........... 2020.3.0-3467-15f2c61a-releases/2020/3
[ INFO ] Loading network files:
face-detection-adas-0001.xml
face-detection-adas-0001.bin
[ INFO ] Preparing input blobs
[ INFO ] Batch size is 1
[ INFO ] Preparing output blobs
[ INFO ] Loading model to the device
[ INFO ] Create infer request
[ WARNING ] Image is resized from (924, 1280) to (672, 384)
[ INFO ] Batch size is 1
[ INFO ] Start inference
[ INFO ] Processing output blobs
[0,1] element, prob = 0.870117 (421,202)-(745,608) batch id : 0 WILL BE PRINTED!
[1,1] element, prob = 0.0268555 (-55,556)-(234,1100) batch id : 0
[2,1] element, prob = 0.0219727 (20,925)-(239,1340) batch id : 0
[3,1] element, prob = 0.0200195 (-18,-144)-(179,581) batch id : 0
[4,1] element, prob = 0.0180664 (636,315)-(757,471) batch id : 0
[5,1] element, prob = 0.0170898 (-27,672)-(137,1217) batch id : 0
[6,1] element, prob = 0.0170898 (74,976)-(364,1322) batch id : 0
[7,1] element, prob = 0.0161133 (291,958)-(349,1122) batch id : 0
[8,1] element, prob = 0.0161133 (329,963)-(391,1126) batch id : 0
[9,1] element, prob = 0.0161133 (203,928)-(423,1346) batch id : 0
[10,1] element, prob = 0.015625 (694,1034)-(732,1140) batch id : 0
[11,1] element, prob = 0.015625 (832,1121)-(937,1295) batch id : 0
[12,1] element, prob = 0.015625 (219,39)-(388,627) batch id : 0
[13,1] element, prob = 0.015625 (76,785)-(357,1160) batch id : 0
[14,1] element, prob = 0.0146484 (355,616)-(475,778) batch id : 0
[15,1] element, prob = 0.0146484 (240,779)-(331,1048) batch id : 0
[16,1] element, prob = 0.0136719 (0,893)-(24,984) batch id : 0
[17,1] element, prob = 0.0136719 (658,306)-(752,405) batch id : 0
[18,1] element, prob = 0.0136719 (204,1198)-(267,1275) batch id : 0
[19,1] element, prob = 0.0136719 (322,611)-(430,794) batch id : 0
[20,1] element, prob = 0.0136719 (171,244)-(423,798) batch id : 0
[21,1] element, prob = 0.0136719 (252,798)-(539,1177) batch id : 0
[22,1] element, prob = 0.0136719 (133,875)-(301,1420) batch id : 0
[23,1] element, prob = 0.0136719 (443,202)-(802,1022) batch id : 0
[24,1] element, prob = 0.0126953 (718,466)-(757,540) batch id : 0
[25,1] element, prob = 0.0126953 (166,527)-(206,594) batch id : 0
[26,1] element, prob = 0.0126953 (694,511)-(731,590) batch id : 0
[27,1] element, prob = 0.0126953 (705,441)-(771,555) batch id : 0
[28,1] element, prob = 0.0126953 (272,962)-(322,1117) batch id : 0
[29,1] element, prob = 0.0126953 (268,486)-(376,688) batch id : 0
[30,1] element, prob = 0.0126953 (323,489)-(422,690) batch id : 0
[31,1] element, prob = 0.0126953 (277,607)-(384,796) batch id : 0
[32,1] element, prob = 0.0126953 (233,709)-(342,924) batch id : 0
[33,1] element, prob = 0.0126953 (256,672)-(406,957) batch id : 0
[34,1] element, prob = 0.0126953 (252,1043)-(431,1214) batch id : 0
[35,1] element, prob = 0.0117188 (695,466)-(733,539) batch id : 0
[36,1] element, prob = 0.0117188 (673,514)-(709,587) batch id : 0
[37,1] element, prob = 0.0117188 (370,1005)-(399,1078) batch id : 0
[38,1] element, prob = 0.0117188 (670,256)-(753,346) batch id : 0
[39,1] element, prob = 0.0117188 (151,500)-(221,610) batch id : 0
[40,1] element, prob = 0.0117188 (447,489)-(511,608) batch id : 0
[41,1] element, prob = 0.0117188 (126,625)-(200,719) batch id : 0
[42,1] element, prob = 0.0117188 (299,626)-(387,709) batch id : 0
[43,1] element, prob = 0.0117188 (164,673)-(251,747) batch id : 0
[44,1] element, prob = 0.0117188 (398,850)-(453,1008) batch id : 0
[45,1] element, prob = 0.0117188 (425,851)-(476,1008) batch id : 0
[46,1] element, prob = 0.0117188 (310,915)-(371,1077) batch id : 0
[47,1] element, prob = 0.0117188 (357,925)-(413,1073) batch id : 0
[48,1] element, prob = 0.0117188 (378,923)-(435,1074) batch id : 0
[49,1] element, prob = 0.0117188 (468,906)-(521,1070) batch id : 0
[50,1] element, prob = 0.0117188 (284,1016)-(357,1188) batch id : 0
[51,1] element, prob = 0.0117188 (203,-95)-(283,182) batch id : 0
[52,1] element, prob = 0.0117188 (269,378)-(374,595) batch id : 0
[53,1] element, prob = 0.0117188 (-22,575)-(71,831) batch id : 0
[54,1] element, prob = 0.0117188 (318,698)-(433,911) batch id : 0
[55,1] element, prob = 0.0117188 (221,716)-(395,1258) batch id : 0
[56,1] element, prob = 0.0117188 (859,771)-(987,1518) batch id : 0
[57,1] element, prob = 0.0107422 (145,466)-(185,527) batch id : 0
[58,1] element, prob = 0.0107422 (166,463)-(204,521) batch id : 0
[59,1] element, prob = 0.0107422 (185,477)-(224,537) batch id : 0
[60,1] element, prob = 0.0107422 (214,477)-(250,540) batch id : 0
[61,1] element, prob = 0.0107422 (458,478)-(488,536) batch id : 0
[62,1] element, prob = 0.0107422 (151,537)-(177,594) batch id : 0
[63,1] element, prob = 0.0107422 (460,522)-(488,590) batch id : 0
[64,1] element, prob = 0.0107422 (481,523)-(509,588) batch id : 0
[65,1] element, prob = 0.0107422 (716,512)-(755,592) batch id : 0
[66,1] element, prob = 0.0107422 (129,581)-(158,641) batch id : 0
[67,1] element, prob = 0.0107422 (150,579)-(179,635) batch id : 0
[68,1] element, prob = 0.0107422 (171,583)-(201,641) batch id : 0
[69,1] element, prob = 0.0107422 (457,564)-(491,658) batch id : 0
[70,1] element, prob = 0.0107422 (481,581)-(510,647) batch id : 0
[71,1] element, prob = 0.0107422 (652,565)-(689,650) batch id : 0
[72,1] element, prob = 0.0107422 (674,571)-(710,650) batch id : 0
[73,1] element, prob = 0.0107422 (694,571)-(734,649) batch id : 0
[74,1] element, prob = 0.0107422 (143,640)-(188,705) batch id : 0
[75,1] element, prob = 0.0107422 (191,743)-(224,805) batch id : 0
[76,1] element, prob = 0.0107422 (437,843)-(465,920) batch id : 0
[77,1] element, prob = 0.0107422 (-1,943)-(22,1037) batch id : 0
[78,1] element, prob = 0.0107422 (414,1007)-(443,1080) batch id : 0
[79,1] element, prob = 0.0107422 (896,1206)-(924,1279) batch id : 0
[80,1] element, prob = 0.0107422 (678,200)-(743,294) batch id : 0
[81,1] element, prob = 0.0107422 (70,374)-(190,512) batch id : 0
[82,1] element, prob = 0.0107422 (422,429)-(485,563) batch id : 0
[83,1] element, prob = 0.0107422 (106,477)-(213,629) batch id : 0
[84,1] element, prob = 0.0107422 (428,477)-(479,615) batch id : 0
[85,1] element, prob = 0.0107422 (648,498)-(727,605) batch id : 0
[86,1] element, prob = 0.0107422 (703,490)-(769,614) batch id : 0
[87,1] element, prob = 0.0107422 (16,531)-(150,712) batch id : 0
[88,1] element, prob = 0.0107422 (87,546)-(150,685) batch id : 0
[89,1] element, prob = 0.0107422 (127,552)-(197,653) batch id : 0
[90,1] element, prob = 0.0107422 (425,557)-(481,671) batch id : 0
[91,1] element, prob = 0.0107422 (437,509)-(516,700) batch id : 0
[92,1] element, prob = 0.0107422 (467,549)-(524,678) batch id : 0
[93,1] element, prob = 0.0107422 (633,551)-(704,664) batch id : 0
[94,1] element, prob = 0.0107422 (416,611)-(487,735) batch id : 0
[95,1] element, prob = 0.0107422 (466,602)-(530,734) batch id : 0
[96,1] element, prob = 0.0107422 (124,673)-(198,759) batch id : 0
[97,1] element, prob = 0.0107422 (156,705)-(260,849) batch id : 0
[98,1] element, prob = 0.0107422 (54,828)-(137,933) batch id : 0
[99,1] element, prob = 0.0107422 (451,812)-(498,960) batch id : 0
[100,1] element, prob = 0.0107422 (470,843)-(520,1004) batch id : 0
[101,1] element, prob = 0.0107422 (447,897)-(500,1066) batch id : 0
[102,1] element, prob = 0.0107422 (641,966)-(706,1122) batch id : 0
[103,1] element, prob = 0.0107422 (164,85)-(233,410) batch id : 0
[104,1] element, prob = 0.0107422 (188,497)-(307,688) batch id : 0
[105,1] element, prob = 0.0107422 (205,445)-(357,715) batch id : 0
[106,1] element, prob = 0.0107422 (230,602)-(341,791) batch id : 0
[107,1] element, prob = 0.0107422 (274,785)-(390,1021) batch id : 0
[108,1] element, prob = 0.0107422 (26,709)-(209,1239) batch id : 0
[109,1] element, prob = 0.0107422 (309,868)-(485,1410) batch id : 0
[110,1] element, prob = 0.0107422 (229,981)-(575,1306) batch id : 0
[111,1] element, prob = 0.0107422 (29,141)-(518,883) batch id : 0
[ INFO ] Image out_0.bmp created!
[ INFO ] Execution successful

[ INFO ] This sample is an API example, for any performance measurements please use the dedicated benchmark_app tool
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
➜  samples ./cpp/build/armv7l/Release/object_detection_sample_ssd -m face-detection-adas-0001.xml -d MYRIAD -i parents.jpg
[ INFO ] InferenceEngine:
API version ............ 2.1
Build .................. 2020.3.0-3467-15f2c61a-releases/2020/3
Description ....... API
Parsing input parameters
[ INFO ] Files were added: 1
[ INFO ] parents.jpg
[ INFO ] Loading Inference Engine
[ INFO ] Device info:
MYRIAD
myriadPlugin version ......... 2.1
Build ........... 2020.3.0-3467-15f2c61a-releases/2020/3
[ INFO ] Loading network files:
face-detection-adas-0001.xml
face-detection-adas-0001.bin
[ INFO ] Preparing input blobs
[ INFO ] Batch size is 1
[ INFO ] Preparing output blobs
[ INFO ] Loading model to the device
[ INFO ] Create infer request
[ WARNING ] Image is resized from (1280, 960) to (672, 384)
[ INFO ] Batch size is 1
[ INFO ] Start inference
[ INFO ] Processing output blobs
[0,1] element, prob = 1 (826,366)-(1026,644) batch id : 0 WILL BE PRINTED!
[1,1] element, prob = 0.996582 (539,173)-(693,429) batch id : 0 WILL BE PRINTED!
[2,1] element, prob = 0.539062 (1094,47)-(1135,96) batch id : 0 WILL BE PRINTED!
[3,1] element, prob = 0.212402 (848,22)-(886,70) batch id : 0
[4,1] element, prob = 0.0527344 (7,783)-(151,957) batch id : 0
[5,1] element, prob = 0.0395508 (1033,78)-(1070,122) batch id : 0
[6,1] element, prob = 0.034668 (1123,75)-(1158,122) batch id : 0
[7,1] element, prob = 0.0322266 (1091,109)-(1130,171) batch id : 0
[8,1] element, prob = 0.0322266 (1086,14)-(1138,85) batch id : 0
[9,1] element, prob = 0.0302734 (1109,43)-(1150,93) batch id : 0
[10,1] element, prob = 0.0292969 (1151,83)-(1190,131) batch id : 0
[11,1] element, prob = 0.0273438 (1030,2)-(1073,43) batch id : 0
[12,1] element, prob = 0.0273438 (1091,6)-(1133,55) batch id : 0
[13,1] element, prob = 0.0273438 (1064,73)-(1098,121) batch id : 0
[14,1] element, prob = 0.0273438 (1058,113)-(1096,168) batch id : 0
[15,1] element, prob = 0.0273438 (1143,74)-(1211,165) batch id : 0
[16,1] element, prob = 0.0268555 (876,73)-(915,118) batch id : 0
[17,1] element, prob = 0.0268555 (1047,-3)-(1110,59) batch id : 0
[18,1] element, prob = 0.0268555 (849,13)-(1030,236) batch id : 0
[19,1] element, prob = 0.0258789 (941,22)-(981,74) batch id : 0
[20,1] element, prob = 0.0258789 (1030,35)-(1074,89) batch id : 0
[21,1] element, prob = 0.0258789 (845,112)-(886,167) batch id : 0
[22,1] element, prob = 0.0258789 (1036,111)-(1076,171) batch id : 0
[23,1] element, prob = 0.0258789 (1113,-1)-(1175,59) batch id : 0
[24,1] element, prob = 0.0258789 (1002,40)-(1093,151) batch id : 0
[25,1] element, prob = 0.0249023 (1059,0)-(1098,43) batch id : 0
[26,1] element, prob = 0.0249023 (876,29)-(918,79) batch id : 0
[27,1] element, prob = 0.0249023 (1002,69)-(1048,123) batch id : 0
[28,1] element, prob = 0.0249023 (1133,104)-(1170,163) batch id : 0
[29,1] element, prob = 0.0249023 (1056,145)-(1094,208) batch id : 0
[30,1] element, prob = 0.0249023 (861,2)-(918,63) batch id : 0
[31,1] element, prob = 0.0239258 (966,1)-(1018,45) batch id : 0
[32,1] element, prob = 0.0239258 (1098,63)-(1131,109) batch id : 0
[33,1] element, prob = 0.0239258 (1010,84)-(1092,204) batch id : 0
[34,1] element, prob = 0.0239258 (1022,139)-(1076,224) batch id : 0
[35,1] element, prob = 0.0239258 (943,34)-(1073,208) batch id : 0
[36,1] element, prob = 0.0229492 (811,3)-(851,48) batch id : 0
[37,1] element, prob = 0.0229492 (1154,27)-(1198,79) batch id : 0
[38,1] element, prob = 0.0229492 (1000,155)-(1046,226) batch id : 0
[39,1] element, prob = 0.0229492 (1086,145)-(1125,208) batch id : 0
[40,1] element, prob = 0.0229492 (958,20)-(1030,84) batch id : 0
[41,1] element, prob = 0.0229492 (934,65)-(986,155) batch id : 0
[42,1] element, prob = 0.0229492 (1045,99)-(1106,178) batch id : 0
[43,1] element, prob = 0.0229492 (983,-9)-(1141,105) batch id : 0
[44,1] element, prob = 0.0229492 (1065,-16)-(1192,124) batch id : 0
[45,1] element, prob = 0.0229492 (922,81)-(1087,304) batch id : 0
[46,1] element, prob = 0.0219727 (796,3)-(834,47) batch id : 0
[47,1] element, prob = 0.0219727 (911,4)-(953,55) batch id : 0
[48,1] element, prob = 0.0219727 (797,29)-(835,79) batch id : 0
[49,1] element, prob = 0.0219727 (821,34)-(855,81) batch id : 0
[50,1] element, prob = 0.0219727 (910,28)-(945,78) batch id : 0
[51,1] element, prob = 0.0219727 (755,77)-(790,125) batch id : 0
[52,1] element, prob = 0.0219727 (969,80)-(1008,128) batch id : 0
[53,1] element, prob = 0.0219727 (938,108)-(980,173) batch id : 0
[54,1] element, prob = 0.0219727 (1123,186)-(1161,252) batch id : 0
[55,1] element, prob = 0.0219727 (985,28)-(1056,105) batch id : 0
[56,1] element, prob = 0.0219727 (845,56)-(898,135) batch id : 0
[57,1] element, prob = 0.0219727 (1023,62)-(1080,131) batch id : 0
[58,1] element, prob = 0.0219727 (1041,40)-(1125,143) batch id : 0
[59,1] element, prob = 0.0219727 (1035,116)-(1110,242) batch id : 0
[60,1] element, prob = 0.0219727 (990,166)-(1054,269) batch id : 0
[61,1] element, prob = 0.0219727 (1078,169)-(1130,261) batch id : 0
[62,1] element, prob = 0.0219727 (971,13)-(1140,235) batch id : 0
[63,1] element, prob = 0.0219727 (980,125)-(1108,286) batch id : 0
[64,1] element, prob = 0.0209961 (940,5)-(987,55) batch id : 0
[65,1] element, prob = 0.0209961 (991,0)-(1043,41) batch id : 0
[66,1] element, prob = 0.0209961 (1126,0)-(1165,42) batch id : 0
[67,1] element, prob = 0.0209961 (998,41)-(1039,86) batch id : 0
[68,1] element, prob = 0.0209961 (1068,40)-(1103,86) batch id : 0
[69,1] element, prob = 0.0209961 (821,77)-(856,123) batch id : 0
[70,1] element, prob = 0.0209961 (913,80)-(949,128) batch id : 0
[71,1] element, prob = 0.0209961 (821,112)-(861,164) batch id : 0
[72,1] element, prob = 0.0209961 (878,113)-(919,175) batch id : 0
[73,1] element, prob = 0.0209961 (910,110)-(950,170) batch id : 0
[74,1] element, prob = 0.0209961 (1149,111)-(1189,163) batch id : 0
[75,1] element, prob = 0.0209961 (870,149)-(918,215) batch id : 0
[76,1] element, prob = 0.0209961 (1033,152)-(1070,209) batch id : 0
[77,1] element, prob = 0.0209961 (1128,141)-(1165,204) batch id : 0
[78,1] element, prob = 0.0209961 (1147,135)-(1188,199) batch id : 0
[79,1] element, prob = 0.0209961 (1064,188)-(1099,246) batch id : 0
[80,1] element, prob = 0.0209961 (1238,903)-(1279,963) batch id : 0
[81,1] element, prob = 0.0209961 (1146,19)-(1208,109) batch id : 0
[82,1] element, prob = 0.0209961 (1054,58)-(1110,126) batch id : 0
[83,1] element, prob = 0.0209961 (1111,99)-(1190,228) batch id : 0
[84,1] element, prob = 0.0209961 (1052,161)-(1105,265) batch id : 0
[85,1] element, prob = 0.0209961 (1060,145)-(1139,284) batch id : 0
[86,1] element, prob = 0.0209961 (-16,860)-(79,985) batch id : 0
[87,1] element, prob = 0.0209961 (1174,-17)-(1281,136) batch id : 0
[88,1] element, prob = 0.0209961 (1073,32)-(1198,190) batch id : 0
[89,1] element, prob = 0.0209961 (950,188)-(1072,366) batch id : 0
[90,1] element, prob = 0.0200195 (943,81)-(978,130) batch id : 0
[91,1] element, prob = 0.0200195 (968,104)-(1008,159) batch id : 0
[92,1] element, prob = 0.0200195 (930,148)-(973,211) batch id : 0
[93,1] element, prob = 0.0200195 (1092,191)-(1128,246) batch id : 0
[94,1] element, prob = 0.0200195 (1124,225)-(1161,291) batch id : 0
[95,1] element, prob = 0.0200195 (1007,-2)-(1083,55) batch id : 0
[96,1] element, prob = 0.0200195 (890,60)-(960,138) batch id : 0
[97,1] element, prob = 0.0200195 (962,65)-(1023,137) batch id : 0
[98,1] element, prob = 0.0200195 (1101,145)-(1183,284) batch id : 0
[99,1] element, prob = 0.0200195 (866,-10)-(1029,107) batch id : 0
[100,1] element, prob = 0.0200195 (1005,179)-(1111,360) batch id : 0
[101,1] element, prob = 0.0200195 (1019,342)-(1283,785) batch id : 0
[102,1] element, prob = 0.019043 (758,2)-(795,45) batch id : 0
[103,1] element, prob = 0.019043 (756,38)-(792,89) batch id : 0
[104,1] element, prob = 0.019043 (968,36)-(1011,82) batch id : 0
[105,1] element, prob = 0.019043 (729,77)-(763,127) batch id : 0
[106,1] element, prob = 0.019043 (1003,109)-(1044,169) batch id : 0
[107,1] element, prob = 0.019043 (1030,190)-(1066,242) batch id : 0
[108,1] element, prob = 0.019043 (887,83)-(971,209) batch id : 0
[109,1] element, prob = 0.019043 (995,99)-(1055,192) batch id : 0
[110,1] element, prob = 0.019043 (902,135)-(963,221) batch id : 0
[ INFO ] Image out_0.bmp created!
object_detection_sample_ssd: ../../libusb/io.c:2116: handle_events: Assertion `ctx->pollfds_cnt >= internal_nfds' failed.
Aborted

For NCS 1

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
➜  samples ./cpp/build/armv7l/Release/object_detection_sample_ssd -m face-detection-adas-0001.xml -d MYRIAD -i me.jpg     
[ INFO ] InferenceEngine:
API version ............ 2.1
Build .................. 2020.3.0-3467-15f2c61a-releases/2020/3
Description ....... API
Parsing input parameters
[ INFO ] Files were added: 1
[ INFO ] me.jpg
[ INFO ] Loading Inference Engine
[ INFO ] Device info:
MYRIAD
myriadPlugin version ......... 2.1
Build ........... 2020.3.0-3467-15f2c61a-releases/2020/3
[ INFO ] Loading network files:
face-detection-adas-0001.xml
face-detection-adas-0001.bin
[ INFO ] Preparing input blobs
[ INFO ] Batch size is 1
[ INFO ] Preparing output blobs
[ INFO ] Loading model to the device
[ INFO ] Create infer request
[ WARNING ] Image is resized from (924, 1280) to (672, 384)
[ INFO ] Batch size is 1
[ INFO ] Start inference
[ INFO ] Processing output blobs
object_detection_sample_ssd: ../../libusb/io.c:2116: handle_events: Assertion `ctx->pollfds_cnt >= internal_nfds' failed.
Aborted
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
➜  samples ./cpp/build/armv7l/Release/object_detection_sample_ssd -m face-detection-adas-0001.xml -d MYRIAD -i parents.jpg
[ INFO ] InferenceEngine:
API version ............ 2.1
Build .................. 2020.3.0-3467-15f2c61a-releases/2020/3
Description ....... API
Parsing input parameters
[ INFO ] Files were added: 1
[ INFO ] parents.jpg
[ INFO ] Loading Inference Engine
[ INFO ] Device info:
MYRIAD
myriadPlugin version ......... 2.1
Build ........... 2020.3.0-3467-15f2c61a-releases/2020/3
[ INFO ] Loading network files:
face-detection-adas-0001.xml
face-detection-adas-0001.bin
[ INFO ] Preparing input blobs
[ INFO ] Batch size is 1
[ INFO ] Preparing output blobs
[ INFO ] Loading model to the device
[ INFO ] Create infer request
[ WARNING ] Image is resized from (1280, 960) to (672, 384)
[ INFO ] Batch size is 1
[ INFO ] Start inference
[ INFO ] Processing output blobs
object_detection_sample_ssd: ../../libusb/io.c:2116: handle_events: Assertion `ctx->pollfds_cnt >= internal_nfds' failed.
Aborted

3.3.3 Python

For NCS 2

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
➜  samples python ./python/object_detection_sample_ssd/object_detection_sample_ssd.py -m face-detection-adas-0001.xml -d MYRIAD -i me.jpg
[ INFO ] Loading Inference Engine
[ INFO ] Loading network files:
face-detection-adas-0001.xml
face-detection-adas-0001.bin
[ INFO ] Device info:
MYRIAD
MKLDNNPlugin version ......... 2.1
Build ........... 2020.3.0-3467-15f2c61a-releases/2020/3
inputs number: 1
input shape: [1, 3, 384, 672]
input key: data
[ INFO ] File was added:
[ INFO ] me.jpg
[ WARNING ] Image me.jpg is resized from (384, 672) to (384, 672)
[ INFO ] Preparing input blobs
[ INFO ] Batch size is 1
[ INFO ] Preparing output blobs
[ INFO ] Loading model to the device
[ INFO ] Creating infer request and starting inference
[ INFO ] Processing output blobs
[0,1] element, prob = 0.870117 (421,202)-(745,608) batch id : 0 WILL BE PRINTED!
[1,1] element, prob = 0.0268555 (-55,556)-(234,1100) batch id : 0
[2,1] element, prob = 0.0219727 (20,925)-(239,1340) batch id : 0
[3,1] element, prob = 0.0200195 (-18,-144)-(179,581) batch id : 0
[4,1] element, prob = 0.0180664 (636,315)-(757,471) batch id : 0
[5,1] element, prob = 0.0170898 (-27,672)-(137,1217) batch id : 0
[6,1] element, prob = 0.0170898 (74,976)-(364,1322) batch id : 0
[7,1] element, prob = 0.0161133 (291,958)-(349,1122) batch id : 0
[8,1] element, prob = 0.0161133 (329,963)-(391,1126) batch id : 0
[9,1] element, prob = 0.0161133 (203,928)-(423,1346) batch id : 0
[10,1] element, prob = 0.015625 (694,1034)-(732,1140) batch id : 0
[11,1] element, prob = 0.015625 (832,1121)-(937,1295) batch id : 0
[12,1] element, prob = 0.015625 (219,39)-(388,627) batch id : 0
[13,1] element, prob = 0.015625 (76,785)-(357,1160) batch id : 0
[14,1] element, prob = 0.0146484 (355,616)-(475,778) batch id : 0
[15,1] element, prob = 0.0146484 (240,779)-(331,1048) batch id : 0
[16,1] element, prob = 0.0136719 (0,893)-(24,984) batch id : 0
[17,1] element, prob = 0.0136719 (658,306)-(752,405) batch id : 0
[18,1] element, prob = 0.0136719 (204,1198)-(267,1275) batch id : 0
[19,1] element, prob = 0.0136719 (322,611)-(430,794) batch id : 0
[20,1] element, prob = 0.0136719 (171,244)-(423,798) batch id : 0
[21,1] element, prob = 0.0136719 (252,798)-(539,1177) batch id : 0
[22,1] element, prob = 0.0136719 (133,875)-(301,1420) batch id : 0
[23,1] element, prob = 0.0136719 (443,202)-(802,1022) batch id : 0
[24,1] element, prob = 0.0126953 (718,466)-(757,540) batch id : 0
[25,1] element, prob = 0.0126953 (166,527)-(206,594) batch id : 0
[26,1] element, prob = 0.0126953 (694,511)-(731,590) batch id : 0
[27,1] element, prob = 0.0126953 (705,441)-(771,555) batch id : 0
[28,1] element, prob = 0.0126953 (272,962)-(322,1117) batch id : 0
[29,1] element, prob = 0.0126953 (268,486)-(376,688) batch id : 0
[30,1] element, prob = 0.0126953 (323,489)-(422,690) batch id : 0
[31,1] element, prob = 0.0126953 (277,607)-(384,796) batch id : 0
[32,1] element, prob = 0.0126953 (233,709)-(342,924) batch id : 0
[33,1] element, prob = 0.0126953 (256,672)-(406,957) batch id : 0
[34,1] element, prob = 0.0126953 (252,1043)-(431,1214) batch id : 0
[35,1] element, prob = 0.0117188 (695,466)-(733,539) batch id : 0
[36,1] element, prob = 0.0117188 (673,514)-(709,587) batch id : 0
[37,1] element, prob = 0.0117188 (370,1005)-(399,1078) batch id : 0
[38,1] element, prob = 0.0117188 (670,256)-(753,346) batch id : 0
[39,1] element, prob = 0.0117188 (151,500)-(221,610) batch id : 0
[40,1] element, prob = 0.0117188 (447,489)-(511,608) batch id : 0
[41,1] element, prob = 0.0117188 (126,625)-(200,719) batch id : 0
[42,1] element, prob = 0.0117188 (299,626)-(387,709) batch id : 0
[43,1] element, prob = 0.0117188 (164,673)-(251,747) batch id : 0
[44,1] element, prob = 0.0117188 (398,850)-(453,1008) batch id : 0
[45,1] element, prob = 0.0117188 (425,851)-(476,1008) batch id : 0
[46,1] element, prob = 0.0117188 (310,915)-(371,1077) batch id : 0
[47,1] element, prob = 0.0117188 (357,925)-(413,1073) batch id : 0
[48,1] element, prob = 0.0117188 (378,923)-(435,1074) batch id : 0
[49,1] element, prob = 0.0117188 (468,906)-(521,1070) batch id : 0
[50,1] element, prob = 0.0117188 (284,1016)-(357,1188) batch id : 0
[51,1] element, prob = 0.0117188 (203,-95)-(283,182) batch id : 0
[52,1] element, prob = 0.0117188 (269,378)-(374,595) batch id : 0
[53,1] element, prob = 0.0117188 (-22,575)-(71,831) batch id : 0
[54,1] element, prob = 0.0117188 (318,698)-(433,911) batch id : 0
[55,1] element, prob = 0.0117188 (221,716)-(395,1258) batch id : 0
[56,1] element, prob = 0.0117188 (859,771)-(987,1518) batch id : 0
[57,1] element, prob = 0.0107422 (145,466)-(185,527) batch id : 0
[58,1] element, prob = 0.0107422 (166,463)-(204,521) batch id : 0
[59,1] element, prob = 0.0107422 (185,477)-(224,537) batch id : 0
[60,1] element, prob = 0.0107422 (214,477)-(250,540) batch id : 0
[61,1] element, prob = 0.0107422 (458,478)-(488,536) batch id : 0
[62,1] element, prob = 0.0107422 (151,537)-(177,594) batch id : 0
[63,1] element, prob = 0.0107422 (460,522)-(488,590) batch id : 0
[64,1] element, prob = 0.0107422 (481,523)-(509,588) batch id : 0
[65,1] element, prob = 0.0107422 (716,512)-(755,592) batch id : 0
[66,1] element, prob = 0.0107422 (129,581)-(158,641) batch id : 0
[67,1] element, prob = 0.0107422 (150,579)-(179,635) batch id : 0
[68,1] element, prob = 0.0107422 (171,583)-(201,641) batch id : 0
[69,1] element, prob = 0.0107422 (457,564)-(491,658) batch id : 0
[70,1] element, prob = 0.0107422 (481,581)-(510,647) batch id : 0
[71,1] element, prob = 0.0107422 (652,565)-(689,650) batch id : 0
[72,1] element, prob = 0.0107422 (674,571)-(710,650) batch id : 0
[73,1] element, prob = 0.0107422 (694,571)-(734,649) batch id : 0
[74,1] element, prob = 0.0107422 (143,640)-(188,705) batch id : 0
[75,1] element, prob = 0.0107422 (191,743)-(224,805) batch id : 0
[76,1] element, prob = 0.0107422 (437,843)-(465,920) batch id : 0
[77,1] element, prob = 0.0107422 (-1,943)-(22,1037) batch id : 0
[78,1] element, prob = 0.0107422 (414,1007)-(443,1080) batch id : 0
[79,1] element, prob = 0.0107422 (896,1206)-(924,1279) batch id : 0
[80,1] element, prob = 0.0107422 (678,200)-(743,294) batch id : 0
[81,1] element, prob = 0.0107422 (70,374)-(190,512) batch id : 0
[82,1] element, prob = 0.0107422 (422,429)-(485,563) batch id : 0
[83,1] element, prob = 0.0107422 (106,477)-(213,629) batch id : 0
[84,1] element, prob = 0.0107422 (428,477)-(479,615) batch id : 0
[85,1] element, prob = 0.0107422 (648,498)-(727,605) batch id : 0
[86,1] element, prob = 0.0107422 (703,490)-(769,614) batch id : 0
[87,1] element, prob = 0.0107422 (16,531)-(150,712) batch id : 0
[88,1] element, prob = 0.0107422 (87,546)-(150,685) batch id : 0
[89,1] element, prob = 0.0107422 (127,552)-(197,653) batch id : 0
[90,1] element, prob = 0.0107422 (425,557)-(481,671) batch id : 0
[91,1] element, prob = 0.0107422 (437,509)-(516,700) batch id : 0
[92,1] element, prob = 0.0107422 (467,549)-(524,678) batch id : 0
[93,1] element, prob = 0.0107422 (633,551)-(704,664) batch id : 0
[94,1] element, prob = 0.0107422 (416,611)-(487,735) batch id : 0
[95,1] element, prob = 0.0107422 (466,602)-(530,734) batch id : 0
[96,1] element, prob = 0.0107422 (124,673)-(198,759) batch id : 0
[97,1] element, prob = 0.0107422 (156,705)-(260,849) batch id : 0
[98,1] element, prob = 0.0107422 (54,828)-(137,933) batch id : 0
[99,1] element, prob = 0.0107422 (451,812)-(498,960) batch id : 0
[100,1] element, prob = 0.0107422 (470,843)-(520,1004) batch id : 0
[101,1] element, prob = 0.0107422 (447,897)-(500,1066) batch id : 0
[102,1] element, prob = 0.0107422 (641,966)-(706,1122) batch id : 0
[103,1] element, prob = 0.0107422 (164,85)-(233,410) batch id : 0
[104,1] element, prob = 0.0107422 (188,497)-(307,688) batch id : 0
[105,1] element, prob = 0.0107422 (205,445)-(357,715) batch id : 0
[106,1] element, prob = 0.0107422 (230,602)-(341,791) batch id : 0
[107,1] element, prob = 0.0107422 (274,785)-(390,1021) batch id : 0
[108,1] element, prob = 0.0107422 (26,709)-(209,1239) batch id : 0
[109,1] element, prob = 0.0107422 (309,868)-(485,1410) batch id : 0
[110,1] element, prob = 0.0107422 (229,981)-(575,1306) batch id : 0
[111,1] element, prob = 0.0107422 (29,141)-(518,883) batch id : 0
[ INFO ] Image out.bmp created!
[ INFO ] Execution successful

[ INFO ] This sample is an API example, for any performance measurements please use the dedicated benchmark_app tool
corrupted double-linked list
Aborted
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
➜  samples python ./python/object_detection_sample_ssd/object_detection_sample_ssd.py -m face-detection-adas-0001.xml -d MYRIAD -i parents.jpg
[ INFO ] Loading Inference Engine
[ INFO ] Loading network files:
face-detection-adas-0001.xml
face-detection-adas-0001.bin
[ INFO ] Device info:
MYRIAD
MKLDNNPlugin version ......... 2.1
Build ........... 2020.3.0-3467-15f2c61a-releases/2020/3
inputs number: 1
input shape: [1, 3, 384, 672]
input key: data
[ INFO ] File was added:
[ INFO ] parents.jpg
[ WARNING ] Image parents.jpg is resized from (384, 672) to (384, 672)
[ INFO ] Preparing input blobs
[ INFO ] Batch size is 1
[ INFO ] Preparing output blobs
[ INFO ] Loading model to the device
[ INFO ] Creating infer request and starting inference
[ INFO ] Processing output blobs
[0,1] element, prob = 1.0 (826,366)-(1026,644) batch id : 0 WILL BE PRINTED!
[1,1] element, prob = 0.996582 (539,173)-(693,429) batch id : 0 WILL BE PRINTED!
[2,1] element, prob = 0.539062 (1094,47)-(1135,96) batch id : 0 WILL BE PRINTED!
[3,1] element, prob = 0.212402 (848,22)-(886,70) batch id : 0
[4,1] element, prob = 0.0527344 (7,783)-(151,957) batch id : 0
[5,1] element, prob = 0.0395508 (1033,78)-(1070,122) batch id : 0
[6,1] element, prob = 0.034668 (1123,75)-(1158,122) batch id : 0
[7,1] element, prob = 0.0322266 (1091,109)-(1130,171) batch id : 0
[8,1] element, prob = 0.0322266 (1086,14)-(1138,85) batch id : 0
[9,1] element, prob = 0.0302734 (1109,43)-(1150,93) batch id : 0
[10,1] element, prob = 0.0292969 (1151,83)-(1190,131) batch id : 0
[11,1] element, prob = 0.0273438 (1030,2)-(1073,43) batch id : 0
[12,1] element, prob = 0.0273438 (1091,6)-(1133,55) batch id : 0
[13,1] element, prob = 0.0273438 (1064,73)-(1098,121) batch id : 0
[14,1] element, prob = 0.0273438 (1058,113)-(1096,168) batch id : 0
[15,1] element, prob = 0.0273438 (1143,74)-(1211,165) batch id : 0
[16,1] element, prob = 0.0268555 (876,73)-(915,118) batch id : 0
[17,1] element, prob = 0.0268555 (1047,-3)-(1110,59) batch id : 0
[18,1] element, prob = 0.0268555 (849,13)-(1030,236) batch id : 0
[19,1] element, prob = 0.0258789 (941,22)-(981,74) batch id : 0
[20,1] element, prob = 0.0258789 (1030,35)-(1074,89) batch id : 0
[21,1] element, prob = 0.0258789 (845,112)-(886,167) batch id : 0
[22,1] element, prob = 0.0258789 (1036,111)-(1076,171) batch id : 0
[23,1] element, prob = 0.0258789 (1113,-1)-(1175,59) batch id : 0
[24,1] element, prob = 0.0258789 (1002,40)-(1093,151) batch id : 0
[25,1] element, prob = 0.0249023 (1059,0)-(1098,43) batch id : 0
[26,1] element, prob = 0.0249023 (876,29)-(918,79) batch id : 0
[27,1] element, prob = 0.0249023 (1002,69)-(1048,123) batch id : 0
[28,1] element, prob = 0.0249023 (1133,104)-(1170,163) batch id : 0
[29,1] element, prob = 0.0249023 (1056,145)-(1094,208) batch id : 0
[30,1] element, prob = 0.0249023 (861,2)-(918,63) batch id : 0
[31,1] element, prob = 0.0239258 (966,1)-(1018,45) batch id : 0
[32,1] element, prob = 0.0239258 (1098,63)-(1131,109) batch id : 0
[33,1] element, prob = 0.0239258 (1010,84)-(1092,204) batch id : 0
[34,1] element, prob = 0.0239258 (1022,139)-(1076,224) batch id : 0
[35,1] element, prob = 0.0239258 (943,34)-(1073,208) batch id : 0
[36,1] element, prob = 0.0229492 (811,3)-(851,48) batch id : 0
[37,1] element, prob = 0.0229492 (1154,27)-(1198,79) batch id : 0
[38,1] element, prob = 0.0229492 (1000,155)-(1046,226) batch id : 0
[39,1] element, prob = 0.0229492 (1086,145)-(1125,208) batch id : 0
[40,1] element, prob = 0.0229492 (958,20)-(1030,84) batch id : 0
[41,1] element, prob = 0.0229492 (934,65)-(986,155) batch id : 0
[42,1] element, prob = 0.0229492 (1045,99)-(1106,178) batch id : 0
[43,1] element, prob = 0.0229492 (983,-9)-(1141,105) batch id : 0
[44,1] element, prob = 0.0229492 (1065,-16)-(1192,124) batch id : 0
[45,1] element, prob = 0.0229492 (922,81)-(1087,304) batch id : 0
[46,1] element, prob = 0.0219727 (796,3)-(834,47) batch id : 0
[47,1] element, prob = 0.0219727 (911,4)-(953,55) batch id : 0
[48,1] element, prob = 0.0219727 (797,29)-(835,79) batch id : 0
[49,1] element, prob = 0.0219727 (821,34)-(855,81) batch id : 0
[50,1] element, prob = 0.0219727 (910,28)-(945,78) batch id : 0
[51,1] element, prob = 0.0219727 (755,77)-(790,125) batch id : 0
[52,1] element, prob = 0.0219727 (969,80)-(1008,128) batch id : 0
[53,1] element, prob = 0.0219727 (938,108)-(980,173) batch id : 0
[54,1] element, prob = 0.0219727 (1123,186)-(1161,252) batch id : 0
[55,1] element, prob = 0.0219727 (985,28)-(1056,105) batch id : 0
[56,1] element, prob = 0.0219727 (845,56)-(898,135) batch id : 0
[57,1] element, prob = 0.0219727 (1023,62)-(1080,131) batch id : 0
[58,1] element, prob = 0.0219727 (1041,40)-(1125,143) batch id : 0
[59,1] element, prob = 0.0219727 (1035,116)-(1110,242) batch id : 0
[60,1] element, prob = 0.0219727 (990,166)-(1054,269) batch id : 0
[61,1] element, prob = 0.0219727 (1078,169)-(1130,261) batch id : 0
[62,1] element, prob = 0.0219727 (971,13)-(1140,235) batch id : 0
[63,1] element, prob = 0.0219727 (980,125)-(1108,286) batch id : 0
[64,1] element, prob = 0.0209961 (940,5)-(987,55) batch id : 0
[65,1] element, prob = 0.0209961 (991,0)-(1043,41) batch id : 0
[66,1] element, prob = 0.0209961 (1126,0)-(1165,42) batch id : 0
[67,1] element, prob = 0.0209961 (998,41)-(1039,86) batch id : 0
[68,1] element, prob = 0.0209961 (1068,40)-(1103,86) batch id : 0
[69,1] element, prob = 0.0209961 (821,77)-(856,123) batch id : 0
[70,1] element, prob = 0.0209961 (913,80)-(949,128) batch id : 0
[71,1] element, prob = 0.0209961 (821,112)-(861,164) batch id : 0
[72,1] element, prob = 0.0209961 (878,113)-(919,175) batch id : 0
[73,1] element, prob = 0.0209961 (910,110)-(950,170) batch id : 0
[74,1] element, prob = 0.0209961 (1149,111)-(1189,163) batch id : 0
[75,1] element, prob = 0.0209961 (870,149)-(918,215) batch id : 0
[76,1] element, prob = 0.0209961 (1033,152)-(1070,209) batch id : 0
[77,1] element, prob = 0.0209961 (1128,141)-(1165,204) batch id : 0
[78,1] element, prob = 0.0209961 (1147,135)-(1188,199) batch id : 0
[79,1] element, prob = 0.0209961 (1064,188)-(1099,246) batch id : 0
[80,1] element, prob = 0.0209961 (1238,903)-(1279,963) batch id : 0
[81,1] element, prob = 0.0209961 (1146,19)-(1208,109) batch id : 0
[82,1] element, prob = 0.0209961 (1054,58)-(1110,126) batch id : 0
[83,1] element, prob = 0.0209961 (1111,99)-(1190,228) batch id : 0
[84,1] element, prob = 0.0209961 (1052,161)-(1105,265) batch id : 0
[85,1] element, prob = 0.0209961 (1060,145)-(1139,284) batch id : 0
[86,1] element, prob = 0.0209961 (-16,860)-(79,985) batch id : 0
[87,1] element, prob = 0.0209961 (1174,-17)-(1281,136) batch id : 0
[88,1] element, prob = 0.0209961 (1073,32)-(1198,190) batch id : 0
[89,1] element, prob = 0.0209961 (950,188)-(1072,366) batch id : 0
[90,1] element, prob = 0.0200195 (943,81)-(978,130) batch id : 0
[91,1] element, prob = 0.0200195 (968,104)-(1008,159) batch id : 0
[92,1] element, prob = 0.0200195 (930,148)-(973,211) batch id : 0
[93,1] element, prob = 0.0200195 (1092,191)-(1128,246) batch id : 0
[94,1] element, prob = 0.0200195 (1124,225)-(1161,291) batch id : 0
[95,1] element, prob = 0.0200195 (1007,-2)-(1083,55) batch id : 0
[96,1] element, prob = 0.0200195 (890,60)-(960,138) batch id : 0
[97,1] element, prob = 0.0200195 (962,65)-(1023,137) batch id : 0
[98,1] element, prob = 0.0200195 (1101,145)-(1183,284) batch id : 0
[99,1] element, prob = 0.0200195 (866,-10)-(1029,107) batch id : 0
[100,1] element, prob = 0.0200195 (1005,179)-(1111,360) batch id : 0
[101,1] element, prob = 0.0200195 (1019,342)-(1283,785) batch id : 0
[102,1] element, prob = 0.019043 (758,2)-(795,45) batch id : 0
[103,1] element, prob = 0.019043 (756,38)-(792,89) batch id : 0
[104,1] element, prob = 0.019043 (968,36)-(1011,82) batch id : 0
[105,1] element, prob = 0.019043 (729,77)-(763,127) batch id : 0
[106,1] element, prob = 0.019043 (1003,109)-(1044,169) batch id : 0
[107,1] element, prob = 0.019043 (1030,190)-(1066,242) batch id : 0
[108,1] element, prob = 0.019043 (887,83)-(971,209) batch id : 0
[109,1] element, prob = 0.019043 (995,99)-(1055,192) batch id : 0
[110,1] element, prob = 0.019043 (902,135)-(963,221) batch id : 0
[ INFO ] Image out.bmp created!
[ INFO ] Execution successful

[ INFO ] This sample is an API example, for any performance measurements please use the dedicated benchmark_app tool
free(): invalid pointer
Aborted

For NCS 1

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
➜  samples python ./python/object_detection_sample_ssd/object_detection_sample_ssd.py -m face-detection-adas-0001.xml -d MYRIAD -i me.jpg
[ INFO ] Loading Inference Engine
[ INFO ] Loading network files:
face-detection-adas-0001.xml
face-detection-adas-0001.bin
[ INFO ] Device info:
MYRIAD
MKLDNNPlugin version ......... 2.1
Build ........... 2020.3.0-3467-15f2c61a-releases/2020/3
inputs number: 1
input shape: [1, 3, 384, 672]
input key: data
[ INFO ] File was added:
[ INFO ] me.jpg
[ WARNING ] Image me.jpg is resized from (384, 672) to (384, 672)
[ INFO ] Preparing input blobs
[ INFO ] Batch size is 1
[ INFO ] Preparing output blobs
[ INFO ] Loading model to the device
[ INFO ] Creating infer request and starting inference
[ INFO ] Processing output blobs
[0,1] element, prob = 0.871094 (420,202)-(745,608) batch id : 0 WILL BE PRINTED!
[1,1] element, prob = 0.0268555 (-54,555)-(234,1100) batch id : 0
[2,1] element, prob = 0.0209961 (21,925)-(239,1340) batch id : 0
[3,1] element, prob = 0.019043 (-18,-144)-(178,583) batch id : 0
[4,1] element, prob = 0.0180664 (636,315)-(757,470) batch id : 0
[5,1] element, prob = 0.0170898 (75,977)-(363,1321) batch id : 0
[6,1] element, prob = 0.0161133 (291,958)-(349,1123) batch id : 0
[7,1] element, prob = 0.015625 (329,962)-(391,1126) batch id : 0
[8,1] element, prob = 0.015625 (832,1121)-(937,1293) batch id : 0
[9,1] element, prob = 0.015625 (-27,670)-(136,1220) batch id : 0
[10,1] element, prob = 0.015625 (76,785)-(357,1160) batch id : 0
[11,1] element, prob = 0.015625 (204,928)-(423,1346) batch id : 0
[12,1] element, prob = 0.0146484 (694,1035)-(732,1141) batch id : 0
[13,1] element, prob = 0.0146484 (219,40)-(388,626) batch id : 0
[14,1] element, prob = 0.0136719 (658,306)-(752,405) batch id : 0
[15,1] element, prob = 0.0136719 (204,1198)-(267,1275) batch id : 0
[16,1] element, prob = 0.0136719 (322,612)-(430,793) batch id : 0
[17,1] element, prob = 0.0136719 (240,779)-(331,1046) batch id : 0
[18,1] element, prob = 0.0136719 (171,245)-(424,798) batch id : 0
[19,1] element, prob = 0.0136719 (443,205)-(802,1021) batch id : 0
[20,1] element, prob = 0.0126953 (694,511)-(731,590) batch id : 0
[21,1] element, prob = 0.0126953 (0,893)-(24,984) batch id : 0
[22,1] element, prob = 0.0126953 (272,962)-(322,1117) batch id : 0
[23,1] element, prob = 0.0126953 (268,487)-(376,688) batch id : 0
[24,1] element, prob = 0.0126953 (323,490)-(422,690) batch id : 0
[25,1] element, prob = 0.0126953 (277,608)-(384,795) batch id : 0
[26,1] element, prob = 0.0126953 (233,710)-(342,923) batch id : 0
[27,1] element, prob = 0.0126953 (256,672)-(406,957) batch id : 0
[28,1] element, prob = 0.0126953 (252,799)-(540,1176) batch id : 0
[29,1] element, prob = 0.0126953 (133,876)-(301,1420) batch id : 0
[30,1] element, prob = 0.0117188 (718,466)-(757,540) batch id : 0
[31,1] element, prob = 0.0117188 (166,527)-(206,594) batch id : 0
[32,1] element, prob = 0.0117188 (673,514)-(709,587) batch id : 0
[33,1] element, prob = 0.0117188 (-1,943)-(22,1038) batch id : 0
[34,1] element, prob = 0.0117188 (670,256)-(753,346) batch id : 0
[35,1] element, prob = 0.0117188 (705,441)-(771,555) batch id : 0
[36,1] element, prob = 0.0117188 (126,625)-(200,718) batch id : 0
[37,1] element, prob = 0.0117188 (164,673)-(251,747) batch id : 0
[38,1] element, prob = 0.0117188 (399,850)-(453,1008) batch id : 0
[39,1] element, prob = 0.0117188 (424,851)-(476,1009) batch id : 0
[40,1] element, prob = 0.0117188 (357,925)-(413,1073) batch id : 0
[41,1] element, prob = 0.0117188 (399,919)-(454,1074) batch id : 0
[42,1] element, prob = 0.0117188 (468,906)-(521,1070) batch id : 0
[43,1] element, prob = 0.0117188 (284,1016)-(357,1186) batch id : 0
[44,1] element, prob = 0.0117188 (269,379)-(374,595) batch id : 0
[45,1] element, prob = 0.0117188 (349,563)-(485,831) batch id : 0
[46,1] element, prob = 0.0117188 (252,1042)-(431,1215) batch id : 0
[47,1] element, prob = 0.0117188 (858,771)-(987,1518) batch id : 0
[48,1] element, prob = 0.0107422 (165,463)-(204,521) batch id : 0
[49,1] element, prob = 0.0107422 (185,477)-(224,537) batch id : 0
[50,1] element, prob = 0.0107422 (214,477)-(250,540) batch id : 0
[51,1] element, prob = 0.0107422 (458,478)-(488,536) batch id : 0
[52,1] element, prob = 0.0107422 (695,466)-(733,539) batch id : 0
[53,1] element, prob = 0.0107422 (460,522)-(488,590) batch id : 0
[54,1] element, prob = 0.0107422 (481,523)-(509,588) batch id : 0
[55,1] element, prob = 0.0107422 (716,512)-(755,592) batch id : 0
[56,1] element, prob = 0.0107422 (150,579)-(179,635) batch id : 0
[57,1] element, prob = 0.0107422 (171,584)-(201,641) batch id : 0
[58,1] element, prob = 0.0107422 (481,582)-(510,647) batch id : 0
[59,1] element, prob = 0.0107422 (652,565)-(689,650) batch id : 0
[60,1] element, prob = 0.0107422 (674,571)-(710,650) batch id : 0
[61,1] element, prob = 0.0107422 (694,571)-(734,648) batch id : 0
[62,1] element, prob = 0.0107422 (142,640)-(188,705) batch id : 0
[63,1] element, prob = 0.0107422 (335,627)-(390,695) batch id : 0
[64,1] element, prob = 0.0107422 (191,744)-(225,803) batch id : 0
[65,1] element, prob = 0.0107422 (370,1005)-(399,1078) batch id : 0
[66,1] element, prob = 0.0107422 (414,1006)-(443,1079) batch id : 0
[67,1] element, prob = 0.0107422 (896,1207)-(924,1280) batch id : 0
[68,1] element, prob = 0.0107422 (86,386)-(160,513) batch id : 0
[69,1] element, prob = 0.0107422 (106,477)-(214,630) batch id : 0
[70,1] element, prob = 0.0107422 (151,501)-(221,610) batch id : 0
[71,1] element, prob = 0.0107422 (428,478)-(479,615) batch id : 0
[72,1] element, prob = 0.0107422 (450,478)-(506,611) batch id : 0
[73,1] element, prob = 0.0107422 (647,498)-(728,605) batch id : 0
[74,1] element, prob = 0.0107422 (148,556)-(220,658) batch id : 0
[75,1] element, prob = 0.0107422 (425,557)-(481,671) batch id : 0
[76,1] element, prob = 0.0107422 (437,510)-(516,700) batch id : 0
[77,1] element, prob = 0.0107422 (463,554)-(527,674) batch id : 0
[78,1] element, prob = 0.0107422 (633,551)-(704,664) batch id : 0
[79,1] element, prob = 0.0107422 (280,628)-(367,711) batch id : 0
[80,1] element, prob = 0.0107422 (416,612)-(488,735) batch id : 0
[81,1] element, prob = 0.0107422 (155,706)-(260,850) batch id : 0
[82,1] element, prob = 0.0107422 (55,828)-(137,932) batch id : 0
[83,1] element, prob = 0.0107422 (451,813)-(498,960) batch id : 0
[84,1] element, prob = 0.0107422 (310,914)-(371,1076) batch id : 0
[85,1] element, prob = 0.0107422 (447,896)-(500,1065) batch id : 0
[86,1] element, prob = 0.0107422 (641,966)-(706,1121) batch id : 0
[87,1] element, prob = 0.0107422 (203,-95)-(283,182) batch id : 0
[88,1] element, prob = 0.0107422 (164,86)-(233,410) batch id : 0
[89,1] element, prob = 0.0107422 (188,498)-(307,688) batch id : 0
[90,1] element, prob = 0.0107422 (205,445)-(357,715) batch id : 0
[91,1] element, prob = 0.0107422 (-22,576)-(71,832) batch id : 0
[92,1] element, prob = 0.0107422 (230,603)-(342,791) batch id : 0
[93,1] element, prob = 0.0107422 (318,698)-(433,909) batch id : 0
[94,1] element, prob = 0.0107422 (273,784)-(390,1021) batch id : 0
[95,1] element, prob = 0.0107422 (175,740)-(444,1237) batch id : 0
[ INFO ] Image out.bmp created!
[ INFO ] Execution successful

[ INFO ] This sample is an API example, for any performance measurements please use the dedicated benchmark_app tool
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
➜  samples python ./python/object_detection_sample_ssd/object_detection_sample_ssd.py -m face-detection-adas-0001.xml -d MYRIAD -i parents.jpg
[ INFO ] Loading Inference Engine
[ INFO ] Loading network files:
face-detection-adas-0001.xml
face-detection-adas-0001.bin
[ INFO ] Device info:
MYRIAD
MKLDNNPlugin version ......... 2.1
Build ........... 2020.3.0-3467-15f2c61a-releases/2020/3
inputs number: 1
input shape: [1, 3, 384, 672]
input key: data
[ INFO ] File was added:
[ INFO ] parents.jpg
[ WARNING ] Image parents.jpg is resized from (384, 672) to (384, 672)
[ INFO ] Preparing input blobs
[ INFO ] Batch size is 1
[ INFO ] Preparing output blobs
[ INFO ] Loading model to the device
[ INFO ] Creating infer request and starting inference
[ INFO ] Processing output blobs
[0,1] element, prob = 1.0 (826,366)-(1026,644) batch id : 0 WILL BE PRINTED!
[1,1] element, prob = 0.996582 (539,173)-(693,429) batch id : 0 WILL BE PRINTED!
[2,1] element, prob = 0.552734 (1094,47)-(1135,95) batch id : 0 WILL BE PRINTED!
[3,1] element, prob = 0.22168 (848,22)-(886,70) batch id : 0
[4,1] element, prob = 0.0537109 (8,784)-(151,956) batch id : 0
[5,1] element, prob = 0.0395508 (1033,78)-(1070,122) batch id : 0
[6,1] element, prob = 0.034668 (1123,75)-(1158,122) batch id : 0
[7,1] element, prob = 0.0322266 (1086,14)-(1138,85) batch id : 0
[8,1] element, prob = 0.03125 (1091,110)-(1130,171) batch id : 0
[9,1] element, prob = 0.0302734 (1108,43)-(1150,93) batch id : 0
[10,1] element, prob = 0.0292969 (1151,83)-(1190,131) batch id : 0
[11,1] element, prob = 0.0273438 (1091,6)-(1133,55) batch id : 0
[12,1] element, prob = 0.0273438 (1064,73)-(1098,121) batch id : 0
[13,1] element, prob = 0.0268555 (1030,2)-(1073,42) batch id : 0
[14,1] element, prob = 0.0268555 (1058,113)-(1096,168) batch id : 0
[15,1] element, prob = 0.0268555 (1047,-3)-(1110,59) batch id : 0
[16,1] element, prob = 0.0268555 (1142,75)-(1211,165) batch id : 0
[17,1] element, prob = 0.0268555 (849,14)-(1030,236) batch id : 0
[18,1] element, prob = 0.0258789 (941,22)-(981,74) batch id : 0
[19,1] element, prob = 0.0258789 (1030,36)-(1074,89) batch id : 0
[20,1] element, prob = 0.0258789 (876,73)-(915,118) batch id : 0
[21,1] element, prob = 0.0258789 (845,112)-(886,167) batch id : 0
[22,1] element, prob = 0.0258789 (1036,110)-(1076,171) batch id : 0
[23,1] element, prob = 0.0258789 (1113,-1)-(1175,59) batch id : 0
[24,1] element, prob = 0.0258789 (1003,41)-(1094,151) batch id : 0
[25,1] element, prob = 0.0249023 (1059,0)-(1098,43) batch id : 0
[26,1] element, prob = 0.0249023 (1002,69)-(1048,123) batch id : 0
[27,1] element, prob = 0.0249023 (1133,105)-(1170,164) batch id : 0
[28,1] element, prob = 0.0239258 (966,1)-(1018,45) batch id : 0
[29,1] element, prob = 0.0239258 (876,29)-(918,79) batch id : 0
[30,1] element, prob = 0.0239258 (1056,145)-(1094,208) batch id : 0
[31,1] element, prob = 0.0239258 (861,2)-(918,63) batch id : 0
[32,1] element, prob = 0.0239258 (1010,84)-(1092,204) batch id : 0
[33,1] element, prob = 0.0239258 (1022,139)-(1076,224) batch id : 0
[34,1] element, prob = 0.0239258 (942,35)-(1072,208) batch id : 0
[35,1] element, prob = 0.0229492 (811,3)-(851,48) batch id : 0
[36,1] element, prob = 0.0229492 (1098,63)-(1131,109) batch id : 0
[37,1] element, prob = 0.0229492 (983,-9)-(1141,105) batch id : 0
[38,1] element, prob = 0.0229492 (921,81)-(1087,304) batch id : 0
[39,1] element, prob = 0.0219727 (796,3)-(835,47) batch id : 0
[40,1] element, prob = 0.0219727 (911,4)-(953,55) batch id : 0
[41,1] element, prob = 0.0219727 (797,29)-(835,79) batch id : 0
[42,1] element, prob = 0.0219727 (1154,27)-(1198,79) batch id : 0
[43,1] element, prob = 0.0219727 (969,80)-(1008,128) batch id : 0
[44,1] element, prob = 0.0219727 (1000,155)-(1046,226) batch id : 0
[45,1] element, prob = 0.0219727 (1086,145)-(1125,208) batch id : 0
[46,1] element, prob = 0.0219727 (1123,186)-(1161,252) batch id : 0
[47,1] element, prob = 0.0219727 (958,20)-(1030,84) batch id : 0
[48,1] element, prob = 0.0219727 (985,28)-(1056,105) batch id : 0
[49,1] element, prob = 0.0219727 (845,56)-(898,134) batch id : 0
[50,1] element, prob = 0.0219727 (934,65)-(986,155) batch id : 0
[51,1] element, prob = 0.0219727 (1046,99)-(1107,178) batch id : 0
[52,1] element, prob = 0.0219727 (1035,116)-(1110,241) batch id : 0
[53,1] element, prob = 0.0219727 (990,166)-(1054,269) batch id : 0
[54,1] element, prob = 0.0219727 (1065,-16)-(1192,124) batch id : 0
[55,1] element, prob = 0.0219727 (971,13)-(1140,235) batch id : 0
[56,1] element, prob = 0.0219727 (980,125)-(1108,286) batch id : 0
[57,1] element, prob = 0.0209961 (991,0)-(1043,41) batch id : 0
[58,1] element, prob = 0.0209961 (1126,0)-(1165,42) batch id : 0
[59,1] element, prob = 0.0209961 (821,35)-(856,79) batch id : 0
[60,1] element, prob = 0.0209961 (910,28)-(945,78) batch id : 0
[61,1] element, prob = 0.0209961 (997,41)-(1038,86) batch id : 0
[62,1] element, prob = 0.0209961 (1068,40)-(1103,86) batch id : 0
[63,1] element, prob = 0.0209961 (754,77)-(789,125) batch id : 0
[64,1] element, prob = 0.0209961 (819,77)-(855,123) batch id : 0
[65,1] element, prob = 0.0209961 (913,79)-(949,128) batch id : 0
[66,1] element, prob = 0.0209961 (821,112)-(861,164) batch id : 0
[67,1] element, prob = 0.0209961 (938,108)-(980,173) batch id : 0
[68,1] element, prob = 0.0209961 (1149,112)-(1189,162) batch id : 0
[69,1] element, prob = 0.0209961 (870,149)-(918,215) batch id : 0
[70,1] element, prob = 0.0209961 (1033,152)-(1070,209) batch id : 0
[71,1] element, prob = 0.0209961 (1064,188)-(1099,246) batch id : 0
[72,1] element, prob = 0.0209961 (1023,63)-(1081,131) batch id : 0
[73,1] element, prob = 0.0209961 (1054,58)-(1110,126) batch id : 0
[74,1] element, prob = 0.0209961 (1041,40)-(1125,143) batch id : 0
[75,1] element, prob = 0.0209961 (1111,99)-(1190,228) batch id : 0
[76,1] element, prob = 0.0209961 (1078,169)-(1130,261) batch id : 0
[77,1] element, prob = 0.0209961 (950,189)-(1072,367) batch id : 0
[78,1] element, prob = 0.0200195 (940,6)-(987,56) batch id : 0
[79,1] element, prob = 0.0200195 (943,81)-(978,130) batch id : 0
[80,1] element, prob = 0.0200195 (878,114)-(919,175) batch id : 0
[81,1] element, prob = 0.0200195 (910,110)-(950,170) batch id : 0
[82,1] element, prob = 0.0200195 (968,104)-(1008,159) batch id : 0
[83,1] element, prob = 0.0200195 (1128,141)-(1165,204) batch id : 0
[84,1] element, prob = 0.0200195 (1147,135)-(1188,199) batch id : 0
[85,1] element, prob = 0.0200195 (1238,903)-(1279,963) batch id : 0
[86,1] element, prob = 0.0200195 (1007,-2)-(1083,55) batch id : 0
[87,1] element, prob = 0.0200195 (1052,161)-(1105,266) batch id : 0
[88,1] element, prob = 0.0200195 (1061,145)-(1140,284) batch id : 0
[89,1] element, prob = 0.0200195 (-16,860)-(79,985) batch id : 0
[90,1] element, prob = 0.0200195 (866,-10)-(1029,107) batch id : 0
[91,1] element, prob = 0.0200195 (1173,-17)-(1280,136) batch id : 0
[92,1] element, prob = 0.0200195 (1072,32)-(1197,189) batch id : 0
[93,1] element, prob = 0.0200195 (1005,179)-(1111,359) batch id : 0
[94,1] element, prob = 0.019043 (758,2)-(795,45) batch id : 0
[95,1] element, prob = 0.019043 (968,35)-(1011,82) batch id : 0
[96,1] element, prob = 0.019043 (1003,109)-(1044,169) batch id : 0
[97,1] element, prob = 0.019043 (930,148)-(973,211) batch id : 0
[98,1] element, prob = 0.019043 (1092,191)-(1128,246) batch id : 0
[99,1] element, prob = 0.019043 (1123,225)-(1161,291) batch id : 0
[100,1] element, prob = 0.019043 (1146,19)-(1208,109) batch id : 0
[101,1] element, prob = 0.019043 (897,59)-(958,143) batch id : 0
[102,1] element, prob = 0.019043 (962,65)-(1023,137) batch id : 0
[103,1] element, prob = 0.019043 (886,83)-(971,209) batch id : 0
[104,1] element, prob = 0.019043 (995,99)-(1055,192) batch id : 0
[105,1] element, prob = 0.019043 (902,135)-(963,221) batch id : 0
[106,1] element, prob = 0.019043 (1101,145)-(1183,284) batch id : 0
[107,1] element, prob = 0.019043 (821,-17)-(946,129) batch id : 0
[108,1] element, prob = 0.019043 (940,-16)-(1076,120) batch id : 0
[109,1] element, prob = 0.019043 (819,34)-(945,205) batch id : 0
[110,1] element, prob = 0.019043 (1019,342)-(1283,785) batch id : 0
[111,1] element, prob = 0.0180664 (1147,7)-(1193,54) batch id : 0
[112,1] element, prob = 0.0180664 (756,38)-(792,89) batch id : 0
[ INFO ] Image out.bmp created!
[ INFO ] Execution successful

[ INFO ] This sample is an API example, for any performance measurements please use the dedicated benchmark_app tool
corrupted double-linked list
Aborted

3.3.4 Results

Parents At First Starbucks Parents Face Detection By Object Detection SSD NCS 1 Parents Face Detection By Object Detection SSD NCS 2
Parents At First Starbucks Parents At First Starbucks - Face Detection By Object Detection SSD NCS 1 Parents At First Starbucks - Face Detection By Object Detection SSD NCS 2
Me Me Face Detection By Object Detection SSD NCS 1 Me Face Detection By OpenCV DNN NCS 2
Me Me - Face Detection By Object Detection SSD NCS 1 Me - Face Detection By Object Detection SSD NCS 2

3.4 Model Optimization

In this section, we are going to test out another OpenVINO example: Image Classification C++ Sample Async. After reading this blog, it wouldn’t be hard for us to notice the MOST important thing we’re missing here is the model file alexnet_fp32.xml. Let’s just keep it in mind for now.

And, let’s review a bit about our previous example: Object Detection – we downloaded face-detection-adas-0001 model from online and use it directly. So, questions:

  • Are we able to download alexnet_fp32.xml from online this time again?
  • Where can we download a whole bunch of open source models?

3.4.1 Open Model Zoo

It wouldn’t be hard for us to google out OpenVINO™ Toolkit - Open Model Zoo repository, under which model face-detection-adas-0001 is just sitting there. However, face-detection-adas-0001.xml and face-detection-adas-0001.bin are missing.

Let’s checkout open_model_zoo and put it under folder /opt/intel.

1
2
3
4
5
6
7
8
➜  intel pwd
/opt/intel
➜ intel ll
total 12K
drwxr-xr-x 8 pi pi 4.0K Apr 7 06:37 dldt
drwxr-xr-x 9 pi pi 4.0K Apr 7 14:04 l_openvino_toolkit_runtime_raspbian_p_2020.1.023
drwxr-xr-x 7 pi pi 4.0K Apr 7 14:04 open_model_zoo
lrwxrwxrwx 1 root root 48 Apr 7 06:29 openvino -> l_openvino_toolkit_runtime_raspbian_p_2020.1.023

Then, let’s enter folder face-detection-adas-0001 under open_model_zoo and take a look:

1
2
3
4
➜  face-detection-adas-0001 git:(master) pwd
/opt/intel/open_model_zoo/models/intel/face-detection-adas-0001
➜ face-detection-adas-0001 git:(master) ls
description face-detection-adas-0001.prototxt model.yml

It seems that each folder under intel and public contains the detailed info of each model. For instance, file intel/face-detection-adas-0001/model.yml contains all the info about model face-detection-adas-0001. However, what we really need are a .xml file and a .bin file. In the following, we are going to generate such 2 files, which are optimized specifically for movidius by following Intel OpenVINO toolkit issue 798441.

3.4.2 Download Caffe Model Files

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
➜  downloader git:(master) pwd
/opt/intel/open_model_zoo/tools/downloader
➜ downloader git:(master) ls
caffe2_to_onnx.py converter.py info_dumper.py README.md requirements.in tests
common.py downloader.py pytorch_to_onnx.py requirements-caffe2.in requirements-pytorch.in
➜ downloader git:(master) python downloader.py --name alexnet
################|| Downloading models ||################

========== Downloading /opt/intel/open_model_zoo/tools/downloader/public/alexnet/alexnet.prototxt
... 100%, 3 KB, 10485 KB/s, 0 seconds passed

========== Downloading /opt/intel/open_model_zoo/tools/downloader/public/alexnet/alexnet.caffemodel
... 100%, 238146 KB, 5428 KB/s, 43 seconds passed

################|| Post-processing ||################

========== Replacing text in /opt/intel/open_model_zoo/tools/downloader/public/alexnet/alexnet.prototxt
➜ downloader git:(master) ✗ ll public/alexnet/
total 233M
-rw-r--r-- 1 pi pi 233M Apr 9 14:24 alexnet.caffemodel
-rw-r--r-- 1 pi pi 3.6K Apr 9 14:24 alexnet.prototxt
-rw-r--r-- 1 pi pi 3.6K Apr 9 14:23 alexnet.prototxt.orig

Clearly, three files including a large model file alexnet.caffemodel has been downloaded.

3.4.3 Model Optimization

Now, are are going to optimize the downloaded caffe model and make it feedable to OpenVINO.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
➜  model-optimizer git:(2020) ✗ pwd
/opt/intel/dldt/model-optimizer
➜ model-optimizer git:(2020) ✗ python mo_caffe.py --generate_deprecated_IR_V7 --input_model /opt/intel/open_model_zoo/tools/downloader/public/alexnet/alexnet.caffemodel
[ WARNING ] Use of deprecated cli option --generate_deprecated_IR_V7 detected. Option use in the following releases will be fatal.
Model Optimizer arguments:
Common parameters:
- Path to the Input Model: /opt/intel/open_model_zoo/tools/downloader/public/alexnet/alexnet.caffemodel
- Path for generated IR: /opt/intel/dldt/model-optimizer/.
- IR output name: alexnet
- Log level: ERROR
- Batch: Not specified, inherited from the model
- Input layers: Not specified, inherited from the model
- Output layers: Not specified, inherited from the model
- Input shapes: Not specified, inherited from the model
- Mean values: Not specified
- Scale values: Not specified
- Scale factor: Not specified
- Precision of IR: FP32
- Enable fusing: True
- Enable grouped convolutions fusing: True
- Move mean values to preprocess section: False
- Reverse input channels: False
Caffe specific parameters:
- Path to Python Caffe* parser generated from caffe.proto: mo/front/caffe/proto
- Enable resnet optimization: True
- Path to the Input prototxt: /opt/intel/open_model_zoo/tools/downloader/public/alexnet/alexnet.prototxt
- Path to CustomLayersMapping.xml: Default
- Path to a mean file: Not specified
- Offsets for a mean file: Not specified
Model Optimizer version: unknown version
Please expect that Model Optimizer conversion might be slow. You are currently using Python protobuf library implementation.
However you can use the C++ protobuf implementation that is supplied with the OpenVINO toolkitor build protobuf library from sources.
Navigate to "install_prerequisites" folder and run: python -m easy_install protobuf-3.5.1-py($your_python_version)-win-amd64.egg
set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=cpp


For more information please refer to Model Optimizer FAQ (https://docs.openvinotoolkit.org/latest/_docs_MO_DG_prepare_model_Model_Optimizer_FAQ.html), question #80.

[ SUCCESS ] Generated IR version 7 model.
[ SUCCESS ] XML file: /opt/intel/dldt/model-optimizer/./alexnet.xml
[ SUCCESS ] BIN file: /opt/intel/dldt/model-optimizer/./alexnet.bin
[ SUCCESS ] Total execution time: 615.36 seconds.
[ SUCCESS ] Memory consumed: 1982 MB.

And, let’s take a look at what’s generated under folder /opt/intel/dldt/model-optimizer.

1
2
3
4
5
6
7
8
9
10
➜  model-optimizer git:(2020) ✗ pwd
/opt/intel/dldt/model-optimizer
➜ model-optimizer git:(2020) ✗ ls
alexnet.bin extensions mo_caffe.py mo_onnx.py README.md requirements_kaldi.txt requirements_tf.txt
alexnet.mapping install_prerequisites mo_kaldi.py mo.py requirements_caffe.txt requirements_mxnet.txt requirements.txt
alexnet.xml mo mo_mxnet.py mo_tf.py requirements_dev.txt requirements_onnx.txt tf_call_ie_layer
➜ model-optimizer git:(2020) ✗ ll alexnet*
-rw-r--r-- 1 pi pi 233M Apr 9 15:26 alexnet.bin
-rw-r--r-- 1 pi pi 2.6K Apr 9 15:26 alexnet.mapping
-rw-r--r-- 1 pi pi 25K Apr 9 15:26 alexnet.xml

Note: You may meet the following ERRORs during model optimization.

1
2
3
4
[ ERROR ]  
Detected not satisfied dependencies:
networkx: installed: 2.4, required: 2.4
protobuf: installed: 3.11.3, required: 3.6.1

Clearly, for networkx, the ERROR message is a kind of ridiculous.
Anyway, if you meet the above 2 errors, please DOWNGRADE your packages as follows:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
➜  ~  pip install protobuf==3.6.1 --user
Looking in indexes: https://pypi.org/simple, https://www.piwheels.org/simple
Collecting protobuf==3.6.1
Downloading https://files.pythonhosted.org/packages/77/78/a7f1ce761e2c738e209857175cd4f90a8562d1bde32868a8cd5290d58926/protobuf-3.6.1-py2.py3-none-any.whl (390kB)
|████████████████████████████████| 399kB 1.4MB/s
Requirement already satisfied: six>=1.9 in /home/pi/.local/lib/python3.7/site-packages (from protobuf==3.6.1) (1.13.0)
Requirement already satisfied: setuptools in /home/pi/.local/lib/python3.7/site-packages (from protobuf==3.6.1) (44.0.0)
Installing collected packages: protobuf
Found existing installation: protobuf 3.11.3
Uninstalling protobuf-3.11.3:
Successfully uninstalled protobuf-3.11.3
Successfully installed protobuf-3.6.1
➜ ~ pip install networkx==2.3 --user
Looking in indexes: https://pypi.org/simple, https://www.piwheels.org/simple
Collecting networkx==2.3
Downloading https://files.pythonhosted.org/packages/85/08/f20aef11d4c343b557e5de6b9548761811eb16e438cee3d32b1c66c8566b/networkx-2.3.zip (1.7MB)
|████████████████████████████████| 1.8MB 2.3MB/s
Requirement already satisfied: decorator>=4.3.0 in /home/pi/.local/lib/python3.7/site-packages (from networkx==2.3) (4.4.1)
Building wheels for collected packages: networkx
Building wheel for networkx (setup.py) ... done
Created wheel for networkx: filename=networkx-2.3-py2.py3-none-any.whl size=1556408 sha256=9964dc8e3b41e97f0228afb2c1c6ca40a5dfc07dc1a2574b92717114f3d5e533
Stored in directory: /home/pi/.cache/pip/wheels/de/63/64/3699be2a9d0ccdb37c7f16329acf3863fd76eda58c39c737af
Successfully built networkx
Installing collected packages: networkx
Found existing installation: networkx 2.4
Uninstalling networkx-2.4:
Successfully uninstalled networkx-2.4
Successfully installed networkx-2.3

3.5 Image Classification

3.5.1 C

For NCS 2

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
➜  samples ./c/build/armv7l/Release/hello_classification_c /opt/intel/dldt/model-optimizer/alexnet.xml me.jpg HETERO:MYRIAD

Top 10 results:

Image me.jpg

classid probability
------- -----------
838 0.074951
978 0.072632
977 0.072083
975 0.060730
903 0.052948
638 0.040588
976 0.040161
433 0.037842
112 0.032471
639 0.029724
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
➜  samples ./c/build/armv7l/Release/hello_classification_c /opt/intel/dldt/model-optimizer/alexnet.xml parents.jpg HETERO:MYRIAD

Top 10 results:

Image parents.jpg

classid probability
------- -----------
707 0.072144
577 0.024185
704 0.022980
955 0.020844
813 0.020432
910 0.016937
515 0.016235
605 0.013985
918 0.013351
523 0.013031

For NCS 1

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
➜  samples ./c/build/armv7l/Release/hello_classification_c /opt/intel/dldt/model-optimizer/alexnet.xml me.jpg HETERO:MYRIAD

Top 10 results:

Image me.jpg

classid probability
------- -----------
978 0.082336
977 0.081665
838 0.081055
975 0.061188
903 0.054840
638 0.046387
433 0.040131
976 0.034607
112 0.032349
639 0.031494
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
➜  samples ./c/build/armv7l/Release/hello_classification_c /opt/intel/dldt/model-optimizer/alexnet.xml parents.jpg HETERO:MYRIAD

Top 10 results:

Image parents.jpg

classid probability
------- -----------
707 0.059418
577 0.026993
813 0.021347
704 0.020218
910 0.019669
515 0.018204
955 0.018051
523 0.013527
808 0.012856
918 0.012802

3.5.2 C++

For NCS 2

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
➜  samples ./cpp/build/armv7l/Release/hello_classification /opt/intel/dldt/model-optimizer/alexnet.xml me.jpg HETERO:MYRIAD

Top 10 results:

Image me.jpg

classid probability
------- -----------
838 0.0749512
978 0.0726318
977 0.0720825
975 0.0607300
903 0.0529480
638 0.0405884
976 0.0401611
433 0.0378418
112 0.0324707
639 0.0297241

This sample is an API example, for any performance measurements please use the dedicated benchmark_app tool
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
➜  samples ./cpp/build/armv7l/Release/hello_classification /opt/intel/dldt/model-optimizer/alexnet.xml parents.jpg HETERO:MYRIAD

Top 10 results:

Image parents.jpg

classid probability
------- -----------
707 0.0721436
577 0.0241852
704 0.0229797
955 0.0208435
813 0.0204315
910 0.0169373
515 0.0162354
605 0.0139847
918 0.0133514
523 0.0130310

This sample is an API example, for any performance measurements please use the dedicated benchmark_app tool

For NCS 1

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
➜  samples ./cpp/build/armv7l/Release/hello_classification /opt/intel/dldt/model-optimizer/alexnet.xml me.jpg HETERO:MYRIAD               

Top 10 results:

Image me.jpg

classid probability
------- -----------
978 0.0823364
977 0.0816650
838 0.0810547
975 0.0611877
903 0.0548401
638 0.0463867
433 0.0401306
976 0.0346069
112 0.0323486
639 0.0314941

This sample is an API example, for any performance measurements please use the dedicated benchmark_app tool
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
➜  samples ./cpp/build/armv7l/Release/hello_classification /opt/intel/dldt/model-optimizer/alexnet.xml parents.jpg HETERO:MYRIAD

Top 10 results:

Image parents.jpg

classid probability
------- -----------
707 0.0594177
577 0.0269928
813 0.0213470
704 0.0202179
910 0.0196686
515 0.0182037
955 0.0180511
523 0.0135269
808 0.0128555
918 0.0128021

This sample is an API example, for any performance measurements please use the dedicated benchmark_app tool

3.5.3 Python

For NCS 2

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
➜  samples python ./python/classification_sample/classification_sample.py -m /opt/intel/dldt/model-optimizer/alexnet.xml -i me.jpg -d HETERO:MYRIAD -nt 10
[ INFO ] Creating Inference Engine
[ INFO ] Loading network files:
/opt/intel/dldt/model-optimizer/alexnet.xml
/opt/intel/dldt/model-optimizer/alexnet.bin
[ INFO ] Preparing input blobs
[ WARNING ] Image me.jpg is resized from (1280, 924) to (227, 227)
[ INFO ] Batch size is 1
[ INFO ] Loading model to the plugin
[ INFO ] Starting inference in synchronous mode
[ INFO ] Processing output blob
[ INFO ] Top 10 results:
Image me.jpg

classid probability
------- -----------
838 0.0751343
978 0.0728149
977 0.0722656
975 0.0606079
903 0.0533142
638 0.0413513
976 0.0391541
433 0.0375061
112 0.0320740
639 0.0298767


[ INFO ] This sample is an API example, for any performance measurements please use the dedicated benchmark_app tool
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
➜  samples python ./python/classification_sample/classification_sample.py -m /opt/intel/dldt/model-optimizer/alexnet.xml -i parents.jpg -d HETERO:MYRIAD -nt 10
[ INFO ] Creating Inference Engine
[ INFO ] Loading network files:
/opt/intel/dldt/model-optimizer/alexnet.xml
/opt/intel/dldt/model-optimizer/alexnet.bin
[ INFO ] Preparing input blobs
[ WARNING ] Image parents.jpg is resized from (960, 1280) to (227, 227)
[ INFO ] Batch size is 1
[ INFO ] Loading model to the plugin
[ INFO ] Starting inference in synchronous mode
[ INFO ] Processing output blob
[ INFO ] Top 10 results:
Image parents.jpg

classid probability
------- -----------
707 0.0712891
577 0.0237122
704 0.0227814
955 0.0209045
813 0.0205231
910 0.0169983
515 0.0162354
605 0.0140991
918 0.0133438
523 0.0129318


[ INFO ] This sample is an API example, for any performance measurements please use the dedicated benchmark_app tool

For NCS 1

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
➜  samples python ./python/classification_sample/classification_sample.py -m /opt/intel/dldt/model-optimizer/alexnet.xml -i me.jpg -d HETERO:MYRIAD -nt 10
[ INFO ] Creating Inference Engine
[ INFO ] Loading network files:
/opt/intel/dldt/model-optimizer/alexnet.xml
/opt/intel/dldt/model-optimizer/alexnet.bin
[ INFO ] Preparing input blobs
[ WARNING ] Image me.jpg is resized from (1280, 924) to (227, 227)
[ INFO ] Batch size is 1
[ INFO ] Loading model to the plugin
[ INFO ] Starting inference in synchronous mode
[ INFO ] Processing output blob
[ INFO ] Top 10 results:
Image me.jpg

classid probability
------- -----------
978 0.0818481
977 0.0818481
838 0.0812378
975 0.0610657
903 0.0553894
638 0.0469971
433 0.0398865
976 0.0338440
112 0.0317993
639 0.0317993


[ INFO ] This sample is an API example, for any performance measurements please use the dedicated benchmark_app tool
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
➜  samples python ./python/classification_sample/classification_sample.py -m /opt/intel/dldt/model-optimizer/alexnet.xml -i parents.jpg -d HETERO:MYRIAD -nt 10
[ INFO ] Creating Inference Engine
[ INFO ] Loading network files:
/opt/intel/dldt/model-optimizer/alexnet.xml
/opt/intel/dldt/model-optimizer/alexnet.bin
[ INFO ] Preparing input blobs
[ WARNING ] Image parents.jpg is resized from (960, 1280) to (227, 227)
[ INFO ] Batch size is 1
[ INFO ] Loading model to the plugin
[ INFO ] Starting inference in synchronous mode
[ INFO ] Processing output blob
[ INFO ] Top 10 results:
Image parents.jpg

classid probability
------- -----------
707 0.0590515
577 0.0267334
813 0.0212250
704 0.0199432
910 0.0197906
955 0.0183716
515 0.0182190
523 0.0135040
808 0.0129242
918 0.0126648

3.5.4 Results

Classification Result for me.jpg

Result from NCS 2 Result from NCS 1
838: ‘sunscreen, sunblock, sun blocker’, 978: ‘seashore, coast, seacoast, sea-coast’,
978: ‘seashore, coast, seacoast, sea-coast’, 977: ‘sandbar, sand bar’,
977: ‘sandbar, sand bar’, 838: ‘sunscreen, sunblock, sun blocker’,
975: ‘lakeside, lakeshore’, 975: ‘lakeside, lakeshore’,
903: ‘wig’, 903: ‘wig’,
638: ‘maillot’, 638: ‘maillot’,
976: ‘promontory, headland, head, foreland’, 433: ‘bathing cap, swimming cap’,
433: ‘bathing cap, swimming cap’, 976: ‘promontory, headland, head, foreland’,
112: ‘conch’, 112: ‘conch’,
639: ‘maillot, tank suit’, 639: ‘maillot, tank suit’,

Classification Result for parents.jpg

Result from NCS 2 Result from NCS 1
707: ‘pay-phone, pay-station’, 707: ‘pay-phone, pay-station’,
577: ‘gong, tam-tam’, 577: ‘gong, tam-tam’,
704: ‘parking meter’, 813: ‘spatula’,
955: ‘jackfruit, jak, jack’, 704: ‘parking meter’,
813: ‘spatula’, 910: ‘wooden spoon’,
910: ‘wooden spoon’, 515: ‘cowboy hat, ten-gallon hat’,
515: ‘cowboy hat, ten-gallon hat’, 955: ‘jackfruit, jak, jack’,
605: ‘iPod’, 523: ‘crutch’,
918: ‘crossword puzzle, crossword’, 808: ‘sombrero’,
523: ‘crutch’, 918: ‘crossword puzzle, crossword’,

classid table:

4. OpenCV DNN with OpenVINO’s Inference Engine

For the example openvino_fd_myriad.py given on Install OpenVINO™ toolkit for Raspbian* OS, the TOUGH thing is how to build OpenCV with OpenVINO’s Inference Engine. NEVER forget to export export InferenceEngine_DIR="/opt/intel/openvino/inference_engine/share" and enable WITH_INF_ENGINE ON to have OpenCV rebuilt. Before moving forward, my modified version of openvino_fd_myriad.py is provided as follows:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
import sys, argparse
import cv2

parser = argparse.ArgumentParser(description='OpenVINO Face Detection on MYRIAD')
parser.add_argument('-x', '--XML', help='Required: model XML file')
parser.add_argument('-b', '--BIN', help='Required: model BIN file')
parser.add_argument('-i', '--input', help='Required: input image')
parser.add_argument('-o', '--output', help='Not Required: output resultant image, by default: out.jpg')


args = parser.parse_args()
xmlfile = args.XML
binfile = args.BIN
inputimage = args.input
outputimage = args.output

if not outputimage:
outputimage = "out.jpg"

if not xmlfile:
print(f'openvino_fd_myriad {args.xmlfile}')

if not binfile:
print(f'openvino_fd_myriad {args.binfile}')

if not inputimage:
print(f'openvino_fd_myriad {args.inputimage}')



# Load the model.
net = cv2.dnn.readNet(xmlfile, binfile)
# Specify target device.
net.setPreferableTarget(cv2.dnn.DNN_TARGET_MYRIAD)
# Read an image.
frame = cv2.imread(inputimage)
if frame is None:
raise Exception('Image not found!')
# Prepare input blob and perform an inference.
print("Frame Size: {}".format(frame.shape))
blob = cv2.dnn.blobFromImage(frame, scalefactor=1.0, size=(224, 224), swapRB=True, ddepth=cv2.CV_32F)
net.setInput(blob)
print("First Blob: {}".format(blob.shape))
out = net.forward()
count = 0
# Draw detected faces on the frame.
for detection in out.reshape(-1, 7):
confidence = float(detection[2])
xmin = int(detection[3] * frame.shape[1])
ymin = int(detection[4] * frame.shape[0])
xmax = int(detection[5] * frame.shape[1])
ymax = int(detection[6] * frame.shape[0])
if confidence > 0.8:
cv2.rectangle(frame, (xmin, ymin), (xmax, ymax), color=(0, 255, 0))
count += 1
print(count)
# Save the frame to an image file.
cv2.imwrite(outputimage, frame)

4.1 Intel64

On my laptop, of course, we are building OpenCV for architecture Intel64, with NVidia GPU + CUDA support, for DNN in OpenCV requires either CUDA or OpenCL.

My test result of openvino_fd_myriad.py shows the performance of adopted model face-detection-adas-0001 is NOT as good as expected.

For NCS 2

1
2
3
4
5
6
7
8
➜  OpenVINO_Examples python ./openvino_fd_myriad.py -x /opt/intel/openvino/inference_engine/samples/cpp/build_18.04/intel64/Release/face-detection-adas-0001.xml -b /opt/intel/openvino/inference_engine/samples/cpp/build_18.04/intel64/Release/face-detection-adas-0001.bin -i ./parents.jpg
Frame Size: (960, 1280, 3)
First Blob: (1, 3, 224, 224)
2
➜ OpenVINO_Examples python ./openvino_fd_myriad.py -x /opt/intel/openvino/inference_engine/samples/cpp/build_18.04/intel64/Release/face-detection-adas-0001.xml -b /opt/intel/openvino/inference_engine/samples/cpp/build_18.04/intel64/Release/face-detection-adas-0001.bin -i ./me.jpg
Frame Size: (1280, 924, 3)
First Blob: (1, 3, 224, 224)
1

For NCS 1

1
2
3
4
5
6
7
8
➜  OpenVINO_Examples python ./openvino_fd_myriad.py -x /opt/intel/openvino/inference_engine/samples/cpp/build_18.04/intel64/Release/face-detection-adas-0001.xml -b /opt/intel/openvino/inference_engine/samples/cpp/build_18.04/intel64/Release/face-detection-adas-0001.bin -i ./parents.jpg
Frame Size: (960, 1280, 3)
First Blob: (1, 3, 224, 224)
2
➜ OpenVINO_Examples python ./openvino_fd_myriad.py -x /opt/intel/openvino/inference_engine/samples/cpp/build_18.04/intel64/Release/face-detection-adas-0001.xml -b /opt/intel/openvino/inference_engine/samples/cpp/build_18.04/intel64/Release/face-detection-adas-0001.bin -i ./me.jpg
Frame Size: (1280, 924, 3)
First Blob: (1, 3, 224, 224)
1

Results

Parents At First Starbucks Parents Face Detection By OpenCV DNN NCS 1 Parents Face Detection By OpenCV DNN NCS 2
Parents At First Starbucks Parents At First Starbucks - Face Detection By NCS 1 Parents At First Starbucks - Face Detection By NCS 2
Me Me Face Detection By OpenCV DNN NCS 1 Me Face Detection By OpenCV DNN NCS 2
Me Me - Face Detection By OpenCV DNN NCS 1 Me - Face Detection By OpenCV DNN NCS 2

4.2 armv7l

For NCS 2

1
2
3
4
5
6
7
8
➜  Programs python ./openvino_fd_myriad.py -x /opt/intel/openvino/inference_engine/samples/face-detection-adas-0001.xml -b /opt/intel/openvino/inference_engine/samples/face-detection-adas-0001.bin -i ./parents.jpg
Frame Size: (960, 1280, 3)
First Blob: (1, 3, 224, 224)
2
➜ Programs python ./openvino_fd_myriad.py -x /opt/intel/openvino/inference_engine/samples/face-detection-adas-0001.xml -b /opt/intel/openvino/inference_engine/samples/face-detection-adas-0001.bin -i ./me.jpg
Frame Size: (1280, 924, 3)
First Blob: (1, 3, 224, 224)
1

For NCS 1

1
2
3
4
5
6
7
8
➜  Programs python ./openvino_fd_myriad.py -x /opt/intel/openvino/inference_engine/samples/face-detection-adas-0001.xml -b /opt/intel/openvino/inference_engine/samples/face-detection-adas-0001.bin -i ./parents.jpg
Frame Size: (960, 1280, 3)
First Blob: (1, 3, 224, 224)
2
➜ Programs python ./openvino_fd_myriad.py -x /opt/intel/openvino/inference_engine/samples/face-detection-adas-0001.xml -b /opt/intel/openvino/inference_engine/samples/face-detection-adas-0001.bin -i ./me.jpg
Frame Size: (1280, 924, 3)
First Blob: (1, 3, 224, 224)
1

Results

Parents At First Starbucks Parents Face Detection By OpenCV DNN NCS 1 Parents Face Detection By OpenCV DNN NCS 2
Parents At First Starbucks Parents At First Starbucks - Face Detection By NCS 1 Parents At First Starbucks - Face Detection By NCS 2
Me Me Face Detection By OpenCV DNN NCS 1 Me Face Detection By OpenCV DNN NCS 2
Me Me - Face Detection By OpenCV DNN NCS 1 Me - Face Detection By OpenCV DNN NCS 2

5. My Built Raspbian ISO With OpenCV4 + OpenVINO

Finally, you are welcome to try my built image rpi4-raspbian20200213-opencv4.3-openvino2020.1.023-ncsdk2.10.01.01.img, which is about 9.5G and composed of:

Everything has ALREADY been updated and built successfully, ONLY EXCEPT object_detection_sample_ssd. In addition, rpi4-raspbian20200213-opencv4.3-openvino2020.1.023-ncsdk2.10.01.01.img ALSO works properly on my Raspberry Pi 3 Model B Version 1.2 manufactured in 2015.