Good new. I’m flying back to China for some traditional Chinese food. 😍❤️😊
In my first blog post about Jetson AGX Xavier early in year 2021,
Raspberry Pi 5 + [Hailo] AI - (2)
Hailo Technologies Ltd. is indeed a semiconductor high-tech company based in Tel Aviv, Israel. Although this blog focuses on the Israeli product Hailo-8L Entry-Level AI Accelerator, I would like to take this opportunity to celebrate the successful holding of 2025 China Victory Day Parade.
In my first blog post about Hailo last year, not using a Docker container to create a Python 3.10 environment caused me significant troubles, and I actually failed to complete the demonstration. Today, we are going to walk through the entire Hailo process again.
1. Raspberry Pi 5 Enviroment
1.1 neofetch
1.2 Package Installation
Please visit Hailo Software Downloads to download and install the following three MUST software packages:
- HailoRT – Ubuntu package (deb) for arm64
1 | ➜ dpkg -L hailort |
- HailoRT – PCIe driver Ubuntu package (deb)
1 | ➜ ~ dpkg -L hailort-pcie-driver |
HailoRT – Python package (whl) for Python 3.11, aarch64
TAPPAS Python Binding (Optional)
1 | ➜ ~ pip list | grep hailo |
Note: Since my Raspberry Pi 5 comes with Python 3.11 pre-installed, I have selected the Python 3.11 version. You should choose the appropriate version that matches your Python environment for installation.
Please also guarantee the software you installed are compatible with each other. As shown in the following, I’m using the newest compatible versions.
Finally, let’s test if Hailo is correctly configured on our Raspberry Pi 5.
1.3 Preparation
Please strictly follow the How to Set Up Raspberry Pi 5 and Hailo guide to fix any bugs that have already appeared or to prevent potential ones.
1.4 Identify Device Architecture
This is a MUST step to do.
1 | ➜ ~ hailortcli fw-control identify |
Mine is a HAILO8L.
2 Models
2.1 Download .hef
s
For simplicity, I directly downloaded some .hef
files from HAILO8L Models
2.2 hailortcli run
.hef
s
The .hef (Hailo Executable File) format is platform-independent: a single .hef file can be executed on both x86_64 hosts and aarch64 (ARM64) hosts, as long as the system has the appropriate Hailo runtime (HailoRT) and driver installed.
This is possible because the .hef file does not contain host-specific binaries or compiled CPU code. Instead, it encapsulates the compiled Hailo neural network graph targeted for the Hailo hardware accelerator itself. The host platform - whether x86_64 or aarch64 - acts mainly as a controller that loads the .hef into the Hailo device, configures it, and orchestrates inference.
In other words, the .hef file is tied to the Hailo hardware (e.g., Hailo-8, Hailo-8L, Hailo-10) but is independent of the host CPU architecture. This allows the same .hef model file to be deployed seamlessly across development environments (for example, a workstation with an x86_64 CPU and PCIe card) and edge devices (for example, a Raspberry Pi 5 or Jetson board with an ARM64 CPU and M.2 card).
3. Demonstration
3.1 Single Image - Object Detection and Image Segmentation
In this demonstration, 2 famous images from Ultralytics are respectively adopted for object detection and image segmentation:
Hailo Object Detection using yolov11s.hef | Hailo Image Segmentation using yolov8s_seg.hef |
---|---|
![]() |
![]() |
3.2 Video - Detection and Tracking
- Roboflow supervision is used in this demo, for tracking purpose.
- For demonstration, the video on the front page from Kaggle‘s 720p Road and Traffic Video for Object Detection Dataset is adopted.
- The model used in this demo is: yolov5s_wo_spp.hef
3.3 Detection and Tracking - Live Camera RTSP
3.3.1 On Server Raspberry Pi 5
1 | ✗ python3 camera_ai_rtsp.py |
3.3.2 On Any Client
1 | ffplay -rtsp_transport tcp rtsp://192.168.1.90:8554/ai |
Live Stream RTSP Broadcasted After Processed by Hailo on Raspberry Pi 5 | ffplay the Processed Live Stream on Server |
---|---|
![]() |
![]() |
4. Compilation
Like other dedicated AI hardware platforms, Hailo‘s software stack involves compiling a trained model, converting it from the standard ONNX format into its proprietary HEF (Hailo Executable Format) file optimized for its own architecture.
ESP32-C3 MAX98357A WiFi Speaker
1. Reference - About ESP32
All well documented by Espressif.
Cited from I2S,
1 | I2S (Inter-IC Sound) is a synchronous serial communication protocol usually used for transmitting audio data between two digital audio devices. |
In sum:
Although a single I2S controller is nominally full-duplex – supporting both input (recording) and output (playback), in practice, operations must be time-multiplexed. In other words, it is not truly capable of simultaneous input and output. To achieve real-time interaction – such as detecting user input and instantly interrupting playback – two independent I2S controllers are required: one dedicated to continuously handling microphone input, and the other to audio output to the speaker.
So, today, let’s make a bluetooth speaker based on ESP32-C3, which ONLY requires I2S to serve as output (playback).
2. Hardware Components
Any ESP32-C3 board | MAX98357A I2S amplifier | Any speaker |
---|---|---|
![]() |
![]() |
![]() |
3. Curcuit Connection
3.1 KiCad Schematic
3.2 Real Connection
4. ESP32 Code
Please refer to Github LongerVisionRobot ESP32-C3-MAX98357A-WiFi-speaker.
5. Demonstration
ESP32-C3 Playing a Tone | ESP32-C3 Playing an Audio File |
---|---|
Maix Bit - SiPeed Risc-V 64bit SBC
Happy Friday… And, how I miss my beloved son… God bless… Today, I’ll talk about Maix Bit board.
1. Introduction
1.1 lsusb
1 | ...... |
1.2 Wiki
Detailed specification about Maix Bit board can be found on MaixBit Wiki.
1.3 Three Meanings Of Maixduino
- MaixDuino board: the development board itself
- Maixduino Arduino Board Manager support: Add
https://dl.sipeed.com/MAIX/Maixduino/package_Maixduino_k210_index.json
as one Additional Board Manager URL - Maixduino Official Documentation
Note: Maixduino supports Maix Bit board.
1.4 MaixPy-v1
And, I am using MaixPy-v1 (Please refer to Introduction of MaixPy-v1).
ESP32-C6 DRV8833 Controlled OWI-535 Robotic Arm
It is another International Children’s Day. Today, let’s have some fun of using ESP32 to control an OWI-535 robotic arm.
Finally, my DRV8833 arrived around Surrey Canada Day. Let’s FIRST carry out a simple comparison, well, using Deepseek (might be incorrect).
0. Comparisons of Popular Motors of Year 2025
0.1. Motor Driver IC/Module Comparison Table
Comparison Item | L298N | DRV1508S (MX1508) | DRV8833 | CS9016C (Replacing TB6612) | HR4988 | 2025 Trends |
---|---|---|---|---|---|---|
Motor Type | DC (Brushed) | DC (Brushed) | DC (Brushed) | DC (Brushed) | Stepper (Bipolar) | BLDC/Stepper/Servo |
Commutation | Mechanical | Mechanical | Mechanical | Mechanical | Electronic | Electronic |
Voltage Range | 4.5V~46V | 5V~18V | 2.7V~10.8V | 2.5V~15V | 8V~35V | 5V~48V |
Continuous Current | 2A/ch (Peak 3A) | 0.5~1A | 1.5A/ch (Peak 2A) | 1.5A/ch (Peak 4A) | 2A/phase | 1A~30A |
Control Method | PWM + DIR | PWM + DIR | PWM + DIR | PWM + DIR | STEP + DIR | FOC/STEP/DIR |
Microstepping | No | No | No | No | Up to 1/16 | Up to 1/256 |
Efficiency | Low (BJT) | Medium (MOSFET) | High | Very High | High | Very High (FOC) |
Protection | External Diode | Basic | Comprehensive | Auto-Recovery | Comprehensive | Full Protection |
Package | Module | SOP-8 | HTSSOP-16 | TSSOP-16 | QFN-28 | Module |
Applications | Robot Cars | Toys | Battery Devices | Smart Cars/Drones | 3D Printers/CNC | Open-Source Robots |
Open-Source | No | No | Partial | Yes | Partial | Fully Open |
Price (USD) | $0.7~1.4 | $0.3~0.4 | $0.35~0.7 | $0.3~0.6 | $0.4~0.8 | $1.5~15 |
0.2. 2025 Recommendation Guide:
Application | Best Choice | Why? |
---|---|---|
Ultra-low cost (≤1A) | DRV1508S | Cheapest ($0.3) |
Balanced performance | DRV8833/CS9016C | CS9016C for current, DRV8833 for voltage range |
High-power DC (≥2A) | DRV8871 | Handles up to 3.6A continuous |
Stepper motors | HR4988 (budget) | Or TMC2209 for silent operation |
BLDC motors | VESC | Open-source FOC control |
In this particular blog, DRV8833 is adoptted, rather than L298N. Without circuit protection, L298N is more prone to being damaged or burned out.
1. Five DOFs of OWI-535
Refer to Pololu‘s OWI-535 Robotic Arm Edge Kit:
- a gripper
- 120° wrist motion
- 300° elbow motion
- 180° shoulder motion
- 270° base motion
2. Reference - About ESP32
Please refer to my blogs:
And:
3. Hardware Components
Any ESP32-C6 board | DRV8833 H-bridge Motor Driver | OWI-535 |
---|---|---|
![]() |
![]() |
![]() |
The one I’m using for this blog is a WeActStudio.ESP32C6-MINI.
4. Curcuit Connection
4.1 KiCad Schematic
4.2 Real Connection
5. ESP32 Code
Please refer to Github LongerVisionRobot ESP32-C6-DRV8833-WiFi-OWI-535-Robotic-Arm.
6. Demonstration
Raspberry Pi 5 + Hailo AI - (1)
1. Configuration
Raspberry Pi 5 Hailo AI + M.2 SSD | Connection With M.2 Extention Board |
---|---|
![]() |
![]() |
Since I personally do NOT like to Hailo-8L on Ubuntu 24.04 using Docker, I just struggled to have everything directly run without docker:
2. libcamera
1 | ➜ ./libcamera-hello |
3. Hailo Model Zoo
- Hailo Model Zoo
- My Pull Request: Remove the dependency of lap
hailomz
command
4. Demonstration
BeagleV-Fire 2
1. Prepare For BeagleV-Fire Ubuntu
Based on my previous blog BeagleV-Fire-1.md, we firstly checkout BeagleV-Fire Ubuntu, and do the following updates.
1.1 01_git_sync.sh
1 | #!/bin/bash |
1.2 04_build_linux.sh
1 | #!/bin/bash |
1.3 06_generate_ubuntu_console_root.sh
Just update 23.04 to 24.04
1 | #!/bin/bash |
1.4 git_linux_mainline.sh
1 | #!/bin/bash |
1.5 git_linux_mpfs.sh
1 | #!/bin/bash |
2. Manually Flash BeagleV-Fire Board With ALL Updates - Failed To Boot
2.1 01_git_sync.sh
1 | ➜ BeagleV-Fire-ubuntu git:(main) ✗ ./01_git_sync.sh |
2.2 ./02_build_hss.sh
1 | ➜ BeagleV-Fire-ubuntu git:(main) ✗ ./02_build_hss.sh |
2.3 ./03_build_u-boot.sh
1 | ➜ BeagleV-Fire-ubuntu git:(main) ✗ ./03_build_u-boot.sh |
A new file patches/u-boot/beaglev-fire/microchip_mpfs_icicle_defconfig
is generated as:
1 | CONFIG_RISCV=y |
2.4 04_build_linux.sh
You may meet the same issues solved in BeagleV-Fire-1.md.
1 | ➜ BeagleV-Fire-ubuntu git:(main) ✗ ./04_build_linux.sh |
Three files are generated as follows:
-
patches/linux/dts/mpfs-beaglev-fire-fabric.dtsi
1 | // SPDX-License-Identifier: (GPL-2.0 OR MIT) |
patches/linux/dts/mpfs-beaglev-fire.dts
1 | // SPDX-License-Identifier: (GPL-2.0 OR MIT) |
-
patches/linux/mpfs_defconfig
1 | # |
2.5 05_generate_payload.bin.sh
1 | ➜ BeagleV-Fire-ubuntu git:(main) ✗ ./05_generate_payload.bin.sh |
2.6 sudo ./06_generate_debian_console_root.sh
1 | ➜ BeagleV-Fire-ubuntu git:(main) ✗ sudo ./06_generate_debian_console_root.sh |
2.7 sudo ./07_create_sdcard_img.sh
1 | ➜ BeagleV-Fire-ubuntu git:(main) ✗ sudo ./07_create_sdcard_img.sh |
2.8 Take A Look At The Image
1 | ➜ BeagleV-Fire-ubuntu git:(main) ✗ ll deploy/images |
2.9 Failed To Boot
1 | ➜ ~ tio /dev/ttyUSB0 |
3. Flash Official BeagleV-Fire Ubuntu 2023-11-21
3.1 Apply Official Flash
The official website of BeagleV-Fire Board provides the official OS BeagleV-Fire Ubuntu 2023-11-21, which is able to flash onto the board directly.
Bingo! This time, we’re good to go.
The ONLY 2 commands we need to use before flashing are:
mmc
usbdmsc
And openssh-server has already been installed by default. Namely:
ssh
is enabled by default.
3.2 ssh
Into BeagleV-Fire Board
The username and password are respectively:
- username:
beagle
- password:
temppwd
1 | ➜ ~ uname -a |
3.3 On-board FPGA
1 | ➜ ~ lspci |
PinePhone
Today, let me have some fun of the open source cellphone PinePhone.
1. Key Points
Please do strictly follow PinePhone and PinePhone Pro - 20.04 Focal Install and Update.
- Flash Tow-Boot onto TF Card
- Hold PinePhone‘s Power button and Volume+ until blue led appears
- Flash Pine64 20.04 Ubuntu Touch Image Builder onto eMMC of PinePhone
2. Demonstration
2.1. PinePhone Flash Results
Ubuntu Touch 16.04 on eMMC Initially | Pine64 20.04 Ubuntu Touch Image Builder Flashed on eMMC In the End |
---|---|
![]() |
![]() |
2.2. Tow-Boot Process
Tow-Boot Installer Interface | eMMC Boot Erased |
---|---|
![]() |
![]() |
Install Tow-Boot Successfully | Booting |
---|---|
![]() |
![]() |
NVidia Jetson Orin Nano 8G Dev Kit
Today is Canada’s Thanksgiving. Let me do something extra on Jetson Orin Orin.
1. My Working Environment
1 | ➜ ~ lsb_release -a |
2. Flash From Scratch
2.1 Entering Recovery Mode
In order to enter the Recovery Mode, we need to connect Jetson Orin Nano’s Pin 9 and Pin 10 with a jumper, and then turn on the power, and connect the Jetson Orin Nano with your desktop, here in my case, Ubuntu 24.04.1 LTS.
2.2 lsusb
1 | ➜ Resource git:(master) lsusb | grep NVIDIA |
2.3 Install Jetson Software with SDK Manager
To use SDK Manager, we need a Ubuntu 22.04.5 LTS or Ubuntu 20.04.6 LTS. In my case, I tried with Ubuntu 20.04.6 LTS.
Step 01 | Step 02 |
---|---|
![]() |
![]() |
Get stucked at:
2.4 Jetson Orin Nano Tutorial: SSD Install, Boot, and JetPack Setup
What can I say?
My comment: so complicated. I’m too lazy to follow.
2.5 dd
Is My Favorite Tool
For some reasons, on my Jetson Orin Nano, there is NO TF card slot, but a 128G SSD instead.
Therefore, in my case, my BEST solution is to flash the SSD directly.
Simply download and unzip JetPack 6.1 Orin Nano SD Card Image first. Then, do dd
:
1 | ➜ jetson sudo dd bs=1M if=sd-blob.img of=/dev/sdb conv=fsync |
However, again, it’s NOT working.
3. My LAST and FINAL Attempt - Jetson Linux Developer Guide Quick Start
I watched the entire video below, and was expecting a good result.
3.1 Before Flash
Let’s try to do sudo ./apply_binaries.sh
again:
1 | ➜ Linux_for_Tegra sudo ./apply_binaries.sh |
3.2 DANGER
Right before I tried to run the following command:
1 | sudo ./tools/kernel_flash/l4t_initrd_flash.sh --external-device nvme0n1p1 \ |
I noticed nvme0n1p1 !!! I got another SSD connected to my desktop and listed as nvme0n1p1 ALREADY.
1 | ➜ Linux_for_Tegra lsblk |
A whole bunch of command lines are to be used, in order to let my OS not load nvme0n1:
3.2.1 Umount
1 | sudo umount /dev/nvme0n1p1 |
3.2.2 Check nvme
s
1 | ➜ lspci | grep -i nvme |
3.2.3 Find the Correct nvme
‘s PCI
1 | ➜ sudo udevadm info --query=all --name=/dev/nvme0n1 | grep ID_PATH |
3.2.4 Unbind the Corresponding nvme
1 | ➜ Linux_for_Tegra echo "0000:01:00.0" | sudo tee /sys/bus/pci/drivers/nvme/unbind |
3.3 Flash
Now, ALL ready. Let’s flash.
3.3.1 Package Preparations
You may still have to install a couple of more packages on your host Ubuntu 24.04.1 LTS:
sshpass
abootimg
nfs-kernel-server
3.3.2 Run l4t_initrd_flash.sh
However, in the end, in my case, on my host Ubuntu 24.04.1 LTS, I got:
1 | ➜ Linux_for_Tegra sudo ./tools/kernel_flash/l4t_initrd_flash.sh --external-device nvme0n1p1 \ |
Extremely sad now:
I Believe, there is ALWAYS a way out. Solution is finally found at JetPack 6.1 Release Announcement. It’s simply because I used a wrong command to flash. Try again:
1 | sudo ./tools/kernel_flash/l4t_initrd_flash.sh --external-device nvme0n1p1 -c tools/kernel_flash/flash_l4t_t234_nvme.xml -p '-c bootloader/generic/cfg/flash_t234_qspi.xml' --showlogs --network usb0 jetson-orin-nano-devkit internal |
This time, succeed.
4. Demonstration of Jetson Orin Nano
4.1 Initial Login Without ssh
You need the DP - DisplayPort. I got to purchase this new calbe iCAN 28AWG 1080p DisplayPort to HDMI Cable Male to Male Gold-plated White Color - 3 ft., in order to successfully display for Jetson Orin Nano’s first-time running.
4.2 ssh
Into Jetson Orin Nano After ssh
Is Enabled
4.3 jeston_release
4.4 jtop
4.5 jetson_clocks
4.6 tegrastats
4.7 nvidia-smi
5. Demonstration of YOLOv11 on Jetson Orin Nano
You can find PyTorch for JetPack 6.1 in Jetson Download Center. However….
5.1 Current Issue
- JetPack 6.1 Release Announcement
- PyTorch for Jetson - To the latest, it ONLY has the support for PyTorch v2.3.0, JetPack 6.0 (L4T R36.2 / R36.3) + CUDA 12.2.
- Installing PyTorch for Jetson Platform - Until today, October 17, 2024, what I can reach so far is this .pdf file: Guide Of Installing PyTorch For Jetson Platform.
5.2 Make YOLOv11 On Jetson Orin Nano Work With GPU
Solution:
pip install ultralytics
- Overwrite
torch
installed withultralytics
by PyTorch for JetPack 6.1 - Build torchvision from source
To successfully build torchvision from source, you may need to allocate more swap memory:
1 | sudo fallocate -l 4G /swapfile |
Otherwise, you may meet the following error:
1 | c++: fatal error: Killed signal terminated program cc1plus |
ChatGPT explained it as:
1 | The error you're encountering while building torchvision seems to be caused by your system running out of resources, particularly memory. The message c++: fatal error: Killed signal terminated program cc1plus often indicates that the system killed the process due to memory exhaustion. |
ssh & neofetch | PyTorch for JetPack 6.1 Cuda Enabled |
---|---|
![]() |
![]() |
YOLOv11 Initialization | YOLOv11 GPU |
---|---|
![]() |
![]() |
The End.
LuckFox Pico Ultra W RV1106G3
Haven’t tried out my LuckFox Pico Ultra W for over 1 month. Tonight, actually, today, is Canada’s Thanksgiving of year 2024.
1. Preparation
1.1 lsusb
- Step 1: hold the
BOOT
button - Step 2: then connect the USB cable
- Step 3: release the
BOOT
button
LuckFox Pico Ultra W is quite picky at USB Type 3 cables.
1 | ➜ ~ lsusb | grep Fuzhou |
1.2 Flash firmware
1 | ➜ ultra-w-buildroot sudo upgrade_tool uf update.img |
1.3 Wired Connection MAC Address Keeps Varying. How?
Please add the following script:
1 | ➜ ~ cat /etc/netplan/01-netcfg.yaml |
and then do sudo netplan apply
.
1.4 ssh
Into LuckFox Pico Ultra W
- username:
root
- password:
luckfox
Luckfox Pico Ultra W CPUInfo | Luckfox Pico Ultra W MemInfo |
---|---|
![]() |
![]() |
2. Luckfox Pico Ultra W CSI Camera
2.1 Connection Of CSI Camera
For my LuckFox Pico Ultra W, I need to follow the following citation:When connecting the camera to LuckFox Pico Ultra development boards, ensure that the metal side of the camera ribbon cable faces the chip on the development board.
(Refer to CSI Camera)
2.2 Content of rkipc.ini
1 | [root@luckfox data]# cat rkipc.ini |
2.3 luckfox-config
Luckfox Pico Ultra W luckfox-config | Luckfox Pico Ultra W About |
---|---|
![]() |
![]() |
Luckfox Pico Ultra W Enable CSI | Luckfox Pico Ultra W CSI Enabled |
---|---|
![]() |
![]() |