0%

Refer to Launchpad PPA:

Solution

  • dput Way
    1
    dput ppa:longervision/googletest googletest_1.11.0_amd64.changes
  • FTP
    Edit ~/.dput.cf as:
    1
    2
    3
    4
    5
    6
    [googletest]
    fqdn = ppa.launchpad.net
    method = ftp
    incoming = ~longervision/ubuntu/
    login = jiapei100
    allow_unsigned_uploads = 0

LongerVision PPA

PPA googletest

Nothing but a collection of various tutorials.

1. Comprehensive

1.1 Tutorials

1.2 Quizzes

1.3 Interviews

1.4 E-Books

1.5 Blogs

2. Programming Languages

2.1 C

2.2 C++

2.3 GDB

2.4. Java

2.5. Python

2.6 Golang

2.7 Rust

2.8 Dart

3. Algorithms

4. Robotics

4.1 Comprehensive Open Source

4.2 Robotic Arm

5. Linux

5.1 Linux Operating System

5.2. Shell Scripts

6. Coding Tricks

6.1. Regular Expressions

6.2. Git

7. Cheat Sheets

Happened to come across a blog XGBoost vs LightGBM: How Are They Different. Let’s investigate a bit wider and deeper into the following 4 machine learning open source packages.

1. Some Readings

Cited from XGBoost vs LightGBM: How Are They Different

1
Two of the most popular algorithms that are based on Gradient Boosted Machines are XGBoost and LightGBM. 

2. Concepts

3. Amazon.com - Employee Access Challenge

In this blog, I’m going to use a very old Kaggle dataset Amazon.com - Employee Access Challenge.

4. XGBoost

Please make sure:

1
2
3
4
5
6
7
8
9
10
11
➜  ~ pip show xgboost
Name: xgboost
Version: 1.6.0.dev0
Summary: XGBoost Python Package
Home-page: https://github.com/dmlc/xgboost
Author:
Author-email:
License: Apache-2.0
Location: /home/lvision/.local/lib/python3.8/site-packages
Requires: numpy, scipy
Required-by: autoviz

5. LightGBM

Please make sure:

1
2
3
4
5
6
7
8
9
10
11
➜  ~ pip show lightgbm
Name: lightgbm
Version: 3.3.2.99
Summary: LightGBM Python Package
Home-page: https://github.com/microsoft/LightGBM
Author:
Author-email:
License: The MIT License (Microsoft)
Location: /home/lvision/.local/lib/python3.8/site-packages
Requires: numpy, scikit-learn, scipy, wheel
Required-by:

Pay Attention: In order to have XGBoost or LightGBM successfully built, you may have to:

6. CatBoost

1
2
3
4
5
6
7
8
9
10
11
➜  ~ pip show catboost
Name: catboost
Version: 1.0.4
Summary: Catboost Python Package
Home-page: https://catboost.ai
Author: CatBoost Developers
Author-email:
License: Apache License, Version 2.0
Location: /home/lvision/.local/lib/python3.8/site-packages
Requires: graphviz, matplotlib, numpy, pandas, plotly, scipy, six
Required-by:

7. H2O

1
2
3
4
5
6
7
8
9
10
11
➜  ~ pip show h2o
Name: h2o
Version: 3.37.0.6
Summary: H2O, Fast Scalable Machine Learning, for python
Home-page: https://github.com/h2oai/h2o-3.git
Author: H2O.ai
Author-email: support@h2o.ai
License: Apache v2
Location: /home/lvision/.local/lib/python3.8/site-packages
Requires: future, requests, tabulate
Required-by:

8. MLflow

1
2
3
4
5
6
7
8
9
10
11
➜  pip show mlflow
Name: mlflow
Version: 1.24.1.dev0
Summary: MLflow: A Platform for ML Development and Productionization
Home-page: https://mlflow.org/
Author: Databricks
Author-email:
License: Apache License 2.0
Location: /home/lvision/.local/lib/python3.8/site-packages
Requires: alembic, click, cloudpickle, databricks-cli, docker, entrypoints, Flask, gitpython, gunicorn, importlib-metadata, numpy, packaging, pandas, prometheus-flask-exporter, protobuf, pytz, pyyaml, querystring-parser, requests, scipy, sqlalchemy, sqlparse
Required-by:

Please make sure:

1. Definition of EDA

Directly refer to Rapid-Fire EDA process using Python for ML Implementation.

Clearly, PCA is a part of EDA.

Here is another good tutorial for EDA: A Starter Pack to Exploratory Data Analysis with Python, pandas, seaborn, and scikit-learn.

2. Visualization with t-SNE

3. Try Analyzing and Visualizing MNIST

Let’s just refer to the FIRST tutorial and try out how to analyze and visualize the famous dataset MNIST.

Please refer to my copy-paste-modification jupyter notebook code at https://github.com/LongerVision/Kaggle.

4. Try Analyzing and Visualizing Fashion-MNIST

Let’s then refer to the SECOND tutorial and try out how to analyze and visualize Fashion-MNIST.

Please refer to my copy-paste-modification jupyter notebook code at https://github.com/LongerVision/Kaggle.

Happy new year everyone. Such a long time I haven’t taken a look at activation functions. It might be a good time for me to review and check.

1. Resource

Let’s first take a look at activation functions of year 2022.

I would strongly recommend you to read the 2 articles mentioned above, particularly: Smish.

2.

It’s been over a year and a half for JeVois to have this second generation of smart camera JeVois Pro. Although JeVois failed to pledge its campaine goal on Kickstarter, it’s still my great pleasure to give it a try. Again: JeVois Pro is made of China.

1. Specification Comparisons

Specifications JeVois Pro JeVois
CPU Amlogic A311D with 4x A73 @ 2.2 GHz + 2x A53 @ 1.8 GHz Allwinner A33 quad core ARM Cortex A7 processor @ 1.34GHz with VFPv4 and NEON
GPU Quad-core MALI G52 MP4 @ 800 MHz dual core Mali-400 GPU supporting OpenGL-ES 2.0
NPU 5-TOPS (trillion operations/s) integrated Neural Processing Unit. None
RAM 4 GB LPDDR4-3200 256MB DDR3 SDRAM
Camera 2MP Sony IMX290 back-illuminated Starvis sensor, 1/2.8”, 12mm lens, 1920x1080 at up to 120fps, rolling shutter, wide dynamic range support. 1.3MP
IMU TDK InvenSense ICM-20948 with 3-axis accelerometer, 3-axis gyro, 3-axis compass, SPI bus @ 7 MHz, can be synchronized with camera sensor. None

JeVois Pro and JeVois respectively look like the following:

JeVois-Pro 3 Boot Choices

2. Quick Start

Now, let’s follow the Quick Start and take a test of JeVois Pro. After successfully flashing the TF card, let’s boot up JeVois Pro, For the first booting, it’s better we connect both HDMI and keyboard and mouse.

JeVois Pro has 3 booting choices, as follows.

JeVois-Pro 3 Boot Choices

For my convenience of setting up the network, I chose Ubuntu Graphical. And, now it’s the time for us to plug in our Wifi dongle/module before rebooting. Then, reboot.

JeVois-Pro XFCE

Clearly, JeVois Pro‘s Ubuntu Graphical comes with GUI XFCE.

3. SSH Into JeVois Pro

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
➜  ~ ssh jevois@192.168.1.78
The authenticity of host '192.168.1.78 (192.168.1.78)' can't be established.
ECDSA key fingerprint is SHA256:JS0C5IOYk3dkPj8s5U/7QGvtswZowO5+QfpFJNMfBfE.
Are you sure you want to continue connecting (yes/no/[fingerprint])? yes
Warning: Permanently added '192.168.1.78' (ECDSA) to the list of known hosts.
jevois@192.168.1.78's password:

Welcome to JeVois Fenix 1.0.7 Ubuntu 20.04.3 LTS Linux 4.9.241
_ __ __ _ ____
| | __\ \ / /__ (_)___ | _ \ _ __ ___
_ | |/ _ \ \ / / _ \| / __| | |_) | '__/ _ \
| |_| | __/\ V / (_) | \__ \ | __/| | | (_) |
\___/ \___| \_/ \___/|_|___/ |_| |_| \___/


* Website: https://jevois.org
* Documentation: https://jevois.org/doc
* Forum: https://jevois.org/qa

jevois@JeVoisPro:~$

4. NO RTSP Video Streaming

According to my Email to JeVois Inc. and their reply, JeVois Pro hasn’t implemented the video streaming part yet. Clearly, two streams are needed to be transferred via Wifi:

  • the original captured video stream
  • the processed resultant video stream

Let’s leave this issue aside for now…

Merry Christmas everyone. It is another Christmas, Year 2022 is coming. I’m back to life. Last week, my JetBot arrived. It took me a couple of hours to make it work, and round 1 week to demonstrate: some of the JetBot Examples are NOT working properly. Anyway, my son Fionn is half-year old. Let’s celebrate it with JetBot, as well as the heavy Christmas snows.

Christmas Snow My Son Snow
Sunny Snow Lazy Boy Ski In the Snow
Somebody Is In The Snow Beautiful Boy Sunny Sunny Sunny
That's THE Christmas Tree Chubby Boy What a House

I believe we can flash JetPack first and have JetBot configured afterwards. Let’s leave this topic for the next blog The JeBot - 2. In this blog, we’re going to follow the official JetBot Software Setup.

1. Setup JetBot Hardware

Just refer to this youtube video:

Since what I purchased is a Waveshare JetBot 2GB AI Kit. therefore, my assembled JetBot looks like following:

My Assembled JetBot

As you can see, there is NO antenna. Instead, a Wifi USB dongle is provided. Actully, there are a couple things to be emphasized for my Waveshare JetBot 2GB AI Kit:

  • No antenna, ONLY a Wifi USB dongle
  • A 64G TF Card is adopted. A comprehensive overview of SD card is summarized at SD Standard Overview.

2. Setup JetBot Software Using SD Card

2.1 SD Card Image from JetBot Official Website

We just download the JetBot official SD card image from Software Setup (SD Card Image), and flash it to my 64G TF Card.

1
2
3
4
5
➜  jetbot sudo dd bs=1M if=jetbot-043_nano-2gb-jp45.img of=/dev/mmcblk0 conv=fsync 
[sudo] password for lvision:
24679+1 records in
24679+1 records out
25877824000 bytes (26 GB, 24 GiB) copied, 1002.99 s, 25.8 MB/s

After that, as usual, we upgrade the entire JetBot system by sudo apt update and sudo apt upgrade. After all these, our JetBot looks like:

JetBot Linux Kernel

2.2 Change from using Python 2 to Python 3

1
2
3
4
5
6
7
8
9
10
11
12
13
14
➜  ~ python --version
Python 3.6.9
➜ ~ pip --version
whereis WARNING: pip is being invoked by an old script wrapper. This will fail in a future version of pip.
Please see https://github.com/pypa/pip/issues/5599 for advice on fixing the underlying issue.
To avoid this problem you can invoke Python with '-m pip' instead of running pip directly.
pip 21.3.1 from /home/jetbot/.local/lib/python3.6/site-packages/pip (python 3.6)
➜ ~ whereis pip
pip: /usr/bin/pip /home/jetbot/.local/bin/pip3.6 /home/jetbot/.local/bin/pip
➜ ~ ll /usr/bin/pip
lrwxrwxrwx 1 root root 27 Dec 22 12:46 /usr/bin/pip -> /home/jetbot/.local/bin/pip
➜ ~ python -m pip --version
pip 21.3.1 from /home/jetbot/.local/lib/python3.6/site-packages/pip (python 3.6)
➜ ~

2.3 Jetson Stats

After twickling around how to run pip under /home/jetbot/.local/bin for user root, and have Jetson Stats installed using this command sudo -H pip install -U jetson-stats, we got the following results:

Jtop
Jetson_config
Jetson_release

2.4 Other Third Party Libraries

Anyway, let’s take a look at my installed results:

OpenCV and Protobuf

Python JetBot

2.5 Key Points

Do NOT forget to export a couple of environmental variables in both .bashrc and /etc/environment as:

1
2
3
export OPENBLAS_CORETYPE="AARCH64"
export LD_PRELOAD="/home/jetbot/.local/lib/python3.6/site-packages/torch/lib/libgomp-d22c30c5.so.1"
export LD_PRELOAD="/usr/lib/aarch64-linux-gnu/libgomp.so.1.0.0"

It looks that by directly installing pytorch via pip install torch -U --user, a second libgomp will be installed. Both libgomps brought me the error of cannot allocate memory in static TLS block, please refer to:

The solution to the above errors is:

3. Test JetBot

Pretty much similar to:

JetBot Basic Motion Demo is really a very cool demo for you to start with JetBot. It wouldn’t be hard for you to run a Jupyter Notebook Server remotely. Anyway, my testing results are given as follows: a headache video, as well as .pdf summary.

😰

4. Longer Vision JetBot OS

1
2
3
4
5
➜  sudo dd bs=1M if=/dev/mmcblk0 of=jetbot.img conv=fsync
[sudo] password for lvision:
60350+0 records in
60350+0 records out
63281561600 bytes (63 GB, 59 GiB) copied, 726.538 s, 87.1 MB/s

Finally, I attached my built JetBot OS at https://www.longervision.us/iso/jetbot-nano2g-ubuntu18.04.6-cuda10.02-cudnn8.0.0-opencv4.5.4.img. You are welcome to take a try.

😄