* OpenMV camera (P4, P5) and control board (SCL, SDA) are connected throught I2C protocal.
* Laptop connects to OpenMV Camera and Control board (Update the excuation codes)
* Battery * 3 for control board (1500 mAh), ControlBoard also mount to the Motor Shield
***High Level Logics** (Current):
```pseudocode
If [Camera] Detect Target --> (x,y):
[Control_Board] do PID control
until move towards (x,y)
Else [Camera] NOT Detect
[Control_Board] Seeking:
Horizontally Rotating
Vertically No Action
```
## (Color) Detection Module
In `feather_main_personaldemo_PID.ino`, in the `loop{...}`, the "If Detect/Not Detect Parts" has an condition called **`cam.exe_color_detection_biggestblob()`**
It use the `interface->call()` of OpenMV MicroPython [`rpc` library](https://docs.openmv.io/library/omv.rpc.html) function to call the function `color_detection_single` in **Shahrul Kamil bin Hassan / blimp-autonomy / Codes / OpenMV H7 Plus Camera /[openMV_main.py](https://git.uclalemur.com/shahrulkamil98/blimp-autonomy/-/blob/main/Codes/OpenMV H7 Plus Camera/openMV_main.py)**
```python
# When called returns the x/y centroid of the largest blob
# within the OpenMV Cam's field-of-view.
#
# data is the 6-byte color tracking threshold tuple of L_MIN, L_MAX, A_MIN, A_MAX, B_MIN, B_MAX.
defcolor_detection_single(data):
red_led.off()
green_led.on()
sensor.set_pixformat(sensor.RGB565)
sensor.set_framesize(sensor.QVGA)
thresholds=struct.unpack("<bbbbbb",data)
print(thresholds)
blobs=sensor.snapshot().find_blobs([thresholds],
pixels_threshold=500,
area_threshold=500,
merge=True,
margin=20)
ifnotblobs:returnstruct.pack("<HH",0,0)# No detections.
There is also another funciton called `color_detection (Thresholds)`, which is a complex version of `color_detection_single()`, and it will be able to detect multiple balls.
Now, we take deeper look into the **`color_detection_single ()`** function,
***Input**: 6 `int8_t` number for **thershold**
* Threshold is in [LAB format for color](http://shutha.org/node/851)
* All the way from `feather_main_personaldemo.ino`, as
* 6-byte color tracking threshold tuple of `L_MIN, L_MAX, A_MIN, A_MAX, B_MIN, B_MAX`
* Used as a cirteria for [`img.find_blobs()`](https://docs.openmv.io/library/omv.image.html#image.image.Image.image.find_blobs) finding a blob
*`get_fb()`: (Get Frame Buffer) Returns the image object returned by a previous call of `sensor.snapshot()`
*`out_blob`: is the max bolob by density
***Output**: Return the center of the blob `(out_blob.cx, out_blob.cy)`
**Note**:
* Currently, the threshold is manually set to be suitable for detecting green inside the experimenting space.
* We will upgrade this to ML procdess to have a accurate
## PID Control Module
### PID Prelim
$$
\begin{aligned}
&u(t)=K_{p} e(t)+K_{i} \int e(t) d t+K_{p} \frac{d e}{d t} \\
&u(t)=\text { PID control variable } \\
&K_{p}=\text { proportional gain } \\
&e(t)=\text { error value } \\
&K_{i}=\text { integral gain } \\
&d e \quad=\text { change in error value } \\
&d t=\text { change in time }
\end{aligned}
$$
In discrete domain:
* Integration as summaton
* Derviative as difference
### Feather PID
**Shahrul Kamil bin Hassan / november-2021-blimp-competition / Code / Main Feather Code / [feather_main_personaldemo_PID.ino](https://git.uclalemur.com/shahrulkamil98/november-2021-blimp-competition/-/blob/main/Code/Main Feather Code/feather_main_personaldemo_PID.ino)**