Hedda
(Hedda)
March 24, 2022, 12:50pm
1
FYI, Garrett Brown (a.k.a. garbear / a.k.a. eigendude, known for being the RetroPlayer developer from Team Kodi/XBMC), has just released a new open source smart home platform called “OASIS” built on ROS 2 stack (Robot Operating System) which is designed for computer vision based interaction and control where camera viewing and input streaming are provided as two initial use cases:
Other than releasing OASIS v1.0.0 on GitHub he has submit a pull request for Kodi for computer vision based interaction for smart home control using a Kinect One (Kinect v2), and as a proof of concept test he shows how he can also use input streaming from input peripherals (such as a gamepad or keyboard and mouse) to control Arduino based ROS robot devices with Firmata firmware:
xbmc:feature_smarthome7
← garbear:smarthome
opened 10:34AM - 24 Mar 22 UTC
## Description
This PR introduces a new paradigm of Kodi usage: smart home in… teraction and control. Camera viewing and input streaming are provided as two initial use cases.
The approach I've taken with smart home is a bit unique: I built on ROS 2. While this introduces some heavy overhead, building on an industrial decentralized communication framework allows for scalability to virtually unlimited low power smart home devices. My smart home's computation graph is currently at 8 Linux nodes and 3 Arduino nodes and growing.
Alongside this PR is a repo devoted to my smart home OS: https://github.com/eigendude/OASIS.
OASIS provides a complete ROS 2 stack for computer vision, input streaming and general automation. It also provides a complete implementation of the Firmata protocol for communicating with Arduinos, with additional support for temperature and humidity sensors, I2C, servos, sonar, SPI, stepper motors, and 4-wire CPU fans.
I'm not seriously proposing we merge this and add ROS 2 as a dependency; it adds 2 millions lines of code. I'm just sharing the code I run everyday at home, and maybe it'll inspire someone.
### Camera viewing
A new `cameraview` control is introduced. It takes a parameter, the ROS image topic to subscribe to. For testing, I created the following `SmartHome.xml` window:
```xml
<?xml version="1.0" encoding="UTF-8"?>
<window>
<controls>
<control type="cameraview">
<width>50%</width>
<height>50%</height>
<topic>/oasis/netbook/image_raw</topic>
</control>
<control type="cameraview">
<left>50%</left>
<height>50%</height>
<topic>/oasis/lenovo/image_raw</topic>
</control>
<control type="cameraview">
<width>50%</width>
<top>50%</top>
<topic>/oasis/netbook/foreground</topic>
</control>
<control type="cameraview">
<left>50%</left>
<top>50%</top>
<topic>/oasis/lenovo/foreground</topic>
</control>
</controls>
</window>
```
This window creates a 2x2 matrix. The top two cameraview controls are the direct output of two laptop cameras. The bottom two cameraview controls are the output of a rudimentary person detector using a [background subtraction library](https://github.com/andrewssobral/bgslibrary) applied to the camera feeds.
Screenshot of the above window (not bad for low light netbook cameras):

I forked the [Kinect 2 driver](https://github.com/eigendude/oasis_kinect2) and ported it to ROS 2. You can see depth registration working, but I'll probably do skeletal tracking on the 2D image via MediaPipe as a person detection solution.

### Input streaming
The second use case is input streaming for smart home control. Kodi advertises two ROS topics:
* `/peripherals` - A list of attached input peripherals, which is all controllers plus the keyboard and mouse
* `/input` - A topic that publishes the state of controllers
Kodi also provides a ROS service:
* `/capture_input` - Calling this service causes a peripheral's input to be captured and sent over the `input` topic.
In my home, I'm using a PS controller to drive my childhood Lego train. The input is captured by Kodi running on my NAS with a Bluetooth dongle, and sent to a Raspberry Pi with an Ardiuno and a robotics motor controller that drives the train's 9V motors.
[](https://youtu.be/zMA9HYPH4Tw)
## Motivation and context
Smart Home!!!!
## How has this been tested?
I've been using Kodi as part of my smart home setup since May of 2021. ROS 2 is so robust that my system barely goes down.
## What is the effect on users?
* Smart Home!!!
## Related PRs
Depends on:
* https://github.com/xbmc/xbmc/pull/21182
* https://github.com/a9183756-gh/Arduino-CMake-Toolchain/pull/55
* https://github.com/paul-shuvo/iai_kinect2_opencv4/pull/3
* https://github.com/MrYsLab/pymata-express/pull/40
* https://github.com/xbmc/peripheral.joystick/pull/235
* https://github.com/eigendude/OASIS
## Types of change
- [ ] **Bug fix** (non-breaking change which fixes an issue)
- [ ] **Clean up** (non-breaking change which removes non-working, unmaintained functionality)
- [ ] **Improvement** (non-breaking change which improves existing functionality)
- [x] **New feature** (non-breaking change which adds functionality)
- [ ] **Breaking change** (fix or feature that will cause existing functionality to change)
- [ ] **Cosmetic change** (non-breaking change that doesn't touch code)
- [ ] **None of the above** (please explain below)
xbmc:master
← garbear:smarthome-features
opened 10:21AM - 24 Mar 22 UTC
## Description
This PR contains three "features", which are additions to the … code to enable more use cases in the future. The primary use case being Smart Home support in https://github.com/xbmc/xbmc/pull/21183.
[`Peripherals: Allow devices to have a controller profile`](https://github.com/xbmc/xbmc/commit/81d268e7ba2a5d303080e02ae2b691bf2d50e756)
* This commit makes Kodi more aware of its input peripherals. In order to make use of the extended controller API, Mouse and Keyboard controller profiles were imported (and the other two synced). Importing mouse/keyboard was planned anyway, when keyboard/mouse mapping is expanded in the GUI.
[`Input: Expose input frame to higher-level input`](https://github.com/xbmc/xbmc/commit/b016039a6c4a7251a1f4d1e8b6d89db186333f2d)
* This commit adds a callback found in lower-level driver input to higher-level controller input. At the lower level, we group axes into analog sticks. At the higher level, we will be grouping entire controller state into single-frame events.
[`AppParamParser: Preserve command line arguments`](https://github.com/xbmc/xbmc/commit/c6b320565aa6e779ffb1366aec4892761eed7433)
* This commit stores the command line arguments in a service-accessible state. This is useful for any future services (like ROS 2) that are initialized with the command line parameters.
## Motivation and context
Required for https://github.com/xbmc/xbmc/pull/21183.
## How has this been tested?
My computer vision pipeline has been running relatively stable for about 10 months now.
Without the Mouse and Keyboard profiles, the ROS topic is missing data:
```yaml
header:
stamp:
sec: 1648117091
nanosec: 438027612
frame_id: nas
peripherals:
- type: 2
name: Keyboard
address: keyboard
vendor_id: 0
product_id: 0
controller_profile: ''
- type: 3
name: Mouse
address: mouse
vendor_id: 0
product_id: 0
controller_profile: ''
```
With the Mouse and Keyboard profiles included, the ROS topic contains the missing data:
```yaml
header:
stamp:
sec: 1648117091
nanosec: 438027612
frame_id: nas
peripherals:
- type: 2
name: Keyboard
address: keyboard
vendor_id: 0
product_id: 0
controller_profile: game.controller.keyboard
- type: 3
name: Mouse
address: mouse
vendor_id: 0
product_id: 0
controller_profile: game.controller.mouse
```
## What is the effect on users?
* Enables future use cases like Smart Home support
## Types of change
- [ ] **Bug fix** (non-breaking change which fixes an issue)
- [ ] **Clean up** (non-breaking change which removes non-working, unmaintained functionality)
- [ ] **Improvement** (non-breaking change which improves existing functionality)
- [x] **New feature** (non-breaking change which adds functionality)
- [ ] **Breaking change** (fix or feature that will cause existing functionality to change)
- [ ] **Cosmetic change** (non-breaking change that doesn't touch code)
- [ ] **None of the above** (please explain below)
I hope that he will develop a new stand-alone integration for Home Assistant for this OASIS platform or otherwise extend the existing Kodi integration with input from this: