-
Notifications
You must be signed in to change notification settings - Fork 15
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
merging bioengineering code for ML pipelines (#24)
* adding build using binary downloads (#8) * adding build using binary downloads * sorting out the build.rs * updating build.rs for surrealml package * prepping version for release * now has target tracking (#10) * adding check in build.rs for docs.rs * removing build.rs for main surrealml to ensure that libraries using the core do not need to do anything in their build.rs * Kings college london integration (#23) * adding machine learning pipelines for bioengineering projects at Kings College London * Remove integrated_training_runner/run_env/ from tracking * adding machine learning pipelines for bioengineering projects at Kings College London
- Loading branch information
1 parent
f4ad3aa
commit 70c7244
Showing
47 changed files
with
1,034 additions
and
1 deletion.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -76,4 +76,4 @@ fn main() -> std::io::Result<()> { | |
|
||
unpack_onnx()?; | ||
Ok(()) | ||
} | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,31 @@ | ||
# SurgiSeek | ||
|
||
Database for advanced surgical video searching - King's x SurrealDB | ||
|
||
# Layout | ||
|
||
Right now at the time of writing this, we are in the discovery phase so the layout of the documentation is fairly basic. However, as systems evolve, we will revisit the layout of the machine learning documentation here to introduce systems as they emerge. We aim to have an iterative process to avoid over-engineering. Right now, we are just having flat pages for the subjects. Hopefully we can keep concepts isolated with clean interfaces. Right now we have the following modules: | ||
|
||
## data_access | ||
|
||
This is where we define code that directly interacts with data either from a database or a file. Right now we do not know how the structure will | ||
form so we just have the `data_access/basic` Rust crate. Here we can explore loading and storing file data. Once themes start to form we can | ||
build out a more defined structure. We have example tags in the `data_access/data_stash` directory. | ||
|
||
A big data set could be downloaded using the link below: | ||
|
||
``` | ||
https://s3.unistra.fr/camma_public/datasets/endoscapes/endoscapes.zip | ||
``` | ||
|
||
And the data standards could be found in the link below: | ||
|
||
``` | ||
https://github.com/CAMMA-public/Endoscapes?tab=readme-ov-file | ||
``` | ||
|
||
|
||
## runners | ||
|
||
Runners is where we can build engines that run our modules. For instance, we can build a basic `main` that just processes files based on inputs | ||
passed in through the command line terminal. However, we can also create a runner that monitors an input from a cable, or messages over a network like Tokio TCP. |
Binary file not shown.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,165 @@ | ||
# Data Access | ||
|
||
Here is where we house libraries that handle the reading and writing of data. For now we are merely just reading from jpeg files. However, | ||
we will moved onto support for networking interfaces. | ||
|
||
# Basic | ||
|
||
Basic is a library that handles the reading of jpeg files. It is a simple library that is used to read in the jpeg files and convert them | ||
to a stream of bytes. The loading and conversion of the jpeg files is done in the `data_access/basic/src/images.rs`. | ||
For our images we are handling data in the following outline: | ||
|
||
<p align="center"> | ||
<img src="static/Image_plane.jpg" alt="Alt text" style="width:480px; height:400px;"> | ||
</p> | ||
|
||
Here we can see that we have three layers of a frame. Each layer of the `Z` axis corresponds to the RGB values of the | ||
pixel in the `X, Y` coordinate. We package the image frame as a `1D` array of `u8` values. The `u8` values and calculate | ||
the index of the pixel in the `1D` array with the following formula example where the maximum width is `10`, the maximum | ||
height is `5`, and there are `3` layers for the RGB values: | ||
|
||
```bash | ||
(3, 480, 853) => (channels, height, width) => (z, y, x) | ||
``` | ||
|
||
<p align="center"> | ||
<img src="static/coordinates.jpg" alt="Alt text" style="width:480px; height:600px;"> | ||
</p> | ||
|
||
Here we can see that we can map the `X, Y, Z` coordinates to the `1D` array. To see the sequence of this mapping we can | ||
look at the following testing code: | ||
|
||
```rust | ||
fn test_calculate_rgb_index() { | ||
|
||
// This will give x y chunks of 50 and an entire rgb image of 150 | ||
let total_height = 5; | ||
let total_width = 10; | ||
|
||
let indexes = calculate_rgb_index(0, 0, total_width, total_height); | ||
assert_eq!(&0, &indexes.red); | ||
assert_eq!(&50, &indexes.green); | ||
assert_eq!(&100, &indexes.blue); | ||
|
||
let indexes = calculate_rgb_index(1, 0, total_width, total_height); | ||
assert_eq!(&1, &indexes.red); | ||
assert_eq!(&51, &indexes.green); | ||
assert_eq!(&101, &indexes.blue); | ||
|
||
let indexes = calculate_rgb_index(2, 0, total_width, total_height); | ||
assert_eq!(&2, &indexes.red); | ||
assert_eq!(&52, &indexes.green); | ||
assert_eq!(&102, &indexes.blue); | ||
|
||
let indexes = calculate_rgb_index(0, 1, total_width, total_height); | ||
assert_eq!(&10, &indexes.red); | ||
assert_eq!(&60, &indexes.green); | ||
assert_eq!(&110, &indexes.blue); | ||
|
||
let indexes = calculate_rgb_index(0, 2, total_width, total_height); | ||
assert_eq!(&20, &indexes.red); | ||
assert_eq!(&70, &indexes.green); | ||
assert_eq!(&120, &indexes.blue); | ||
} | ||
``` | ||
|
||
We can see that our mapping function follows the exact same pattern as the reshape function that `numpy` has and this | ||
can be seen in the file `engines/pytorch_train/tests/test_numpy_quality_control.py`. | ||
|
||
# Networking | ||
|
||
At this point in time we are just handling image files in the `basic` module in rust, and piping this data into the | ||
python pytorch engine as seen in the following example: | ||
|
||
```bash | ||
./data_access_rust_bin | python pytorch_engine.py | ||
``` | ||
|
||
This means we can chunk the data into the stream and thus the ML model to be trained further. We are doing this to | ||
give users flexibility on the size of the RAM memory needed to train the model. For instance, if the user has a | ||
`60GB` folder of images, it is unlikely that they will be able to load all of these images into memory at once | ||
as depicted in the following: | ||
|
||
```bash | ||
[rust (basic)] ===> [1, 0, 0, 1, 1, 0, 1] ===> [1, 0, 0, 1, 1, 0, 1] ===> [python (ML)] | ||
``` | ||
|
||
This also gives us a lot of flexibility in the future. For instance, if we need to send the training data over | ||
a network we can easily swap out the `std::io::stdin` with a networking layer like the following: | ||
|
||
```bash | ||
[rust (basic)] ===> [TCP (packet)] ===> [TCP (packet)] ===> [python (ML)] | ||
``` | ||
|
||
We can use the `Command` in a program to coordinate pipes over multiple cores and manage the flow of data. | ||
We can also pipe in the `ffmpeg` command as seen in the following example: | ||
|
||
```bash | ||
ffmpeg -i 'srt://192.168.1.345:40052?mode=caller' | ./data_access_rust_bin | python pytorch_engine.py | ||
``` | ||
|
||
We can map this with the following Rust code: | ||
|
||
```rust | ||
use std::process::{Command, Stdio}; | ||
|
||
fn main() -> std::io::Result<()> { | ||
// Start the ffmpeg process | ||
let ffmpeg_output = Command::new("ffmpeg") | ||
.args(["-i", "srt://192.168.1.345:40052?mode=caller"]) | ||
.stdout(Stdio::piped()) | ||
.spawn()?; | ||
|
||
// Assuming `data_access_rust_bin` is the compiled binary you want to run next | ||
let rust_binary_output = Command::new("./data_access_rust_bin") | ||
.stdin(ffmpeg_output.stdout.unwrap()) // Use the output of ffmpeg as input | ||
.stdout(Stdio::piped()) | ||
.spawn()?; | ||
|
||
// Finally, pass the output of your Rust binary to the Python script | ||
let python_output = Command::new("python") | ||
.arg("pytorch_engine.py") | ||
.stdin(rust_binary_output.stdout.unwrap()) // Use the output of the Rust binary as input | ||
.output()?; | ||
|
||
// Here you can handle the final output, for example, print it | ||
println!("Python script output: {}", String::from_utf8_lossy(&python_output.stdout)); | ||
|
||
Ok(()) | ||
} | ||
|
||
``` | ||
|
||
# Local Test Setup | ||
|
||
## SRT listener server in OBS Studio | ||
|
||
At this stage, we do not have steady access to Panasonic [AW-UE150](https://eu.connect.panasonic.com/gb/en/products/broadcast-proav/aw-ue150) cameras. Hence, for initial testing purposes, we set up a Secure Reliable Transport (SRT) listener server using OBS Studio. The server URL is | ||
|
||
```powershell | ||
srt://127.0.0.1:9999?mode=listener&timeout=500000&transtype=live | ||
``` | ||
|
||
- **'127.0.0.1':** IP address of the listener server | ||
- **'9999':** Port of the listener server | ||
- **'timeout=500000':** The listener server waits for connection for 500 s before auto-abort | ||
- **'transtype=live':** Optimised for live streaming | ||
|
||
Output resolution is set to 1920 X 1080 with an FPS of 1. The images in the [CAMMA-public/cholect50](https://github.com/CAMMA-public/cholect50) are sourced as an Image Slide Show. | ||
|
||
## FFmpeg SRT caller server in Windows Powershell | ||
|
||
FFmpeg is installed and used in Windows Powershell. We tested our SRT caller server implementation in Powershell, using the command: | ||
|
||
```powershell | ||
ffmpeg -i srt://127.0.0.1:9999?mode=caller -f image2 -vcodec mjpeg -q:v 5 output%03d.jpg | ||
``` | ||
|
||
- **'127.0.0.1':** Destination IP address the caller calls | ||
- **'9999':** Destination port the caller calls | ||
- **'-f image2':** Set FFmpeg output image format to *image2* | ||
- **'-vcodec mjpeg':** Set FFmpeg output image codec to mjpeg | ||
- **'-q:v 5':** Sets the quality of the jpg images to 5 (2-31) | ||
- **'output%03d.jpg':** Sequentially incrementing file name, e.g. output001.jpg | ||
|
||
The above command saves the SRT stream as a sequence of .jpg images. Through this process, we confirm this experimental setup with OBS Studio, FFmpeg, and SRT works in CLI. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,11 @@ | ||
[package] | ||
name = "basic" | ||
version = "0.1.0" | ||
edition = "2021" | ||
|
||
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html | ||
|
||
[dependencies] | ||
serde = { version = "1.0", features = ["derive"] } | ||
serde_json = "1.0" | ||
image = "0.24.8" |
Empty file.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,177 @@ | ||
//! # Image Buffer | ||
//! In this module we are reading the image to a buffer and calculate the RGB indexes for each pixel. | ||
use image::io::Reader as ImageReader; | ||
use image::{ImageBuffer, Rgb, DynamicImage}; | ||
use crate::tags::SurgeryStep; | ||
use std::io::{self, Write}; | ||
|
||
|
||
/// Represents the indexes of the red, green and blue components of a pixel. | ||
/// | ||
/// # Fields | ||
/// * `red` - The index of the red component in relation to the (x, y) coordinates of the pixel. | ||
/// * `green` - The index of the green component in relation to the (x, y) coordinates of the pixel. | ||
/// * `blue` - The index of the blue component in relation to the (x, y) coordinates of the pixel. | ||
#[derive(Debug)] | ||
struct RgbIndexes { | ||
red: usize, | ||
green: usize, | ||
blue: usize, | ||
} | ||
|
||
|
||
/// Calculates the RGB indexes for a given pixel in relation to the (x, y) coordinates of the pixel. | ||
/// | ||
/// # Arguments | ||
/// * `x` - The x coordinate of the pixel. | ||
/// * `y` - The y coordinate of the pixel. | ||
/// * `total_width` - The total width of the image frame. | ||
/// * `total_height` - The total height of the image frame. | ||
/// | ||
/// # Returns | ||
/// A RgbIndexes struct containing the indexes of the red, green and blue components of the pixel. | ||
fn calculate_rgb_index(x: usize, y: usize, total_width: usize, total_height: usize) -> RgbIndexes { | ||
RgbIndexes { | ||
red: 0 * total_height * total_width + y * total_width + x, | ||
green: 1 * total_height * total_width + y * total_width + x, | ||
blue: 2 * total_height * total_width + y * total_width + x, | ||
} | ||
} | ||
|
||
|
||
/// Reads an RGB image from a file and returns the raw data in 1D form that can be mapped as a 3D | ||
/// array by using the `calculate_rgb_index` function. | ||
/// | ||
/// # Arguments | ||
/// * `path` - The path to the image file. | ||
/// * `height` - The total height of the image. | ||
/// * `width` - The total width of the image. | ||
/// | ||
/// # Returns | ||
/// A 1D array containing the raw RGB data of the image (flatterned). | ||
pub fn read_rgb_image(path: String, height: usize, width: usize) -> Vec<u8> { | ||
// let height: usize = 480; | ||
// let width: usize = 853; | ||
let depth: usize = 3; | ||
|
||
let img: DynamicImage = ImageReader::open(path).unwrap().decode().unwrap(); | ||
let resized_img: DynamicImage = img.resize_exact(width as u32, height as u32, image::imageops::FilterType::Nearest); | ||
|
||
// Convert to RGB and flatten to array if necessary | ||
let rgb_img: ImageBuffer<Rgb<u8>, Vec<u8>> = resized_img.to_rgb8(); | ||
|
||
let mut raw_data = vec![0u8; depth * height * width]; // 3 channels, 480 height, 853 width | ||
|
||
for chunk in rgb_img.enumerate_pixels() { | ||
let x: u32 = chunk.0; | ||
let y: u32 = chunk.1; | ||
let pixel: &Rgb<u8> = chunk.2; // [u8, u8, u8] | ||
|
||
let indexes = calculate_rgb_index(x as usize, y as usize, width, height); | ||
|
||
raw_data[indexes.red as usize] = pixel[0]; // store red component | ||
raw_data[indexes.green as usize] = pixel[1]; // store green component | ||
raw_data[indexes.blue as usize] = pixel[2]; // store blue component | ||
} | ||
raw_data | ||
} | ||
|
||
|
||
/// Writes a frame to the standard output. | ||
/// | ||
/// # Arguments | ||
/// * `data` - The raw data of the frame. | ||
/// * `tag` - The tag associated with the frame. | ||
pub fn write_frame_to_std_out(data: Vec<u8>, tag: SurgeryStep) { | ||
let stdout = io::stdout(); | ||
let mut handle = stdout.lock(); | ||
|
||
// Write the tag as a 2-byte integer | ||
handle.write_all(&(tag.to_u8() as u16).to_le_bytes()).unwrap(); | ||
|
||
// Write the len as a 4-byte integer | ||
handle.write_all(&(data.len() as u32).to_le_bytes()).unwrap(); | ||
|
||
// Write each byte in data as a 2-byte integer | ||
for byte in data { | ||
handle.write_all(&(byte as u16).to_le_bytes()).unwrap(); | ||
} | ||
|
||
handle.flush().unwrap(); | ||
} | ||
|
||
|
||
#[cfg(test)] | ||
mod tests { | ||
|
||
use super::*; | ||
use serde::Deserialize; | ||
|
||
#[derive(Debug, Deserialize)] | ||
struct DummyJson { | ||
data: Vec<u8>, | ||
} | ||
|
||
#[test] | ||
fn test_read_image() { | ||
let _data = read_rgb_image("../data_stash/images/169_6300.jpg".to_string(), 480, 853); | ||
} | ||
|
||
#[test] | ||
fn test_calculate_rgb_index() { | ||
|
||
// This will give x y chunks of 50 and an entire rgb image of 150 | ||
let total_height = 5; | ||
let total_width = 10; | ||
|
||
let indexes = calculate_rgb_index(0, 0, total_width, total_height); | ||
assert_eq!(&0, &indexes.red); | ||
assert_eq!(&50, &indexes.green); | ||
assert_eq!(&100, &indexes.blue); | ||
|
||
let indexes = calculate_rgb_index(1, 0, total_width, total_height); | ||
assert_eq!(&1, &indexes.red); | ||
assert_eq!(&51, &indexes.green); | ||
assert_eq!(&101, &indexes.blue); | ||
|
||
let indexes = calculate_rgb_index(2, 0, total_width, total_height); | ||
assert_eq!(&2, &indexes.red); | ||
assert_eq!(&52, &indexes.green); | ||
assert_eq!(&102, &indexes.blue); | ||
|
||
let indexes = calculate_rgb_index(0, 1, total_width, total_height); | ||
assert_eq!(&10, &indexes.red); | ||
assert_eq!(&60, &indexes.green); | ||
assert_eq!(&110, &indexes.blue); | ||
|
||
let indexes = calculate_rgb_index(0, 2, total_width, total_height); | ||
assert_eq!(&20, &indexes.red); | ||
assert_eq!(&70, &indexes.green); | ||
assert_eq!(&120, &indexes.blue); | ||
} | ||
|
||
#[test] | ||
fn test_test_calculate_rgb_index_quality_control() { | ||
let raw_data = std::fs::read_to_string("../data_stash/images/dummy_rgb_data.json").unwrap(); | ||
let data: DummyJson = serde_json::from_str(&raw_data).unwrap(); | ||
|
||
// This will give x y chunks of 50 and an entire rgb image of 150 | ||
let total_height = 5; | ||
let total_width = 10; | ||
|
||
let index = calculate_rgb_index(0, 0, total_width, total_height); | ||
assert_eq!(&data.data[index.red], &111); // z = 0 | ||
assert_eq!(&data.data[index.green], &208); // z = 1 | ||
assert_eq!(&data.data[index.blue], &12); // z = 2 | ||
|
||
let index = calculate_rgb_index(5, 3, total_width, total_height); | ||
assert_eq!(&data.data[index.red], &65); | ||
assert_eq!(&data.data[index.green], &7); | ||
assert_eq!(&data.data[index.blue], &193); | ||
|
||
let index = calculate_rgb_index(8, 2, total_width, total_height); | ||
assert_eq!(&data.data[index.red], &253); | ||
assert_eq!(&data.data[index.green], &133); | ||
assert_eq!(&data.data[index.blue], &115); | ||
} | ||
} |
Oops, something went wrong.