-
Notifications
You must be signed in to change notification settings - Fork 1.1k
[2021.2.18] Deploy WG: Core weekly
David Bericat edited this page Feb 26, 2021
·
6 revisions
- Haris Shuaib (GSTT), 2. Pouria Rouzrokh (Mayo), 3. Kilian Hett (Vanderbilt), 4. Laurence Jackson (GSTT), 5. B. Selnur Erdal (Mayo Jacksonville), 6. Dana Groff (NVIDIA), 7. Rahul Choudhury (NVIDIA), 8. Steve Langer (FlowSigma), 9. David Bericat (NVIDIA)
MONAI Deploy core WG - bi-weekly-20210218_110741-Meeting Recording
New ones:
- AI1 - [DB] Sync up with Mayo (Pouria, Brad and Steve) to present their case next time. Then GST and DKFZ.
- AI2 - [Steve] IHE Air - pull up the diagram and see the transactions.
- AI3 - [DB] Move meetings to weekly to speed up use cases review timing.
Pending from last meeting:
- AI1 - [DB] To open a slide deck for all of us to document - [COMPLETED] - MONAI Deploy WG use cases
- AI2 - [All] Add slides with workflows per team from training to deployment and feedback - [IN PROGRESS]
- Data flows
- Input syncs
- Output syncs
- AI3 - [Selnur] Share how they document a use case and requirements sample - [COMPLETED]
- AI4 - [All] Identify a use case from MONAI and use the use case and reqs template as a way to document what we will design and build - [IN PROGRESS]
- Selnur reviewed past experience with OSU. Fantastic summary. Totally recommend watching the recording.
- Mayo will go next, then GST, then DFKZ.
Review use cases per institution with focus on workflows, inputs, outputs, personas and integrations with other systems
-
[Selnur - Mayo Jacksonville] Presented past work on OSU (public) - MONAI Deploy - Selnur REMIX OSU slides.pdf
- Input and output patterns
- Image to image
- Image to multiple images
- Multiple images to structure
- Images to DB
- Images to sender
- Evaluation stage but with live clinical data
- AI - regulations?
- Acceptance easier when radiologists are driving it
- Speed and ease of use - back-end response to user
- Rads singleview is how they build and deploy their apps. OSU built on the same app engine, so same interface. Then to PACS.
- Fast response - in-memory DB
- Bulk processing from VNA transfer, once done, communication speed is not important anymore
- Time measurement per click, speed of inference feedback justifies adding AI or not. Per transaction speed
- Focus on standards
- Input and output patterns
-
[Steve Langer]
- DICOM SR
- IHE Air - Kevin O’Connor
- AI: pull up the diagram and see the transactions
-
[Rahul]
- AI can be integrated at different stages
- Input and output varies
- AI inference as a service for rads, data representation, fast response. Boundaries and internal representation.
- AI can be integrated at different stages
-
[Haris]
- Non-segmentation data
- HL7 message
- DB search based on ID, you can query DICOM for that
- Structured data (EMRs) vs free text (endoscopy, pathology?) vs image data
-
[Laurence]
- New feature requests for the AI algorithm
- Single site
- Multiple sites - more layers of approval, vetting process by a committee, DS, IT, clinical
- Exploration Dev (accuracy, precision, published - measure of success is does it work properly) (6-8 months) -> Production Dev - Prioritization (then IT get involved, and depending on where it will be deployed and what it does, then it goes through a different testing process depending on what systems (QA/Test is a whole team IT) - Production stages
- Test is clinically driven
- Does this have a clinical use? Then you can move it to prod.
- Research only can be not
- New feature requests for the AI algorithm