1
1
# infernet-ml
2
2
3
+ > [ !WARNING]
4
+ > This library has been moved to
5
+ > the [ Infernet Monorepo] ( https://github.com/ritual-net/infernet-monorepo/ )
6
+ > repository. This repository is now archived, for ` infernet-ml 1.0.0 ` and above,
7
+ > please refer to the monorepo.
8
+
9
+
3
10
` infernet-ml ` is a lightweight library meant to simplify the implementation
4
11
of machine learning workflows for models intended for Web3.
5
12
@@ -27,7 +34,8 @@ uv pip install infernet-ml
27
34
28
35
## Optional Dependencies
29
36
30
- Depending on the workflow you're using, you may want to install optional dependencies. For example, if you're using the
37
+ Depending on the workflow you're using, you may want to install optional dependencies.
38
+ For example, if you're using the
31
39
` torch ` workflow, you'll need to install its dependencies by running:
32
40
33
41
``` bash
@@ -40,9 +48,11 @@ Alternatively, via [uv](https://github.com/astral-sh/uv):
40
48
uv pip install " infernet-ml[torch_inference]"
41
49
```
42
50
43
- > [ !NOTE] The optional dependencies for this workflow require that ` cmake ` is installed on your system. You can install
51
+ > [ !NOTE] The optional dependencies for this workflow require that ` cmake ` is installed
52
+ > on your system. You can install
44
53
` cmake ` on MacOS by running ` brew install cmake ` . On Ubuntu & Windows,
45
- > consult [ the documentation] ( https://onnxruntime.ai/docs/build/inferencing.html#prerequisites )
54
+ >
55
+ consult [ the documentation] ( https://onnxruntime.ai/docs/build/inferencing.html#prerequisites )
46
56
> for more information.
47
57
48
58
## Docs
0 commit comments