Skip to content

Releases: abetlen/llama-cpp-python

v0.2.78-metal

10 Jun 15:27
Compare
Choose a tag to compare
chore: Bump version

v0.2.78

10 Jun 17:03
Compare
Choose a tag to compare
chore: Bump version

v0.2.77-metal

04 Jun 05:00
Compare
Choose a tag to compare
Merge branch 'main' of https://github.com/abetlen/llama-cpp-python in…

…to main

v0.2.77-cu124

04 Jun 17:17
d634efc
Compare
Choose a tag to compare
feat: adding `rpc_servers` parameter to `Llama` class (#1477)

* passthru rpc_servers params

wip

* enable llama rpc by default

* convert string to byte

* add rpc package

* Revert "enable llama rpc by default"

This reverts commit 832c6dd56c979514cec5df224bf2d2014dccd790.

* update readme

* Only set rpc_servers when provided

* Add rpc servers to server options

---------

Co-authored-by: Andrei Betlen <[email protected]>

v0.2.77-cu123

04 Jun 17:19
d634efc
Compare
Choose a tag to compare
feat: adding `rpc_servers` parameter to `Llama` class (#1477)

* passthru rpc_servers params

wip

* enable llama rpc by default

* convert string to byte

* add rpc package

* Revert "enable llama rpc by default"

This reverts commit 832c6dd56c979514cec5df224bf2d2014dccd790.

* update readme

* Only set rpc_servers when provided

* Add rpc servers to server options

---------

Co-authored-by: Andrei Betlen <[email protected]>

v0.2.77-cu122

04 Jun 17:18
d634efc
Compare
Choose a tag to compare
feat: adding `rpc_servers` parameter to `Llama` class (#1477)

* passthru rpc_servers params

wip

* enable llama rpc by default

* convert string to byte

* add rpc package

* Revert "enable llama rpc by default"

This reverts commit 832c6dd56c979514cec5df224bf2d2014dccd790.

* update readme

* Only set rpc_servers when provided

* Add rpc servers to server options

---------

Co-authored-by: Andrei Betlen <[email protected]>

v0.2.77-cu121

04 Jun 17:18
d634efc
Compare
Choose a tag to compare
feat: adding `rpc_servers` parameter to `Llama` class (#1477)

* passthru rpc_servers params

wip

* enable llama rpc by default

* convert string to byte

* add rpc package

* Revert "enable llama rpc by default"

This reverts commit 832c6dd56c979514cec5df224bf2d2014dccd790.

* update readme

* Only set rpc_servers when provided

* Add rpc servers to server options

---------

Co-authored-by: Andrei Betlen <[email protected]>

v0.2.77

04 Jun 06:36
Compare
Choose a tag to compare
Merge branch 'main' of https://github.com/abetlen/llama-cpp-python in…

…to main

v0.2.76-metal

24 May 06:16
Compare
Choose a tag to compare
chore: Bump version

v0.2.76-cu124

24 May 06:18
Compare
Choose a tag to compare
chore: Bump version