Skip to content

Releases: 2noise/ChatTTS

v0.2.1: doc(conda): pinning python version (#816)

05 Nov 05:14
a67bfb5
Compare
Choose a tag to compare

New

Fixed

Optimized


新增

修复

  • 中文正则化器中缺少remove_interjections (#783) by @IrisSally

优化

v0.2.0

09 Oct 15:10
71b42e0
Compare
Choose a tag to compare

New

Fixed

Optimized

  • Completely removed pretrain_models dictionary (77c7e20) by @fumiama
  • Made tokenizer a standalone class (77c7e20) by @fumiama
  • normalizer will remove all unsupported characters now to avoid inference errors (0f47a87) by @fumiama
  • Removed config folder, settings are now directly embedded into the code for easier changes (27331c3) by @fumiama
  • Removed extra whitespace at the end of streaming inference (#564) by @Ox0400
  • Added manual_seed parameter, which directly provides generator to multinomial, avoiding impact on torch environment (e675a59) by @fumiama
  • tokenizer loading switched from .pt to the built-in from_pretrained method to eliminate potential malicious code loading (80b24e6) by @fumiama
  • Made speaker class standalone, placing spk_stat related content within it, and directly wrote its values into the settings class due to its small size (f3dcd97) by @fumiama
  • Chat.load will set compile=False now by default (7e33889) by @fumiama
  • Switched GPT to safetensor model (8a503fd) by @fumiama

Dependencies

  • Changed code license to open-source AGPL3.0 (9f402ba)

新增

修复

优化

  • 彻底移除pretrain_models字典 (77c7e20) by @fumiama
  • tokenizer独立为一个类 (77c7e20) by @fumiama
  • normalizer将所有不支持的字符删除以免推理出错 (0f47a87) by @fumiama
  • 取消config文件夹,直接把设置写入代码方便更改 (27331c3) by @fumiama
  • 删除流式推理末尾多余的空白 (#564) by @Ox0400
  • 在调用前设置manual_seed改为直接给multinomial提供generator,避免影响 torch 环境 (e675a59) by @fumiama
  • tokenizer从直接加载.pt改为调用自带的from_pretrained方法,从而消除可能的恶意代码加载 (80b24e6) by @fumiama
  • 独立speaker类,放置spk_stat相关内容,同时因为该模型很小,所以直接将它的值写入了设置类 (f3dcd97) by @fumiama
  • Chat.load参数改为默认关闭编译 (7e33889) by @fumiama
  • GPT 切换到safetensor模型 (8a503fd) by @fumiama

依赖

  • 代码许可证更改为开源的 AGPL3.0 (9f402ba)

v0.1.1

04 Jul 05:31
Compare
Choose a tag to compare

New

  • Apple MPS GPU (Experimental, off by default) (#261, #472) by @rasonyang
  • Replacement of rare characters (Chinese characters) (#350) by @6drf21e
  • local loading mode, renamed original local to custom (#361) by @fumiama
  • Core supports streaming inference (#360) by @Ox0400
  • WebUI supports streaming inference (#380) by @v3ucn
  • User customizable logger (#398) by @fumiama
  • CMD supports batch inference (#366) by @Ox0400
  • Customizable DVAE coef parameter (#405) by @fumiama
  • download_models unload API (4dd1f88) by @fumiama
  • Normalizer changed to registration type, users can register interfaces that meet the requirements (#420) by @fumiama
  • Improved type annotations, all dict parameters changed to dataclass for easy auto-completion when calling (#422) by @fumiama
  • Interruptable inference process, which will return the currently inferred part (#433) by @fumiama
  • Experimental: NVIDIA TransformerEngine support (#496) by @fumiama
  • Infer parameter show_tqdm (3836db8) by @fumiama
  • Experimental: flash_attention_2 support (c109089) by @fumiama

Fixed

Optimized

Dependencies

  • Relaxed dependency restrictions for easier installation

新增

修复

优化

依赖

  • 放宽依赖限制使安装更容易