You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
ValueError: do_sample is set to False. However, temperature is set to 0.5 -- this flag is only used in sample-based generation modes. Set do_sample=True or unset temperature to continue.
#419
# Run the following Python code to generate speech samples:
from transformers import AutoProcessor, BarkModel
import torch
import scipy
processor = AutoProcessor.from_pretrained("suno/bark")
model = BarkModel.from_pretrained("suno/bark")
# define the type of device
model = model.to("cuda:0" if torch.cuda.is_available() else "cpu")
voice_preset = "v2/en_speaker_6"
inputs = processor("Hello, my dog is cute", voice_preset=voice_preset)
audio_array = model.generate(**inputs)
audio_array = audio_array.cpu().numpy().squeeze()
# Listen to the audio samples either in an ipynb notebook:
# from IPython.display import Audio
sample_rate = model.generation_config.sample_rate
# Audio(audio_array, rate=sample_rate)
# Or save them as a .wav file using a third-party library, e.g. scipy:
# import scipy
# sample_rate = model.generation_config.sample_rate
scipy.io.wavfile.write("bark_out.wav", rate=sample_rate, data=audio_array)
Error:
Traceback (most recent call last):
File "...\app002.py", line 17, in <module>
audio_array = model.generate(**inputs, temperature=1)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\...\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "C:\...\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\transformers\models\bark\modeling_bark.py", line 1518, in generate
fine_generation_config = BarkFineGenerationConfig(**self.generation_config.fine_acoustics_config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\...\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\transformers\models\bark\generation_configuration_bark.py", line 221, in __init__
super().__init__(temperature=temperature)
File "C:\...\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\transformers\generation\configuration_utils.py", line 316, in __init__
self.validate()
File "C:\...\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\transformers\generation\configuration_utils.py", line 354, in validate
raise ValueError(
ValueError: `do_sample` is set to `False`. However, temperature is set to 0.5 -- this flag is only used in sample-based generation modes. Set `do_sample=True` or unset temperature to continue.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Some one with this error ?
code python
Error:
Beta Was this translation helpful? Give feedback.
All reactions