### Supported functions
|Speech recognition| [Speech synthesis][tts-url] | [Source separation][ss-url] |
|------------------|------------------|-------------------|
| ✔️ | ✔️ | ✔️ |
|Speaker identification| [Speaker diarization][sd-url] | Speaker verification |
|----------------------|-------------------- |------------------------|
| ✔️ | ✔️ | ✔️ |
| [Spoken Language identification][slid-url] | [Audio tagging][at-url] | [Voice activity detection][vad-url] |
|--------------------------------|---------------|--------------------------|
| ✔️ | ✔️ | ✔️ |
| [Keyword spotting][kws-url] | [Add punctuation][punct-url] | [Speech enhancement][se-url] |
|------------------|-----------------|--------------------|
| ✔️ | ✔️ | ✔️ |
### Supported platforms
|Architecture| Android | iOS | Windows | macOS | linux | HarmonyOS |
|------------|---------|---------|------------|-------|-------|-----------|
| x64 | ✔️ | | ✔️ | ✔️ | ✔️ | ✔️ |
| x86 | ✔️ | | ✔️ | | | |
| arm64 | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ |
| arm32 | ✔️ | | | | ✔️ | ✔️ |
| riscv64 | | | | | ✔️ | |
### Supported programming languages
| 1. C++ | 2. C | 3. Python | 4. JavaScript |
|--------|-------|-----------|---------------|
| ✔️ | ✔️ | ✔️ | ✔️ |
|5. Java | 6. C# | 7. Kotlin | 8. Swift |
|--------|-------|-----------|----------|
| ✔️ | ✔️ | ✔️ | ✔️ |
| 9. Go | 10. Dart | 11. Rust | 12. Pascal |
|-------|----------|----------|------------|
| ✔️ | ✔️ | ✔️ | ✔️ |
For Rust support, please see [sherpa-rs][sherpa-rs]
It also supports WebAssembly.
[Join our discord](https://discord.gg/fJdxzg2VbG)
## Introduction
This repository supports running the following functions **locally**
- Speech-to-text (i.e., ASR); both streaming and non-streaming are supported
- Text-to-speech (i.e., TTS)
- Speaker diarization
- Speaker identification
- Speaker verification
- Spoken language identification
- Audio tagging
- VAD (e.g., [silero-vad][silero-vad])
- Speech enhancement (e.g., [gtcrn][gtcrn])
- Keyword spotting
- Source separation (e.g., [spleeter][spleeter], [UVR][UVR])
on the following platforms and operating systems:
- x86, ``x86_64``, 32-bit ARM, 64-bit ARM (arm64, aarch64), RISC-V (riscv64), **RK NPU**
- Linux, macOS, Windows, openKylin
- Android, WearOS
- iOS
- HarmonyOS
- NodeJS
- WebAssembly
- [NVIDIA Jetson Orin NX][NVIDIA Jetson Orin NX] (Support running on both CPU and GPU)
- [NVIDIA Jetson Nano B01][NVIDIA Jetson Nano B01] (Support running on both CPU and GPU)
- [Raspberry Pi][Raspberry Pi]
- [RV1126][RV1126]
- [LicheePi4A][LicheePi4A]
- [VisionFive 2][VisionFive 2]
- [旭日X3派][旭日X3派]
- [爱芯派][爱芯派]
- [RK3588][RK3588]
- etc
with the following APIs
- C++, C, Python, Go, ``C#``
- Java, Kotlin, JavaScript
- Swift, Rust
- Dart, Object Pascal
### Links for Huggingface Spaces
<details>
<summary>You can visit the following Huggingface spaces to try sherpa-onnx without
installing anything. All you need is a browser.</summary>
| Description | URL | 中国镜像 |
|-------------------------------------------------------|-----------------------------------------|----------------------------------------|
| Speaker diarization | [Click me][hf-space-speaker-diarization]| [镜像][hf-space-speaker-diarization-cn]|
| Speech recognition | [Click me][hf-space-asr] | [镜像][hf-space-asr-cn] |
| Speech recognition with [Whisper][Whisper] | [Click me][hf-space-asr-whisper] | [镜像][hf-space-asr-whisper-cn] |
| Speech synthesis | [Click me][hf-space-tts] | [镜像][hf-space-tts-cn] |
| Generate subtitles | [Click me][hf-space-subtitle] | [镜像][hf-space-subtitle-cn] |
| Audio tagging | [Click me][hf-space-audio-tagging] | [镜像][hf-space-audio-tagging-cn] |
| Source separation | [Click me][hf-space-source-separation] | [镜像][hf-space-source-separation-cn] |
| Spoken language identification with [Whisper][Whisper]| [Click me][hf-space-slid-whisper] | [镜像][hf-space-slid-whisper-cn] |
We also have spaces built using WebAssembly. They are listed below:
| Description | Huggingface space| ModelScope space|
|------------------------------------------------------------------------------------------|------------------|-----------------|
|Voice activity detection with [silero-vad][silero-vad] | [Click me][wasm-hf-vad]|[地址][wasm-ms-vad]|
|Real-time speech recognition (Chinese + English) with Zipformer | [Click me][wasm-hf-streaming-asr-zh-en-zipformer]|[地址][wasm-hf-streaming-asr-zh-en-zipformer]|
|Real-time speech recognition (Chinese + English) with Paraformer |[Click me][wasm-hf-streaming-asr-zh-en-paraformer]| [地址][wasm-ms-streaming-asr-zh-en-paraformer]|
|Real-time speech recognition (Chinese + English + Cantonese) with [Paraformer-large][Paraformer-large]|[Click me][wasm-hf-streaming-asr-zh-en-yue-paraformer]| [地址][wasm-ms-streaming-asr-zh-en-yue-paraformer]|
|Real-time speech recognition (English) |[Click me][wasm-hf-streaming-asr-en-zipformer] |[地址][wasm-ms-streaming-asr-en-zipformer]|
|VAD + speech recognition (Chinese) with [Zipformer CTC](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-ctc/icefall/zipformer.html#sherpa-onnx-zipformer-ctc-zh-int8-2025-07-03-chinese)|[Click me][wasm-hf-vad-asr-zh-zipformer-ctc-07-03]| [地址][wasm-ms-vad-asr-zh-zipformer-ctc-07-03]|
|VAD + speech recognition (Chinese + English + Korean + Japanese + Cantonese) with [SenseVoice][SenseVoice]|[Click me][wasm-hf-vad-asr-zh-en-ko-ja-yue-sense-voice]| [地址][wasm-ms-vad-asr-zh-en-ko-ja-yue-sense-voice]|
|VAD + speech recognition (English) with [Whisper][Whisper] tiny.en|[Click me][wasm-hf-vad-asr-en-whisper-tiny-en]| [地址][wasm-ms-vad-asr-en-whisper-tiny-en]|
|VAD + speech recognition (English) with [Moonshine tiny][Moonshine tiny]|[Click me][wasm-hf-vad-asr-en-moonshine-tiny-en]| [地址][wasm-ms-vad-asr-en-moonshine-tiny-en]|
|VAD + speech recognition (English) with Zipformer trained with [GigaSpeech][GigaSpeech] |[Click me][wasm-hf-vad-asr-en-zipformer-gigaspeech]| [地址][wasm-ms-vad-asr-en-zipformer-gigaspeech]|
|VAD + speech recognition (Chinese) with Zipformer trained with [WenetSpeech][WenetSpeech] |[Click me][wasm-hf-vad-asr-zh-zipformer-wenetspeech]| [地址][wasm-ms-vad-asr-zh-zipformer-wenetspeech]|
|VAD + speech recognition (Japanese) with Zipformer trained with [ReazonSpeech][ReazonSpeech]|[Click me][wasm-hf-vad-asr-ja-zipformer-reazonspeech]| [地址][wasm-ms-vad-asr-ja-zipformer-reazonspeech]|
|VAD + speech recognition (Thai) with Zipformer trained with [GigaSpeech2][GigaSpeech2] |[Click me][wasm-hf-vad-asr-th-zipformer-gigaspeech2]| [地址][wasm-ms-vad-asr-th-zipformer-gigaspeech2]|
|VAD + speech recognition (Chinese 多种方言) with a [TeleSpeech-ASR][TeleSpeech-ASR] CTC model|[Click me][wasm-hf-vad-asr-zh-telespeech]| [地址][wasm-ms-vad-asr-zh-telespeech]|
|VAD + speech recognition (English + Chinese, 及多种中文方言) with Paraformer-large |[Click me][wasm-hf-vad-asr-zh-en-paraformer-large]| [地址][wasm-ms-vad-asr-zh-en-paraformer-large]|
|VAD + speech recognition (English + Chinese, 及多种中文方言) with Paraformer-small |[Click me][wasm-hf-vad-asr-zh-en-paraformer-small]| [地址][wasm-ms-vad-asr-zh-en-paraformer-small]|
|VAD + speech recognition (多语种及多种中文方言) with [Dolphin][Dolphin]-base |[Click me][wasm-hf-vad-asr-multi-lang-dolphin-base]| [地址][wasm-ms-vad-asr-multi-lang-dolphin-base]|
|Speech synthesis (English) |[Click me][wasm-hf-tts-piper-en]| [地址][wasm-ms-tts-piper-en]|
|Speech synthesis (German) |[Click me][wasm-hf-tts-piper-de]| [地址][wasm-ms-tts-piper-de]|
|Speaker diarization |[Click me][wasm-hf-speaker-diarization]|[地址][wasm-ms-speaker-diarization]|
</details>
### Links for pre-built Android APKs
<details>
<summary>You can find pre-built Android APKs for this repository in the following table</summary>
| Description | URL | 中国用户 |
|----------------------------------------|------------------------------------|-----------------------------------|
| Speaker diarization | [Address][apk-speaker-diarization] | [点此][apk-speaker-diarization-cn]|
| Streaming speech recognition | [Address][apk-streaming-asr] | [点此][apk-streaming-asr-cn] |
| Simulated-streaming speech recognition | [Address][apk-simula-streaming-asr]| [点此][apk-simula-streaming-asr-cn]|
| Text-to-speech | [Address][apk-tts] | [点此][apk-tts-cn] |
| Voice activity detection (VAD) | [Address][apk-vad] | [点此][apk-vad-cn] |
| VAD + non-streaming speech recognition | [Address][apk-vad-asr] | [点此][apk-vad-asr-cn] |
| Two-pass speech recognition | [Address][apk-2pass] | [点此][apk-2pass-cn] |
| Audio tagging | [Address][apk-at] | [点此][apk-at-cn] |
| Audio tagging (WearOS) | [Address][apk-at-wearos] | [点此][apk-at-wearos-cn] |
| Speaker identification | [Address][apk-sid] | [点此][apk-sid-cn] |
| Spoken language identification | [Address][apk-slid] | [点此][apk-slid-cn] |
| Keyword spotting | [Address][apk-kws] | [点此][apk-kws-cn] |
</details>
### Links for pre-built Flutter APPs
<details>
#### Real-time speech recognition
| Description | URL | 中国用户 |
|--------------------------------|-------------------------------------|-------------------------------------|
| Streaming speech recognition | [Address][apk-flutter-streaming-asr]| [点此][apk-flutter-streaming-asr-cn]|
#### Text-to-speech
| Description | URL | 中国用户 |
|------------------------------------------|------------------------------------|------------------------------------|
| Android (arm64-v8a, armeabi-v7a, x86_64) | [Address][flutter-tts-android] | [点此][flutter-tts-android-cn] |
| Linux (x64) | [Address][flutter-tts-linux] | [点此][flutter-tts-linux-cn] |
| macOS (x64) | [Address][flutter-tts-macos-x64] | [点此][flutter-tts-macos-arm64-cn] |
| macOS (arm64) | [Address][flutter-tts-macos-arm64] | [点此][flutter-tts-macos-x64-cn] |
| Windows (x64) | [Address][flutter-tts-win-x64] | [点此][flutter-tts-win-x64-cn] |
> Note: You need to build from source for iOS.
</details>
### Links for pre-built Lazarus APPs
<details>
#### Generating subtitles
| Description | URL | 中国用户 |
|--------------------------------|----------------------------|----------------------------|
| Generate subtitles (生成字幕) | [Address][lazarus-subtitle]| [点此][lazarus-subtitle-cn]|
</details>
### Links for pre-trained models
<details>
| Description | URL |
|---------------------------------------------|---------------------------------------------------------------------------------------|
| Speech recognition (speech to text, ASR) | [Address][asr-models] |
| Text-to-speech (TTS) | [Address][tts-models] |
| VAD | [Address][vad-models] |
| Keyword spotting | [Address][kws-models] |
| Audio tagging | [Address][at-models] |
| Speaker identification (Speaker ID) | [Address][sid-models] |
| Spoken language identification (Language ID)| See multi-lingual [Whisper][Whisper] ASR models from [Speech recognition][asr-models]|
| Punctuation | [Address][punct-models] |
| Speaker segmentation | [Address][speaker-segmentation-models] |
| Speech enhancement | [Address][speech-enhancement-models] |
| Source separation | [Address][source-separation-models] |
</details>
#### Some pre-trained ASR models (Streaming)
<details>
Please see
- <https://k2-fsa.github.io/sherpa/onnx/pretrained_models/online-transducer/index.html>
- <https://k2-fsa.github.io/sherpa/onnx/pretrained_models/online-paraformer/index.html>
- <https://k2-fsa.github.io/sherpa/onnx/pretrained_models/online-ctc/index.html>
for more models. The following table lists only **SOME** of them.
|Name | Supported Languages| Description|
|-----|-----|----|
|[sherpa-onnx-streaming-zipformer-bilingual-zh-en-2023-02-20][sherpa-onnx-streaming-zipformer-bilingual-zh-en-2023-02-20]| Chinese, English| See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/online-transducer/zipformer-transducer-models.html#csukuangfj-sherpa-onnx-streaming-zipformer-bilingual-zh-en-2023-02-20-bilingual-chinese-english)|
|[sherpa-onnx-streaming-zipformer-small-bilingual-zh-en-2023-02-16][sherpa-onnx-streaming-zipformer-small-bilingual-zh-en-2023-02-16]| Chinese, English| See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/online-transducer/zipformer-transducer-models.html#sherpa-onnx-streaming-zipformer-small-bilingual-zh-en-2023-02-16-bilingual-chinese-english)|
|[sherpa-onnx-streaming-zipformer-zh-14M-2023-02-23][sherpa-onnx-streaming-zipformer-zh-14M-2023-02-23]|Chinese| Suitable for Cortex A7 CPU. See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/online-transducer/zipformer-transducer-models.html#sherpa-onnx-streaming-zipformer-zh-14m-2023-02-23)|
|[sherpa-onnx-streaming-zipformer-en-20M-2023-02-17][sherpa-onnx-streaming-zipformer-en-20M-2023-02-17]|English|Suitable for Cortex A7 CPU. See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/online-transducer/zipformer-transducer-models.html#sherpa-onnx-streaming-zipformer-en-20m-2023-02-17)|
|[sherpa-onnx-streaming-zipformer-korean-2024-06-16][sherpa-onnx-streaming-zipformer-korean-2024-06-16]|Korean| See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/online-transducer/zipformer-transducer-models.html#sherpa-onnx-streaming-zipformer-korean-2024-06-16-korean)|
|[sherpa-onnx-streaming-zipformer-fr-2023-04-14][sherpa-onnx-streaming-zipformer-fr-2023-04-14]|French| See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/online-transducer/zipformer-transducer-models.html#shaojieli-sherpa-onnx-streaming-zipformer-fr-2023-04-14-french)|
</details>
#### Some pre-trained ASR models (Non-Streaming)
<details>
Please see
- <https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-transducer/index.html>
- <https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-paraformer/index.html>
- <https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-ctc/index.html>
- <https://k2-fsa.github.io/sherpa/onnx/pretrained_models/telespeech/index.html>
- <https://k2-fsa.github.io/sherpa/onnx/pretrained_models/whisper/index.html>
for more models. The following table lists only **SOME** of them.
|Name | Supported Languages| Description|
|-----|-----|----|
|[sherpa-onnx-nemo-parakeet-tdt-0.6b-v2-int8](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-transducer/nemo-transducer-models.html#sherpa-onnx-nemo-parakeet-tdt-0-6b-v2-int8-english)| English | It is converted from <https://huggingface.co/nvidia/parakeet-tdt-0.6b-v2>|
|[Whisper tiny.en](https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-whisper-tiny.en.tar.bz2)|English| See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/whisper/tiny.en.html)|
|[Moonshine tiny][Moonshine tiny]|English|See [also](https://github.com/usefulsensors/moonshine)|
|[sherpa-onnx-zipformer-ctc-zh-int8-2025-07-03](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-ctc/icefall/zipformer.html#sherpa-onnx-zipformer-ctc-zh-int8-2025-07-03-chinese)|Chinese| A Zipformer CTC model|
|[sherpa-onnx-sense-voice-zh-en-ja-ko-yue-2024-07-17][sherpa-onnx-sense-voice-zh-en-ja-ko-yue-2024-07-17]|Chinese, Cantonese, English, Korean, Japanese| 支持多种中文方言. See [also](https://k2-fsa.github.io/sherpa/onnx/sense-voice/index.html)|
|[sherpa-onnx-paraformer-zh-2024-03-09][sherpa-onnx-paraformer-zh-2024-03-09]|Chinese, English| 也支持多种中文方言. See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-paraformer/paraformer-models.html#csukuangfj-sherpa-onnx-paraformer-zh-2024-03-09-chinese-english)|
|[sherpa-onnx-zipformer-ja-reazonspeech-2024-08-01][sherpa-onnx-zipformer-ja-reazonspeech-2024-08-01]|Japanese|See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-transducer/zipformer-transducer-models.html#sherpa-onnx-zipformer-ja-reazonspeech-2024-08-01-japanese)|
|[sherpa-onnx-nemo-transducer-giga-am-russian-2024-10-24][sherpa-onnx-nemo-transducer-giga-am-russian-2024-10-24]|Russian|See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-transducer/nemo-transducer-models.html#sherpa-onnx-nemo-transducer-giga-am-russian-2024-10-24-russian)|
|[sherpa-onnx-nemo-ctc-giga-am-russian-2024-10-24][sherpa-onnx-nemo-ctc-giga-am-russian-2024-10-24]|Russian| See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-ctc/nemo/russian.html#sherpa-onnx-nemo-ctc-giga-am-russian-2024-10-24)|
|[sherpa-onnx-zipformer-ru-2024-09-18][sherpa-onnx-zipformer-ru-2024-09-18]|Russian|See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-transducer/zipformer-transducer-models.html#sherpa-onnx-zipformer-ru-2024-09-18-russian)|
|[sherpa-onnx-zipformer-korean-2024-06-24][sherpa-onnx-zipformer-korean-2024-06-24]|Korean|See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-transducer/zipformer-transducer-models.html#sherpa-onnx-zipformer-korean-2024-06-24-korean)|
|[sherpa-onnx-zipformer-thai-2024-06-20][sherpa-onnx-zipformer-thai-2024-06-20]|Thai| See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-transducer/zipformer-transducer-models.html#sherpa-onnx-zipformer-thai-2024-06-20-thai)|
|[sherpa-onnx-telespeech-ctc-int8-zh-2024-06-04][sherpa-onnx-telespeech-ctc-int8-zh-2024-06-04]|Chinese| 支持多种方言. See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/telespeech/models.html#sherpa-onnx-telespeech-ctc-int8-zh-2024-06-04)|
</details>
### Useful links
- Documentation: https://k2-fsa.github.io/sherpa/onnx/
- Bilibili 演示视频: https://search.bilibili.com/all?keyword=%E6%96%B0%E4%B8%80%E4%BB%A3Kaldi
### How to reach us
Please see
https://k2-fsa.github.io/sherpa/social-groups.html
for 新一代 Kaldi **微信交流群** and **QQ 交流群**.
## Projects using sherpa-onnx
### [BreezeApp](https://github.com/mtkresearch/BreezeApp) from [MediaTek Research](https://github.com/mtkresearch)
> BreezeAPP is a mobile AI application developed for both Android and iOS platforms.
> Users can download it directly from the App Store and enjoy a variety of features
> offline, including speech-to-text, text-to-speech, text-based chatbot interactions,
> and image question-answering
- [Download APK for BreezeAPP](https://huggingface.co/MediaTek-Research/BreezeApp/resolve/main/BreezeApp.apk)
- [APK 中国镜像](https://hf-mirror.com/MediaTek-Research/BreezeApp/blob/main/BreezeApp.apk)
| 1 | 2 | 3 |
|---|---|---|
||||
### [Open-LLM-VTuber](https://github.com/t41372/Open-LLM-VTuber)
Talk to any LLM with hands-free voice interaction, voice interruption, and Live2D taking
face running locally across platforms
See also <https://github.com/t41372/Open-LLM-VTuber/pull/50>
### [voiceapi](https://github.com/ruzhila/voiceapi)
<details>
<summary>Streaming ASR and TTS based on FastAPI</summary>
It shows how to use the ASR and TTS Python APIs with FastAPI.
</details>
### [腾讯会议摸鱼工具 TMSpeech](https://github.com/jxlpzqc/TMSpeech)
Uses streaming ASR in C# with graphical user interface.
Video demo in Chinese: [【开源】Windows实时字幕软件(网课/开会必备)](https://www.bilibili.com/video/BV1rX4y1p7Nx)
### [lol互动助手](https://github.com/l1veIn/lol-wom-electron)
It uses the JavaScript API of sherpa-onnx along with [Electron](https://electronjs.org/)
Video demo in Chinese: [爆了!炫神教你开打字挂!真正影响胜率的英雄联盟工具!英雄联盟的最后一块拼图!和游戏中的每个人无障碍沟通!](https://www.bilibili.com/video/BV142tje9E74)
### [Sherpa-ONNX 语音识别服务器](https://github.com/hfyydd/sherpa-onnx-server)
A server based on nodejs providing Restful API for speech recognition.
### [QSmartAssistant](https://github.com/xinhecuican/QSmartAssistant)
一个模块化,全过程可离线,低占用率的对话机器人/智能音箱
It uses QT. Both [ASR](https://github.com/xinhecuican/QSmartAssistant/blob/master/doc/%E5%AE%89%E8%A3%85.md#asr)
and [TTS](https://github.com/xinhecuican/QSmartAssistant/blob/master/doc/%E5%AE%89%E8%A3%85.md#tts)
are used.
### [Flutter-EasySpeechRecognition](https://github.com/Jason-chen-coder/Flutter-EasySpeechRecognition)
It extends [./flutter-examples/streaming_asr](./flutter-examples/streaming_asr) by
downloading models inside the app to reduce the size of the app.
Note: [[Team B] Sherpa AI backend](https://github.com/umgc/spring2025/pull/82) also uses
sherpa-onnx in a Flutter APP.
### [sherpa-onnx-unity](https://github.com/xue-fei/sherpa-onnx-unity)
sherpa-onnx in Unity. See also [#1695](https://github.com/k2-fsa/sherpa-onnx/issues/1695),
[#1892](https://github.com/k2-fsa/sherpa-onnx/issues/1892), and [#1859](https://github.com/k2-fsa/sherpa-onnx/issues/1859)
### [xiaozhi-esp32-server](https://github.com/xinnan-tech/xiaozhi-esp32-server)
本项目为xiaozhi-esp32提供后端服务,帮助您快速搭建ESP32设备控制服务器
Backend service for xiaozhi-esp32, helps you quickly build an ESP32 device control server.
See also
- [ASR新增轻量级sherpa-onnx-asr](https://github.com/xinnan-tech/xiaozhi-esp32-server/issues/315)
- [feat: ASR增加sherpa-onnx模型](https://github.com/xinnan-tech/xiaozhi-esp32-server/pull/379)
### [KaithemAutomation](https://github.com/EternityForest/KaithemAutomation)
Pure Python, GUI-focused home automation/consumer grade SCADA.
It uses TTS from sherpa-onnx. See also [✨ Speak command that uses the new globally configured TTS model.](https://github.com/EternityForest/KaithemAutomation/commit/8e64d2b138725e426532f7d66bb69dd0b4f53693)
### [Open-XiaoAI KWS](https://github.com/idootop/open-xiaoai-kws)
Enable custom wake word for XiaoAi Speakers. 让小爱音箱支持自定义唤醒词。
Video demo in Chinese: [小爱同学启动~˶╹ꇴ╹˶!](https://www.bilibili.com/video/BV1YfVUz5EMj)
### [C++ WebSocket ASR Server](https://github.com/mawwalker/stt-server)
It provides a WebSocket server based on C++ for ASR using sherpa-onnx.
### [Go WebSocket Server](https://github.com/bbeyondllove/asr_server)
It provides a WebSocket server based on the Go programming language for sherpa-onnx.
### [Making robot Paimon, Ep10 "The AI Part 1"](https://www.youtube.com/watch?v=KxPKkwxGWZs)
It is a [YouTube video](https://www.youtube.com/watch?v=KxPKkwxGWZs),
showing how the author tried to use AI so he can have a conversation with Paimon.
It uses sherpa-onnx for speech-to-text and text-to-speech.
|1|
|---|
||
### [TtsReader - Desktop application](https://github.com/ys-pro-duction/TtsReader)
A desktop text-to-speech application built using Kotlin Multiplatform.
### [MentraOS](https://github.com/Mentra-Community/MentraOS)
> Smart glasses OS, with dozens of built-in apps. Users get AI assistant, notifications,
> translation, screen mirror, captions, and more. Devs get to write 1 app that runs on
> any pair of smart glasses.
It uses sherpa-onnx for real-time speech recognition on iOS and Android devices.
See also <https://github.com/Mentra-Community/MentraOS/pull/861>
It uses Swift for iOS and Java for Android.
[sherpa-rs]: https://github.com/thewh1teagle/sherpa-rs
[silero-vad]: https://github.com/snakers4/silero-vad
[Raspberry Pi]: https://www.raspberrypi.com/
[RV1126]: https://www.rock-chips.com/uploads/pdf/2022.8.26/191/RV1126%20Brief%20Datasheet.pdf
[LicheePi4A]: https://sipeed.com/licheepi4a
[VisionFive 2]: https://www.starfivetech.com/en/site/boards
[旭日X3派]: https://developer.horizon.ai/api/v1/fileData/documents_pi/index.html
[爱芯派]: https://wiki.sipeed.com/hardware/zh/maixIII/ax-pi/axpi.html
[hf-space-speaker-diarization]: https://huggingface.co/spaces/k2-fsa/speaker-diarization
[hf-space-speaker-diarization-cn]: https://hf.qhduan.com/spaces/k2-fsa/speaker-diarization
[hf-space-asr]: https://huggingface.co/spaces/k2-fsa/automatic-speech-recognition
[hf-space-asr-cn]: https://hf.qhduan.com/spaces/k2-fsa/automatic-speech-recognition
[Whisper]: https://github.com/openai/whisper
[hf-space-asr-whisper]: https://huggingface.co/spaces/k2-fsa/automatic-speech-recognition-with-whisper
[hf-space-asr-whisper-cn]: https://hf.qhduan.com/spaces/k2-fsa/automatic-speech-recognition-with-whisper
[hf-space-tts]: https://huggingface.co/spaces/k2-fsa/text-to-speech
[hf-space-tts-cn]: https://hf.qhduan.com/spaces/k2-fsa/text-to-speech
[hf-space-subtitle]: https://huggingface.co/spaces/k2-fsa/generate-subtitles-for-videos
[hf-space-subtitle-cn]: https://hf.qhduan.com/spaces/k2-fsa/generate-subtitles-for-videos
[hf-space-audio-tagging]: https://huggingface.co/spaces/k2-fsa/audio-tagging
[hf-space-audio-tagging-cn]: https://hf.qhduan.com/spaces/k2-fsa/audio-tagging
[hf-space-source-separation]: https://huggingface.co/spaces/k2-fsa/source-separation
[hf-space-source-separation-cn]: https://hf.qhduan.com/spaces/k2-fsa/source-separation
[hf-space-slid-whisper]: https://huggingface.co/spaces/k2-fsa/spoken-language-identification
[hf-space-slid-whisper-cn]: https://hf.qhduan.com/spaces/k2-fsa/spoken-language-identification
[wasm-hf-vad]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-sherpa-onnx
[wasm-ms-vad]: https://modelscope.cn/studios/csukuangfj/web-assembly-vad-sherpa-onnx
[wasm-hf-streaming-asr-zh-en-zipformer]: https://huggingface.co/spaces/k2-fsa/web-assembly-asr-sherpa-onnx-zh-en
[wasm-ms-streaming-asr-zh-en-zipformer]: https://modelscope.cn/studios/k2-fsa/web-assembly-asr-sherpa-onnx-zh-en
[wasm-hf-streaming-asr-zh-en-paraformer]: https://huggingface.co/spaces/k2-fsa/web-assembly-asr-sherpa-onnx-zh-en-paraformer
[wasm-ms-streaming-asr-zh-en-paraformer]: https://modelscope.cn/studios/k2-fsa/web-assembly-asr-sherpa-onnx-zh-en-paraformer
[Paraformer-large]: https://www.modelscope.cn/models/damo/speech_paraformer-large_asr_nat-zh-cn-16k-common-vocab8404-pytorch/summary
[wasm-hf-streaming-asr-zh-en-yue-paraformer]: https://huggingface.co/spaces/k2-fsa/web-assembly-asr-sherpa-onnx-zh-cantonese-en-paraformer
[wasm-ms-streaming-asr-zh-en-yue-paraformer]: https://modelscope.cn/studios/k2-fsa/web-assembly-asr-sherpa-onnx-zh-cantonese-en-paraformer
[wasm-hf-streaming-asr-en-zipformer]: https://huggingface.co/spaces/k2-fsa/web-assembly-asr-sherpa-onnx-en
[wasm-ms-streaming-asr-en-zipformer]: https://modelscope.cn/studios/k2-fsa/web-assembly-asr-sherpa-onnx-en
[SenseVoice]: https://github.com/FunAudioLLM/SenseVoice
[wasm-hf-vad-asr-zh-zipformer-ctc-07-03]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-zh-zipformer-ctc
[wasm-ms-vad-asr-zh-zipformer-ctc-07-03]: https://modelscope.cn/studios/csukuangfj/web-assembly-vad-asr-sherpa-onnx-zh-zipformer-ctc/summary
[wasm-hf-vad-asr-zh-en-ko-ja-yue-sense-voice]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-zh-en-ja-ko-cantonese-sense-voice
[wasm-ms-vad-asr-zh-en-ko-ja-yue-sense-voice]: https://www.modelscope.cn/studios/csukuangfj/web-assembly-vad-asr-sherpa-onnx-zh-en-jp-ko-cantonese-sense-voice
[wasm-hf-vad-asr-en-whisper-tiny-en]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-en-whisper-tiny
[wasm-ms-vad-asr-en-whisper-tiny-en]: https://www.modelscope.cn/studios/csukuangfj/web-assembly-vad-asr-sherpa-onnx-en-whisper-tiny
[wasm-hf-vad-asr-en-moonshine-tiny-en]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-en-moonshine-tiny
[wasm-ms-vad-asr-en-moonshine-tiny-en]: https://www.modelscope.cn/studios/csukuangfj/web-assembly-vad-asr-sherpa-onnx-en-moonshine-tiny
[wasm-hf-vad-asr-en-zipformer-gigaspeech]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-en-zipformer-gigaspeech
[wasm-ms-vad-asr-en-zipformer-gigaspeech]: https://www.modelscope.cn/studios/k2-fsa/web-assembly-vad-asr-sherpa-onnx-en-zipformer-gigaspeech
[wasm-hf-vad-asr-zh-zipformer-wenetspeech]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-zh-zipformer-wenetspeech
[wasm-ms-vad-asr-zh-zipformer-wenetspeech]: https://www.modelscope.cn/studios/k2-fsa/web-assembly-vad-asr-sherpa-onnx-zh-zipformer-wenetspeech
[reazonspeech]: https://research.reazon.jp/_static/reazonspeech_nlp2023.pdf
[wasm-hf-vad-asr-ja-zipformer-reazonspeech]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-ja-zipformer
[wasm-ms-vad-asr-ja-zipformer-reazonspeech]: https://www.modelscope.cn/studios/csukuangfj/web-assembly-vad-asr-sherpa-onnx-ja-zipformer
[gigaspeech2]: https://github.com/speechcolab/gigaspeech2
[wasm-hf-vad-asr-th-zipformer-gigaspeech2]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-th-zipformer
[wasm-ms-vad-asr-th-zipformer-gigaspeech2]: https://www.modelscope.cn/studios/csukuangfj/web-assembly-vad-asr-sherpa-onnx-th-zipformer
[telespeech-asr]: https://github.com/tele-ai/telespeech-asr
[wasm-hf-vad-asr-zh-telespeech]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-zh-telespeech
[wasm-ms-vad-asr-zh-telespeech]: https://www.modelscope.cn/studios/k2-fsa/web-assembly-vad-asr-sherpa-onnx-zh-telespeech
[wasm-hf-vad-asr-zh-en-paraformer-large]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-zh-en-paraformer
[wasm-ms-vad-asr-zh-en-paraformer-large]: https://www.modelscope.cn/studios/k2-fsa/web-assembly-vad-asr-sherpa-onnx-zh-en-paraformer
[wasm-hf-vad-asr-zh-en-paraformer-small]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-zh-en-paraformer-small
[wasm-ms-vad-asr-zh-en-paraformer-small]: https://www.modelscope.cn/studios/k2-fsa/web-assembly-vad-asr-sherpa-onnx-zh-en-paraformer-small
[dolphin]: https://github.com/dataoceanai/dolphin
[wasm-ms-vad-asr-multi-lang-dolphin-base]: https://modelscope.cn/studios/csukuangfj/web-assembly-vad-asr-sherpa-onnx-multi-lang-dophin-ctc
[wasm-hf-vad-asr-multi-lang-dolphin-base]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-multi-lang-dophin-ctc
[wasm-hf-tts-piper-en]: https://huggingface.co/spaces/k2-fsa/web-assembly-tts-sherpa-onnx-en
[wasm-ms-tts-piper-en]: https://modelscope.cn/studios/k2-fsa/web-assembly-tts-sherpa-onnx-en
[wasm-hf-tts-piper-de]: https://huggingface.co/spaces/k2-fsa/web-assembly-tts-sherpa-onnx-de
[wasm-ms-tts-piper-de]: https://modelscope.cn/studios/k2-fsa/web-assembly-tts-sherpa-onnx-de
[wasm-hf-speaker-diarization]: https://huggingface.co/spaces/k2-fsa/web-assembly-speaker-diarization-sherpa-onnx
[wasm-ms-speaker-diarization]: https://www.modelscope.cn/studios/csukuangfj/web-assembly-speaker-diarization-sherpa-onnx
[apk-speaker-diarization]: https://k2-fsa.github.io/sherpa/onnx/speaker-diarization/apk.html
[apk-speaker-diarization-cn]: https://k2-fsa.github.io/sherpa/onnx/speaker-diarization/apk-cn.html
[apk-streaming-asr]: https://k2-fsa.github.io/sherpa/onnx/android/apk.html
[apk-streaming-asr-cn]: https://k2-fsa.github.io/sherpa/onnx/android/apk-cn.html
[apk-simula-streaming-asr]: https://k2-fsa.github.io/sherpa/onnx/android/apk-simulate-streaming-asr.html
[apk-simula-streaming-asr-cn]: https://k2-fsa.github.io/sherpa/onnx/android/apk-simulate-streaming-asr-cn.html
[apk-tts]: https://k2-fsa.github.io/sherpa/onnx/tts/apk-engine.html
[apk-tts-cn]: https://k2-fsa.github.io/sherpa/onnx/tts/apk-engine-cn.html
[apk-vad]: https://k2-fsa.github.io/sherpa/onnx/vad/apk.html
[apk-vad-cn]: https://k2-fsa.github.io/sherpa/onnx/vad/apk-cn.html
[apk-vad-asr]: https://k2-fsa.github.io/sherpa/onnx/vad/apk-asr.html
[apk-vad-asr-cn]: https://k2-fsa.github.io/sherpa/onnx/vad/apk-asr-cn.html
[apk-2pass]: https://k2-fsa.github.io/sherpa/onnx/android/apk-2pass.html
[apk-2pass-cn]: https://k2-fsa.github.io/sherpa/onnx/android/apk-2pass-cn.html
[apk-at]: https://k2-fsa.github.io/sherpa/onnx/audio-tagging/apk.html
[apk-at-cn]: https://k2-fsa.github.io/sherpa/onnx/audio-tagging/apk-cn.html
[apk-at-wearos]: https://k2-fsa.github.io/sherpa/onnx/audio-tagging/apk-wearos.html
[apk-at-wearos-cn]: https://k2-fsa.github.io/sherpa/onnx/audio-tagging/apk-wearos-cn.html
[apk-sid]: https://k2-fsa.github.io/sherpa/onnx/speaker-identification/apk.html
[apk-sid-cn]: https://k2-fsa.github.io/sherpa/onnx/speaker-identification/apk-cn.html
[apk-slid]: https://k2-fsa.github.io/sherpa/onnx/spoken-language-identification/apk.html
[apk-slid-cn]: https://k2-fsa.github.io/sherpa/onnx/spoken-language-identification/apk-cn.html
[apk-kws]: https://k2-fsa.github.io/sherpa/onnx/kws/apk.html
[apk-kws-cn]: https://k2-fsa.github.io/sherpa/onnx/kws/apk-cn.html
[apk-flutter-streaming-asr]: https://k2-fsa.github.io/sherpa/onnx/flutter/asr/app.html
[apk-flutter-streaming-asr-cn]: https://k2-fsa.github.io/sherpa/onnx/flutter/asr/app-cn.html
[flutter-tts-android]: https://k2-fsa.github.io/sherpa/onnx/flutter/tts-android.html
[flutter-tts-android-cn]: https://k2-fsa.github.io/sherpa/onnx/flutter/tts-android-cn.html
[flutter-tts-linux]: https://k2-fsa.github.io/sherpa/onnx/flutter/tts-linux.html
[flutter-tts-linux-cn]: https://k2-fsa.github.io/sherpa/onnx/flutter/tts-linux-cn.html
[flutter-tts-macos-x64]: https://k2-fsa.github.io/sherpa/onnx/flutter/tts-macos-x64.html
[flutter-tts-macos-arm64-cn]: https://k2-fsa.github.io/sherpa/onnx/flutter/tts-macos-x64-cn.html
[flutter-tts-macos-arm64]: https://k2-fsa.github.io/sherpa/onnx/flutter/tts-macos-arm64.html
[flutter-tts-macos-x64-cn]: https://k2-fsa.github.io/sherpa/onnx/flutter/tts-macos-arm64-cn.html
[flutter-tts-win-x64]: https://k2-fsa.github.io/sherpa/onnx/flutter/tts-win.html
[flutter-tts-win-x64-cn]: https://k2-fsa.github.io/sherpa/onnx/flutter/tts-win-cn.html
[lazarus-subtitle]: https://k2-fsa.github.io/sherpa/onnx/lazarus/download-generated-subtitles.html
[lazarus-subtitle-cn]: https://k2-fsa.github.io/sherpa/onnx/lazarus/download-generated-subtitles-cn.html
[asr-models]: https://github.com/k2-fsa/sherpa-onnx/releases/tag/asr-models
[tts-models]: https://github.com/k2-fsa/sherpa-onnx/releases/tag/tts-models
[vad-models]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/silero_vad.onnx
[kws-models]: https://github.com/k2-fsa/sherpa-onnx/releases/tag/kws-models
[at-models]: https://github.com/k2-fsa/sherpa-onnx/releases/tag/audio-tagging-models
[sid-models]: https://github.com/k2-fsa/sherpa-onnx/releases/tag/speaker-recongition-models
[slid-models]: https://github.com/k2-fsa/sherpa-onnx/releases/tag/speaker-recongition-models
[punct-models]: https://github.com/k2-fsa/sherpa-onnx/releases/tag/punctuation-models
[speaker-segmentation-models]: https://github.com/k2-fsa/sherpa-onnx/releases/tag/speaker-segmentation-models
[GigaSpeech]: https://github.com/SpeechColab/GigaSpeech
[WenetSpeech]: https://github.com/wenet-e2e/WenetSpeech
[sherpa-onnx-streaming-zipformer-bilingual-zh-en-2023-02-20]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-streaming-zipformer-bilingual-zh-en-2023-02-20.tar.bz2
[sherpa-onnx-streaming-zipformer-small-bilingual-zh-en-2023-02-16]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-streaming-zipformer-small-bilingual-zh-en-2023-02-16.tar.bz2
[sherpa-onnx-streaming-zipformer-korean-2024-06-16]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-streaming-zipformer-korean-2024-06-16.tar.bz2
[sherpa-onnx-streaming-zipformer-zh-14M-2023-02-23]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-streaming-zipformer-zh-14M-2023-02-23.tar.bz2
[sherpa-onnx-streaming-zipformer-en-20M-2023-02-17]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-streaming-zipformer-en-20M-2023-02-17.tar.bz2
[sherpa-onnx-zipformer-ja-reazonspeech-2024-08-01]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-zipformer-ja-reazonspeech-2024-08-01.tar.bz2
[sherpa-onnx-zipformer-ru-2024-09-18]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-zipformer-ru-2024-09-18.tar.bz2
[sherpa-onnx-zipformer-korean-2024-06-24]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-zipformer-korean-2024-06-24.tar.bz2
[sherpa-onnx-zipformer-thai-2024-06-20]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-zipformer-thai-2024-06-20.tar.bz2
[sherpa-onnx-nemo-transducer-giga-am-russian-2024-10-24]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-nemo-transducer-giga-am-russian-2024-10-24.tar.bz2
[sherpa-onnx-paraformer-zh-2024-03-09]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-paraformer-zh-2024-03-09.tar.bz2
[sherpa-onnx-nemo-ctc-giga-am-russian-2024-10-24]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-nemo-ctc-giga-am-russian-2024-10-24.tar.bz2
[sherpa-onnx-telespeech-ctc-int8-zh-2024-06-04]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-telespeech-ctc-int8-zh-2024-06-04.tar.bz2
[sherpa-onnx-sense-voice-zh-en-ja-ko-yue-2024-07-17]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-sense-voice-zh-en-ja-ko-yue-2024-07-17.tar.bz2
[sherpa-onnx-streaming-zipformer-fr-2023-04-14]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-streaming-zipformer-fr-2023-04-14.tar.bz2
[Moonshine tiny]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-moonshine-tiny-en-int8.tar.bz2
[NVIDIA Jetson Orin NX]: https://developer.download.nvidia.com/assets/embedded/secure/jetson/orin_nx/docs/Jetson_Orin_NX_DS-10712-001_v0.5.pdf?RCPGu9Q6OVAOv7a7vgtwc9-BLScXRIWq6cSLuditMALECJ_dOj27DgnqAPGVnT2VpiNpQan9SyFy-9zRykR58CokzbXwjSA7Gj819e91AXPrWkGZR3oS1VLxiDEpJa_Y0lr7UT-N4GnXtb8NlUkP4GkCkkF_FQivGPrAucCUywL481GH_WpP_p7ziHU1Wg==&t=eyJscyI6ImdzZW8iLCJsc2QiOiJodHRwczovL3d3dy5nb29nbGUuY29tLmhrLyJ9
[NVIDIA Jetson Nano B01]: https://www.seeedstudio.com/blog/2020/01/16/new-revision-of-jetson-nano-dev-kit-now-supports-new-jetson-nano-module/
[speech-enhancement-models]: https://github.com/k2-fsa/sherpa-onnx/releases/tag/speech-enhancement-models
[source-separation-models]: https://github.com/k2-fsa/sherpa-onnx/releases/tag/source-separation-models
[RK3588]: https://www.rock-chips.com/uploads/pdf/2022.8.26/192/RK3588%20Brief%20Datasheet.pdf
[spleeter]: https://github.com/deezer/spleeter
[UVR]: https://github.com/Anjok07/ultimatevocalremovergui
[gtcrn]: https://github.com/Xiaobin-Rong/gtcrn
[tts-url]: https://k2-fsa.github.io/sherpa/onnx/tts/all-in-one.html
[ss-url]: https://k2-fsa.github.io/sherpa/onnx/source-separation/index.html
[sd-url]: https://k2-fsa.github.io/sherpa/onnx/speaker-diarization/index.html
[slid-url]: https://k2-fsa.github.io/sherpa/onnx/spoken-language-identification/index.html
[at-url]: https://k2-fsa.github.io/sherpa/onnx/audio-tagging/index.html
[vad-url]: https://k2-fsa.github.io/sherpa/onnx/vad/index.html
[kws-url]: https://k2-fsa.github.io/sherpa/onnx/kws/index.html
[punct-url]: https://k2-fsa.github.io/sherpa/onnx/punctuation/index.html
[se-url]: https://k2-fsa.github.io/sherpa/onnx/speech-enhancment/index.html
Raw data
{
"_id": null,
"home_page": "https://github.com/k2-fsa/sherpa-onnx",
"name": "sherpa-onnx",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": null,
"keywords": null,
"author": "The sherpa-onnx development team",
"author_email": "dpovey@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/c3/b8/46310a6bc56e99ca4a8d80a884d5f5bfcd80da3d7e289f996ae3e417f2b5/sherpa-onnx-1.12.13.tar.gz",
"platform": null,
"description": "### Supported functions\n\n|Speech recognition| [Speech synthesis][tts-url] | [Source separation][ss-url] |\n|------------------|------------------|-------------------|\n| \u2714\ufe0f | \u2714\ufe0f | \u2714\ufe0f |\n\n|Speaker identification| [Speaker diarization][sd-url] | Speaker verification |\n|----------------------|-------------------- |------------------------|\n| \u2714\ufe0f | \u2714\ufe0f | \u2714\ufe0f |\n\n| [Spoken Language identification][slid-url] | [Audio tagging][at-url] | [Voice activity detection][vad-url] |\n|--------------------------------|---------------|--------------------------|\n| \u2714\ufe0f | \u2714\ufe0f | \u2714\ufe0f |\n\n| [Keyword spotting][kws-url] | [Add punctuation][punct-url] | [Speech enhancement][se-url] |\n|------------------|-----------------|--------------------|\n| \u2714\ufe0f | \u2714\ufe0f | \u2714\ufe0f |\n\n\n### Supported platforms\n\n|Architecture| Android | iOS | Windows | macOS | linux | HarmonyOS |\n|------------|---------|---------|------------|-------|-------|-----------|\n| x64 | \u2714\ufe0f | | \u2714\ufe0f | \u2714\ufe0f | \u2714\ufe0f | \u2714\ufe0f |\n| x86 | \u2714\ufe0f | | \u2714\ufe0f | | | |\n| arm64 | \u2714\ufe0f | \u2714\ufe0f | \u2714\ufe0f | \u2714\ufe0f | \u2714\ufe0f | \u2714\ufe0f |\n| arm32 | \u2714\ufe0f | | | | \u2714\ufe0f | \u2714\ufe0f |\n| riscv64 | | | | | \u2714\ufe0f | |\n\n### Supported programming languages\n\n| 1. C++ | 2. C | 3. Python | 4. JavaScript |\n|--------|-------|-----------|---------------|\n| \u2714\ufe0f | \u2714\ufe0f | \u2714\ufe0f | \u2714\ufe0f |\n\n|5. Java | 6. C# | 7. Kotlin | 8. Swift |\n|--------|-------|-----------|----------|\n| \u2714\ufe0f | \u2714\ufe0f | \u2714\ufe0f | \u2714\ufe0f |\n\n| 9. Go | 10. Dart | 11. Rust | 12. Pascal |\n|-------|----------|----------|------------|\n| \u2714\ufe0f | \u2714\ufe0f | \u2714\ufe0f | \u2714\ufe0f |\n\nFor Rust support, please see [sherpa-rs][sherpa-rs]\n\nIt also supports WebAssembly.\n\n[Join our discord](https://discord.gg/fJdxzg2VbG)\n\n\n## Introduction\n\nThis repository supports running the following functions **locally**\n\n - Speech-to-text (i.e., ASR); both streaming and non-streaming are supported\n - Text-to-speech (i.e., TTS)\n - Speaker diarization\n - Speaker identification\n - Speaker verification\n - Spoken language identification\n - Audio tagging\n - VAD (e.g., [silero-vad][silero-vad])\n - Speech enhancement (e.g., [gtcrn][gtcrn])\n - Keyword spotting\n - Source separation (e.g., [spleeter][spleeter], [UVR][UVR])\n\non the following platforms and operating systems:\n\n - x86, ``x86_64``, 32-bit ARM, 64-bit ARM (arm64, aarch64), RISC-V (riscv64), **RK NPU**\n - Linux, macOS, Windows, openKylin\n - Android, WearOS\n - iOS\n - HarmonyOS\n - NodeJS\n - WebAssembly\n - [NVIDIA Jetson Orin NX][NVIDIA Jetson Orin NX] (Support running on both CPU and GPU)\n - [NVIDIA Jetson Nano B01][NVIDIA Jetson Nano B01] (Support running on both CPU and GPU)\n - [Raspberry Pi][Raspberry Pi]\n - [RV1126][RV1126]\n - [LicheePi4A][LicheePi4A]\n - [VisionFive 2][VisionFive 2]\n - [\u65ed\u65e5X3\u6d3e][\u65ed\u65e5X3\u6d3e]\n - [\u7231\u82af\u6d3e][\u7231\u82af\u6d3e]\n - [RK3588][RK3588]\n - etc\n\nwith the following APIs\n\n - C++, C, Python, Go, ``C#``\n - Java, Kotlin, JavaScript\n - Swift, Rust\n - Dart, Object Pascal\n\n### Links for Huggingface Spaces\n\n<details>\n<summary>You can visit the following Huggingface spaces to try sherpa-onnx without\ninstalling anything. All you need is a browser.</summary>\n\n| Description | URL | \u4e2d\u56fd\u955c\u50cf |\n|-------------------------------------------------------|-----------------------------------------|----------------------------------------|\n| Speaker diarization | [Click me][hf-space-speaker-diarization]| [\u955c\u50cf][hf-space-speaker-diarization-cn]|\n| Speech recognition | [Click me][hf-space-asr] | [\u955c\u50cf][hf-space-asr-cn] |\n| Speech recognition with [Whisper][Whisper] | [Click me][hf-space-asr-whisper] | [\u955c\u50cf][hf-space-asr-whisper-cn] |\n| Speech synthesis | [Click me][hf-space-tts] | [\u955c\u50cf][hf-space-tts-cn] |\n| Generate subtitles | [Click me][hf-space-subtitle] | [\u955c\u50cf][hf-space-subtitle-cn] |\n| Audio tagging | [Click me][hf-space-audio-tagging] | [\u955c\u50cf][hf-space-audio-tagging-cn] |\n| Source separation | [Click me][hf-space-source-separation] | [\u955c\u50cf][hf-space-source-separation-cn] |\n| Spoken language identification with [Whisper][Whisper]| [Click me][hf-space-slid-whisper] | [\u955c\u50cf][hf-space-slid-whisper-cn] |\n\nWe also have spaces built using WebAssembly. They are listed below:\n\n| Description | Huggingface space| ModelScope space|\n|------------------------------------------------------------------------------------------|------------------|-----------------|\n|Voice activity detection with [silero-vad][silero-vad] | [Click me][wasm-hf-vad]|[\u5730\u5740][wasm-ms-vad]|\n|Real-time speech recognition (Chinese + English) with Zipformer | [Click me][wasm-hf-streaming-asr-zh-en-zipformer]|[\u5730\u5740][wasm-hf-streaming-asr-zh-en-zipformer]|\n|Real-time speech recognition (Chinese + English) with Paraformer |[Click me][wasm-hf-streaming-asr-zh-en-paraformer]| [\u5730\u5740][wasm-ms-streaming-asr-zh-en-paraformer]|\n|Real-time speech recognition (Chinese + English + Cantonese) with [Paraformer-large][Paraformer-large]|[Click me][wasm-hf-streaming-asr-zh-en-yue-paraformer]| [\u5730\u5740][wasm-ms-streaming-asr-zh-en-yue-paraformer]|\n|Real-time speech recognition (English) |[Click me][wasm-hf-streaming-asr-en-zipformer] |[\u5730\u5740][wasm-ms-streaming-asr-en-zipformer]|\n|VAD + speech recognition (Chinese) with [Zipformer CTC](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-ctc/icefall/zipformer.html#sherpa-onnx-zipformer-ctc-zh-int8-2025-07-03-chinese)|[Click me][wasm-hf-vad-asr-zh-zipformer-ctc-07-03]| [\u5730\u5740][wasm-ms-vad-asr-zh-zipformer-ctc-07-03]|\n|VAD + speech recognition (Chinese + English + Korean + Japanese + Cantonese) with [SenseVoice][SenseVoice]|[Click me][wasm-hf-vad-asr-zh-en-ko-ja-yue-sense-voice]| [\u5730\u5740][wasm-ms-vad-asr-zh-en-ko-ja-yue-sense-voice]|\n|VAD + speech recognition (English) with [Whisper][Whisper] tiny.en|[Click me][wasm-hf-vad-asr-en-whisper-tiny-en]| [\u5730\u5740][wasm-ms-vad-asr-en-whisper-tiny-en]|\n|VAD + speech recognition (English) with [Moonshine tiny][Moonshine tiny]|[Click me][wasm-hf-vad-asr-en-moonshine-tiny-en]| [\u5730\u5740][wasm-ms-vad-asr-en-moonshine-tiny-en]|\n|VAD + speech recognition (English) with Zipformer trained with [GigaSpeech][GigaSpeech] |[Click me][wasm-hf-vad-asr-en-zipformer-gigaspeech]| [\u5730\u5740][wasm-ms-vad-asr-en-zipformer-gigaspeech]|\n|VAD + speech recognition (Chinese) with Zipformer trained with [WenetSpeech][WenetSpeech] |[Click me][wasm-hf-vad-asr-zh-zipformer-wenetspeech]| [\u5730\u5740][wasm-ms-vad-asr-zh-zipformer-wenetspeech]|\n|VAD + speech recognition (Japanese) with Zipformer trained with [ReazonSpeech][ReazonSpeech]|[Click me][wasm-hf-vad-asr-ja-zipformer-reazonspeech]| [\u5730\u5740][wasm-ms-vad-asr-ja-zipformer-reazonspeech]|\n|VAD + speech recognition (Thai) with Zipformer trained with [GigaSpeech2][GigaSpeech2] |[Click me][wasm-hf-vad-asr-th-zipformer-gigaspeech2]| [\u5730\u5740][wasm-ms-vad-asr-th-zipformer-gigaspeech2]|\n|VAD + speech recognition (Chinese \u591a\u79cd\u65b9\u8a00) with a [TeleSpeech-ASR][TeleSpeech-ASR] CTC model|[Click me][wasm-hf-vad-asr-zh-telespeech]| [\u5730\u5740][wasm-ms-vad-asr-zh-telespeech]|\n|VAD + speech recognition (English + Chinese, \u53ca\u591a\u79cd\u4e2d\u6587\u65b9\u8a00) with Paraformer-large |[Click me][wasm-hf-vad-asr-zh-en-paraformer-large]| [\u5730\u5740][wasm-ms-vad-asr-zh-en-paraformer-large]|\n|VAD + speech recognition (English + Chinese, \u53ca\u591a\u79cd\u4e2d\u6587\u65b9\u8a00) with Paraformer-small |[Click me][wasm-hf-vad-asr-zh-en-paraformer-small]| [\u5730\u5740][wasm-ms-vad-asr-zh-en-paraformer-small]|\n|VAD + speech recognition (\u591a\u8bed\u79cd\u53ca\u591a\u79cd\u4e2d\u6587\u65b9\u8a00) with [Dolphin][Dolphin]-base |[Click me][wasm-hf-vad-asr-multi-lang-dolphin-base]| [\u5730\u5740][wasm-ms-vad-asr-multi-lang-dolphin-base]|\n|Speech synthesis (English) |[Click me][wasm-hf-tts-piper-en]| [\u5730\u5740][wasm-ms-tts-piper-en]|\n|Speech synthesis (German) |[Click me][wasm-hf-tts-piper-de]| [\u5730\u5740][wasm-ms-tts-piper-de]|\n|Speaker diarization |[Click me][wasm-hf-speaker-diarization]|[\u5730\u5740][wasm-ms-speaker-diarization]|\n\n</details>\n\n### Links for pre-built Android APKs\n\n<details>\n\n<summary>You can find pre-built Android APKs for this repository in the following table</summary>\n\n| Description | URL | \u4e2d\u56fd\u7528\u6237 |\n|----------------------------------------|------------------------------------|-----------------------------------|\n| Speaker diarization | [Address][apk-speaker-diarization] | [\u70b9\u6b64][apk-speaker-diarization-cn]|\n| Streaming speech recognition | [Address][apk-streaming-asr] | [\u70b9\u6b64][apk-streaming-asr-cn] |\n| Simulated-streaming speech recognition | [Address][apk-simula-streaming-asr]| [\u70b9\u6b64][apk-simula-streaming-asr-cn]|\n| Text-to-speech | [Address][apk-tts] | [\u70b9\u6b64][apk-tts-cn] |\n| Voice activity detection (VAD) | [Address][apk-vad] | [\u70b9\u6b64][apk-vad-cn] |\n| VAD + non-streaming speech recognition | [Address][apk-vad-asr] | [\u70b9\u6b64][apk-vad-asr-cn] |\n| Two-pass speech recognition | [Address][apk-2pass] | [\u70b9\u6b64][apk-2pass-cn] |\n| Audio tagging | [Address][apk-at] | [\u70b9\u6b64][apk-at-cn] |\n| Audio tagging (WearOS) | [Address][apk-at-wearos] | [\u70b9\u6b64][apk-at-wearos-cn] |\n| Speaker identification | [Address][apk-sid] | [\u70b9\u6b64][apk-sid-cn] |\n| Spoken language identification | [Address][apk-slid] | [\u70b9\u6b64][apk-slid-cn] |\n| Keyword spotting | [Address][apk-kws] | [\u70b9\u6b64][apk-kws-cn] |\n\n</details>\n\n### Links for pre-built Flutter APPs\n\n<details>\n\n#### Real-time speech recognition\n\n| Description | URL | \u4e2d\u56fd\u7528\u6237 |\n|--------------------------------|-------------------------------------|-------------------------------------|\n| Streaming speech recognition | [Address][apk-flutter-streaming-asr]| [\u70b9\u6b64][apk-flutter-streaming-asr-cn]|\n\n#### Text-to-speech\n\n| Description | URL | \u4e2d\u56fd\u7528\u6237 |\n|------------------------------------------|------------------------------------|------------------------------------|\n| Android (arm64-v8a, armeabi-v7a, x86_64) | [Address][flutter-tts-android] | [\u70b9\u6b64][flutter-tts-android-cn] |\n| Linux (x64) | [Address][flutter-tts-linux] | [\u70b9\u6b64][flutter-tts-linux-cn] |\n| macOS (x64) | [Address][flutter-tts-macos-x64] | [\u70b9\u6b64][flutter-tts-macos-arm64-cn] |\n| macOS (arm64) | [Address][flutter-tts-macos-arm64] | [\u70b9\u6b64][flutter-tts-macos-x64-cn] |\n| Windows (x64) | [Address][flutter-tts-win-x64] | [\u70b9\u6b64][flutter-tts-win-x64-cn] |\n\n> Note: You need to build from source for iOS.\n\n</details>\n\n### Links for pre-built Lazarus APPs\n\n<details>\n\n#### Generating subtitles\n\n| Description | URL | \u4e2d\u56fd\u7528\u6237 |\n|--------------------------------|----------------------------|----------------------------|\n| Generate subtitles (\u751f\u6210\u5b57\u5e55) | [Address][lazarus-subtitle]| [\u70b9\u6b64][lazarus-subtitle-cn]|\n\n</details>\n\n### Links for pre-trained models\n\n<details>\n\n| Description | URL |\n|---------------------------------------------|---------------------------------------------------------------------------------------|\n| Speech recognition (speech to text, ASR) | [Address][asr-models] |\n| Text-to-speech (TTS) | [Address][tts-models] |\n| VAD | [Address][vad-models] |\n| Keyword spotting | [Address][kws-models] |\n| Audio tagging | [Address][at-models] |\n| Speaker identification (Speaker ID) | [Address][sid-models] |\n| Spoken language identification (Language ID)| See multi-lingual [Whisper][Whisper] ASR models from [Speech recognition][asr-models]|\n| Punctuation | [Address][punct-models] |\n| Speaker segmentation | [Address][speaker-segmentation-models] |\n| Speech enhancement | [Address][speech-enhancement-models] |\n| Source separation | [Address][source-separation-models] |\n\n</details>\n\n#### Some pre-trained ASR models (Streaming)\n\n<details>\n\nPlease see\n\n - <https://k2-fsa.github.io/sherpa/onnx/pretrained_models/online-transducer/index.html>\n - <https://k2-fsa.github.io/sherpa/onnx/pretrained_models/online-paraformer/index.html>\n - <https://k2-fsa.github.io/sherpa/onnx/pretrained_models/online-ctc/index.html>\n\nfor more models. The following table lists only **SOME** of them.\n\n\n|Name | Supported Languages| Description|\n|-----|-----|----|\n|[sherpa-onnx-streaming-zipformer-bilingual-zh-en-2023-02-20][sherpa-onnx-streaming-zipformer-bilingual-zh-en-2023-02-20]| Chinese, English| See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/online-transducer/zipformer-transducer-models.html#csukuangfj-sherpa-onnx-streaming-zipformer-bilingual-zh-en-2023-02-20-bilingual-chinese-english)|\n|[sherpa-onnx-streaming-zipformer-small-bilingual-zh-en-2023-02-16][sherpa-onnx-streaming-zipformer-small-bilingual-zh-en-2023-02-16]| Chinese, English| See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/online-transducer/zipformer-transducer-models.html#sherpa-onnx-streaming-zipformer-small-bilingual-zh-en-2023-02-16-bilingual-chinese-english)|\n|[sherpa-onnx-streaming-zipformer-zh-14M-2023-02-23][sherpa-onnx-streaming-zipformer-zh-14M-2023-02-23]|Chinese| Suitable for Cortex A7 CPU. See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/online-transducer/zipformer-transducer-models.html#sherpa-onnx-streaming-zipformer-zh-14m-2023-02-23)|\n|[sherpa-onnx-streaming-zipformer-en-20M-2023-02-17][sherpa-onnx-streaming-zipformer-en-20M-2023-02-17]|English|Suitable for Cortex A7 CPU. See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/online-transducer/zipformer-transducer-models.html#sherpa-onnx-streaming-zipformer-en-20m-2023-02-17)|\n|[sherpa-onnx-streaming-zipformer-korean-2024-06-16][sherpa-onnx-streaming-zipformer-korean-2024-06-16]|Korean| See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/online-transducer/zipformer-transducer-models.html#sherpa-onnx-streaming-zipformer-korean-2024-06-16-korean)|\n|[sherpa-onnx-streaming-zipformer-fr-2023-04-14][sherpa-onnx-streaming-zipformer-fr-2023-04-14]|French| See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/online-transducer/zipformer-transducer-models.html#shaojieli-sherpa-onnx-streaming-zipformer-fr-2023-04-14-french)|\n\n</details>\n\n\n#### Some pre-trained ASR models (Non-Streaming)\n\n<details>\n\nPlease see\n\n - <https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-transducer/index.html>\n - <https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-paraformer/index.html>\n - <https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-ctc/index.html>\n - <https://k2-fsa.github.io/sherpa/onnx/pretrained_models/telespeech/index.html>\n - <https://k2-fsa.github.io/sherpa/onnx/pretrained_models/whisper/index.html>\n\nfor more models. The following table lists only **SOME** of them.\n\n|Name | Supported Languages| Description|\n|-----|-----|----|\n|[sherpa-onnx-nemo-parakeet-tdt-0.6b-v2-int8](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-transducer/nemo-transducer-models.html#sherpa-onnx-nemo-parakeet-tdt-0-6b-v2-int8-english)| English | It is converted from <https://huggingface.co/nvidia/parakeet-tdt-0.6b-v2>|\n|[Whisper tiny.en](https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-whisper-tiny.en.tar.bz2)|English| See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/whisper/tiny.en.html)|\n|[Moonshine tiny][Moonshine tiny]|English|See [also](https://github.com/usefulsensors/moonshine)|\n|[sherpa-onnx-zipformer-ctc-zh-int8-2025-07-03](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-ctc/icefall/zipformer.html#sherpa-onnx-zipformer-ctc-zh-int8-2025-07-03-chinese)|Chinese| A Zipformer CTC model|\n|[sherpa-onnx-sense-voice-zh-en-ja-ko-yue-2024-07-17][sherpa-onnx-sense-voice-zh-en-ja-ko-yue-2024-07-17]|Chinese, Cantonese, English, Korean, Japanese| \u652f\u6301\u591a\u79cd\u4e2d\u6587\u65b9\u8a00. See [also](https://k2-fsa.github.io/sherpa/onnx/sense-voice/index.html)|\n|[sherpa-onnx-paraformer-zh-2024-03-09][sherpa-onnx-paraformer-zh-2024-03-09]|Chinese, English| \u4e5f\u652f\u6301\u591a\u79cd\u4e2d\u6587\u65b9\u8a00. See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-paraformer/paraformer-models.html#csukuangfj-sherpa-onnx-paraformer-zh-2024-03-09-chinese-english)|\n|[sherpa-onnx-zipformer-ja-reazonspeech-2024-08-01][sherpa-onnx-zipformer-ja-reazonspeech-2024-08-01]|Japanese|See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-transducer/zipformer-transducer-models.html#sherpa-onnx-zipformer-ja-reazonspeech-2024-08-01-japanese)|\n|[sherpa-onnx-nemo-transducer-giga-am-russian-2024-10-24][sherpa-onnx-nemo-transducer-giga-am-russian-2024-10-24]|Russian|See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-transducer/nemo-transducer-models.html#sherpa-onnx-nemo-transducer-giga-am-russian-2024-10-24-russian)|\n|[sherpa-onnx-nemo-ctc-giga-am-russian-2024-10-24][sherpa-onnx-nemo-ctc-giga-am-russian-2024-10-24]|Russian| See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-ctc/nemo/russian.html#sherpa-onnx-nemo-ctc-giga-am-russian-2024-10-24)|\n|[sherpa-onnx-zipformer-ru-2024-09-18][sherpa-onnx-zipformer-ru-2024-09-18]|Russian|See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-transducer/zipformer-transducer-models.html#sherpa-onnx-zipformer-ru-2024-09-18-russian)|\n|[sherpa-onnx-zipformer-korean-2024-06-24][sherpa-onnx-zipformer-korean-2024-06-24]|Korean|See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-transducer/zipformer-transducer-models.html#sherpa-onnx-zipformer-korean-2024-06-24-korean)|\n|[sherpa-onnx-zipformer-thai-2024-06-20][sherpa-onnx-zipformer-thai-2024-06-20]|Thai| See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/offline-transducer/zipformer-transducer-models.html#sherpa-onnx-zipformer-thai-2024-06-20-thai)|\n|[sherpa-onnx-telespeech-ctc-int8-zh-2024-06-04][sherpa-onnx-telespeech-ctc-int8-zh-2024-06-04]|Chinese| \u652f\u6301\u591a\u79cd\u65b9\u8a00. See [also](https://k2-fsa.github.io/sherpa/onnx/pretrained_models/telespeech/models.html#sherpa-onnx-telespeech-ctc-int8-zh-2024-06-04)|\n\n</details>\n\n### Useful links\n\n- Documentation: https://k2-fsa.github.io/sherpa/onnx/\n- Bilibili \u6f14\u793a\u89c6\u9891: https://search.bilibili.com/all?keyword=%E6%96%B0%E4%B8%80%E4%BB%A3Kaldi\n\n### How to reach us\n\nPlease see\nhttps://k2-fsa.github.io/sherpa/social-groups.html\nfor \u65b0\u4e00\u4ee3 Kaldi **\u5fae\u4fe1\u4ea4\u6d41\u7fa4** and **QQ \u4ea4\u6d41\u7fa4**.\n\n## Projects using sherpa-onnx\n\n### [BreezeApp](https://github.com/mtkresearch/BreezeApp) from [MediaTek Research](https://github.com/mtkresearch)\n\n> BreezeAPP is a mobile AI application developed for both Android and iOS platforms.\n> Users can download it directly from the App Store and enjoy a variety of features\n> offline, including speech-to-text, text-to-speech, text-based chatbot interactions,\n> and image question-answering\n\n - [Download APK for BreezeAPP](https://huggingface.co/MediaTek-Research/BreezeApp/resolve/main/BreezeApp.apk)\n - [APK \u4e2d\u56fd\u955c\u50cf](https://hf-mirror.com/MediaTek-Research/BreezeApp/blob/main/BreezeApp.apk)\n\n| 1 | 2 | 3 |\n|---|---|---|\n||||\n\n### [Open-LLM-VTuber](https://github.com/t41372/Open-LLM-VTuber)\n\nTalk to any LLM with hands-free voice interaction, voice interruption, and Live2D taking\nface running locally across platforms\n\nSee also <https://github.com/t41372/Open-LLM-VTuber/pull/50>\n\n### [voiceapi](https://github.com/ruzhila/voiceapi)\n\n<details>\n <summary>Streaming ASR and TTS based on FastAPI</summary>\n\n\nIt shows how to use the ASR and TTS Python APIs with FastAPI.\n</details>\n\n### [\u817e\u8baf\u4f1a\u8bae\u6478\u9c7c\u5de5\u5177 TMSpeech](https://github.com/jxlpzqc/TMSpeech)\n\nUses streaming ASR in C# with graphical user interface.\n\nVideo demo in Chinese: [\u3010\u5f00\u6e90\u3011Windows\u5b9e\u65f6\u5b57\u5e55\u8f6f\u4ef6\uff08\u7f51\u8bfe/\u5f00\u4f1a\u5fc5\u5907\uff09](https://www.bilibili.com/video/BV1rX4y1p7Nx)\n\n### [lol\u4e92\u52a8\u52a9\u624b](https://github.com/l1veIn/lol-wom-electron)\n\nIt uses the JavaScript API of sherpa-onnx along with [Electron](https://electronjs.org/)\n\nVideo demo in Chinese: [\u7206\u4e86\uff01\u70ab\u795e\u6559\u4f60\u5f00\u6253\u5b57\u6302\uff01\u771f\u6b63\u5f71\u54cd\u80dc\u7387\u7684\u82f1\u96c4\u8054\u76df\u5de5\u5177\uff01\u82f1\u96c4\u8054\u76df\u7684\u6700\u540e\u4e00\u5757\u62fc\u56fe\uff01\u548c\u6e38\u620f\u4e2d\u7684\u6bcf\u4e2a\u4eba\u65e0\u969c\u788d\u6c9f\u901a\uff01](https://www.bilibili.com/video/BV142tje9E74)\n\n### [Sherpa-ONNX \u8bed\u97f3\u8bc6\u522b\u670d\u52a1\u5668](https://github.com/hfyydd/sherpa-onnx-server)\n\nA server based on nodejs providing Restful API for speech recognition.\n\n### [QSmartAssistant](https://github.com/xinhecuican/QSmartAssistant)\n\n\u4e00\u4e2a\u6a21\u5757\u5316\uff0c\u5168\u8fc7\u7a0b\u53ef\u79bb\u7ebf\uff0c\u4f4e\u5360\u7528\u7387\u7684\u5bf9\u8bdd\u673a\u5668\u4eba/\u667a\u80fd\u97f3\u7bb1\n\nIt uses QT. Both [ASR](https://github.com/xinhecuican/QSmartAssistant/blob/master/doc/%E5%AE%89%E8%A3%85.md#asr)\nand [TTS](https://github.com/xinhecuican/QSmartAssistant/blob/master/doc/%E5%AE%89%E8%A3%85.md#tts)\nare used.\n\n### [Flutter-EasySpeechRecognition](https://github.com/Jason-chen-coder/Flutter-EasySpeechRecognition)\n\nIt extends [./flutter-examples/streaming_asr](./flutter-examples/streaming_asr) by\ndownloading models inside the app to reduce the size of the app.\n\nNote: [[Team B] Sherpa AI backend](https://github.com/umgc/spring2025/pull/82) also uses\nsherpa-onnx in a Flutter APP.\n\n### [sherpa-onnx-unity](https://github.com/xue-fei/sherpa-onnx-unity)\n\nsherpa-onnx in Unity. See also [#1695](https://github.com/k2-fsa/sherpa-onnx/issues/1695),\n[#1892](https://github.com/k2-fsa/sherpa-onnx/issues/1892), and [#1859](https://github.com/k2-fsa/sherpa-onnx/issues/1859)\n\n### [xiaozhi-esp32-server](https://github.com/xinnan-tech/xiaozhi-esp32-server)\n\n\u672c\u9879\u76ee\u4e3axiaozhi-esp32\u63d0\u4f9b\u540e\u7aef\u670d\u52a1\uff0c\u5e2e\u52a9\u60a8\u5feb\u901f\u642d\u5efaESP32\u8bbe\u5907\u63a7\u5236\u670d\u52a1\u5668\nBackend service for xiaozhi-esp32, helps you quickly build an ESP32 device control server.\n\nSee also\n\n - [ASR\u65b0\u589e\u8f7b\u91cf\u7ea7sherpa-onnx-asr](https://github.com/xinnan-tech/xiaozhi-esp32-server/issues/315)\n - [feat: ASR\u589e\u52a0sherpa-onnx\u6a21\u578b](https://github.com/xinnan-tech/xiaozhi-esp32-server/pull/379)\n\n### [KaithemAutomation](https://github.com/EternityForest/KaithemAutomation)\n\nPure Python, GUI-focused home automation/consumer grade SCADA.\n\nIt uses TTS from sherpa-onnx. See also [\u2728 Speak command that uses the new globally configured TTS model.](https://github.com/EternityForest/KaithemAutomation/commit/8e64d2b138725e426532f7d66bb69dd0b4f53693)\n\n### [Open-XiaoAI KWS](https://github.com/idootop/open-xiaoai-kws)\n\nEnable custom wake word for XiaoAi Speakers. \u8ba9\u5c0f\u7231\u97f3\u7bb1\u652f\u6301\u81ea\u5b9a\u4e49\u5524\u9192\u8bcd\u3002\n\nVideo demo in Chinese: [\u5c0f\u7231\u540c\u5b66\u542f\u52a8\uff5e\u02f6\u2579\ua1f4\u2579\u02f6\uff01](https://www.bilibili.com/video/BV1YfVUz5EMj)\n\n### [C++ WebSocket ASR Server](https://github.com/mawwalker/stt-server)\n\nIt provides a WebSocket server based on C++ for ASR using sherpa-onnx.\n\n### [Go WebSocket Server](https://github.com/bbeyondllove/asr_server)\n\nIt provides a WebSocket server based on the Go programming language for sherpa-onnx.\n\n### [Making robot Paimon, Ep10 \"The AI Part 1\"](https://www.youtube.com/watch?v=KxPKkwxGWZs)\n\nIt is a [YouTube video](https://www.youtube.com/watch?v=KxPKkwxGWZs),\nshowing how the author tried to use AI so he can have a conversation with Paimon.\n\nIt uses sherpa-onnx for speech-to-text and text-to-speech.\n|1|\n|---|\n||\n\n### [TtsReader - Desktop application](https://github.com/ys-pro-duction/TtsReader)\n\nA desktop text-to-speech application built using Kotlin Multiplatform.\n\n### [MentraOS](https://github.com/Mentra-Community/MentraOS)\n\n> Smart glasses OS, with dozens of built-in apps. Users get AI assistant, notifications,\n> translation, screen mirror, captions, and more. Devs get to write 1 app that runs on\n> any pair of smart glasses.\n\nIt uses sherpa-onnx for real-time speech recognition on iOS and Android devices.\nSee also <https://github.com/Mentra-Community/MentraOS/pull/861>\n\nIt uses Swift for iOS and Java for Android.\n\n[sherpa-rs]: https://github.com/thewh1teagle/sherpa-rs\n[silero-vad]: https://github.com/snakers4/silero-vad\n[Raspberry Pi]: https://www.raspberrypi.com/\n[RV1126]: https://www.rock-chips.com/uploads/pdf/2022.8.26/191/RV1126%20Brief%20Datasheet.pdf\n[LicheePi4A]: https://sipeed.com/licheepi4a\n[VisionFive 2]: https://www.starfivetech.com/en/site/boards\n[\u65ed\u65e5X3\u6d3e]: https://developer.horizon.ai/api/v1/fileData/documents_pi/index.html\n[\u7231\u82af\u6d3e]: https://wiki.sipeed.com/hardware/zh/maixIII/ax-pi/axpi.html\n[hf-space-speaker-diarization]: https://huggingface.co/spaces/k2-fsa/speaker-diarization\n[hf-space-speaker-diarization-cn]: https://hf.qhduan.com/spaces/k2-fsa/speaker-diarization\n[hf-space-asr]: https://huggingface.co/spaces/k2-fsa/automatic-speech-recognition\n[hf-space-asr-cn]: https://hf.qhduan.com/spaces/k2-fsa/automatic-speech-recognition\n[Whisper]: https://github.com/openai/whisper\n[hf-space-asr-whisper]: https://huggingface.co/spaces/k2-fsa/automatic-speech-recognition-with-whisper\n[hf-space-asr-whisper-cn]: https://hf.qhduan.com/spaces/k2-fsa/automatic-speech-recognition-with-whisper\n[hf-space-tts]: https://huggingface.co/spaces/k2-fsa/text-to-speech\n[hf-space-tts-cn]: https://hf.qhduan.com/spaces/k2-fsa/text-to-speech\n[hf-space-subtitle]: https://huggingface.co/spaces/k2-fsa/generate-subtitles-for-videos\n[hf-space-subtitle-cn]: https://hf.qhduan.com/spaces/k2-fsa/generate-subtitles-for-videos\n[hf-space-audio-tagging]: https://huggingface.co/spaces/k2-fsa/audio-tagging\n[hf-space-audio-tagging-cn]: https://hf.qhduan.com/spaces/k2-fsa/audio-tagging\n[hf-space-source-separation]: https://huggingface.co/spaces/k2-fsa/source-separation\n[hf-space-source-separation-cn]: https://hf.qhduan.com/spaces/k2-fsa/source-separation\n[hf-space-slid-whisper]: https://huggingface.co/spaces/k2-fsa/spoken-language-identification\n[hf-space-slid-whisper-cn]: https://hf.qhduan.com/spaces/k2-fsa/spoken-language-identification\n[wasm-hf-vad]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-sherpa-onnx\n[wasm-ms-vad]: https://modelscope.cn/studios/csukuangfj/web-assembly-vad-sherpa-onnx\n[wasm-hf-streaming-asr-zh-en-zipformer]: https://huggingface.co/spaces/k2-fsa/web-assembly-asr-sherpa-onnx-zh-en\n[wasm-ms-streaming-asr-zh-en-zipformer]: https://modelscope.cn/studios/k2-fsa/web-assembly-asr-sherpa-onnx-zh-en\n[wasm-hf-streaming-asr-zh-en-paraformer]: https://huggingface.co/spaces/k2-fsa/web-assembly-asr-sherpa-onnx-zh-en-paraformer\n[wasm-ms-streaming-asr-zh-en-paraformer]: https://modelscope.cn/studios/k2-fsa/web-assembly-asr-sherpa-onnx-zh-en-paraformer\n[Paraformer-large]: https://www.modelscope.cn/models/damo/speech_paraformer-large_asr_nat-zh-cn-16k-common-vocab8404-pytorch/summary\n[wasm-hf-streaming-asr-zh-en-yue-paraformer]: https://huggingface.co/spaces/k2-fsa/web-assembly-asr-sherpa-onnx-zh-cantonese-en-paraformer\n[wasm-ms-streaming-asr-zh-en-yue-paraformer]: https://modelscope.cn/studios/k2-fsa/web-assembly-asr-sherpa-onnx-zh-cantonese-en-paraformer\n[wasm-hf-streaming-asr-en-zipformer]: https://huggingface.co/spaces/k2-fsa/web-assembly-asr-sherpa-onnx-en\n[wasm-ms-streaming-asr-en-zipformer]: https://modelscope.cn/studios/k2-fsa/web-assembly-asr-sherpa-onnx-en\n[SenseVoice]: https://github.com/FunAudioLLM/SenseVoice\n[wasm-hf-vad-asr-zh-zipformer-ctc-07-03]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-zh-zipformer-ctc\n[wasm-ms-vad-asr-zh-zipformer-ctc-07-03]: https://modelscope.cn/studios/csukuangfj/web-assembly-vad-asr-sherpa-onnx-zh-zipformer-ctc/summary\n[wasm-hf-vad-asr-zh-en-ko-ja-yue-sense-voice]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-zh-en-ja-ko-cantonese-sense-voice\n[wasm-ms-vad-asr-zh-en-ko-ja-yue-sense-voice]: https://www.modelscope.cn/studios/csukuangfj/web-assembly-vad-asr-sherpa-onnx-zh-en-jp-ko-cantonese-sense-voice\n[wasm-hf-vad-asr-en-whisper-tiny-en]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-en-whisper-tiny\n[wasm-ms-vad-asr-en-whisper-tiny-en]: https://www.modelscope.cn/studios/csukuangfj/web-assembly-vad-asr-sherpa-onnx-en-whisper-tiny\n[wasm-hf-vad-asr-en-moonshine-tiny-en]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-en-moonshine-tiny\n[wasm-ms-vad-asr-en-moonshine-tiny-en]: https://www.modelscope.cn/studios/csukuangfj/web-assembly-vad-asr-sherpa-onnx-en-moonshine-tiny\n[wasm-hf-vad-asr-en-zipformer-gigaspeech]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-en-zipformer-gigaspeech\n[wasm-ms-vad-asr-en-zipformer-gigaspeech]: https://www.modelscope.cn/studios/k2-fsa/web-assembly-vad-asr-sherpa-onnx-en-zipformer-gigaspeech\n[wasm-hf-vad-asr-zh-zipformer-wenetspeech]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-zh-zipformer-wenetspeech\n[wasm-ms-vad-asr-zh-zipformer-wenetspeech]: https://www.modelscope.cn/studios/k2-fsa/web-assembly-vad-asr-sherpa-onnx-zh-zipformer-wenetspeech\n[reazonspeech]: https://research.reazon.jp/_static/reazonspeech_nlp2023.pdf\n[wasm-hf-vad-asr-ja-zipformer-reazonspeech]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-ja-zipformer\n[wasm-ms-vad-asr-ja-zipformer-reazonspeech]: https://www.modelscope.cn/studios/csukuangfj/web-assembly-vad-asr-sherpa-onnx-ja-zipformer\n[gigaspeech2]: https://github.com/speechcolab/gigaspeech2\n[wasm-hf-vad-asr-th-zipformer-gigaspeech2]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-th-zipformer\n[wasm-ms-vad-asr-th-zipformer-gigaspeech2]: https://www.modelscope.cn/studios/csukuangfj/web-assembly-vad-asr-sherpa-onnx-th-zipformer\n[telespeech-asr]: https://github.com/tele-ai/telespeech-asr\n[wasm-hf-vad-asr-zh-telespeech]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-zh-telespeech\n[wasm-ms-vad-asr-zh-telespeech]: https://www.modelscope.cn/studios/k2-fsa/web-assembly-vad-asr-sherpa-onnx-zh-telespeech\n[wasm-hf-vad-asr-zh-en-paraformer-large]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-zh-en-paraformer\n[wasm-ms-vad-asr-zh-en-paraformer-large]: https://www.modelscope.cn/studios/k2-fsa/web-assembly-vad-asr-sherpa-onnx-zh-en-paraformer\n[wasm-hf-vad-asr-zh-en-paraformer-small]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-zh-en-paraformer-small\n[wasm-ms-vad-asr-zh-en-paraformer-small]: https://www.modelscope.cn/studios/k2-fsa/web-assembly-vad-asr-sherpa-onnx-zh-en-paraformer-small\n[dolphin]: https://github.com/dataoceanai/dolphin\n[wasm-ms-vad-asr-multi-lang-dolphin-base]: https://modelscope.cn/studios/csukuangfj/web-assembly-vad-asr-sherpa-onnx-multi-lang-dophin-ctc\n[wasm-hf-vad-asr-multi-lang-dolphin-base]: https://huggingface.co/spaces/k2-fsa/web-assembly-vad-asr-sherpa-onnx-multi-lang-dophin-ctc\n\n[wasm-hf-tts-piper-en]: https://huggingface.co/spaces/k2-fsa/web-assembly-tts-sherpa-onnx-en\n[wasm-ms-tts-piper-en]: https://modelscope.cn/studios/k2-fsa/web-assembly-tts-sherpa-onnx-en\n[wasm-hf-tts-piper-de]: https://huggingface.co/spaces/k2-fsa/web-assembly-tts-sherpa-onnx-de\n[wasm-ms-tts-piper-de]: https://modelscope.cn/studios/k2-fsa/web-assembly-tts-sherpa-onnx-de\n[wasm-hf-speaker-diarization]: https://huggingface.co/spaces/k2-fsa/web-assembly-speaker-diarization-sherpa-onnx\n[wasm-ms-speaker-diarization]: https://www.modelscope.cn/studios/csukuangfj/web-assembly-speaker-diarization-sherpa-onnx\n[apk-speaker-diarization]: https://k2-fsa.github.io/sherpa/onnx/speaker-diarization/apk.html\n[apk-speaker-diarization-cn]: https://k2-fsa.github.io/sherpa/onnx/speaker-diarization/apk-cn.html\n[apk-streaming-asr]: https://k2-fsa.github.io/sherpa/onnx/android/apk.html\n[apk-streaming-asr-cn]: https://k2-fsa.github.io/sherpa/onnx/android/apk-cn.html\n[apk-simula-streaming-asr]: https://k2-fsa.github.io/sherpa/onnx/android/apk-simulate-streaming-asr.html\n[apk-simula-streaming-asr-cn]: https://k2-fsa.github.io/sherpa/onnx/android/apk-simulate-streaming-asr-cn.html\n[apk-tts]: https://k2-fsa.github.io/sherpa/onnx/tts/apk-engine.html\n[apk-tts-cn]: https://k2-fsa.github.io/sherpa/onnx/tts/apk-engine-cn.html\n[apk-vad]: https://k2-fsa.github.io/sherpa/onnx/vad/apk.html\n[apk-vad-cn]: https://k2-fsa.github.io/sherpa/onnx/vad/apk-cn.html\n[apk-vad-asr]: https://k2-fsa.github.io/sherpa/onnx/vad/apk-asr.html\n[apk-vad-asr-cn]: https://k2-fsa.github.io/sherpa/onnx/vad/apk-asr-cn.html\n[apk-2pass]: https://k2-fsa.github.io/sherpa/onnx/android/apk-2pass.html\n[apk-2pass-cn]: https://k2-fsa.github.io/sherpa/onnx/android/apk-2pass-cn.html\n[apk-at]: https://k2-fsa.github.io/sherpa/onnx/audio-tagging/apk.html\n[apk-at-cn]: https://k2-fsa.github.io/sherpa/onnx/audio-tagging/apk-cn.html\n[apk-at-wearos]: https://k2-fsa.github.io/sherpa/onnx/audio-tagging/apk-wearos.html\n[apk-at-wearos-cn]: https://k2-fsa.github.io/sherpa/onnx/audio-tagging/apk-wearos-cn.html\n[apk-sid]: https://k2-fsa.github.io/sherpa/onnx/speaker-identification/apk.html\n[apk-sid-cn]: https://k2-fsa.github.io/sherpa/onnx/speaker-identification/apk-cn.html\n[apk-slid]: https://k2-fsa.github.io/sherpa/onnx/spoken-language-identification/apk.html\n[apk-slid-cn]: https://k2-fsa.github.io/sherpa/onnx/spoken-language-identification/apk-cn.html\n[apk-kws]: https://k2-fsa.github.io/sherpa/onnx/kws/apk.html\n[apk-kws-cn]: https://k2-fsa.github.io/sherpa/onnx/kws/apk-cn.html\n[apk-flutter-streaming-asr]: https://k2-fsa.github.io/sherpa/onnx/flutter/asr/app.html\n[apk-flutter-streaming-asr-cn]: https://k2-fsa.github.io/sherpa/onnx/flutter/asr/app-cn.html\n[flutter-tts-android]: https://k2-fsa.github.io/sherpa/onnx/flutter/tts-android.html\n[flutter-tts-android-cn]: https://k2-fsa.github.io/sherpa/onnx/flutter/tts-android-cn.html\n[flutter-tts-linux]: https://k2-fsa.github.io/sherpa/onnx/flutter/tts-linux.html\n[flutter-tts-linux-cn]: https://k2-fsa.github.io/sherpa/onnx/flutter/tts-linux-cn.html\n[flutter-tts-macos-x64]: https://k2-fsa.github.io/sherpa/onnx/flutter/tts-macos-x64.html\n[flutter-tts-macos-arm64-cn]: https://k2-fsa.github.io/sherpa/onnx/flutter/tts-macos-x64-cn.html\n[flutter-tts-macos-arm64]: https://k2-fsa.github.io/sherpa/onnx/flutter/tts-macos-arm64.html\n[flutter-tts-macos-x64-cn]: https://k2-fsa.github.io/sherpa/onnx/flutter/tts-macos-arm64-cn.html\n[flutter-tts-win-x64]: https://k2-fsa.github.io/sherpa/onnx/flutter/tts-win.html\n[flutter-tts-win-x64-cn]: https://k2-fsa.github.io/sherpa/onnx/flutter/tts-win-cn.html\n[lazarus-subtitle]: https://k2-fsa.github.io/sherpa/onnx/lazarus/download-generated-subtitles.html\n[lazarus-subtitle-cn]: https://k2-fsa.github.io/sherpa/onnx/lazarus/download-generated-subtitles-cn.html\n[asr-models]: https://github.com/k2-fsa/sherpa-onnx/releases/tag/asr-models\n[tts-models]: https://github.com/k2-fsa/sherpa-onnx/releases/tag/tts-models\n[vad-models]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/silero_vad.onnx\n[kws-models]: https://github.com/k2-fsa/sherpa-onnx/releases/tag/kws-models\n[at-models]: https://github.com/k2-fsa/sherpa-onnx/releases/tag/audio-tagging-models\n[sid-models]: https://github.com/k2-fsa/sherpa-onnx/releases/tag/speaker-recongition-models\n[slid-models]: https://github.com/k2-fsa/sherpa-onnx/releases/tag/speaker-recongition-models\n[punct-models]: https://github.com/k2-fsa/sherpa-onnx/releases/tag/punctuation-models\n[speaker-segmentation-models]: https://github.com/k2-fsa/sherpa-onnx/releases/tag/speaker-segmentation-models\n[GigaSpeech]: https://github.com/SpeechColab/GigaSpeech\n[WenetSpeech]: https://github.com/wenet-e2e/WenetSpeech\n[sherpa-onnx-streaming-zipformer-bilingual-zh-en-2023-02-20]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-streaming-zipformer-bilingual-zh-en-2023-02-20.tar.bz2\n[sherpa-onnx-streaming-zipformer-small-bilingual-zh-en-2023-02-16]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-streaming-zipformer-small-bilingual-zh-en-2023-02-16.tar.bz2\n[sherpa-onnx-streaming-zipformer-korean-2024-06-16]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-streaming-zipformer-korean-2024-06-16.tar.bz2\n[sherpa-onnx-streaming-zipformer-zh-14M-2023-02-23]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-streaming-zipformer-zh-14M-2023-02-23.tar.bz2\n[sherpa-onnx-streaming-zipformer-en-20M-2023-02-17]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-streaming-zipformer-en-20M-2023-02-17.tar.bz2\n[sherpa-onnx-zipformer-ja-reazonspeech-2024-08-01]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-zipformer-ja-reazonspeech-2024-08-01.tar.bz2\n[sherpa-onnx-zipformer-ru-2024-09-18]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-zipformer-ru-2024-09-18.tar.bz2\n[sherpa-onnx-zipformer-korean-2024-06-24]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-zipformer-korean-2024-06-24.tar.bz2\n[sherpa-onnx-zipformer-thai-2024-06-20]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-zipformer-thai-2024-06-20.tar.bz2\n[sherpa-onnx-nemo-transducer-giga-am-russian-2024-10-24]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-nemo-transducer-giga-am-russian-2024-10-24.tar.bz2\n[sherpa-onnx-paraformer-zh-2024-03-09]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-paraformer-zh-2024-03-09.tar.bz2\n[sherpa-onnx-nemo-ctc-giga-am-russian-2024-10-24]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-nemo-ctc-giga-am-russian-2024-10-24.tar.bz2\n[sherpa-onnx-telespeech-ctc-int8-zh-2024-06-04]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-telespeech-ctc-int8-zh-2024-06-04.tar.bz2\n[sherpa-onnx-sense-voice-zh-en-ja-ko-yue-2024-07-17]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-sense-voice-zh-en-ja-ko-yue-2024-07-17.tar.bz2\n[sherpa-onnx-streaming-zipformer-fr-2023-04-14]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-streaming-zipformer-fr-2023-04-14.tar.bz2\n[Moonshine tiny]: https://github.com/k2-fsa/sherpa-onnx/releases/download/asr-models/sherpa-onnx-moonshine-tiny-en-int8.tar.bz2\n[NVIDIA Jetson Orin NX]: https://developer.download.nvidia.com/assets/embedded/secure/jetson/orin_nx/docs/Jetson_Orin_NX_DS-10712-001_v0.5.pdf?RCPGu9Q6OVAOv7a7vgtwc9-BLScXRIWq6cSLuditMALECJ_dOj27DgnqAPGVnT2VpiNpQan9SyFy-9zRykR58CokzbXwjSA7Gj819e91AXPrWkGZR3oS1VLxiDEpJa_Y0lr7UT-N4GnXtb8NlUkP4GkCkkF_FQivGPrAucCUywL481GH_WpP_p7ziHU1Wg==&t=eyJscyI6ImdzZW8iLCJsc2QiOiJodHRwczovL3d3dy5nb29nbGUuY29tLmhrLyJ9\n[NVIDIA Jetson Nano B01]: https://www.seeedstudio.com/blog/2020/01/16/new-revision-of-jetson-nano-dev-kit-now-supports-new-jetson-nano-module/\n[speech-enhancement-models]: https://github.com/k2-fsa/sherpa-onnx/releases/tag/speech-enhancement-models\n[source-separation-models]: https://github.com/k2-fsa/sherpa-onnx/releases/tag/source-separation-models\n[RK3588]: https://www.rock-chips.com/uploads/pdf/2022.8.26/192/RK3588%20Brief%20Datasheet.pdf\n[spleeter]: https://github.com/deezer/spleeter\n[UVR]: https://github.com/Anjok07/ultimatevocalremovergui\n[gtcrn]: https://github.com/Xiaobin-Rong/gtcrn\n[tts-url]: https://k2-fsa.github.io/sherpa/onnx/tts/all-in-one.html\n[ss-url]: https://k2-fsa.github.io/sherpa/onnx/source-separation/index.html\n[sd-url]: https://k2-fsa.github.io/sherpa/onnx/speaker-diarization/index.html\n[slid-url]: https://k2-fsa.github.io/sherpa/onnx/spoken-language-identification/index.html\n[at-url]: https://k2-fsa.github.io/sherpa/onnx/audio-tagging/index.html\n[vad-url]: https://k2-fsa.github.io/sherpa/onnx/vad/index.html\n[kws-url]: https://k2-fsa.github.io/sherpa/onnx/kws/index.html\n[punct-url]: https://k2-fsa.github.io/sherpa/onnx/punctuation/index.html\n[se-url]: https://k2-fsa.github.io/sherpa/onnx/speech-enhancment/index.html\n",
"bugtrack_url": null,
"license": "Apache licensed, as found in the LICENSE file",
"summary": null,
"version": "1.12.13",
"project_urls": {
"Homepage": "https://github.com/k2-fsa/sherpa-onnx"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "70799f8985c12399e6b5d6194aee070af5a18c7b6e66e18f834e958619402b4d",
"md5": "cdc419b31090199ea02801249cd94ced",
"sha256": "852c0bb9c43e3855bf8f3c258abef1a8819fb41d207de18b582307905157db8d"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp310-cp310-macosx_10_15_universal2.whl",
"has_sig": false,
"md5_digest": "cdc419b31090199ea02801249cd94ced",
"packagetype": "bdist_wheel",
"python_version": "cp310",
"requires_python": ">=3.7",
"size": 3890544,
"upload_time": "2025-09-12T09:19:18",
"upload_time_iso_8601": "2025-09-12T09:19:18.682890Z",
"url": "https://files.pythonhosted.org/packages/70/79/9f8985c12399e6b5d6194aee070af5a18c7b6e66e18f834e958619402b4d/sherpa_onnx-1.12.13-cp310-cp310-macosx_10_15_universal2.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "927d141012cfe68e514e3130356e3dd1ddfacfd4e5fd80903a1265e812d731f0",
"md5": "e4415b299447c1d8b081f6fc98a007e2",
"sha256": "364c852785a4edb89e45e7decd54f744043976c083e4d7c81e1c8bf9ab890808"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp310-cp310-macosx_10_15_x86_64.whl",
"has_sig": false,
"md5_digest": "e4415b299447c1d8b081f6fc98a007e2",
"packagetype": "bdist_wheel",
"python_version": "cp310",
"requires_python": ">=3.7",
"size": 2057025,
"upload_time": "2025-09-12T09:08:09",
"upload_time_iso_8601": "2025-09-12T09:08:09.171626Z",
"url": "https://files.pythonhosted.org/packages/92/7d/141012cfe68e514e3130356e3dd1ddfacfd4e5fd80903a1265e812d731f0/sherpa_onnx-1.12.13-cp310-cp310-macosx_10_15_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "f0756649f43f76a626c5c934fad3a5bd97677453cbfd639ef4587d031d9c84a8",
"md5": "519687feee4ffdfff59a8e81a0834ecf",
"sha256": "2d74800f57482e8a13768befadcbcc6e2355177bbe64649c2bf072948321297f"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp310-cp310-macosx_11_0_arm64.whl",
"has_sig": false,
"md5_digest": "519687feee4ffdfff59a8e81a0834ecf",
"packagetype": "bdist_wheel",
"python_version": "cp310",
"requires_python": ">=3.7",
"size": 1864125,
"upload_time": "2025-09-12T08:50:15",
"upload_time_iso_8601": "2025-09-12T08:50:15.826874Z",
"url": "https://files.pythonhosted.org/packages/f0/75/6649f43f76a626c5c934fad3a5bd97677453cbfd639ef4587d031d9c84a8/sherpa_onnx-1.12.13-cp310-cp310-macosx_11_0_arm64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "23970684994ee897cdc0cc3e464576f168cfd94e86512252302cdf34e3b489be",
"md5": "c4c948bfdc39862e7199246e76839a20",
"sha256": "7dd584d93b616ca6e24508cfb32d3c4c85ac30e9e92fb255a45b205fe4a5066a"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl",
"has_sig": false,
"md5_digest": "c4c948bfdc39862e7199246e76839a20",
"packagetype": "bdist_wheel",
"python_version": "cp310",
"requires_python": ">=3.7",
"size": 3930745,
"upload_time": "2025-09-12T08:51:17",
"upload_time_iso_8601": "2025-09-12T08:51:17.194777Z",
"url": "https://files.pythonhosted.org/packages/23/97/0684994ee897cdc0cc3e464576f168cfd94e86512252302cdf34e3b489be/sherpa_onnx-1.12.13-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "61ed9a7678a4df5304e370f5f9fe88b8a5d02869107dd2abf61ff677d6af355e",
"md5": "bbe13df2588326b01e3c463dc1f6835d",
"sha256": "7af7285d136df0340d149e212db839a0bb0c97587d78abcca2d41afb4999d4ba"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl",
"has_sig": false,
"md5_digest": "bbe13df2588326b01e3c463dc1f6835d",
"packagetype": "bdist_wheel",
"python_version": "cp310",
"requires_python": ">=3.7",
"size": 4111400,
"upload_time": "2025-09-12T09:09:55",
"upload_time_iso_8601": "2025-09-12T09:09:55.948803Z",
"url": "https://files.pythonhosted.org/packages/61/ed/9a7678a4df5304e370f5f9fe88b8a5d02869107dd2abf61ff677d6af355e/sherpa_onnx-1.12.13-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "3e58a3e6e1c2988176e92c9f5b1b8cd29720513fc2d37f1412a17a6484ab27c2",
"md5": "c0d20a208b1db4c81b929b0b9433c44c",
"sha256": "5ebbd5241a639dc6cc3865210b2a16e2e50e8d128b509a49366a7c535adcecb8"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp310-cp310-win32.whl",
"has_sig": false,
"md5_digest": "c0d20a208b1db4c81b929b0b9433c44c",
"packagetype": "bdist_wheel",
"python_version": "cp310",
"requires_python": ">=3.7",
"size": 1640756,
"upload_time": "2025-09-12T09:14:08",
"upload_time_iso_8601": "2025-09-12T09:14:08.823227Z",
"url": "https://files.pythonhosted.org/packages/3e/58/a3e6e1c2988176e92c9f5b1b8cd29720513fc2d37f1412a17a6484ab27c2/sherpa_onnx-1.12.13-cp310-cp310-win32.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "49b4c04c305937dd07ae3fa32b3f76fa33f163de1c3c8e474b576b832f40ae85",
"md5": "96436060807670736f9d4ce34cbb4f57",
"sha256": "6b18499a4a3be41faf167cf20dc1c942a4b069e60579f81f426babb0afea05bf"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp310-cp310-win_amd64.whl",
"has_sig": false,
"md5_digest": "96436060807670736f9d4ce34cbb4f57",
"packagetype": "bdist_wheel",
"python_version": "cp310",
"requires_python": ">=3.7",
"size": 1940763,
"upload_time": "2025-09-12T09:06:57",
"upload_time_iso_8601": "2025-09-12T09:06:57.149466Z",
"url": "https://files.pythonhosted.org/packages/49/b4/c04c305937dd07ae3fa32b3f76fa33f163de1c3c8e474b576b832f40ae85/sherpa_onnx-1.12.13-cp310-cp310-win_amd64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "192c6e3df02b8928cee858501bac03ec5456996bbe69e0db4476c44ff29901a5",
"md5": "902abd41b013f0f0ad1d2287d075a82c",
"sha256": "a0b6cac5d86a2e8ddef6a91bbe8013eed41b63c9fbcd9c43a41df69a1996ace9"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp311-cp311-macosx_10_15_x86_64.whl",
"has_sig": false,
"md5_digest": "902abd41b013f0f0ad1d2287d075a82c",
"packagetype": "bdist_wheel",
"python_version": "cp311",
"requires_python": ">=3.7",
"size": 2057955,
"upload_time": "2025-09-12T09:03:56",
"upload_time_iso_8601": "2025-09-12T09:03:56.365464Z",
"url": "https://files.pythonhosted.org/packages/19/2c/6e3df02b8928cee858501bac03ec5456996bbe69e0db4476c44ff29901a5/sherpa_onnx-1.12.13-cp311-cp311-macosx_10_15_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "0852dde854f0aa363f2786c522379841719511283b1a119eeae18bb8f4d1377c",
"md5": "6c7b85dd2a9a4c8a1bb016b1cfbd3178",
"sha256": "73795631908e6cd0db0ac4ef3640009e391d5140d7f1ac951b58c1de3ca2d76c"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp311-cp311-macosx_11_0_arm64.whl",
"has_sig": false,
"md5_digest": "6c7b85dd2a9a4c8a1bb016b1cfbd3178",
"packagetype": "bdist_wheel",
"python_version": "cp311",
"requires_python": ">=3.7",
"size": 1866127,
"upload_time": "2025-09-12T09:12:39",
"upload_time_iso_8601": "2025-09-12T09:12:39.564766Z",
"url": "https://files.pythonhosted.org/packages/08/52/dde854f0aa363f2786c522379841719511283b1a119eeae18bb8f4d1377c/sherpa_onnx-1.12.13-cp311-cp311-macosx_11_0_arm64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "5923e789ffa9b64b4fb65b3249dd9ed58d38b719f8218515a7df18ca9151c28d",
"md5": "b3dcbc998ecf5c3879fab1cc2bc2a1c2",
"sha256": "1b583d560b6311ee1843bb9ee0a35107332d0efa91f420d944e716cd0d4926d8"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl",
"has_sig": false,
"md5_digest": "b3dcbc998ecf5c3879fab1cc2bc2a1c2",
"packagetype": "bdist_wheel",
"python_version": "cp311",
"requires_python": ">=3.7",
"size": 3930531,
"upload_time": "2025-09-12T09:02:49",
"upload_time_iso_8601": "2025-09-12T09:02:49.174288Z",
"url": "https://files.pythonhosted.org/packages/59/23/e789ffa9b64b4fb65b3249dd9ed58d38b719f8218515a7df18ca9151c28d/sherpa_onnx-1.12.13-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "45446e13494dc442ed4ede171d2dfe837c37ce8e500f6b7e5baf636f58610827",
"md5": "3c5f81b3af24011c8ae9ce633df8f302",
"sha256": "560b79294f6bfe4043334ede7660585b8b7de54b739d721049b0bdf301aa9281"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl",
"has_sig": false,
"md5_digest": "3c5f81b3af24011c8ae9ce633df8f302",
"packagetype": "bdist_wheel",
"python_version": "cp311",
"requires_python": ">=3.7",
"size": 4111668,
"upload_time": "2025-09-12T08:56:04",
"upload_time_iso_8601": "2025-09-12T08:56:04.229045Z",
"url": "https://files.pythonhosted.org/packages/45/44/6e13494dc442ed4ede171d2dfe837c37ce8e500f6b7e5baf636f58610827/sherpa_onnx-1.12.13-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "884be3627c2a725a934d76133665f0743895b3e36a1d1a8d1effba4cd41f636c",
"md5": "9ef7bdf099e9879b84c0dc8598afcb34",
"sha256": "c4cad9857633a27e9fda69e8d49986cf6a9b5c7f06f9523b99bd6bd6cc595988"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp311-cp311-win32.whl",
"has_sig": false,
"md5_digest": "9ef7bdf099e9879b84c0dc8598afcb34",
"packagetype": "bdist_wheel",
"python_version": "cp311",
"requires_python": ">=3.7",
"size": 1639281,
"upload_time": "2025-09-12T09:06:06",
"upload_time_iso_8601": "2025-09-12T09:06:06.516075Z",
"url": "https://files.pythonhosted.org/packages/88/4b/e3627c2a725a934d76133665f0743895b3e36a1d1a8d1effba4cd41f636c/sherpa_onnx-1.12.13-cp311-cp311-win32.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "b7f20586dfced8d48ad3530628b8bdcdb74f1bb87fb00ba92e7c8b7ea2487f46",
"md5": "5d554537ddb9d0b72ab6824d6d168f90",
"sha256": "c5788923e979f9e526c65830e40879fdbdcf5ddc5c6c2a4766aee7320ce8b7f6"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp311-cp311-win_amd64.whl",
"has_sig": false,
"md5_digest": "5d554537ddb9d0b72ab6824d6d168f90",
"packagetype": "bdist_wheel",
"python_version": "cp311",
"requires_python": ">=3.7",
"size": 1941895,
"upload_time": "2025-09-12T09:13:45",
"upload_time_iso_8601": "2025-09-12T09:13:45.923673Z",
"url": "https://files.pythonhosted.org/packages/b7/f2/0586dfced8d48ad3530628b8bdcdb74f1bb87fb00ba92e7c8b7ea2487f46/sherpa_onnx-1.12.13-cp311-cp311-win_amd64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "e4989ededffee1bcb42e5498eaf14a0b4a346fa05fd1012fa5e9291397df0f95",
"md5": "2ee2bd1799dc35388d4a9831bacfcfe6",
"sha256": "71025a8244f8d6481502b8eb794164daae4bf1510f8b455325b69a2360cd3af8"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp312-cp312-macosx_10_15_universal2.whl",
"has_sig": false,
"md5_digest": "2ee2bd1799dc35388d4a9831bacfcfe6",
"packagetype": "bdist_wheel",
"python_version": "cp312",
"requires_python": ">=3.7",
"size": 3914899,
"upload_time": "2025-09-12T08:59:40",
"upload_time_iso_8601": "2025-09-12T08:59:40.223849Z",
"url": "https://files.pythonhosted.org/packages/e4/98/9ededffee1bcb42e5498eaf14a0b4a346fa05fd1012fa5e9291397df0f95/sherpa_onnx-1.12.13-cp312-cp312-macosx_10_15_universal2.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "69cea3320af4f8a475cca8c820da1b572e0961b051f1b5ff80bc03de13214859",
"md5": "95e797b066684f739ae9cf4fe101c88b",
"sha256": "1b90924cc13224efb0dc285cc4af225172c18049d9fdae97bb93eeb97839b21d"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp312-cp312-macosx_10_15_x86_64.whl",
"has_sig": false,
"md5_digest": "95e797b066684f739ae9cf4fe101c88b",
"packagetype": "bdist_wheel",
"python_version": "cp312",
"requires_python": ">=3.7",
"size": 2074276,
"upload_time": "2025-09-12T08:51:22",
"upload_time_iso_8601": "2025-09-12T08:51:22.834965Z",
"url": "https://files.pythonhosted.org/packages/69/ce/a3320af4f8a475cca8c820da1b572e0961b051f1b5ff80bc03de13214859/sherpa_onnx-1.12.13-cp312-cp312-macosx_10_15_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "4174c00541afb2c22492bc1cfa3bd53ec72e91dbc21abd27d8ca6d8caa61cf93",
"md5": "edba548cc62b9a344aff8d663bf5ba26",
"sha256": "8098e3bab8869aabde70d2f585f2f4e91a6a6d7e0aeb06daddc4845ddec8b322"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp312-cp312-macosx_11_0_arm64.whl",
"has_sig": false,
"md5_digest": "edba548cc62b9a344aff8d663bf5ba26",
"packagetype": "bdist_wheel",
"python_version": "cp312",
"requires_python": ">=3.7",
"size": 1871622,
"upload_time": "2025-09-12T08:57:46",
"upload_time_iso_8601": "2025-09-12T08:57:46.406086Z",
"url": "https://files.pythonhosted.org/packages/41/74/c00541afb2c22492bc1cfa3bd53ec72e91dbc21abd27d8ca6d8caa61cf93/sherpa_onnx-1.12.13-cp312-cp312-macosx_11_0_arm64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "8cce9e706c7f9d1abc550df8af509b40c61924f102f134904350600d5c85f957",
"md5": "8315c90e60a6074968bcc23b553d3982",
"sha256": "792951c064afa4c693d39f07b065a117ca6f74e2d0512f95ae097c047ce08cdf"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl",
"has_sig": false,
"md5_digest": "8315c90e60a6074968bcc23b553d3982",
"packagetype": "bdist_wheel",
"python_version": "cp312",
"requires_python": ">=3.7",
"size": 3931928,
"upload_time": "2025-09-12T08:46:15",
"upload_time_iso_8601": "2025-09-12T08:46:15.272494Z",
"url": "https://files.pythonhosted.org/packages/8c/ce/9e706c7f9d1abc550df8af509b40c61924f102f134904350600d5c85f957/sherpa_onnx-1.12.13-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "567c20888aeb07aed48826660569339fefb328290090997f582e3c63a000f271",
"md5": "090c7a17335db4394a695fc7c5b60e61",
"sha256": "8b170f2ff6670c856797160469080e52abbff5f45be8913441fba918932a4297"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl",
"has_sig": false,
"md5_digest": "090c7a17335db4394a695fc7c5b60e61",
"packagetype": "bdist_wheel",
"python_version": "cp312",
"requires_python": ">=3.7",
"size": 4110642,
"upload_time": "2025-09-12T08:45:51",
"upload_time_iso_8601": "2025-09-12T08:45:51.162223Z",
"url": "https://files.pythonhosted.org/packages/56/7c/20888aeb07aed48826660569339fefb328290090997f582e3c63a000f271/sherpa_onnx-1.12.13-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "5ba75f45c9ce8ae41c1dae8975a8174d85eca851e54b1f945153d298026cfab9",
"md5": "f0bc99cb8f67674fc299f758f5f6bd9e",
"sha256": "a80e111f3f5f00cc28bea97a2eb737e402ede9ce01c2144d3d5f2f927816ef3c"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp312-cp312-win32.whl",
"has_sig": false,
"md5_digest": "f0bc99cb8f67674fc299f758f5f6bd9e",
"packagetype": "bdist_wheel",
"python_version": "cp312",
"requires_python": ">=3.7",
"size": 1644149,
"upload_time": "2025-09-12T09:11:19",
"upload_time_iso_8601": "2025-09-12T09:11:19.061641Z",
"url": "https://files.pythonhosted.org/packages/5b/a7/5f45c9ce8ae41c1dae8975a8174d85eca851e54b1f945153d298026cfab9/sherpa_onnx-1.12.13-cp312-cp312-win32.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "1afad20b1d3aad2468de425388888c0bda404f16c93c942ffc7ce54e211a34b0",
"md5": "5285f41d4bfda0d04a282ef33eec6680",
"sha256": "1eb3ccef84d2733df5298273eaa2a0f85887fe73a5922c5f2ce81938d42ce011"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp312-cp312-win_amd64.whl",
"has_sig": false,
"md5_digest": "5285f41d4bfda0d04a282ef33eec6680",
"packagetype": "bdist_wheel",
"python_version": "cp312",
"requires_python": ">=3.7",
"size": 1942048,
"upload_time": "2025-09-12T08:51:54",
"upload_time_iso_8601": "2025-09-12T08:51:54.461307Z",
"url": "https://files.pythonhosted.org/packages/1a/fa/d20b1d3aad2468de425388888c0bda404f16c93c942ffc7ce54e211a34b0/sherpa_onnx-1.12.13-cp312-cp312-win_amd64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "e91f86e7cb553b9285d0fa88fbd93cfc2ddd1eb7e42b5ad4eb3e18d6cdaa2560",
"md5": "85871c382eef71a0a60351d4febc991a",
"sha256": "753a520820dd77476b1a7f89b75f623a6425598821e65fe503975e7396773146"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp313-cp313-macosx_10_15_x86_64.whl",
"has_sig": false,
"md5_digest": "85871c382eef71a0a60351d4febc991a",
"packagetype": "bdist_wheel",
"python_version": "cp313",
"requires_python": ">=3.7",
"size": 2074378,
"upload_time": "2025-09-12T08:40:15",
"upload_time_iso_8601": "2025-09-12T08:40:15.398695Z",
"url": "https://files.pythonhosted.org/packages/e9/1f/86e7cb553b9285d0fa88fbd93cfc2ddd1eb7e42b5ad4eb3e18d6cdaa2560/sherpa_onnx-1.12.13-cp313-cp313-macosx_10_15_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "9ea339372f33db07a122cebc87a96ef666b5c82e428bbee8001c0985f722745f",
"md5": "db2fc14556e4e765e85fbacd52c481dd",
"sha256": "89186d0db8ee3e526714aa67453d5ce8cad3087044e5dcc63478705f9d741daf"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp313-cp313-macosx_11_0_arm64.whl",
"has_sig": false,
"md5_digest": "db2fc14556e4e765e85fbacd52c481dd",
"packagetype": "bdist_wheel",
"python_version": "cp313",
"requires_python": ">=3.7",
"size": 1871459,
"upload_time": "2025-09-12T09:16:15",
"upload_time_iso_8601": "2025-09-12T09:16:15.885367Z",
"url": "https://files.pythonhosted.org/packages/9e/a3/39372f33db07a122cebc87a96ef666b5c82e428bbee8001c0985f722745f/sherpa_onnx-1.12.13-cp313-cp313-macosx_11_0_arm64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "4dedbb98fb82c3ca324bc94ec721d934ab90c476d3a8091a94a3b024cb4bfdfd",
"md5": "47d493202f94de764be68cfb88bcd41c",
"sha256": "982f61412bc9c29a47f377de85d7349f34e52fa0e6b4892e914b2d2b67e6a865"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl",
"has_sig": false,
"md5_digest": "47d493202f94de764be68cfb88bcd41c",
"packagetype": "bdist_wheel",
"python_version": "cp313",
"requires_python": ">=3.7",
"size": 3931355,
"upload_time": "2025-09-12T09:03:03",
"upload_time_iso_8601": "2025-09-12T09:03:03.190747Z",
"url": "https://files.pythonhosted.org/packages/4d/ed/bb98fb82c3ca324bc94ec721d934ab90c476d3a8091a94a3b024cb4bfdfd/sherpa_onnx-1.12.13-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "a6ee0b20e478377e1902b4378c494a1698d2250fba40f5410f9c12e1f6fd9ffa",
"md5": "8f575caa9d25d11345bd5944917eea6c",
"sha256": "5dabffab7f515536152b6db3029fdb5d2bc4611143170d69be7b17aad2015a9b"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl",
"has_sig": false,
"md5_digest": "8f575caa9d25d11345bd5944917eea6c",
"packagetype": "bdist_wheel",
"python_version": "cp313",
"requires_python": ">=3.7",
"size": 4111508,
"upload_time": "2025-09-12T09:00:13",
"upload_time_iso_8601": "2025-09-12T09:00:13.468232Z",
"url": "https://files.pythonhosted.org/packages/a6/ee/0b20e478377e1902b4378c494a1698d2250fba40f5410f9c12e1f6fd9ffa/sherpa_onnx-1.12.13-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "a9beb976166542b806f26546ab6c811a4e75ca70e6e343c33957ef55c4f214fd",
"md5": "93d1f0c240232ff7d41b10810ca66f1c",
"sha256": "e12e4ba1a16b1f58e7705e6b6c2ee07cf29b5ed7ac66f1821438de83201f9a5d"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp313-cp313-win32.whl",
"has_sig": false,
"md5_digest": "93d1f0c240232ff7d41b10810ca66f1c",
"packagetype": "bdist_wheel",
"python_version": "cp313",
"requires_python": ">=3.7",
"size": 1643595,
"upload_time": "2025-09-12T08:51:11",
"upload_time_iso_8601": "2025-09-12T08:51:11.273174Z",
"url": "https://files.pythonhosted.org/packages/a9/be/b976166542b806f26546ab6c811a4e75ca70e6e343c33957ef55c4f214fd/sherpa_onnx-1.12.13-cp313-cp313-win32.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "a430ae848c95166423f882f3bd5b0d14c348018ce4fa29df0093e2bda7f292dc",
"md5": "0f03062fd36d0fe771ff9f99a02e7259",
"sha256": "2674963be7fe35ccda1981fc3d1a98462069e336e03a6829e86301ee61cd342a"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp313-cp313-win_amd64.whl",
"has_sig": false,
"md5_digest": "0f03062fd36d0fe771ff9f99a02e7259",
"packagetype": "bdist_wheel",
"python_version": "cp313",
"requires_python": ">=3.7",
"size": 1942100,
"upload_time": "2025-09-12T09:11:44",
"upload_time_iso_8601": "2025-09-12T09:11:44.945671Z",
"url": "https://files.pythonhosted.org/packages/a4/30/ae848c95166423f882f3bd5b0d14c348018ce4fa29df0093e2bda7f292dc/sherpa_onnx-1.12.13-cp313-cp313-win_amd64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "a83f50f43470f814f9fe2af8c5c2749387a6445d77e6cfd62fedfb7ca95e0522",
"md5": "a94a084b93fed1c059000e00769e474a",
"sha256": "d5a96d50f222e1c1b1b504057769c08a8cc16996253a9d2607e5ca5938d43ea5"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp314-cp314-macosx_10_15_x86_64.whl",
"has_sig": false,
"md5_digest": "a94a084b93fed1c059000e00769e474a",
"packagetype": "bdist_wheel",
"python_version": "cp314",
"requires_python": ">=3.7",
"size": 2074720,
"upload_time": "2025-09-12T08:43:11",
"upload_time_iso_8601": "2025-09-12T08:43:11.015355Z",
"url": "https://files.pythonhosted.org/packages/a8/3f/50f43470f814f9fe2af8c5c2749387a6445d77e6cfd62fedfb7ca95e0522/sherpa_onnx-1.12.13-cp314-cp314-macosx_10_15_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "23363f959484d3e76717c4fec6afec5dc4cd7ace3dd041d937b7a330cf4d270e",
"md5": "b5ac72b2d13d04f2643214b5f72332ba",
"sha256": "0d3ce75754a2694ad24649bdabdf41456a02a1d1755f1259fa7eb93e4204e81b"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp314-cp314-macosx_11_0_arm64.whl",
"has_sig": false,
"md5_digest": "b5ac72b2d13d04f2643214b5f72332ba",
"packagetype": "bdist_wheel",
"python_version": "cp314",
"requires_python": ">=3.7",
"size": 1874890,
"upload_time": "2025-09-12T09:10:59",
"upload_time_iso_8601": "2025-09-12T09:10:59.880798Z",
"url": "https://files.pythonhosted.org/packages/23/36/3f959484d3e76717c4fec6afec5dc4cd7ace3dd041d937b7a330cf4d270e/sherpa_onnx-1.12.13-cp314-cp314-macosx_11_0_arm64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "d54507a4e8a89afe3c15f2e8f36f667f0c50627834b6fdb38cfdbe7ff3eaea2e",
"md5": "039ff07d2f735278e74dcb477c4d1f5a",
"sha256": "ee6a7dc1905d252f71b76279f75d9bdb098d9defc0aa1d49ebf73995296bcd94"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl",
"has_sig": false,
"md5_digest": "039ff07d2f735278e74dcb477c4d1f5a",
"packagetype": "bdist_wheel",
"python_version": "cp314",
"requires_python": ">=3.7",
"size": 3935779,
"upload_time": "2025-09-12T09:13:38",
"upload_time_iso_8601": "2025-09-12T09:13:38.380738Z",
"url": "https://files.pythonhosted.org/packages/d5/45/07a4e8a89afe3c15f2e8f36f667f0c50627834b6fdb38cfdbe7ff3eaea2e/sherpa_onnx-1.12.13-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "7822a834dc87663fe004877fc6b75fa0d7db99c0d8cbaac346d7b7ef082be092",
"md5": "aba92e8181a3f787d91aff73f293300a",
"sha256": "d23814d55597731b219274c29d34e5d8bd6878fffe0be7b55801ab4afc53964a"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl",
"has_sig": false,
"md5_digest": "aba92e8181a3f787d91aff73f293300a",
"packagetype": "bdist_wheel",
"python_version": "cp314",
"requires_python": ">=3.7",
"size": 4113007,
"upload_time": "2025-09-12T09:04:22",
"upload_time_iso_8601": "2025-09-12T09:04:22.874534Z",
"url": "https://files.pythonhosted.org/packages/78/22/a834dc87663fe004877fc6b75fa0d7db99c0d8cbaac346d7b7ef082be092/sherpa_onnx-1.12.13-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "be9bf9b771072cb9a97736677cff0e1a64e334868f14f59f317d1e75a86f62d7",
"md5": "8930134c360ed3d8711c421eda215ab4",
"sha256": "1e55c1c24fa8bf448d5900bbae5bd79603d1efaef4710c2542ccb08329d8e0ec"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp314-cp314-win32.whl",
"has_sig": false,
"md5_digest": "8930134c360ed3d8711c421eda215ab4",
"packagetype": "bdist_wheel",
"python_version": "cp314",
"requires_python": ">=3.7",
"size": 1677377,
"upload_time": "2025-09-12T09:00:43",
"upload_time_iso_8601": "2025-09-12T09:00:43.921366Z",
"url": "https://files.pythonhosted.org/packages/be/9b/f9b771072cb9a97736677cff0e1a64e334868f14f59f317d1e75a86f62d7/sherpa_onnx-1.12.13-cp314-cp314-win32.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "a53262f908a7eea39b13d82739532ccf24954c6a7dc539fb119863ccd2672a91",
"md5": "08076e835175f4355637b22cac9b9cf5",
"sha256": "74a48ace02d1f6c516ac5a002ae51a85522141d6ff8d8cab314a1da9974b344e"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp37-cp37m-win_amd64.whl",
"has_sig": false,
"md5_digest": "08076e835175f4355637b22cac9b9cf5",
"packagetype": "bdist_wheel",
"python_version": "cp37",
"requires_python": ">=3.7",
"size": 1941960,
"upload_time": "2025-09-12T08:54:08",
"upload_time_iso_8601": "2025-09-12T08:54:08.587569Z",
"url": "https://files.pythonhosted.org/packages/a5/32/62f908a7eea39b13d82739532ccf24954c6a7dc539fb119863ccd2672a91/sherpa_onnx-1.12.13-cp37-cp37m-win_amd64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "574364a06d710e0c971525faab2866bacbaa8eeee031c493c93d4223a62416b9",
"md5": "0cd4d0344efce090d04f99597d8ffa5a",
"sha256": "2675b89d77eda9648420c32745d21a9eee03202065e744fabd2d4b2c2d413118"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp38-cp38-macosx_10_15_x86_64.whl",
"has_sig": false,
"md5_digest": "0cd4d0344efce090d04f99597d8ffa5a",
"packagetype": "bdist_wheel",
"python_version": "cp38",
"requires_python": ">=3.7",
"size": 2056721,
"upload_time": "2025-09-12T09:06:46",
"upload_time_iso_8601": "2025-09-12T09:06:46.946470Z",
"url": "https://files.pythonhosted.org/packages/57/43/64a06d710e0c971525faab2866bacbaa8eeee031c493c93d4223a62416b9/sherpa_onnx-1.12.13-cp38-cp38-macosx_10_15_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "1a8db583e2945dc80901088d725a59a0ffb5d534ca2d966262a3b493d81410f2",
"md5": "f4ce5714daac3e1ba3a1d05bd879e0c6",
"sha256": "cd1659ee33dff693a811b71efad32df75c82e70730f1f6a9b8308b6d827d4b05"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp38-cp38-macosx_11_0_arm64.whl",
"has_sig": false,
"md5_digest": "f4ce5714daac3e1ba3a1d05bd879e0c6",
"packagetype": "bdist_wheel",
"python_version": "cp38",
"requires_python": ">=3.7",
"size": 1864144,
"upload_time": "2025-09-12T09:16:31",
"upload_time_iso_8601": "2025-09-12T09:16:31.053743Z",
"url": "https://files.pythonhosted.org/packages/1a/8d/b583e2945dc80901088d725a59a0ffb5d534ca2d966262a3b493d81410f2/sherpa_onnx-1.12.13-cp38-cp38-macosx_11_0_arm64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "f194dc2d4934d65f0bf695da8891d4db265af40bd8188650214b84b84461954b",
"md5": "67ca214a1847e30c7f8c463c075c568f",
"sha256": "e236dfa74c9969a762972aeab9d2774b40256da71ec631100c2d1c8fe6fc6dff"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp38-cp38-manylinux2014_aarch64.manylinux_2_17_aarch64.whl",
"has_sig": false,
"md5_digest": "67ca214a1847e30c7f8c463c075c568f",
"packagetype": "bdist_wheel",
"python_version": "cp38",
"requires_python": ">=3.7",
"size": 3929925,
"upload_time": "2025-09-12T08:44:18",
"upload_time_iso_8601": "2025-09-12T08:44:18.700226Z",
"url": "https://files.pythonhosted.org/packages/f1/94/dc2d4934d65f0bf695da8891d4db265af40bd8188650214b84b84461954b/sherpa_onnx-1.12.13-cp38-cp38-manylinux2014_aarch64.manylinux_2_17_aarch64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "0be9f27deef259f223c083ba8fd607dbea161a85b4e40a0a3ddb45ae5585c14a",
"md5": "35c176ec68c62396aab6465945fad046",
"sha256": "4ed82f7efb70b13f93019cb97822aca855c6d1ec928bf5d5f6feb08f69c9fdc5"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp38-cp38-manylinux2014_x86_64.manylinux_2_17_x86_64.whl",
"has_sig": false,
"md5_digest": "35c176ec68c62396aab6465945fad046",
"packagetype": "bdist_wheel",
"python_version": "cp38",
"requires_python": ">=3.7",
"size": 4111026,
"upload_time": "2025-09-12T09:01:25",
"upload_time_iso_8601": "2025-09-12T09:01:25.190279Z",
"url": "https://files.pythonhosted.org/packages/0b/e9/f27deef259f223c083ba8fd607dbea161a85b4e40a0a3ddb45ae5585c14a/sherpa_onnx-1.12.13-cp38-cp38-manylinux2014_x86_64.manylinux_2_17_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "c429d6a67aeed8abfdd6628b0592ea7af50db92e70ff68afcb9177e47e056609",
"md5": "39c5e706b05001ffb18ef4825cda187f",
"sha256": "c065f0929d44c0906d083362b5a6f9d757c4c15aa58b030438a5c1581172bce6"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp38-cp38-win32.whl",
"has_sig": false,
"md5_digest": "39c5e706b05001ffb18ef4825cda187f",
"packagetype": "bdist_wheel",
"python_version": "cp38",
"requires_python": ">=3.7",
"size": 1638110,
"upload_time": "2025-09-12T09:12:59",
"upload_time_iso_8601": "2025-09-12T09:12:59.013908Z",
"url": "https://files.pythonhosted.org/packages/c4/29/d6a67aeed8abfdd6628b0592ea7af50db92e70ff68afcb9177e47e056609/sherpa_onnx-1.12.13-cp38-cp38-win32.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "c0673ebae73312f8e09004ef47d02d04cc4af6240a8d4f061d4fdbba6f5a794d",
"md5": "bed7a182c8a5d83523ed16b945d0ae6f",
"sha256": "d5d70b3f9eff3077fd44088bb0d72e5c867440dca2917f565020e9a1ac881c81"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp38-cp38-win_amd64.whl",
"has_sig": false,
"md5_digest": "bed7a182c8a5d83523ed16b945d0ae6f",
"packagetype": "bdist_wheel",
"python_version": "cp38",
"requires_python": ">=3.7",
"size": 1940609,
"upload_time": "2025-09-12T09:10:25",
"upload_time_iso_8601": "2025-09-12T09:10:25.045941Z",
"url": "https://files.pythonhosted.org/packages/c0/67/3ebae73312f8e09004ef47d02d04cc4af6240a8d4f061d4fdbba6f5a794d/sherpa_onnx-1.12.13-cp38-cp38-win_amd64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "e76c021f67306dc3692a80c5f28a8084a302aefc5e5f61f4407fa01312823e16",
"md5": "e79ed27125fe0fffdf7426f74ceb4a70",
"sha256": "608f0d58faee14b858ffdf0131f1daef111da3749d93073a4d2f31b04418d626"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp39-cp39-macosx_10_15_x86_64.whl",
"has_sig": false,
"md5_digest": "e79ed27125fe0fffdf7426f74ceb4a70",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": ">=3.7",
"size": 2057102,
"upload_time": "2025-09-12T08:42:51",
"upload_time_iso_8601": "2025-09-12T08:42:51.757141Z",
"url": "https://files.pythonhosted.org/packages/e7/6c/021f67306dc3692a80c5f28a8084a302aefc5e5f61f4407fa01312823e16/sherpa_onnx-1.12.13-cp39-cp39-macosx_10_15_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "293886134303df8a0916511cdcb12cbf0cb6c8fa2a77c2bb738ca39f2aa691c4",
"md5": "df8f6cb2db270120126430732b78dce4",
"sha256": "9379c6afd1e7e6f99feeec62555702f08d7348c39b27b398c46fd70b51c84d19"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp39-cp39-macosx_11_0_arm64.whl",
"has_sig": false,
"md5_digest": "df8f6cb2db270120126430732b78dce4",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": ">=3.7",
"size": 1864222,
"upload_time": "2025-09-12T09:12:18",
"upload_time_iso_8601": "2025-09-12T09:12:18.803794Z",
"url": "https://files.pythonhosted.org/packages/29/38/86134303df8a0916511cdcb12cbf0cb6c8fa2a77c2bb738ca39f2aa691c4/sherpa_onnx-1.12.13-cp39-cp39-macosx_11_0_arm64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "f974f7fdcaf37a3fe28a7493b609aa37bae3b9334bd291aee718a1500b9744c8",
"md5": "f1026dcecaac88632b7e15fdc874f651",
"sha256": "9739bcb360670ed0cb8ea78ec3759f951dc35c4e91a7192fe4ff34925f96707e"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.whl",
"has_sig": false,
"md5_digest": "f1026dcecaac88632b7e15fdc874f651",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": ">=3.7",
"size": 3930319,
"upload_time": "2025-09-12T08:53:54",
"upload_time_iso_8601": "2025-09-12T08:53:54.603344Z",
"url": "https://files.pythonhosted.org/packages/f9/74/f7fdcaf37a3fe28a7493b609aa37bae3b9334bd291aee718a1500b9744c8/sherpa_onnx-1.12.13-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "b92bcbde479667809d7bad229635c73640188a884e92634b9efc529f9c7c12d1",
"md5": "82517714bae6b3126ba7c5bddaae8b3c",
"sha256": "4ce23b4d2f4bdc27e4b3e598682638e49ef4be2de9f7ce748624e2dfae74853a"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.whl",
"has_sig": false,
"md5_digest": "82517714bae6b3126ba7c5bddaae8b3c",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": ">=3.7",
"size": 4110804,
"upload_time": "2025-09-12T09:01:38",
"upload_time_iso_8601": "2025-09-12T09:01:38.022728Z",
"url": "https://files.pythonhosted.org/packages/b9/2b/cbde479667809d7bad229635c73640188a884e92634b9efc529f9c7c12d1/sherpa_onnx-1.12.13-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "ac2af7edd62682e9456fa12d1c3214267387ed9b344602c8c86e1d5604e35869",
"md5": "97c453e5fd3c2b2b3d2ec3dd97d04b69",
"sha256": "ca8a8f953e25ae57bc225949f8bc5d89455debd17bbfe79784833932f0a3576b"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp39-cp39-win32.whl",
"has_sig": false,
"md5_digest": "97c453e5fd3c2b2b3d2ec3dd97d04b69",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": ">=3.7",
"size": 1640513,
"upload_time": "2025-09-12T09:09:37",
"upload_time_iso_8601": "2025-09-12T09:09:37.378211Z",
"url": "https://files.pythonhosted.org/packages/ac/2a/f7edd62682e9456fa12d1c3214267387ed9b344602c8c86e1d5604e35869/sherpa_onnx-1.12.13-cp39-cp39-win32.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "4ff40e8e38213cfc75b43aa4453b2ebbb98361045a7dd0321a9b7c5784289b94",
"md5": "b8ade3fb4ec555cd276d3f5a4f3a90a8",
"sha256": "0acf54bc66294d8715b2fa90ca90e09909c7a411e207721524a3696ed958823b"
},
"downloads": -1,
"filename": "sherpa_onnx-1.12.13-cp39-cp39-win_amd64.whl",
"has_sig": false,
"md5_digest": "b8ade3fb4ec555cd276d3f5a4f3a90a8",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": ">=3.7",
"size": 1940165,
"upload_time": "2025-09-12T09:00:17",
"upload_time_iso_8601": "2025-09-12T09:00:17.992130Z",
"url": "https://files.pythonhosted.org/packages/4f/f4/0e8e38213cfc75b43aa4453b2ebbb98361045a7dd0321a9b7c5784289b94/sherpa_onnx-1.12.13-cp39-cp39-win_amd64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "c3b846310a6bc56e99ca4a8d80a884d5f5bfcd80da3d7e289f996ae3e417f2b5",
"md5": "26f680af2dff6a3515483cba8bdb034d",
"sha256": "46cd0de566f87a731b51c1daed3d1055e889b5def8e024ef1d89e65953dd6ef1"
},
"downloads": -1,
"filename": "sherpa-onnx-1.12.13.tar.gz",
"has_sig": false,
"md5_digest": "26f680af2dff6a3515483cba8bdb034d",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 598058,
"upload_time": "2025-09-12T09:01:27",
"upload_time_iso_8601": "2025-09-12T09:01:27.796276Z",
"url": "https://files.pythonhosted.org/packages/c3/b8/46310a6bc56e99ca4a8d80a884d5f5bfcd80da3d7e289f996ae3e417f2b5/sherpa-onnx-1.12.13.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-09-12 09:01:27",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "k2-fsa",
"github_project": "sherpa-onnx",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "sherpa-onnx"
}