avatar

Noromaid 20B | Chat Online | 無料のAIツール

Annie
6

Use this chatbot to Chat with Noromaid 20B online.

チャットボット

アプリの概要

Noromaid 20B | Chat Online

Welcome to Noromaid 20B, a cutting-edge chatbot powered by the Noromaid 20B v0.1.1 - GGUF model created by IkariDev and Undi. This advanced language model takes conversation and interaction to the next level, offering a wide range of capabilities and compatibility with various platforms. In this markdown document, we will delve into the features, capabilities, and usage instructions for Noromaid 20B.

Introduction to Noromaid 20B

Noromaid 20B is an innovative chatbot developed based on the Noromaid 20B v0.1.1 model. This model was quantized using hardware generously provided by Massed Compute, ensuring efficient performance and resource management. The chatbot is designed to provide engaging and interactive conversations, making it suitable for a variety of applications, including role-playing (RP), erotic role-playing (ERP), and general chat.

About GGUF

GGUF is a novel format introduced by the llama.cpp team on August 21st, 2023. It serves as a replacement for the deprecated GGML format and offers enhanced support and features. Noromaid 20B leverages the GGUF format to deliver its conversational prowess. Notable clients and libraries that support GGUF include llama.cpp, text-generation-webui, KoboldCpp, LM Studio, LoLLMS Web UI, Faraday.dev, ctransformers, and more.

Licensing

Noromaid 20B is released under the cc-by-nc-4.0 license, consistent with the original model's licensing terms. Additionally, it adheres to the Meta Llama 2 license terms. Questions regarding licensing or how these two licenses may interact should be directed to the original model repository maintained by IkariDev and Undi.

Compatibility

The quantized GGUFv2 files of Noromaid 20B are compatible with llama.cpp from August 27th onwards, starting from commit d0cee0d. Furthermore, they are compatible with various third-party UIs and libraries, expanding the possibilities for user interaction.

Explanation of Quantization Methods

Noromaid 20B offers a range of quantization options, each with different bit depths and quality levels. Users can select the quantization method that best suits their specific use case, balancing quality and resource consumption. Below is a summary of the available quantization methods:

NameQuant MethodBitsSizeMax RAM RequiredUse Case
noromaid-20b-v0.1.1.Q2_K.ggufQ2_K28.31 GB10.81 GBSmallest, significant quality loss
noromaid-20b-v0.1.1.Q3_K_S.ggufQ3_K_S38.66 GB11.16 GBVery small, high quality loss
noromaid-20b-v0.1.1.Q3_K_M.ggufQ3_K_M39.70 GB12.20 GBVery small, high quality loss
noromaid-20b-v0.1.1.Q3_K_L.ggufQ3_K_L310.63 GB13.13 GBSmall, substantial quality loss
noromaid-20b-v0.1.1.Q4_0.ggufQ4_0411.29 GB13.79 GBLegacy; small, very high quality loss
noromaid-20b-v0.1.1.Q4_K_S.ggufQ4_K_S411.34 GB13.84 GBSmall, greater quality loss
noromaid-20b-v0.1.1.Q4_K_M.ggufQ4_K_M412.04 GB14.54 GBMedium, balanced quality - recommended
noromaid-20b-v0.1.1.Q5_0.ggufQ5_0513.77 GB16.27 GBLegacy; medium, balanced quality
noromaid-20b-v0.1.1.Q5_K_S.ggufQ5_K_S513.77 GB16.27 GBLarge, low quality loss - recommended
noromaid-20b-v0.1.1.Q5_K_M.ggufQ5_K_M514.16 GB16.66 GBLarge, very low quality loss - recommended
noromaid-20b-v0.1.1.Q6_K.ggufQ6_K616.40 GB18.90 GBVery large, extremely low quality loss
noromaid-20b-v0.1.1.Q8_0.ggufQ8_0821.25 GB23.75 GBVery large, extremely low quality loss - not recommended

Note: RAM figures assume no GPU offloading; GPU offloading reduces RAM usage and uses VRAM instead.

How to Download GGUF Files

To download GGUF files, you can use various methods depending on your preferences and needs. Some options include:

  • Using LM Studio, LoLLMS Web UI, Faraday.dev, or text-generation-webui for automatic downloads.
  • Command-line download using huggingface-hub Python library.
  • More advanced huggingface-cli download usage.
  • Instructions for running in text-generation-webui.
  • How to load this model in Python code using ctransformers.

How to Use Noromaid 20B

To interact with Noromaid 20B and harness its conversational capabilities, you can utilize various methods, including web UIs, command-line interfaces, and Python code. Here are some examples of how to use Noromaid 20B:

Using LM Studio or LoLLMS Web UI

LM Studio and LoLLMS Web UI provide user-friendly interfaces for interacting with Noromaid 20B. Simply select the model repo and filename to initiate a download and engage in conversations effortlessly.

Command-Line Interface

You can use the huggingface-hub Python library for command-line downloads. This method offers high-speed downloading capabilities and flexibility for selecting specific files.

Python Code

Noromaid 20B can be accessed and utilized in Python code using the ctransformers library. Whether you have GPU acceleration or not, ctransformers offers an easy way to load the model and start conversations. Sample code for using ctransformers with Noromaid 20B is provided in the documentation.

Conclusion

Please note that Noromaid 20B is a test version, and while it offers powerful capabilities, occasional issues may occur. Your feedback and suggestions are valuable for improving the chatbot's performance.

Enjoy using Noromaid 20B for engaging conversations, role-playing, and more!

前置きのプロンプト