microsoft/bloom-deepspeed-inference-fp16

60次阅读

microsoft/bloom-deepspeed-inference-fp16

This is a copy of the original BLOOM weights that is more efficient to use with the DeepSpeed-MII and DeepSpeed-Inference. In this repo the original tensors are split into 8 shards to target 8 GPUs, this allows the user to run the model with DeepSpeed-inference Tensor Parallelism.
For specific details about the BLOOM model itself, please see the original BLOOM model card.
For examples on using this repo please see the following:

  • https://github.com/huggingface/transformers-bloom-inference
  • https://github.com/microsoft/DeepSpeed-MII

前往AI网址导航

正文完
 0
微草录
版权声明:本站原创文章,由 微草录 2024-01-05发表,共计470字。
转载说明:除特殊说明外本站文章皆由CC-4.0协议发布,转载请注明出处。