热门搜索:
1. VLLM 简介
https://github.com/vllm-project/vllm
conda create -n vllm python=3.12 conda activate vllm
使用官方安装安装:https://github.com/vllm-project/vllm
pip install vllm
使用清华镜像源加速安装:
pip install vllm -i https://pypi.tuna.tsinghua.edu.cn/simple
pip show vllm
vllm serve /path/to/your/model
vllm serve /home/ctq/Huggingface/Qwen2.5-1.5B-Instruct
vllm serve /home/ctq/Huggingface/DeepSeek-R1-Distill-Qwen-1.5B --port 8188 --host 0.0.0.0
curl -X POST http://localhost:8000/generate \ -H "Content-Type: application/json" \ -d '{"prompt": "Hello, how are you?", "max_tokens": 50}'
import requests response = requests.post( "http://localhost:8000/generate", json={"prompt": "Hello, how are you?", "max_tokens": 50} ) print(response.json())
pip list | grep -E "torch|transformers"
pip install torch transformers
python -c "import torch; print(torch.cuda.is_available())"
pip install --upgrade pip
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118
conda create -n vllm python=3.12 conda activate vllm pip install vllm -i https://pypi.tuna.tsinghua.edu.cn/simple
【收藏本页】 【返回顶部】 【关闭窗口】
扫一扫 | 关注南粤通信官方微信 | 惊喜无限哟!