Contents

Inatll llama2-accessory

Contents

Simple Demo

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
git clone https://github.com/Alpha-VLLM/LLaMA2-Accessory.git
cd LLaMA2-Accessory
conda create -n accessory python=3.10 -y
conda activate accessory
pip install -r requirements.txt
pip install flash-attn --no-build-isolation
conda install -c nvidia cuda-nvcc


git clone https://github.com/NVIDIA/apex
cd apex
pip install -v --disable-pip-version-check --no-build-isolation --no-cache-dir ./