Нема описа

Eric Wang df2a5dc4be cleanup notebooks пре 3 година
.gitignore 26f64780ad initial commit пре 3 година
DATA_LICENSE 63121244c8 Licenses and whatnot пре 3 година
LICENSE 63121244c8 Licenses and whatnot пре 3 година
README.md 357ec81a17 decapoda пре 3 година
alpaca_data.json 26f64780ad initial commit пре 3 година
conversion.py 26f64780ad initial commit пре 3 година
finetune.py df2a5dc4be cleanup notebooks пре 3 година
generate.py 357ec81a17 decapoda пре 3 година
lengths.ipynb 26f64780ad initial commit пре 3 година
loss.ipynb 357ec81a17 decapoda пре 3 година

README.md

alpaca-lora (WIP)

This repository contains code for reproducing the Stanford Alpaca results. Users will need to be ready to fork transformers.

Setup

  1. Install dependencies (install zphang's transformers fork)

    pip install -q datasets accelerate loralib sentencepiece
    
    pip install -q git+https://github.com/zphang/transformers@llama_push
    pip install -q git+https://github.com/huggingface/peft.git
    
  2. Install bitsandbytes from source

Inference

See generate.py.

Training

Under construction.