VoucherVision / requirements.txt
phyloforfun's picture
add mammal prompt, fix bug
ee1ea69
��# requirements.txt
wheel
timm
einops
packaging
ninja
torch==2.3.1
torchvision==0.18.1
torchaudio==2.3.1
#flash-attn==1.0.9
# https://discuss.huggingface.co/t/how-to-install-flash-attention-on-hf-gradio-space/70698/2
https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.9.post1/flash_attn-2.5.9.post1+cu118torch1.12cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
streamlit
streamlit-extras
duckduckgo-search
plotly
pyyaml
Pillow
pandas
matplotlib
matplotlib-inline
tqdm
openai
mapboxgl
langchain
langchain-community
langchain-core
langchain_mistralai
langchain_openai
langchain_google_genai
langchain_experimental
langchain-google-vertexai
langchain_huggingface
jsonformer
PyMuPDF
gputil
vertexai
bitsandbytes
accelerate
tiktoken
wikipedia
wikibase-rest-api-client
Wikipedia-API
mediawikiapi
openpyxl
google-api-python-client
google-generativeai
google-cloud-storage
google-cloud-vision
google-auth
google-auth-oauthlib
google-cloud-aiplatform
opencv-python
chromadb
InstructorEmbedding
transformers
sentence-transformers
seaborn
dask
psutil
py-cpuinfo
Levenshtein
fuzzywuzzy
opencage
geocoder
pycountry_convert
tensorboard
pydantic
peft
labelbox
PyPDF2
mistralai==0.4.2