File size: 1,176 Bytes
20d71ea
 
 
5a551a0
 
 
 
 
7b5cc8b
39546ab
20d71ea
 
7c27865
5a551a0
 
 
7c27865
20d71ea
 
7c27865
20d71ea
 
7c27865
20d71ea
7c27865
20d71ea
7c27865
20d71ea
 
 
 
7c27865
20d71ea
7c27865
20d71ea
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
---
base_model: unsloth/mistral-7b-bnb-4bit
library_name: peft
license: mit
language:
- en
tags:
- chemistry
- text-generation-inference
pipeline_tag: text-generation
---

# Fine-tune models for atomic structure generation of superconductor candidate materials

AtomGPT: Atomistic Generative Pretrained Transformer for Forward and Inverse Materials Design

<!-- Provide a quick summary of what the model is/does. -->


<!-- Provide a longer summary of what this model is. -->


GitHub: [https://github.com/usnistgov/atomgpt](https://github.com/usnistgov/atomgpt)

[Training colab notebook](https://colab.research.google.com/github/knc6/jarvis-tools-notebooks/blob/master/jarvis-tools-notebooks/atomgpt_example.ipynb)

[Inference colab notebook](https://colab.research.google.com/github/knc6/jarvis-tools-notebooks/blob/master/jarvis-tools-notebooks/atomgpt_example_huggingface.ipynb)




Reference: [Choudhary, K. (2024). AtomGPT: Atomistic Generative Pretrained Transformer for Forward and Inverse Materials Design. The Journal of Physical Chemistry Letters, 15, 6909-6917.](https://pubs.acs.org/doi/full/10.1021/acs.jpclett.4c01126)

- 
### Framework versions

- PEFT 0.11.1