U.S. flag

An official website of the United States government

Dot gov

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Https

Secure .gov websites use HTTPS
A lock () or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Breadcrumb

  1. Home

AtomGPT: Atomistic Generative Pretrained Transformer for Forward and Inverse Materials Design

AtomGPT: Atomistic Generative Pretrained Transformer for Forward and Inverse Materials Design

About this Dataset

Updated: 2025-04-06
Metadata Last Updated: 2024-06-01 00:00:00
Date Created: N/A
Data Provided by:
Dataset Owner: N/A

Access this data

Contact dataset owner Access URL
Table representation of structured data
Title AtomGPT: Atomistic Generative Pretrained Transformer for Forward and Inverse Materials Design
Description AtomGPT: Atomistic Generative Pretrained Transformer for Forward and Inverse Materials Design
Modified 2024-06-01 00:00:00
Publisher Name National Institute of Standards and Technology
Contact mailto:[email protected]
Keywords Large language models , Materials design , JARVIS
{
    "identifier": "ark:\/88434\/mds2-3463",
    "accessLevel": "public",
    "contactPoint": {
        "hasEmail": "mailto:[email protected]",
        "fn": "Kamal Choudhary"
    },
    "programCode": [
        "006:045"
    ],
    "landingPage": "",
    "title": "AtomGPT: Atomistic Generative Pretrained Transformer for Forward and Inverse Materials Design",
    "description": "AtomGPT: Atomistic Generative Pretrained Transformer for Forward and Inverse Materials Design",
    "language": [
        "en"
    ],
    "distribution": [
        {
            "accessURL": "https:\/\/github.com\/usnistgov\/atomgpt",
            "description": "Large language models (LLMs) such as generative pretrained transformers (GPTs) have shown potential for various commercial applications, but their applicability for materials design remains underexplored. In this Letter, AtomGPT is introduced as a model specifically developed for materials design based on transformer architectures, demonstrating capabilities for both atomistic property prediction and structure generation. This study shows that a combination of chemical and structural text descriptions can efficiently predict material properties with accuracy comparable to graph neural network models, including formation energies, electronic bandgaps from two different methods, and superconducting transition temperatures. Furthermore, AtomGPT can generate atomic structures for tasks such as designing new superconductors, with the predictions validated through density functional theory calculations. This work paves the way for leveraging LLMs in forward and inverse materials design, offering an efficient approach to the discovery and optimization of materials.",
            "title": "AtomGPT: Atomistic Generative Pretrained Transformer for Forward and Inverse Materials Design"
        }
    ],
    "bureauCode": [
        "006:55"
    ],
    "modified": "2024-06-01 00:00:00",
    "publisher": {
        "@type": "org:Organization",
        "name": "National Institute of Standards and Technology"
    },
    "theme": [
        "Materials:Modeling and computational material science",
        "Information Technology:Data and informatics"
    ],
    "keyword": [
        "Large language models",
        "Materials design",
        "JARVIS"
    ]
}