U.S. flag

An official website of the United States government

Dot gov

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Https

Secure .gov websites use HTTPS
A lock () or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Breadcrumb

  1. Home

Meta Learning Paper Supplemental Code

Meta learning with LLM: supplemental code for reproducibility of computational results for MLT and MLT-plus-TM. Related research paper: "META LEARNING WITH LANGUAGE MODELS: CHALLENGES AND OPPORTUNITIES IN THE CLASSIFICATION OF IMBALANCED TEXT", A. Vassilev, H. Jin, M. Hasan, 2023 (to appear on arXiv).All code and data is contained in the zip archive arxiv2023.zip, subject to the licensing terms shown below. See the Readme.txt contained there for detailed explanation how to unpack and run the code. See also requirements.txt for the necessary depedencies (libraries needed). This is not a dataset, but only python source code.

About this Dataset

Updated: 2024-02-22
Metadata Last Updated: 2023-09-11 00:00:00
Date Created: N/A
Views:
Data Provided by:
natural language processing
Dataset Owner: N/A

Access this data

Contact dataset owner Access URL
Landing Page URL
Table representation of structured data
Title Meta Learning Paper Supplemental Code
Description Meta learning with LLM: supplemental code for reproducibility of computational results for MLT and MLT-plus-TM. Related research paper: "META LEARNING WITH LANGUAGE MODELS: CHALLENGES AND OPPORTUNITIES IN THE CLASSIFICATION OF IMBALANCED TEXT", A. Vassilev, H. Jin, M. Hasan, 2023 (to appear on arXiv).All code and data is contained in the zip archive arxiv2023.zip, subject to the licensing terms shown below. See the Readme.txt contained there for detailed explanation how to unpack and run the code. See also requirements.txt for the necessary depedencies (libraries needed). This is not a dataset, but only python source code.
Modified 2023-09-11 00:00:00
Publisher Name National Institute of Standards and Technology
Contact mailto:[email protected]
Keywords Natural language processing , Out of policy speech detection , Meta learning , Deep learning , Language Models
{
    "identifier": "ark:\/88434\/mds2-3074",
    "accessLevel": "public",
    "contactPoint": {
        "hasEmail": "mailto:[email protected]",
        "fn": "Apostol Vassilev"
    },
    "programCode": [
        "006:045"
    ],
    "@type": "dcat:Dataset",
    "landingPage": "https:\/\/data.nist.gov\/od\/id\/mds2-3074",
    "description": "Meta learning with LLM: supplemental code for reproducibility of computational results for MLT and MLT-plus-TM. Related research paper: \"META LEARNING WITH LANGUAGE MODELS: CHALLENGES AND OPPORTUNITIES IN THE CLASSIFICATION OF IMBALANCED TEXT\", A. Vassilev, H. Jin, M. Hasan, 2023 (to appear on arXiv).All code and data is contained in the zip archive arxiv2023.zip, subject to the licensing terms shown below. See the Readme.txt contained there for detailed explanation how to unpack and run the code. See also requirements.txt for the necessary depedencies (libraries needed). This is not a dataset, but only python source code.",
    "language": [
        "en"
    ],
    "title": "Meta Learning Paper Supplemental Code",
    "distribution": [
        {
            "accessURL": "https:\/\/github.com\/usnistgov\/NIST-AI-Meta-Learning-LLM",
            "format": "Zip archive",
            "description": "Meta learning with LLM: supplemental code for reproducibility of computational results for MLT and MLT-plus-TM. Related research paper: \"META LEARNING WITH LANGUAGE MODELS: CHALLENGES AND OPPORTUNITIES IN THE CLASSIFICATION OF IMBALANCED TEXT\", A. Vassilev, H. Jin, M. Hasan, 2023 (to appear on arXiv).All code and data is contained in the zip archive arxiv2023.zip, subject to the licensing terms shown below. See the Readme.txt contained there for detailed explanation how to unpack and run the code. See also requirements.txt for the necessary depedencies (libraries needed).",
            "title": "Meta Learning Paper supplemental code"
        },
        {
            "downloadURL": "https:\/\/data.nist.gov\/od\/ds\/mds2-3074\/arxiv2023.zip",
            "format": "zip archive of text files (pyhhon source code)",
            "description": "Meta learning with LLM: supplemental code for reproducibility of computational results for MLT and MLT-plus-TM. Related research paper: \"META LEARNING WITH LANGUAGE MODELS: CHALLENGES AND OPPORTUNITIES IN THE CLASSIFICATION OF IMBALANCED TEXT\", A. Vassilev, H. Jin, M. Hasan, 2023 (to appear on arXiv).All code and data is contained in the zip archive arxiv2023.zip, subject to the licensing terms shown below. See the Readme.txt contained there for detailed explanation how to unpack and run the code. See also requirements.txt for the necessary depedencies (libraries needed).",
            "mediaType": "application\/zip",
            "title": "Meta Learning Paper supplemental code"
        }
    ],
    "license": "https:\/\/www.nist.gov\/open\/license",
    "bureauCode": [
        "006:55"
    ],
    "rights": "N\/A",
    "modified": "2023-09-11 00:00:00",
    "publisher": {
        "@type": "org:Organization",
        "name": "National Institute of Standards and Technology"
    },
    "accrualPeriodicity": "irregular",
    "theme": [
        "Mathematics and Statistics",
        "Information Technology:Computational science"
    ],
    "issued": "2023-10-13",
    "keyword": [
        "Natural language processing",
        "Out of policy speech detection",
        "Meta learning",
        "Deep learning",
        "Language Models"
    ]
}

Was this page helpful?