LLMs can generate feedback on their work, use it to improve the output, and repeat this process iteratively.
-
Updated
Jan 13, 2024 - Python
LLMs can generate feedback on their work, use it to improve the output, and repeat this process iteratively.
PaL: Program-Aided Language Models (ICML 2023)
Interpretability for sequence generation models 🐛 🔍
A method to fix GPT-3 after deployment with user feedback, without re-training.
Code for "Aligning Linguistic Words and Visual Semantic Units for Image Captioning", ACM MM 2019
XLNet for generating language.
On Generating Extended Summaries of Long Documents
NAACL'19: "Jointly Optimizing Diversity and Relevance in Neural Response Generation"
UNION: An Unreferenced Metric for Evaluating Open-ended Story Generation
Benchmark for evaluating open-ended generation
Design and build a chatbot using data from the Cornell Movie Dialogues corpus, using Keras
Official code for the NAACL 2022 paper "Fuse It More Deeply! A Variational Transformer with Layer-Wise Latent Variable Inference for Text Generation"
Pre-trained models for our work on Temporal Graph Generation
Synthetic QA generation for long documents.
Multi-Figurative Language Generation (COLING 2022)
Pytorch version of Continuous Language Generative Flow (ACL 2021)
Data and code for Kang et al., EMNLP 2019's paper titled "Linguistic Versus Latent Relations for Modeling a Flow in Paragraphs"
Using Machine Translation to "translate" non-humor into humor. Code for the paper "Humorous Headline Generation via Style Transfer" at FigLang 2020
This is the final project for Multimedia course CSE6501
Code for our ACL (Findings) Paper - Fingerprinting Fine-tuned Language Models in the wild.
Add a description, image, and links to the language-generation topic page so that developers can more easily learn about it.
To associate your repository with the language-generation topic, visit your repo's landing page and select "manage topics."