Chrome Extension
WeChat Mini Program
Use on ChatGLM

Relational Programming with Foundational Models

Ziyang Li,Jiani Huang, Jason Liu, Felix Zhu, Eric Zhao, William Dodds, Neelay Velingker,Rajeev Alur,Mayur Naik

AAAI 2024(2024)

Cited 0|Views26
No score
Abstract
Foundation models have vast potential to enable diverse AI applications. The powerful yet incomplete nature of these models has spurred a wide range of mechanisms to augment them with capabilities such as in-context learning, information retrieval, and code interpreting. We propose Vieira, a declarative framework that unifies these mechanisms in a general solution for programming with foundation models. Vieira follows a probabilistic relational paradigm and treats foundation models as stateless functions with relational inputs and outputs. It supports neuro-symbolic applications by enabling the seamless combination of such models with logic programs, as well as complex, multi-modal applications by streamlining the composition of diverse sub-models. We implement Vieira by extending the Scallop compiler with a foreign interface that supports foundation models as plugins. We implement plugins for 12 foundation models including GPT, CLIP, and SAM. We evaluate Vieira on 9 challenging tasks that span language, vision, and structured and vector databases. Our evaluation shows that programs in Vieira are concise, can incorporate modern foundation models, and have comparable or better accuracy than competitive baselines.
More
Translated text
Key words
KRR: Logic Programming,CV: Visual Reasoning & Symbolic Representations,ML: Deep Neural Architectures and Foundation Models,ML: Neuro-Symbolic Learning,ML: Statistical Relational/Logic Learning,NLP: Information Extraction,RU: Relational Probabilistic Models
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined