Systems AI: A Declarative Learning Based Programming Perspective

@inproceedings{Kordjamshidi2018SystemsAA,
  title={Systems AI: A Declarative Learning Based Programming Perspective},
  author={Parisa Kordjamshidi and Dan Roth and Kristian Kersting},
  booktitle={IJCAI},
  year={2018}
}
Data-driven approaches are becoming dominant problem-solving techniques in many areas of research and industry. Unfortunately, current technologies do not make such techniques easy to use for application experts who are not fluent in machine learning nor for machine learning experts who aim at testing ideas on real-world data and need to evaluate those as a part of an end-to-end system. We review key efforts made by various AI communities to provide languages for high-level abstractions over… 

From Ontologies to Learning-Based Programs

TLDR
This paper discusses the work in progress on designing a prototype for a novel declarative learningbased programming system and presents the preliminary results and proposes to automatically generate the learning-based programs from the current ontology representation languages such as OWL.

Symbolic Logic meets Machine Learning: A Brief Survey in Infinite Domains

TLDR
There is a common misconception that logic is for discrete properties, whereas probability theory and machine learning, more generally, is for continuous properties, and results are reported that challenge this view on the limitations of logic, and expose the role that logic can play for learning in infinite domains.

Symbolic Logic meets Machine Learning: A Brief Survey in Infinite Domains

  • Tabia
  • Computer Science
  • 2020
TLDR
This article surveys work that provides further evidence for the connections between logic and learning, and reports on results that challenge this view on the limitations of logic, and expose the role that logic can play for learning in infinite domains.

Ecological Semantics: Programming Environments for Situated Language Understanding

TLDR
It is argued that models must begin to understand and program in the language of affordances both for online, situated discourse comprehension, as well as large-scale, offline common-sense knowledge mining, in an environment-oriented ecological semantics.

Teaching Machines to Classify from Natural Language Interactions

TLDR
It is demonstrated that language can define rich and expressive features for learning tasks, and machine learning can benefit substantially from this ability, and new algorithms for semantic parsing are developed that incorporate pragmatic cues, including conversational history and sensory observation, to improve automatic language interpretation.

Neural-Symbolic Argumentation Mining: An Argument in Favor of Deep Learning and Reasoning

TLDR
It is posited that neural-symbolic and statistical relational learning could play a crucial role in the integration of symbolic and sub-Symbolic methods to achieve this goal.

srlearn: A Python Library for Gradient-Boosted Statistical Relational Models

We present srlearn, a Python library for boosted statistical relational models. We adapt the scikit-learn interface to this setting and provide examples for how this can be used to express learning

PE-TU Participation at TAC 2018 Drug-Drug Interaction Extraction from Drug Labels

TLDR
This paper provided a hybrid method that combines a dictionary-based, rule-based and machine learning techniques to extract mention from mention label text to extract information on drug reactions from drug labels.

Neuro-Symbolic Artificial Intelligence: The State of the Art

References

SHOWING 1-10 OF 68 REFERENCES

Saul: Towards Declarative Learning Based Programming

TLDR
Saul is an object-functional programming language written in Scala that facilitates allowing a programmer to learn, name and manipulate named abstractions over relational data and supporting seamless incorporation of trainable (probabilistic or discriminative) components into the program.

Learning Based Programming

TLDR
The paper provides the formalisms for the main constructs of the language, including knowledge representations and learning constructs as well as the notions of an interaction and sensors through which an LBP program interacts with its environment.

Programming with a Differentiable Forth Interpreter

TLDR
An end-to-end differentiable interpreter for the programming language Forth which enables programmers to write program sketches with slots that can be filled with behaviour trained from program input-output data, and shows empirically that this interpreter is able to effectively leverage different levels of prior program structure and learn complex behaviours such as sequence sorting and addition.

Structured Factored Inference: A Framework for Automated Reasoning in Probabilistic Programming Languages

TLDR
This work introduces a new framework called structured factored inference (SFI), using models encoded in a probabilistic programming language that provides a sound means to decompose a model into sub-models, apply an inference algorithm to each sub-model, and combine the resulting information to answer a query.

Better call Saul: Flexible Programming for Learning and Inference in NLP

TLDR
It is argued that Saul provides an extremely useful paradigm both for the design of advanced NLP systems and for supporting advanced research in NLP.

A Berkeley View of Systems Challenges for AI

TLDR
This paper proposes several open research directions in systems, architectures, and security that can address challenges and help unlock AI's potential to improve lives and society.

DeepDive: Declarative Knowledge Base Construction

TLDR
DeepDive is described, a system that combines database and machine learning ideas to help develop KBC systems, a long-standing problem in industry and research that encompasses problems of data extraction, cleaning, and integration.

Efficient programmable learning to search

TLDR
It is shown that the search space can be defined by an arbitrary imperative program, reducing the number of lines of code required to develop new structured prediction tasks by orders of magnitude.

WOLFE: Strength Reduction and Approximate Programming for Probabilistic Programming

TLDR
WOLFE is presented, a probabilistic programming language that enables practitioners to develop such models as structured prediction or matrix factorization and can yield very concise programs, high expressiveness and efficient execution.

Design and Implementation of the LogicBlox System

TLDR
The design considerations behind the LogicBlox system are discussed and the use of purely functional data structures; novel join processing strategies; advanced incremental maintenance and live programming facilities; and a novel concurrency control scheme are given.
...