Skip to main navigation Skip to search Skip to main content

Generating Predicate Logic Expressions From Natural Language

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Formal logic expressions are commonly written in standardized mathematical notation. Learning this notation typically requires many years of experience and is not an explicit part of undergraduate academic curricula. Constructing and comprehending logical predicates can feel difficult and unintuitive. We hypothesized that this process can be automated using neural machine translation. Most machine translation techniques involve word-based segmentation as a preprocessing step. Given the nature of our custom dataset, hosts first-order-logic (FOL) semantics primarily in unigram tokens, the word-based approach does not seem applicable. The proposed solution was to automate the translation of short English sentences into FOL expressions using character-level prediction in a recurrent neural network model. We trained four encoder-decoder models (LSTM, Bidirectional GRU with Attention, and two variants of Bi-directional LSTM with Attention). Our experimental results showed that several established neural translation techniques can be implemented to produce highly accurate machine translators of English sentences to FOL formalisms, given only characters as markers of semantics. We also demonstrated that attention-based enhancement to the encoder-decoder architecture can vastly improve translation accuracy. Most machine translation techniques involve word-based segmentation as a preprocessing step. Given the nature of our custom dataset, hosts first-order-logic (FOL) semantics primarily in unigram tokens, the word-based approach does not seem applicable. The proposed solution was to automate the translation of short English sentences into FOL expressions using character-level prediction in a recurrent neural network model. We trained four encoder-decoder models (LSTM, Bidirectional GRU with Attention, and two variants of Bi-directional LSTM with Attention). Our experimental results showed that several established neural translation techniques can be implemented to produce highly accurate machine translators of English sentences to FOL formalisms, given only characters as markers of semantics. We also demonstrated that attention-based enhancement to the encoder-decoder architecture can vastly improve translation accuracy. We trained four encoder-decoder models (LSTM, Bidirectional GRU with Attention, and two variants of Bi-directional LSTM with Attention). Our experimental results showed that several established neural translation techniques can be implemented to produce highly accurate machine translators of English sentences to FOL formalisms, given only characters as markers of semantics. We also demonstrated that attention-based enhancement to the encoder-decoder architecture can vastly improve translation accuracy.

Original languageEnglish
Title of host publicationSoutheastCon 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages465-472
Number of pages8
ISBN (Electronic)9780738111315
ISBN (Print)9780738111315
DOIs
StatePublished - Mar 10 2021
Event2021 SoutheastCon, SoutheastCon 2021 - Atlanta, United States
Duration: Mar 10 2021Mar 13 2021

Publication series

NameSoutheastCon 2021

Conference

Conference2021 SoutheastCon, SoutheastCon 2021
Country/TerritoryUnited States
CityAtlanta
Period3/10/213/13/21

Bibliographical note

Publisher Copyright:
© 2021 IEEE.

ASJC Scopus Subject Areas

  • Computer Networks and Communications
  • Software
  • Electrical and Electronic Engineering
  • Control and Systems Engineering
  • Signal Processing

Keywords

  • Machine learning
  • Neural machine translation
  • NLP
  • Predicate logic

Fingerprint

Dive into the research topics of 'Generating Predicate Logic Expressions From Natural Language'. Together they form a unique fingerprint.

Cite this