#### Filter Results:

- Full text PDF available (9)

#### Publication Year

2011

2016

- This year (0)
- Last 5 years (9)
- Last 10 years (10)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

Learn More

- Islam Beltagy, Cuong Chau, Gemma Boleda, Dan Garrette, Katrin Erk, Raymond J. Mooney
- *SEM@NAACL-HLT
- 2013

We combine logical and distributional representations of natural language meaning by transforming distributional similarity judgments into weighted inference rules using Markov Logic Networks (MLNs). We show that this framework supports both judging sentence similarity and recognizing textual entailment by appropriately adapting the MLN implementation of… (More)

- Islam Beltagy, Katrin Erk, Raymond J. Mooney
- ACL
- 2014

Probabilistic Soft Logic (PSL) is a recently developed framework for probabilistic logic. We use PSL to combine logical and distributional representations of natural-language meaning, where distributional information is represented in the form of weighted inference rules. We apply this framework to the task of Semantic Textual Similarity (STS) (i.e. judging… (More)

- Islam Beltagy, Moustafa Youssef, Mohamed N. El-Derini
- 2011 IEEE Wireless Communications and Networking…
- 2011

Routing in cognitive networks is a challenging problem due to the primary users' (PU) activities and mobility. Multipath routing is a general solution to improve reliability of connections. Existing multipath routing metrics for traditional wireless networks do not take into account PUs' activities. This work introduces a new routes selection metric for… (More)

- Islam Beltagy, Stephen Roller, Pengxiang Cheng, Katrin Erk, Raymond J. Mooney
- Computational Linguistics
- 2016

NLP tasks differ in the semantic information they require, and at this time no single semantic representation fulfills all requirements. Logic-based representations characterize sentence structure, but do not capture the graded aspect of meaning. Distributional models give graded similarity ratings for words and phrases, but do not capture sentence… (More)

- Islam Beltagy, Stephen Roller, Gemma Boleda, Katrin Erk, Raymond J. Mooney
- SemEval@COLING
- 2014

We represent natural language semantics by combining logical and distributional information in probabilistic logic. We use Markov Logic Networks (MLN) for the RTE task, and Probabilistic Soft Logic (PSL) for the STS task. The system is evaluated on the SICK dataset. Our best system achieves 73% accuracy on the RTE task, and a Pearson’s correlation of 0.71… (More)

NLP tasks differ in the semantic information they require, and at this time no single semantic representation fulfills all requirements. Logic-based representations characterize sentence structure, but do not capture the graded aspect of meaning. Distributional models give graded similarity ratings for words and phrases, but do not capture sentence… (More)

- Islam Beltagy, Raymond J. Mooney
- AAAI Workshop: Statistical Relational Artificial…
- 2014

Using Markov logic to integrate logical and distributional information in natural-language semantics results in complex inference problems involving long, complicated formulae. Current inference methods for Markov logic are ineffective on such problems. To address this problem, we propose a new inference algorithm based on SampleSearch that computes… (More)

We propose a new approach to semantic parsing that is not constrained by a fixed formal ontology and purely logical inference. Instead, we use distributional semantics to generate only the relevant part of an on-the-fly ontology. Sentences and the on-the-fly ontology are represented in probabilistic logic. For inference, we use probabilistic logic… (More)

- Islam Beltagy, Katrin Erk
- IWCS
- 2015

As a format for describing the meaning of natural language sentences, probabilistic logic combines the expressivity of first-order logic with the ability to handle graded information in a principled fashion. But practical probabilistic logic frameworks usually assume a finite domain in which each entity corresponds to a constant in the logic (domain closure… (More)

With better natural language semantic representations, computers can do more applications more efficiently as a result of better understanding of natural text. However, no single semantic representation at this time fulfills all requirements needed for a satisfactory representation. Logic-based representations like first-order logic capture many of the… (More)

- ‹
- 1
- ›