Neural architectures for database query processing, syntax analysis, knowledge representation, and inference

Date
1997
Authors
Chen, Chun-Hsien
Major Professor
Advisor
Vasant Honavar
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
Altmetrics
Authors
Research Projects
Organizational Units
Computer Science
Organizational Unit
Journal Issue
Series
Department
Computer Science
Abstract

Artificial neural networks (ANN), due to their inherent parallelism, potential for fault tolerance, and adaptation through learning, offer an attractive computational paradigm for a variety of applications in computer science and engineering, artificial intelligence, robotics, and cognitive modeling. Despite the success in the application of ANN to a broad range of numeric tasks in pattern classification, control, function approximation, and system identification, the integration of ANN and symbolic computing is only beginning to be explored. This dissertation explores to integrate ANN and some essential symbolic computations for content-based associative symbolic processing. This offers an opportunity to explore the potential benefits of ANN's inherent parallelism in the design of high performance computing systems for real time content-based symbolic processing applications. We develop methods to systematically design massively parallel architectures for pattern-directed symbol processing using neural associative memories as key components. In particular, we propose neural architectures for content-based as well as address-based data storage and recall, information retrieval and database query processing, elementary logical inference, sequence processing, and syntax analysis. Their potential advantages over conventional serial computer implementations of the same functions are examined in the dissertation.

Comments
Description
Keywords
Citation
Source