Dynamic inference-based learning of Markov network structure

Thumbnail Image
Date
2007-01-01
Authors
Gandhi, Parichey
Major Professor
Advisor
Dimitris Margaritis
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
Altmetrics
Authors
Research Projects
Organizational Units
Organizational Unit
Computer Science

Computer Science—the theory, representation, processing, communication and use of information—is fundamentally transforming every aspect of human endeavor. The Department of Computer Science at Iowa State University advances computational and information sciences through; 1. educational and research programs within and beyond the university; 2. active engagement to help define national and international research, and 3. educational agendas, and sustained commitment to graduating leaders for academia, industry and government.

History
The Computer Science Department was officially established in 1969, with Robert Stewart serving as the founding Department Chair. Faculty were composed of joint appointments with Mathematics, Statistics, and Electrical Engineering. In 1969, the building which now houses the Computer Science department, then simply called the Computer Science building, was completed. Later it was named Atanasoff Hall. Throughout the 1980s to present, the department expanded and developed its teaching and research agendas to cover many areas of computing.

Dates of Existence
1969-present

Related Units

Journal Issue
Is Version Of
Versions
Series
Department
Abstract

In this thesis we address the problem of leaning Markov network structure from data by presenting the Dynamic GSIMN or DGSIMN algorithm. DGSIMN is an extension of GSIMN algorithm, and works by conducting a series of statistical conditional independence tests on the data, and uses the axioms that govern the independence relation to avoid unnecessary tests i.e., tests that can be inferred from the results of known ones. However, DGSIMN improves on the GSIMN algorithm by dynamically selecting the locally optimal test that will increase the state of knowledge about the structure the most. This is done by estimating the number of inferences that will be obtained after executing a test (before it is actually evaluated on data), and selecting the one that is expected to maximize the number of such inferences. This helps decreasing the number of tests required to be evaluated on data, resulting in an overall decrease in the computational requirements of the algorithm. Experiments show that DGSIMN yields savings of up to 85% while achieving similar or better accuracy in most cases.

Comments
Description
Keywords
Citation
Source
Subject Categories
Copyright
Mon Jan 01 00:00:00 UTC 2007