Views on more technical issues

The momentum of traditional scholarship and evaulation of academics is enormous, but is slowly changing. The change reflects this technology's ability to help focus on content (the ideas and their clarity) versus form. The documents accessible here are all in various states of development, including abstracts only, historical descriptions, papers submitted for publication, papers already published, and papers that might never be published.

The list that follows includes a title, an author list, a citation (or description of publication state), an abstract, and downloadable postscipt files. When there is a corresponding presentation on-line, a link to that presentation is included.


Describing plan recognition as non-monotonic reasoning and belief revision (PostScript), (gzip Postscript)
P. Jachowicz
R.G. (Randy) Goebel
Single page poster to appear at IJCAI97, Nagoya, Japan, August 23-29, 1997.

Abstract We provide a characterization of plan recognition in terms of a general framework of belief revision and non-monotonic reasoning. We adopt a generalization of classical belief revision to describe a competence model of plan recognition which supports dynamic change to all aspects of a plan recognition knowledge base, including background knowledge, action descriptions and their relationship to named plans, and accumulating sets of observations on agent actions. Our plan recognition model exploits the underlying belief revision model to assimilate observations, and answer queries about an agent's intended plans and actions. Supporting belief states are determined by observed actions and non-monotonic assumptions consistent with background knowledge and action descriptions. We use a situation calculus notation to describe plans and actions, together with a small repertoire of meta predicates which are used to specify observations to the belief revision system, and to query the reasoning system regarding the current status of plans and predictable actions. Our intent is to demonstrate the connections between a general plan recognition model and important concepts of belief revision and non-monotonic reasoning, to help establish a basis for improving the specification and development of specialized plan recognition systems.
From association to meaning via correspondence connections: requirements and challenges for creating knowledge from behaviour (PostScript), (gzip Postscript) (presentation)
R.G. (Randy) Goebel
from the Proceedings of the Keio International Workshop on Verbalization of Tacit Knowledge based on Inductive Inference, pages 13--18, Keio University, Shonan Fujisawa Campus, Kanagawa, Japan, July 29--30, 1996.

Abstract The problem of transforming physical behaviour into symbolic knowledge is a kind of ``artificial intelligence complete'' problem in the sense that almost all AI research ever done seems to be somewhere relevant to the most general conception of the problem. The hypothesis sketched here claims that if any general computational framework for this problem is to emerge, it will necessarily involve the capture and deployment of domain knowledge at the information boundaries we will call correspondence connections, after Brian Smith's idea called the ``correspondence continuum.'' In this brief position paper, we scramble through a few central ideas that provide a basis for describing the idea of correspondence connections, and try to explain the requirements and challenges for the development and exploitation of such a framework.
If inductive logic programming leads, will data mining follow? (PostScript), (gzip Postscript) (presentation)
R.G. (Randy) Goebel
from the Proceedings of the Japanese Society for Artificial Intelligence, Foundations of AI Special Interest Group Workshop on Inductive Logic Programming, pages 39--49, Hokkaido University, Sapporo, Japan, July 31--August 2, 1996

Abstract The increasing popularity of inductive logic programming (ILP) has provided one clear demonstration that machine learning has become practical. Despite its relatively conservative basis, it has natural avenues of both theoretical and practical development. One more general area in which induction has a role is so-called knowledge discovery in databases (KDD) sometimes called data mining . There too induction has a role, but many of the current approaches are based on the creation of abstraction rules, guided by the use of explicit concept hierarchies and hypothesis rankings based on measures like ``support'' and ``confidence'' computed against extensional (ground) databases. We examine some of the directions in KDD, with the goal of identifying ILP research that can gracefully lead KDD to improved methods.
Anytime default inference (PostScript), (gzip Postscript)
Aditya K. Ghose and R.G. (Randy) Goebel
manuscript draft of May 18, 1995

Abstract Default reasoning plays a fundamental role in a variety of information processing applications. Default inference is inherently computationally hard and practical applications, specially time-bounded ones, may require that some notion of approximate inference be used. Anytime algorithms are a useful conceptualization of processes that may be prematurely terminated whenever necessary to return useful partial answers, with the quality of the answers improving in a well-defined manner with time. In this paper, we develop a repertoire of meaningful partial solutions for default inference problems and use these as the basis for specifying general classes of anytime default inference algorithms. We then present some of our earlier results on the connection between partial constraint satisfaction and default reasoning and exploit this connection to identify a large space of possible algorithms for default inference that may be defined based on partial constraint satisfaction techniques, which are inherently anytime in nature. The connection is useful because a number of existing techniques from the area of partial constraint satisfaction can be applied with little or no modification to default inference problems and because tractable cases for partial constraint satisfaction suggest tractable default inference problems.

Characteristics of multiplexed variable bit rate video sources (PostScript), (gzip Postscript)
Guohu Huang and R.G. (Randy) Goebel
manuscript draft of March 20, 1996

Abstract Multimedia applications will play a crucial role in future high-speed networks. Despite improvements in network switching and transmission capacity, the anticipated volume of multimedia traffic will remain a challenge, and any results which characterize multimedia data streams will be useful in understanding how to improve their distribution over digital networks. Clarification of the statistical characteristics of multimedia traffic, especially video traffic, is therefore of great significance. Here we present a statistical analysis of one hour long sample of the aggregated traffic of three synthetic MPEG video sources. The sample was obtained from a simple network model running on the {ATM-TN} simulator, which was developed by the Western University Research Consortium on High Performance Computing and Networking (WurcNet) TeleSim Project. Existing results have shown that single video traffic is self-similar. But the complexity of self-similar processes makes difficult the theoretical analysis of multiplexed video traffic. Therefore we use a simple network model to help establish some empirical evidence regarding whether multiplexed video traffic is self-similar. This simple network model is built to detect the possible smoothness of multiplexed video traffic. Contrast to expectation, we find that aggregated traffic of self-similar video traffic is still self-similar. Our synthetic sample displays most of the properties found in the measurement study of individual video streams, including long range dependence and self-similarity.
Hyper least general generalization and its application to higher-order Concept learning, (PostScript), (gzip Postscript)
K. Furukawa, M. Imai, and Randy Goebel
manuscript draft of January 7, 1995

Abstract We propose a simple extension to Popplestone and Plotkin's concept of Least General Generalization (LGG), in order to generalize literals with different predicates. We call this algorithm Hyper Least General Generalization (HLGG). We discuss the importance of HLGG's ability to do predicate invention, and explore the relationship between HLGG and the folding operation of program transformation. Using the inductive logic programming system GOLEM as a foundation, we apply HLGG to a problem which involves higher order concept learning, and describe an example which extracts a higher order concept like transitivity. Finally, we compare HLGG to Higher Order LGG, as proposed by Feng and Muggleton.