You can click on the title of any article to download the postscript version of the article.
In this paper, we present a generalized physically-based model to show erosion caused by water flow, using ideas from fluid mechanics. This model uses the solution of Navier-Stokes equations to provide the dynamics of velocity and pressure. These equations form the basis for the model to balance erosion and deposition that determine changes in the layers between water and erosion material. The eroded material is captured and relocated by water according to a material transport equation. The resulting model is fully 3D and is able to simulate a variety of phenomena including river meanders, low hill sediment wash, natural water springs and receding waterfalls. The model shows terrain morphogenesis and can be used for animation as well as static scene generation.
This paper introduces LocalLines -- a robust, high-resolution line detector that operates in linear time. LocalLines tolerates noisy images well and can be optimized for various specialized applications by adjusting the values of configurable parameters, such as mask values and mask size. As described in this paper, the resolution of LocalLines is the maximum that can be justified for pixelized data. Despite this high resolution, LocalLines is of linear asymptotic complexity in terms of number of pixels in an image. This paper also provides a comparison of LocalLines with the prevalent Hough Transform Line Detector.
This paper describes an indexing scheme that can perform image database indexing and retrieval without having to uncompress the images. It relies on JPEG compression of images and distribution of luminance values to develop and index and perform retrieval. It also allows for some configurable parameters to be adjusted to increase or decrease the number of images that match a query, and to limit the number of images in the output from a query. The system is tested using an image database from Smithsonian Institute.
In this paper, we described techniques to create clusters of documents to improve the efficiency and effectiveness of information retrieval systems. The clusters are created by using knowledge acquisition techniques based on personal construct theory. Our technique minimizes the transfer time from an existing manual clustering scheme in a collection. The technique is also applicable in other areas like routing messages to appropriate destinations.
In this paper, we build on our previous work on the use of perfect hash table in image databases. The technique advanced in this paper moderately affects retrieval performance but allows the database to be dynamically updated through the insertion and deletion of images. The technique is demonstrated with the asymptotic analysis of the new algorithms.
In this paper, we have advanced the use of knowledge acquisition techniques to develop a user profile. The profile can be used to customize the queries in information retrieval systems so that the user gets only those documents that he is looking for (no spurious documents). In addition, the user should get almost all the documents that are relevant to the current search and that exist in the document collection. The system also ranks the documents in the order of relevance to the query.
This paper describes the human capabilities to recognize the images of human
faces that have been degraded electronically
in terms of number of pixels and the number of gray scale levels. It describes
the probability of perception of an image as a human face at different levels of
degradation. The results have been collected from a large sample (over 200,000
trials) using an image
database of over 400 original images, with each image degraded to 144
different images yielding a database of over 57,000 images.
This paper improves on the algorithm to create perfect hash table in image
database systems that was reported in an earlier paper.
The new algorithm allows for the associated values in the hash table to be
negative by intelligently computing the starting point for each value. The
paper also presents a detailed mathematical analysis of the algorithm.
This paper advanced a new heuristic algorithm for creating hash tables in image
database systems. The hash table is constructed by computing associated values
for each picture object in the images contained in the database. The hash table
allows for a one-step O(1) retrieval of images that contain a specified
pattern. The algorithm is compared with an existing algorithm for creating hash
tables using standard data sets from the literature.
This paper describes a texture synthesis technique to create a large texture by wrapping around patches of a small texture in a way that the repetition of small texture is not noticeable. The technique is based on selection of small rectangular patches of textures from random areas in an input texture, provide a placement for the sub-textures, and provide a smooth blend across the sub-textures. The algorithm can create large isotropic textures from a given anisotropic texture by using only the desired areas in the synthesized texture.
In this paper, I present a technique to create texture patterns that can seamlessly tile against each other to create larger texture patterns. I describe the algorithm and implementation of a tool to create such texture patterns, called isotropic toroidal texture patterns, from any given image, without human intervention
This paper describes technical/mathematical solutions for simulating infra-red sensor effects. We have implemented our simulation using a PC running Windows NT and off-the-shelf image processing hardware and software. In particular, we describe the computation of the dynamic characteristics of the actual sensor package within the constraints of hardware and software environment. These characteristics can include video polarity, gain, contrast enhancement, noise, blurring, AC coupling, sensor defects, as well as video overlays (reticules/test patterns), and are applied in the post-processor phase. This paper describes the research and development into the algorithms needed to support the sensor simulation.
This paper describes a new technique developed by me to create an index in image databases from the compressed images without uncompressing them. The need for such a technique has been emphasized in the paper that describes the QBIC system and that was published in September 1995 in IEEE Computer. I am currently working on more experiments that will be reported in a journal article.
This paper proposes the idea of a hash table that is dynamically modifiable under certain constraints. Different algorithms to insert, delete, and update entries in the hash table structure are described.
This paper describes the preliminary results from the psychophysical experiment conducted by us to determine the capabilities of human perception to recognize human faces in degraded images. An expanded version of this paper was subsequently published as a journal article.
Conventional repertory grids are determined by asking the interviewee to specify an integral-valued rating of an entity on a concept. We relaxed the restriction such that the interviewee can specify the rating as an interval to capture the uncertainty in his/her thinking patterns. In this paper, we proposed an analysis method to develop a concept dependence tree from the elicited interval-valued ratings.
This paper described the effects of adding a new heuristic to the ones we had developed in an earlier paper. The new heuristic allowed for negative associated values for symbolic objects.
An expanded version of this paper was subsequently published in Pattern Recognition.
In this paper, we proposed the idea of using interval-valued repertory grids.
The grids were analyzed by adapting a standard analysis technique to interval
values. The technique was subsequently refined and published in the paper presented in Baden Baden, 1994.
Click here to download gzipped postscript version
Click here to download PDF version