>> Event-Based Modeling and Processing of Digital Media

Rahul Singh, Georgia Institute of Technology
Zhao Li, Georgia Institute of Technology
Pilho Kim, Georgia Institute of Technology
Derik Pack, Georgia Institute of Technology
Ramesh Jain, Georgia Institute of Technology

Capture, processing, and assimilation of digital media-based information such as video, images, or audio requires a unified framework within which signal processing techniques and data modeling and retrieval approaches can act and interact. In this paper we present the rudiments of such a framework based on the notion of "events". This framework serves the dual roles of a conceptual data model as well as a prescriptive model that defines the requirements for appropriate signal processing. Amongst the key advantages of this framework, lies the fact that it fundamentally brings together the traditionally diverse disciplines of databases and (various areas of) digital signal processing. In addition to the conceptual event-based framework, we present a physical implementation of the event model. Our implementation specifically targets the problem of processing, storage, and querying of multimedia information related to indoor group- oriented activities such as meetings. Such multimedia information may comprise of video, image, audio, and text-based data. We use this application context to illustrate many of the practical challenges that are encountered in this area, our solutions to them, and the open problems that require research across databases, computer vision, audio processing, and multimedia.

Get the slides of the presentation