Conference Paper

A core for ambient and mobile intelligent imaging applications

Inst. of Microelectron. Syst., Hannover Univ., Germany;
DOI: 10.1109/ICME.2003.1221538 Conference: Multimedia and Expo, 2003. ICME '03. Proceedings. 2003 International Conference on, Volume: 2
Source: IEEE Xplore

ABSTRACT This paper describes the work in progress of the European IST-2001-34410 CAMELLIA project, which focuses on a platform-based development of a smart imaging core to be embedded in smart cameras. Therefore the work within the CAMELLIA project comprises the specification and implementation of smart imaging applications including the development of required new algorithms and hardware. Based on an existing video encoding architecture for MPEG-4 Simple Profile, the aim is to design a smart imaging core, which is suitable for automotive and mobile communication applications. Thereby the encoding architecture is to be extended with processing units for low- and mid-level smart imaging functions. To indicate the applicability of this platform-based development, a first approach for a motion estimation based background detection using a hardware motion estimation unit is illustrated in this paper. Furthermore, first results employing the background detection to detect moving objects in video sequences using a functional simulator of a video encoding architecture are presented.

0 Bookmarks
 · 
110 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The gap between application specific integrated circuits (ASICs) and general-purpose programmable processors in terms of performance, power, cost and flexibility is well known. Application specific instruction-set processors (ASIPs) bridge this gap. In this work, we demonstrate the key benefits of ASIPs for several video applications. One of the most compute- and memory-intensive functions in video processing is motion estimation (ME). The focus of this work is on the design of a ME template, which is useful for several video applications like video encoding, obstacle detection, picturerate up-conversion, 2-D-to-3-D video conversion, etc. An instruction-set suitable for performing a variety of ME functions is developed. The ASIP is based on a very long instruction word (VLIW) processor template and meets low-power and low-cost requirements still providing the flexibility needed for the application domain. The ME ASIP design consumes 27 mW and takes an area of 1.1 mm<sup>2</sup> in 0.13 μm technology performing picturerate up-conversion, for standard definition (CCIR601) resolution at 50 frames per second.
    IEEE Transactions on Circuits and Systems for Video Technology 05/2005; · 1.82 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: This survey addresses a number of challenges and research areas identified in real time image processing for state-of-the-art hand-held device implementation in networked electronic media. The challenges appear when having to develop and map processing algorithms not only on fading, noisy, and multi-path band limited transmission channels, but more specifically here on the limited resources available for decoding and scalable rendering on battery-limited hand-held mobile devices. Networked electronic media requires scalable video coding which in turn introduces additional degradation. These problems raise some complex issues discussed in the paper. A need to extend, modify and even create new algorithms and tools, targeting architectures, technology platforms, and design techniques as well as scalability, computational load and energy efficiency considerations has established itself as a key research area. A multidisciplinary approach is advocated.
    Journal of Real-Time Image Processing 01/2006; 1:9-23. · 1.16 Impact Factor
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper documents the development of a camera that combines real-time video encoding with the tracking of a moving non-rigid object. The object tracker reuses motion vectors that are generated for video encoding by grouping similar vectors together. These motion vectors are produced using block matching motion estimation. A rectangular representation of the moving object is then formed using these block clusters. This system was first implemented as a PC-based prototype as a proof of concept. It was then ported onto an Altera Nios II soft- core processor where some hardware-based support was used to be able to meet application timing requirements. Practical tests have shown that it is feasible to extend an embedded real-time video encoder to track non-rigid objects using the prototype developed in this project.

Full-text (2 Sources)

View
13 Downloads
Available from
May 16, 2014