Optical flow-based motion compensation algorithm for very low-bit-rate video coding

Y. Q. Shi, S. Lin, Ya Qin Zhang

Research output: Contribution to journalArticlepeer-review

3 Scopus citations


In this article, we propose an efficient compression algorithm for very low-bit-rate video applications. The algorithm is based on (a) an optical-flow motion estimation to achieve more accurate motion prediction fields; (b) discrete cosine transformation (DCT) coding of the motion vectors from the optical-flow estimation to reduce the motion overheads; and (c) an adaptive threshold technique to match optical flow motion prediction and minimize the residual errors. Unlike the classic block-matching based DCT video coding schemes in MPEG-1/2 and H.261/3, the proposed algorithm uses optical flow for motion compensation and the DCT is applied to the optical flow field instead of predictive errors. Thresholding techniques are used to treat different regions to complement optical flow technique and to efficiently code residual data. While maintaining a comparable peak signal-to-noise ratio (PSNR) and computational complexity with that of ITU-T H.263/TMN5, the reconstructed video frames of the proposed coder are free of annoying blocking artifacts, and hence visually much more pleasant. The computer simulation are conducted to show the feasibility and effectiveness of the algorithm. Results at 11 kbps are presented which can be used for videophone applications in the existing public switched telephone network (PSTN).

Original languageEnglish (US)
Pages (from-to)230-237
Number of pages8
JournalInternational Journal of Imaging Systems and Technology
Issue number4
StatePublished - 1998

All Science Journal Classification (ASJC) codes

  • Electronic, Optical and Magnetic Materials
  • Software
  • Computer Vision and Pattern Recognition
  • Electrical and Electronic Engineering


Dive into the research topics of 'Optical flow-based motion compensation algorithm for very low-bit-rate video coding'. Together they form a unique fingerprint.

Cite this