Video shot boundary detection based on frames objects comparison and scale-invariant feature transform technique

Noor Khalid Ibrahim, Zinah Sadeq Abduljabbar

Abstract


The most popular source of data on the Internet is video which has a lot of information. Automating the administration, indexing, and retrieval of movies is the goal of video structure analysis, which uses content-based video indexing and retrieval. Video analysis requires the ability to recognize shot changes since video shot boundary recognition is a preliminary stage in the indexing, browsing, and retrieval of video material. A method for shot boundary detection (SBD) is suggested in this situation. This work proposes a shot boundary detection system with three stages. In the first stage, multiple images are read in temporal sequence and transformed into grayscale images. Based on correlation value comparison, the number of redundant frames in the same shots is decreased, from this point on, the amount of time and computational complexity is reduced. Then, in the second stage, a candidate transition is identified by comparing the objects of successive frames and analyzing the differences between the objects using the standard deviation metric. In the last stage, the cut transition is decided upon by matching key points using a scale-invariant feature transform (SIFT). The proposed system achieved an accuracy of 0.97 according to the F-score while minimizing time consumption.


Keywords


Frames correlation; Object comparison; Shot boundary; Video analysis; Video segmentation

Full Text:

PDF


DOI: https://doi.org/10.11591/csit.v5i2.p130-139

Refbacks

  • There are currently no refbacks.


Computer Science and Information Technologies
ISSN: 2722-323X, e-ISSN: 2722-3221
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).

CSIT Stats

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.