Measuring Game Lag

Thread in 'Discussion' started by Ezzelin, 25 Jul 2008.

  1. If you're talking about 24fps film material, then I suppose 1080i is fine, since if you do the right pulldown, you'll get a 24fps progressive signal anyways. But then, they should have just broadcast in 1080p 24fps and been done with it, and used the same bandwith if not less. However, it's really annoying when you have to go about deinterlacing normal interlaced content, such as that from a camcorder. All it can do is degrade quality in annoying ways, and cause all sorts of artifacts.
     
  2. tepples

    tepples Lockjaw developer

    In theory, 1080i at 60 fields would use only an insignificantly larger data rate than 1080i at 48 fields, as the extra 12 fields per second are just a "repeat the field from 2 fields ago" flag. So is there a reason that 1080i at 48 fields would use more bandwidth than 1080p at 24fps?
     
  3. Muf

    Muf

    Tepples is right. Properly authored telecine MPEG streams are in fact coded as 24fps progressive (so not 48 fields, but 24 frames, this is a crucial difference in MPEG coding as the macroblocks can be coded progressive and there is no chroma offset), with so-called RFF (Repeat Field Flag) information to display on interlaced 60Hz displays. This has been the case ever since DVD. These flags take up mere bytes in the bitstream, and modern DVD and Blu-Ray players that are equipped with HDMI will in fact ignore these flags, and simply output 24fps progressive over a 48Hz 1080p HDMI link, for displays that support that video mode (1080i/p 60Hz otherwise).
     

Share This Page