Measuring Game Lag

Thread in 'Discussion' started by Ezzelin, 25 Jul 2008.

  1. So I was reading Gamasutra the other day and stumbled across an interesting article about measuring controller lag using the 60fps video mode on a digital camera: ... video_.php

    Well, I have one of these cameras, but my wife has it and is away until the middle of next week. But I think this is an ingenious solution to actually figuring out how much lag a game actually has, and I intend to try it out on a number of different Tetris games when I get the camera back. I KNOW my MAME setup has terrible lag on TAP ever since I used Texmaster and felt the lack of lag in that game. I'd be curious to see anyone else's results as well. Perhaps we can compete to see who can find the Tetris setup on the PC with the lowest lag?
    One important note is to make sure you're using a display device with as little lag as possible, so as not to add any extra lag to the results. Using a CRT monitor or TV to do this would be the best. If you need to find your display device's input lag, the standard technique is to set a computer to clone mode with a CRT and the display in question. Then, take a picture of both displays while a stopwatch program is running on the screens. The difference between the two times is the lag difference between the two displays, and since we assume the CRT has little to no lag, the difference is the lag of the other device.

    Also, there was another article from the same person about the causes of controller lag in games, which was also an interesting read: ... veness.php
  2. We have a camcorder somewhere but I don't know where it is.

    Just for fun I did an informal comparison a few days ago:

    Texmaster -> CRT monitor = 0 lag
    Texmaster -> 15kHz CRT television (composite connection) = 0 lag
    Texmaster -> LCD HDTV (using VGA connection) = ~1 or 2 frames of lag
    Texmaster -> LCD HDTV (using S-Video or Composite connection) = Unplayable, at least 8 frames of lag
    Texmaster -> USB TV Tuner -> CRT monitor = ~2 or 3 frames of lag
    Texmaster -> DVD Recorder -> CRT television = Maybe 1 frame, I have to test this again
  3. thank you alot for pointing me to this article. it is really enlighting. my favorite quote from that article:

    this gives me something to about how good "fixed timesteps" ( are anylonger under the light of this article.
  4. some more interesting quotes from user comments:

    interesting.. i always asked myself if usb is fine or not regarding lag.

    yes? howto do that?
  5. tepples

    tepples Lockjaw developer

    For some reason, audio lag on PCs tends to be a lot longer than video lag. So back in November 2002, I ran some tests with VisualBoyAdvance, using a microphone that can hear both my keypresses and the game audio. I made a test rom that would play a sound every time I pressed a button. Then I ran it on an emulator and on my GBA using a flash card. After I reported the results to VBA's maintainer, the emulator got noticeably tighter over the next couple versions, at least in (eyeballed) control-to-video if not control-to-audio.

    The control interface on the NES and on the GBA and DS allows for faster polling of the buttons, but most games don't do that because most games don't run their simulations more often than 60 Hz. I've read somewhere that fighting games might do this to gain accuracy in detecting button press combos.
  6. One think I'm not sure about is how applicable the methods that cause lag are to PCs compared to consoles. For instance, I don't think that PCs need to decide on 60fps or 30fps, for example. (Not that I'm sure why consoles need to make this decision either, for that matter.) Generally PCs use as high of a framerate as possible up to the refresh rate. Still, measuring with a 60fps video can still give good information, especially if you set the PC's refresh rate to 60Hz. I can see how this method would be very useful for things that always run at a fixed FPS, like Tetris games and emulators.

    To measure USB polling rate, herc, there are two programs out there: mouserate.exe, and dx_mouse_timer_dialog.exe. The first measures without using DirectX, and the second measures using DirectInput. Otherwise, they function similarly: you move the mouse around quickly and the program tells you the polling rate. There are various utilities to change this rate out there, but some mice poll faster by default as well. My Logitech G3 polls at 500Hz, for example, while the wireless mouse from my Logitech Cordless Confort Duo only polls at 60Hz, probably to save battery life. I would really be interested in finding a program like mouserate that would work for USB HID joysticks as well.

    Also, I've never seen the USB lag that the poster in the comments section noticed. Perhaps it's a SLI specific issue? USB in general seems to not have much lag. This would be an interesting thing to test, however. Perhaps if someone could make a program that displays something when given a configurable input and for certain has absolutely no lag, then lag for various input methods could be tested with a 60fps video camera in much the same way.

    tepples, audio lag is a whole different subject. Still, it can definitely contribute a sense of lagginess in a game for sure. Somehow I would think it wouldn't make as big of a difference, though. I suppose the best way to determine if audio is causing a perception of lag is to turn the audio off and see if the game still feels laggy. It would be interesting to test various Tetris setups with your microphone setup.
  7. I always find it funny how everytime I play console games I feel most safe using the cheapest and oldest TV in the house. I think it even is starting to have problems with the contacts that make the image fuzzy toward the bottom for the first minute or so until it warms up. But it doesn't matter, as long as I have no lag. Back when I was playing SSBM I could distinctly notice lag on other systems, since I was a fox main and was used to wavedash input sequences which were 4 or 5 frames long.

    Of course games like guitar hero are another story completely. Guitar hero does have lag calibration, and nowadays they have separate calibration for audio and video, which is even better, but there is no way to eliminate the effects of lag. In this case the notes and music can line up just fine, but the note hit/miss effect will still necessarily be delayed, which can still cause minor issues.

    As a side note, you can actually use the lag calibration feature of GH3/Rock band to get a -rough- estimate of the lag introduced by your system. I did this with my TV tuner and found that it had maybe 100ms lag or something like that (don't recall exactly). Which, for GH3 was pretty much unacceptable (luckily I had another method--routing the video from XBox360 into my monitor directly using DVI input). For SSB:Brawl it was roughly acceptable but I doubt I would have liked it in SSBM since that was more fast-paced in general. For Trauma Center:New Blood it didn't really cause any problems at all since that's not a very time-sensitive game.
  8. It's funny, so far I've seen one TV that (apparently) doesn't lag (enough to make a difference) like most TVs today when it gets a 480i signal, if you set it to GAME mode.

    So why isn't this the case for most TVs today, when it was for (correct me if I'm wrong here) every single TV in existence over 10 years ago? It looks like a step backwards more than anything.
  9. Of course, it's simple: bright processed images sell the most TVs in the showroom, and all that adds lag. The interesting thing about the technique in the article, however, is that it measures the other part of the lag equation that DOESN'T have to do with the TV, which I hadn't really considered much before. I'm pretty sure games today are overall laggier. Look at Super Mario Bros, for example. I'm pretty sure one of the major things that contributed to the feel of that game was the lack of lag. In fact, I think I remember Miyamoto once saying that the secret to game design or something was to make the game do what the player wanted to when the player pressed the button. As in, don't lag.
  10. There certainly felt like there was input lag in SMB on the NES (hey, I friggin' pressed jump!) but that may be up to my Famicom limping along at the ripe old age of 20. But that's kind of off topic. [​IMG]
  11. tepples

    tepples Lockjaw developer

    The little I've played of Lumines for PSP sounded extremely laggy. As I understand it, the game delays all sound effects to the next sixteenth note (double croche) in the music. Given the way the game's block clears are synchronized to music, I have a feeling it's intentional, but it still felt like lag to me. Even Luminesweeper through VBA felt less laggy.

    I can test control-to-audio lag:
    • Tetris for NES on an NES, and on Nintendulator, Nestopia, FCE Ultra, and PocketNES
    • Lockjaw for GBA on a DS, a Game Boy Player, VisualBoyAdvance, and NO$GBA, with Lockjaw for PC for comparison
    If I can borrow my cousin's camcorder, I can test control-to-video lag in these situations as well, distinguishing between a CRT SDTV and an LCD HDTV for the Game Boy Player. Which do you want to see first?

    For one thing, TV-sized flat-panel displays are much more common now than then. Unlike CRT SDTVs, flat-panel HDTVs have to take into account several image sizes as well as the fact that half the scanlines are delayed by one field, and resample them all to the 768p or 1080p signal that the panel is expecting. This usually involves one or more frame buffers to analyze the image, even if only to decide which parts of the picture to bob (stretch a single field) and which to weave (combine with previous field). Each frame buffer adds one field (16.7 ms) of lag, and some TVs might use four for best results on material converted from 24-frame progressive to 60-field interlaced (that is, feature films on TV or DVD).

    I think the "game mode" on your TV uses a fixed proportion of "bob" and "weave" to deinterlace the signal. My TV (a Vizio 32" LCD) doesn't have a "game mode", but it exhibits a couple fields less lag through the (progressive) VGA input than through the (interlaced) composite input.

    Most TVs are designed primarily to watch OTA TV, cable TV, or satellite TV, not locally generated video game pictures. If you're watching something that is happening 100 miles away, lag just isn't an engineering factor, even more so if it happened months or years ago and is being played from tape. In addition, people in live-action footage and drawings in animated cartoons tend to change direction less often than objects in a video game. Filmed kung fu is 24 fps, and people accept that.
  12. Then I have to say you're just not the right person for the game, as the 16th-note quantization is not just merely intentional, it is considered by Tetsuya Mizuguchi to be one of the most vital constituents of his music-oriented games since Rez and beyond.
  13. I think the new Space Invaders Extreme does something like this as well to a good effect. All the shots go to the music. As for using a camcorder to test for lag, I'm not sure it will work nearly as well. Don't camcorders capture at around 30fps? I think one of the key things for this technique is to us a 60fps video capture device, as it allows a much more exacting 1:1 ratio of frames displayed versus frames captured.
  14. If it's interlaced then it's just as good.
  15. tepples

    tepples Lockjaw developer

    What kotetsu213 means is that analog camcorders capture at 480i or 576i, which has 60 or 50 fields per second respectively. My cousin's AIPTEK HD camcorder captures at 480p with 60 fps to .mov files on an SDHC card.
  16. See, that's what I'm talking about. 480p at 60fps. I really hate interlacing, I just wish it would die. There's really no reason for it anymore. Anyway, yes, 60 fields per second would do fairly good as well, it's just not quite as simple as a straight progressive 60fps.
  17. Found the camcorder and did these tests:

    # of frames of lag compared to a CRT monitor:

    CRT SDTV composite @ 720x480
    -1 - 0 frames

    CRT SDTV composite @ 640x480
    0 - 1 frames (closer to 0 than 1)

    DVD Recorder composite -> CRT SDTV composite @ 720x480
    0 - 1 frames (almost always 1)

    DVD Recorder composite -> CRT SDTV composite @ 640x480
    0 - 1 frames (almost always 1)

    LCD HDTV VGA @ 640x480
    1 - 2 frames (almost always 2)

    LCD HDTV VGA @ 1360x768
    2 frames

    HDTV S-Video @ 720x480
    6 - 7 frames

    Some of these numbers are weird but I don't feel like going back and doing them over again. Maybe later...

    (by the way, none of this means anything to you unless you have the same HDTV/camcorder/etc that I do)
  18. there still is a reason for having interlacing: allowing to sell stuff with the title "fullHD" resolution but only offering 1080i with some sluggish slow hardware... just another case of piss-taking customers...

    interlacing is probably one of the worst compromise inventions i can think of.
  19. For the 30s? I think it was pretty clever.
  20. Muf


    I think it's pretty clever compression in general, it literally halves the required bandwidth to transmit a signal that looks almost as good as the full picture. And if you look at modern deinterlacing circuits like Faroudja DCDi, one can imagine why 1080i is generally considered an "okay" alternative to 1080p, which, at the time that HDTV standards were ratified, was simply too heavy on hardware with the state of technology at that time. I also sincerely doubt that 90% or an even higher percentage of the population would be able to reliably tell the difference between a properly deinterlaced 1080i signal and a true 1080p signal. Mind you, most people can't even tell the difference between 720p and 1080p, or even 480p and 720p.

Share This Page