Thread in 'Discussion' started by jkwon23, 28 May 2016.
kevin said it felt like TAP
Arika actually did the ports for Cave games, not the other way around. Arcade platforms are PC now anyway, so there's not much "porting" to speak of.
There was a tweet during prime tgm2015 time where mihara was basically like "go ask IKD" when someone asked about a steam release.
You're trusting Arika with a lot when you say going from arcade to PC/steam is easy
It's not a question of it being easy, it's basically a non-issue. Modern arcade systems ARE PCs. TGM3 is a Windows game.
Not sure why you'd want it on Steam though. Of course that's the most realistic approach, but when I do play PC games I prefer them DRM free.
TGM3 is a bad Windows game. For example, you cannot rebind inputs in the game, you cannot adjust fullscreen/windowed options (when you do window it, it will run too fast or slow if you're not using a 60hz monitor). It has no scaling options, instead it just uses a single display mode. It relies on vsync, so when you're on a faster machine, you get input lag, and if disabled, the game will run too fast. The last two issues are probably fixed in TGM4, since there was word that input lag had been fixed, and TGM4 had supported HD displays in videos.
These things are the exact reason why people jumped on Ubisoft when Tetris Ultimate came out on Steam. Making games for PC is hard!
yeah, low latency gaming not so easy on PC.
Since TypeX uses an i/o board instead of directInput, the input system has to b rewritten.
That said, Arika no doubt has the skills for a proper PC port, considering how well Jewelry master played.
a proper PC port would run on steam, not for the DRM, but for the replay sharing and online leaderboard. It would support keyboard, as well as xinpu and directinput, with a smart controller mapper routine. I've discussed said routine before. XInput is trivially auto-detected. ps4 and ps3 controllers are also easily detected, except for sixaxis and dualshock 3, whihc need special drivers or special support in the game itself to send the activation code to the controller.
if something appears to be a ps1/2 controller (determined by inspecting it's axes and pov hat and number of buttons, it can ask "is this a ps2 controller?" and if the user says "yes" it will go press the analog button if the analog light is not on,, the press up on the dpad, then press right on the dpad, then press x. press o, press box. press triangle, (etc) and then it will ask you to move the analogs, and then to click them. now we have ps2 button prompts.
and finally, a way to run the game at the right refresh rate, and to poll at the right time in the main loop for maximum responsiveness.
it would track your best replay online, and let you view other players replays.
The majority of the problems there are caused by it still using the age-old method of locking the game's system to it's framerate. While it was a good idea back in the day, it certainly isn't now. While I'm no expert games developer, I see no reason why they couldn't ditch that method completely and run it off of actual timings instead.
Compared to making a game for console*/arcade, yes. It is harder as said consoles/arcade machine will always run on a single hardware setup. Everyone's PC is different and that's why we have APIs such as DirectX, Java and Vulcan to middle-man the hardware and software to create an even playing field tailored to the game's needs.
*Assuming you perfectly understand the console's arbitrary code language. Otherwise there are many simpler languages available that are all PC-compatible.
EDIT: I did realise something though. Seeing as this would be a single-player (with local multiplayer) game, would it be too hard to buffer inputs by letting the player 'calibrate' the game before playing? That way there would be no input lag whatsoever.
Well, yes and no, depending on what you mean. Yes, they should refactor things such that game logic is decoupled from the computer's vsync, as they're no longer running on a fixed hardware and configuration where they can rely on it being something specific. But no, they shouldn't abandon a fixed timestep for game logic.
Adding a buffer will add lag. In order to remove input lag, you'd have to add a "negative" buffer (i.e. "time travel"). It's not like a music game where you can just move the timing windows and have everything work out OK; Tetris is a game with "moving parts" so to speak, so in order to achieve the same effect you'd have to "send inputs into the past" and re-evaluate the actual game state before pushing the next video frame into the buffer.
TGM3 was designed for Arcade ONLY, on a specific environement "Taito Type X" wich was an affordable alternative for ARIKA at that time.. but you probably already know that. The Project was naturally oriented and compiled for one specific PC configuration, wich greatly limit cost. All unnecessary functionnality for multi hardware is only wasted money. Can i blame Mihamoto for not anticipating 1080p support on original SupermarioBros code because some bunch of pirats would want it 20years later ? (extreme but same idea)
Of course the input delay is the real issue.. (we can't blame durability and security of the Hardware directly to ARIKA)
But knowing the Mihara drama shit about clone and gaijin, His position of Game Designer protecting his lovely baby creation is a FAILURE imo : he did nothing to prevent the game running on regular windows PC than rely on TAITO encryption joke.
No speculation; I played it and it worked perfectly with I would say what felt like TAP input lag. CT can back that up.
?? Oh shit if completely forgot this testing location in america !?!...
Ah ah My bad !
When was that ? 2 years ago ?
Sorry Party man X
the game logic MUST run at a fixed frames per second. it's the only way for competition to be fair.
this way the game is equally responsive to all, and controller input storing replays will play back.
And every non scrolling arcade game waits for vblank to update screen, and polls input as late as possible in this loop to make it responsive.
when every MS matters this is the right way to do it.
the issue is that pcs are not equal, and not all of them are running their monitors at the same refresh rate, and there are a large number of possible latency sources on PC.
input needs to be polled once per frame for most game types. The classic method arcade games used, and that pcs used to back when they had joystick ports, is memory mapped i/o. this means that all the arcade game had to do is look at a memory location. each switch essentially twiddles a bit directly. all the software has to do is AND the contents of a memory location with the right number to see if the fire button is down. or run pseudocode like this
if x==1 // up is pressd
if x==2 //left is pressed
if x==3 //up AND left are pressed.
you get the idea.
you can fit one joystick and four buttons into a single address this way. ( the coin switch and/or start button may be at a different one, though often as not the start button was on the same memory location as the joystick it was for).
while the original PC joyport was the same way, USB is not. different PC inputs are handled different ways. so there need to be multiple input handlers. which means more processing time. pcs are complicated. even special purpose PCS are trickier to get right then an arcade board using memory mapped i/o for responsiveness (TAP).
the responsiveness requirement of modern analog control 3d games aren't nearly as high. people simply won't notice a teeny bit of lag when actions are not instant anyway.
for certain types of games (rhythm for one) latency maters a LOT more. you can cheat for ddr style games by simply sliding the timing window forward, so that even if your input is late because of the lag, the game will see it as perfect.
On a fixed platform (like a Nintendo DS) any existing lag is known and can be corrected for (for Rhythm Heaven) if there is any.
But say for computer pinball game, this can't be done. you MUST simply reduce latency as much as possible to stop balls rolling off the flippers after you press flip. And TGM is the same way, once the speed gets up there. lag causes misdrops, and the faster the speed, the more it hurts.
Add LCD/LED lag onto this, which CAN'T be fixed in the software itself without time travel, and well, we see why it's so hard to get the TGM experience right on PC. that's why we use shmupmame to remove lag that's there on original hardware to make up for other PC lag sources.
It can be done, but its quite complicated and it can feel awkward at times. It just means that after the ball rolls of, it jumps back to the position the ball would have reached by now if you hit it 16ms ago. This technique is often used in network code of fast games, e.g. FPS.
What youre describing is actully time travel, and is closer to how ggpo or rythm games might work. Network fps games use interpolation to fill in gaps from dropped packets, and extrapolation to predict the current position of the networked players (to compensate for lag). These techniques arent used in classic arcade games as far as i know.
No, its not time travel. Its basically what happens when what you call "extrapolation" fails and the data is overruled by the server. As there are many arcade games with network code, I think its quite safe to assume that there are at least some using those techniques.
You can also simulate multiple frame scenario and jump back to the real user action.. multiverse
Or just savestats several previous frame gamelogic and jump back/fast forward to match user input reality (surely way easier to implement) but might be akward to see in TGM..
Really? I was under the impression PC games had the least latency, I myself use a gsync monitor and play most games without vsync. The difference is very noticeable especially when I play these games counterparts on PS4. The PC version feels much more responsive.
Now I know the implication here is that Tetris needs to run at a locked a frame rate, but this shouldn't impede the reduced latency from running with vsync off.
Here is an article that goes in depth about the discrepancies with input lag on SFV.
I'd like to know what it is about windows that is introducing more input lag vs consoles, surely the console is using the same standard for reading data from the bus?
flatscreen computer monitors tend to have less lag then flatscreen TVs. and TVs PC mode tends to have even less lag then game mode.
but there are issues that affect PC arcade style gaming.
for example often video drivers will have a flip queue, which can often be multiple frames big. this works fine for floating framerate 3d games when they go up t 180 fps or so.
but it totally KILLS 60 fps poll every frame games like TGM, causing multiple frames worth of display lag.
Though, if you're playing TGM in an emulator at "60fps", isn't that just the emulated speed? I'd assume the emulator would send more frames to the GPU per second if it can get away with it?
I'm not stating it, but I'm curious if it doesn't work like this? Haven't used Mame myself for over 15 years.
Separate names with a comma.