Is Low Latency Emulation Cheating?

So a little background about this article and why I have decided to write it. For those who have been following my humble website and Discord this may be somewhat redundant, but I think it is best to try and give the full backstory. Previous to this article, I wrote an article called “What is ShmupArch? Why Does it Matter?” The purpose of this writing was just to provide some clarification of what exactly ShmupArch is, how it works, and hopefully answer some common questions I had received about it up to that point. At the time, I was focused pretty much exclusively on the technical aspects of the emulator, as all the feedback I had received on my ShmupArch project up to that point had been this is awesome! … But how do you get it to work?

That was my frame of mind when I wrote the ShmupArch article and then posted it to the Shmup System 11 forum (I recommend reading it if you haven’t already, as it gives context to this article). I was anticipating a lukewarm response of people expressing mild interest in the project and asking some technical questions about it. However, to my surprise, the response I received was anything but mild. Really, I don’t want to dwell on some of the more heated aspects of the response, the last thing I want to do is further antagonize people, so I’ll just say that the crux of the disagreement about ShmupArch had nothing to do with how to get it to run. Instead, the primary criticism was that the way ShmupArch achieves its lowered input latency is unfaithful to the original hardware and that playing an emulated shmup at a lower level of input lag is a form of cheating. That was my impression of what some of the people posting were getting at at least. Even though I don’t agree necessarily, I can see where people are coming from. I’m sure I’m probably over summarizing some of the posters’ viewpoints a little bit, but I think the topic of emulation finally out performing arcade hardware with input latency reduction is a fascinating and complicated subject.

Up front, let me just say that I don’t expect this article to completely win people over or change their minds. This really is a debate about an underlying philosophy of how a person views shmups and video games in general, so I am sure this discussion could go on endlessly. Instead, all I am striving to do with this post is to explain my philosophy and outlook on the situation clearly. If you don’t agree with me, I understand and I’m not going to be offended (as long as your response is somewhat respectful ha). Also, this is just my general outlook on the situation, so I am sure there will be some exceptions to what I think here and there.

Going back to the beginning, it’s hard for me to pinpoint when exactly I started to really get into emulation and digging into different emulators. Certainly it was before I ever started playing shmups. I’d say probably around ten years ago when my friend helped me put together a junker of a computer out of salvaged parts (the Frankenputer we called it) and the poor thing couldn’t run anything except Starcraft Broodwar and Zsnes. In those years I was just starting to pick up Super Metroid speedrunning before it was the popular beast that it is today.

Anyway, I bring this example up because I was playing Super Metroid on the SNES as well as on Zsnes and I remember, even then, feeling like something was wrong with the inputs on the emulated version. The extended wall jump up the room with all the platforms and space pirates was especially troublesome on Zsnes. I was using a pretty crappy 3rd party usb controller at the time, so that is where I initially placed the blame for the input problems on Zsnes.

 

(screenshot from famous Super Metroid AGDQ 4-Way Race, done on original hardware)

Soon enough I figured out that it just wasn’t the controller causing the input problems, it was emulation input lag. I could just stick to the SNES version, of course, but I wanted access to practice tools like save states and replay recording (this was before flashcarts supported save states). And so began my journey of minimizing latency. Ironically enough, it was when I was using Retroarch’s new run ahead feature for Super Metroid that I realized it could be used for shmups as well.

I bring up Super Metroid because I think it is an important example to contrast with shmups and explain my viewpoint. As a player, I prioritize gameplay feel, gameplay accuracy, and my ability to interact with a game over other factors such as aesthetic or hardware accuracy. I also favor pragmatic simple solutions, which will come into play later. What I ended up doing with Super Metroid is save state practice on emulator, but then using the original hardware for my actual runs; makes sense, I think. Especially since, until this year, Super Metroid on the SNES was the most responsive way to play the game anyway.

But in the world of shmups, original hardware is not a $60 console, $50 game, and a free crt. Maybe in some cases (U.N. Squadron!), but for many, like Dodonpachi, this is far from the reality of the situation. It also doesn’t help that many shmup console ports are stupid expensive to begin with. I know there are some high rollers out there who have the money and space for original arcade hardware, but making that a requirement to play and post scores is simply unrealistic for the majority of people. So unlike speedrunning, only playing on the original hardware really isn’t a viable solution.

So we now must turn to emulation in a way that speedrunning and fighting games don’t have to. Yes, there are console ports, but, overall, I think it’s safe to say there are many examples of great shmups that have sub par console ports (like Dodonpachi) or no port at all (like Batrider). Now the question becomes, how do we approach emulation? I’m sure this is going to be the sticking point in the article that divides readers, but I’ll try my best to fairly represent both perspectives.  

The first perspective, that ShmupArch undermines in some ways, is using emulation to try and replicate the original hardware as closely as possible. The goal of this approach would be to be able to set two cabinets side by side; one pcb one emulator, and have them be completely indistinguishable from one another. In many ways I respect this perspective, especially for those who originally did play their favorite shmups in an arcade setting. I remember in my interview with EX Mosquito for the podcast, he mentioned that he experienced this exact phenomenon when using GroovyMAME next to an original PCB. Visually, he could not tell them apart.

This, of course, is extremely impressive. However, one concern that I have about GroovyMAME is its input lag. NOW HOLD ON! Before everyone gets fired up, let me repeat that this is a CONCERN, in that I do not have all the answers and I am open to evidence on this matter. So, if you don’t agree with my concern and think it is incorrect, please try and present some evidence and testing so we can all get on the same page.

Anyway, the reason I have concern about GroovyMAME’s input lag is that, in the testing I have performed with GroovyMAME, it’s input latency was a frame slower than ShmupMAME’s input lag, when both had v-sync turned off. The test was performed on a VGA CRT monitor using a 60 fps camera and a light wired to the move left input. For some reason ShmupArch was not delivering 1 frame of input lag, even though it was setup to do so in frame-advance. It is likely that this can be improved with some setting tweaking on my part, but still, two frames of lag is a remarkable improvement. 

 

As I understand, GroovyMAME’s edge over ShmupMAME, when it comes to input lag, is that it is able to run v-sync without any input lag penalty, which is awesome, there is no denying that. Also, it is completely possible that I may have had GroovyMAME setup in a manner where I was unable to take full advantage of GroovyMAME’s potential. I wouldn’t think so, but it is possible. I was using the Windows build of GroovyMAME and ShmupMAME on a CRT computer monitor, rather than on a 15hz CRT. Obviously this setup probably is not the correct way to achieve 240p output, but, like I said, I was focusing on input lag not video resolution.

So let’s step back for a second. Now, I recognize that GroovyMAME and MAME in general are complicated pieces of software and I don’t fully understand how they work. I can already envision the replies saying as much and I am completely OPEN to some more hard information about MAME and GroovyMAME, rather than opinions or impressions.

So here is the problem that we run into, and this is really the crux of my concern. From the players I have spoken with, Eaglet, Jaimers, Aquas, Bananamatic, Prometheus, and more, MAME-based emulation feels laggier than original hardware in cases like Garegga, DDP, and Batrider; whereas  ShmupArch has the potential to feel closer to the input lag you would experience on PCB. ShmupArch, of course, also has the potential to feel even more responsive, but we’ll get back to that later. Now I say “feels” because, to the best of my knowledge, there is not much documentation of what exactly the original PCB lag is on a lot of these shmups. What is Garegga’s exact latency? What is Dodonpachi’s? DaiOuJou’s? Batrider’s? Who knows? This is a HUGE problem, and here is why.

If we, as a community, decide that “fair” play consists of playing in an emulated environment that matches the exact latency of the original hardware then, logically, we are going to need a comprehensive database of what levels of latency are we trying to match. Not only is this information going to be extremely difficult to gather because of the technical requirements of creating accurate reliable input lag tests, it also has the added problem of the original hardware being expensive and difficult to come by in the first place. I would be shocked, though impressed, if such a database came into existence.

Before proceeding to my solution and outlook on the situation, let’s review the first perspective again and my concerns with it. Theoretically, it does make some sense that the standard of play, when it comes to input lag, will need to match the original hardware. There is a certain feeling of fair play in this idea when you think about the arcade roots of the genre. However, even if we all end up agreeing to do this, there are a number of logistical problems that are going to be barriers. The first is that emulators like MAME and GroovyMAME may not accurately match the PCB input lag on many shmups in the first place (again, I am open to hard data that contradicts this point). The second is that we don’t have a detailed publicly available database of the original hardware latency to refer to and the possibility of this ever being created is unlikely. A third concern, which I have been saving until now, is the additional regulation and hassle this requirement is going to place on the community (are we going to require input lag checks?). This expectation of hardware purity also undermines how many in the community prefer to play shmups these days. Let me explain.

Sure, there is a number of people, like myself, who enjoy using CRT monitors to play shmups. However, from what I have gathered from talking to people on the forum and in my Discord, there is also a large group of players who prefer to play on modern digital displays. Most of these people I have spoken with do use low latency monitors, but no digital display is lagless. So what does this mean? This means that only allowing people to play on MAME or GroovyMAME is going to subject a large group of players, if not the majority of players, to additional input lag, since these emulators do not offer lag reduction features like ShmupArch does. Is that really what we want? Are we so committed to hardware purity that subjecting a large group of players to a laggy gameplay experience is just a necessary evil? For some maybe the answer is yes, but for me the answer is no. So here are my thoughts on the situation and my solution.

The first thing I think we need to remember is that arcade hardware is over. CAVE is not going to build anymore Dodonpachi PCBs. Original hardware for these games we love is aftermarket, and it’s not exactly plentiful. I’m a huge DDP fan, I’ve played and studied the game for years now, and yet I don’t have a Dodonpachi PCB. I, of course, would be happy to have one, but unless I unexpectedly roll deep with $$$ one day, I don’t see it happening. So if arcade hardware is not available to many of the current players, and will certainly become even less available to future players, we really have to ask ourselves … why bother? Why does it matter, for our community, that we play in a manner faithful to the arcade, in terms of input lag? Sure in places with more accessible arcade hardware and a more active arcade community, like Japan, this conversation is a little different. But we are not Japan, we will never be Japan, so, personally, I think it is time to move forward.

Also, I think it’s worthwhile to ask ourselves if input lag is something worth preserving? Arcade hardware is not perfect, just because it’s arcade hardware. Is input lag a gameplay feature? Or just an unfortunate byproduct of the technology used to create these games. I think Garegga is a good example to look to. The game is known for having significant lag. Maybe this lag was intentional, maybe it was just caused by some kind of coding problem. It’s unclear. What is clear though, is that when M2 created the extremely faithful port of the game, they added in an input lag reduction option. Even more interesting is that you are still able to save your replays with this option turned on. Apparently M2 does not consider input lag reduction cheating.

 

(Screenshot of Revision 2016 Score Screen)

And so we arrive at another crossroad. In my previous post I talked about being able to reduce Battle Garegga’s input lag down to one frame using ShmupArch. For an experienced Garegga player, even one not married to the idea of hardware purity, this does sound pretty shocking, considering Garegga is known for its input lag being part of its gameplay (for better or for worse). Then you compare this to something like Dodonpachi being at 1 frame of input lag, and this does seem like more of a natural fit. In fact, when comparing the latency of Dodonpachi in ShmupArch, compared to other emulators, Jaimers reported that ShmupArch actually felt the closest to the original PCB.

I guess a compromise that could be proposed is using ShmupArch’s adjustable input lag reduction to match the PCB input lag on your setup. Not a bad idea on paper, but when you think about the logistics of requiring players to do accurate input lag tests on their setups and adjusting the input lag on a per-game basis, this seems like an insane hassle. Then there is the question of fairness and trust. Pandora’s box is open, the future is now, run-ahead isn’t going anywhere. What is stopping a player from setting the run ahead a little faster than everyone else? Why shouldn’t he? If one frame of lag is the most desirable, if one frame of lag is what the players want, why not make it the standard? Then we are all on the same playing field. If a player wants to add additional lag by turning on v-sync or using a slower monitor, then it is his choice. All options are open.

When you sit down and put all the pieces together: the end of original hardware, the barriers to accurately emulating original hardware latency for the community, the possibility of an improved gameplay experience, the impossibility of regulating a player’s level of input lag, the lack of quality console ports, the need for ease of use and accessibility; the answer to the question of is low latency emulation cheating? becomes clear. No. Low input latency is not cheating, it is the natural evolution to how we have been playing shmups outside of the arcade.

Back to Top