In the world of software testing, it’s common to hear folks talk about simulators and emulators as if the terms are synonymous.
To a certain extent, that makes sense. Simulators and emulators are similar in many ways, and the differences between them don’t always matter from the perspective of a test engineer.
But the fact remains that simulators and emulators are different beasts. If you want to make the very most of each type of software testing tool, it’s important to understand what makes simulators different from emulators, and why you’d choose to use one or the other.
That’s what this post explains.
Simulators and Emulators: What They Have in Common
To begin, let me explain how simulators and emulators are similar to each other.
Emulators and simulators both make it possible to run software tests inside flexible, software-defined environments. In this way, they allow you to run tests more quickly and easily than you could if you had to set up a real hardware device.
That is why simulators and emulators are typically used to perform most software tests. Real-device testing tends to be performed only late in the software delivery pipeline, just before releasing software into production. That way, you can take advantage of the speed and flexibility of simulated and emulated test environments for most software tests, while still getting the deep insight of real-device testing before you release your software to end-users.
Simulators vs. Emulators: How They’re Different
But the fact that simulators and emulators both serve similar purposes does not mean that they work in identical ways. There are essential differences between them.
A simulator is designed to create an environment that contains all of the software variables and configurations that will exist in an application’s actual production environment. However, simulators do not attempt to emulate the actual hardware that will host the application in production. Because simulators create only software environments, they can be implemented using high-level programming languages.
In contrast, an emulator does attempt to mimic all of the hardware features of a production environment, as well as software features. To achieve this, you typically need to write an emulator using assembly language.
In a sense, then, you can think of emulators as occupying a middle ground between simulators and real devices. Whereas simulators only mimic environment features that can be configured or defined using software, emulators mimic both hardware and software features.
Of course, because emulators may not do a perfect job of emulating the hardware and software of a production environment, they are not a substitute for real-device testing. They just allow you to set up an environment that is closer to the one you’d have on a real device.
When to Use Simulators
Typically, simulators are best for software testing scenarios in which you’re focused on making sure that an application performs as expected when interacting with external applications or environments.
For example, you may want to test an app’s ability to send data to another application. A simulated environment will typically suffice for this, because the underlying hardware configuration is unlikely to have much of an impact on data transactions for your application. Similarly, if you want to make sure that an application’s interface displays properly under different screen resolutions, simulated testing environments are appropriate.
When to Use Emulators
On the other hand, emulators are most useful when you need to test how software interacts with underlying hardware, or a combination of hardware and software.
Do you want to know whether a firmware update will cause problems for your application? An emulator can help you find that out. Or perhaps you need to know how your application performs using different types of CPUs or different memory allocations. These are also scenarios where emulators come in handy.
To sum up: A simulator provides a fast and easy way to set up a software environment for application testing purposes without mimicking actual hardware. An emulator takes things a step further by emulating software as well as hardware configurations. Both types of testing platforms are useful when you need to test code quickly across a large range of variations. But neither is a complete substitute for real-device testing, which you should also perform at critical points, such as just before releasing software into production.
Chris Riley (@HoardingInfo) is a technologist who has spent 15 years helping organizations transition from traditional development practices to a modern set of culture, processes and tooling. In addition to being an industry analyst, he is a regular author, speaker, and evangelist in the areas of DevOps, BigData, and IT. Chris believes the biggest challenges faced in the tech market are not tools, but rather people and planning.