There are more connected devices on the planet today than the total human population, and this number will continue to grow manifold. IoT performance testing teams are facing an unprecedented challenge today. How could they prove the reliability of their IoT infrastructure to handle millions of connected devices? How would they ensure the coverage of the entire spectrum of possible real world IoT use case scenarios? How to automate orchestration, execution and the entire results analysis so that they could focus on the key metrics of system performance? It turns out that developing a comprehensive testing strategy which could challenge the limits of their IoT infrastructure is a major engineering challenge in itself.
A real world IoT Testing use case
To understand the complexity of testing IoT applications, let’s consider an Oil Rig installation in offshore location. A rig has hundreds of thousands of sensors. Your customer expects that 5 Million sensor devices will need to be connected to the IoT cloud platform eventually. Assuming a single desktop load generator workstation could handle 10K devices, your team would need 500 desktop machines to execute full load. Even if you could spawn all those desktop machines in the cloud, or local data center, you will need a way to manage and execute all those test cases centrally. The bill to obtain the licenses of 5 Million endpoints with your favorite performance test software will probably cost a fortune.
Lets consider some functional requirement. Your product is an emergency fire alarm system. Each fire detector wakes up every 30 seconds and performs a current temperature measurement. Your test needs to make sure that the sensor occasionally reports false positives (i.e., spurious readings above 100 degrees) and cloud platform starts a replacement procedure if the fault number exceeds a certain occurrence within a particular time.
Your test needs to ensure that maximum guaranteed latency between reporting a certain fire incident and activation of sprinkler system should not exceed 30 seconds under any circumstances. If it does, your test need to capture the logs of the entire sequence of events which caused it so that error could be triaged. Your test will need to simulate multiple fires in the close geographic proximity so that the cloud platform could perform an escalated incidence reporting.
At any given point of time, up to max 10% of the total fire alarm system can report the fire. (up to 500K) You will need to test what happens if system exceeds the capacity. The test needs to ensure that system should be able to handle messaging of up to 10,000 devices per second at peak load. There will be multiple generations of devices; the older one will speak CoAP, the new generation will talk MQTT and HTTP. Your test will test all such combinations
Any problem report raised from QA will need to contain the entire history of messaging done between cloud and simulated device – to help the developer find the cause. As more and more system requirement are understood, the list will grow very large. It is clear that testing such a complex system is not an easy task and requires considerable engineering within the QA team.
IoT TESTING IS HARD
What makes IoT testing difficult?
Traditional QA approach of recording a user behavior, pre-defining test cases and executing them sequentially becomes limited in meeting the exponential demands of IoT. The usual method of triggering a set of input and then validating the output from the system becomes suboptimal due to the nature of the system under test.
A combination of varying sensor data, device states, combined with possible error scenarios, complex device to device interaction and unreliable network conditions will generate nearly an infinite amount of test vectors. A comprehensive IoT performance test needs to shift its focus on the macro level big picture.
IoT devices speak many protocols, talk to different cloud components and also talk to each other via LAN media. Simulating such a network is difficult.
IoT Specific Challenges
IoT Devices require registration, certificate deployment, configuration, user onboarding, servicing, software updates, power management.
Testing enrollment for hundreds of thousand of devices is a major challenge. You need a cloud based infrastructure which could scale on demand to cater such needs.
Is simulation an answer to IoT Testing challenges?
The complexity of IoT system and scalability challenges demand a ground up and holistic approach to test the IoT system performance. The performance testing tools of tomorrow need to evolve to match the complexity and scalability of the IoT. At IoTIFY, we saw a clear need in the market to solve these challenges ahead in time and therefore we designed our network simulation software from the ground up to meet the needs of the future enterprise. We have used the same software infrastructure component which is used in production today to build scalable cloud platforms and developed a performance simulation software out of it. The result is IoTIFY smart network simulator, which is first of its kind IoT performance testing software designed for cloud platforms. Here are the key facts which make it different.