Adapt to a remote-first post COVID-19 world. Setup a free performance benchmarking consultation.

Talk to us
industry, iot

Why IoT needs Simulation instead of Load testing

There are more connected devices on the planet today than the total human population, and this number will continue to grow manifold. IoT performance testing teams are facing an unprecedented challenge today. How could they prove the reliability of their IoT infrastructure to handle millions of connected devices? How would they ensure the coverage of the entire spectrum of possible real world IoT use case scenarios? How to automate orchestration, execution and the entire results analysis so that they could focus on the key metrics of system performance? It turns out that developing a comprehensive testing strategy which could challenge the limits of their IoT infrastructure is a major engineering challenge in itself.

A real world IoT Testing use case

To understand the complexity of testing IoT applications, let’s consider an Oil Rig installation in offshore location. A rig has hundreds of thousands of sensors. Your customer expects that 5 Million sensor devices will need to be connected to the IoT cloud platform eventually. Assuming a single desktop load generator workstation could handle 10K devices, your team would need 500 desktop machines to execute full load. Even if you could spawn all those desktop machines in the cloud, or local data center, you will need a way to manage and execute all those test cases centrally. The bill to obtain the licenses of 5 Million endpoints with your favorite performance test software will probably cost a fortune.

 Lets consider some functional requirement. Your product is an emergency fire alarm system. Each fire detector wakes up every 30 seconds and performs a current temperature measurement. Your test needs to make sure that the sensor occasionally reports false positives (i.e., spurious readings above 100 degrees) and cloud platform starts a replacement procedure if the fault number exceeds a certain occurrence within a particular time.

 Your test needs to ensure that maximum guaranteed latency between reporting a certain fire incident and activation of sprinkler system should not exceed 30 seconds under any circumstances. If it does, your test need to capture the logs of the entire sequence of events which caused it so that error could be triaged. Your test will need to simulate multiple fires in the close geographic proximity so that the cloud platform could perform an escalated incidence reporting.

 At any given point of time, up to max 10% of the total fire alarm system can report the fire. (up to 500K) You will need to test what happens if system exceeds the capacity.  The test needs to ensure that system should be able to handle messaging of up to 10,000 devices per second at peak load. There will be multiple generations of devices; the older one will speak CoAP, the new generation will talk MQTT and HTTP. Your test will test all such combinations

Any problem report raised from QA will need to contain the entire history of messaging done between cloud and simulated device – to help the developer find the cause. As more and more system requirement are understood, the list will grow very large. It is clear that testing such a complex system is not an easy task and requires considerable engineering within the QA team.

IoT TESTING IS HARD

What makes IoT testing difficult?

Traditional QA approach of recording a user behavior, pre-defining test cases and executing them sequentially becomes limited in meeting the exponential demands of IoT. The usual method of triggering a set of input and then validating the output from the system becomes suboptimal due to the nature of the system under test. 

A combination of varying sensor data, device states, combined with possible error scenarios, complex device to device interaction and unreliable network conditions will generate nearly an infinite amount of test vectors. A comprehensive IoT performance test needs to shift its focus on the macro level big picture.

Multiprotocol Approach

IoT devices speak many protocols, talk to different cloud components and also talk to each other via LAN media. Simulating such a network is difficult. 

IoT Specific Challenges

IoT Devices require registration, certificate deployment, configuration, user onboarding, servicing, software updates, power management. 

Security

Testing enrollment for hundreds of thousand of devices is a major challenge. You need a cloud based infrastructure which could scale on demand to cater such needs. 

Year Established
Products
Team Size
Clients

Is simulation an answer to IoT Testing challenges?

The complexity of IoT system and scalability challenges demand a ground up and holistic approach to test the IoT system performance. The performance testing tools of tomorrow need to evolve to match the complexity and scalability of the IoT. At IoTIFY, we saw a clear need in the market to solve these challenges ahead in time and therefore we designed our network simulation software from the ground up to meet the needs of the future enterprise. We have used the same software infrastructure component which is used in production today to build scalable cloud platforms and developed a performance simulation software out of it. The result is IoTIFY smart network simulator, which is first of its kind IoT performance testing software designed for cloud platforms. Here are the key facts which make it different.

Cloud First Approach

Our solution is horizontally scalable and could be leaned down to a single machine. The result is seamless scalability from Cloud, hybrid or event to the local desktops. The IoTIFY network simulator could truly match your IoT cloud platform and grow with it.

Advanced Simulation Capabilities

Features such as drive simulation under real traffic conditions, location functionality, and custom payload generation, with several helper libraries enable a truly customizable simulation.

API Driven

We provide an API driven standard interface which can be easily integrated with existing test ecosystem. The test results could also be exported to any database of your choice.

Multiprotocol

The scripting is protocol agnostic, which means testers could easily create several versions of the same test with different protocols. We support COAP, MQTT, HTTP, LWM2M out of the box and many other protocols will be supported in the future.

Schedule a demo
Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Youtube
Consent to display content from Youtube
Vimeo
Consent to display content from Vimeo
Google Maps
Consent to display content from Google
Spotify
Consent to display content from Spotify
Sound Cloud
Consent to display content from Sound

© Ternary GmbH 2020 All rights reserved.