SMS testing is a challenge regardless of location, device, or network- due to inaccurate reporting and regional regulations it’s almost impossible to consistently and accurately generate reports. DLR, or Delivery Reporting, is a feature of SMS message termination that reports back if the SMS message has been delivered. Typically, this message is sourced directly from the handset.
Telecoms providers can report whether an SMS has been delivered, but with no regulations in reporting, much of the data is inaccurate. To make matters more complex, many of the reports that are generated can only report if the transaction was a success or failure.
In this post, we demonstrate how we can leverage the HeadSpin platform for SMS testing in three high-volume countries: India, China, and Brazil, and discuss how each can be improved.
What is “active testing”?
Active testing means your service sends an SMS to a phone or device that you control and then checks to see if the SMS was received and what aspects of the message changed in the process. However, the scope of active testing is larger than running unit or end-to-end testing. Test SMS messages can be sent ad hoc when setting up a new service, or for detecting delivery issues on a particular carrier or within a specific country. The classic approach to “active testing” is to call your colleague in India and tell him you’re sending a message. Understandably, this approach doesn’t scale.
What does active testing look for?
Active testing is the best tool for determining if a message sent via a provider will arrive at the destination looking like it is expected to. Active testing can identify many of the following issues:
- Character encoding issues
- Consolidator substitution
- Concatenation behavior
- Content modification
- Sender ID modification
- Blocking or content filtering
- URLs
- Telephone numbers
- Email address
- Actual delivery (remember, DLRs can lie)
- Invalid/valid phone number support
With HeadSpin, you can identify all kinds of telecom pitfalls from the user’s perspective including:
- Is your Alpha-Sender ID preserved? Shortcode?
- Are your US long codes working in Europe? EMEA? APAC?
- Are any of your numbers re-written?
- Does your app signup really work in the middle of Africa?
HeadSpin’s POP, or Point of Presence, phones allow any support or engineering team to have a physical presence across the globe.
Examples of failures identified with HeadSpin and “active testing” include:
- Identifying delivery failures in China.
- Customer’s improper regex phone number configuration in African countries.
- Identifying carrier rewrites in numerous countries.
Active SMS Testing in India, China, and Brazil
Let’s see how a Leading CPaaS provider’s general (CPaaS SMS) and turnkey (CPaaS Turnkey) solutions work out of the box to deliver in three high-volume countries – India, China, and Brazil. India has a heavily regulated telecom system with many rules around when and what you can send. China, on the other hand, is constrained by the great firewall which could, at any time, suspend or delay messages. China is also quite draconian if you send messages with illicit content. Doing so may very well result in the suspension or loss of your SMS sending privileges. Brazil has historically had issues with carriers and Unicode making it another fun country to experiment with.
We’ll use the CPaaS Turnkey API’s default configuration to send OTPs. For India, this means it should deliver to recipients via a registered Alpha Sender ID. For Programmable SMS, we’ll simply register a US long code with CPaaS provider to send directly to the HeadSpin POP handsets.
Once we’ve found a phone to test with, the very first step is to simply text your own phone as a reality check. Honestly, it’s pretty amazing to initiate an SMS from the other side of the world and watch it show up to your own local phone within seconds. This first step just demonstrates that everything is working.
Discovery Data Points
The intent of this testing is to see the real-world SMS experience a user has on their phone in each location. We will record each data point as below:
- Mobile Number: Phone numbers associated with HeadSpin device
- Device: What kind of device?
- Location: Where is the device located?
- Carrier: Carrier the device is using
- Country to US: Are we able to send an SMS from a Country to the device in the US
- CPaaS SMS to Country: Are we able to send an SMS to the device from the US via Programmable?
- CPaas Turnkey to Country: Are we able to send an SMS to the device?
- Expected Body: Did the carriers process the body correctly?
- Message Body: “Hello from San Francisco?!”
- Long Message Body: “The lazy brown fox jumped over the fence. The lazy brown fox jumped over the fence. ……”
- Unicode Body: ★ ☆ ☇ ☈ ☉ ☊ ☋ ☌ ☍ ☏ ☐ ☒ ☓ ☚ ☛ ☜ ☞ ☟ ☡ ☤
- DNA = Did Not Arrive
India Testing
A Note 5 Pro device on HeadSpin’s public cloud is depicted below. This is one of 3 devices we’re using to test telephony delivery in India. Each device uses a different carrier network and supports both automated and manual QA testing via any testing framework.
Data from Testing in India
Mobile Number+91XXXXX+91XXXXX+91XXXXXDeviceiPhone 6iPhone 11XIAOMI Redmi Note 5 ProLocationBangaloreBangaloreBangaloreCarrierAir TelJioVodafoneIndia to USNo lagNo lagNo lagCPaaS SMS to IndiaDNADNADNACPaas Turnkey > IndiaNo lagNo lagNo lagExpected Body?YesYesYesUnicode Issues?NoNoNoDelivered By?57575565757556 / BZ-AUTHMS503501 / 5676735LagNoNoLittle
India Conclusion
Admittedly, this is a pretty small sample size but it gives you a general idea of what kind of experience a user can have with different carriers in each location.
The most obvious issue here is the non-delivery of SMSs to India via the CPaaS SMS API. This is expected behavior as delivery to India typically requires jumping through several technological and bureaucratic hoops.
The real winner in this small sample size is the CPaaS Turnkey. Obviously, they have nailed their implementation as everything was delivered in a timely manner. The interesting observation is that out of the six attempts, it seems only one was delivered via the Alpha Sender ID registered in India. This could be a side-effect of the intelligent retry logic or simply Indian carrier mischief.
China Testing
For testing in China, we focused on two major metro areas, Shanghai and Beijing. You’ll also note that each phone used a completely different carrier. DNA denotes, did not arrive.
Data from Testing in China
Mobile Number+86XXXXX+86XXXXX+86XXXXXDeviceiPhone 7XIAOMI Mi A2iPhone 7LocationBeijingShanghaiShanghaiCarrierChina MobileChina UnicomChina TelecomChina to USZero LagZero LagZero LagCPaaS SMS to ChinaDNAMinimal LagMinimal LagExpected Body?DNAYesYesUnicode Issues?DNAVery FastVery FastDelivered By?DNA1306764314513136240203LagDNANoneFew secondsMultibody?DNASupportedSupportedCPaaS Turnkey > ChinaDNADNADNA
China Conclusion
Testing with HeadSpin in China resulted in the complete opposite experience from India. In this case, 33% of CPaas SMS failed to arrive, and NO CPaaS Turnkey messages sent to China arrive. Like most Voice or SMS API calls to phones terminating in China, misconfiguration or mischief by the Chinese firewall is surely to blame.
Interestingly enough, the message sent from China to the US arrived quicker than any other test.
Brazil Testing
Testing in Brazil was limited to a single carrier network. Overall, Brazil demonstrated the best performance across devices. The carrier behavior did not vary between phones and provided the fastest experience across all three test countries.
Data from Testing in Brazil
Mobile Number+55XXXXX+55XXXXX+55XXXXXDeviceiPhone XSiPhone 6siPhone 7LocationSão PauloSão PauloSão PauloCarrierVivoVivoVivoIndia to USNo lagNo lagNo lagCPaaS SMS to BrazilNo lagNo lagNo lagExpected Body?YesYesYesUnicode Issues?NoNoNoDelivered By?271992719927199LagNoNoNoCPaaS Turnkey > BrazilNo lagNo lagNo lagExpected Body?YesYesYesUnicode Issues?NoNoNoDelivered By?271992719927199LagNoNoNo
Brazil Conclusion
Testing in Brazil seems to have provided the best results out of the three countries. At the time of testing, there were limited Brazilian carriers to test with but on a positive note, the experience was exactly the same across each device.
Why Active Testing Matters to You!
Working with telecom is a fickle beast. When talking with customers, I frequently compare sending an SMS to sending a UDP packet. With both of these technologies, you initiate a send and the data is off! But, you never know if it has gotten to its destination. Any number of issues could affect the delivery.
- Queueing (is there a football game in Munich, a cricket game in Lahore?)
- Infrastructure Issues
- Political Suppression
- Grey Routes
Another way to think of SMS telephony is to consider hiring a container ship to deliver goods to a port of call. You can hire a cheaper company that takes the long route around South America or the more expensive company which goes through the Panama Canal. It is this latter route that has a far higher likelihood of arriving on time at the port of call. Your cheaper company’s container ship may end up at the bottom of the sea while rounding Cape Horn.
As a mobile developer, you’ll have a clear understanding of how your mobile experience will play when leveraging HeadSpin’s robust suite of testing and performance monitoring solutions. You can take immediate control of a phone anywhere in the world with our Global Device Cloud and test to your heart’s content. Looking for a more robust long term solution? Leverage your own private cloud for automated testing or setup hourly, daily, weekly testing and benchmark reporting to identify trends and issues around your KPIs.
HeadSpin can help you automate functional, performance, and load testing across applications, devices, and networks to optimize digital experiences enabled by mobile, web, IoT, and 5G. Want to give HeadSpin a try? We have a Point of Presence in more than 90+ locations, consisting of thousands of devices and endpoints. Reach out via our Contact Us form and we will coordinate a discussion and demo.
Accelerate your Automation Skills
Introducing HeadSpin University
FAQs
Q1: What are the characteristics of active testing?
Active testing is the testing process where the tester interacts with the software and directly executes the test. It usually verifies that the critical parts of the application are working as intended. Some examples of active testing include functional testing and structural testing.
Q2: What is passive testing?
Passive testing is finding issues and bugs in the software without the tester interacting with the application, e.g., extracting and analyzing the application logs. Passive testing is practical when the regular run of the application cannot be interrupted, or the QA team does not have access to the application interface.
Q3: What is API testing?
API testing validates an API’s functionality, security, robustness, performance, and reliability. Automated testing of APIs is recommended as automated testing can identify risks and security vulnerabilities of APIs, which manual testing can easily miss.
Q4: Who performs acceptance testing?
User acceptance testing is usually close to the final stages of SDLC and done close to deployment. The customer or the end-user typically performs it once the development team is reasonably sure that the application is bug-free.