Automating Biometric Authentication in iOS

July 1, 2019
 By 
Prathitha Iyengar
Mobile 
Group

Hello, and welcome to HeadSpin’s Webinar. My name is Joe Chasinga. I’m a Software Engineer with HeadSpin working with device instrumentation and the biometrics SDKs for both iOS and Android platforms. Today, we’re going to be talking a little bit about the iOS biometrics SDK and how we can use it to automate our tests on the iOS platform. It’s a newly-released SDK alongside Android, which we have just released a version ago. So, any feedback would be more than welcome.

Let’s start with what biometrics really are. Biometrics are the use of the user’s body measurements and calculations as metrics or keys in the process of authentication, identification, or control.

Most common forms you’ll know today include fingerprint scanning, face recognition, iris scanning, or even skeleton detection, as you might have seen with the Xbox Kinect sensor.

Here’s a screenshot from a very popular movie “Minority Report” from back in 2002 – that was 17 years ago. As with many new technologies, they all start from SciFi.

You see here, Tom Cruise is actually engaging in what’s now one of the biometric authentication forms – iris scanning – which is pretty interesting. Now, it’s wildly popular at this point.

Let’s jump into a more iOS-specific biometrics timeline here.

1. In 2013, Apple introduced its first biometric feature: the Touch ID in the iPhone 5S

2. In the same year, they acquired a company called PrimeSense – it’s a company behind the depth camera technology that powers the Kinect sensor

3. This acquisition may have laid the groundwork for the recent Face ID technology, which Apple released in 2017 in its iPhone X and later to replace the Touch ID.

At this point, it’s worth noting for everyone that an Apple device can only use Touch ID or Face ID depending on its model, but not both at the same time. Face ID was the successor to [Touch ID] and exists only in the iPhone X and later.

Let’s dive into the overall architecture of the iOS biometric authentication from a really high level. You’ll see here that, depending on the model of your device, it’s going to have either the fingerprint scanner sensor or the depth camera.

These biometric hardware sensors acquire inputs from the user. Then, there’s the operating system – in this case, iOS, of course, which manages how the applications interact with the users’ biometric inputs via the iOS frameworks or what are collectively known as the Xcode environment.

Last but not least, here’s a very important component in the iOS biometrics process – it’s the microchip on the device called the Secure Enclave Processor.

This processor is totally isolated from the iOS, has its own operating system, runs its own kernel. It stores the more sensitive keys, like raw data and raw biometrics data. The iOS operating system cannot access this data directly. >

So, here comes the issue with automating biometrics tests on the iOS platform. Since there’s just no way to interact with or sidestep that biometric prompt in the test flow without any physical intervention, and there’s also no way to programmatically intervene with the Secure Enclave Processors.

So, what should we do? One way is to hire people all over the world to run a biometrics click farm for all the iOS biometrics capable devices you want, where locals can just runs their fingers or faces against all devices in real time.

That’s interesting – that could have been a great business model that we could launch in the near future. In the meanwhile, we’re just going to offer a more developer-first solution, which is a library or an SDK that developers can easily swap in mock framework and facilitate the automation gracefully in the software layer without any physical intervention.

Here’s where the SDK will be – it’s going to exist in your normal Xcode environment and it’s going to sidestep the biometrics prompt on the software layer.

Here comes a video demo of myself, just testing a sample app called the Authenticator app. What this app does is it presents a login page with a login button that would trigger the biometrics prompt, and it has a red background. And if the biometrics process is successful, it will change to a logged-in page with a green background.

So, let’s see how it goes. I just click the login button here and that’s it, it’s pretty quick. Let me repeat that again because it was so quick – there we go. Using my face, I can authenticate the app without a problem.

And here’s another version of the demo with the same app – the Authenticator app – but this time, we are using the HeadSpin biometrics SDK. This means that, without my face, we’re going to try to use a remote http request to do the same thing.

You see here that this is pretty much the same old school curl command here. I just sent my token, and I sent a post request and just like that I am in, without my face in front of my device.

I’ll just repeat that again. So that’s a post request to our HeadSpin API endpoint. And with the action succeed, I want it to succeed. And then, voilà, just like that, it works.

We’re gonna get into the code just a little bit so that you can get started with using our SDK. But it’s worth noting that, before installing our SDK, our SDK actually just work with iOS version 9 or later due to the SDK. It’s an API that’s only available to those versions. And also, please don’t forget to include:

‘NSFaceIDUsageDescription’ key in ‘Info.plist’

Regardless of whether you’re going to use our SDK or not. If you forget this, your authentication process will likely fail. The SDK is not trying to do something magical, so please be sure that your target device needs to be able to authenticate using its respective Touch ID or Face ID.

In short, it means that the device needs to have at least one legit fingerprint or face for its biometrics to work or for our SDK to work also, because it would be freaky if it did without.

Last but not least – this goes without saying – please do not distribute your test build in public. There’s a great risk of people hijacking your app’s biometrics prompt using TCP connections, so only trust HeadSpin and no one else.

Installing HeadSpin’s iOS biometrics SDK

Let’s get started here. To install the HeadSpin biometrics framework, just open up your Xcode here. I believe this is how most of your Xcode environment looks like. For your information, this is Xcode editor version 10.2.

Under your project target, under the General tab right here, please find the embedded binary sections. Then, you can either click the app button and then locate your downloaded biometrics dot framework file, or you can distract the framework by writing here itself to add to your project and then clean your build folder.

Make sure your build passes, and in your code, try to import the biometrics module. If you’re using swift, just type “import biometrics” and wait a few seconds. If the Xcode does not complain, then you’re good to go.

And, please do not forget to install the SDK dependency: the SDK relies on this really popular networking library called CocoaAsyncSocket.

For those of you who are into iOS development, you should have heard of this library. It’s awesome. We recommend using Cocoa pods to manage your projects, including the SDK dependency.

To install CocoaAsyncSocket, just include this pod file into your project. If you already have the existing pod file, just integrate, this part and this part into your pod file. You can type “pod install” or “pod update” to install the library.

So, once you are set up here, let’s go into the nooks and cranny and do a walk-through. This is the sample app that I’ve shown you previously in the demo: the Authenticator app.

Here is the view controller that controls the login view – the application.

This is the conventional view, without our biometrics SDK. You will have to import local authentication here and then, to begin with, you have to initiate an incidence of LAContext – you’re gonna be interacting most of the time with this context instance.

This part is where you actually just set up your application state. What’s really worth noting here is: you can actually inquire the type of biometrics that is available to the device. So right here, I just want to make sure that if it’s a Face ID, then I’m gonna show them the label. Or if it’s something else, just the Touch ID, I’m going to hide this label.

Right onto this point is the “viewDidLoad()” method. This function gets called when this view has finished loading. Just make sure you call “canEvaluatePolicy” here on the context instance. It’s gonna return a Boolean value for you to learn whether your device is capable of evaluating biometrics or not. It’s always a good idea to call this method earlier on before presenting any kind of interaction to the users.

Here is where the user interacts with the Authenticator app. This “tapButton” method gets called when the login button is pressed. You’ll notice here that we actually initiate a new LAContext instance and then reassign it to the same context variable. This is because you want to get a fresh context for every login, instead of re-using the previous one, or else there’s a chance that all subsequent log-ins are going to be successful, which is usually not something you really want.

Then, you can just set the cancel title, it’s a message on the cancel button for the user. Right here you see that we call the “canEvaluatePolicy” on the context again, but right now it’s inside the conditional “if” block because you don’t want to run any of this code in the block if your device cannot do biometrics.

Here’s the meat of the biometrics authentication. It’s the EvaluatePolicy method. When you call this method on the context instance, and then you pass the call back function here – this is really important – this callback function actually accepts two parameters: the success, and the error

Just make sure to provide the case when it’s successful. In this case, we’re just going to toggle the app state to “loggedin” which is going to trigger the change in the UI to a green page – a login page.

In case of the error state, you can always catch several types of errors here. For instance, userCancel is when the user actually canceled the prompt using the button. appCancel is when the code actually canceled the prompt, and assistantCancel is when the code actually canceled the prompt.

In the case of errors, please always make sure to provide some meaningful fall back message to the users. This is important for the interaction, for user experience, and then the user can gracefully fall back to typing the conventional username  and password.

Let’s now go through this code again, but I am going to swap in and use the biometrics module instead of the conventional context. Right here, please do not forget to import the biometrics module here, and instead of creating the LAContext instance, you’ll be creating type HSLA context.

Before that, there’s some detail you need to remember: always create a wrapper. Let’s create a wrapper here – it’s just an LAContext wrapper – and then only acquire the incidence of HSLAContext using the wrapper createContext factory method.

The reason for this: the wrapper itself helps to persist the TCP connections in the app so that it can actually talk to the remote http request call.

If you use the HSLA context construct or directly at this point, whenever you create a new fresh context, the TCP connections can kind of disconnect. So please make sure you remember this when you’re using our SDK.

And at this point, since you’re just using the same context variable name, all the code stays pretty much the same here, because the SDK was designed to require really [few] changes on your side. In this part, when you actually need to initiate a fresh context, use the wrapper, createContexts method here, and then reassign it to your context, and you should be good to go.

One more thing: you might want to call close method on the HSLAContext as well, because this will help clean up any TCP connections before this view gets destroyed.

One more thing here on the error side: there’s a new extra error called HSBiometricError.intentionalError. This error gets raised when the user sends a http request – the post requests that I’ve shown with the action value of error instead of succeed. This is the time or the case when you want to test the case during which the biometric prompts fail, so you can always catch this intentional error. As always, please do make sure that you provide a graceful fallback to users.

We’re going to try to run this code and see if it’s successful. Awesome. The build succeeded. So, as you can see here [on my phone] this is the same app right here. And when you press the login button, I can still use my face to authenticate. So the only extra perk you’ll get is that you can use HeadSpin API endpoints to actually do this instead of your face.

Okay, seems like we’ve concluded early. Thank you everyone for joining this webinar. For your information, there’s going to be a separate webinar tomorrow, same time on the biometrics SDK for Android – so please hurry and register. Thank you.

Webinar Q&A

Q: How do I get the SDK?

A: Please sign up with HeadSpin and you’ll get access to all our awesome products, including the SDK downloads for both iOS and Android.

Q: Does it work with Objective C project?

A: Yes, it does. The SDK is written in Swift. So that means that in your implementation file, or the dotm file, please import the other file instead. The name is biometrics dot h and it will work as is.

Q: Can I test using the SDK on a local device?

A: Yes, there’s nothing stopping you [from testing] on a local device, but you’ll need to use a TCP client and use a special protocol for that. So, I highly recommend for you to test on HeadSpin devices.

Q: Where can I read the documentation to learn more?

A: The documentation is available to all HeadSpin customers. If you’re interested, please get in touch with me or any other HeadSpin channels and we would be more than happy to walk you through it.

Tags
HeadSpin Logo

About HeadSpin

HeadSpin helps Telcos, media organizations, and large enterprises analyze and improve the user experience of their digital products through its global real device infrastructure, on the edge end-to-end testing, and ML-driven performance and quality of experience analytics.

The HeadSpin data science platform enables collaboration among global teams to accelerate release cycles, build for complex real user environments, and proactively detect and resolve issues whether at the code, device, or network layer. HeadSpin currently works alongside a number of global telco and media organizations today to:

  • Monitor and improve 5G user experience
  • Improve streaming experience for OTT apps
  • Test and optimize data, voice, and messaging services
  • Assess and validate device compatibility
  • Offer regression insights for accelerated development
  • Deploy software at the edge
Infosys Logo

About Infosys

Infosys is a global leader in next generation digital services and consulting. We enable clients in 50+ countries to navigate their digital transformation. With over three decades of experience in managing the system and workings of global enterprises, we expertly steer our clients through their digital journey. Visit www.infosys.com to see how Infosys (NYSE:INFY) can help your enterprise navigate your next.