Simply put, authentication is the act of verifying one's identity. In a social setting, authentication can be found in various forms – presenting your social security number, ID, passport, or a plain written signature. We can go as far as stating that simply recognizing a person in the office hall can also be considered a form of authentication. However, authentication as we know it belongs in computing. The first thing that pops to mind are the ever-present PINs and passwords, followed by OTPs and fingerprint authentication.
The whole story fits into 60 (seemingly short) years of continuous authentication improvements and development. To see how the history of authentication progressed to what we're dealing with today, ASEE prepared a comprehensive authentication timeline showcasing important milestones through decades.
The history of authentication begins with passwords in the 1960s, with the first computers being available to the broad public. Back then, computers were enormous, ridiculously expensive, and slow - compared to today's standards. The fact that only a handful of universities and large-scale enterprises owned an actual computer made the demand exceptionally high. To address this demand, universities such as MIT introduced time-sharing operating systems like the Compatible Time-Sharing System (CTSS), which enabled multiple users to share resources delivered by a single computer.
By solving the demand issue, the privacy of the files within a shared file system became a challenge. It is important to note that the users were researchers and students using the computer to calculate and store their research and new findings – nobody was concerned about losing their cute_dogs folder. Essentially, everyone had access to everything. The solution emerged in 1961 with Fernando Corbató – an MIT researcher and later professor.
Corbató implemented a simple password program. And by simple, we mean rudimentary – the passwords were stored into a plaintext file within the filesystem. It didn't take long for someone to crack the code. Allan Scherr, in an attempt to prolong his limited four-hour session using the MIT computer, simply found the location of the file and printed the complete list of user passwords.
The conclusion was clear: storing passwords in a plaintext file is a terrible idea. Mistakes made in the past pushed the development to add an additional member to the history of authentication. Robert Morrison took a concept from cryptography – the hash function. This function served as a sort of derivation key. A secret value is easily calculated in one direction, but difficult to reverse.
As hashing became a standard, hackers managed to find their way around it. Additional protection was introduced in the form of salts – elements of randomness that made the initial hash even more unique and difficult to crack. Hash-based password storage was truly a breakthrough at the time.
Besides hashing, additional cryptography techniques proved to be beneficial for the further development of authentication. Asymmetric cryptography, more popular as public key infrastructure, is one of those techniques. Without going into too much detail on how the technology works, it is based on two keys. A public key that is safe to share with the rest of the world in order to prove your identity, and a private key for digital signing used for verifying your identity. A digital certificate is basically a combination of the two – a public key certificate signed with your private key verifying it belongs to you.
However, widespread use of Public Key Infrastructure didn't happen until the early 90s. The reason behind this is that the technology was highly classified and present only within government institutions. With making the technology available to the public, PKI and digital certificates play an important role in the authentication ecosystem, covering the ''something you have'' part of the SCA (Strong Customer Authentication) requirement.
As security measures were growing stronger, so did the hackers. This demanded the security industry to constantly step up its game and look for more secure authentication solutions. Static passwords were no longer reliable. Hackers could easily steal, intercept, or guess your password and replay it as many times as they wish. With this in mind, an idea was born. What if a user had a completely different password for each time they wanted to access a service? This is the era when One-Time Passwords were born.
When developing the OTP authentication concept, two main challenges need to be addressed:
Upon figuring out the answers to the before-mentioned challenges, the final product was a Time-based OTP delivered through a special piece of hardware. Over the years, OTP standards grew from time-based, challenge/response, hash-based to event-based methods. The delivery of OTPs eventually abandoned the need for a specific hardware device – One-Time Passwords are delivered through SMS, eMail, or specialized mobile applications.
Remember how they kept Public Key Infrastructure a secret for 20 years? Yeah, it's the late 90s, and now it's finally going public. What made PKI a necessity in this era was the World Wide Web. With tons of sensitive information online, knowing who accessed a particular service was imperative.
We already mentioned how PKI involves digital certificates and private/public key pairs. This led to the development of the SP4 protocol, which was later renamed to a better known TLS (Transport Layer Security) Protocol. A couple of years forward, the SSL (Secure Sockets Layer) protocol was developed, including server authentication and keys.
Over time, PKI technology's role became the following: creation, storage, and distribution of digital certificates. This role also includes having a:
MFA requires a combination of multiple authentication factors to verify someone's identity. With OTP from the 80s, we scratched the surface on this topic. The 2000s eliminated the need for a specific hardware device for generating dynamic passwords. Emerging technologies and digitalization brought mobile applications specialized for MFA and generating user OTPs.
Three main authentication elements required for successful identity verification are the following:
Successful authentication implies the combination of two or more security elements. In case an attacker gets a hold of your password, they would also need to gain access to the second or even third factor you used for authenticating yourself. This puts the hacker at a fair disadvantage.
Single Sign-On was another big leap in the authentication ecosystem. The whole premise upon which SSO is built is that the users are not to be trusted with passwords. We don't take them seriously enough, use the same password over and over, and sometimes share them with friends, family members, and colleagues. Also, it is no wonder that passwords tend to be forgotten since we have to use one for every single service we sign up for. For these, and many other reasons, the idea of a trusted third-party responsible for identity verification came to life.
Single Sign-On is based on a trusted third-party, which eliminates the need for verifying a set of credentials on every single site. When a user logs in to a website, that website checks whether an SSO provider already authenticated that user. In case the check is negative, the user must log in; if positive, the user is granted access. Although convenient, SSO brings certain risks. For example, if your gmail account is compromised, that means that every single website/ service for which you used gmail to log in is also at risk.
Biometric authentication uses physical features to determine the identity of the user. This includes fingerprint scanning, facial recognition, voice patterning, iris scanning, etc. Essentially, biometrics cover the ''something you are'' authentication element within the SCA requirement.
To everyone's surprise, iPhones weren't the first ones to incorporate biometrics into their offering. The first ever smartphone enabling fingerprint authentication was the Motorola ATRIX, powered by Android. Apple followed the same footsteps in three years and swapped the convenient fingerprint feature with Face ID in 2017. 30 000 infrared dots scan the user's face and determine their identity.
Today biometrics are a part of our routine – unlocking our smartphones, confirming online purchases, etc. They're extremely hard to spoof and considered among the most secure authentication methods out there. However, you can still find debates on the topic of privacy concerns which raise certain legal questions.
What more can we expect from authentication? In the 2010s, one of the most popular topics in the world of authentication and online payments security was the UX. How to achieve a user experience that is truly frictionless? This is where behavioral authentication gets to shine.
Picture this. You go to checkout, tap on the buy now button, and that's pretty much it. No PINs, passwords, CVV codes, not even a fingerprint scan. A truly frictionless user experience. Actually, stop imagining – it's already here, and it's pretty neat!
Behavioral authentication is based on the user's unique patterns displayed during their interaction with the device – smartphone, desktop, and tablet. Authentication factors include everything from the pressure you use on the keyboard, to the angle at which you hold your smartphone. This eliminates friction and is less invasive for the end-user. An added bonus is the fact that there are no additional sensor costs since the technology is based on incorporating existing hardware.
Behavioral pattern factors include:
It is gaining a lot of popularity, especially in the banking and finance industries, due to the sensitive nature of the information they're dealing with. However, mobile banking users are not the only ones that fall under the ''sensitive information'' category. With more and more services demanding details including personal (SSN, address, email, phone number) and financial information (credit/debit cards), it is time to raise awareness regarding other industries. Some examples of vulenable areas include the gaming industry, healthcare, IoT, telecoms, and government services available online.
Besides fintechs, a combination of biometrics and behavioral factors has a great potential in the customer service department. Behavioral data is a pillar for building a customer-centric approach, allowing companies to better understand and tend to their customer's needs. With the ongoing development of behavioral authentication, we're going to witness heavy application in the customer service areas.
The SxS Authentication Solution by ASEE, based on the Multi-Factor Authentication concept, provides a wide variety of strong customer authentication methods (like MAC, challenge/response, One-Time Password), and combines those authentication methods with biometrics and risk analysis mechanisms to provide the highest security standards for users, regardless of whether they're using hardware or software tokens.
The SxS Authentication solution easily integrates with various services (i.e., e-commerce, internet banking, and enterprise applications) and numerous devices (i.e., mobile/software token, EMV CAP, display cards, hardware tokens, etc.). The solution is in line with the latest PSD2 compliance and is flexible by offering both cloud hosting (SaaS) and at the client premises.
To build your security ecosystem, ASEE provides:
We're excited to participate in the history of authentication as well as contribute to reliable and user-friendly security practices by delivering top-notch solutions.
In case you're curious, feel free to contact us - zero obligation. Our ASEE team will be happy to hear you out.