Why Should You Protect Sensitive Data Like PII, Credit Card with Tokenization?
Learn how to de-identify or pseudonymize sensitive data with Tokenization
Learn how to de-identify or pseudonymize sensitive data with Tokenization
Tokenization is the process of generating a non-sensitive identifier for a given sensitive data element. That non-sensitive identifier is called a Token. Think of a Token as a random UUID.
![]() |
A Token does not have any intrinsic or exploitable meaning or value. In layman's terms, that means: If someone steals a Token, no harm can be done because the Token in and of itself is meaningless. It is just a reference to the sensitive data.
Tokenization is the technical solution for De-identification & Pseudonymization. De-identification is the process used to prevent someone's personal identity from being revealed. Pseudonymization is a data management and de-identification procedure by which personally identifiable information fields within a data record are replaced by one or more artificial identifiers, or pseudonyms.
Let's look at how the world would look without Tokenization. Let's start with how we would store it in our database.
![]() |
Although the sensitive fields would be encrypted at rest using the Database server's encryption key, for anyone to access the data, they would get to see data in plain text, aka raw form.
At a high level, there are four use cases for how data is stored/retrieved in/from the database:
Multiple services within your cloud touch this sensitive data to perform these four broad use cases. These services can be broadly categorized into Application Servers, Networks, Log Files and Databases. Internal employees will consume these services.
Take a step back on how many such services will touch this sensitive data. Think of all the security risks introduced by each service. Security Risks or Vulnerabilities at the application server (compute), database server (storage), Identity & Access Management (IAM), Network, Internet Access, or humans themselves!
Here is the simplest cloud server farm of a small company. Think of how this grows exponentially - so much so that no single person knows the architecture of your entire company over time, much less the sensitive data flowing through this network of servers.
![]() |
The idea of Tokenization is compelling because storage and compute services never deal with sensitive data. Sensitive data is tokenized, and the basic plain-text version is isolated to a Tokenization service. Only that system can perform actions on sensitive data.
![]() |
Let's talk about the four broad use cases we discussed earlier and how they will be achieved in this new world with Tokenization.
Read more on how to not store any sensitive data on your servers and still function.
It all starts with a simple HTML form and some basic JavaScript to collect any data. The same goes for even sensitive data. In the old world, sensitive data goes from the browser/app to a server API endpoint and gets passed around to multiple services until it hits the service that is the single source of truth. All those services touch sensitive data when they don't need to touch it; thereby increasing the security and compliance risk burden of the company
In the new world, the sensitive data is accepted via input fields that are part of an iFrame. The Tokenization provider hosts this iFrame. For example: As Strac is the Tokenization provider, Strac provides UI Components. With Strac's UI Components, the parent page can never access sensitive data; therefore, sensitive data will never touch the business' server. Strac will tokenize the sensitive data and return those tokens to the UI application.
Read more on how to collect sensitive data on your apps without touching it.
![]() |
In the PII/PHI world, displaying data collected from customers/patients is pretty standard. For example: showing the last 4 of the SSN or Date of Birth or as simple as first/last name. Since the sensitive data is tokenized with a Tokenization provider like Strac, the above Strac UI Components also take care of displaying sensitive data securely.
Since business application databases have tokens to send data to third-party partners, use Strac Interceptor API
curl --location --request <your verb> 'https://api.strac.io/proxy' \
--header 'X-Api-Key: <your API key>' \
--header 'Content-Type: application/json' \
--header 'Target-Url: <your third party endpoint>' \
--data-raw '{
"tin": "tkn_lT8RtnYLfpmfecvAfWqzlMnO"
}'
Read more on how to send data to third party partners with tokens!
![]() |
Performing queries against sensitive data like string equality on date of birth or zip code of an address or any operation on sensitive data is super common! With Strac tokens, you can still perform database queries by leveraging Strac APIs.
Online businesses have to charge customers using a credit card as it is the most common form of payment. To accept credit card data, the online business has to achieve PCI Compliance. PCI Compliance forces you to have a tokenization system so that the rest of your cloud (application server) farm does not even touch credit cards.
Identity Verification is mandatory in almost all financial and health-related businesses - whether to perform a background check, fraud check, patient look up or even to do taxes.
Targeted marketing allows businesses to tailor and personalize online advertisements. Businesses can extract anonymized customer information (e.g., area of residence, ethnicity, gender, age group) from identity documents and perform analytics without handling PII on your servers. To learn more on how to redact sensitive documents, please checkout this blog post.