Ensuring secure and private defaults for sensitive data in the enterprise
Never has a security mandate been more effectively stated than the three words “Secure the Default”. With so many cases of data breaches and privacy violations caused by simple mistakes, one always wonders why the defaults in the systems involved could not be made secure by design. Inevitably, we realize, that as far as tools and platforms go, there is only so much that can be done to secure the defaults at the outset. After all, access has to be granted to at least one default user, and quite often that user or account is the one that ends up being mistakenly unprotected, leading to breaches and compromises that we read about week after week. Now, there are solutions in the market that seek to address this type of risk. This is the collections of tools that scan our environments looking for insecure configurations and settings, as well as the new crop of tools that scan for these in code form, and in our dev pipelines, prior to deployment. All these solutions aim to test every fence that has been built around our data, so that we can avoid scenarios where data, which is the crown jewels of the enterprise, is not inadvertently exposed.
But what about the data itself?
For data-at-rest and data-in-motion, this has been addressed by encryption. However, the very mechanisms we have built to effectively utilize encryption in our day-to-day operations i.e. to instantly convert encrypted data to clear text data-in-use when its needed; these very access mechanisms have become the means by which malicious entities are able to get to our crown jewels in clear text.
So let is talk about data-in-use. Some of the largest losses of clear text sensitive data in the last year have come from data-in-use. Often, this is data that lives in non-core systems, powering business analytics, customer support, artificial intelligence, security operations, and other such high-value applications that need to operate on private data and legitimately hold it within enterprise search platforms and big data indices. Very often, this data lives in cloud service back-ends, where enterprises entrust it to cloud service providers.
Could there be a data default that provides security and privacy from the get-go; where business can be conducted without revealing private data in clear text; where additional effort has to be made to remove the default protection rather than the other way around; where the access control fence around the data store serves as an additional layer of protection, rather than the only one. Could a platform or application be born secure simply by virtue of holding its data in a default secure and private format at all times? Could it be that the default state for data-in-use could be obfuscated, so that even when an enterprise becomes the victim of an insider attack or compromised due to human error, the impact of that is dramatically lowered because the enterprise would not have exposed its crown jewels in clear text?
Titaniam envisions a modern secure enterprise, where all sensitive or private data is “secure by default” at all times, regardless of whether it is at-rest, in-motion, or in-use.
Titaniam Protect, from Titaniam, targets data-in-use protection, at scale. Titaniam Protect provides a single point to plug into a data pipeline with the purpose of converting everything sensitive that comes through that pipeline, into the appropriate private and secure format. Titaniam implements at scale, both data protection that is already well-known, as well as new ways to obfuscate data-in-use, enabling for the first time ever, the possibility of getting to zero clear text sensitive data inside the enterprise.
From simple redaction of data that should never have been there in the first place, to tokenization for data that needs to be transacted upon but not studied directly, to masking, traditional encryption, and finally to data entanglement (where sensitive data that needs to be manipulated directly, is always obfuscated, to avoid its manipulation in clear text form), Titaniam provides a full spectrum of coverage. Entangled data can still be indexed, searched, aggregated and analyzed by native platform processes as long as the platform is enabled with the Titaniam Protect Plug-in, or can call Titaniam as a service.
With the spectrum of options and flexibility that Protect provides, data platforms can ingest data that is secure by default. And when the data itself remains secure and private, even when it is pulled into memory, manipulated, returned in search results, aggregated, and shared between applications; all downstream applications and systems that utilize that data, also become secure and private by default. A single data field that is processed by Titaniam Protect, can be made available to downstream applications in a variety of secure and private formats, thereby removing the unnecessary complexity that enterprises experience today even just to achieve a basic level of data privacy and security for a few applications.
“Secure the default” is an excellent mandate. For the hundreds of thousands of enterprises who put important data into Elasticsearch, or MongoDB, or the multitude of other big data and no-so-big data platforms, and the thousands of applications and cloud service providers, that use these platforms as back-ends, for the Hadoops, Cassandras, Redis’, Druids, the SIEMS, the SOARs, the DMPs, the AMPs, the AI engines… listing individual applications or services would not be practical…
…for all of you, “securing the default” is finally possible.
Imagine a world where implementing GDPR controls was not about mapping every bit of data into every application against a set of geos, users, permissions, articles, artifacts, timelines, and more.
You could simply protect the whole thing. By default. All the time. The data itself would be “secure by default”.
Interested in learning more about how Titaniam can help? Get started today with a demo