In this episode of Modern Cyber, Jeremy sits down with Dan Draper, founder and CEO of CipherStash, to unpack the complexities of modern data encryption and secure access.
In this episode of Modern Cyber, Jeremy sits down with Dan Draper, founder and CEO of CipherStash, to unpack the complexities of modern data encryption and secure access. From searchable encryption to post-quantum cryptography, Dan explains how CipherStash is making secure data access possible without sacrificing utility. The conversation covers Zero Trust, access control philosophies, real-world encryption applications, and why most people don’t actually care about security—until it blocks a deal. This is a highly technical, yet accessible dive into how data protection is evolving in the face of increasing complexity and looming quantum threats.
About Dan Draper
Dan Draper is the founder and CEO of CipherStash, a data security company focused on enabling trusted access to encrypted data. An experienced cryptography engineer, Dan previously served as VP of Engineering at MedicalDirector and Expert360. His mission is to empower developers with the tools and knowledge they need to build secure, privacy-respecting applications. Dan is a leading voice in applying searchable encryption to real-world business challenges and is passionate about rethinking how organizations manage data access securely and efficiently.
Host Note: “We checked Riverside’s encryption after recording and found this: https://www.ietf.org/id/draft-ietf-tls-ecdhe-mlkem-00.html — so quantum-safe 🙂”
CipherStash Website: https://cipherstash.com/
Dan on Linkedin: https://www.linkedin.com/in/ddraper/
Alright. Welcome back to another episode of Modern Cyber. We've got another guest joining us from the other side of the world today, and I always find that it's a lot of fun to talk to people from other parts of the globe because they might have different perspective, different experiences from what a lot of our audience experiences in North America. My guest today is Dan Draper. Dan is the CEO and founder of CipherStash, a data security startup building a searchable encrypted data storage platform for sensitive data.
We're gonna get into what that actually means in some level of detail in today's conversation. We'll see how we go. Dan previously worked as a VP of engineering at medical director and at expert three sixty. He's an experienced cryptography engineer and his mission is to empower all the developers with the knowledge they need to to build secure applications. Something that is super important in the modern technology landscape.
Dan, thank you so much for taking the time to join us on modern cyber today. Hey, Jeremy. You're welcome. Great to be here. Awesome.
Awesome. Well, let's start a little bit with your journey and what you guys are up to at CipherStash. When I hear, you know, a searchable encrypted data storage platform for sensitive data, there's a lot in that phrase right there. So talk to us about some of the key components and some of the key problems that you're helping customers solve. Yeah.
Absolutely. So you're right. There's a lot of technology in CipherStash, and but every every piece of technology that we've created has been very intentional, very connected to the to the underlying problem. And so in some sense, I like to think beyond the technology. What is the actual problem we're trying to solve?
Right. You think and you think about, every data protection company on the planet right now really has one thing in common. That one thing is that they everybody's trying to limit access to data, make sure that it's it's controlled and it's locked down. Yeah. CipherStash actually takes the opposite approach.
We want to make data accessible. And it's this concept we call trusted data access. So Okay. Without access to data, if people that work in an organization who need access to particular records or particular information, if they can't get access to that data, then the data is effectively useless. So how do organizations capitalize on the data that they capture but in a safe way?
And so that's really what CipherStash is trying to solve. It's it's sort of like the the Goldilocks of access control. It's not too little. It's not too much. We wanna make sure that we give folks access to the right amount of data at the right time and that such access is always recorded.
It's auditable. Yeah. And in some sense, provable, to the extent that such a thing can be done. Okay. I mean, that makes a ton of sense, and I guess it actually it's not quite what you were saying, like, every data access company is trying to limit access to data.
I would say, like, you know, what you're trying to do is make sure that only the people who have access to the data can access the data. Right? And similarly to that, I think one of the things we look at for instance at FireTail with regards to APIs is, do you have access to the data for the action that you're trying to take? So a lot of times on APIs, I I give the example of something like a social media profile where, like, I can access my profile, I can access your profile. I can edit my profile, but I can't edit your profile.
Right? So the the the action that I'm trying to take with the data also factors into that. Right? Absolutely. And I love that.
And I think you guys are really, you know, blazing a trail, pun intended, I suppose, as well. And and, you know, it's why it's why we've connected and and become friends is we have similar kind of mindsets. I think traditionally traditionally, though, if if an organization or a product had to fall on one side or the other, if you're going to have Yeah. A more restrictive approach versus a more, you know, allowing approach, it's it's always the more restrictive approach that that that folks end up taking. And that ends up Yeah.
You know, causing all kinds of problems in, you know, in teams and productivity and all all that sort of stuff. So and there's no there's no perfect solution. But, that that trusted data access piece where you Yeah. Focus on allowing the access but in a trusted way, that's that's really our our this our spirit. Yeah.
Yeah. Yeah. You said something else here that I think is a little bit, like, almost maybe a little bit of a controversial take, but I do kind of agree with your sentiment there, which is like, if you think about kind of, you know, the buzzword of Zero Trust. Most approaches that I've seen towards zero trust and and, you know, remember I'll remind people like and this is like a common saying, zero trust is a philosophy, not a piece of technology. You cannot buy zero trust.
It's like a strategy for how you you think about structuring access to systems, data, process, people, technology platforms, etcetera. Most zero trust approaches that I've seen, they kind of start with a denial, and then we, like, give explicit access to people based on certain conditions, whether that's, like, role based access or attribute based access or, let's say, application based access or what have you. And and, like, I've always thought about this approach is, like, it makes a lot of sense on paper, but I think something that you said kind of hints at what's going on is that, like, we're accumulating data at such velocity that it's very, very hard to keep track of who should have access to what. Right? Right.
Yes. A %. It it gets very complicated very fast. And in in my experience, what tends to happen is you're right. It starts with a denial type approach.
Although that in itself is actually more difficult than it seems in a lot of context, and we'll talk about that perhaps in a moment. Okay. But the the person or the team that is making that decision about who's allowed access to what data, there's a huge responsibility on them to make sure that they get those decisions. They make the right decisions. But also they have to have a really good sense of what might happen in the future at a given point in time.
So if I'm a data security engineer and I'm defining some policies, I want this group or this person to be able to have access to a record, what happens when that that the conditions that led me to that decision are no longer true? Right. Then things change. Right? So Right.
So whatever whatever system you have in place has to be adaptable. And so one of the one of the kind of philosophies that we've lent into very heavily at Scifesache is start with deny all, but access is, very much a just in time thing. So you're only making the access decision at the point in time when the when the user needs or is making a request to access the data, where where, using standard technologies to drive some some fundamentals and then using some more, I guess, novel technologies to, to do the the last mile if you wanna think of it like that. Okay. Okay.
And so that's where some of your, let's say, like, your key technology comes in really is at that last mile. Right. Yeah. So think about, I I guess I'll give you a kinda concrete exam or two concrete examples, actually. So one is, there are some really obvious ways in which, you might wanna think about applying the denial, pattern.
And so one is Okay. And this is sort of where Syph Stash started, is the difference between access to infrastructure versus the access to data within that infrastructure. And I'll Okay. Talk about that concretely. Hopefully, that makes more sense.
So, you have a database. Yeah. Developers or engineers in your team need access to that database in order to run migrations or backups or, you know, schema changes, what have you. But they don't necessarily need access to some of the underlying data within that database, the actual records or fields in the table. Right.
Right. And so one really simple way to to mitigate that, is encrypting the values in that table. Also, the sensitive values. As, you know, contrast to something like row level security where, an administrator in a database can mod could potentially modify the the row level security rules. And so then you're reliant on, on trusting that the administration team within the organization is doing the right thing.
Where the encryption in this context plays a really important role is when customers themselves can control the key, or the key derivation process. Process. So in other words, a a vendor that is capturing sensitive data on behalf of its customers can store encrypted data in its table in the database table, and the only the customer can decide whether or not that data is allowed to be accessed. Okay. That that really flips the the the model, quite a lot compared to traditional technologies.
Yeah. The other interesting example, I I can I can see here the COGS turn Yeah? Yeah. Exactly. I already have at least one follow-up question that I wanna ask.
Actually, you know what? Let me ask it now before we get into your second Sure. Yeah. Yeah. That's good.
Otherwise I might forget it. All of that makes total sense. But in real time, in the real world, how do you pull a human into that loop fast enough for it to be meaningful. Right? Because I think like one of these things around data access request that I've seen in organization after organization over the years is that like, okay, we we realized to your point somebody's changed rules, rules, or they've now got a new customer assignment, or whatever the case may be.
But they now have a a change in their data access status. Right? So they should or should not be allowed to access data based on some change. And like what I see time and again is that people get behind on these changes. Right?
Because somebody has to go and make this change. So, you know, Dan has to go in and change the group policy object, or adjust the permissions, or add an attribute to Jeremy's user profile, or whatever the case may be. And and like these things back up, and so what ends up happening inevitably over time is that we over correct towards the well, Jeremy's just a member of this team now, access to all data that this team has access to. Right? Like, we fall into that kind of, you know, I don't wanna call it a trap because that's not right.
But we fall into this, like, trade off situation again and again and again. So so, like, how do you make those decisions quickly enough? It is really what it comes down to. So the context here is very, very important. And, actually, I think this does it does connect nicely with the other example that I Okay.
I was gonna mention. So let me talk about the other example, and then maybe we can try and join the dots. So Okay. The the second example is we have a customer. They are a health a health tech organization.
Okay. They have doctors and patients, and, and this is in in Europe, in Norway. Doctors are saving patient records, for the patients that they see. Now Okay. Rather than than a traditional policy might be so let's say the doctor is, doctor Smith.
You might have a a policy that says only doctor doctor Smith is only allowed to see these patient records. Because of the context, we can actually do something simpler but also more effective. And so that is that you can say using encryption technology, only the person who wrote the record is the person who's allowed to read the record. And with traditional low level security, that's actually quite difficult to do, certainly in a in a robust way. With encryption, certainly with our encryption technology, what we can do is connect that with your identity provider.
We can know what, who you are and take the proof of your identity embedded into the encryption. And then only when you can prove that you are the same person again can you retrieve those records. So then, the context of the data access is the thing that determines how the policy is applied. And so you end up with, I guess, policy principles rather than concrete policies. Another example is when you've got a, let's say you're a vendor.
You're a Yep. One example we use sometimes is a, you're a you're a calendar optimization vendor. There are a lot of them around at the moment. Yep. Yep.
Yep. We wanna connect it to we wanna connect to your corporate calendar, say, G Suite, pulling all your calendar events. Now the problem is as soon as you sign up to an application that does that, a vendor that does that, they're gonna connect to your Google Calendar. They're gonna suck in all of your calendar events, say, the last twelve months. And then you might decide after a week of using the product that you don't like the product anymore.
You don't wanna use it. Now the challenge is that vendor still has twelve months of data in your system. It's up to the vendor to decide, did we delete this data? Do we do we have an access rule in our system that means that no one in our team can see it anymore? Like, what what happens to it after that?
Right. And so in this context, what you want is a system that, as soon as I, as the end user decide to revoke my authentication, so I disconnect it from my G Suite, I should no longer be able to access the the data in the system and nor should the vendor. So by by the the context, the relationship between me and the vendor has changed. So therefore, the policies that are inferred by that relationship are changed. So in some sense, what what we're talking about here is highly context dependent policies Yeah.
That, can be inferred or or defined based on the nature of the relationship or the scenario rather than having to rely on some a security person determining what policies what low level policies might make sense depending on their their best judgment. Yeah. Do do you consider this parallel or the same as as kind of ABAC attribute based access control? In a sense, it is a it's a form of attribute based access control. In some sense, what we've what we've done is is extended the the concept of ABAC, in in both directions.
So one, we can capture a broader range of attributes that can be proven. So your, you know, claims are your identity and so forth. So it's expanding the number of attributes you can you can use in a in a policy, and extending the enforcement layer. So the enforcement now becomes part of the encryption and so you end up with a with a much, much more flexible and and wider reaching, system. In a sense, it is an ABAC.
Yes. Okay. Okay. And and I guess, you know, if I think about kind of everything that we've talked about relative to this problem, right, that we're trying to solve, there's an underlying thought that sticks with me, which is that, you know, this has been a problem since I started doing IT right back in 1997 to date myself. You know, kind of who has access to what files, what data, whether it's files, data, rows in a database, tables, whatever.
Like, this problem has been around forever. What's changed about this problem over time? Yeah. So I think you're right. It's something that I've struggled with my whole career as well.
Once again, I think the context is important. In my experience, doing these kinds of things in databases is probably the most challenging. And that's that's consistently where the the the bulk of our data ends up. When I say data databases, I mean warehouses and lakes and, you know, those kinds of things. Yeah.
Yeah. Yeah. Sure. And I think a couple of things have changed. So one, the the encryption technology, has been around for a while.
Now you don't have to use encryption technology to solve this problem, but I do think encryption is a very powerful way to do it because encryption allows you to do two things. One, it it becomes a deny by default policy mechanism. Whereas, if you think about storing data in, say, a Postgres database and then in a Snowflake and then in a MongoDB and then in some vector database, Traditionally, what you'd have to do is define policies and rules for each one of those different databases. Then if you then very often, certainly in the case of a relational database to a, say, a warehouse, you wanna synchronize data from your relational database into your warehouse. There's some ETL process.
So what what permissions and policies do you have around the ETL ETL process? Does that become a potential attack vector or weak point into your security chain? So what encryption allows you to do is, encrypt the value once and then have that, the the access to that data be denied by default no matter where it goes. So you end up with sort of a universal policy universal access control. Only only through the application of the appropriate encryption decryption algorithm can you access it.
Now that's very powerful. And historically, you know, the last decade, you could use something like Amazon's KMS to do that or an HSM. That sort of thing's possible. The reason in my view that that hasn't been taken up really a great deal is because if you apply that kind of encryption technique to existing databases, your database technology doesn't work properly anymore. And what I mean by that is you wanna query, say, somebody by their email address.
You wanna sort him by their first name. None of those things work anymore. And so one of the the enabling technologies that has started to become, you know, move out of academia and into into, industry over the last decade is this idea of searchable encryption. So you can take a query, an SQL query, say, encrypt that, send it to the database, run queries over encrypted values, get an encrypted result, and then only when the authorized user tries to access that encrypted result can they actually access the values. So I think this is Searchful encryption is an important one.
Yeah. And and this is as a result of something like a one way hash where, like, the the kind of the unidirectional hash always kind of resolves to the same encrypted value. So you don't have to search for, you know, Jeremy at FireTail dot io. You search for the hash that's generated through the one unidirectional, you know, kind of encryption of that value. Right?
Exactly. Yeah. So one way one way hash is a good analogy for it. And then and some of the different searchables encryption schemes use one way hashes as part of the construction, But they're actually more much more sophisticated than that. And so as an example, you can do range queries.
There's there's a there's a form of encryption called order revealing encryption, where you can you can say, is this value more than, less than, or or equal to some some query? That allows you to do sorting as well. Yeah. We can at Systash, we've also developed a scheme to do, fuzzy text matches. So we effectively can replace the the like, keyword in SQL Yeah.
Yeah. And do that over encrypted data. Okay. Now some of your audience may be may be familiar with it with a technology called homomorphic encryption. So homomorphic encryption has been around for a few years as well.
Yeah. I've heard this phrase, and I've never actually had the chance to kind of do a deep dive onto what that actually means. So maybe give us, like, a one minute primer or something if if one minute is possible. Yeah. Sure.
I'll have a crack. Yeah. Yeah. Yeah. So homomorphic encryption is is, I guess, like the, the superset of searchable encryption in that you can take two values that are encrypted using this this scheme, this homomorphic encryption scheme, and apply any any operation to them if you like.
You might add them together. You might multiply them. You might compare them. It is a becomes a general purpose, I guess, mathematical, function that you can apply to encrypted values. So you take two encrypted values in.
You run some function. You get some encrypted value out. And then when you decrypt that value, that is the same as if you had applied the the operation over the unencrypted values. So hope hopefully, that all makes sense. It's gained a lot of interest, over the last decade because it is very compelling.
The problem and the major major limitation with with homomorphic encryption is it is incredibly slow. To put that in perspective, if you were to use it, say, to query a database and say you wanted to use the comparison function on a homomorphic value to just to just really simply to do a sequential scan in a database, you know, that might over a few million records might take days to run. Right. So these operations take can take, like, quarter of a second per operation. Yeah.
Searchable encryption on the other hand is incredibly fast. So there's there's hardly any, overhead compared to doing it on unencrypted values. You know? And so the schemes we're using, for example, even the slowest scheme is still literally a hundred thousand times faster than homomorphic, can operate over data in in, you know, a few milliseconds. Yeah.
So that's the that's that's the major piece that's that's exciting there. Gotcha. Gotcha. That makes a lot more sense to me now than it did, you know, three minutes ago. So I appreciate that that primer on that.
It kinda brings me to the next question, and I wanna go a little bit deeper on this kind of question of encryption. Encryption is something to me that I I I think like a lot of people, I just look at it. It's like, hey, dude. Does the website have the little lock icon on it? Are we using SSL?
And like, oh, yeah. There's this TLS thing and blah blah blah. Right? But, like, generally speaking, other than, you know, encrypted files on my hard drive and and whatnot, I kinda take it as a given. And and I kinda take it as like a, you know, turn it on for the places that I know I need it.
Whether that is again, like my hard drive or whether that is again, you know, an SSL certificate or an API gateway in front of an API or whatever it is that I that my use case is. But I know that there are different algorithms behind all of these different encryption techniques. And and you know, one of the things that we've all heard a lot about over the last couple of years is, oh my gosh, quantum computing is going to break encryption. So I guess like my perspective on it is, I hear about this. I don't know necessarily if it's hype or truth or nobody really knows what the current state of it is.
For you as somebody who spend a lot of time looking at encryption, like, what's your take on it? What what is real for what is coming around quantum computing and how it will or won't impact the state of current encryption? Yeah. Such an interesting question. So I'll start with the bottom line bottom line upfront is that I am not the slightest bit worried about quantum computers breaking encryption.
Okay. And let me let me clarify why. So there's a lot of hype around quantum computers, and this this threat, this potential threat of being able to break encryption. And there's a couple of different ways to think about it. So let's imagine for a moment that a quantum computer, a stable enough, powerful enough quantum computer was available tomorrow.
What what might happen? In in the sense that, you have you know, right now we're we're using, what you might call encryption in transit to to communicate over this this link via the, you know, our podcast studio. That is using a form of public key encryption. And in fact, if you were to go and, you know, open the developer tools in Chrome or something, it would probably tell you, what kind of encryption it's using. Now more than likely it's using, something called elliptic curve cryptography.
If you happen to have an old browser, it might be using RSA. I hope that it's not, but you know, RSA is a is a common commonly used public key encryption technology as well. Now these are the these are the encryption schemes that are theoretically vulnerable to a quantum computer. And that's because of a guy, called Peter Shaw who in the nineties, long before any practical quantum computers were were, in existence, he developed this algorithm, a theoretical algorithm called Shaw's algorithm that can break the the public key encryption scheme using of RSA or or elliptic curve in a fraction of the time. Technically speaking, it's in a, it's a much more efficient algorithm and so that the theoretical run time is much, much smaller.
It still does take time, but it's something that can run, you know, within a few hours or maybe days rather than if you were to try and break, elliptic curve encryption where the classical computer would would take billions of years. So there's it's it's suddenly the suddenly that the, the public key encryption becomes vulnerable. What people don't talk about enough is I'm gonna give you a couple of things to think about. So one is that the public key encryption that, is used is only for the exchange of the key. So right now, what what is actually doing the encryption is the, is the symmetric cipher.
So a scheme typically, it's AES, so the advanced encryption encryption standard, or you might have something more exotic like a like a ChaCha poly or something like that, but it doesn't really matter. None of these none of these symmetric schemes are vulnerable to the algorithm that Peter Shaw proposed in in the nineties. And so then what we need to think about if if, if these symmetric schemes are are not really of concern, how do we improve the security of the public key key encryption schemes? And so there there's a a quite a lot of work being done right now, on what's called a post quantum secure encryption scheme. So NIST, the National Institute of Science Technology in The US, ratified, new post quantum secure encryption schemes.
There's a number of them now. Most probably the most famous one is is Kyber called Kyber or now increasingly being referred to as as, as MLKEM. It's a key encapsulation mechanism. These schemes and in fact, if you're using CloudFlare or or a number of modern, kind of content delivery or website delivery platforms, more than likely, you already are using quantum safe encryption. Like, if you are if you're in Chrome and you go and click inspect and you look at the the certificate and the encryption mechanism, it's most likely it's gonna be using post quantum encryption already.
And so then that's assuming that a quantum computer already exists and is stable enough. We've kind of already covered our our backsides. So then the next interesting question is, well, okay. How far away is a quantum computer that could actually do this? And now I'm not a quantum computer expert, but you start to speak to the experts and you pull back the curtain from a lot of the, you know, the media hype and so forth, and you start to realize that we're still a very, very long way away.
And to give you an example, Shor's algorithm that I mentioned a moment ago is, it's a factorization algorithm. So it's it's designed to, very quickly factorize or determine the two numbers that multiply together to result in in a asymmetric key. There's a bit of bit of, math and sort of technical, nuance there. But effectively, what the algorithm is doing is working out what those factors are incredibly fast. The largest number that, any team anywhere on the world has ever used a quantum computer to factorize is in the twenties.
So I'm talking, like, '23 or something. Okay. Whereas we're talking about with RSA or with, elliptic curve, RSA, for example, is a 4,096 bit number. So that's a very, very long number. And most quantum, computer experts that you speak to will say, in order for us to factor that kind of number, we're at least a decade away.
So while we're getting all these interesting advances, and I'm I am excited about quantum computer technology, by the time we get a quantum computer that's even possible to factorizing those numbers, everybody's gonna be using post quantum cryptography. And not to mention that the other technology that you mentioned is the idea of disk encryption or database encryption. Yeah. These these, they're always using symmetric encryption, which are not vulnerable to this this Shor's algorithm. Gotcha.
And and if I say to you thank you for that, by the way. That was extremely helpful way of thinking about it. But if I say to you just to kind of, like, test my understanding and also test some of what you said, okay. Everybody except for the people who are still stuck on mainframes. Right.
Like, is that a concern or is it no no no even those systems are going to be, let's say, like transmitting over lines in network infrastructure that where the encryption is going to be done at a different layer and this becomes irrelevant? Yeah. So the problem is not in the sort of the ephemeral key that is being, generated. So right now, you know, as I mentioned, we're talking via this over this encryption in transit link. There is an encryption key that's being used to encrypt our traffic.
But as soon as we end the call, that encryption key will be discarded. And the next time we get on a call, it'll be a different key. So so long as the technology continues to update, then, any new keys that are that are generated will be will be safe. Of course, the point you make about mainframes on legacy technology is indeed a concern. And I and I think we are going to have a problem in the next decade or so, where, you know, not dissimilar to the y two k problem for for for those that are old enough to to remember what that was about, where we had to update our systems.
We're probably gonna end up with a with a similar situation. Right? Where we've got, we've got this kind of, like, oh, in three months, IBM or whoever is going to release the new quantum computer that is going to be, like, say, like, widely commercially available or at least commercially available might maybe why one of the cloud provider platforms or who knows where to at a price point where it is actually feasible for somebody, threat actor, nation state, whoever, to to use it for whatever purposes they might. Right. Right.
Exactly. Yeah. So so there is a a process that we have to adopt as an industry now over the next decade or so to start to migrate to these post quantum safe encryption schemes. Yep. The reason that I'm not concerned although, like, I I get it.
You know, Y two ks was was was not straightforward, but the reason Y two ks ended up not being a problem was because everybody updated their systems. There's it's nontrivial to go and update systems, and we have to do the work. Yeah. But the reason I'm not concerned is because this threat of quantum computers is not imminent. It's not something that's that's like right around the corner.
It's not something that we have to go and solve for tomorrow. And this concept that, you know, people talk about the kind of the data data collection problem where you have some nation state, collecting a bunch of data, to, to store for later use. That is still potentially a concern, and we do need to work would need to be conscious of that. Nonetheless, there's still some time. We have time to go and solve these problems.
So I think my my my number one advice to to folks that are listening, if you are thinking about this quantum problem, probably the most impactful thing you can do with, any legacy systems is is try to understand what your migration to post post quantum, safe encryption systems looks like. If you're considering a modern encryption scheme, something like Sykes Dash or or any any of the number of, players on the market now, make sure that they're using quantum safe technologies. And and I would suggest that by and large, we all are now. So certainly for new technology, it's not a concern. I I just pulled out of curiosity while we were talking.
I just pulled up the, the certificate signature algorithm, not for Riverside that we're using to record, but actually for Google Docs that I've got here in another browser window. And I've got PKCS number one, SHA two fifty six with RSA. And so I assume that PKCS is already kind of post quantum, or or what some stands for something along those lines. No. In that case, PKCS is I forget what the algorithm is, but that's that's referring to, the certificate type, I think.
We're gonna do a bit of a live, record. Live Googling, captivating, contract. So that's just standing for public public key cryptography standards. Right? So Got it.
The the keyword that you've mentioned there is RSA. So you're you're using RSA, which is that's interesting. It's a shame that, that, Google Docs is using that. But I'm just curious now, for your listeners, if we're using using it on Riverside, I might not be able to do it while everybody's listening. We'll come back to that perhaps.
But, I'm reason reasonably certain, like, Sykes-.com, for example, Linear, the the the dev management tool GitHub, a lot of the companies that we use today are are now using the post quantum encryption standards. Got it. Yeah. So we'll start to see more and more of that over time. X9Dot62 ECDSA signature with SHA three eighty four.
Yeah. So the DCA DSA in that context is the, digital signal digital signature algorithm. So that is using the traditional, either RSA or in or in, yeah, that would be RSA essentially using it for digital signatures. So that is that is not quantum safe. So we haven't had a good hit rate yet.
No. No. But we're open listening, I encourage you to to have a look at the websites that you're browsing and and, and see what they're using. The one thing I would mention is that you do have to have, a really up to date browser, to to capitalize on that. So the older browsers don't support those new schemes.
Yeah. Yeah. Yeah. Yeah. Yeah.
It's funny. The browser has become so important in this whole well, really in, like, the whole of the modern workspace. I I think it really is one of these kind of overlooked but super critical points. And we've got some stuff we're doing on that side, around AI that we'll be announcing in the next little while. But, I wanna shift gears for a second as we kind of, you know, come towards the end of today's episode, because there there was something that you said kind of in our prerecord that really resonated with me, which is like, you know, we talk about, let's say, like, the the different parts of the problem that we have are trying to solve today, whether it's encryption, whether it's data access that we talked about at the beginning of the conversation.
And you said something to me which is like, you know, part of the the discovery and the lessons learned of your own journey in cyber security and as you've built out CipherStash is that sometimes it's hard to get non security people to care about security. So share with us a little bit about kind of like what kind of led you to have that position and how do you think about getting people to change their mindset around it? Yeah. I have a another slightly controversial take actually, Jeremy. My my theory or my hypothesis is that unless you are a security engineer or a security person working in cyber, you probably actually don't care about security.
What you care about is something that security enables for you. Right. And most business people and I I am a business person. You are a business person as well as a cyber person, so we have a foot in each camp. But what we actually care about is growing a business.
We care about, you know, building trust with our customers, closing deals, you know, generating more sales, all those kinds of things. Right? And it's it's a it's a it's an interesting realization that we've had at Scifestash. How do you how do you convince somebody that security is important when actually deep down, even though they probably will never admit it, they don't really care about security? Yeah.
Yeah. What they and and most security and once again, the present company accepted, but most security, vendors will will try and sell sell on fear. So this fear of data breach or fear of lost lost business or, you know, being held ransom or whatever. Right? And and those are those are genuine concerns and you probably, if you're not afraid of what's going on in the world right now, maybe you should be a little bit.
Yeah. But fear is not I think in our industry, people get sick of the fear angle. It is it is a powerful motivator. But if when you're getting bombarded all the time, you should be afraid. You should be afraid.
You kinda get desensitized to it. Yeah. And so what we've started to to think about is, well, what are what are the what are the positive ways that you can sell security? What what about if you didn't have to spend so much time worrying about if you were gonna get, breached and now just focus on closing deals and generating more business and building trust with your customers? And so Yeah.
At SyFSDash, we think a lot more about this idea of, our motto is kind of protect data, close deals. Yeah. So if you are a vendor, SaaS vendor or an infrastructure provider, and you can convince your customer that using your product is a is a more secure, safer option, you don't have to go through all of the anywhere near as much of the procurement pain as you had to in the past, is that compelling to you? Turns out the answer is yes. Yep.
Yep. Yeah. It's funny. I had I personally had a very specific moment when I actually realized that cybersecurity is important. And and mine was just a random set of circumstances that kind of led to a little bit of an epiphany moment, honestly, in my own career trajectory.
I I I'd been in cybersecurity. I'd done it hands on keyboard practitioner for the first half of my career about thirteen years. And it was one of those things that, like, you always had to do, but, you know, at that time, I will say this is kind of the 1997 through, like, twenty ten ish time frame that I was still kind of involved in the day to day operations of both our IT systems and our cybersecurity systems. In even at that time, what was really interesting about it was my own perspective on it was IT is the thing that enables the organization because they need tools and they need systems that they can use to build on top of. Right?
Every marketing guy needs a laptop. Every salesperson needs a mobile phone. Whatever the the thing that they need in order to do their job is, that's what it is. And what's happened over, let's say, like the ten year time period from 2010 until 2020, was that the number of threats really went up from my perspective. And also by the way, the number of risks, like self inflicted risks.
Right? Like the kinds of accidental data exposures because we all moved to the cloud in the mid twenty tens, and we didn't understand that we shouldn't leave s three buckets open to the world and and stuff like that. Right? Like, so so we had this kind of proliferation of data, proliferation of new systems and new technologies, and new new threat actors, and, you know, criminal gangs, and ransomware as a service, and the dark web, and all the things that are real things. Right?
That are real threats to organizations into their data out there in the world. And even I, with all of that exposure and all that experience and by the way, like, from 2016 to 2020, I worked at a cloud security posture management company where we grew very rapidly, and we helped a lot of customers around the world. And even in 2020, at the end of that journey, I was like, man, security is rough. I just don't think my heart's in it. I just don't think, like, I'm doing anything super meaningful.
And then we had COVID. Mhmm. And during COVID, I know so many healthcare organizations were just under constant attack. And I talked to a couple of organizations that were instrumental in kind of helping all of humanity out in getting out of the pandemic. And I can't say more about, you know, exactly what those customers were and what they did, But let's just say that, like, we're all here today in large part thanks to them and the work that they did.
And I kinda realized something was that, like, you know, the work that I had done in the previous four or five years up to that point actually helped them move forward. It actually helped them, like, to embrace cloud systems where they could do new things and they could build with a sense of security knowing that their intellectual property wasn't gonna be hacked or stolen or or whatnot on a regular basis. And so, like, all of these, amazing minds who came together and worked on some very, very important difficult problems. I realized, okay. I was not going to be one of those people.
That's not my mindset. That's not like my my, like, my core competencies as a human to to go be that person. But I do understand how these security systems work. And I do understand, like, to your point, the role that this can be as an enabling function to allow an organization to move forward. And in my case, it was a little bit less on the let's say the deal focus around what what you just expressed.
But I do take that point, by the way. I do think, like, any small company who's going and trying to sell to a large company, one of the first questions you're gonna ask if you hold the customer data is show me your certification, show me your security standards, show me your whatever. Right? So, like, I I truly, empathize with that. I think it's a great way to frame the question.
I've had a similar kind of gripe on my own, and a little bit of a hot take is that I think only security companies care about secure by design. When I talk to organizations about what they're doing for their security strategy, I hear a lot of kind of defensive stuff around production environments. When I talk to security companies, I'll hear that, but then I'll also hear about what they're doing to eliminate vulnerabilities doing during the build side. What they're doing for secure code reviews. What they're doing for, all types of things around like, you know, sometimes down to the basics of like, how do you control request parameters on an API?
Like do you have good input standardization and validation? Like some of these core basic functions Mhmm. That similarly you would think are part of a programming best practice for the last twenty plus years, and yet we still see like day after day after day that they just get overlooked in favor of move quick and break things. Right. And so, like, you know, I I think we share similar perspectives on this, but I I I really like your your take on, like, it actually enables more sales and that's an important thing.
Yeah. And and more sales is, you know, is our particular flavor of that, you know, that that characterization of it. But to to to think of it more broadly, I think it come it's just simply about confidence. And to to I know I know we're running out of time, but to to to to wrap up the way that I think of it, the analogy that I have, if you have to cross, you know, a a a chasm, you you need to get across the Grand Canyon. You could walk across a tightrope.
You've got you're a highly skilled tightrope walker. You've got your big long pole and, you know, if you know what you're doing there's a reasonable chance that you'll get across safely. But let's be honest. Most people are not gonna do that because that's Yeah. Very high risk.
If that is that is what the world looks like when you don't have good security in place. In contrast, if you have a bridge with nice high high guard rails, it's made by engineers, it's built with steel and concrete, Everybody's gonna feel perfectly safe going across that that bridge. And so people now have confidence to operate and do the thing that they need to do because they they trust that the security is doing, doing its job. They don't care about the security per se. They care about their goal, but they just need to know that they've got they can have the confidence to go and do that thing, because the systems underlying what they need to do are working properly and they're secure.
Yeah. Yeah. I think that's a great analogy, and I think that's a great note to end today's episode on. Dan, if people wanna learn more about you, your work, what you're doing, what's the best place for them to check out? Yeah.
Take a look at Sykes-.com. Learn about our our our business and our technology. We have a number of open source technologies that people may be interested in as well on our GitHub, which is linked from the website. And, of course, if you'd like to to connect with me on on LinkedIn, Dan Draper, just hook me up, and I'd be happy to connect and, always happy to chat. Awesome.
Awesome. We'll have both of them link for the show notes. Dan, thank you so much for taking the time to join us on Modern Cyber today. Thanks, Jeremy. It's been fun.
Awesome. Awesome. We will talk to you next time on another episode of modern cyber. Remember, if you know somebody who should come on the show, please have them reach out. And if you know somebody who wants to come on and talk about their breach story and what they learned from it, we still got a few slots on our breach series.
Season one. Excuse me. Talk to you next time. Bye bye.