May 12, 2021

Software Security Gurus Episode #19: Bankim Tejani

Welcome to Software Security Gurus with Matias Madou.

In episode 19, he chats to Bankim Tejani, Chief Security Architect and Distinguished Engineer at the Charles Schwab Corporation. They discuss his experience with big data breaches, as well as how secure coding can speed up remediation, and drive quality, performance, and scalability.

Want to nominate a guru? Get in touch!

Find Matias on LinkedIn
Find Bankim on LinkedIn

--

When engineers move into security: 01:28
Less finding security problems, more building secure software: 05:15
The ingredients for creating successful, secure software among engineering teams: 10:27
Living through a data breach: 16:04

Listen to the podcast version:

Read the transcription:

Matias Madou:
Welcome to the Software Security Gurus webcast with Bankim Tejani. Welcome Bankim.

Bankim Tejani:
Thanks Matias. Thank you.

Matias Madou:
Bankim it's great to have you. Can you share a few words about yourself?

Bankim Tejani:
I'm happy to. Yeah. So, I'm Bankim Tejani. I have been in... I think of myself really as an engineer that became a security person. I grew up as a tinkerer. I always like to take things apart and build them. I learnt to program at a pretty young age and worked out from there. And then I transitioned in my career from software development into security. And so that pathway is kind of informed a lot. And then that's where we crossed paths later on in that journey where I moved over from InfoSec into application security and got to be with you and many other great people at Fortify. And in the early years when we were a startup and then going through the acquisitions and then I've been, since then, focused on cloud security, product security, and now really helping, to keep pushing that forward and modernize.

Matias Madou:
An engineer that moves into security. These are the best, right?

Bankim Tejani:
I think so. I think we should all be that way. [crosstalk 00:01:32] I also... yeah, I agree. And I think I was listening to some other episodes and I fully agree with Flea, right. I hope that all security people are also engineers, right? It is awkward that we talk about security engineers that don't have an engineering background. But I'm also... I'll say something a little bit more controversial. I also think it's awkward that we call software engineers engineers, when we're not professional engineers, right? Like if you build a building and it crashes because you built it wrong and you were defective in your materials or you're defective in your components or... You lose your license. You're liable for that. And when we look at software, that's not the case. We don't have that teeth structure to accountability in our industry, not just in security and software secure but also in software engineering in general and software products.

Bankim Tejani:
And so that's I think an untold aspect of the picture. And so especially when you talk about recent things like SolarWinds, we're really... There's all of these things happening and we've been... And we're seeing like the bad things can happen. Right. [inaudible 00:02:50] clearly the bad things did happen with that. But because it happened with software, there is no liability, right? There's no impact of that. And so you can effectively push out insecure software at will. And there are very few consequences, right? No one's lost their CISSP or their CSSLP, right. But if you were an architect on a building that collapsed in a nefarious way, you would lose your license or you'd, at least be liable for malpractice. So that... Those are things don't happen in our industry.

Matias Madou:
I do think it's going to change when software is more and more embedded in their systems like cars, airplanes.

Bankim Tejani:
Yes.

Matias Madou:
If it's over there and people lose their lives, I don't know. And I think that's a different thing. If it's just software, if the worldwide web and the SaaS solution to do whatever thing that is only on a computer and it stays on a computer that is different from real life things. So, I agree with you, but I think that there's going to be changes because [crosstalk 00:04:06]

Bankim Tejani:
Yeah, and certainly things like self-driving and lots more that will drive the impact of that. But I also think that it's a matter of technology and in medical devices and these are areas where there's lives at stake. And so we do have to be better. We have to write better software as an industry and we're told ourselves accountable better for that.

Matias Madou:
Absolutely. And that actually needlessly goes into my first topic that I want to perch on which is, well back in the day we were at the company which was really good at finding problems in code and the more problems you could find, the better. The mountain of problems and then we wanted to find them faster, more categories, more everything.

Bankim Tejani:
Yeah.

Matias Madou:
I think right now we take a different stance on it. It's not about how big is the pile? How many can we find? But it's more like, "Hey, how can we build better software, as you're saying, how can we fix that?"

Bankim Tejani:
Yeah.

Matias Madou:
How do you see that? How do you see that transition? And maybe if you can talk a little bit about your company organization and how that transition happens in your company.

Bankim Tejani:
I should probably qualify. So I... Currently I work for Charles Schwab. I'm the chief security architect there. But I'm not representing Charles Schwab in any way onto this call. Nothing I'm talking about is from Schwab. But my experience, like you said, when we were at Fortify in their early days, right. You could run a scanner like that and you still can today where you can run a static analysis scan, or you can run to the scanner on almost any application and come back with hundreds of vulnerabilities. And so when we were there at Fortify, that was kind of our professional services program, we could scan anything, right? Any application you want, and you're going to get somewhere between a hundred and a thousand vulnerabilities, depending on the size of the application, many sufficiently large application, you'd come back with hundreds of vulnerabilities and to it.

Bankim Tejani:
So we knew we had that confidence and we could only get better. Like you said, if you only get better. And so much of what the security industry folks thought as well, help me find the next day. Help me find the next category of issues and what they are. And I think it's funny because you look at the OWASP top 10 and if I go back 2004, 2007, and you look at the different years, not that different, right? Yeah, they've repackaged them a little bit here and there, they've collapsed one or two into a higher level category, but at the core, you're still talking to the same sets of design patterns. And so... And we're still finding them. Right?

Bankim Tejani:
And so there's something there to that other piece that I recognized and at that time, in that early to mid 2000s, late 2000s, really it was the security team that was buying these tools and saying, "okay, we need to use this because we don't know what's happening there's these applications over here, they're very complicated." We don't know what's happening. We don't have the expertise, right. We're not developers. We run networks and we run infrastructure. So I don't know what's happening.

Bankim Tejani:
I need a tool that's going to tell me what's happening and what's good and what's bad. And then I'm going to kind of smack this report down on a development team and say, "Hey, go fix these things." And what I found due to spending years working with our customers, a lot of financial services companies and government agencies and startups, that ethos really was, okay we find these vulnerabilities and we're going to flip them back to the development team and they got to go fix them. And one example organization, I won't say who, their average time to fix one, just a single finding, was somewhere in the neighborhood of like 60 to 90 days.

Bankim Tejani:
Right. Well, what does that mean? If you find a thousand things your application and you do the math, right. You're never getting out from under that. You will never get out from under that, at that philosophy. And so you have to really think about this differently of like, okay, what can we do to change this model? And there were reasons for that, right? There are reasons because it's a separate team that is providing the results and a different team that's receiving them. They're not talking the same language. There is disagreement about things like false positives, false negatives, right? These are... So you lead to these areas of long-term negotiation and this was all happening via email. Now it's happening via Slack. But it doesn't matter, right. The point is you're not spending time fixing, you're spending time communicating and arguing about what you should fix.

Bankim Tejani:
And so we have to find ways to accelerate that, but also to create more ownership. And so one example kind of scenario, it was the customer that we went to, they had this paradigm, right. And then they work 60 days to fix one issue. And so we went in and said, "okay, we want to change this. We want to do it differently. We're not going to follow that same process because we want to show the value of doing it differently." And we said, "okay, pick... Let's go with three development teams. And what we're going to ask of these development teams is that they fix 20 issues within 30 days, right? Within a month, they're going to fix 20 issues in 30 days. And the development teams, they were like, "Whoa, this is a lot, right. You're asking us to do all of this at once?"

Bankim Tejani:
And the security team was like, "well, that seems too small, but we'll try it. But we still don't think it's going to happen because it's [inaudible 00:09:48]." And we said, "okay, we're going to change the way you engage, we'll change the way you communicate. And rather than you giving them a report and says, here's what you have to fix. We're going to give them the direct information right. But we're also... You can't dictate to them what they fix. Right? We need them to have urgency in what they choose to fix and what that is." And so we set those ground rules. Everybody agreed. And we sat down and we did it. We did training. We showed them how to use the software, but more than anything else, we actually showed the developers, their code.

Bankim Tejani:
We are showing them their own code, their own flaws. And then we said, "okay, what can you fix? How can you use this and change it?" And so... And they set aside 30, a month worth of man hours to do this across, times three teams. In the end, they actually spent... I think I forget the exact numbers. There's about 10 to 12 man days across the three teams. So they were planning for a man month each, right? So three man months. And they went... The actual time they spent was about 12 man days. And what they actually fixed across the three teams was... I think it was around 120 to 130 vulnerabilities.

Matias Madou:
Very nice.

Bankim Tejani:
Right? Because what they found was, once they cut through the noise and the back and forth ping pong between different groups, what they actually found was, Oh, I can make the code better.

Bankim Tejani:
And you know what, this thing... That's the same design pattern. I already know how to fix thing. And once they learned it and they applied it and they had the right motivation to do it, and they had the plan time, right. That was the other piece of this, they actually had time planned where they had set aside this resource to do this work. And that's not the typical model. The typical model is when you think of a developer experience, you get your stories from JIRA, from your product owner and you develop and you build it, you test it, then you move on. And then a month later somebody comes along and says, "Hey, we found this issue with this code that you wrote month ago." Well, now you have to go carve out new time and then go back in your mental energy and go backwards a month to where was I? What was I doing? Was I thinking?

Bankim Tejani:
And so that doesn't work, right? We have to find ways to get that information to the person at the right time, the right actionable moments. But we also have to be more respectful about the relationship and the urgency. Right. And I think of this in a lot of sense that like, if you go to your partner, right, and you say, "there's a hundred things I don't like about you." Right. How's that going to go in terms of a dynamic, right? That's not going to go very well. It's not a productive way to build a relationship, but that's what we're doing. And that's what we're doing. We're dropping down findings and issues and risks. And we're saying, "Hey, go fix this. What's wrong with you. Why haven't you fixed these critical issues or these high issues or these medium issues or these low issues."

Bankim Tejani:
And that's not the right approach. We have to create a different structure where they have urgency. But they can also turn those things into planned work. Where they can take ownership of when it's done, how it's done, and actually build that into the schedule. And then... And I think there's really great thinking around the industry around this, right? So, SREs talk a lot about tech debt and the idea of like a debt clock, right? Where once you accumulate a certain amount of tech debt that you'd have to stop future work to do that in a defect clock. And I think we can take similar approaches with security, right? And having that in other really great approaches to create plan capacity, right? To set aside 10, 15% of your sprints and negotiate that upfront and say, "okay, 10 15% is just set aside to do security defects." Whatever that is.

Bankim Tejani:
And if you get to a place where, Hey, you've fixed all the security defects, you get 10 15% time back, right. Which can be for both... it's not... You're not taking away and returning it, it's still set aside because if something happens later, it's still set aside. But if you get yourself... If you engineer your way ahead of the curve, now you got bonus time-

Matias Madou:
More time.

Bankim Tejani:
...That you can spend on the feature that, your engineers really want to do, but your product manager doesn't necessarily want to prioritize, right? So you mess with that in lots of ways, but you can create these incentives for teams to self-actualize at what they want to do. And I think, on the whole right, engineers, developers, even we as security people, we want to do good work. We want it to be valued. We want it to be useful. And so we have to find ways to tap that ethos in the way that we engage from software security to our developers.

Matias Madou:
Absolutely. And developers want to do the right thing. Actually in your experiment, there is a couple of key ingredients, I would say, that made it successful. So there's for sure a cultural shift in terms of, Hey, everybody has to rally around security. You make it a top priority. You've acknowledged that they need time to upskill themselves, to make sure that, Hey, they need to make sure they can fix this [inaudible 00:15:26] the two teams together that didn't happen 10 years ago either. As you were saying, people are throwing security issues over the wall. You gave them time. You had the right setup. So, actually what I hear is there's a lot of key ingredients there in that experience that made it successful and that people that hopefully listen to this, need to copy into their organization, to do something successful and bring people together, rally around it and actually gets it.

Bankim Tejani:
Yeah. I think there's a lot of aspects to that. Yeah. And it took that... It took work. It is a team effort to drive off that. Yeah.

Matias Madou:
Then one can... I would... I have the second topic. And I'm actually very very curious how people live through a breach and fortunately, or unfortunately, you lived through a breach.

Bankim Tejani:
Yeah, I did. Yeah.

Matias Madou:
So I would love to hear how it feels from a personal experience on how do you live through a breach because people tell me sometimes it's hard and then they've heard from somebody that knows somebody that it is hard, but you've lived through it. So, I'm very curious.

Bankim Tejani:
Yeah. I lived through it. So, this happened in 2018 when MyFitnessPal was impacted. So, at that time I ran product security for Under Armour connected fitness. Which included MyFitnessPal, Endomondo, MapMyFitness and then UA e-commerce. And so one of those properties had this breach and we went through that and went public. It was very difficult thing. I can't talk about all the specifics because you have company info, but certainly the personal experience of it was breathtaking. I mean, in some ways, right? I mean, it changes so much. But I'll go back to beforehand. Right. So even before then, years before, I had seen data breaches, I'd seen Target. I had seen what happened with TJ Maxx and so many others. And I always felt bad for the people that were affected, but hoping it never happens.

Bankim Tejani:
And when you... Even just in your dialogue, when you talk about things and it's like, " Oh, well, this could happen and it could be bad. We could have a breach trade-ins." It's this like ominous thing that occurs. Right. And it's the end of the world. And then it happened to us. Right. And I remember the day that I knew that it happened because we've gotten some information from a security intelligence company that, "Hey, this might be something that's affecting you." And it was a Sunday and I remember because I went to go play hockey. Well, I started a download, a big file download before I left and I went to go play hockey. I play ice hockey. And so I went to play my beer league hockey game. I came back from that and the data download had finished, and I started looking through and I was like, "Whoa fuck, this is our data."

Matias Madou:
Wow.

Bankim Tejani:
And because like, wow, okay. And then the rest of that week, from that Sunday night to Friday, I think I slept maybe 20 hours [crosstalk 00:18:34] total in that timeframe. So it started right away. We knew it. And there was some things I learned from that, right. That were really of value. I mean, one is scary. Right. I'd be like, I don't know. Right. I mean, okay, does that mean I'm fired, but I don't know. And that always kind of occurs as I think a possibility. But in truth that it really didn't. That was an unfounded fear. Right. Because what, what I learned... One of the things I learned from that experience is that it's not as much the thing that happened as it was what you did and what we did was pretty amazing.

Bankim Tejani:
And so, when you look at the data, the average disclosure time from the time you know something happened to the time you disclose to the public, is around 33 days, give or take on average. And there's some that are much longer than that. We went public in four. Right. That was the Sunday that I kind of found out and we're going through. We went public Thursday, right. That-

Matias Madou:
It's fast.

Bankim Tejani:
That was the whirlwind. That was fast. And as a team effort, right. It took a ton of collaboration. And also the relationships that I had built there over the previous two years that we had built collectively between the security team and engineering, we were indebted in the same offices and sitting together our customer support, apparatus, legal, the privacy in particular, our communications team, with all the engineers and so many more, right.

Bankim Tejani:
I mean, even technology and corporate risk and all of these different functions and having the right things in place, having vendors that are on retainer for forensics and outside counsel that's on retainer and ready to go. Like we had all of these building blocks in place, but then it also took a tremendous amount of collaboration for four days, right. To get ourselves in a row, to have answers, to have a diagnosis, to really drill into STEM any damage and make sure and protect it. And so all of that was a huge thing. And one of the things I took away from that is the fear is so much worse than it seems. And the actions you can take are so much better, right? Ultimately... No. And was it a good thing that we got breached? No, of course not.

Bankim Tejani:
But when we look at what happened, what we demonstrated was we had the utmost care and concern for our customers. And we took all the right steps in terms of a response. And I think we even got called out by some of the consumer data regulators as like, "this is the way you should respond to an event like this." And so is it great that it happened? No, but what did we learn? So much. And we learned having those relationships is so critical. I found... This is one of the key elements of a talk I gave it last [inaudible 00:21:38] kind of afterwards about it, was one of the key killer things that made us execute that way was Slack. Because we had adopted a Slack culture. And so we did so much of that collaboration in Slack, right. And private channels. And so little of it was on email.

Bankim Tejani:
And the driver for that was, that's just how we were collaborating. But it was so evident afterwards, when you look back and do a retrospective. That made us so much faster because each day when new people got red in and say, "Hey, here's what's going on? How can you help?" Right. They had a very clear history of what to go through. They didn't have to go get forwarded 50 different email threads and then have to digest it all. It's all very linear. It's very clear. Here's exactly what happened. Here is who's doing what and here is what's been talked about and now, as soon as I.. and we're asked questions, it's ramping up. And so when one person asks questions, the next person that came in got the benefit of those questions and it wasn't starting over each time new people were brought in. [crosstalk 00:22:51]

Bankim Tejani:
And so it... All of those things helped us to automate that way. And we had a lot of other infrastructure. We had great engineers. There were brilliant engineers with us to help scale and automate. And they were also there with us night and day through that weekend. And so it was a team effort, but I feel like [inaudible 00:23:13] one of the things I took away is to be less scared, right. To not let that fear drive us. And fundamentally, I learned that, we couldn't... There was nothing really we could have done earlier to prevent that earlier. I mean, it's bad that it happened. Yes. But when you look at the landscape, that's the end of the world. And so we were... Chasing perfect security isn't... I think I didn't have a great sense of perfect security beforehand, because I had lived through it enough with Fortify and everything else and seeing what's out there.

Bankim Tejani:
But I also afterwards have a more resolved view of like, I'm not chasing perfection, what I really want to chase is do we have the right dynamics? Have we set up the right incentives so that our developers can be their best and take the best decisions? And are we empowering our security people to do that? And are we not being driven by fear, but are we being driven by the customer? We driven... Being driven by the value that we're providing and making sure that we're delivering their risk expectations and our risk expectations, and doing it. The only system that's zero risk is the one that's off. And if it's off, you're out of business. That's not a place you want to be in. Right. So it's much better to take risks and be smart about it than it is to try to strive for zero risk because I just don't think that's feasible.

Matias Madou:
Yeah, I think... So what I hear is you must have had a very well-prepared team. You're actually doing whatever you can to prevent something like that. But even then there is a plan B or there's a backup plan that if something goes south, you are prepared and the Slack one I had never heard before. And actually it makes perfect sense that a solution like that or a communication mechanism like that where you have the history where everything is in, that makes perfect sense how that can help in [inaudible 00:25:11] circumstances like that.

Bankim Tejani:
And that... There's also the... We are talking about culture a little bit before. There's a cultural aspect of it too, of being able to connect and bond and drive that. And that drives that communication, having that connectedness was really important. And we've done that from the beginning on my effective CSO at the time at Under Armour. Brian was so amazing at embedding our team with the engineering work, right. That's something he wanted to do from the beginning and that's where, when I came in to drive that focus, that's what we did. And so that set the groundwork for everything that we did later because we were all together, right. We were on the same team and we weren't... Security team wasn't separate from the engineering team working on different goals. We were on the same set of goals.

Matias Madou:
What I also read is that there's a lot of openness and transparency, not always... Not only to the outside world but also internally, because, if I hear you say you add people to Slack channels and stuff and you can read up on the history, there must be a lot of transparency which then helps in resolving-

Bankim Tejani:
It does.

Matias Madou:
...and the communication part.

Bankim Tejani:
Yeah, yeah.

Matias Madou:
Bankim, we're coming to the end of our session. And I actually have one more question.

Bankim Tejani:
I'm waiting, man. I want to hear the homework that you've done-

Matias Madou:
Yes, so are you... you're one of the people behind LASCON on the right. And unfortunately it's not going through in 2021 and 22. And for the people who do not know what LASCON is, it's the Lonestar application security conference for builders and breakers. From application developers to security engineers. And you're one of the people behind that. So congratulations on that. It's a fantastic conference. So my question for you is like, Hey, you've had some great... So, the Speakers make the conference and you've had some great speakers in the past. And I was actually wondering, who would you like to have on LASCON and if you say, "well, we already have had everybody on LASCON that I ever dreamed of." Who is your top speaker that you think of that would fit perfectly within LASCON?

Bankim Tejani:
Oh, man. That's a great question.

Matias Madou:
Except for yourself. I know you've spoken yourself. You can't pick yourself.

Bankim Tejani:
I can't pick myself. Oh, that's a great one. Well, I'll drop a plug for last time and the OWASP Boston chapter. Right. So I've been involved in volunteer with LASCON since 2009. And it started out of the Austin chapter of OWASP and has been shepherding. And I was lucky, I was a volunteer for a number of years and then I was able to step up into the conference chair position for a couple of years. And actually just last year I stepped away from the conference chair because of... We had a toddler and wanted to spend more time on that. And so I actually stepped away and then it turned out maybe because of COVID, we didn't have LASCON 2020. And so I hope we're going to come back and be stronger in the future and with it. Who would the speakers be? Oh, man. That's great. Well, can I... I'm going to preface this while I'm thinking about it. Some of the really great speakers that I've gotten the privilege to meet and get to know over the years, stands out.

Bankim Tejani:
Oh, there's so many more Shannon, Wickets. I mean, there are some fantastic people, but who could I... Who would I love to see there, that I haven't seen already? There is one in particular and that's Sean Carpenter, and it's going to sound interesting and I don't know if you know him or ever seen him or met him. So, and I'm going to say it for a couple of reasons. One is because he was kind of a role model for me early, very early in my career. He was... At that time we were both at Sandia National Laboratories and he was actually my customer. Right. He was one of the people that got me interested in and aware of security as a profession. Prior to that, I just knew of security as like a hobby. And so he was this whirlwind of energy tracking down security for the Sandia National Laboratories, which is part of the laboratories... The nuclear weapons laboratories for the United States.

Bankim Tejani:
And he was my customer. I was doing... I was actually writing data mining software to identify network anomalies for Sean and for his team. And then since then he's gone on, he was a key person at NetWitness and the human investor, but he was also... Was... Became famous or [inaudible 00:29:57] famous for Titan Rain. Because during the time that I knew him, he was actually tracking a group of attackers. And you can look it up, Google Titan Rain you'll see the articles. But he was tracking this group and actually discovered them and then was able to kind of be a whistleblower and report them to the FBI and help protect many government agencies against continued attack by this group.

Bankim Tejani:
And a bunch of things happened to him after that, but he really believed in what he was doing. And he just showed me this model of how to operate it. And so I think for that reason, one being role model, but since then, he was at Netfitness and did amazing things and has been an investor in security companies and an advisor to a bunch of others and he's just really one of the most brilliant people I have ever known. And so that's who I pick.

Matias Madou:
Nice, so I really hope you guys come back with LASCON and it's going to be stronger than ever and that you're able to get him to the conference.

Bankim Tejani:
Well, I am hoping now that this is going to hit the social medias and stuff, and I'm going to tag him and hopefully he'll come down and give a keynote. That'd be awesome.

Matias Madou:
Okay. That will be fantastic. So Bankim, thank you very very much to be the 19th guru on the Software Security Gurus webcast. Thank you very very much for this lovely chat.

Bankim Tejani:
Thanks. Good to see you again as well.

Matias Madou:
Thank you.

Bankim Tejani:
Bye.

Never want to miss an episode? Get in touch and subscribe!