This Brand Story episode explores how security operations can move past the limits of data normalization and enable every analyst to perform like an entire team. Monzy Merza, Co-Founder and CEO of Crogl, joins Sean Martin and Marco Ciappelli to show how AI-driven collaboration reshapes what’s possible in the SOC.
When “Normal” Doesn’t Work: Rethinking Data and the Role of the SOC Analyst
Monzy Merza, Co-Founder and CEO of Crogl, joins Sean Martin and Marco Ciappelli to discuss how cybersecurity teams can finally move beyond the treadmill of normalization, alert fatigue, and brittle playbooks that keep analysts from doing what they signed up to do—find and stop bad actors.
Merza draws from his experience across research, security operations, and leadership roles at Splunk, Databricks, and one of the world’s largest banks. His message is clear: the industry’s long-standing approach of forcing all data into one format before analysis has reached its limit. Organizations are spending millions trying to normalize data that constantly changes, and analysts are paying the price—buried under alerts they can’t meaningfully investigate.
The conversation highlights the human side of this issue. Analysts often join the field to protect their organizations, but instead find themselves working on repetitive tickets with little context, limited feedback loops, and an impossible expectation to know everything—from email headers to endpoint logs. They are firefighters answering endless 911 calls, most of which turn out to be false alarms.
Crogl’s approach replaces that normalization-first mindset with an analyst-first model. By operating directly on data where it lives—without requiring migration or schema alignment—it allows every analyst to investigate deeper, faster, and more consistently. Each action taken by one team member becomes shared knowledge for the next, creating an adaptive, AI-driven system that evolves with the organization.
For CISOs, this means measurable consistency, auditability, and trust in outcomes. For analysts, it means rediscovering purpose—focusing on meaningful investigations instead of administrative noise.
The result is a more capable, connected SOC where AI augments human reasoning rather than replacing it. As Merza puts it, the new normal is no normalization—just real work, done better.
Watch the full interview and product demo: https://youtu.be/7C4zOvF9sdk
Learn more about CROGL: https://itspm.ag/crogl-103909
Note: This story contains promotional content. Learn more.
GUEST
Monzy Merza, Founder and CEO of CROGL | On LinkedIn: https://www.linkedin.com/in/monzymerza/
RESOURCES
Learn more and catch more stories from CROGL: https://www.itspmagazine.com/directory/crogl
Brand Spotlight: The Schema Strikes Back: Killing the Normalization Tax on the SOC: https://brand-stories-podcast.simplecast.com/episodes/the-schema-strikes-back-killing-the-normalization-tax-on-the-soc-a-corgl-spotlight-brand-story-conversation-with-cory-wallace [Video: https://youtu.be/Kx2JEE_tYq0]
Are you interested in telling your story?
▶︎ Full Length Brand Story: https://www.studioc60.com/content-creation#full
▶︎ Spotlight Brand Story: https://www.studioc60.com/content-creation#spotlight
[00:00:32] Sean Martin: Marco,
[00:00:33] Marco Ciappelli: Sean,
[00:00:35] Sean Martin: guess what time it is?
[00:00:36] Marco Ciappelli: it's a story time. One of my, my favorite time.
[00:00:41] Sean Martin: it's, uh, it's not, it's not a normal story time where, nor nor it's not normalized. This
[00:00:48] Marco Ciappelli: It is not, it's not normalized, know. I spend way too much time with you and I make too many interview and podcasts with you. 'cause I I knew you were gonna go there, so. [00:01:00] Well, I was gonna say, yeah, it's not normal. I don't know. You've been around for a while in this industry. Our guests been in this industry for a long time, and I'm here, you know, I'm the, the, the younger guy, not, not actually his age, but younger in the industry.
So I, I wanna learn from you guys, but it's not normal, is it?
[00:01:20] Sean Martin: It's not normal. It's not normal, and people are gonna understand what we're, what the heck we're talking about as we dig into this. But I'm thrilled to have, uh, man on from Rogel. How are you, Manzy?
[00:01:30] monzy merza: I'm really well. Good to see you guys again. Good to
[00:01:32] Sean Martin: you again. Yes. Uh, we had, we had a quick chat, uh, not too long ago. We got a quick overview of of what's up to, and, and we wanted to dig in and.
Kind of talk about, uh, some stuff, you have, some news that you, you're gonna share. We're gonna have another conversation with the team on that as well. Um, and what it all means, uh, for organizations trying to secure their business and have an effective and [00:02:00] efficient soc to do that. And, uh, so we're gonna get into that.
But first, uh, a few words about your past, uh, leading you up to, uh, the founding of Kroger and then what the catalyst was for. For starting the company.
[00:02:15] monzy merza: Yeah. Well thanks, thanks for having me and everybody who's watching this. Thanks for, thanks for listening in. Um, so I, I, I'm a lab rat by, by training and, and what I mean by that is I worked in the nuclear weapons complex for about a dozen years as an applied security researcher. Working on hardware, software problems, all related to cybersecurity and building things for the weapons complex for, for that, for that mission work.
And then I went and worked for Splunk for about a decade, and then after that started, uh, the cybersecurity go-to-market function for Databricks. And, and throughout this time period, I, there was a thing that kind of kept getting stuck in my head that there's lots of data. There's lots of tools.
Conventional wisdom is, you know, put all your data in one [00:03:00] place and analyze it, and that's the way to do it. And, uh, uh, I, I, I realized that that was not factually tenable or operationally tenable 'cause. Organizations have data in lots and lots of places. I mean, I've literally shaken hands with probably a thousand or so CISOs and, and thousands of security practitioners in the community over the years.
Security is all I've ever done. And, um, and that was the reason to start Kroger. But before starting that, I thought to myself, well, I should go back and really get my hands dirty. So I joined one of the largest banks in the world, uh, to work on the security team. The executives there sort of joked with me and they were like, Hey, you're an executive at Databricks.
Why don't you come be an executive here? How big of a team do you want? I was like, no, I, I don't want a team. I, I have this idea. I want to create a startup. I want a keyboard and I don't work on the problems. I want to work on tickets. They're like, you are crazy. And uh, because that's a [00:04:00] grungy job. I said, I know.
And it's, and it's, I will really wanna understand how it works. 'cause I don't want to be the guy who starts a company who's actually, I mean, you don't eat food from a chef who's never eaten food. You know, it's like that. It doesn't work that way. Why is it okay to build a company like that? And, uh, so, so I did that and, uh, and that's, um.
So we had this crazy idea and we sat around the fireplace and we said, you know what if, what if you created a product which would make every security analyst as effective as the entire team? Now think about that. That's a crazy proposition. And we're not naive. We've been in the security industry for a long time as a practitioner researcher, done all the things right?
So we knew it was ridiculous proposition, but that was the ridiculous proposition. And then we said, well, what would it take? For that proposition to come true. Like what could we do to even scratch the surface on that one and get deeper. So that was our North Star and that was the starting point for Kroger.
So Kroger's now been around for two and a [00:05:00] half years. Um, and, uh, we have customers and, and, and large government agencies. We're really just, and, and other commercial enterprises. We're really trying to focus on organizations that have really, really hard problems.
[00:05:13] Marco Ciappelli: Yeah, I, I really love the story and I know you're gonna actually show us something, uh, like a little demo in a, in a little bit, but. There's a few things you said. I mean, we started joking with normalization, but then you said a couple of, I mean one magic sentence, which is what if, and I think like that's the key for any innovation, any hacking mentality, not the bad guys.
The hacking meaning the scientists, the inventors, they're like, like, what if I do it different? I love that. And normalization is exactly when you say, well, I've always done it this way. Why should I change? So it is really like how you break stuff and you, and you innovate. So love that. Love
[00:05:56] Sean Martin: Yeah, and I, I think the, the challenge, we've touched on this before, [00:06:00] zy, uh, both on, on recording and, and us, us time other times, but, um. I mean, the, the problem we were solving 10, 15, maybe even 20 years ago now, um, was hard to solve, right? The technology available to solve, it hadn't quite come around yet. Uh, to scale with the businesses and the number of systems connecting and the amount of data coming in was difficult to deal with.
It's only expanded exponentially since, and I think we ended up with. A set of, or a method for which we manage our data in the soc, to your point. And we, we found, we found a, I guess a common ground where it kind of did okay, but it wasn't gonna scale to the future very well, which is what you, you've seen.
So maybe now's a good time to kind of touch on. That what that really looks like and the challenge there and, and how you guys are really shifting the, the method and the model for how we address this. So I'm gonna, [00:07:00] I'll pull up your slides here and you can kind of walk us through some things and then we'll get into, into the demo.
[00:07:05] monzy merza: Great. Um, okay, cool. Uh, let me see. We are, we are, we're seeing that, yeah. I just, I, I'm not gonna do death by PowerPoint, but I, I, I, I do want to, I I, I do want to call out a couple of things just to set the stage right. Is like, why, what was happening? So when before we founded Kroger, I mean we're in 2023 at that time and, and security teams are missing breaches.
And you look at like, why do they miss breaches? And there's lots of reasons and people like play a lot of blame game, like, this person's lazy, or That person's not competent or like you, okay? Throw away all of that and really look at the physics of what's happening. And the first part of the physics of what's happening is that as technology footprints have evolved, sensors have been developed.
To create, to, to, you know, to check the security posture or to create alarms or to create compliance, you know, um, uh, modalities for the organization. So every time there is a new thing, it creates more [00:08:00] alerts and the security team is somehow magically expected. To understand all those alerts and to understand all of the behavioral patterns of the user, understand what's normal and what's not normal from an investigative point of view.
So that's a very important thing to call out. And so that's why I say that analysts have domain knowledge gap. This is not a slight against analysts. This is a, this is like a weird thing and maybe I'm just hungry, so I keep going back to kitchen analogies, but it's the, uh, you know, you don't expect a baker. To be a bartender or a bartender to be a dentist. But in cybersecurity, we expect our SOC analysts or security operators to understand what email headers look like, what PCI means, how to remember all of where the data is in the enterprise, and be competent in writing exceptionally clean and performant queries in a particular data lake. It's a ridiculous proposition.
[00:08:55] Sean Martin: On and on and on. I mean, you just picked a few.
[00:08:58] monzy merza: Yeah, just
[00:08:59] Sean Martin: [00:09:00] It's endless.
[00:09:01] monzy merza: Right? It's a ridiculous proposition. And, and, and, and, and then what we do is we say, oh, people say we have a security team, but it's like a security team. Like, you know, like an Olympic team is a team, an Olympic team is really not a team because everybody runs their own events and, and that's how most security teams operate, where Bob gets an alert and Bob works on that alert. What Bob learns, unless there's some meeting or some crazy retrospective or a whole bunch of document writing and retraining, Sally doesn't get to benefit from Bob's experience. In contrast, we take some other sports, which are truly team sports where each player benefits from the other player's activities.
Security teams don't behave that way. Product development teams behave that way. Uh, but security teams don't behave that way. So that was, so when we tore down the, the, the problem that is sort of the core problem in my mind on what was, what is still happening in many organizations, but what was really happening, that's [00:10:00] the kind of the base principle.
And now this other stuff makes sense. And by this other stuff, I mean, why did the prior stuff didn't work? So we tried Sims, I was there. Thank you for everybody who is, you know, a customer of Splunk, uh, that has bought the Splunk sim and we tried to solve that problem. There was try to create this single pane of glass.
It has its benefits, but it has detriments and one of those big detriments is everything has to be normalized. We tried it with Soar, and again, I was in that industry for a long time as well, and, and Soar was useful, but those playbooks are so brittle. And they're so hard to maintain that every time something changes slightly, everything breaks.
Like most organizations that I talk to maybe have four or five, so playbooks that are deployed in production, that's about the extent of it. And then we started seeing this emergence of zero shot. Like we, if we look at like November 20, 22 time period. When Chad GPT really came out as, you know, where an out, out of the public beta, uh, or the private beta into a public beta where people started seeing the opportunity, [00:11:00] what could be done with a frontier model.
Um, people started to build these one, you know, these zero shot ai soc things. And even that, even that didn't work. So that's kind of the work that we're in. So we look at security leaders. There's tool competency issues, there's data sprawl, there's use case explosion. But when we talk to the security teams, especially the more sophisticated ones. Because we wanted to do this, you know, one person being as effective as the entire team mantra. So we talked to teams that were doing really, really well, and what we realized was they still had the problems. They had a lot of capability, but they were really focused on trust and safety. Of the future of this operation using AI of their security operations.
They wanted to use the best tool for the job, and they wanted to live in a world because they realized that data was everywhere and things were growing really, really fast. And so when we built Kroger, we built it on these three core principles that are rooted in data, process and governance. [00:12:00] And those things are, first, we want to live in a world where investigations, you have to investigate every alert. There is just no, there's no ands if and buts around that because you can, you, you can say, oh, that's a false positive, or that should be tuned out. We can, we can say that, but you can also see the, the logical conclusion of that, that at some point if you don't pay attention to the alert, even if when you're sensitized to it or you just don't have the time for it, you're gonna miss the breach.
[00:12:29] Sean Martin: Yeah.
[00:12:30] monzy merza: And that's gonna be the thing that costs the
[00:12:32] Marco Ciappelli: Are you, are you saying you're normalizing, missing the alert?
[00:12:37] monzy merza: Yeah.
[00:12:38] Sean Martin: Well, yeah. It becomes part of something bigger, right? All of a sudden that one thing that's become irrelevant is part of a chain that is relevant.
[00:12:46] Marco Ciappelli: no.
[00:12:48] monzy merza: And I think that's the issue. People, I've talked to so many folks and you know myself as being a practitioners like, they were like, well, we'll filter alerts. We need more alert filtering. It's like, yes, we do, but we're not doing that because we're gonna get [00:13:00] a better security posture. We're doing that because we're tired of crushing our teams because they just don't have the time and trying to try to make them look bad.
I've seen literally seen people close. Tens of alerts with one checkbox without investigating any one of them. It's not because they're lazy, they don't have the time they have to go do something else, so they don't have the time to, to just go through every single one where there's such a low probability of finding anything meaningful in there.
But then the other big one is this schema thing. And this was crazy, like when we, when we shared with people that we're gonna build a product that, uh, will work regardless of what schema you had, uh, you have under the hood. People are like, that's just stupid. It's come on. As, as 40 years, every single person in the world says that you have to normalize your data, whether you use a ETL approach or ELT approach or whatever approach that you're using.
You have to normalize your data across all these things then, and our [00:14:00] statement wasn't like, oh, you know, we hate those people who build the data warehouse platforms. Is that when we look at it observationally of what's really happening in an organization? And operationally what's happening, data's not normalized and it's just, and people have spent.
Millions of dollars on these treadmills of trying to normalize data, but it's, it's not having the outcome. Something always changes. The data store changes the data. The data format changes. A new data format shows up. People have multiple data lakes that they utilize. They're sitting on cloud, on-prem, hybrid, all, all over the place.
And so to, to expect it's just, it's just not. True. It's like, you know, you're gonna like, you're gonna breed without oxygen. It's just, it's just like a weird thing to say. It's like, no, nobody would buy that. It's sort of an extreme example, but it has to be true. And then in the, on these, uh, auditable action on the governance side of this.
We really wanted to build a system that people could trust. And the only [00:15:00] way you do that is by, by showing the work that was done and that people can learn from it. And the only way to do that is by, by showing the work that was done. So it, it seems like a simple argument, but it's unbelievable. Like how many systems are built, uh, by popular cybersecurity, you know, uh, vendors who do good work, but their tools or the techniques are not.
Are, are, are not visible to the, to the security operator. And then we layered this other thing. We said, you know, AI is very important. We're gonna embrace it, but it has to be private, it has to be customer managed. You can't just be leaking stuff to it. So those are kind of our main. Main objectives and, you know, tactically, so the data process and governance piece kind of encapsulated in more tactical, very, very specific aspects and to package it up into a private system.
So that's, that's kind of the thing. That's sort of the thing that we built. I mean, I can show you a quick i customer example slide, and then we can uh, uh, we can go to a demo or, or maybe you guys have some [00:16:00] reactions to this.
[00:16:01] Sean Martin: Well, yeah, let, let, let me say this first because, and, and maybe you can touch on this, uh, to kinda maybe set the stage for the customer and the, and the demo, but, um. I mean, what, what do teams we're kind of speaking to? Well, I'll say this. When vendors sell security to CISOs, it's often about, um, setting them up for success.
The ciso right? Which means hopefully their program is successful, and then they can demonstrate to the executive leadership team and the board perhaps that they're successful and. You bring that down to the analyst and that the analyst being successful gets lost. Right. It's, I think it's, it's either assumed or, uh, we'll fi we'll figure that out.
That doesn't really matter. It's, it's how, how the CISO and all the programs look so they can talk about it and. I mean, to your point earlier, just the, the ability for them to keep [00:17:00] up on all the different domains and all the different data, uh, yeah. Data sets coming in and even if it's all normalized, they might be missing contacts and to your point of they close 10 at a time or tens at a time.
Um, what, what are the, what did the teams feel the analyst feel, um, when they're dealing with this challenge? So. Are they, do they feel like they're doing the right thing? Do they feel like they're missing things? Do they feel that they can't keep up with the changes? Do they feel there's information there that they can't get because it's not part of the schema, not part of the normalized data set?
Um, what do you hear from them?
[00:17:40] monzy merza: Yeah, I think there's multiple layers, right? So at a very, very high level, if I had like one term, uh, most security operators that I talked to, they joined the security. Industry and, and joined the world of being an operator because they wanted to find bad guys [00:18:00] and, and they wanted to protect or defend whatever organization it is that they joined.
And, and so, and the reality of it, what they're really doing is they're working on alerts that usually don't yield anything. They're working on, uh, you know, misconfiguration related alerts where when they pull on the thread to find out, well, you know, Bob just forgot, uh, to, you know, uh, to change his password when he was supposed to.
And so he is still trying his old password and that's why the alarm's going off, that somebody's boot force attempting him. And it is really actually, Bob, it's not some bad person doing it. And so it kind of went from this. What in their mind was supposed to be like a really noble profession and that they were gonna do and make a huge contribution to something that is, that is really just, it's of no, it's of of little consequence.
They get measured on weird things in terms of how fast they respond or, or how many tickets they work on, uh, and their ability. And they have this ridiculous e [00:19:00] expectation from the organization that they're supposed to know everything. And they're supposed to, like all the things that I listed before, like whether it's email or application services or tools or whatever, they're supposed to know everything.
And they have this strange job where when they do good things over and over and over again, they're just doing their job. And, and when they miss one thing, then it is, it is catastrophic. Uh, the CSO gets fired, the leadership gets fired, people get shuffled around. Everybody gets blamed for a bunch of stuff.
And that's, and that's what happens, right? So that's the reality. That's the reality. It's like you, you know, I don't know. I've, I've, I've talked, you know, to some folks they're like, well, it feels like we're like firefighters who just keep answering the 9 1 1 call. But we never actually go get to go fire fight. And, and most of those 9 1 1 calls are crank calls. So that's that. That's, that's all, that's all it is. We actually never get to do the thing that we always wanted to do. So that's, I think at a, at a meta level on the, on the, on the on, on the physics [00:20:00] level of like what is actually happening on a day-to-day basis.
Alerts get dished out. People have to, you know, work on them. Individual security analysts have to work on those alerts. They write their documentation. Somebody sometimes sees something, sometimes doesn't see something, they, there is no direct line for them to be able to communicate. Like, Hey, these are all the things that we're seeing.
Maybe somebody ought to do something about that. Um, and they just keep seeing their repetitive so they don't have a, a general mechanism of their voice, of the voice going through. Then we go one layer deeper at a very, at the, at the tactics of it. This is what I saw at the bank. This is what I see in many organizations.
When an alert comes in, usually one of the first messages that goes out to the rest of the team is who knows how to write the, you know, this, to look for this thing in this tool. Like, who knows how to write the query to look at how many emails did Bob get last week? And the first response usually is, uh, Sally's really good at that, but she's not here today.
But here's the thing that she sent me when I [00:21:00] asked her this last time. Maybe this will work for you. And, and, and so that's, that's the kind of thing. So, so it's not, it, it doesn't always feel like you're doing awesome things. So security practitioners often have to find like these other kind of site things that they, that they take on with their jobs.
And this is what, you know, where they're going and investigating things or reading blog posts or trying to stay away or trying to tinker with tools, which is when the leadership says, well, why are they doing that? Why are they not working on alerts? Because working alerts is boring. Ideas to do things but don't have.
[00:21:37] Sean Martin: Right.
[00:21:38] Marco Ciappelli: I am telling you, I'm learned two things. You don't wanna be Bob, and maybe you want to be silly, but, but, but let,
[00:21:46] Sean Martin: there.
[00:21:47] Marco Ciappelli: let, let's, let's talk about how, and maybe this is where you kick in the, the, the customer slide. Like how, how the solution. It's gonna help Bob 'cause I feel really, really bad for [00:22:00] Bob right now.
[00:22:00] monzy merza: Yes. We, we should, we, we, we all need to build some empathy for empathy for Bob, because this is Bob's world. The, the, the first sort of brown, you know, that, that bluish, grayish box. Bob is living in a world where there is, and, and this is one of Kroger's government customers where there is 70 plus terabytes.
A data coming in day after day, just constantly. And I think most people don't really rationalize what 70 terabytes of data is. So I'll give you a visual. Uh, this, this, this phone of mine has about, has about a half a gig of memory, or excuse me, 512 gigs of memory. So it's half a terabyte. And so 70 terabytes a day is, I'm gonna throw this phone at you every like 20 minutes. And I'm gonna throw it at you and I'm gonna throw it. So 3, 3, 3 times an hour, I'm gonna throw you at you a phone's worth of data. knows what the heck's in here. So I'm just gonna just keep doing that. And every day, and when you go home, I'm still throwing, you're not sitting in that chair, but I'm still [00:23:00] throwing this phone at you, right?
That's what's happening. That's 70 terabytes a day, a day to just kind of wrap your head around like a physical example of that. That's crazy. Like how do you even analyze or, or work through that problem. And then you have all these tools like these, this, this Bob in this case has, has a soar, an EDR, and, and this.
Two sims thing is at the core of the problem that I was starting to observe. Like who has two sims? You'll be surprised that many people do.
[00:23:31] Sean Martin: Mm. Yep.
[00:23:33] monzy merza: will, but no CISO will ever stand up in public and say, guys, great news. We have two sims.
[00:23:41] Sean Martin: Great news.
[00:23:44] monzy merza: But it's, but that's reality. And behind those two sims in this case too, in other cases, sometimes we, we've seen organizations with even more, there are multiple data lakes that, that connect up and multiple detection things and all of that. Right? So
[00:23:57] Sean Martin: Yeah, I was gonna say there's probably some [00:24:00] extra stack underneath all that too.
[00:24:01] monzy merza: oh, there is a lot. Yeah, that's just the, that's, it's not even the tip of the iceberg.
It's like at the, the thing like, you know, at the very top that you just kinda see a thing. So that's what's happening. So the, this is a very complex organization, but I, but we're starting to see even smaller organizations that are getting to this point. 'cause they have so many different tools. So why are they using Kroger?
They're using Kroger because one, they want to investigate every alert that comes through. The analyst, Bob should not be investigating every alert. Bob is an intelligent human being. Is dedicated, and Sally is an intelligent human being. She's dedicated to doing a good job and wants to do good for the organization.
They want Bob and Sally to focus on things that they should really be paying attention to, but not just saying, oh, this is important. Trust us in a way that, uh, that there's evidence there. So I'll show you in the demo how the alert gets investigated and how a person like Bob or Sally, when they get the result from Kroger, they, they have some confidence in terms of what it's doing. [00:25:00] You'll, as a consequence, you'll be able to see why things go so much faster and why things go so much deeper. All, especially on the depth piece. There's, because you have supposed to know so many different tools in order to to do a deep investigation, I'll show you how Crow will reduces the need to learn every single tool. 'cause it will touch multiple, multiple tools and then you'll see the consistency because of the response plan that's in place. That despite the fact that we're using AI capability, we still create, make that thing, make things consistent. Because in terms of an auditor, you have to demonstrate that that's what you're doing.
And when Bob does something and Sally does something, uh, it should be consistent. It should be. And but consistent doesn't. So this is the other point that I'll send then we'll switch to the demo. Consistent doesn't mean. Incomplete
[00:25:49] Sean Martin: Or, or rigid.
[00:25:51] monzy merza: or rigid or the same, constantly
[00:25:53] Sean Martin: Yeah.
[00:25:55] monzy merza: consistent can evolve and it should. And, and so you'll, so we'll [00:26:00] talk about that as we, as we go through the, as we go through the demo.
So with that, I
[00:26:04] Sean Martin: I'm excited. Yeah, let's
[00:26:06] monzy merza: my, uh, my screen. So let's get to it. Um, let me log in, share this tab and. So this is the main dashboard. Imagine that you have deployed Kroger in your environment. And there's two, the, the, the, the minimal two requirements for Kroger to work is one, there has to be some sort of case management system.
For most people that's like a ticketing system like Jira, ServiceNow, or it's, or it's their sim because that's where alert goes and that's where they do alert management or it's even their SOAR platform. 'cause that's maybe where they're doing case management. So you have some system like that that you've deployed.
The other thing that you need at least one. Data lake of some kind. That, that could be, that could be anything, could be Splunk, elastic, whatever it is that you use. Um, even S3 buckets, it doesn't matter. It, there has to be some data store, Databricks, whatever, some data store. So some case [00:27:00] management system at least, and some data store.
So, and then you deploy Kroger. So in this case that is true, Kroger is deployed. You have a data store, you have a, and, and, and you have a sim alerts are coming in. Kroger is investigating. Investigating those alerts. So I'm gonna click on this Malware detected on Host Alert, and at the very top, Kroger is just giving you a summary and saying, investigated this alert and a threat has been found.
Why is there a threat? We're just gonna give you a quick summary because we evaluated across eight phases of the kill chain utilizing 13 different actions, and we found evidence. And these are, there are some number of assets and identities that are impacted. So right there in a very short blurb, you know that something was done deeper than just the singular alert for malware detection and some analysis was done.
Uh, so you can start to build some confidence in terms of, okay, this, the story's going somewhere, um, farther below you see a little bit of an overview. This is something you can copy, paste. [00:28:00] Into another system or into a Slack message or something. Um, so that it's, it's somewhat, it's somewhat human readable.
The other thing that we do is you say, okay, cool. I get it. You did all that, but what did you actually do? 'cause you're trying to build confidence in the system, right? As a, as a human being. So this is the actual set of actions that was done. It's a malware detection alert. A responsible Bob or a responsible Sally will want to know what was the initial access for that alert?
Where did that bad piece of malware come from? So Kroger follows the kill chain methodology to look across the different techniques that an attacker might use. And, and then figure out if there is a threat. And in order to do that, data is spread across multiple places. It's not always sitting in the same place.
Kroger is going to the appropriate location to do that. So in this case, you can see that it's, it created a Splunk query and it's able to investigate that and get some results farther below the, as [00:29:00] the investigation continues, it's creating queries against scribble. Uh, a cripple search. And this one is interesting because it's not just cripple search, it's, it's a crippled search query that's looking at data that's sitting in S3 bucket for CrowdStrike Falcon.
So you're like three degrees of separation out. AndroGel is able to figure out that that's the data that's sitting in that S3 bucket and is gonna use it under this circumstance to analyze execution. And so likewise, there's other phases and that's how Kroger figured out. That there was threat here in this case, and that's why we say we looked at eight phases and 13 actions.
These are the 13 actions, um, that have kind of built on top of each other.
[00:29:44] Sean Martin: I am gonna pause you here quickly, Ponzi. 'cause the, the alternative is this is one of 10 alerts that look similar and the analyst says, I see this all the time. These are, these are the 10, this hour almost identical, the [00:30:00] 10 last hour and the one hour before that and the 10. 10 blocks from the day before.
So I'm just gonna close it. Um, if they wanted to say, this seems su suspicious, they would have to do all the stuff you just showed, the phases and the actions manually.
[00:30:18] monzy merza: Yes.
[00:30:19] Sean Martin: And that's why they don't, not that they're lazy, but if they did that for all 10 and only one of them resulted in this, obviously it's a good thing they did.
They found the one, but they would've wasted all that time manually doing it for the other nine over and over and over every hour. So I think, do you find that teams and the customers you're working with, do you find that they recognize this and they go, my goodness, I, I, I don't, I can't even describe the value that it provides.
[00:30:48] monzy merza: Yeah, that is, that is the reaction. I mean, you look at the query that's in front of you right now and you know, towards the middle of the screen, forget about the fact that you can remember the schema. You still have to type the darn thing [00:31:00] out,
[00:31:01] Sean Martin: Yep.
[00:31:02] monzy merza: so you have to, you know.
[00:31:04] Sean Martin: With no errors. Yeah.
[00:31:05] monzy merza: Yeah, yeah. When you have to write a Splunk query, you have two problems.
Yeah. At least. Or in this case, a cripple query, right? Or, uh, or, or, or cripple search. Who, who, who remembers all of this? And, and so, and yes. So, so, so, yeah. So when we show this to security practitioners, they're like, yeah, this is what I want. 'cause this will, so this will help. And then their immediate next question is, well, what if it makes a mistake? Or how do I know that this is, that this is correct. And, and so that's why as we go through this, we, there are, there are two things. So behind the scenes to achieve that consistency bit that I called out before we did, jumped into the demo, Kro goal creates a response plan. And that response plan actually includes what Rogel is gonna do.
So let me show you this query. Like in this particular [00:32:00] case, you'll notice that. This is the template of the query that we would use. Now, this template was not prepackaged with Kroger. This template was learned from your environment. So those of you know, those of us watching, you know, that received two squiggly brackets.
It's not a particular. Field label for any specific specification, like OCSF or Common Information Model or, or, or the CEF format. It just so happens that in, in this particular case, in this organization, that's the field label that's used for recipient email addresses. Kogal figured that out. And so now it knows that if it's gonna go look to investigate files, attachments, this is the query that it's gonna run and, and if somebody wants to change it, they can change this.
If somebody wants to disable it because they say, Kroger, we don't trust you. This is the wrong query or whatever, they can disable it. So what it's going to do on a given circumstance is already [00:33:00] visible to the org, to the team ahead of time. Because it's gonna follow a process. But we talked about consistency.
Not being consistent does not equal static. So as, as Kroger continues its operation, as people do more work, Kroger will learn from that work and Kroger will update the response plan. So this is not so unlike a SOAR playbook where a human would have to change it. In this case, as data changes, as people's actions change, this is the Bob and Sally part.
So when, when Sally works on this alert, after Kroger has said something, Kroger will learn from Sally. Kroger will change the response plan. When Bob gets the alert, Bob will, the result that Bob is gonna see is sensitized from Sally's work. It's incorporated, this goes back to the higher level mantra to make every analyst as effective as the entire team.
And the only way you can do that is by shrinking that collaboration gap is you try to put Sally and Bob in a room constantly to try to talk to each other. That's [00:34:00] just, that's, that's not a good way to learn and that's not a good way to action thing. There's too many meetings.
[00:34:04] Sean Martin: Yeah. Yeah. Well, I love that. We're, uh, we're, we're coming close on time here, so I'm gonna, I'm gonna say that it's a great demo. I wanna kind of recap with, um, some of the final points here. So I'll put us back to, uh. To, uh, conversation mode here, um, because I wanna bring it back to the, the normalization part.
There, there isn't a bunch of systems, there isn't a, a collection of other data sets that are now normalized and missing context and missing critical data. Um, all of that kind of gone, right? And it it just straight into roal from the data lake and access to the other systems. So we can do, do the, uh, different phases of, of activities, which.
It seems to me it limit, I, I, I remember building collectors and connectors and normalization technologies and new stuff would come and you'd change the schema and everybody get pissed off. And I mean, [00:35:00] that in itself was a challenge, um, and complex and leaves things open for risk and error. So I think kinda to bring it full circle, the new normal is no normalization and eliminate all that junk and just give the data to the system. And ultimately to the analysts so they can actually, uh, help train the system to do what needs to, and then obviously they learn as well,
[00:35:27] monzy merza: You have the access to the system, right? So Kroger is, is, is working on the data where the data lives. That's another important piece of it is you j we just, it just goes and queries the system or interacts with that system, makes the API call with that system, whatever needs to be done, leaves the data where the data is, it's emulating an analyst essentially, right?
That's what analysts do. We don't print out pages and pages of logs, put 'em on our desk and start sifting through it. So why is it okay for another company to come tell you, oh, you wanna do security analysis? Have to buy this thing, and now you have to migrate all of your data into this thing, normalize it, and then you can use it. [00:36:00] They're like, no, if the data's already there, just use it.
[00:36:03] Sean Martin: It's incredible.
[00:36:04] Marco Ciappelli: well, what I, what I, what I like is, you know, I obviously, I'm not a SOC analyst, but I use generative ai and there is the learning. Learning curve and learning according to the way you work. What's my style? What is my environment? And when you buy something prepackaged, it's almost like, you know, I don't know if you're still starving, but you can use another ingredient type of thing.
Like, this is the recipe. Well that's great, but this is not, ingredient is not in season, so how are you gonna adapt, right, according to what your environment is. And not every environment is the same. So I mean, it's like. The difference between buying something that is packaged before and then you get upgrade and every everybody gets the same upgrade, versus you upgrade it as you go.
And that's, that's why I like it very, very much. Well, I, I think we could have another conversation. I'm very fascinated by this old [00:37:00] way of thinking and, and the way the business change, the cybersecurity world is changing. So looking forward for, for many more, um, of this.
[00:37:10] Sean Martin: Yep. And stay tuned. Uh, not only is it cool and the outcomes are obvious, at least evident to me, I'm sure, uh, analysts and, and, uh, soc managers are probably seeing it as well. Hopefully the CISOs recognize it, that their teams can, uh, benefit from this, not just the program. Um, but you, you've also patented this stuff, so we're gonna have a chat about that too.
And, uh, so stay tuned. It'll be. Be talking to Corey about the, the patent that, uh, helps make all this possible. Um, yep. Good stuff.
[00:37:40] monzy merza: Yeah. Thank you guys. This is, I really appreciate the time and really appreciate the questions, and we're looking forward to sharing this with more and more people.
[00:37:47] Sean Martin: Yep. I love it.
[00:37:48] Marco Ciappelli: love it.
[00:37:49] Sean Martin: Thanks Zi. And uh, thanks everybody listening to this brand story here on ITSP Magazine. Um, be sure to connect with Zi on LinkedIn and, uh, and connect with the uh, [00:38:00] Kroger team. Let's see, uh, see how it can connect to your systems and data sets and, uh, give you the, the actions that your analysts need, uh, taken.
And, uh, really appreciate it. M.
[00:38:13] monzy merza: Thank you guys. Have a great day.
[00:38:15] Marco Ciappelli: Take care everybody.
​[00:39:00]