ITSPmagazine

AI Adoption Without Readiness: When AI Ambition Collides With Data Reality | A TrustedTech Brand Story Conversation with Julian Hamood, Founder and Chief Visionary Officer at TrustedTech

Episode Summary

AI does not fix messy environments, it amplifies them. This episode explores what real AI readiness looks like when data, security, and architecture are treated as prerequisites rather than afterthoughts.

Episode Notes

As organizations race to adopt AI, many discover an uncomfortable truth: ambition often outpaces readiness. In this episode of the ITSPmagazine Brand Story Podcast, host Sean Martin speaks with Julian Hamood, Founder and Chief Visionary Officer at TrustedTech, about what it really takes to operationalize AI without amplifying risk, chaos, or misinformation.

Julian shares that most organizations are eager to activate tools like AI agents and copilots, yet few have addressed the underlying condition of their environments. Unstructured data sprawl, fragmented cloud architectures, and legacy systems create blind spots that AI does not fix. Instead, AI accelerates whatever already exists, good or bad.

A central theme of the conversation is readiness. Julian explains that AI success depends on disciplined data classification, permission hygiene, and governance before automation begins. Without that groundwork, organizations risk exposing sensitive financial, HR, or executive data to unintended audiences simply because an AI system can surface it.

The discussion also explores the operational reality beneath the surface. Most environments are a patchwork of Azure, AWS, on-prem infrastructure, SaaS platforms, and custom applications, often shaped by multiple IT leaders over time. When AI is layered onto this complexity without architectural clarity, inaccurate outputs and flawed business decisions quickly follow.

Sean and Julian also examine how AI initiatives often emerge from unexpected places. Legal teams, business units, and individual contributors now build their own AI workflows using low-code and no-code tools, frequently outside formal IT oversight. At the same time, founders and CFOs push for rapid AI adoption while resisting the investment required to clean and secure the foundation.

The episode highlights why AI programs are never one-and-done projects. Ongoing maintenance, data validation, and security oversight are essential as inputs change and systems evolve. Julian emphasizes that organizations must treat AI as a permanent capability on the roadmap, not a short-term experiment.

Ultimately, the conversation frames AI not as a shortcut, but as a force multiplier. When paired with disciplined architecture and trusted guidance, AI enables scale, speed, and confidence. Without that discipline, it simply magnifies existing problems.

Note: This story contains promotional content. Learn more.

GUEST

Julian Hamood, Founder and Chief Visionary Officer at TrustedTech | On LinkedIn: https://www.linkedin.com/in/julian-hamood/

Are you interested in telling your story?
▶︎ Full Length Brand Story: https://www.studioc60.com/content-creation#full
▶︎ Spotlight Brand Story: https://www.studioc60.com/content-creation#spotlight
▶︎ Highlight Brand Story: https://www.studioc60.com/content-creation#highlight

Keywords: sean martin, julian hamood, trusted tech, ai readiness, data governance, ai security, enterprise ai, brand story, brand marketing, marketing podcast, brand story podcast, brand spotlight

Episode Transcription

AI Adoption Without Readiness: When AI Ambition Collides With Data Reality | A TrustedTech Brand Story Conversation with Julian Hamood, Founder and Chief Visionary Officer at Trusted Tech
 

​[00:00:00]  
 

[00:00:22] Sean Martin: Hello everybody. You're very welcome to a new brand story here on ITSB magazine. This is Sean Martin, where I get to hear all kinds of cool things being built to solve some super complex problems, uh, in the world of technology and, and, uh, oftentimes given my. My background and past in, in the world of cybersecurity and, and privacy and, and yeah, data has been called the, the new oil. 
 

And uh, certainly with the, the evolution of AI and, uh, everybody wanting to get their hands on all their data to do cool things with ai, [00:01:00] that's never more true than now. And. With new capabilities comes new challenges, and, and, uh, specifically with AI and, and data and systems and all that, a world of complexity that can make things difficult for IT and security teams to keep up with. 
 

And I'm thrilled to have Julian Hamud on today. We're gonna talk, talk about all that stuff and probably a bit more. Julian, how are you? 
 

[00:01:25] Julian Hamood: Sean, thank you for having me, uh, doing well and really excited for our conversation. 
 

[00:01:29] Sean Martin: Yeah, absolutely. And you, uh, you are the, the founder and also I love this title, the Chief Visionary Officer, 
 

[00:01:36] Julian Hamood: Yeah. Yeah. It's, uh. 
 

[00:01:38] Sean Martin: trusted tech. Uh, try to steer the ship. So, I mean, it's cool to be. Visionary. Right. And I'm, I'm sure you get to engage with a lot of organizations that, that are trying to do that as well. 
 

Uh, tell me about what that role means to you at Trusted Tech and maybe, maybe a little, little [00:02:00] background and how you, uh, how you landed in that role. And then we'll talk about the forming of the company 
 

[00:02:05] Julian Hamood: Yeah. Yeah, definitely. Um, I, again, with, uh, our rebrand, uh, that we kicked off earlier this year, um, and with strategically how much the advancements in, uh, technology, especially as it relating to, you know, chat GT LLM kicking off three years ago. Uh, we really believe that the future of the company is not just in regards to just continuing to bring in more revenue, but really look for niche holes in regards to what's going on. 
 

Not necessarily. The AI practice as a whole, but also there's tons of gaps in data. There's tons in, in of gaps of AI agents. There's tons of misinformation, there's issues with security. So again, uh, chief Visionary Officer pretty much looked through the gaps and, and recognize opportunities, uh, not just for trusted tech, but for our clients as well. 
 

Um, and in regards to the founding of the company, um, founded the company nine years ago now, uh, going into our 10th year of operation next year. And, um, really, [00:03:00] uh, I found a, a niche within some of the licensing, uh, that Microsoft was pushing out. I realized no one nationally or even globally had an understanding of how to license Microsoft product correctly. 
 

Um, and I kicked off the company with three or four individuals. We ended up doing 5 million in revenue the first year and really just compounded into, you know, what we have, you know, now at, at a $500 million company and really focused on Microsoft, Microsoft offering services, um, and managed services. 
 

[00:03:27] Sean Martin: Yeah, just, just those figures along. And make me think, how much money are you saving other companies to, to make, to make, generate that money yourself? And, and yeah, it's, that just speaks to the complexity of that whole ecosystem, right. 
 

[00:03:41] Julian Hamood: Yeah, yeah, absolutely. Uh, first line of business, and we, we have gotten away from this, is we, we don't wanna be a cost house. Uh, we don't wanna just, you know, save people money, but along the process, whether it's, if it's not in the front where it's the licensing. It's professional service engagement. It's a managed service practice. 
 

It's a third party [00:04:00] offering. Um, and especially as it relates to, you know, copilot and AI readiness, uh, we really look to put a package together that does eventually, um, you know, draw down that bottom line for, for our clients. 
 

[00:04:10] Sean Martin: Yeah. And make, and make things more efficient and effective. And, um, I believe that's, well, you, you'll tell me, but I'm gonna guess that that's part of the reason why, uh, you've been recognized by Microsoft. 
 

[00:04:22] Julian Hamood: Yeah. Yeah. So, um, again, going to the rebrand and maybe just digging into a little bit more, uh, to the Microsoft managed, which is less than, uh, 1% of partners globally are managed by what Microsoft calls a product, uh, or partner development manager. But yeah, simply put the shift from trusted tech team to trusted. 
 

It wasn't just a cosmetic rebrand, it was really a reflection of our maturity as a company. And while team spoke to our roots in really getting our hands, uh, dirty hands on support, we real, we realized we really outgrown the name. Uh, so again, just not only an extension help desk anymore, um, [00:05:00] we're, uh, architects for complex global enterprises, and that ranges from, you know, 200 employees all the way up to 23,000 employees. 
 

And we needed a, a brand that stood toe to toe with the C-Suite executives. We advise every day. That project's exactly why we're in that Microsoft highest tier partnership. And we work with A PDM, um, you know, weekly. And it's where, and honestly we're incredibly proud of it. Uh, but for our clients, it's not just a badge on our website. 
 

It means, uh, we have direct access to Microsoft Engineering that, you know, standard partners just don't have. And it really validates that when we deploy solution, it's done at the highest global standard. And, you know, we, we've. It's taken a lot of time to get up to that expectation for not only myself, but the team that we have in place now, and of course, Microsoft's expectation. 
 

So yeah, that momentum is real. And we're taking a model into the UAE now. Um, we, it's officially expanded into Dubai after our successful launch into, uh, the UK and EU markets almost three years ago. And we're seeing that, you know, problems we solve, uh, whether it's complexity, security, [00:06:00] data, chaos, they're universal and we're really ready to tackle them. 
 

[00:06:03] Sean Martin: Yeah. So many questions and um, I'll hold on those, uh, for before we get in. Um, but the, I guess the, I don't want. I paint too much of a barrier around what you do, but maybe a couple examples of maybe your most common engagements, some of the projects you do. You just kind of set the, because I can, I can go to office deployments that may or may not be it. 
 

Go 
 

[00:06:28] Julian Hamood: Yeah, we, we still do, you know, the office deployments, the mail migrations, all the things that are very, very common still with, uh, Microsoft partners. But really we've, uh, evolutionized our offering and majority of the offerings, it's AI agent readiness, it's a agent AI readiness, it's copilot readiness, uh, it's data readiness. 
 

And again, it's, it's a, you know, project that can span from $5,000. Let's, let's tell you, you know, what's going on with the environment to. If you're gonna spend $500,000 on a complete rehaul, so all your data's talking together and your externally [00:07:00] facing application that you send out to 200 a quarter, you know, million customers can really have, uh, the next phase of evolution as it relates to hooking up to different LLMs or AI or, you know, any type of agent AI customers are trying. 
 

And it's such a wide array of, you know, what we're seeing even as it relates to AI app development. But yeah, it's, it's, it's wide. I would say we got, we got a, we used to have a, an inch down and a mile wide, and now we're, we're a mile down and a mile wide. So it's been a lot. But, um, every, every customer, every engagement's different and we're, we're learning a ton as we go, just like similar to every other company. 
 

[00:07:35] Sean Martin: Yeah, absolutely. Uh, I, it makes me wanna be there with you to, to completely explore all these things and 
 

[00:07:41] Julian Hamood: Sean. It's challenging. 
 

[00:07:42] Sean Martin: it's, uh, I'm a nerd that way. I like those challenges. Um, but, but speaking of which, I mean, talk to me about what. Readiness means, um, because there's, in the past year or so, I, I've heard a lot about resilience, that everybody's focused on resilience. 
 

There's even a lot of [00:08:00] talk about sovereignty as well. Um, but kind of the other, the other part of it is to actually be ready to do this stuff in the first place. So tell me what that looks like. 
 

[00:08:10] Julian Hamood: Yeah. And, and we compare this to AI hype versus the reality of what's really going on. And, uh, the reality that I'm seeing on the ground is everyone wants the magic of ai, but few are really ready for the maintenance of it. And a good analogy for this is, is the iceberg effect. So everyone sees the shiny chat interface above the water. 
 

They ignored a massive unstructured data mess beneath the surface. And if you turn something on, let's just use a Microsoft product for example, but Microsoft copilot, if you turn it on without cleaning up your data, you're essentially supercharging the chaos. And we're not just talking about Microsoft copilot for office applications to help you, you know, formulate cells within Excel, but we're talking about the entire organization utilizing a copilot to make decisions, financial analysis, et cetera. 
 

Hr, um. So yeah, we, we've just, we've, we see [00:09:00] uncontrolled data sprawl and like, again, in this copilot example, files living in personal OneDrive, old SharePoint sites, legacy servers. Uh, the risk here is really oversharing. Um, just at the surface level. Uh, suddenly an AI tool like copilot can surface sensitive HR data or even executive financial records, for example, to a junior employee just because they asked the right questions and the permissions weren't locked down. 
 

So we've always had the same messages. Practical readiness has to come before the hype. And I say that, and I say that, and I say that, and that never happens. But, uh, you have to label your data, uh, have to classify it, and, uh, internal boundaries and parameters. Um, so, you know, if you feed AI incomplete or conflicting data, it'll, I can confidently say, it'll always give you the right, the wrong answer. So again, we focus on getting the fuel right, so the engine runs effectively. Um, but again, you hit on it at the onset of, uh, this meeting. The data needs to be the focus. And I hate everyone's just saying that. [00:10:00] It used to be, it was AI hype and then data readiness hype. But that's, that's where we're seeing the most engagement is let, let's look at the data. 
 

Let's look at, uh, our Azure environment. Let's look at our, uh, sql, uh, environment that's hosted in Azure. Let's look at what we have on-prem. Let's look at our billing applications. Uh, let's look at our customer facing applications and really integrate all that data together. And that's, that's where probably 70, 80% of the work's being done right now, the other 20% is tying those systems together and having agen AI do what you want it to do. 
 

[00:10:29] Sean Martin: Yeah, because yeah, it, it's so, especially when when you have a a u user interface, like a, a chat GBT or a clot or whatever, it's so easy to do something. I mean, you can even vibe, code and, and get stuff done. And, and maybe your perspective on this, 'cause I, what I can envision is I. A pilot project or a proof of concept or some skunkworks thing that I just wanna see if it's possible to do this with my hr, to [00:11:00] do this with my marketing, to do this with my sales, whatever it is. 
 

And then, oh, by the way, it, it looks pretty cool. Let's just let it fly. 
 

[00:11:08] Julian Hamood: Yeah. Yeah. 
 

[00:11:09] Sean Martin: so to your point, data's being surfaced and maybe being exposed to people who have been given access to now to that app or that agent service that. Shouldn't have the access to the data that that service has now. So what, what are some other examples of, of things like that happening that you're, 
 

[00:11:28] Julian Hamood: Uh uh, again, without digging in, we have dozens of examples that have surfaced from our clients reaching out and saying, yeah, we just had massive exposure, or The data that we used to make a huge financial decision or a budget decision was absolutely incorrect. So again, whether it's tying it to an ERP system. 
 

We've had examples of customers tying AI agents into the financials similar to NetSuite publishing information and goals and markers for the sales team, and that being completely inaccurate. Where sales, we had a, again, I won't give any [00:12:00] specific examples, but organizational goal for specific department was set a 13%, and this was based on NetSuite, was do, uh, conducting the consolidation of the financials, published it to, uh, director level individual, 13% goal. 
 

When they hit the 13% goal, it ended up being 4.3% of what the actuals were. So again, lack of data readiness, lack of verifying the output for what's going on. And we just have example after example, after example. But again, we, I could point out finance, I could point out hr, uh, I could point out, um, it having massive issues with what's going on right now. 
 

And we've had clients that are like, okay, perfect. We have worked with you guys. Uh, we're ready to go ahead and push out the ai. There's still, you know, security concerns. They're still, you know, inaccurate reporting. So it, it's, it's a work in progress and customers need to be ready to not only put six months into this, this is, it's ongoing. 
 

It's never gonna stop you. You need to, you need to construct, when you're constructing an AI strategy, you need to put this permanently on the roadmap, uh, for the company [00:13:00] and really be hiring around it. That's what we've done here. 
 

[00:13:02] Sean Martin: Yeah, we'll talk about that as well. But I wanna touch on the, the ongoing piece. And you mentioned maintenance earlier. Um, yeah, training, training models and, and setting this stuff up once can be a, a bear and expensive and, but super important to maintain it. Right. So to, so you're not using old data. Um, so how, how does that look? 
 

Do, do companies. Actually build in the process. I mean like updates over there, syncing data and all that kind of stuff. 
 

[00:13:37] Julian Hamood: Yeah, we, we try to put, uh, parameters in place, uh, and that ensures after the project is completed, there's typical maintenance points. Uh, for example. Uh, Salesforce, uh, jet Ai targeting Salesforce directly to provide outputs for C-Suite. The only thing is you have a, you still have humans that are inputting data, and when humans are inputting data, even after massive, you know, data cleanup and a readiness [00:14:00] assessment and project completed, the maintenance still needs to. 
 

Those data points will start moving after three months or four months or five months and start providing inaccurate responses. So yeah, again, when, when we're completing a project, whatever the size is, $5,000 to $1.5 million project, we always have maintenance and checkpoints. And if the client wants to perform those main checkpoints individually, absolutely we will set them up for success. 
 

But again, we'll typically work with our clients on a move forward basis. But it just depends on the scope and what the customer's looking for. But yeah, security parameters and also maintenance parameters are extremely important. With everything going on right now. 
 

[00:14:34] Sean Martin: What else about the environment? Can you tell me that that potentially adds more complexity? I mean, we talked a lot about the data and I don't know, maybe different types of data. There are lakes and stores and 
 

[00:14:51] Julian Hamood: Yeah, 
 

[00:14:51] Sean Martin: databases and all, all kinds of, but then. Certainly the infrastructure underneath, uh, the build environments [00:15:00] that the stuff is built on is changing. 
 

I'd mentioned vibe coding, so kind of paint a picture of what you're seeing there. Um, up and down that stack from hardware to, to, yeah. Operating system and services and cloud and apps and building norm, all that stuff. 
 

[00:15:18] Julian Hamood: Yeah. We rarely see environments set up correctly. Uh, and there's typically more than one director of it, uh, or CTO that was involved in the structuring of the environment. And everyone has a preference. And, and I'll keep it, you know, 1 0 1 base level, even if we, we see split environments between AWS and Microsoft Azure, probably 35, 40% of the time. 
 

And just by having coordination of two different platforms, uh, we're talking about waters already. And then you add a whole set of complexities for a lot of clients, regardless of employee size. Even 15,000, 18,000 employee companies, they still have on-prem servers, um, hosted it outside their office. 
 

They're paying a lot of money for it. But again, now you're just talking about a, a tertiary system that needs to be integrated in. [00:16:00] And then you talk about hosted applications, and then you talk about development work that's been conducted, uh, for in-house applications. And then you talk about an e rp, then you talk about A CRM, um, then you talk about they move from, uh, HubSpot to Salesforce and then they're using Service Cloud, for example, to manage a lot of the support requests. 
 

Uh, and those are just maybe seven or eight different examples of. How just the foundation of the environment can be extremely muddy. And even when you have a full picture painted, or if a CTO presents a document that that is a review of the entire architecture, it's still really, really muddy because they thought that all the data for, uh, client accessibility, uh, client data, uh, employee records was stored only exclusively on AWS, but it was hooked up to a DP and it was on Azure. 
 

Azure as well. So, yeah, you talk about this has been the most interesting time. For what we're seeing in the market, and we've been really well positioned going into even this year. But yeah. Uh, environment wise, I hope that gives you enough [00:17:00] of the cluster that we're having to deal with here. 
 

[00:17:02] Sean Martin: and I'll, I'll use your iceberg example. That's probably just the tip. 
 

[00:17:05] Julian Hamood: yeah. 
 

Just the tip, we're, we're, uh, yeah. Technology janitors, uh, out such a high scale. 
 

[00:17:11] Sean Martin: yeah. So help me get a picture of how you and your team can help. 'cause I, I used to build. Products, and I used to beat the drum loudly that the requirements phase was the most important. You had to define what it was, who it's for, what it's intended to do, what you don't want it to do, how you're gonna verify all that stuff. 
 

And that included architecting the system and, and understanding the environments can be deployed in and the team's gonna use it and all that stuff. So how, how do you help? With some of that and then, and then kind of actually getting into the nitty gritty of the architecture and everything else that goes on in there. 
 

[00:17:52] Julian Hamood: It just depends on really the scale, uh, of the company we're working with. Um, we feel like a, a private equity company at 4,000 employees is gonna be [00:18:00] completely different than a private company at 400 employees. So we really look to see what the IT coordinator, whoever it is, uh, typically it's gonna be a CTO or director of it. 
 

Really the infrastructure that they have built, but it, it's identical to app development as well, is you need to observe all aspects of what's in place and also look at the sensitive areas, which are exposure points, which they might not think of. And again, I think we're relate if we just relate this to AI practice and what we're seeing in the marketplace. 
 

Again, I think you, you really hit it on the head here is, is you can look at, uh, you know, coding a website, building an application, or agentic ai. It's, it's all identical Is. What do we wanna accomplish? What are exposure points? What do we have now and what is it gonna take to get there? It's the same four buckets that we see, pretty much, whether it's, again, massive millions of dollars of AI project, or we wanna just get our, our data clean, it's equal. 
 

[00:18:54] Sean Martin: And do you, so you mentioned C-I-O-C-T-O, head of it, um, [00:19:00] kind of the primary folks you work with. Um. My experience is that there's oftentimes a line of business owners saying, we wanna do this. And I was at a legal conference not too long ago where they were actually empowering attorneys to actually build stuff with ai. 
 

Um, so, so we're seeing end users actually do things that have, talk about the ultimate shadow, uh, IT stuff, right? So how, how do organizations kind of get a view of just what's going on in the first place? 
 

[00:19:33] Julian Hamood: Yeah. 
 

[00:19:33] Sean Martin: doing what? 
 

[00:19:34] Julian Hamood: Uh, Sean, really good point. Two topics to cover there. You have found you have founder or CEO LED initiatives as it relates to ai. The secondary aspect of that, we touched upon attorneys, uh, sampling different AI agents. Uh, N eight N is a great tool for people to go and have day-to-day help from AI agents super cheap and, and non. 
 

Technology based individual can go and create different AI agents, including legal and attorneys, and we'll see [00:20:00] that quite often. That's a whole nother mess, number one. Number two is founder led initiative as it relates to ai. And we will work with, when it's founder led, we have founder or we have CFO led and they're typically looking at the top and bottom line for any project that's going on. 
 

So typically we always like to look at the technology, uh, technology burden for each company and propose that, but again. The biggest obstacle, whether, you know, I'm talking to CNBC, this was a question that they asked is how do you really teeter in regards to, everyone wants to get to ai, but no one wants the founders and the the CFOs, they don't wanna spend the money to actually get there. 
 

So again, it's, it's a pushing and pulling match in regards to proposition of why can't we just get AI done now? You gotta go clean the backyard first. You gotta cut the grass, trim the trees, put a fence up before you can actually start looking at, uh, you know, jet the, so yeah, that, that's, you know, two different issues that arise. 
 

And, you know, I always look at the, the negatives of the setup, but again, um, yeah, two different avenues there and there we could dig into either either of those on each individual projects and [00:21:00] pros and cons of an attorney creating an AI agent. To email all his clients confirming the meeting for the day, you know, the safety of that versus a founder making a decision, Hey, I, I'm looking to, to save $1.2 million on my internal team by cutting my HR staff and my IT staff down, but I don't wanna spend $300,000 on a project and there's issues there as well. 
 

[00:21:21] Sean Martin: Yeah, so I, I guess it the, I would imagine most business leaders, including the executives, uh. Probably have a decent understanding of ROI, right? What, what do they hope to get out of this by spending on it? Um, but then there's the, the reduction of risk on investment or with, with the investment or the introduction of risk, however you wanna look at it. 
 

And I think that's a little less tangible for many, and especially when you get down to the individual building. So how, how do organizations kind of get it? Get their head wrapped around the risk, the exposure that you [00:22:00] mentioned, um, that 
 

[00:22:01] Julian Hamood: Yeah. Yeah. And again, we, we do, we conduct, um, a, uh, we have an ROI calculator when it refers to these larger projects that we're working with clients where there is an investment, there's a $1.8 million investment, for example, a client we work with, and they didn't break even until month 19. And we made very conservative assessments for them. 
 

That's one aspect there is, you know, it's gonna take some time and this AI hype and luckily over the last like two months starting to settle in a little bit, you had 340 billion or so of spend this year. Where's the revenue coming from? That's a big question, you know, to see the market teeing or whatnot. 
 

But, um, yeah, again, as it relates to, um, the, the safety parameters around it, we also have to bake that in as well. And that, that's, it's tough to quantify for, uh, for C-Suite is, hey, we're not only gonna. Perform an AI project, we're gonna, we're gonna accomplish your goal for you. But we also need to look at the security aspect of like, okay, yeah, you're, you're, we need your data. 
 

We, the, when we're working with a client, you know, it's all hosted in Azure, it's gonna be [00:23:00] public. Um, so again, we always have a safety parameter. We never touch any customer's data. We're never, uh, exposing that data. But the company that we're working with, all the employees, all the suite, suite, are now gonna have potential exposure to their own data. 
 

As a founder or as a CTO or CFO, you have very tight parameters in regards to that. But again, for AI to be successful, it needs a company's data to be held within an ecosystem. And yeah, so that, that's, you know, the, the security training is a huge part of a lot of our professional service projects we're working on right now. 
 

Um, we would even have a, you know, a 10% budget in regards to these projects where we can train. They're director of it or their, you know, head of it in regards to the parameters that we can assist setting up and how they can maintain that. But yeah, again, we, if you, if we throw a round number out there, I would say about 10% of project spend is in regards to security, security parameters at minimum. 
 

And that could scale up to about 30, 35% of total budget for, uh, for planning expenditure. 
 

[00:23:55] Sean Martin: Yeah, interesting. What, what other [00:24:00] functions or roles. End up being needed that organizations may not recognize upfront. 
 

[00:24:08] Julian Hamood: CISO is probably the most common, uh, that's pe uh, private equity and non-private equity. Um, having a security officer in place seems to be the most common role that we're seeing, uh, get brought in mid-project. Uh, so project is kicked off with a CTO or, or our director of it, head of IT, and CISO is introduced. 
 

So we are actually loving the, uh, environment shift for companies to be able to bring in a ciso. But again, you know, 3, 4, 500 employee company, it's just a, that's a, that's an expense that some might not want to bring on. But when we're looking at the four or five, 6,000 employee companies, or the 15,000, or the 22, 20 3000, uh, we're commonly working with a CISO and the CISO's hiring a security team underneath them. 
 

Again, most commonly that that's not the case with majority of the customers we work with. But again, yeah, you're seeing ciso, you're seeing, uh, different CTOs being brought in with a security background. Um, but yeah, I would [00:25:00] say those two roles, um, most common is founder, COO and CTO with an extension, uh, to a, uh, security officer as well. 
 

[00:25:08] Sean Martin: Got it. Got it. And I wanna touch on one, one scenario that, uh, I have a feeling. Continue to see more of, um, acquisitions, mergers, those types of things where talk about two houses coming together with two different infrastructures and ecosystems and, and teams and cultures and all kinds of fun stuff that makes those, those connections and integrations, uh, fun we'll say. 
 

Um, so what about some of those things that you've worked on? What, uh, any, any unique stories to share there? I. 
 

[00:25:42] Julian Hamood: Yeah, and I know we were talking about the, uh, the, the two dirty house analogy right before we, uh, were on here. So yeah, this, it's really where the rubber meets the road, especially in m and a. Um, and again, I always like to use the two dirty house analogy. And the analogy there is imagine getting married, you [00:26:00] move it together. 
 

You both have messy houses. It's attics, full junk closets, completely disorganized appliances, washer dryer, everything just broken. Um, you can't just dump the contents of house B into house A and expect a happy home. And I, I, I imagine that, like that probably said that about a year ago. And instead it to a client live and you end up with two dirty houses and it's inside of one dress. 
 

So this is really exactly what we see and what modern m and a looks like technically. You have two different identifying, uh, identity systems colliding. You have completely separate security protocols and then you have shadow IT that nobody ever knew existed until the merge happened. It was never in dd. 
 

Uh, so yeah, our job is to really just go in and clean the house and we provide the visibility to say, you know, keep this archive that secure this. And I'm keeping this very simple, but, uh, lesson we, we've 
 

[00:26:55] Sean Martin: That's your job. Simplify it. 
 

[00:26:57] Julian Hamood: security is really enabler. Uh, and when you clean up that, [00:27:00] that infrastructure and standardize it, you aren't just safer, you're faster, and you really create a foundation where the new and combined companies can actually scale rather than tripping over its own, you know, for say, digital cluster for years, 24 months, 36 months, 48 months. 
 

Uh, that's not uncommon. 
 

[00:27:17] Sean Martin: Yeah. So I know you've, you've moved away from a cost savings, uh, uh, model to, uh, an enablement model. You talked about scaling, you talked about performance, you talked about, uh. Delivering, uh, what do some of your clients say? What, what do, what do they experience after working with you? What are some of the outcomes? 
 

[00:27:37] Julian Hamood: Yeah. Uh, uh, when we work with a client, we want to enable them to be able to handle their environment by themselves. We don't want to enable ourselves to be anchored in with a client and, uh, based on a successful project. A few things are accomplished. Number one, the the environment is stable, the data's clean. 
 

Number two, they have a team in place that can handle things moving [00:28:00] forward. And number three, again, they know what the next objective is for growth. So whether we're involved in that aspect or not, and again, uh, clients that work with us on one project are 61% likely to work with us on a future project based on the data that we've seen in the last 36 months. 
 

But again, we wanna set our customers up for success and to independently handle everything, uh, that's structured. Whether it's, again, uh, AI agents, same thing, app development, um, security protocols, uh, utilizing Microsoft, uh, copilot for security independently. We wanna set customers up for success. Uh, so yeah, if, if. 
 

If our client said, you know, trust tech, you did an amazing job. It's because we set them up for success moving forward. 
 

[00:28:41] Sean Martin: And an example of success, you don't, you don't have to name names unless you can, but, uh, I don't know any cool. Any cool project that, uh, was delivered that actually really cleaned house and connected data to systems and, and produced some really cool results for. 
 

[00:28:59] Julian Hamood: [00:29:00] Yeah, we, we've done, uh, AI agents that publish infographics for the sales team at the very, very base level. We've done AI projects that build a deal desk for the sales team. We've, of course, the chat automation, uh, the triage automation. Uh, really, really cool. We've done, uh, AI agents that actually speak to you on the phone. 
 

And Sean, I'm telling you, you could not tell that they're ai. They have a little either UK accent, Australian accent, or even eng, like a, just a deeper English accent, and it's completely, you know, blind. There's no way you could tell. They breathe, they're talking, they're feeling emotion. We've done projects successfully on that, but all aspects of the business, HR automations that we've put into place. 
 

Um, you know, utilization of SharePoint docs to publish financial information using, uh, agen ai, that the AI is actually making decisions on financial projections and forecasts, and even publishing goals and expectations for the next year, pulling in the market [00:30:00] data for that industry. Um, yeah, we've done some really cool stuff. 
 

And again, we, we take each project independently because we don't really see the same project more than once. Um, but we're learning, and, you know, thousands and thousands of projects we're, we're learning as we go. Um, and again, we just have the best resources, you know, available. We've pulled people from Microsoft directly, we've pulled from, you know, S-H-I-C-D-W, uh, different technology vendors. 
 

Um, and again, a lot of this is just we're teaching ourselves and we're working with our clients and. We're continuing to progress. Um, so I, I would say we have one of the best, if not the best, uh, domestic, uh, AI expertise in-house here and the UK market is advancing. I, I would say we're probably number two or three in, in the UK, EU market as well right now in regards to the, the quality and quantity of AI projects and complexity of projects we're working on. 
 

[00:30:47] Sean Martin: Let, let's close with this, Julian. 'cause I, I think there, there's tremendous value in that, right? Because it, if you're sitting in an organization, you know your organization, but you may not know AI and what's possible and certainly what all the, the [00:31:00] stumbling blocks might be. And that's where you, you and your team have the, the knowledge. 
 

And because you've. Done thousands of projects, you know, different businesses as well, tech stacks, business models, desired outcomes, the capabilities of the team versus not maturity for security, that kind of stuff. So, final word, the value of all the stuff you've seen, your visionary, uh, it's in your title. 
 

Your visionary, uh, view. View of what's to come and the value of all that for an organization that wants to tackle, uh, ai. 
 

[00:31:36] Julian Hamood: Yeah, best advice I can give, and even if it's not trusted tech, is seek an advisor that knows what they're doing in the industry because everything is changing so rapidly and everyone has a bias in regards to what they think is right. And again, we follow Microsoft guidelines and expectations for every project we're working on, but you're gonna have an industry or a sector or a client that is, it's, you know, there is [00:32:00] no guidelines, there are no expectations, never been done before. 
 

So again, whether it's trusted tech or whether it's a partner you really trust, just get advice for client coming to us. It's free. We don't, we don't charge for, you know, assessment to analyze the environment. Um, but yeah, that, that's the best advice I can give. Don't do it by yourself. Don't trust your CTO independently. 
 

They're, they're gonna have the most knowledge of what you guys need, but there are gaps that every individual, including myself have when we're analyzing an environment, analyzing an industry, or analyzing a client. So, yeah, best piece of advice. Seek help, trusted help. Not mom and pop shop down the road that says, automate your Google calendar, your outlook. 
 

That's not, that's not what we're looking at here. So, 
 

[00:32:41] Sean Martin: And I'll, and I'll close with, uh, it's, it's a trusted partnership, uh, that you're, you're looking for. And what I'm hearing there is. You might know a lot. They might know a lot, but it's the coming together that uncovers the what you don't know and how you move through that because you're [00:33:00] gonna uncover stuff you haven't seen before either. 
 

And it's, it's your experience and how to deal with those things, uh, that's gonna be super valuable I think. So 
 

[00:33:08] Julian Hamood: Yeah. End of 2025, it's time for all hands on deck. Uh, this isn't going away. Uh, and a lot of people are falling behind. So yeah, see, seek trusted advisor and continue pushing this forward. 
 

[00:33:17] Sean Martin: Very good. Well, Jillian, fantastic, uh, chatting with you. Thanks for all the, the insights and to, to what you're seeing and it's exciting times for sure. And, uh. Everybody listening and watching connect with Julian and the, and, uh, trusted tech and the whole crew there. And, uh, yeah, I'll link to Julian's, LinkedIn and uh, of course we'll include links to the website so you can get in touch with the team. 
 

And, uh, stay tuned for more brand stories here on ITSP Magazine. Thanks everybody.  
 

​[00:34:00]