How to get started in AI for your Business, Expert AI Strategies - Episode 012

Show notes

Join us as we dive into the exciting world of AI strategy with Svetlana, a seasoned professional in digital innovation and strategic initiatives across multiple industries. In this episode, Svetlana shares her extensive experience and insights into implementing AI solutions that drive new business value and revenue.

Discover:
- What AI strategy entails and how to implement it effectively
- The importance of creating new value and enhancing existing products through AI
- Insights into Svetlana’s background and her pioneering role in digital strategy

Stay tuned till the end for an exclusive surprise from Edgar that could benefit your own AI journey! If you have specific areas you'd like us to explore further or questions for Svetlana, don't hesitate to drop a comment below or reach out for a dedicated follow-up episode.

Show transcript

00:00:00: Hello and welcome to this week's episode of the AI Boardroom.

00:00:05: And today we have a really special expert, Svetlana.

00:00:11: You're the expert actually.

00:00:14: Tell the people about what we're doing.

00:00:17: Yeah, I think what we wanted to dive in just to give you guys a little glimpse as

00:00:22: to what AI strategy is and how to implement it, what it means to actually

00:00:27: bring solutions to

00:00:30: to kind of fruition, right, in your enterprise.

00:00:33: So, Edgar's gonna do some talking.

00:00:35: We'll kind of do a little bit of talking.

00:00:37: But the idea, and again, it's a limited amount of time, so we are gonna scratch

00:00:42: the surface.

00:00:42: And if there's any particular parts that you wanna dive deeper on, we're happy to

00:00:48: do a separate episode.

00:00:49: And I think if you're staying until the end, there's a little bit of a surprise

00:00:52: that I think Edgar is gonna offer.

00:00:54: So, I hope you guys stay till the end to find out what it is.

00:01:01: Yeah, thank you very much.

00:01:03: So Svetlana, why are you an expert in strategy?

00:01:06: Tell people a bit about your background.

00:01:09: Yeah, so I've been working in digital for over a decade.

00:01:16: So building products, supporting strategic initiatives across different

00:01:21: organizations.

00:01:22: And it's always been kind of in the innovation front.

00:01:25: So the types of products or projects I've been leading.

00:01:29: have been at new portfolio levels or new initiatives that, you know, the

00:01:34: organization was either trialing.

00:01:36: There's not a lot of kind of research or maturity in that space.

00:01:40: So I was kind of initially recruited as one of the first to kind of lead those

00:01:45: efforts and that hasn't changed even when I entered healthcare.

00:01:48: So I do have a full -time job where I continue to lead these innovative efforts

00:01:53: and the type of projects that I continue to work on and create strategies.

00:01:58: whether it's at the portfolio level or at the individual product level, they tend to

00:02:02: be very new.

00:02:04: And it's exciting to me because it's things that create new business value.

00:02:09: It's the things that drive additional revenue or drive additional value to the

00:02:12: organization in that new ways.

00:02:15: And I know that there's a lot of talk and we spoke with Edgar quite a lot about even

00:02:21: in our past episodes on the news covering...

00:02:26: You know, the problems with cheat sheets and, you know, rolling out and talking

00:02:31: about prompt guides and all kinds of things.

00:02:34: Yes, they streamline operations, but they're not creating that new value for

00:02:38: your business.

00:02:38: And so I think that's what I do.

00:02:41: And I, you know, have an understanding of the business.

00:02:45: You know, I'm a trained MBA.

00:02:47: I went to a top 15 school for my MBA, paid a lot of money for.

00:02:52: for my degree as we kind of discussed with Edgar.

00:02:57: And I'm trained in also workshops for opportunity mapping within, you know,

00:03:05: specifically in the AI space.

00:03:07: I have been a practitioner in developing AI solutions for healthcare.

00:03:13: And then I've been working with other companies to help ideate solutions within

00:03:17: their business, how to bring those and scale solutions across different

00:03:21: industries.

00:03:22: Construction, finance, really, I mean, healthcare is not limiting.

00:03:26: I think it's this innovative aspect of it that excites me most.

00:03:30: It's like, how do you bring more and new value to the organization and how do you

00:03:34: have the strategic vision into the future?

00:03:36: Because AI, to be frank, it does take a little bit of time to mature and to

00:03:40: actually get value once the solution matures and scales.

00:03:46: So how do you have that forward thinking?

00:03:49: to bring that new value to your organization.

00:03:52: What is the interpretation of a new value and more value?

00:03:59: More value, I think, is meant to elevate existing stuff.

00:04:05: And new value would be going new path.

00:04:10: How would we understand that?

00:04:12: Yeah, there's a few ways that you could define new value.

00:04:16: So one is, let's say you do have legacy products.

00:04:19: And for example, they have been built traditionally and they haven't been

00:04:23: improved over a long time just because of technology was delivering factor.

00:04:27: And so you can create new value by improving or rethinking that whole

00:04:32: workflow with, for example, AI.

00:04:35: So AI kind of opens up a new world of possibilities, new world of efficiencies.

00:04:40: So sometimes it's time to create or rethink that whole architecture.

00:04:45: And again, I have experienced rebuilding tools that were...

00:04:48: built in kind of legacy via legacy software, and that you're rethinking it

00:04:53: completely with AI using data and things that people were kind of seeing as like

00:04:58: impossible before.

00:05:00: And they were doing, you know, parts of that solution providing value to different

00:05:05: parts of the organization.

00:05:07: Now it's possible to create more new value for those legacy, like what used to be

00:05:13: legacy products, but now AI can take that to...

00:05:17: a whole net new scale and provide that value to the rest of the organization that

00:05:22: wasn't possible before.

00:05:24: You can also create new value in thinking about possibilities that you just didn't

00:05:31: think could be done before.

00:05:34: So when you create new value, and these are also some of the projects I'm very

00:05:39: excited about doing, is looking at white space opportunities.

00:05:43: So white space opportunities basically means that...

00:05:45: There maybe has been some research ideation happening, but there's not

00:05:49: solutions in the space that have tackled or successfully approached that solution.

00:05:55: So you create and it requires some level of strategic thinking in order to find

00:06:01: solutions for those spaces.

00:06:03: Because if it was easy, people would have figured it out and it would not be a white

00:06:08: space opportunity.

00:06:09: But you can create new value where no solutions exist and they're now made

00:06:14: possible with AI.

00:06:16: And they do tend to be a little bit more complex, complicated, then require a

00:06:19: little bit more thinking and maybe potentially budget.

00:06:22: So the types of solutions that are released in that space tend to be like

00:06:25: larger organizations and enterprises.

00:06:28: But that's really, you know, a huge competitive mode for the organization when

00:06:32: you're putting your data to use in more unique and innovative ways, because that's

00:06:36: IP that you could potentially sell too.

00:06:39: So again, those types of solutions and products really excite me.

00:06:44: because again, the potential is limitless.

00:06:47: Interesting.

00:06:48: And so you said you like do it for a long time.

00:06:51: I think it wasn't always AI involved back in the days.

00:06:56: What makes AI special or AI projects in specific?

00:07:03: So traditional products, digital products go through a maturity cycle.

00:07:09: So...

00:07:10: just think about not even digital, but flash disks, right?

00:07:14: So flash drives, you barely see those around anymore, right?

00:07:20: Because they do follow this kind of bell curve type of a maturity, they raise to

00:07:28: the top and then they reach the peak of like the highest demand and then they

00:07:33: decline in demand just because there's not much revamp that is happening and not much

00:07:39: improvement that you can do.

00:07:40: beyond what that product can actually handle.

00:07:43: And that's true of digital products.

00:07:46: Google, you could even argue, is going through that phase right now where they've

00:07:51: improved that and optimized it to a certain phase.

00:07:54: And now a new emerging solution came in, which is doing solutioning for that space

00:07:59: in a completely different way.

00:08:01: And it's being disrupted.

00:08:02: So you can also position it as kind of going down the decline.

00:08:07: For AI...

00:08:09: products, what's great about those solutions is that user feedback kind of

00:08:17: drives the improvement.

00:08:18: So they call this the flywheel effect.

00:08:21: So as you're putting in more data, you're providing a better user experience, better

00:08:27: recommendations.

00:08:28: So think Netflix.

00:08:30: And why people continue to come back to it is because of those really...

00:08:35: attention grabbing headlines or attention grabbing thumbnails, people want to see

00:08:39: more because Netflix gets me.

00:08:42: Netflix understands what I want.

00:08:45: And because users experience such a great user experience, they recruit more users,

00:08:50: which drives more data and it kind of continues going up in that flywheel effect

00:08:55: again, until that product potentially gets disrupted.

00:08:58: But the...

00:09:01: great idea of the solution.

00:09:03: There's really no limit for how well that system, for how much that flywheel can get

00:09:10: better and better because the more data, the more users and the scalability of AI

00:09:16: solutions is limitless.

00:09:17: Again, the only limitation is the compute behind it.

00:09:21: I think at some point, I think improvements get so marginal that you kind

00:09:29: of...

00:09:30: You might be limitless theory, but the improvements are at some point mitigating

00:09:37: or also small, they are not really benefiting the whole process anymore.

00:09:43: But if you think about also the scale, so when the traditional products kind of

00:09:49: provide a generic experience, right?

00:09:52: So you can't do the same level of personalization that you can with these AI

00:09:58: systems.

00:09:59: So...

00:09:59: The, I would argue that the improvements are done at a user level, right?

00:10:03: So like you can get that experience fine tuned to that exact user over time and our

00:10:08: preferences change, right?

00:10:09: So, and then the beauty of it is that the systems are reactive and then they're

00:10:13: going to adapt to those changes as the user adapts in their kind of cycle.

00:10:18: So that's another difference is that AI can handle, yes, more data, more users and

00:10:24: things like that.

00:10:24: So generally the product itself will continue to improve.

00:10:28: But the level of personalization and how customized that experience for each

00:10:32: individual user is, is a huge differentiating factor for what AI systems

00:10:39: can do at that scale.

00:10:41: So you can actually personalize every experience.

00:10:44: That's what Netflix does.

00:10:45: Your experience would be completely different from mine and the types of

00:10:49: thumbnails I see would be different from yours.

00:10:52: I've seen an interview of the CEO of

00:10:59: of Nvidia.

00:11:00: He was at Stanford because I think he graduated there back in the days and then

00:11:06: he built Nvidia and up to this day he is CEO there.

00:11:11: So, and he was talking about generative AI and also like huge performance

00:11:16: improvements that are going to come into in the next like four, five to six years.

00:11:21: And he was telling me like, he was saying something really interesting.

00:11:26: He said like everything,

00:11:27: we had up to this point was kind of pre -recorded, even if you're like video,

00:11:32: audio, of course, but also if you have an app, like the UX is also pre -recorded,

00:11:42: kind of, because like you put in the ways to go through the app beforehand and

00:11:50: they're kind of fixed.

00:11:51: You might have branches, but they are limited.

00:11:55: And rightfully so.

00:11:57: And in the future, we will have basically everything generated on the fly, which I

00:12:04: found interesting.

00:12:06: I mean, the dynamic experiences, right?

00:12:08: So like, it doesn't have to be static.

00:12:10: And I think, again, I think, I don't remember what the user experience that I

00:12:17: was seeing, but like, why not?

00:12:19: Why not, you know, adjust the experience of the app based on what your...

00:12:26: kind of experiencing.

00:12:27: So maybe the future even of iPhones, like, why do you have to see a home screen full

00:12:30: of like these busy apps?

00:12:32: Why can't anticipate based on your schedule or what you have going on in that

00:12:36: day and, you know, bring on the platter to you, like, here are some options of what

00:12:41: you can do now and it becomes much more efficient.

00:12:44: So maybe the future of like these operating systems becomes much more

00:12:48: contextualized to what you have going on.

00:12:50: I think we are close to the first step of this.

00:12:54: I'm pretty sure.

00:12:55: that this year's developments will make a huge step on our mobile devices to become

00:13:02: personal assistants.

00:13:05: We'll see.

00:13:06: Apple is releasing their new operating systems and maybe a new version of the

00:13:10: iOS.

00:13:12: Yeah, yeah, like the new Android is coming out, it's like right around the corner.

00:13:16: And in June, we have the Apple Developer Conference.

00:13:20: So yeah, everything's close.

00:13:22: I'm really curious what this will bring.

00:13:25: Well, yeah, I'm a new company.

00:13:30: I'm an old company.

00:13:32: That's better.

00:13:34: I have my processes, but I'm seeing my competitors, which like we've seen in the

00:13:40: last news episode is 79 % of managers think that they have to integrate AI to

00:13:49: stay competitive.

00:13:51: And I highly agree.

00:13:52: because the efficiency gains you might or can get are so high that your value output

00:14:02: grows really, not exponentially, but at a really high rate after implementing AI.

00:14:12: So either you save costs and you can be more price competitive or you save or you

00:14:17: create more value and can just take higher prices.

00:14:21: or you can just create more, like handle more load to sell even more.

00:14:26: So there is a lot of opportunity in there.

00:14:31: And some, like I always say, like someone out there will do it.

00:14:34: So, and go ahead and actually do it.

00:14:40: But now I'm completely non -technical, but I know I need at least to evaluate how I

00:14:48: could use AI to elevate my business.

00:14:51: Where do I start?

00:14:53: What are typical use cases in businesses that might kind of be a blueprint to start

00:15:01: from?

00:15:02: I would say maybe three different steps I would recommend as starting points.

00:15:07: One is understanding what AI can do.

00:15:12: So I'm not talking about the history of AI or how all of these AI ML...

00:15:19: you know, deep learning techniques actually work.

00:15:21: Just trying to understand at a high level, elevate yourself to understand what can AI

00:15:27: do to be able to, again, understand like what translate that into your business?

00:15:33: Where do those capabilities, what could you utilize across your different business

00:15:40: functions in order to progress your business?

00:15:42: So at that first step, you're just trying to understand the fundamentals of AI, what

00:15:46: it can do.

00:15:47: and to set yourself up to understand like realistically what value it can deliver

00:15:51: into your business.

00:15:52: And that's why I think, you know, trying to maybe partner with a mentor, you know,

00:15:57: I do also coaching on AI, but, or take some introductory courses, again, with the

00:16:03: intent to understand what AI can do from a capability standpoint and not the

00:16:07: technology.

00:16:09: Two is, you know, as it...

00:16:11: respective to your industry, because you are going to see different utilization of

00:16:16: those capabilities across different businesses.

00:16:18: Let's say healthcare.

00:16:19: When you do legal banking, you're going to see how some of those technologies get

00:16:24: leveraged in completely different ways.

00:16:27: So getting an idea for how some of these AI capabilities being leveraged across

00:16:34: those industries.

00:16:36: And sometimes I think it makes sense to even look at parallel industries to just

00:16:39: get ideas for

00:16:41: how you could even potentially create new value in your business with how maybe

00:16:45: parallel industries are utilizing it.

00:16:48: So first you build the foundation, but in foundational understanding of

00:16:51: capabilities, you look at applications of those in the industry just to get your

00:16:56: ideas flowing.

00:16:56: And then you look in your own business internally as step number three, and you

00:17:00: kind of look objectively, well, what objectives do I have set for my business?

00:17:05: And is that uncommon for organizations to have one year, two year, five year

00:17:09: objectives?

00:17:10: to kind of understand like, how do I actually operate my business and what

00:17:15: matters to my business to focus on this year?

00:17:18: Are you focusing on minimizing customer churn?

00:17:22: Are you looking to increase profits for this year?

00:17:25: What are some of your high level objectives?

00:17:29: And then you look at your either product mix or services mix.

00:17:32: You look at your workflows, what kind of services you are providing.

00:17:36: And you're looking to see, okay, well, based on my objectives,

00:17:40: which one of these functions or product lines or service lines can actually

00:17:44: improve with AI.

00:17:45: And then you would layer, you know, there's workshops.

00:17:48: One of the other things that I do also is AI design thinking workshops or design

00:17:53: sprint workshops where you kind of ideate and you align the objectives to the

00:17:58: different products mixes or service lines and the capabilities and you ideate

00:18:03: different use cases within that.

00:18:06: So you're kind of coming in there with the lens of like not designing the end

00:18:09: solution, but

00:18:10: If I have this product, how can I boost it with AI?

00:18:15: How can I do that work better?

00:18:17: Or are there things that we have been thinking about doing, revamping this and

00:18:21: it wasn't possible before?

00:18:22: Because before AI, we just thought it was just the amount of business rules that we

00:18:27: had to implement was just unfeasible.

00:18:29: Can we revisit that improvement for that product?

00:18:33: Is there a process that's inefficient in the business that could be improved again?

00:18:39: in marketing and sales?

00:18:41: Are there things that are taking like the to get an approved marketing copy?

00:18:46: Is it currently take a week, two weeks, five weeks?

00:18:49: I don't know to get from a concept into the market.

00:18:54: Could we cut down on that time and can we get a few more turns of that marketing ads

00:19:02: into the market?

00:19:05: Like you look more objectively into your business and align it to your objectives.

00:19:10: And see, okay, what are some opportunities?

00:19:12: And what that enables you to do is to focus on not chasing the AI technology for

00:19:18: the sake of chasing technology.

00:19:19: You're looking more strategically at where can I invest money that matters to my

00:19:27: business?

00:19:28: So you make sure that that investment can actually give you a positive ROI and that

00:19:33: you're not investing it into like just chasing a technology and saying like, hey,

00:19:36: we actually do AI.

00:19:38: but it's not an impactful product that's actually moving the needle on any of your

00:19:44: strategic objectives.

00:19:45: So you kind of have to start top down.

00:19:47: And I think we talked about it yesterday too, when our news episode where the

00:19:52: dangers of not driving strategies top down is that sometimes employees, because of

00:19:58: how much value AI has, they're going to go and hunt those solutions themselves

00:20:02: because they want to do their job.

00:20:04: more efficiently and effectively.

00:20:06: And I think the report that we saw from Microsoft yesterday, that 75 % of people

00:20:10: are using some AI tools to support their workflows.

00:20:13: And at that point, it becomes really hard to measure the investment or the ROI in

00:20:19: those initiatives.

00:20:20: So you want to make sure that you are democratizing access, but you're doing it

00:20:25: strategically.

00:20:26: You're investing in the right things in the business that you can measure the ROI.

00:20:30: You can measure the investment that you've put in and how much value you're getting

00:20:33: out.

00:20:33: you're actually assigning metrics to that if you kind of do it in those steps.

00:20:38: So not like just handing it out and hit and hope.

00:20:44: But yeah, one thing I would love to add is if you have existing processes and you try

00:20:52: to evaluate them, look at all the parts that are actually not creating value.

00:20:59: For example, in sales,

00:21:02: managing the context or putting them into the CRM or writing a protocol, stuff like

00:21:07: that.

00:21:09: It has value with the information that are inside, but the process itself is just

00:21:16: tedious.

00:21:17: And that's something where something like a large language model with its language

00:21:23: understanding capabilities might be a way to streamline the process.

00:21:30: And I'm...

00:21:31: That's just an example.

00:21:32: So the value lies in like getting rid of the non -value creating parts of value

00:21:38: creation.

00:21:42: So yeah.

00:21:43: And another thing, what you can look at if you don't want to touch your existing

00:21:49: processes is like, what stuff did we want to do, but we weren't able to because it

00:21:55: wasn't cost effective enough to do it.

00:21:58: It might be like, it might AI, may AI help me to do it cost effectively or at a cost

00:22:09: rate where it actually makes now business sense where it hasn't before.

00:22:15: So definitely looking at marketing stuff you didn't do because you didn't have

00:22:24: the...

00:22:25: spare time or didn't have enough ROI to justify it.

00:22:30: And now you can maybe pump up the volume, stuff like that.

00:22:35: Yeah, and I think just to add to that, I think historically, I think natural

00:22:40: language processing as a field or kind of a capability on its own has been

00:22:45: historically been something that big corporations did because of compute.

00:22:50: or the technology that was required in order to develop those systems or

00:22:54: customize them to your industry.

00:22:56: It was just so cost prohibitive that people would not even just go there

00:23:00: because it's like, oh yeah, it's for the big corporations to play around with.

00:23:04: But now if you look at, and we talk about large language models right now, so if you

00:23:08: think about the amount of value and the use cases that that opens, like you could

00:23:13: use this for summarization, but you can use it to curate data.

00:23:16: You could look at...

00:23:18: using large language models for some analysis tasks and things like that.

00:23:23: So things that you were potentially looking to leverage natural language

00:23:27: processing for, but it was so cost prohibitive before.

00:23:29: You could totally revisit now because the costs of some of this compute and

00:23:35: utilizing these large language models in your infrastructure have gone down

00:23:39: significantly.

00:23:39: So for the benefit that you're gaining out of it...

00:23:42: weekly basically.

00:23:44: Yeah.

00:23:45: Yeah.

00:23:45: Yeah, so I think, I mean, definitely revisit that because I think, yeah, the

00:23:49: ROI is much easier to demonstrate now than it probably was in the past.

00:23:55: So you've done this for some time now.

00:23:57: What do you have experienced, like maybe as common pitfalls during AI projects or

00:24:05: during strategizing AI projects?

00:24:09: There are a few, I think.

00:24:10: Um, inflated expectations are unrealistic expectations of what AI can do and what

00:24:16: timeframes.

00:24:16: Cause I think depending on the complexity of the projects, it does take a little bit

00:24:20: of time to implement.

00:24:22: Um, there's also some experimentation that happens with AI systems.

00:24:26: Um, unlike again, traditional software development techniques where everything's

00:24:31: kind of, you know, rule -based, you determine the rules and then the outcome

00:24:34: is pretty certain with AI.

00:24:37: It really depends on the right technique.

00:24:39: to use sometimes, it's very unique to the use case, the industry and things like

00:24:45: that.

00:24:46: So the types of techniques or models you're using.

00:24:48: So for example, in healthcare space that I have my full -time role, we are looking

00:24:54: for equivalents of general models, but that are trained specifically on medical

00:24:59: tech.

00:24:59: So there's again, nuances to the types of models, some experimentation, just the

00:25:04: fact that you're building something.

00:25:07: means that it doesn't exist elsewhere.

00:25:08: And that's another thing that we may need to talk about, but there's a build versus

00:25:12: buy assessment that you have to do.

00:25:13: So you shouldn't just build these tools because it's quite expensive, but because

00:25:18: it is new, it means that it doesn't exist elsewhere.

00:25:21: So there's a level of experimentation that you have to do.

00:25:23: So it's not as deterministic.

00:25:26: And also look in the open source space, there's a lot of movement there, a lot of

00:25:31: things you can just integrate a fork and...

00:25:36: Yeah, adapt to your use case.

00:25:38: So that might give you a great head start.

00:25:41: What I want also to add is you don't have to...

00:25:51: how to say it.

00:25:54: Okay, we cut that.

00:25:57: I just want to say, but one other thing that I'll just add that with AI system,

00:26:08: because there are probabilistic systems, it's hard to predict the accuracy or the

00:26:13: outcome, right?

00:26:14: So it's not as like, you know, you put this in and you're going to get a

00:26:18: deterministic output.

00:26:19: Sometimes, you know, as you're kind of getting the systems up to a certain

00:26:23: accuracy,

00:26:24: you might get there with the first two 80 % or 90 % accuracy by third iteration,

00:26:31: which may take a little bit of time.

00:26:32: But even then, you're only promising 80 % accuracy of the promised output.

00:26:38: So you can't also calculate your ROI the same way you are doing it for standard

00:26:44: processes.

00:26:45: So the way that you could actually do it is incremental.

00:26:48: So for every percent of improvement that you gain with each iteration,

00:26:53: you promise an X amount of value gained.

00:26:56: And so that way you're not anchoring yourself to a hundred percent accuracy,

00:27:00: which very hard to achieve with some of these AI systems.

00:27:03: If you're lucky, you will get into the nineties.

00:27:05: But you are also kind of self -proving or safeguarding yourself so that if your

00:27:13: solution is built to 80, you demonstrate, okay, the ROI and that investment, how

00:27:17: much you put in to get to that 80 % is X.

00:27:20: So now for every 5 % improvement,

00:27:22: we are looking to project why, and that's what would make sense.

00:27:25: And when you reach that time where the invested amount of money is just not worth

00:27:33: the ROI at that point, that's when you say like, this is probably as good as we're

00:27:36: going to get.

00:27:37: And any change beyond a certain point does kind of start to level off and the

00:27:42: improvement tends to be very incremental.

00:27:44: So sometimes it's just not worth putting in huge amounts of dollars into an

00:27:49: initiative if you're only getting a small increment of value back.

00:27:52: And also, I would love to add the value is not only calculated by what percentage

00:28:01: helps me to like, for every percent, this is the value you add, but it's also you

00:28:06: have to weigh against what does it mean if it's not right?

00:28:12: Like if the answer or the outcome of your AI solution is not right, what does...

00:28:17: the consequence of it is like, is the consequence just after we start the

00:28:22: rephrase the question.

00:28:23: So what is the consequence that someone has to like cancel some bookings might be

00:28:31: might be a different beast.

00:28:33: So that and what I would love to add is also really think on how you sell the

00:28:40: stuff to your people and to the people using it.

00:28:42: Because if you sell 100 % solution to them and you

00:28:48: may not even have 70 right now, it's fatal because people will just stop accepting it

00:28:58: and saying like, does work like every third time and dismiss it.

00:29:05: So you have to sell it properly to have it implemented successfully in the end.

00:29:11: I would say just like, oh, maybe to add one more that you'll.

00:29:16: probably frequently encountered, especially if you work in larger

00:29:19: organization enterprises, but when you roll out the product, you have to set the

00:29:22: expectations.

00:29:23: Yeah, that's what I mean.

00:29:24: For the product as to what you can expect as output, as far as, you know, because

00:29:32: the output may not be perfect in every scenario.

00:29:37: And so we preface that, then we go on road shows before we ever deploy anything

00:29:41: organization wide.

00:29:43: is to educate the organization on what the solution is, how it works and the

00:29:49: importance of feedback.

00:29:50: So the solution becomes better with them engaging with the tool, with them

00:29:57: providing feedback and telling us what's wrong with a specific piece of output.

00:30:01: So it comes to us.

00:30:03: There's also a phase that we do controlled rollouts, especially for the tools that

00:30:07: are internally facing.

00:30:09: where we are more proactive and there's different phases that we do for rollouts.

00:30:14: But we seek that feedback more actively initially.

00:30:17: And then once we roll out, let's say a beta, we have other more automated

00:30:22: mechanisms like implicit feedback or explicit feedback that we collect

00:30:25: additional data points to improve the product.

00:30:28: And then we proactively seek so that when we roll it out into the practice, for

00:30:34: example, or into the enterprise,

00:30:37: people are educated as to what to expect of that product.

00:30:40: So we don't hear, although it does come sometimes like, oh, this is really wrong

00:30:45: or, but once you go back and you educate or you revisit kind of some of the things

00:30:52: that how this product works, I actually appreciate it.

00:30:55: But I don't want to underestimate the amount of education and change management

00:30:59: that needs to happen to educate folks within the organization for...

00:31:04: what's different about this product and why it's not built the same way that the

00:31:09: traditional products that they're used to.

00:31:11: From our talks, I remember that you also, like oftentimes, you have an emphasis on

00:31:21: scalable solutions and that's important for you.

00:31:25: What does this mean?

00:31:27: So, yeah, I say this a lot, but quality and scale are the two elements of AI

00:31:32: solutions that you want to

00:31:33: Make sure you have, so it depends on the use case.

00:31:36: So of course, if you're in a small organization, maybe scale is going to be

00:31:41: like scaling to five users or maybe 10 users.

00:31:44: But when you're talking about enterprises and scalability, what that means is, so

00:31:49: quality ensures that you're building something and then you have a pathway to

00:31:54: improve the solution, you know, up to a hundred percent over time.

00:31:58: So there's a roadmap for how you continue to improve, whether it's feedback or other

00:32:03: means.

00:32:04: And then the scalability aspect of it, you ensure that you're not creating a small

00:32:10: use case for a small population that could be as good as just writing a few rules,

00:32:15: like business rules.

00:32:16: So sometimes if it's a use case that you invest a lot of money, but it is so small

00:32:22: as far as impact to the organization, then it may not be worth the investment.

00:32:27: So you want to make sure that you scale it, but you also do it in a way that...

00:32:32: you can deliver value not only to one business unit, but you do it, you develop

00:32:37: it as a platform or you get a solution to a certain part where you let other teams

00:32:46: leverage the technology or kind of, you know, let them use the modular components

00:32:51: of the architecture in order to customize it to their unique use cases.

00:32:55: So,

00:32:55: It is highly emphasized and like how large organizations and enterprises actually do

00:33:00: this and how they're able to scale, not scale, but also let organization, you

00:33:08: know, deploy hundreds and thousands of models a year is because they're building

00:33:11: these platforms.

00:33:12: And then each individual business unit is then customizing to their, their needs.

00:33:17: But then also, you know, we have, I have experienced use cases where it is.

00:33:23: you know, supporting a specific workflow, but historically, because of rule -based

00:33:28: solutions that were kind of tried in the past, they were only able to support a few

00:33:36: use cases, right?

00:33:37: So like for this situation, the situation, the situation, this is how you use it.

00:33:42: So it's, it's how do you actually do it, you know, challenge the system to do these

00:33:49: things, but then a hundred other things, right?

00:33:51: So.

00:33:52: How do you actually create this engine to take the data and let the data do the hard

00:33:57: work to provide recommendations for not just to use this use case, but then how do

00:34:01: you actually create this model?

00:34:03: It's really challenging, I'll tell you that, but we have experience doing this.

00:34:06: But you have to make sure that it is something that the whole enterprise can

00:34:09: take advantage of, not just a specific exclusive group that their use case is so

00:34:14: small that it becomes a shiny object for their organization and everyone else wants

00:34:18: it.

00:34:19: So you need to build a path again, whether you democratize the architecture or

00:34:23: modularize it so other teams can use it, or you build the solution itself in a

00:34:27: scalable way so that the whole enterprise can take advantage of that use case.

00:34:31: Again, it's a case by case basis, but there's two different ways to think about

00:34:35: it.

00:34:36: Interesting.

00:34:38: Like every project has to be done by people.

00:34:42: How do you, like, what is the team one would need?

00:34:48: if you would love to do it internally, like for example, I have my own IT

00:34:53: department and I have my own people and they have to do the project.

00:35:02: What are they?

00:35:03: Like I think you have internal teams and like, augmented with some external

00:35:08: contractors, right?

00:35:11: Yeah, so I would look internally first to see what skill sets exist.

00:35:17: So,

00:35:17: Not every IT team has a data science function or data analytics that have the

00:35:25: right skillset to work on AI solutions.

00:35:28: So I would evaluate your internal team and structure and skillsets to see if you even

00:35:32: have the right skillsets to deploy AI solutions.

00:35:36: So people have MLOps or just working in ML solutions, data sciences, data engineers.

00:35:43: Do you have those skillsets and are they appropriate for the project?

00:35:46: If you do,

00:35:48: I would also, you know, the technical piece of it is just one part of the story.

00:35:52: You do have to have a business representation, right?

00:35:56: Someone that drives the vision of that product, of that function, some, the

00:36:00: metric KPIs for what, if this product is developed or this initiative is developed,

00:36:07: what's considered the success?

00:36:09: You also need some, I want to say leadership champions, someone who really

00:36:14: believes in the effort.

00:36:16: and can champion this across the organization and has influence across the

00:36:22: organization.

00:36:22: So this does not become like the silo effort of someone experimenting with AI

00:36:27: and the rest of our organization is not aligned.

00:36:31: Right.

00:36:32: And they're just kind of independently working on this own and they don't have a

00:36:35: voice to like really get the organization excited about this effort.

00:36:38: Then it could quickly get shut down as an experiment and like something gets sweep

00:36:43: them to the rug that no one cares about.

00:36:45: So you do want to make sure that you structure this team, someone that's,

00:36:50: again, a strong leader.

00:36:52: You know, you need the product manager, you know, some leadership representation

00:36:55: in this, and you do need to have a technical team in order to kind of

00:36:59: implement that and see it through, you know, proof of concept, beta deployment,

00:37:05: like operationalizing the tool and whatever else have you.

00:37:09: But I think that the key here is that...

00:37:12: You know, not everyone is very fit for innovative stuff.

00:37:15: So there are going to be things, especially with the use cases of

00:37:19: automation.

00:37:20: There are, I haven't experienced this, but I have recently read a book about a

00:37:26: specific use case where, you know, this business leader was trying to deploy an

00:37:30: automation tool or automating certain aspects of the process.

00:37:34: And, you know, she was delegating.

00:37:37: I want to say like process mapping to the target team.

00:37:40: So basically the SME is on the ground and what might actually happen is people like,

00:37:45: and what actually happened in that use case, people started ghosting that effort.

00:37:49: So people started taking vacations unplanned, calling out sick, um, and just

00:37:55: like not documenting.

00:37:56: So a month went by and there's not much movement that happens.

00:38:00: So again, it takes a certain level of leadership, then also stamina and the

00:38:05: mindset.

00:38:06: also to like lead the initiative, but also the teams you surround yourself with kind

00:38:10: of have to be in that same mindset of like innovation and not be stagnant to their

00:38:15: old ways, reserving to their old ways of traditional kind of problem thinking.

00:38:21: And then there's some, some telltale signs for when things are not going in the right

00:38:26: direction.

00:38:27: And you can tell again, by the velocity of the team, where the initiative is going,

00:38:31: how long it's taking.

00:38:34: Yeah.

00:38:34: And I have some, some of my own kind of targets for how long things like a proof

00:38:38: of concept should take, right?

00:38:40: For you to even build out an initial version of the solution shouldn't take

00:38:43: more than like three to six months, like max, even for the most complex project,

00:38:47: you need to demonstrate value.

00:38:49: So you don't want to invest years.

00:38:52: Six weeks.

00:38:54: Like, yeah, depends on the complexity.

00:38:57: Yeah, just get something out.

00:38:59: And I think, um, just prove that it works.

00:39:02: Right?

00:39:02: So like you just need an indication, some signal that, Hey, this can actually

00:39:07: deliver value and then put it in front of stakeholders and say like, Hey, do you

00:39:11: want to scale this?

00:39:12: I'm actually going through some, some of those, those initiatives right now.

00:39:16: It's like, do you want to scale this?

00:39:17: Do you want to staff?

00:39:18: Do you, do you want to see, realize the full value and the full potential of this

00:39:21: product?

00:39:22: And that's when you make that decision and then calculate the ROI for each increment.

00:39:26: And when does it make sense for you to improve it to, um, in order to get.

00:39:31: the value for, for that product.

00:39:33: That's also something that's especially, um, especially useful or not, not only

00:39:40: useful, but especially if you are at a point where you have an AI project.

00:39:46: Um, and I've said it like 10 minutes ago that you have to weigh the successful

00:39:52: implementation against the failures.

00:39:54: Um, or like the successful runs against the failures.

00:39:59: that makes it even more important to get a proof of concept running, to even get the

00:40:06: chance to evaluate how far might we come.

00:40:10: There are certain things you can do in really setting up such an AI solution

00:40:18: where you know, okay, if we improve the problem, if we improve the architecture,

00:40:24: we can go even further, but that's a baseline we can work from.

00:40:29: And I, for myself, built for our solution, I built the first prototype in a week or

00:40:38: so.

00:40:39: Yeah, that's a good point.

00:40:40: Yeah.

00:40:41: And just made like one or two things I wanted to have happen.

00:40:47: And then I said, okay, the rest I think we can figure out.

00:40:52: Let's go.

00:40:54: Did it work completely as planned?

00:40:56: No.

00:40:57: But do we have a proper working solution now?

00:40:59: Yes, we have.

00:41:01: And that's the thing.

00:41:04: For that, it's also not bad to have someone, at least for a short period of

00:41:10: time, helping evaluate stuff who has actually experience in doing AI projects

00:41:16: besides typing stuff into ChachiPT.

00:41:19: And I think that's a really great point.

00:41:22: So when I say it shouldn't take longer than three months, it's a red flag that

00:41:26: when...

00:41:26: people like teams start taking longer than that.

00:41:29: But what I was actually, cause I'm doing quite a complex project right now that

00:41:35: we're on the timeline to deliver a proof of concept in five weeks.

00:41:38: And we're talking about knowledge graphs, large language models, natural language

00:41:42: processing, and machine learning recommendation system, all orchestrated

00:41:46: under one solution.

00:41:47: And that's being delivered in five weeks.

00:41:49: So it is possible to do it.

00:41:51: And again, you have to have the right people actually doing this, but...

00:41:55: It starts to become again, it should become a red flag when things are taking

00:41:59: longer than anticipated.

00:42:00: And then you should really evaluate are the right people or the right skillsets at

00:42:04: the table kind of ideating these solutions.

00:42:06: And if not, I think you should augment your teams with contractors or experts or

00:42:12: people who have experience actually building these solutions or educate teams

00:42:17: for actually how to strategize, how to put these solutions together.

00:42:20: So I know Edgar, you do some of this, I do some of that as well.

00:42:24: How do you actually build these solutions for proof of concept, but do it in a way

00:42:28: that you could scale those solutions even further and later.

00:42:32: So pick, you don't have to boil the ocean with the initial version.

00:42:36: So you do have to define the scope for those five weeks of what you're trying to

00:42:40: show as an indication.

00:42:42: So it can be small, but you want to make sure that that effort is not throw away.

00:42:45: They can just build on it.

00:42:47: If you do get leadership buy -in to continue.

00:42:52: I would say a little note to that.

00:42:59: It can be throw away if you just want, if it helps you, if one -way camera is good

00:43:08: to take the shot right now to see if it works and then go get the expensive camera

00:43:15: afterwards.

00:43:16: So it's totally fine.

00:43:18: And for AI solutions, the prototyping doesn't...

00:43:22: have to be in the system you will implement it like the real solution later.

00:43:30: I for myself oftentimes just go ahead, mock some prompt with some mock data and

00:43:38: put it into CHPT for just to see how AI reacts to what I want from it.

00:43:45: Or use a low code solution or something like that.

00:43:50: to just get some results because results and evaluating results is like really key

00:43:57: to creating successful AI solutions.

00:44:04: Yeah, totally agree.

00:44:05: So that can be throw away, but only if it really speeds up the process.

00:44:12: Otherwise, go ahead.

00:44:13: Of course.

00:44:14: And I think, again, I'll just kind of emphasize that AI is experimental.

00:44:18: So do even data, like the most experienced data scientists or ML engineers go into it

00:44:23: knowing exactly how it's going to work?

00:44:25: Not always.

00:44:27: So there are things that they are going to be trying different algorithms, different

00:44:30: approaches.

00:44:30: So exactly.

00:44:33: So by definition, anything that they tried that didn't work is throw away.

00:44:37: But I'm just saying that you shouldn't just build.

00:44:40: a solution for the sake of building it, something that should be using machine

00:44:44: learning, but you decided to just do the proof of concept with large language

00:44:47: models because that was just something that you could get done in like two weeks,

00:44:51: but that's not the technology you end up using at scale.

00:44:55: That's what I mean.

00:44:56: And also, I would love to add something to that.

00:45:02: The jump from a prototype, which already gets a lot of the way,

00:45:09: to an actual product, don't underestimate that.

00:45:13: The jump from an actual usable product with all the concerns about privacy and

00:45:21: data security and everything.

00:45:26: Like really, really, really, really take into consideration that just because you

00:45:34: have built a proof of concept in two weeks,

00:45:36: the solution won't be ready in four weeks time.

00:45:39: So there is a big gap from, okay, barely usable to production ready.

00:45:49: And there is also, and that was, for example, my learning building our solution

00:45:55: at one company.

00:45:58: We went ahead and I did the AI algorithm in November.

00:46:06: and attached it next time now in March.

00:46:10: So that's how long it took for me to get everything else ready.

00:46:15: All the systems, all the integrations, all the UI for administrating stuff, concepts

00:46:22: for interactions, authentication, authorization, all that stuff.

00:46:29: That's a lot of work.

00:46:31: That's a lot of work just for the surroundings.

00:46:36: to then give AI the proper environment to work in, which is like the most important

00:46:42: thing, because otherwise you won't ever get a good result if you don't have the

00:46:46: right context.

00:46:49: And yeah, that's something I'd love to emphasize because I know also like from my

00:46:56: partners and for myself even like, I thought, okay, I built this in a week,

00:47:02: I'll be done in two months.

00:47:05: I wasn't, I wasn't even close to doing that in two months.

00:47:08: So keep that in mind.

00:47:10: The gap and from like, I do softener for 15 years professionally or close to 15

00:47:18: years.

00:47:20: I never had anything that was like so capable in its prototype phase, but took

00:47:28: so long to bring to production really.

00:47:31: So yeah, keep that in mind.

00:47:34: Yeah, I totally agree.

00:47:36: I think don't underestimate the security emphasis.

00:47:39: I think I would even layer that depending on whether this is B2C, meaning that it's

00:47:46: outward facing towards customers type of solutions.

00:47:49: You probably want to even build in additional time to do the due diligence to

00:47:52: make sure, especially for large language models, that it's not going to

00:47:55: hallucinate.

00:47:56: You put in the right guardrails in place, system prompts, and you anticipate all

00:48:01: kinds of scenarios.

00:48:02: So there's probably an additional...

00:48:05: level of Q &A that you want to do with smaller groups to make sure you iron out

00:48:14: all of those bells and whistles before you put a customer -facing product like

00:48:19: Chatbot into the market.

00:48:21: And again, depending on the industry as well, but things that are internally

00:48:25: facing, again, depends.

00:48:27: There's a potentially different approach that you can do with change management

00:48:30: where you would do still...

00:48:33: your due diligence of rolling it out to smaller set of users, but then the types

00:48:36: of people you roll it out initially with.

00:48:38: And then the subsequent phases, there's also an approach to that.

00:48:42: But I would also kind of add that in there as well, because again, these systems are

00:48:48: not meant to be perfect.

00:48:49: And in order to get that last level of feedback, accuracy built in is actually

00:48:56: better driven not by fine tuning these techniques, but it's actually through

00:49:01: getting additional feedback.

00:49:03: So, yeah, so it really depends, but I totally agree with you that the time from

00:49:09: POC to a production level product does take a little bit of time that could be

00:49:13: disproportionate to the amount of time it took you to get to that proof of product.

00:49:16: Yeah, definitely.

00:49:17: Okay.

00:49:18: I think this was a good closing.

00:49:25: Should we talk about what we'd love to do maybe during the next episode?

00:49:29: And if people want to take us up on it.

00:49:33: Yeah, a little surprise.

00:49:34: I was completely forgot about it, to be honest.

00:49:40: But it's in the middle of the night here where I am.

00:49:43: So have some, how do you say, pity with me?

00:49:49: I don't know.

00:49:52: Yeah.

00:49:52: What we love to offer you is you can hand in through the comments on YouTube or DM

00:50:01: us on LinkedIn.

00:50:02: hand over some ideas, stuff that you would love to try out or love to know if it's

00:50:11: working or not considering AI integration.

00:50:16: And we would offer up maybe one time and then see how it goes, but take one of the

00:50:24: suggested solutions or implementations and just do it live with you.

00:50:28: Just building live POC.

00:50:33: which you can take with you, do what you want with it.

00:50:36: I don't care.

00:50:38: It's about showing how to approach that.

00:50:42: So I'm doing that with some of my own projects live on my channel.

00:50:47: And we would love to do it in the AI boardroom context as well.

00:50:52: So if you have any interesting topics you wanna tackle, just approach us down in the

00:51:00: comments or DM us.

00:51:02: Yeah, just to let us know what maybe industry you're in and what use case or

00:51:08: what problem are you trying to solve with AI.

00:51:13: And so, and then we'll be in touch with you further to kind of get your additional

00:51:18: information, maybe documents or whatever you're hoping to achieve through this, but

00:51:22: drop your comments below.

00:51:23: We'll select someone from the comments, we'll be in touch and we'll be building

00:51:29: out the solution live with you.

00:51:30: to show you exactly how it works behind the scenes and just kind of let Edgar's

00:51:36: skillset shine live.

00:51:38: So for those, hopefully we won't do it at this late hour so he doesn't fall asleep

00:51:43: while building.

00:51:44: Honestly, I stream from 10 PM every time I stream, so it will be late at night, but

00:51:50: it's fine.

00:51:52: Okay.

00:51:52: Just make sure you drink some coffee so you don't fall asleep on that.

00:51:54: Yeah, I didn't have any coffee today.

00:51:56: That's my problem at the moment, I think.

00:52:00: Okay, yeah, thank you very much, Svelana.

00:52:04: That was really, really insightful.

00:52:07: Thank you for opening up to us about all the experiences you made.

00:52:14: And yeah, to you all, thanks for listening.

00:52:17: It was a pleasure, as always, talking with Svelana about our most favorite topics.

00:52:25: Yeah, I hope you get the value from it.

00:52:29: I've myself founded Value Pact.

00:52:31: I think you have to listen several times to get everything out of it.

00:52:36: But yeah, that's what we want to deliver value until it comes out of your ears.

00:52:43: I mean, the invitation still stands.

00:52:44: So if there's anything that you guys want us to dive deeper, maybe do some

00:52:48: diagramming live, I think we can talk and deep dive on a specific topic that you

00:52:53: want to understand more about.

00:52:56: So again,

00:52:57: Drop it in the comments.

00:52:58: We read your comments.

00:52:59: Thank you so much for again, engaging us and taking the time to listen.

00:53:03: But yeah, I hope you guys enjoyed this and we'll see you, I guess, on the next one.

00:53:09: And think, yeah, if you're not subscribed to the channel, please subscribe.

00:53:13: Oh, yes.

00:53:13: Leave us a like.

00:53:18: Yeah.

00:53:19: And comment down below.

00:53:20: We are always curious to hear from you.

00:53:23: And that's it.

00:53:26: See you.

New comment

Your name or nickname, will be shown publicly
At least 10 characters long
By submitting your comment you agree that the content of the field "Name or nickname" will be stored and shown publicly next to your comment. Using your real name is optional.