[MUSIC PLAYING] Hi, everyone. How are you doing? Thank you for joining. I'm hopping over from the day one track into day two track, just providing coverage. And I am super, super excited by this talk. And it's about the foundations of AI, and what a year we've had for AI. But I think, more importantly, it's for next year and what enterprises can do to support and set sustainable foundations for AI in their organizations.
And no better person to talk about that than Sri from DXC. Sri, thank you so much for joining. Can I ask you to introduce yourself to the audience and why you're the right person to talk to them about this topic?
Cool. Yeah, thanks, Mike. Hi, everyone. This is Sri Kaligotla. I'm a Senior Practice Principal and Director for Data and AI Practice at DXC Technology.
I have about 20 years of experience, and almost the entirety of it in data analytics, but a lot of it at a Fortune 50 retailer and then at DXC technology, obviously, where I dealt with the whole life cycle of data maturity, the solutions, the breadth of [INAUDIBLE] solutions, and also the importance and actual value and power of AI. So excited to talk to you today. And yeah, we'll cover pretty good stuff today.
Yeah, Sri, can I just ask-- that sounds like you've had many years of experience. This isn't-- or is this-- this isn't a just-this-year thing. You've been doing this for a very long time. How many years?
I'm at 20 years, going on, right now.
Wow. OK. Wow. OK, so you're you're you're looking at everyone jumping on the bandwagon. OK, gotcha. Sri, let's jump into it.
Awesome. So a couple of topics that you see on the slide here we are going to cover today are almost the hottest topics that we deal with, every client that we interact with and the industry partners we have. So that is how do we get to the foundations of setting up an AI capability, and which sort of leads into is the data management side of things. Now, they're more-- very much interrelated and interlinked, reinforcing topics. So let's get right into them.
So we'll cover the data management aspect first. And then we'll jump into the AI because I like to build up the data management and then show you how it actually organically blends into an AI foundation setup. So what you see in this slide is the overview of the data management capabilities, all the way starting from-- when you ask someone, actually, there might be many versions of what data management means. But there's some commonality across all of them.
It largely deals with metadata and then, more importantly, how do you treat the data, and how do you extract the maximum value out of it, have some consistency in it. And the biggest part of it is the governance. So one of the best analogies I use and heard are there are this lego bricks that you have. And then you need to build a structure.
You can build them in any way. But how do you build the most robust one is to have the right pieces in, right support, and right components all fitting in so that it can stand and also give stability and longevity to the structure. So similar to that, there are so many components here. Many organizations do some of them well. Very few of them actually do all of them well.
But this is a good mix of what you can expect from a data management capability standpoint. And if someone is able to do all of them, that is the nirvana state, right? So right from starting off with the asset catalog, where it literally lists down everything that you have in the company.
Having worked with a Fortune 50 company, I know it was a hassle to know exactly what all systems we had and what all things that each of those had, platforms had, and make sure we're not stepping on each other's toes or doing the same thing. So sadly, that's still a reality, where you have all these different applications, due to whatever reason, have been siloed. And sometimes, and unfortunately, they are probably solving the same problem. So that's one of the things that catalogs are useful for is getting you a list of, OK, what all do I have in the company.
And then the other very important thing is the business glossary. Knowing what does an asset do to the business is the real purpose of it. Yeah, it might be the shiniest toy. It may be a simple report or a fanciest ML algorithm or AI program. But until you have-- you test the business utility and the definition to it, it's most likely just another shiny toy. So that's a very important component is how do you get to the business glossary and then actually connect that to the data-- the asset catalog.
The third one that you see here is the asset and end-to-end lineage of it. So now you have the asset catalog. You have the business definition. But now you've got to stitch it up.
That's where the end-to-end data asset lineage comes into play, where you have the interconnectivity, understanding how the data is flowing, how the relationships are working, traceability of where certain something coming from an ERP to a WMS, and eventually getting to a utility of the end product, be it on the floor or for the marketing person, merchandising person, for a retailer, for, actually, a sales manager. So it just-- that whole lineage plays such a big role, not only for just understanding, but you'll see it's even in the AI foundations as well.
The other piece coming on from the lineage is a concept that is gaining a lot of popularity. And actually, it is in some of the products that erwin has as well, where it's the process mindmaps and ontology concept, which is getting a definition of a business process. So for example, let's say supply chain, receiving. It sounds very simple, but it has so many pieces to it.
And it could be, hey, what's my trailer number? What's my product inside it? What's the container quantity? Can I package it? Can I can I move it on the floor? Can I-- do I have to rack it up?
So so many things that are making up that simple definition that you need to have a proper process mind map and actually a definition that goes with it that is again connected to the lineage-- so if you see, that actually opens up the whole gamut of data usability to a whole spectrum of users who are now more of a subject matter experts. And they can get excited on, OK, hey, what I'm saying, what I'm doing, and what is the actual business is actually related and connected to the data. So that's a really cool thing that is-- we have implemented and also some of the platforms like erwin have them.
Then moving on, now you've figured out the what, right? Up to this point is the what. OK, the different pieces of the lego, and how do I-- what is there, and how do I use them is what follows. One of the key aspects and very under-- I wouldn't say underestimated, but at least underutilized or underperformed in terms of building a capability is the data quality with certification and democratization.
What that means is building a capability that can not only just check the accuracy of the data, but also make it so that it's easily repeatable. It's scalable. Many people can do it. Even different personas can do it, versus being a very hard process to actually do it, and then you lose the benefit of it.
So as an example, let's say a SKU, a product, is set up to be received in bulk, whereas you only need it to be in eaches. So now, all of a sudden, it throws up the whole process into a-- it throws a wrench in the process, where now, all of a sudden, the warehouse is stuck because the system won't let it receive each one, but now you have to receive multiple items.
And now you don't have the storage space, and you lost potentially-- at least if it's a bigger company, about $10,000 per hour. So it's ridiculous how small things like that can actually impact a lot of issues and actual monetary impact as well. So that's a huge aspect, the data quality and certification and another feature actually that is there in the erwin Data Intelligence product.
Then one other piece is the intelligent operations and traceability. So like I mentioned about the data lineage aspect, now it is actually execution. So you have this ecosystem of pipelines and jobs and everything that's running end to end. But they are so much interdependent.
And if something breaks, and then the downstream one doesn't wait for it or try to make sure it is reconciled or rerun after the upstream one is fixed, everything is a mess. Now, all of a sudden, you have a sales person using a wrong metric or has taken a decision to do something with the wrong data. So it becomes extremely critical to make sure those intelligent operations setup is there so that you can actually alert the right people, take the right action, and make sure your data is authenticated so that the trust is developed.
And then, finally, it's how do you enable the discovery, collaboration, and actually make sure there's governance end to end. So today, again, you see it's very siloed, very broken. And one of the concepts we have introduced and have seen a lot of success is how do you centralize all of that so that it's easy to discover, easy to learn what it is doing, and then ask for access. And then there's a checks and balances to make sure that persona is actually using it, and then go from there.
So some of the benefits, as you can see-- citizen development through business value driven-data analytics. And then that is how you actually make that option better. How do you show the value to the citizen of the company who is not a data analytics practitioner.
So you got to show them the value. You got to make sure they are able to trust the data. And then-- you then build that foundation to unleash it across the organization for various use cases, for different personas, set up the data governance in the right way with the right sequence of events happening, and then eventually improve the asset utilization of whatever investments you're making.
None of these are inexpensive, right? There could be a lot of effort, subscription fees, and actual data, trillions of data, gigabytes of data that's flowing through. You want to get the maximum out of it, and then which eventually should lead to your IT cost optimization. So it's a lot of things packed into this, but that's where it's so important to focus on this as one of the stepping stones towards getting to the AI foundation.
It's a fundamental, really, isn't it?
Exactly.
Moving on to the data management ecosystem, what you see here is how does it flow, right? Yeah, I've read a lot of bunch of capabilities in the previous slide. But OK, what does this mean, right? OK, how do I-- how do we put the lego pieces together?
This is a way, right? This is a way that's repeatable, that makes sense, that is, I would say, 95% applicable to almost every industry out there, specifically you know retail and CPG that I deal with a lot. So you see all your data products on the left side. And data products is a very contentious definition, too, but you can actually standardize it.
It's a collection of one or more data assets that can serve a business value at the heart of it. So it could be a data structure, attributes, KPIs, pipelines, everything coming together to serve a business value. So how do you put that all together in a data asset catalog? That's the step 1.
Like I mentioned earlier, you put it all in an enterprise data catalog and then layer it on the business glossary on top of it as a step 2. Step 3 is your lineage. Step 4 is your quality and intelligence operations. And then you put on-- you open that up, the doors to the data asset discovery, to your citizens, making sure they are able to learn, understand, and make sure they're able to use it effectively. And then provisioning-- OK, now they are like, oh, I know what this. This is needed for my job or can make me do my job better.
Now you go into an asset provisioning of it. Make sure they are the right persona, it's in the right hands, and they get the right access the right quantity. So that's where the number 7 is also such a crucial thing. Usually it just happens in the step 6, where you are just focused on who gets the right data and whatnot. But it has to happen end to end.
So otherwise, you're sort of losing the site, and you're only trying to fix the bubble at the end, the pump at the end, versus evening it out. So that's where this is a proposal on how this ecosystem of capabilities will work. And it should work, for the most part.
Sri, before we move on, how does all this come together with respect to data processing and storage?
Yeah, that's a great question. Thanks, Mike. So if we, just to get to the next one, is--
Sorry. [LAUGHS]
That's still the-- that's still the heart of the value, right? The data is still the real thing. Whatever we spoke so far is how do we manage the data that is-- can unlock more value. So you still see here-- we've talked about the business process mindmaps. We talked about semantic layer.
The knowledge database is the one big thing that is supporting the ontology and semantic layer. There's a data literacy. There's data quality anomaly detection, governance, security compliance, observability, and lineage. We spoke about all that.
But the front and center is always going to be the data storage, processing, and consumption. So most of the clients have already figured this out or at least are in the path to figure this out, either through their modernization journey, or they have been there for a while. So this is like the bread and butter. And whatever you see on surrounding is going to be the key things that are necessary to unlock that value from that data storage, processing, and consumption.
Sure.
So this is my favorite slide, to be honest. How do you contextualize something that we spoke about so far in terms of, OK, I'm a business user. Let's say I'm a forecaster in the merchandising or a demand planner or, let's say, a supply chain senior manager or director. What does it mean to me? Why do I care about this?
Because of these four things at the end. The discoverability, the traceability, interoperability, and addressability. So these four are what you're getting by participating in this process as a business SME or a business partner and not necessarily an IT or a data partner-- practitioner.
So when this whole thing plays out, when you have the business process, when you have the glossary defined, when you have the data all mapped out and everything done in the right way, as an end user, for you and your team, from a business leader standpoint, you now know how to find this stuff. Usually that's a that's a pain point. You know how to trace it back, and OK, where is it coming from? Can I rely on it? Can I actually make sense out of it?
Interoperability-- how do you relate multiple topics, like be it HR to inventory to warehouse and labor and the supplier and all that? And then addressability-- what to do with it, what is it telling me? Is it compliant? Is it not? How do I use that? So those four topics are some of the things that, today, are a problem area for a business.
So the incentive so far is, hey, you build it, and they will come. But it doesn't happen all the time. You build the house, and they'll come. No, I mean, you got to still furnish it, make sure it's useful, the AC is on, the heating is on, and then it's better than what they have, right? And then they'll say, yeah, OK, I'll figure out when to move, and they'll start moving.
You know what, I love that analogy. If-- I remember buying, moving into my house, and we were furnishing room by room. And it seems to me that, going to the business, we're always told to start small. Take one problem at a time as opposed to trying to swallow an elephant. I'm mixing my analogies. What advice would you offer the business teams, the business functions to shed more light on particular problems or where to start?
Yep. See, I was [INAUDIBLE].
Take your room analogy. Do you start in the kitchen, or do you start in the living room?
Yeah, you got to take care of the basics, right? You got to take care of the table stakes, which is running the business. It's always great to have your fancy 80-inch TV, but can you get away-- you cannot get away without having heat or AC or the kitchen running and the refrigerator.
So focus on those first, where, hey, can I use-- am I using the data properly and effectively to run my business? Is it helping me make my business become more efficient. And then you start building on top of it to say, OK, how is it now fueling my growth plan for the next year or two?
And then-- and that's how the evolution happens on a data-driven decision making is you trust it for it to run the business, and now you don't have to worry about that. And now you can rely on it to take a more strategic bets and bigger dollar bets to say, OK, now I'm going to even finish my basement with this data and then build a fancy theater there-- so things like that where you have to start with the core things. It doesn't have to be small, but it has to be very core to your business. And make sure it is able to fulfill that need, and then you build on top of it.
OK.
So moving on, so that was around data management, right? So it-- now you have this-- let's say you are assuming you invested in all those aspects. And now you have a very rich catalog.
You have-- you know exactly what each thing means and what it's doing, what's the value it's adding. You know, at an entity level, how it is mapped and the attributes from a source system to the end product. You know the quality of it, and you also know the intelligent operations of it, where you know exactly when it is arriving, and what is the reliability of anything that my team or a business function is using.
So now it's like, OK, to your point earlier, Mike, now how do I use it for much fancier stuff, right? How do I accelerate? OK, to run the business, great. That's-- I made sure everything-- a roof on my head and everything. But now let's do some fancy stuff and accelerate. It still has to be tied to a business value. But then how do I get to it faster? And that's where the foundations of AI and how do you operationalize it is coming into play.
OK.
So as you can see here, one of the key things that you see at the right, at bottom, is the embedding AI with a proven framework. Now, how does it happen? Start off with impactful AI strategy, right? So there's a make AI an integral part of the boardroom discussion, which means-- which could mean not only from a business unit, but at higher up.
Make your business unit leader talk to the CEO or whoever in the boardroom. So it-- AI has to be a very purposeful discussion that is talked through at that level. And then-- some of these AIs could be-- it may seem like a black box. But at the end of the day, some of the business decisions they're taking could be solved through AI without them realizing that AI is needed, right?
For example, a financial operations-- you could use AI to get better at it, to get to get faster results on, hey, how do I make sure my cost allocation is better? Or how do I make sure it is deferred to a different year, et cetera? So I'm giving an example on a financial ops. But even from a retailer standpoint, like the asset allocation or the geography that you are playing with, all that can be leveraged or solved to a very good extent through AI.
So what does that mean now? So creating an operating model to leverage AI-- so what we discussed already is the capabilities. So that should help quite a bit. And then now you have to create value definition. Like I said, it doesn't have to be something different than what you're doing today.
So if a boardroom has a discussion around, hey, I need to penetrate in a certain market, or I have to attract a new market segment, Gen z or whoever customers, that is still a problem statement valid for AI. You don't have to make up a new thing for AI. So it's still the same thing, so that's where you see the same old framework people process technology but then insert AI on top of it without having to reinvent the wheel.
The problem happens where you try to over overthink it, over-solution it. You think, oh, it has to be a really complex problem for AI to be used. No, I mean, you can use-- as simple as your current goals and current issues and then have AI help you with it.
Well, I got to say, we've had a few talks already over the last while. And a lot of the topics or a lot of the discussion is about productivity. And what I find is, the smaller the task, the more time saving AI or GenAI can offer. So absolutely, 100%, you don't need this grand scheme and AI to solve that problem. In fact, you might be better off having a collection of smaller topics.
Exactly. So again, echoing some of the very simple things, like even pricing, pricing optimization, there are simple algorithms that are being run today. But with AI with the amount of compute and the ability to skim through such amount of data and the relationships, especially if you have all the ontology figured out, it can get you a more real-time, more accurate, and more complex pricing to you in a much more effective way. So as simple as a pricing discussion is also where you can have AI actually help you build the solutions.
Yeah.
So moving on to the whole aspects of vision to value, so how do you bring this whole AI vision? So we discussed, OK, you can actually have your current vision overlaid and use AI for it, right? But you still want to have a vision for how do you do it.
The what will remain the same. The key message is the business should not try to invent problems for the sake of using AI. So you use the same problem, but then figure out how do you have AI solve that for you.
So we already discussed the foundation with sustainable AI products. Some of the products are actually within the data management realm that we already talked about-- the data quality, the intelligent operations, the data catalog, the whole relationship with lineage and ontology. So it starts with that as your stepping stone. Then you democratize the development.
Today, it's a narrow-- many organizations have their own data science teams. And then they hand it off to their MLOps teams, which is still a very effective process. But some of the problems today are the speed to market, which you'll see in one of the following slides.
But when you democratize it, it cuts down some of the cycle, and you have the participation, the participation from the broader personas, not just the data science folks, but you have the whole community of data practitioners, data producers, data analytics users, and actual data consumers who are participating in the process, which leads to a unified vision. And then you actually get to commoditizing. So once you have experimented, once you have figured out the big problems and all, now you start going at a more rapid pace than one massive project and then realizing the value.
And then scaling it up-- scaling-- if a organization deals with multiple industries, and you do it there, or if you are matured, or let's say, usually what we have seen is the marketing or the digital sides of things are obviously easier, where you have AI application being easier. But then how do you bring that to something that's more complex like retail or supply chain or even pharma to an extent where there's a human element to it? And it's not that easy to say, hey, AI, just give me a new recommendation, and boom, you're done.
So that's where you then figure out how do you mainstream it to go live to-- for a retailer, marketing, digital is good. But their, still, bread and butter is going to be the warehouse and the retail. So you got to scale it to the mainstream for it to actually get that scalability and the massivity that it requires.
Yeah. Quick question, Sri-- do you have any examples of where you've, I suppose, implemented this or used this?
Yeah, so there-- a couple of examples would be, one, at a car manufacturer, where there has been challenges. Again, we'll cover this in a little bit on the challenges, is the-- when you have the traditional-- not too long ago, the models were able to only get you to 80% to 90% accuracy. For example, you have a car dealership-- car manufacturer trying to minimize the number of rejects from a dealership because of the potential damage happening within the transit.
So in order to predict that based on a lot of parameters, you were able to get to only 80%. But now, with the advent of the technology, the ability to use some of the foundations that we showed earlier here and, actually, the compute power, we are able to now predict it close to 3% accuracy. And that has resulted in either less-- lesser claims or just not sending the product so that you don't even have the return. So that's one example.
Similarly, there's a damage from a retailer standpoint, especially for heavy and large products that could be susceptible to returns. You would make sure there are attributes in there that can detect an anomaly. So for example, a certain vendor, certain period of the time, or certain particular route, you know, OK, it's going to be susceptible to damage, based on either returns data, sales data, or just the packaging of it arrives.
So now you have introduced a model to detect it and say, OK, you know, this seems-- this has a high likelihood of damage. So you reject it, and then make sure the retailer is not taking the hit on the product versus-- and then collaborate with the manufacturer. So end of the day, we're not trying to ding the manufacturer as part of this process. But each one has their own ways of you know collaborating and being mutually successful.
So those two are some of the key examples. And we have we have done some really good work in terms of the automated driving with BMW and also-- so there are several examples around how we have used AI to go into the mainstream and actually use it for real mass adoptions.
Sri, I've got a question. And I know-- and I think I know the answer. But if you could pick out one component that got you further than others, what was that component? Was it the algorithm? Was it the compute power? Was it the setting the data management foundations? Or was it all of them?
I think it has to be my foundations and the compute. Those two-- the algorithms have been still the same. To be honest, some of the original statistical models still work very beautifully as long as you have the right compute and the ability to churn through a lot of that data.
So there's been advent in even the algorithms, but more or less, they have been fairly standard for a long time. It's the compute and the ability to have that foundation, the data management foundations, that will lead you to better results and better adoptions. Thanks for that question.
Cool. So moving on to the next topic here, so the types of AI implementations-- so I'll just quickly go through all of these here. So as you can see here, there are multiple types from an AI model implementations. It could be custom. There's your AutoML, model API, and GenAI.
So what does this mean? Most of the models today is the custom modeling, where a business is trying to solve a very unique problem, and they want to have that specific data set and then train it, make sure it is tested. And then you deploy it based on result, and you keep monitoring the results. So essentially, that is your bulk of everything that's out there.
And then there's your challenge of that model, that type of implementation is what if you have a small change in the use case? And what if it's just a similar thing, but for something else? So that's where there's a little bit of a AutoML that has come in, where now you have something that's more reusable, that is more effective in addressing probably multiple use cases.
It can act on multiple sets of data. It can accelerate the whole development cycle. So that's where the AutoML is. So you have your some sort of a pre-built models, and now you are tuning them. And then you're able to also get to the faster and then apply more use cases to it.
Model APIs are essentially your endpoints. So like I said, if you're using a model for a specific use case, you then use it for that specific application integration or certainly the output going to either doing an action or analysis. And then that's how you would then-- you actually put that into action.
So most of the models, sadly, would end at, OK, I've gotten some insights. I know the fancy-- more details about it. But now it's not put into action. It's only-- unless, again, if you remember I told about the run the business, those are the ones that are always put into action. So that's why those are always important.
But then they are probably only the 20% of it. There's still 80% of custom or AutoML models that are not being productionized because they either are lacking the accuracy or the ability to figure out, OK, how do I put that into production or actual usability [INAUDIBLE].
And then GenAI-- you must have heard of it. It's a latest thing, at least from a buzzword standpoint. It's been there for a while, where now you have using AI for all kinds of task automation and every single thing that doesn't require anymore a hands-on approach.
So it's almost like a very interactive and very verbal inputs and then very simplistic commands to then get to a very high-quality and repeatable output-- so using large language models, stable diffusion, the chat bots and everything, where they're getting way better in terms of taking a human input, either text or verbal, and then translating that to really solid, useful output.
Yeah. So I have to admit, I am constantly conflating AI and GenAI. So apologies if I'm doing that even here. Thank you for clarifying that.
Yep, no problem. Yeah, the heart of it, they're very similar, except the application has now proliferated with GenAI, right? It's almost-- you can think of almost every task that is probably adding cost to your bottom line, you can think of, hey, can I use GenAI to automate to make the life better. It's not necessarily to replace someone. But how do you make it incrementally faster so that you can do more and do different things?
I had a call-- well, we had a conversation earlier. And Kamal, he's a DBA, and he was talking about the repetitive nature of a DBA, the repetitive tasks, the mundane tasks that a DBA has to do. And that is a key area where I can really, really help with is you're not taking the function away, but you're certainly reducing the mundanity and the repetitive nature of that role.
Exactly. Yep, and then the steps in GenAI actually will probably require same person as to actually get to the end product, right? However, like I said, the use cases are now profoundly more, and our utility is a lot more. However, it also comes with high maintenance and the need to now, OK, have 10 different, 20 different models, whereas now I had only, let's say, five custom models for-- one for marketing, one for retail, one for finance, and that's it. But now you have almost every process now trying to leverage the power of GenAI.
So moving on to the next aspect of here, so what are the different aspects? Again, I'm not going to drain the slide. Very standard on how it will happen. It's the data preparation, the data model building, deployment, and embedding. And as you can see, their effort is spent on both-- on almost all sides.
But predominantly, there's a lot of on data preparation because of lack of the data management capabilities. That's one of the big reasons too. And there's-- the data, actual data quality, and the data, the rawness of it, and the contextualization plays a part. But all of that is solved when you when you actually have the glossary, when you have the lineage, when you have all these other aspects that are readily there, right?
Now all you have to do is, oh, OK, I know what this is. I know why it is this way. And I know it is 95% accurate. Now you can jump into the model building process versus spending a decent amount of time, sadly, today, where you are preparing the data, cleaning the data, and making sure it's ready. So as you can see, the linking between our topics is, OK, you have all that-- the foundation's built. Now it's accelerates your timeline in the whole AI life cycle.
Moving on to some of the challenges that we have in general. I already hit on some of this already, is the time to market, collaboration, expertise, repeatability, and scale. And how do you get all of that is through this-- whatever, the grid that you see here, starting from the strategy right at the top-- we covered some of the process and operating model-- and then the whole-- the actual AI product building. And then there's your collaboration platform and then your operations of it.
So some of the challenges are resolved through some of the frameworks that we have already built, which is going to be, essentially, as you can see here in the next step, the frameworks that we have built in terms of getting to the process in a four-step process, and the fifth step being support. So the first one is the ideation. So any time you work on an AI model setup or AI initiative setup, be it a center of excellence or trying to actually derive value out of your AI implementation, this is the typical life cycle that we follow. It's usually pretty quick for ideation.
Then there's your value identification, with the specific use case and then defining the operating model and framework. This is when you don't have something, assuming it's the first one. But once you have the framework and operating model defined, you can almost say, yep, that three weeks is not needed. And then you-- we do a six weeks of advisory and MVP development. And then, eventually, there's your value realization, so either through projects or scale rollout, and then it follows with the support model.
So this is how we recommend doing things and then packaging it all together. But there are-- different companies are in different phases of this. And each one can adapt to either just doing this end to end this way or figuring out, if there's certain aspects that they have trouble with, then focusing on that particular aspect of it.
Sri, I got to say, this is not alien to me. I've seen this in my business analysis days, when we were designing data warehouses, like long, long, long time ago. This isn't alien to me. But can I ask a quick question? Or maybe I'll leave you to skip on here for a bit.
Cool. So this slide shows the very similar-- again, just to highlight the differences or the similarities with the GenAI life cycle-- it's a little bit more quicker turnaround with GenAI because again, like I said, the use cases are different, the proliferation is bigger, the actual impact is needed to be quicker. And even though it's a smaller scale, it sometimes has huge impact. But then you also need the turnaround to be quicker. So you have the scoping, selecting the models, and adapting and aligning, and then application integration from a GenAI standpoint.
So how can knowing-- having been here for a while now, so how can DXC help? And our partners together is we have these accelerators, we have these capabilities built to bring it all together, bring it all really quickly and in a efficient way, in a sustainable way, across the collaboration workbench, be it the data prep or the model prep, the data management foundations, from an experimentation standpoint.
There's also an aspect of the edge and application embedding, where, a lot of the times, you actually need the model to perform it such a real-time basis, especially, like I said, in the automated driving example, where it's literally your processing the data within milliseconds and sending the output, manage those models there's accelerators and frameworks for that-- all that across multiple cloud environments and the other pieces that we talked about from a data management standpoint, be it the modeling, the data product definitions, workflows.
So ultimately, what does this help? With increased productivity, optimizing your costs, and then, eventually, it's faster to market, to realize the value of AI. So with that, I would like to end the session. And thanks for the opportunity.
Sri, thank you so much. It's very-- this is an absolutely fascinating subject. And what I got to say that I took from this is this is reachable. This isn't blue sky, five years away. This is very reachable. And I'm super excited to hear other-- or other organizations try this out. And I would love to hear more at another point in time.
Thank you so much for-- you've been very generous with your time. Thank you so much. And I look forward to seeing you again.
Thank you.
[MUSIC PLAYING]