[AI & the Modern Data Stack] #184 Accelerating AI Workflows with Nuri Cankaya & LaTiffaney Santucci

**The Future of Artificial Intelligence: Insights and Innovations**

---

### The Evolution of AI Development

In recent years, artificial intelligence (AI) has emerged as one of the most transformative technologies of our time. The development journey of AI is no longer confined to large corporations or specialized teams. With advancements in both software and hardware, AI is becoming more accessible than ever before.

One key aspect of AI development is the growing importance of data science and machine learning. These fields require a unique skill set that combines technical expertise with analytical thinking. As an AI engineer, the ability to work with various frameworks like TensorFlow and PyTorch is essential. However, the role extends far beyond coding; it involves understanding how different models can be integrated to solve complex problems.

The democratization of AI is one of the most exciting trends in this field. With the introduction of AI-powered personal computers (AI PCs), individuals now have access to advanced AI capabilities right on their desks. This shift is making AI development more inclusive, allowing everyday people and small businesses to experiment with cutting-edge technologies.

---

### The Role of Data in AI

Data lies at the heart of any successful AI implementation. Unlike traditional programming, where clear instructions dictate outcomes, AI models learn from data. This means that the quality and quantity of data are critical factors in determining the performance of an AI system.

When working with AI, organizations must prioritize data privacy and security. Sensitive information, such as personally identifiable data (PII), requires careful handling to avoid breaches and comply with regulations like GDPR.不慎处理数据可能导致模型意外学习到内部机密信息,从而引发严重后果。

For example, if a company inadvertently trains an AI model on internal communications, it could potentially leak confidential information when prompted. This highlights the importance of proper data classification and access controls.

---

### The Intersection of AI and Ethics

As AI becomes more prevalent, ethical considerations are becoming increasingly important. Developers must ensure that their models are free from bias and operate transparently. Achieving this requires collaboration between data scientists, ethicists, and domain experts.

In the marketing industry, for instance, using AI to target customers raises questions about privacy and consent. Companies must balance the benefits of personalized advertising with the need to respect user preferences. This ethical approach not only builds trust but also aligns businesses with regulatory expectations.

The future of AI will likely involve stricter guidelines and certifications, similar to those seen in other industries like automotive or healthcare. Organizations that embrace ethical practices early will be better positioned for long-term success.

---

### The Future of AI: Democratization and AGI

One of the most promising developments in AI is the democratization of access. With tools like Intel's OpenVINO and the availability of AI PCs, individuals and small businesses can now experiment with advanced AI technologies without needing expensive infrastructure. This shift is opening up new possibilities for innovation across industries.

Looking ahead, the concept of artificial general intelligence (AGI) represents an exciting yet challenging frontier. AGI refers to systems that possess the ability to understand and execute any intellectual task that a human can perform. While this vision is still in its early stages, it has the potential to revolutionize how we live and work.

The journey toward AGI will require significant advancements in both hardware and software. Companies like Intel are already paving the way by developing solutions like Gaudi, which optimize AI workloads for large language models. These innovations are laying the groundwork for a future where AI is deeply integrated into our daily lives.

---

### Conclusion: The Road Ahead

The future of AI is both exciting and complex. As the technology continues to evolve, it will require collaboration between diverse stakeholders—ranging from developers to policymakers—to ensure that it benefits society as a whole.

For individuals interested in entering the field, now is an opportune time to start. With continuous learning and adaptability, professionals can navigate the rapidly changing landscape of AI. The demand for skilled AI engineers is growing, offering promising career opportunities for those willing to embrace this transformative field.

In conclusion, AI is not just a technological advancement—it's a fundamental shift in how we approach problem-solving. By embracing innovation while staying true to ethical principles, we can unlock the full potential of AI and create a future that is both smarter and more inclusive.

"WEBVTTKind: captionsLanguage: enin the next uh five to 10 years we will merge with AGI so like we will be a part of uh like the artificial general intelligence and we will all as Humanity we will benefit from that so if you look at from that perspective everything we interact today from an industry perspective will definitely and drastically change so we need engineers and the the the term Engineers is really looking at the phases of development maybe parsing them out and make them more efficient so in the industrial engineering age we did it perfect with manufacturing right so like the Henry Ford's all Model T and like production in the mass it it changed what we get today from a like a mass production and industrialization perspective similarly I think the AI Engineers will change the future of AI in terms of how do we interpret the data how do we collect the data in a secure price private and like all the uh consistent ways and generate the right level of output in a Val sequenced way which will really like accelerate the Humanity's growth uh in the journey hi Nori and La Tiffany great to have you on the show hi Rich than for having USC so I'd love to talk a little bit about uh adopting um AI in the Enterprise and since you both work in marketing let's start off with um how AI is changing marketing analytics it's definitely like um the year of AI maybe the decade of AI in the upcoming uh years we will feel AI is infusing our marketing execution deeply uh starting with the data analysis I think we see today uh we are heavily involved in getting some of the insights and predicting some of the trends uh before it happens so so and previously we weren't able to analyze the uh data models and come up with some of the learning so I believe understanding customer behavior and really getting into the next prediction level before they knew they want the next product using the marketing tactics to really help them hey like we know your minds are all like going this way and this is the best fit for your needs I think that's a really tactical way of using uh the marketing and AI together but line I don't know if you want to comment more on this yeah no definitely so um I I think that the data piece that we see with AI is really what's going to change how marketing is done with AI um you know you you're seeing things already like personalization like product recommendations those sorts of things it's only going to get better and that those future trends that we're going to be able to forecast from that information is also going to be at the Forefront so both great points there like Nori I really like the idea of being able to predict uh what customers want before the customers even know it and let yeah the idea that data is going to be the key to success with AI does seem incredibly important uh all right so I'd love these are things I'd love to get into in a bit more detail later but uh for now I think one of the big trends is companies trying to include AI in their existing products so maybe can you just give me some examples of how companies are trying to do this um Tiffany do you want to go first this time absolutely so yes that is absolutely true we see that 58% of CEOs from leading public companies are actively investing in AI this is what is on the top of everyone's minds and um I think when you look at what is most common right now it's really that customer service with the the chat Bots um and this is very popular right now with the rise of um all of the llms that we've been seeing lately um but also Beyond just the customer service with the chat bods we're also seeing a lot of vertical use cases so depending on which vertical you're in from you know Ai and Health Care looking at treatment plans Diagnostics things like that but also retail like we mentioned earlier right a lot of those personalization engines um and then finally um there's even a lot of vertical specific use cases with an automotive right we're seeing this with the self driving cars and um and and you know enhancing the safety features as well there that's brilliant and I think yeah maybe chatbot is the thing that comes to everyone's mind initially but I do like the idea that there are there really are sort of vertical specific use cases um Nori do you have any more examples of uh ways AI is being incorporated into products yeah absolutely I think let shared like the key ones especially I think generative AI will touch many aspects of all those uh industries that LE mentioned uh and we will see like more disruption in many of the other uh untouched industri so if you look at uh today I mean the jobs is always like a tough question that I get like and initially when I think of the AI I thought like hey this will start from the process automation some of the basic things but if you look at where the Gen Solutions are it started with the the white color jobs writing code like generating images generating videos every day when I log into social media I see a NE AI solution coming up so it's really unpredictable what the next major uh workflow and industry but in a nutshell I think all the industries in the next couple of years will be heavily impacted by Ai and as Intel we are trying to be ready with our solutions to make that transition happen absolutely I mean the the impact of AR is certainly very wide ranging I do think most uh Industries are going to be impacted um in terms of some of the Practical details can you talk me through what a typical workflow is for going from hey we should probably do something with AI to it actually becoming implemented absolutely yeah go nor yeah let me go first and maybe will cover but it all starts with the problem definition right so like today what we really want to achieve AI uh at the customer level is solve some of the problems or mitigate some of the issues uh I always approach this as an outcome based solution so uh all the customers really look for three outcomes they want to make more money they want to save more money or they want to protect their money so that's the the purpose of the business and I think if you look at the uh identification of the problem let's say they want to do a new uh way of innovation in their business lineup the first thing they have to do is data preparation so without the data AI will not work so it's really dependent on it and like you cannot just like dump the data I mean like there are models that you need to choose from if you look at the large language models and how they work today it's like really predicting the next word in the sequence and really making it like really effective H but in order to start with that you need to have the data so you need to agree on what are the training data that you will provide what's the data model you will train and you start so like the training is a big process and people think on the AI side I will train once and I'm done unfortunately it's not the case because AI will always transform the business it will predict the next thing and like the it will come even better accuracy every time you do the the modeling and of course the next level is inferencing so and we need to make sure sure that this is one of the workflows that is integrated into existing products so at that point you have the outcomes from like uh the the AI algorithms and make it more sense but the journey starts again so it's it never ends like you have to get all the outputs from your uh let's say inferencing you have to do all the training once again and then like you have to fine-tune the model so fine tuning is a big part of the the journey uh and like in some cases you have to be the moderator you so you see some of the user inputs and you have seen like uh on chat GPT and open AI when you ask specific questions like give me the predictions for the next big stock boom it will not answer that because that's the moderated learning so you say like hey I'm not I'm just a chatbot like it interferes the model data and gives a user input so the companies needs to put some U Gates uh before it's shown to the the users but that's kind of the the journey like from uh the data collection to training and inferencing and all this process repeating itself for a long time I love that there is a pretty well defined process now for adopting AI I want to highlight the thing you said first there that you really need to start with like a a business problem have a well- defined problem definition otherwise it's all going to be meaningless um so maybe we talk a bit about like things that can go wrong so I guess not a world defined problem is one of them umy can you talk about any uh of the pain points in terms of adopting AI yeah absolutely so one of the main pain points that we really see with companies is that you they have this excitement around Ai and they want to um include AI in their current workflows but they don't really know what AI is needed right and AI is such a broad term in terms of what they could actually do for that um so I think that is number one um secondly um if they even really need AI right so sometimes um there are certain uh solutions that we can come up with that might not include um you know it something that is in the AI Arena right so that's another thing is just figuring out what is the outcome that they really want before we back into what that solution is or and what it's going to look like um you know and and one of the things we always tell our um customers is that that's why it's so key to work with Partners like Intel um to really understand that customer outcome that they're looking for and then we can work with them along the way to find out those business requirements find out what Hardware or software is needed um to really bring that um solution to life um but having a partner um is is definitely key in this journey it's new for everyone and um we want to be um you know at your side every step of the way for sure excellent yeah um so talking about the uh the software and Hardware requirements I'd love to get into that in in more depth so um I remember SAA Nadella the CEO of Microsoft recently saying that U Microsoft having to just change the whole infrastructure of AIA just to cope with all these new AI workload so the cost is are pretty ey watering um and large language models in particular sort of notoriously expensive to train and to run so do you see this trend continuing or do you think that they're going to get cheaper what's going to happen um Nori yeah uh I think we are in the like transition phase for AI everywhere and that's why maybe you heard about Intel is trying to bring AI everywhere I think the Azure example that you give is like a good example so I worked at company for 17 years before joining Intel and I have seen the early days of AI like again my team worked heavily on uh getting the collaboration with open AI team and in the early days like I think the applications made it uh more tangible uh but in the early days we had the models on the back end but when you don't have a client to interact with like the cat GPT and that's a brilliant idea because it uses natural language and any language actually to really get a response request from the AI it made it obvious that this is going to impact everyone and with that we are seeing that from AWS last week the reinvent event and also like with our OEM Partners like d Lenovo HP everyone is trying to be ahead of the game ER including uh like the market dynamics are changing in ER both on on premise world where there is still Data Centers of customers they have to reink like hey am I going to be in the AI game and my Hardware is really compatible with the needs of the AI uh because again I will remind that like from data preparation to training to deployment it's like a circle and you need AI compute so and I I almost GNA say like the new compute is the AI compute like everybody needs it so in the early days of Cloud solution providers like we were always thinking about virtual machines so they will just like run your on Virtual machines on your data center on the cloud in a cost effective way I think there's a value to have a hardware Innovation back to your question I think this will be just accelerated in the upcoming months and I think we will also maybe share some of the news but as an example Intel acquired a company called Habana Labs a couple of years back and our hero solution is Intel Gaudi and Gaudi is an AI accelerator so it's a specific purpose built accelerator for AI H in the early days of I think AI people were using gpus because like Graphics processing is really fast and you need that fast for all the training inferencing and like deployment Solutions and all the training but now like you need to focus on AI acceleration so simple answer to your question we will just see a huge wave of innovation in the upcoming months even like not years months on this journey and there will be new hardware from Intel and also from many of the ecosystem players so this is really interesting because um my sort of uh mental model for this was that a lot of people are using gpus at the moment to train these models and know because Intel is sort of maybe perhaps more famously known for your your CPUs you're saying um that so you also have a product called G which is chips specifically for AI training so can you talk me through is that like a is that an alternative to a GPU what what's what's going on there as I said it's an AI accelerator so I think uh the Gaudi is specifically designed for AI in in the mind of uh the early days of deep learning now we are at the Gaudi to the second version and like we already introduced in 2024 uh we will in this year so we will have gaud tree in the market so again it's like not years but it's months almost we introduce me products with the pace of uh the AI and the reason we speakly have an AI accelerator is the models require not only the GPU again there is like specific tasks that GPU can be really uh a good uh let's say resolution for any AI problem but moving forward we we will see more AI specific Hardware uh not only the graphics acceleration but like acceleration in the neural processing acceleration in the the training models acceleration in the inferencing models like everything will be a part of the journey so when you have a dedicated AI accelerator which covers all those Solutions uh you will get like more uh faster results uh and it's going to be way cheaper uh because one of the things uh it's hard to find gpus in the market nowadays the reason why is again like everybody wants to do something on the AI uh and they require all the models uh to to use a GPU but in the long run we believe this is not going to be the case uh before handing over to let like one of the we also have a GPU product by the way so and it has specific purposes so if you look at high performance Computing for example which is like there for a long time but predicting the next weather by seconds like not days but like it's possible today with uh Solutions like Intel GPU Max so it's a special product we recently introduced some of it at the uh super competing event at Danver couple of months back and you can predict what's going to happen in the like space exploration like in the cancer treatment so you can really do high performance competing for AI uh using the gpus because that's where it's really successful uh but also handing over to li like it's also possible to do it even on the PC today uh so we call it the AI PC but Le do you want to add anything on that one so as we mentioned earlier right that um a lot of companies um will have different AI needs and based on these needs is where um they'll be able to choose which Hardware is best for them so whether that's an accelerator or GPU or CPU um it's really going to be based on what their needs are so um that's why we are excited here at Intel to bring AI everywhere and to really have options for our customers uh because again not everyone is going to need um you know that AI power um uh horsepower that you know we sometimes talk about right um it's really going to be based on that and um in in terms of making AI more accessible um we're so excited that we now have our core Ultra processors that's coming to um the AI PC um that we've recently launched and within that customers are going to be able to run those models on their PC and they're going to be able to utilize all of um the goodness that we've put into these new gpus um so they can start uh going Beyond just uh the traditional gpus for their AIS okay this is interesting um can you maybe talk me through like when you would want to do um when you would want to use each of these different Hardware Solutions like when would you want to use an AI accelerator when would you want to use a GPU when would you want to use CPU like what are the different uh use cases for each yeah I'm I can start on that um so um for example um where you want an accelerator such as gouty is where you're doing some of that deep learning and training um of the models up front right so again something where you really need that horsepower um then as you kind of move down the line and um you want to utilize say one of our general purpose gpus um that is something that you can um have one of our gpus um would be sufficient there and then um finally so going back to the AI PC um use case that is a great way um to do again general purpose AI however um this is going to be really useful when you have sensitive data for example so um if you want to utilize uh your PC so that that data does not leave um your local device and doesn't go up to the cloud now we see this a lot of times with um specific um organizations or governments that um for security reasons they don't want things to go up to the cloud um or sometimes if you're offline right and you don't have access to the internet having the having um the your CPUs do a lot of this work is going to U be really helpful as well and just to add on top of that I think we haven't covered the EDI specifically until now but thanks a lot L mentioning that uh Richie I think that's a very important part of the AI Journey because we always think like AI will be on the clouds AI will be on the gpus but it's not true like most of the cases today like it's impacting our life it's actually happening on the edge uh so if you think about all the like s driving cars like you cannot just ask like shall I pass this lane or not I mean you might just cry like there is no tolerance for a latency so you need that at the edge at the like in on the car running AI in stly and like you have to learn all the way so if you look at uh like the industry leaders on S driving cars they make the models like fine-tuned for Edge scenarios and similarly it's impacting many other Industries like manufacturing is another example but La just mentioned security is a big concern you don't want to train your models if you are let's say government or imagine like Ministry of defense like I mean you are really protecting your p uh and you need to be secure so like you need to add that layer on top so where we call it the hybrid AI so we will see more of those scenarios where there's some AI at the edge some AI on the client side but also like whenever needed you go to U the data center or the cloud to get that enablement so it really brings us back to Bringing AI everywhere because there's no one siiz fits so in the AI so there needs to be Solutions at the edge at the client at the data center at the cloud and I think Intel is really positioned well in terms of covering all those four components of air okay so that's actually kind of interesting because I think maybe one of the big push backs in terms of um saying well okay I need all this Hardware in order to run AI or actually a lot of times you can just call an API and have Hardware be someone else's problem so those are really interesting examples the idea that you need if you're doing something in your self-driving car you can't called the cloud or if you're a government agent then maybe you want to build things yourself do you have any more examples of like the trade-off between working with like other people's Cloud models or um and building something there are two let's say models today in the market so if you look at the large language models there is like open AI uh likes where all the training and everything is done by the wendor so you just use the output of that and you build your Solution on top of that uh it's kind of the closed AI solution you don't have any like influence on the parameters and everything that is used by that model or if you go to hugging phase and download a let's say a llama model which is like completely open source then you can see what's this trained for and then you can add your own training components and you can customize it so there's of like pros and cons on both models one is like if you want to be fast in the market and get all the API support from open AI stability AI scale Ai and like many others then you can use that route to be fast in the market but if you need any I mean it boils down to developers right they have to choose what's right for the company and what's right for uh the solution uh it might be really easy to build a solution I'm just making up but like if you're looking at a healthare solution if you have a llm that wants to focus on cancer uh like let's say early detection on specific uh industrial use cases then there might be a let's say three billion parameter large language model uh open source uh which might be really helpful for you so you don't have to train the model and like pay lot of compute power for AI but you really I call it the Nimble AI so there will be solutions that is really nimble like you don't need a GPT 4.5 or like moving forward like trillions of parameters but like you can shrink the size of parameters to your domain and it will be more accurate and again it will be open source so you can feed everything uh within your uh ecosystem so again depending on the sensitive data security those are the concerns I think we will be discussing moving forward that's actually interesting this idea of nimble ey that you don't necessarily need the best all powerful Cutting Edge model you just need something that's good enough for your purposes but maybe is a bit cheaper to run um leany do you want to expand on that like when might you sort of care about having a smaller like less powerful AI rather than having The Cutting Edge yeah absolutely so it's really going to depend on um the organization and the needs and it's also going to um depend on like you know again like what does that company want to do and what sort of outcome they want to um come up with when you think about AI everyone just gets so creative and um the ideas are endless of what they what they can do with this right and so um I looking at what is capable of that company as well something that we're going to need to um give some attention to um the other thing we always like to talk about as well is the the data the quantity of the data um that a company might have as well as the quality of that data um so as we spoke about earlier AI is is all about um the data piece right so we don't have enough of it or if the quality isn't there then we might find that certain companies aren't ready to do that right everyone's AI journey is going to look different and um every company is going to be at a different stage um and that's okay right um but something that we um definitely want to think about as we're looking for the right solution okay and adding on this idea of solutions um we've talked a bit about Hardware I'd love to talk about the software as well so uh can you me about like what of the most common software Stacks people use for working with AI the stack again is going to really vary um based on um what their needs are um but for AI um overall commonly what you're going to have within that stack is kind of starting off with whatever the programming language is um and then um choosing which libraries and Frameworks are going to be best for them when it comes to the machine learning portion um as spoke about earlier um and then finally you know you want to process that data and then kind of start to choose which visual visualization tools um might be right for you um it might be um different U analysis tools um and and and more right um but again it's really going to be based on like what their needs are that stat can look so different um it's just really going to be based on what the solution is excellent yeah uh so all those components seem to make sense um norri can you fill in any of the details on like what people might go for in terms of like sort of programming languages Frameworks database tools all that sort of stuff yeah I think there's like a couple of Frameworks already in the market which are again like I will call tensor flow is one of them pytorch is a big framework and these are really like uh big foundations like as a example Intel is now a part of the pytor foundation because we completely support open source and we want the developers to get the best out of uh the solutions uh in that and specifically maybe unpacking where Intel sits on the software because when I joined Intel it was like is Intel in the software business like I didn't know that so and I really did my let's say deep dive into the topic and I was really surprised like I mean especially on AI I want to highlight two solutions that Intel provides which is really important for all the developers who are listening this podcast one is open and it's a runtime choice for the developers and really helps the um the developers to deploy the AI Solutions everywhere uh it's heavily initiated as an edge deployment solution it's expanded to client and of course with our data center capabilities open window provides really a wide variety of AI deployment Solutions as like uh core foundation and the big thing is the one API uh one API is again open source standard based software that simplifies the programming across diverse architectures so the developers can have the flexibility uh you might heard about Cuda for example so uh Nvidia at 2008 come up with the idea of Cuda and brought the the developers but that's a close Source platform so like I mean if you're a Cuda developer you have to develop on NV media uh chips set so what we believe is one API it's open source so you should be able to develop wherever you want you can deploy uh whatever you want so that's giving the flexibility and power uh to the developers and Architects uh and maybe the final thing uh which I was a part of the launch process actually Intel developer Cloud so that's a big step forward so if you think about the Innovation Journey that we have at Intel it is uh it's really exciting like almost every six months you have a new product coming into Market uh just mapping the pace of the AI Innovation and for example Intel Gaudi 2 in the near future Intel Gaudi 3 so whenever that products launch it will be first available on Intel developer cloud and it's a cloud uh infrastructure provided by Intel to developers you people can just go to Intel developer Cloud create an account and immediately try those things without really need to purchase a million dollar of uh Hardware it's like really easy to get access you can use the platform develop your Solutions and this applies like the wide variety of solutions including our xon Hardware GPU Max GPU Flex so Intel has like wi Solutions on the the AI and I'm just connecting the do but you can use open one API and open Mino Solutions on top of Intel developer Cloud which makes it easy for developers to try before they commit to the platform and they want to like run uh so it's just easy Port because they have done all the testing development on Intel developer Cloud uh we are not really planning to uh steal any let's say uh customers from our vendors we are trying to incubate them in early stage and when they are ready like G is available on AWS as an example they can go to AWS uh they can go to Zeon for fifth generation uh for gcp so again like I mean those are all the customers that are really uh uh have a journey to end their AI continum so it sounds like there's a lot of software in a lot of different areas so that's changing so everything from like the lowlevel sort of data center infrastructure stuff up to sort of tools for people who are developing with AI um I'm going have to make you both pick your favorite so go on uh what do you think is going to have the biggest impact um La Tiffany do you want to go first uh which bit of software do you think is going to have the most impact in the next year or so um yeah I think um it's really going to be open V now um for me you know as we mentioned this is really going to be the client runtime of choice and it's just going to open up what's possible um for developers in terms of um where they're coding and where they're deploying I think you're really going to see uh just an explosion of different use cases and um inst solutions that come from this okay so easy deployment that sounds like a very useful uh thing to have uh Nori do you have a favorite bit of software you think it's gonna have high impact I will also plus one uh let F on this one open know especially again we recently launched the AI PC which is again bringing AI to computers and this is a drastic change for the industry like AI was like somewhere in the cloud like I mean it's like the Greek mythical Gods like I mean you need to have like gpus now you have a PC and you can run as an example stable diffusion on your PC like you don't need any like expensive Hardware or like commitment for a cloud provider for a long term so this is going to change and in order to deploy that solution open Mino is the only solution in the market right now which is really exciting for 2024 I think this is gonna be a big innovation for Intel and also like great help to Developers for deployment okay after look out for that then um I'd like to take a little Sid step now into uh talking about privacy since this is um a big deal for a lot of Enterprises so um can you talk me through how privacy requirements affect the use of AI what the organizations need to consider um yeah Nori do you want to go first this time yeah I think it starts with really classifying the data first like uh uh privacy and like again security is like a foundation for a company data uh process uh and uh I always when I met with customers I asked like who's your data Chief data officer like if you don't have one there's a problem then like you don't know where your data is like what's the classification that has been done and in AI the problem is um llms will never forget like if you train accidentally an LL which is public facing it will be a part of the training data for the upcoming training model and it will leave there forever and we have seen some customers unfortunately again like uh I will not name the companies but they accidentally train all the information on their private data and now you can just like prompted like hey give me all the uh C in backwards of this company uh writing a code like them and then boom boom boom so it it gives you all the back so like it's really important to uh put that guard rails in front of your data making the right classification and who access to the data uh like preparation because that's a very important step so today at like at marketing at least like the pii it's really important for us to keep like customers name phone and everything uh private like we don't really uh do anything with that data and really few people can access and if it's an opin but what we see is like the companies are throwing all their internal data to the training of a large language model or fine-tuning one of the the things and it's a little bit risky so you have to make sure that this is going to be understood by the the machine learning and it will be a part of the deep deep learning algorithms which can be like by simple prompting it might be unveiling your secrets uh for trade H or any other company confidential information so that's why I think secure AI is an important step that before jumping into this AI Journey you have to know what's my security Journey what's my privacy what's my uh really like models and approach to this uh like AI uh implementation and then move on so if you drive a car today or you produce a car you get to like all this uh certifications all this assessments there are government bodies that are approving those but if you look at AI deployment today companies are deploying like back and forth everywhere so and I think getting that ethical usage of AI data sources getting the privacy and making sure AI is secure uh is an important thing I think it will impact the next couple of years for all of us yeah definitely so I like that analogy that if you're sort of building anything else that you know you put out a phone or car or something that there's a ton of regulatory hurdles you have to go through but there's not that much for AI um Lely you look like you had something you wanted to add there so do you have any more advice on like how to think about when do you need to get about um privacy in the context of AI yeah absolutely so Intel has a long history of always following gdpr rate so um it's going to be the same with AI for us and um you're going to see that integrated within our products and um within um everything that we do when it comes to AI so um I I'm very confident in terms of where Intel's going to be going with the Privacy requirements for AI um and and you've even probably seen this in the news um Intel um collaborated with bware for a new Venture called private AI um where um we worked with them to build um a a program that gives um privacy from Edge to cloud and this is just the beginning right um we're going to see um other initiatives like this happen um we're uh we're working with um all different types of companies to ensure that that security um comes along with the per performance okay um and so you talked about from Edge to Cloud there is there going to be a difference then in terms of the Privacy requirements if you're not doing something in the cloud and you want your your competition to happen like where your data is um yeah so um we think it's going to be both right so um you can have that security that you need and want in the cloud um but there's also going to be um the there's also going to be additional um Securities that you can have by keeping some of your data local um so this is a um a great example of what we see of being hybrid AI um so we talked about this a little bit earlier right where you see there is a um a seamless transition um from your data that you have on Prem um and then also utilizing uh the cloud when it needs to when you need to get additional data that might not be private right um so this is is going to really enable organizations to use their AIP PCS and um or their Edge um infrastructure for some of that sensitive data and then pull from the the cloud um additional data um that might not be um s sensitive I'd like to talk a little bit about processes to touch on it briefly and it seems like um most of the processes you want um for AI development are going to be similar to standard software development are there anything is that a different here like is there what's peculiar to developing stuff for AI I think the key differentiation is the data piece so like we have seen the data science jobs are like incrementally getting popular in the market because AI is built on top of the data without the input there's no output and you need to write data scientist skills to develop that machine learning model the data science correlation uh we we have seen this machine learning machine uh learning engineering roles and again I'm a computer science graduate but like I believe there will be artificial intelligence engineers in the near future so it will be a really a major study area because this is really becoming like on top of that data science layer you develop everything uh like from a like a modeling base I just covered the tensor flow pytorch those foundations but on top of that also there is like uh the development needs to Think Through uh the synthetic data creation like where uh you need to feed the engine all the time with the latest and potential upcoming learning H and we covered the ethics and compliance but that's also like an important piece of the development Journey you cannot just I mean historically I'm a developer like I develop uh websites with asp.net at a point of time and it was easy you just use NET Framework and then like you use a like a client which is a web browser it's easy everything is on the server side you don't worry like AI is multifaceted so like I mean you have the models in a private environment you have to get the right uh let's say training on top of that uh like you have to deploy and we discussed the edge like you need to do Edge inferencing in some of the cases and like it's a life cycle of Journey if you miss the ethics and compliance piece like you might uh it's not just a server s side thing anymore the data is in every location so uh those are I think the additional uh level of complexity for developers to think through and final thoughts is like writing a programming language was like a big task so but now it's not like I mean if you're building some blocks you can just ask any GPT like hey please write me a python code to enable this so you are more uh enabled from like the time consumption perspective so you have to spend more time as a developer on the the right architecture the right model work more collaboratively with the data scientist uh to brief down what you want and to work with the compliance team on like what's going to be the outcome so you don't really Buri into the line of codes and like test and deploy the solutions so I think AI is gonna give a good uh breeding area to interact with the rest of the organization in a better way for developers absolutely and so you mentioned this the idea of this AI engineer being a sort of fairly new career and this is like a ton of things they have to do can you break it down like what sort of skills you need to get to being that AI engineer yeah let me go and then let's fin will maybe drive more depth conversation but it's all about analytical thinking so I think uh with AI with artificial intelligence we are trying to mitigate the intelligence that we have in the humans right so we want to repeat some of the the process underlying and then really surpass some of the things that with our human intelligence what we achieved so as an AI engineer I think um this is a first of all a really uh interesting area to work on because like we don't know what's going to be the next big thing right so and maybe we will cover that but like we all heading to this artificial general intelligence and in my opinion in the next uh five to 10 years we will merge with AGI so like we will be a part of uh like the artificial general intelligence and we will all as Humanity we will benefit from that so if you look at from that perspective everything we interact today from an industry perspective will definitely and drastically change so we need engineers and the the the term Engineers really looking at the phases of development and maybe parsing them out and make them more efficient so in the industrial engineering age we did it perfect at manufacturing right so like the Henry Ford's all modality and like production in the mass it it changed what we get today from a like a mass production and industrialization perspective similarly I think the AI Engineers will change the future of AI in terms of how the we interpret the data how do we collect the data in a secure private and like all the uh consistent ways and generate the right level of output in a Val sequenced way which will really like accelerate the Humanity's growth uh in the journey okay and uh Tiffany uh like so if uh people who are listening who want a career as an AI engineer or for managers who are wanting to hire an AI engineer like what do you think are the the top skills you need here yeah um so I think that um if someone is wanting to really um develop their skills in AI I think um one is going to be um you know having that Proficiency in machine learning and Ai and all these other disciplines that are under this AI umbrella that we've talked about today um that's number one and secondly is really going to be um having a good understanding of how all of these different models can kind of like work together right um as ner mentioned what we're going to see with these engineers in the future is they're going to be putting together um all these different aspects into one solution so um it's not going to be a one-size fits-all so really having that flexibility and honestly uh creativity um to come up with these Solutions um based on whatever challenges that they they are faced with um I think that's really going to be essential um and then and um you know also I would say that just uh keeping a um a an attitude of uh continuous learning um because everything in this AI space is changing so rapidly um from you know the new developments happening to uh the new uh software and Hardware coming out you have to just keep learning to stay up to date on this um so I think if someone wants to get in the AI field uh those are some of the the key areas uh they should focus on absolutely I do like that idea of continuous learning because you're right it just changing so fast I keep teaching things and then a few months later it's it's all out date again so I suppose related to that do you think there are any skills that going to be made obsolete by AI so um I don't think there's going to be um skills that are made obsolete I really think what's going to happen is we're going to see this transformation happen um with those skills so AI is really just going to unlock um additional time and resources for us so it's going to take away some of those automated routine and repetitive tasks that you were doing and give you more time to uh build additional skills or um you know use your um yeah use your skill set to uh learn new things um and I think that's really what we're going to see is more of a transformation rather than one skill or the other becoming obsolete absolutely um so I want to talk a little bit about like what you're most excited about for 2024 but before that I'd like to take a little step back nor I know you were involved uh when chat GPT was first coming out and you were involved in in the marketing of that uh when you your from your time at Microsoft can you just talk me through a little bit about what happened there and what you were aiming to do in terms of uh bringing AI to the masses then absolutely and I think it's a good story to share with everyone because like this is a uh like this was a journey where uh like Microsoft partner with open Ai and then initially in some of the components I will give the specific GitHub co-pilot as an example uh so when we run the first phase with GitHub co-pilot we have seen like tremendous results on like hey this thing can write code like sometimes better than many coders and uh we observed what's going on like for a while and then I think like before the chat GPT it was easy to H generate some of the uh the coding dolly was another example so like when we interacted with Del's first version wow I mean this is really creating some images uh which are really creative in a way that uh that surpasses like some of the uh human Ingenuity and really goes Beyond some of the the thinking uh but almost in 18 months uh the number of code lines written on GitHub uh by AI became 51% so the human generated content was 49 so in 18 months so this is like a huge change all the history of software engineering programming and everything but and again like I see it's going to be maybe 90 to 10 moving forward uh and back to your question to lean I'm not really afraid of like hey the software Engineers will lose their job so I think on some of the repetitive tasks well it's a also a company time and efficiency problem because you ask those developers to write a code which can be written by AI because they try to solve the same problem over again and again again like the the basic foundation of the the models are the same so and I think this will affect in many Industries so uh when uh I was at Microsoft so we used open AI for our internal marketing team uh there's a term called mpf the messaging and positioning framework we said like imagine you are launching a new product so what will be the mpf boom boom boom boom like it's and you can just name you can like upload some documents to train it even further and it was like so good like I mean we said like hey we will save most of the time like writing documents or creating some messaging Frameworks versus really marketing them so and it accelerated our journey and again November 30th 2022 so when we launched open AI uh on check GPT to the world running on Azure that was a aha moment for everyone uh to interact with that using their own language so you can start English turn to Spanish whatever like because technically the large language model understands the way and it responds the way and getting that uh I think the visual components like the the uh video images everything is happening so fast if you look at uh the last one year uh it really drastically made AI a number one topic for every leader in the world so if they want to be in the game in the next five to 10 years they have to spend some time on AI so the learning that I had was like it was taking for my team long time to explain like AI is going to change AI is going to change the world but when people see it in action boom like it was an instant pass yeah we have to be in it otherwise this will going to be disrupt our industry so you have to do it before your competitor does uh so I think that that's a good example and I'm trying to apply that uh at Intel uh so like we are trying to be head of the game like we covered until now lot of products but all those products I will go back to the first question that you asked what are we trying to solve we are trying to help the customers on the outcomes so either it's accelerating Innovation it's securing the the the data we covered it a lot and also the TCO like if you look at some of the GPU prices in the last one year like everybody added margins into the game and like a customer is not benefiting from from that they're not saving dollars so as Intel we want to provide like and bring AI everywhere so that's that's a big change so again with the core Ultra processors we made it to the PC we are making it with Zeon to Data Centers but also I will reh highlight Intel gudi is a big innovation it's really bringing AI for large language models you asked Rich question like where do we position gudi it's whenever there's a large language model work that's the area for an AI accelerator and then Intel gudi will definitely help it and we already seen some of the great examples of uh that solution in the market so if you have kids for example Roblox is running Intel GOI on the back end if you are watching Netflix it's running in t on the back end so all these recommendation systems and everything that is in our lives are already filled by Intel technology today um yeah uh it's amazing how much uh is sort of pervasive throughout everyone's Liv and I I do like the point you mentioned when you're kind of going to promote CH gbt was like well you know until people saw it it wasn't obvious that this is going to change the world uh but once you've seen it like oh yeah yeah I get it now brilliant um okay so uh before we wrap up can you tell me what you're most excited about for 2024 and AI uh the D me yeah so for me I am most excited about seeing this uh democratization of AI um really come to fruition and we're really seeing that through the AI PC I mean this is truly putting AI into the hands of anyone who wants to to have it right so I'm so excited about that and uh this just underscores Intel's mission of bringing AI everywhere as well um you know it's now it's accessible not only just if you know not only just the big companies but Everyday People who want to um start using these models and so just so excited about um the opportunities that's going to create as well as seeing what people make all right nice I do like the the fact it's coming to everyone yeah the the democratization of a brilliant uh norri what are you most excited about I think I gave some Clues uh before but I'm excited about AGI and I think artificial general intelligence is not that far and I'm looking forward to like think through as Intel how do we collaborate into this like massive change that Humanity will go through uh because I believe like we will merge with AGI uh for example if you leave your phone at home today you are disadvantaged right so you cannot get a taxi ride you cannot get food so you can still live but you are like not advantaged similarly I think in the next five to six years when the AGI comes like people will use AGI and merge with AGI so you will really utilize the power of AI in your daily life embedded uh and like I'm excited about like that discussion because this requires all the prev that we discuss like how do we make it secure how do we uh don't create like digital divide between different countries who own the AI how do we bring AI everywhere to from devices to software how do we make it really easy to deploy easy to train uh so again I think I'm so excited about Intel's portfolio of solutions to enable AGI in the long term excellent yeah um look definitely looking forward to um yeah seeing some uh super powerful well maybe maybe artificial general intelligence that that's G to be an exciting thing all right uh on that note uh I think we'll wrap up uh so thank you both for joining me thank you Tiffany thank you Nuri it's been great to have you on the show thanks a lot for having usin the next uh five to 10 years we will merge with AGI so like we will be a part of uh like the artificial general intelligence and we will all as Humanity we will benefit from that so if you look at from that perspective everything we interact today from an industry perspective will definitely and drastically change so we need engineers and the the the term Engineers is really looking at the phases of development maybe parsing them out and make them more efficient so in the industrial engineering age we did it perfect with manufacturing right so like the Henry Ford's all Model T and like production in the mass it it changed what we get today from a like a mass production and industrialization perspective similarly I think the AI Engineers will change the future of AI in terms of how do we interpret the data how do we collect the data in a secure price private and like all the uh consistent ways and generate the right level of output in a Val sequenced way which will really like accelerate the Humanity's growth uh in the journey hi Nori and La Tiffany great to have you on the show hi Rich than for having USC so I'd love to talk a little bit about uh adopting um AI in the Enterprise and since you both work in marketing let's start off with um how AI is changing marketing analytics it's definitely like um the year of AI maybe the decade of AI in the upcoming uh years we will feel AI is infusing our marketing execution deeply uh starting with the data analysis I think we see today uh we are heavily involved in getting some of the insights and predicting some of the trends uh before it happens so so and previously we weren't able to analyze the uh data models and come up with some of the learning so I believe understanding customer behavior and really getting into the next prediction level before they knew they want the next product using the marketing tactics to really help them hey like we know your minds are all like going this way and this is the best fit for your needs I think that's a really tactical way of using uh the marketing and AI together but line I don't know if you want to comment more on this yeah no definitely so um I I think that the data piece that we see with AI is really what's going to change how marketing is done with AI um you know you you're seeing things already like personalization like product recommendations those sorts of things it's only going to get better and that those future trends that we're going to be able to forecast from that information is also going to be at the Forefront so both great points there like Nori I really like the idea of being able to predict uh what customers want before the customers even know it and let yeah the idea that data is going to be the key to success with AI does seem incredibly important uh all right so I'd love these are things I'd love to get into in a bit more detail later but uh for now I think one of the big trends is companies trying to include AI in their existing products so maybe can you just give me some examples of how companies are trying to do this um Tiffany do you want to go first this time absolutely so yes that is absolutely true we see that 58% of CEOs from leading public companies are actively investing in AI this is what is on the top of everyone's minds and um I think when you look at what is most common right now it's really that customer service with the the chat Bots um and this is very popular right now with the rise of um all of the llms that we've been seeing lately um but also Beyond just the customer service with the chat bods we're also seeing a lot of vertical use cases so depending on which vertical you're in from you know Ai and Health Care looking at treatment plans Diagnostics things like that but also retail like we mentioned earlier right a lot of those personalization engines um and then finally um there's even a lot of vertical specific use cases with an automotive right we're seeing this with the self driving cars and um and and you know enhancing the safety features as well there that's brilliant and I think yeah maybe chatbot is the thing that comes to everyone's mind initially but I do like the idea that there are there really are sort of vertical specific use cases um Nori do you have any more examples of uh ways AI is being incorporated into products yeah absolutely I think let shared like the key ones especially I think generative AI will touch many aspects of all those uh industries that LE mentioned uh and we will see like more disruption in many of the other uh untouched industri so if you look at uh today I mean the jobs is always like a tough question that I get like and initially when I think of the AI I thought like hey this will start from the process automation some of the basic things but if you look at where the Gen Solutions are it started with the the white color jobs writing code like generating images generating videos every day when I log into social media I see a NE AI solution coming up so it's really unpredictable what the next major uh workflow and industry but in a nutshell I think all the industries in the next couple of years will be heavily impacted by Ai and as Intel we are trying to be ready with our solutions to make that transition happen absolutely I mean the the impact of AR is certainly very wide ranging I do think most uh Industries are going to be impacted um in terms of some of the Practical details can you talk me through what a typical workflow is for going from hey we should probably do something with AI to it actually becoming implemented absolutely yeah go nor yeah let me go first and maybe will cover but it all starts with the problem definition right so like today what we really want to achieve AI uh at the customer level is solve some of the problems or mitigate some of the issues uh I always approach this as an outcome based solution so uh all the customers really look for three outcomes they want to make more money they want to save more money or they want to protect their money so that's the the purpose of the business and I think if you look at the uh identification of the problem let's say they want to do a new uh way of innovation in their business lineup the first thing they have to do is data preparation so without the data AI will not work so it's really dependent on it and like you cannot just like dump the data I mean like there are models that you need to choose from if you look at the large language models and how they work today it's like really predicting the next word in the sequence and really making it like really effective H but in order to start with that you need to have the data so you need to agree on what are the training data that you will provide what's the data model you will train and you start so like the training is a big process and people think on the AI side I will train once and I'm done unfortunately it's not the case because AI will always transform the business it will predict the next thing and like the it will come even better accuracy every time you do the the modeling and of course the next level is inferencing so and we need to make sure sure that this is one of the workflows that is integrated into existing products so at that point you have the outcomes from like uh the the AI algorithms and make it more sense but the journey starts again so it's it never ends like you have to get all the outputs from your uh let's say inferencing you have to do all the training once again and then like you have to fine-tune the model so fine tuning is a big part of the the journey uh and like in some cases you have to be the moderator you so you see some of the user inputs and you have seen like uh on chat GPT and open AI when you ask specific questions like give me the predictions for the next big stock boom it will not answer that because that's the moderated learning so you say like hey I'm not I'm just a chatbot like it interferes the model data and gives a user input so the companies needs to put some U Gates uh before it's shown to the the users but that's kind of the the journey like from uh the data collection to training and inferencing and all this process repeating itself for a long time I love that there is a pretty well defined process now for adopting AI I want to highlight the thing you said first there that you really need to start with like a a business problem have a well- defined problem definition otherwise it's all going to be meaningless um so maybe we talk a bit about like things that can go wrong so I guess not a world defined problem is one of them umy can you talk about any uh of the pain points in terms of adopting AI yeah absolutely so one of the main pain points that we really see with companies is that you they have this excitement around Ai and they want to um include AI in their current workflows but they don't really know what AI is needed right and AI is such a broad term in terms of what they could actually do for that um so I think that is number one um secondly um if they even really need AI right so sometimes um there are certain uh solutions that we can come up with that might not include um you know it something that is in the AI Arena right so that's another thing is just figuring out what is the outcome that they really want before we back into what that solution is or and what it's going to look like um you know and and one of the things we always tell our um customers is that that's why it's so key to work with Partners like Intel um to really understand that customer outcome that they're looking for and then we can work with them along the way to find out those business requirements find out what Hardware or software is needed um to really bring that um solution to life um but having a partner um is is definitely key in this journey it's new for everyone and um we want to be um you know at your side every step of the way for sure excellent yeah um so talking about the uh the software and Hardware requirements I'd love to get into that in in more depth so um I remember SAA Nadella the CEO of Microsoft recently saying that U Microsoft having to just change the whole infrastructure of AIA just to cope with all these new AI workload so the cost is are pretty ey watering um and large language models in particular sort of notoriously expensive to train and to run so do you see this trend continuing or do you think that they're going to get cheaper what's going to happen um Nori yeah uh I think we are in the like transition phase for AI everywhere and that's why maybe you heard about Intel is trying to bring AI everywhere I think the Azure example that you give is like a good example so I worked at company for 17 years before joining Intel and I have seen the early days of AI like again my team worked heavily on uh getting the collaboration with open AI team and in the early days like I think the applications made it uh more tangible uh but in the early days we had the models on the back end but when you don't have a client to interact with like the cat GPT and that's a brilliant idea because it uses natural language and any language actually to really get a response request from the AI it made it obvious that this is going to impact everyone and with that we are seeing that from AWS last week the reinvent event and also like with our OEM Partners like d Lenovo HP everyone is trying to be ahead of the game ER including uh like the market dynamics are changing in ER both on on premise world where there is still Data Centers of customers they have to reink like hey am I going to be in the AI game and my Hardware is really compatible with the needs of the AI uh because again I will remind that like from data preparation to training to deployment it's like a circle and you need AI compute so and I I almost GNA say like the new compute is the AI compute like everybody needs it so in the early days of Cloud solution providers like we were always thinking about virtual machines so they will just like run your on Virtual machines on your data center on the cloud in a cost effective way I think there's a value to have a hardware Innovation back to your question I think this will be just accelerated in the upcoming months and I think we will also maybe share some of the news but as an example Intel acquired a company called Habana Labs a couple of years back and our hero solution is Intel Gaudi and Gaudi is an AI accelerator so it's a specific purpose built accelerator for AI H in the early days of I think AI people were using gpus because like Graphics processing is really fast and you need that fast for all the training inferencing and like deployment Solutions and all the training but now like you need to focus on AI acceleration so simple answer to your question we will just see a huge wave of innovation in the upcoming months even like not years months on this journey and there will be new hardware from Intel and also from many of the ecosystem players so this is really interesting because um my sort of uh mental model for this was that a lot of people are using gpus at the moment to train these models and know because Intel is sort of maybe perhaps more famously known for your your CPUs you're saying um that so you also have a product called G which is chips specifically for AI training so can you talk me through is that like a is that an alternative to a GPU what what's what's going on there as I said it's an AI accelerator so I think uh the Gaudi is specifically designed for AI in in the mind of uh the early days of deep learning now we are at the Gaudi to the second version and like we already introduced in 2024 uh we will in this year so we will have gaud tree in the market so again it's like not years but it's months almost we introduce me products with the pace of uh the AI and the reason we speakly have an AI accelerator is the models require not only the GPU again there is like specific tasks that GPU can be really uh a good uh let's say resolution for any AI problem but moving forward we we will see more AI specific Hardware uh not only the graphics acceleration but like acceleration in the neural processing acceleration in the the training models acceleration in the inferencing models like everything will be a part of the journey so when you have a dedicated AI accelerator which covers all those Solutions uh you will get like more uh faster results uh and it's going to be way cheaper uh because one of the things uh it's hard to find gpus in the market nowadays the reason why is again like everybody wants to do something on the AI uh and they require all the models uh to to use a GPU but in the long run we believe this is not going to be the case uh before handing over to let like one of the we also have a GPU product by the way so and it has specific purposes so if you look at high performance Computing for example which is like there for a long time but predicting the next weather by seconds like not days but like it's possible today with uh Solutions like Intel GPU Max so it's a special product we recently introduced some of it at the uh super competing event at Danver couple of months back and you can predict what's going to happen in the like space exploration like in the cancer treatment so you can really do high performance competing for AI uh using the gpus because that's where it's really successful uh but also handing over to li like it's also possible to do it even on the PC today uh so we call it the AI PC but Le do you want to add anything on that one so as we mentioned earlier right that um a lot of companies um will have different AI needs and based on these needs is where um they'll be able to choose which Hardware is best for them so whether that's an accelerator or GPU or CPU um it's really going to be based on what their needs are so um that's why we are excited here at Intel to bring AI everywhere and to really have options for our customers uh because again not everyone is going to need um you know that AI power um uh horsepower that you know we sometimes talk about right um it's really going to be based on that and um in in terms of making AI more accessible um we're so excited that we now have our core Ultra processors that's coming to um the AI PC um that we've recently launched and within that customers are going to be able to run those models on their PC and they're going to be able to utilize all of um the goodness that we've put into these new gpus um so they can start uh going Beyond just uh the traditional gpus for their AIS okay this is interesting um can you maybe talk me through like when you would want to do um when you would want to use each of these different Hardware Solutions like when would you want to use an AI accelerator when would you want to use a GPU when would you want to use CPU like what are the different uh use cases for each yeah I'm I can start on that um so um for example um where you want an accelerator such as gouty is where you're doing some of that deep learning and training um of the models up front right so again something where you really need that horsepower um then as you kind of move down the line and um you want to utilize say one of our general purpose gpus um that is something that you can um have one of our gpus um would be sufficient there and then um finally so going back to the AI PC um use case that is a great way um to do again general purpose AI however um this is going to be really useful when you have sensitive data for example so um if you want to utilize uh your PC so that that data does not leave um your local device and doesn't go up to the cloud now we see this a lot of times with um specific um organizations or governments that um for security reasons they don't want things to go up to the cloud um or sometimes if you're offline right and you don't have access to the internet having the having um the your CPUs do a lot of this work is going to U be really helpful as well and just to add on top of that I think we haven't covered the EDI specifically until now but thanks a lot L mentioning that uh Richie I think that's a very important part of the AI Journey because we always think like AI will be on the clouds AI will be on the gpus but it's not true like most of the cases today like it's impacting our life it's actually happening on the edge uh so if you think about all the like s driving cars like you cannot just ask like shall I pass this lane or not I mean you might just cry like there is no tolerance for a latency so you need that at the edge at the like in on the car running AI in stly and like you have to learn all the way so if you look at uh like the industry leaders on S driving cars they make the models like fine-tuned for Edge scenarios and similarly it's impacting many other Industries like manufacturing is another example but La just mentioned security is a big concern you don't want to train your models if you are let's say government or imagine like Ministry of defense like I mean you are really protecting your p uh and you need to be secure so like you need to add that layer on top so where we call it the hybrid AI so we will see more of those scenarios where there's some AI at the edge some AI on the client side but also like whenever needed you go to U the data center or the cloud to get that enablement so it really brings us back to Bringing AI everywhere because there's no one siiz fits so in the AI so there needs to be Solutions at the edge at the client at the data center at the cloud and I think Intel is really positioned well in terms of covering all those four components of air okay so that's actually kind of interesting because I think maybe one of the big push backs in terms of um saying well okay I need all this Hardware in order to run AI or actually a lot of times you can just call an API and have Hardware be someone else's problem so those are really interesting examples the idea that you need if you're doing something in your self-driving car you can't called the cloud or if you're a government agent then maybe you want to build things yourself do you have any more examples of like the trade-off between working with like other people's Cloud models or um and building something there are two let's say models today in the market so if you look at the large language models there is like open AI uh likes where all the training and everything is done by the wendor so you just use the output of that and you build your Solution on top of that uh it's kind of the closed AI solution you don't have any like influence on the parameters and everything that is used by that model or if you go to hugging phase and download a let's say a llama model which is like completely open source then you can see what's this trained for and then you can add your own training components and you can customize it so there's of like pros and cons on both models one is like if you want to be fast in the market and get all the API support from open AI stability AI scale Ai and like many others then you can use that route to be fast in the market but if you need any I mean it boils down to developers right they have to choose what's right for the company and what's right for uh the solution uh it might be really easy to build a solution I'm just making up but like if you're looking at a healthare solution if you have a llm that wants to focus on cancer uh like let's say early detection on specific uh industrial use cases then there might be a let's say three billion parameter large language model uh open source uh which might be really helpful for you so you don't have to train the model and like pay lot of compute power for AI but you really I call it the Nimble AI so there will be solutions that is really nimble like you don't need a GPT 4.5 or like moving forward like trillions of parameters but like you can shrink the size of parameters to your domain and it will be more accurate and again it will be open source so you can feed everything uh within your uh ecosystem so again depending on the sensitive data security those are the concerns I think we will be discussing moving forward that's actually interesting this idea of nimble ey that you don't necessarily need the best all powerful Cutting Edge model you just need something that's good enough for your purposes but maybe is a bit cheaper to run um leany do you want to expand on that like when might you sort of care about having a smaller like less powerful AI rather than having The Cutting Edge yeah absolutely so it's really going to depend on um the organization and the needs and it's also going to um depend on like you know again like what does that company want to do and what sort of outcome they want to um come up with when you think about AI everyone just gets so creative and um the ideas are endless of what they what they can do with this right and so um I looking at what is capable of that company as well something that we're going to need to um give some attention to um the other thing we always like to talk about as well is the the data the quantity of the data um that a company might have as well as the quality of that data um so as we spoke about earlier AI is is all about um the data piece right so we don't have enough of it or if the quality isn't there then we might find that certain companies aren't ready to do that right everyone's AI journey is going to look different and um every company is going to be at a different stage um and that's okay right um but something that we um definitely want to think about as we're looking for the right solution okay and adding on this idea of solutions um we've talked a bit about Hardware I'd love to talk about the software as well so uh can you me about like what of the most common software Stacks people use for working with AI the stack again is going to really vary um based on um what their needs are um but for AI um overall commonly what you're going to have within that stack is kind of starting off with whatever the programming language is um and then um choosing which libraries and Frameworks are going to be best for them when it comes to the machine learning portion um as spoke about earlier um and then finally you know you want to process that data and then kind of start to choose which visual visualization tools um might be right for you um it might be um different U analysis tools um and and and more right um but again it's really going to be based on like what their needs are that stat can look so different um it's just really going to be based on what the solution is excellent yeah uh so all those components seem to make sense um norri can you fill in any of the details on like what people might go for in terms of like sort of programming languages Frameworks database tools all that sort of stuff yeah I think there's like a couple of Frameworks already in the market which are again like I will call tensor flow is one of them pytorch is a big framework and these are really like uh big foundations like as a example Intel is now a part of the pytor foundation because we completely support open source and we want the developers to get the best out of uh the solutions uh in that and specifically maybe unpacking where Intel sits on the software because when I joined Intel it was like is Intel in the software business like I didn't know that so and I really did my let's say deep dive into the topic and I was really surprised like I mean especially on AI I want to highlight two solutions that Intel provides which is really important for all the developers who are listening this podcast one is open and it's a runtime choice for the developers and really helps the um the developers to deploy the AI Solutions everywhere uh it's heavily initiated as an edge deployment solution it's expanded to client and of course with our data center capabilities open window provides really a wide variety of AI deployment Solutions as like uh core foundation and the big thing is the one API uh one API is again open source standard based software that simplifies the programming across diverse architectures so the developers can have the flexibility uh you might heard about Cuda for example so uh Nvidia at 2008 come up with the idea of Cuda and brought the the developers but that's a close Source platform so like I mean if you're a Cuda developer you have to develop on NV media uh chips set so what we believe is one API it's open source so you should be able to develop wherever you want you can deploy uh whatever you want so that's giving the flexibility and power uh to the developers and Architects uh and maybe the final thing uh which I was a part of the launch process actually Intel developer Cloud so that's a big step forward so if you think about the Innovation Journey that we have at Intel it is uh it's really exciting like almost every six months you have a new product coming into Market uh just mapping the pace of the AI Innovation and for example Intel Gaudi 2 in the near future Intel Gaudi 3 so whenever that products launch it will be first available on Intel developer cloud and it's a cloud uh infrastructure provided by Intel to developers you people can just go to Intel developer Cloud create an account and immediately try those things without really need to purchase a million dollar of uh Hardware it's like really easy to get access you can use the platform develop your Solutions and this applies like the wide variety of solutions including our xon Hardware GPU Max GPU Flex so Intel has like wi Solutions on the the AI and I'm just connecting the do but you can use open one API and open Mino Solutions on top of Intel developer Cloud which makes it easy for developers to try before they commit to the platform and they want to like run uh so it's just easy Port because they have done all the testing development on Intel developer Cloud uh we are not really planning to uh steal any let's say uh customers from our vendors we are trying to incubate them in early stage and when they are ready like G is available on AWS as an example they can go to AWS uh they can go to Zeon for fifth generation uh for gcp so again like I mean those are all the customers that are really uh uh have a journey to end their AI continum so it sounds like there's a lot of software in a lot of different areas so that's changing so everything from like the lowlevel sort of data center infrastructure stuff up to sort of tools for people who are developing with AI um I'm going have to make you both pick your favorite so go on uh what do you think is going to have the biggest impact um La Tiffany do you want to go first uh which bit of software do you think is going to have the most impact in the next year or so um yeah I think um it's really going to be open V now um for me you know as we mentioned this is really going to be the client runtime of choice and it's just going to open up what's possible um for developers in terms of um where they're coding and where they're deploying I think you're really going to see uh just an explosion of different use cases and um inst solutions that come from this okay so easy deployment that sounds like a very useful uh thing to have uh Nori do you have a favorite bit of software you think it's gonna have high impact I will also plus one uh let F on this one open know especially again we recently launched the AI PC which is again bringing AI to computers and this is a drastic change for the industry like AI was like somewhere in the cloud like I mean it's like the Greek mythical Gods like I mean you need to have like gpus now you have a PC and you can run as an example stable diffusion on your PC like you don't need any like expensive Hardware or like commitment for a cloud provider for a long term so this is going to change and in order to deploy that solution open Mino is the only solution in the market right now which is really exciting for 2024 I think this is gonna be a big innovation for Intel and also like great help to Developers for deployment okay after look out for that then um I'd like to take a little Sid step now into uh talking about privacy since this is um a big deal for a lot of Enterprises so um can you talk me through how privacy requirements affect the use of AI what the organizations need to consider um yeah Nori do you want to go first this time yeah I think it starts with really classifying the data first like uh uh privacy and like again security is like a foundation for a company data uh process uh and uh I always when I met with customers I asked like who's your data Chief data officer like if you don't have one there's a problem then like you don't know where your data is like what's the classification that has been done and in AI the problem is um llms will never forget like if you train accidentally an LL which is public facing it will be a part of the training data for the upcoming training model and it will leave there forever and we have seen some customers unfortunately again like uh I will not name the companies but they accidentally train all the information on their private data and now you can just like prompted like hey give me all the uh C in backwards of this company uh writing a code like them and then boom boom boom so it it gives you all the back so like it's really important to uh put that guard rails in front of your data making the right classification and who access to the data uh like preparation because that's a very important step so today at like at marketing at least like the pii it's really important for us to keep like customers name phone and everything uh private like we don't really uh do anything with that data and really few people can access and if it's an opin but what we see is like the companies are throwing all their internal data to the training of a large language model or fine-tuning one of the the things and it's a little bit risky so you have to make sure that this is going to be understood by the the machine learning and it will be a part of the deep deep learning algorithms which can be like by simple prompting it might be unveiling your secrets uh for trade H or any other company confidential information so that's why I think secure AI is an important step that before jumping into this AI Journey you have to know what's my security Journey what's my privacy what's my uh really like models and approach to this uh like AI uh implementation and then move on so if you drive a car today or you produce a car you get to like all this uh certifications all this assessments there are government bodies that are approving those but if you look at AI deployment today companies are deploying like back and forth everywhere so and I think getting that ethical usage of AI data sources getting the privacy and making sure AI is secure uh is an important thing I think it will impact the next couple of years for all of us yeah definitely so I like that analogy that if you're sort of building anything else that you know you put out a phone or car or something that there's a ton of regulatory hurdles you have to go through but there's not that much for AI um Lely you look like you had something you wanted to add there so do you have any more advice on like how to think about when do you need to get about um privacy in the context of AI yeah absolutely so Intel has a long history of always following gdpr rate so um it's going to be the same with AI for us and um you're going to see that integrated within our products and um within um everything that we do when it comes to AI so um I I'm very confident in terms of where Intel's going to be going with the Privacy requirements for AI um and and you've even probably seen this in the news um Intel um collaborated with bware for a new Venture called private AI um where um we worked with them to build um a a program that gives um privacy from Edge to cloud and this is just the beginning right um we're going to see um other initiatives like this happen um we're uh we're working with um all different types of companies to ensure that that security um comes along with the per performance okay um and so you talked about from Edge to Cloud there is there going to be a difference then in terms of the Privacy requirements if you're not doing something in the cloud and you want your your competition to happen like where your data is um yeah so um we think it's going to be both right so um you can have that security that you need and want in the cloud um but there's also going to be um the there's also going to be additional um Securities that you can have by keeping some of your data local um so this is a um a great example of what we see of being hybrid AI um so we talked about this a little bit earlier right where you see there is a um a seamless transition um from your data that you have on Prem um and then also utilizing uh the cloud when it needs to when you need to get additional data that might not be private right um so this is is going to really enable organizations to use their AIP PCS and um or their Edge um infrastructure for some of that sensitive data and then pull from the the cloud um additional data um that might not be um s sensitive I'd like to talk a little bit about processes to touch on it briefly and it seems like um most of the processes you want um for AI development are going to be similar to standard software development are there anything is that a different here like is there what's peculiar to developing stuff for AI I think the key differentiation is the data piece so like we have seen the data science jobs are like incrementally getting popular in the market because AI is built on top of the data without the input there's no output and you need to write data scientist skills to develop that machine learning model the data science correlation uh we we have seen this machine learning machine uh learning engineering roles and again I'm a computer science graduate but like I believe there will be artificial intelligence engineers in the near future so it will be a really a major study area because this is really becoming like on top of that data science layer you develop everything uh like from a like a modeling base I just covered the tensor flow pytorch those foundations but on top of that also there is like uh the development needs to Think Through uh the synthetic data creation like where uh you need to feed the engine all the time with the latest and potential upcoming learning H and we covered the ethics and compliance but that's also like an important piece of the development Journey you cannot just I mean historically I'm a developer like I develop uh websites with asp.net at a point of time and it was easy you just use NET Framework and then like you use a like a client which is a web browser it's easy everything is on the server side you don't worry like AI is multifaceted so like I mean you have the models in a private environment you have to get the right uh let's say training on top of that uh like you have to deploy and we discussed the edge like you need to do Edge inferencing in some of the cases and like it's a life cycle of Journey if you miss the ethics and compliance piece like you might uh it's not just a server s side thing anymore the data is in every location so uh those are I think the additional uh level of complexity for developers to think through and final thoughts is like writing a programming language was like a big task so but now it's not like I mean if you're building some blocks you can just ask any GPT like hey please write me a python code to enable this so you are more uh enabled from like the time consumption perspective so you have to spend more time as a developer on the the right architecture the right model work more collaboratively with the data scientist uh to brief down what you want and to work with the compliance team on like what's going to be the outcome so you don't really Buri into the line of codes and like test and deploy the solutions so I think AI is gonna give a good uh breeding area to interact with the rest of the organization in a better way for developers absolutely and so you mentioned this the idea of this AI engineer being a sort of fairly new career and this is like a ton of things they have to do can you break it down like what sort of skills you need to get to being that AI engineer yeah let me go and then let's fin will maybe drive more depth conversation but it's all about analytical thinking so I think uh with AI with artificial intelligence we are trying to mitigate the intelligence that we have in the humans right so we want to repeat some of the the process underlying and then really surpass some of the things that with our human intelligence what we achieved so as an AI engineer I think um this is a first of all a really uh interesting area to work on because like we don't know what's going to be the next big thing right so and maybe we will cover that but like we all heading to this artificial general intelligence and in my opinion in the next uh five to 10 years we will merge with AGI so like we will be a part of uh like the artificial general intelligence and we will all as Humanity we will benefit from that so if you look at from that perspective everything we interact today from an industry perspective will definitely and drastically change so we need engineers and the the the term Engineers really looking at the phases of development and maybe parsing them out and make them more efficient so in the industrial engineering age we did it perfect at manufacturing right so like the Henry Ford's all modality and like production in the mass it it changed what we get today from a like a mass production and industrialization perspective similarly I think the AI Engineers will change the future of AI in terms of how the we interpret the data how do we collect the data in a secure private and like all the uh consistent ways and generate the right level of output in a Val sequenced way which will really like accelerate the Humanity's growth uh in the journey okay and uh Tiffany uh like so if uh people who are listening who want a career as an AI engineer or for managers who are wanting to hire an AI engineer like what do you think are the the top skills you need here yeah um so I think that um if someone is wanting to really um develop their skills in AI I think um one is going to be um you know having that Proficiency in machine learning and Ai and all these other disciplines that are under this AI umbrella that we've talked about today um that's number one and secondly is really going to be um having a good understanding of how all of these different models can kind of like work together right um as ner mentioned what we're going to see with these engineers in the future is they're going to be putting together um all these different aspects into one solution so um it's not going to be a one-size fits-all so really having that flexibility and honestly uh creativity um to come up with these Solutions um based on whatever challenges that they they are faced with um I think that's really going to be essential um and then and um you know also I would say that just uh keeping a um a an attitude of uh continuous learning um because everything in this AI space is changing so rapidly um from you know the new developments happening to uh the new uh software and Hardware coming out you have to just keep learning to stay up to date on this um so I think if someone wants to get in the AI field uh those are some of the the key areas uh they should focus on absolutely I do like that idea of continuous learning because you're right it just changing so fast I keep teaching things and then a few months later it's it's all out date again so I suppose related to that do you think there are any skills that going to be made obsolete by AI so um I don't think there's going to be um skills that are made obsolete I really think what's going to happen is we're going to see this transformation happen um with those skills so AI is really just going to unlock um additional time and resources for us so it's going to take away some of those automated routine and repetitive tasks that you were doing and give you more time to uh build additional skills or um you know use your um yeah use your skill set to uh learn new things um and I think that's really what we're going to see is more of a transformation rather than one skill or the other becoming obsolete absolutely um so I want to talk a little bit about like what you're most excited about for 2024 but before that I'd like to take a little step back nor I know you were involved uh when chat GPT was first coming out and you were involved in in the marketing of that uh when you your from your time at Microsoft can you just talk me through a little bit about what happened there and what you were aiming to do in terms of uh bringing AI to the masses then absolutely and I think it's a good story to share with everyone because like this is a uh like this was a journey where uh like Microsoft partner with open Ai and then initially in some of the components I will give the specific GitHub co-pilot as an example uh so when we run the first phase with GitHub co-pilot we have seen like tremendous results on like hey this thing can write code like sometimes better than many coders and uh we observed what's going on like for a while and then I think like before the chat GPT it was easy to H generate some of the uh the coding dolly was another example so like when we interacted with Del's first version wow I mean this is really creating some images uh which are really creative in a way that uh that surpasses like some of the uh human Ingenuity and really goes Beyond some of the the thinking uh but almost in 18 months uh the number of code lines written on GitHub uh by AI became 51% so the human generated content was 49 so in 18 months so this is like a huge change all the history of software engineering programming and everything but and again like I see it's going to be maybe 90 to 10 moving forward uh and back to your question to lean I'm not really afraid of like hey the software Engineers will lose their job so I think on some of the repetitive tasks well it's a also a company time and efficiency problem because you ask those developers to write a code which can be written by AI because they try to solve the same problem over again and again again like the the basic foundation of the the models are the same so and I think this will affect in many Industries so uh when uh I was at Microsoft so we used open AI for our internal marketing team uh there's a term called mpf the messaging and positioning framework we said like imagine you are launching a new product so what will be the mpf boom boom boom boom like it's and you can just name you can like upload some documents to train it even further and it was like so good like I mean we said like hey we will save most of the time like writing documents or creating some messaging Frameworks versus really marketing them so and it accelerated our journey and again November 30th 2022 so when we launched open AI uh on check GPT to the world running on Azure that was a aha moment for everyone uh to interact with that using their own language so you can start English turn to Spanish whatever like because technically the large language model understands the way and it responds the way and getting that uh I think the visual components like the the uh video images everything is happening so fast if you look at uh the last one year uh it really drastically made AI a number one topic for every leader in the world so if they want to be in the game in the next five to 10 years they have to spend some time on AI so the learning that I had was like it was taking for my team long time to explain like AI is going to change AI is going to change the world but when people see it in action boom like it was an instant pass yeah we have to be in it otherwise this will going to be disrupt our industry so you have to do it before your competitor does uh so I think that that's a good example and I'm trying to apply that uh at Intel uh so like we are trying to be head of the game like we covered until now lot of products but all those products I will go back to the first question that you asked what are we trying to solve we are trying to help the customers on the outcomes so either it's accelerating Innovation it's securing the the the data we covered it a lot and also the TCO like if you look at some of the GPU prices in the last one year like everybody added margins into the game and like a customer is not benefiting from from that they're not saving dollars so as Intel we want to provide like and bring AI everywhere so that's that's a big change so again with the core Ultra processors we made it to the PC we are making it with Zeon to Data Centers but also I will reh highlight Intel gudi is a big innovation it's really bringing AI for large language models you asked Rich question like where do we position gudi it's whenever there's a large language model work that's the area for an AI accelerator and then Intel gudi will definitely help it and we already seen some of the great examples of uh that solution in the market so if you have kids for example Roblox is running Intel GOI on the back end if you are watching Netflix it's running in t on the back end so all these recommendation systems and everything that is in our lives are already filled by Intel technology today um yeah uh it's amazing how much uh is sort of pervasive throughout everyone's Liv and I I do like the point you mentioned when you're kind of going to promote CH gbt was like well you know until people saw it it wasn't obvious that this is going to change the world uh but once you've seen it like oh yeah yeah I get it now brilliant um okay so uh before we wrap up can you tell me what you're most excited about for 2024 and AI uh the D me yeah so for me I am most excited about seeing this uh democratization of AI um really come to fruition and we're really seeing that through the AI PC I mean this is truly putting AI into the hands of anyone who wants to to have it right so I'm so excited about that and uh this just underscores Intel's mission of bringing AI everywhere as well um you know it's now it's accessible not only just if you know not only just the big companies but Everyday People who want to um start using these models and so just so excited about um the opportunities that's going to create as well as seeing what people make all right nice I do like the the fact it's coming to everyone yeah the the democratization of a brilliant uh norri what are you most excited about I think I gave some Clues uh before but I'm excited about AGI and I think artificial general intelligence is not that far and I'm looking forward to like think through as Intel how do we collaborate into this like massive change that Humanity will go through uh because I believe like we will merge with AGI uh for example if you leave your phone at home today you are disadvantaged right so you cannot get a taxi ride you cannot get food so you can still live but you are like not advantaged similarly I think in the next five to six years when the AGI comes like people will use AGI and merge with AGI so you will really utilize the power of AI in your daily life embedded uh and like I'm excited about like that discussion because this requires all the prev that we discuss like how do we make it secure how do we uh don't create like digital divide between different countries who own the AI how do we bring AI everywhere to from devices to software how do we make it really easy to deploy easy to train uh so again I think I'm so excited about Intel's portfolio of solutions to enable AGI in the long term excellent yeah um look definitely looking forward to um yeah seeing some uh super powerful well maybe maybe artificial general intelligence that that's G to be an exciting thing all right uh on that note uh I think we'll wrap up uh so thank you both for joining me thank you Tiffany thank you Nuri it's been great to have you on the show thanks a lot for having us\n"