Hello there. Welcome to day two of the demo days event in this July. And, I’m pleased to report that it’s finally stopped raining in London. So, thank you very much. Whoever intervened on our behalf, much appreciated. And, wherever you are joining from in the world, I don’t know if, we’ve got new people from yesterday. Looking down the list, we do seem seem to have a few new people. So please do just tell us where you’re from. Tell us where you’re joining from in the chat to say hello. I’d like to kick things off with a little, friendly interaction. I’m joined today by Joe Rinnie, who is commercial product lead for, MarketLogic software. And he’s gonna be taking us through the DeepSights solution, which is generative AI platform for knowledge management, for consumer insights access, part of that kind of conversational interface for accessing all of those rich insights, the data, the research reports, all of the third party, connected tools that you have as large as an in house insights data function. So Joe will be taking us through how that works and the way that you interact with that. In this session, you can try if, if you wanna listen on mute or you if English isn’t your first language and you wanna follow along with subtitles, you can enable the closed captions down the bottom there. I apparently, that will work okay as we’re talking. Won’t work in all the sessions. And I woke up this morning to emails from ten different people asking about the replays from yesterday’s event. So, my default response is that that’s user error, but, obviously, we haven’t communicated it properly. So if you wanna head back to to watch any of those replays, you can click the schedule button anytime, and then there should be a watch replay tab against each of the the ones that have finished. So you’ll be able to come back into this. Use your link to come back in here all until the end of the weekend, and then we’ll have them, streaming from the Insights platform site. So they’ll all be available there. So great. Cincinnati, France, New Jersey, Sao Paulo, Berlin, all over the map. So, I’m in London, by the way, if, if you weren’t aware. And I’m gonna shut up now and hand over to Joe who’s gonna take us through DeepSights. Thanks, Jack. Great. Thanks, Mike. So just to reiterate, Joe Reaney, senior commercial product manager at MarketLogic. Excited to be here today to talk about MarketLogic’s DeepSights, which is our Gen AI assistant for consumer insights and knowledge management. But just quickly, what I wanna go into today, little introduction about MarketLogic, who we are, what we do, and then really get into kinda guiding you through a demo, of the the platform, really focusing on our latest, released supercharged answers and report generation, but then a little change of pace. So I’m gonna take us out of the standard in platform demo into a couple, external applications we have. So MS Teams, integration application, and then focus a bit on our API integration, specifically showing you something we’ve got, with ChatGPT for enterprise. Then, of course, we’ll come back for some, q and a. So who is MarketLogic? For those of you who don’t know us, we are a SaaS provider of an AI platform for insights management. We help some of the most innovative companies in the world run insights driven operations. We’ve been in the industry for about fifteen years. Hundred plus global customers that we’re very proud of. You can take a look at the right hand side there of some of our, great customers, some highlights there. I personally have been working with Unilever for about ten years. And on the flip side, maybe like an IKEA, so a very new customer of ours that we’re super excited to be working with. We have hundred plus integration partners with various, data sources and so on that we bring into our platform. Just a little quickly, get a couple of awards we’ve received recently, specifically twenty twenty four AI breakthrough award, around an innovative Gen AI offering, which is, of course, our DeepSights. So quickly on DeepSights. DeepSights reads across your repository of knowledge, whether that’s your market research trackers, ad hoc research, reports, but also into your transcripts, new sources, integrated third party content, and so on. So anywhere that you’ve got market insights, Insights is able to analyze, read those, and provide you with, business answers, which I will show in a second. Of course, these are trusted answers grounded in your knowledge repository available twenty four seven. So I’m gonna show you the in platform web browser. But as I said, we have a couple, different ways to access, DeepSight’s output. And email integration, we connect to all of the, you know, big knowledge and, business communication tools like your MS Teams, Google Chat, Slack. And then our API offering is really, differentiated for us. So via the API, we can make all of that Gen AI insights output available throughout your organization, specifically big tools like your MS Copilot and Enterprise Jet GPT, which, of course, I said, I will show. So the first thing I wanna do is come into one of our demo tenants and show you, our supercharged answers look and feel. So here I am in one of our demo platforms. Of course, this will be populated with, your market insights and integrated, sources and so on. This is just demo content for the purposes of of today. I’m gonna go ahead and ask DeepSights here a question like, what do we know about millennial spending? And that kicks Deepsight’s process off. So what DeepSights is now gonna do is it’s ingested all of your documents and so on into the system, ingested them with a large language model, and it’s now scanning them for relevant content, and evidence to support that business question. You can see it gives you a little, information about what it’s doing as it goes through the process, and it starts to really populate out that supercharged answer, as we go. So it comes back with, first and foremost here, a synthesized answer across any of the relevant sources in your repository, and it sources these directly in, text as you can see. So as a if I highlight over one of these sources, I can see that that sentence there was summarized from, a given market research document, points me to the relevant pages and so on. And I’ll show you in a bit how I can click back into those sources to really see that, initial page for the eventual source. We also, as I’ve stated now, look at all of your, integrated third party sources. So this would be maybe an RSS feed, or a, yeah, an integrated, syndicated report, which is also being pulled in here, which is quite novel, for MarketLogic’s offering. I’ll come back to the sources in a second. Just scrolling down, we also give you further reading. So this shows you content that maybe didn’t make it into that answer because it hadn’t reached the evidence threshold or also between integrated source that, the legal framework doesn’t exist around to to wrap into a generative answer. So we also source some of that content here for users to guide their research going forward. With all of the sources that we show, on the right hand side, you can expand these out. You get a little Gen AI created summary attached to the source so you have a bit loop better understanding of the context behind what that source is all about. And as I said, now you can click back to the sources, which I’ll show you in a second. Something really key that I wanna highlight, which is new with our supercharged answer, look and feel, is the AI caveats that we identify. So if you see here, it’s letting me know that the, AI has identified a couple potential source conflicts. Let’s expand that. And you can see that via the LLM, it’s letting me know that there’s some discrepancies in the sources Insights wanna consider. Right? So, it’s saying that some of the information around the financial data in the answer, might not represent global trends. It’s pointing to some other areas that I might wanna look at contradictions and so on, and then showing me the sources where those conflicting pieces of data, appear. So this is really taking us beyond just generating an answer for us. It’s now trying to still that, kind of first step of really being an insights colleague, giving me even an idea where maybe the answer might go off or where I might wanna take a little extra look, maybe do some further research or compare across what those sources are telling me. We also give the ability to generate a deep dive report, which I’ll show in a second. Before that before we do that, I just wanna pop over and show you we have a history area here. So you can see all of the, past questions, or reports that I’ve generated. And the reason I’m gonna click in on this one is because I wanna drive home a really key point and a differentiator from MarketLogic, which is that we, not only scan all of the text and so on, which is kind of the common capability of the LLMs today, but we also now, are able to, with cutting edge models, look at all of the infographics, the tables and graphs and so on that show up in the document repository and pull insights directly from those. So if we take a look at, for instance, this sentence here, it’s about, price consciousness for this subset, and there’s some quite specific, data points around, some of the behavior there. So you’d think perhaps this comes from text within the given source document. Let’s go to that source document to take a look. So I’m now going back to the repository to view the sort of raw evidence. And as you take a look, you can see that this, those numbers that I showed you, they’re not coming up in the text here. Rather, they’re appearing right in the graphic itself. And, Insights is able to assess that, understand what that graphic is saying, pull out the given, you know, age cohort that I asked about, pull up the given, percentage associated with it, and then if you come back to the answer, really associate that, correctly in the report. So huge value gain in terms of tapping into all of those, visual, insights that we know from working with our customers are where a lot of the, key evidence lives in your repository. And then it’s also about making that content really show up for DeepSights because from experience, we know that when trying to search across knowledge repository, if the system is not understanding the infographic content, it’s not gonna find that content in the first place. So really a huge, value add for us there. Let’s now go ahead and take a look at one of those reports I mentioned. So when I click this, it’s gonna now really do a deep dive. The LMS gonna kick back in and start to look at other evidence, follow-up questions, and so on to generate me a deep dive report. That takes a little bit of time because the LLM really does some work. So I’m just gonna go to my history to a report I’ve saved. And what you can see is it’s simply gone deeper now. It’s asked follow-up questions, and it’s wrapped those all into a, more like a one pager style report with the sourcing and so on preserved as always. Additionally, in the DeepSights, reports, we provide a secondary tab where we show, all the sources sourced, but we also do a full summary of how that source is relevant to the given question asked. Right? So that’s not a prespecified summary. That summary is really specific to that question that I’ve asked. So really forming a complete kind of takeaway, report around the question. You can, of course, download or export that. And I’ve got that here on my screen. So it’s now available in a word document for me and the entire report with the full, Jenny I answer, all of the sources, the summary of how the source are relevant is there on my desktop, and I’m able to, you know, work with that further, send that off to a colleague, and so on. Great. So coming back to, the presentation, just quickly to reiterate because I I really covered all of this, but DeepSights is looking at all of your primary, secondary, integrated sources, and so on. We provide further reading to deep dive into the, other knowledge. The watch outs give, specific looks at limitations and conflicts in the data, and then we’re really able to look in at visuals and extract meaningful findings from them in addition to, just text. A couple key things on our road map that are coming out this summer that I really wanted to highlight because, again, they, kind of move us forward in this whole space is all around source control. So there’s two key, features that we’re releasing that allow, users to really specify the particular source which DeepSights, would look at. The first is the ability to either kind of deactivate or tick off and then activate or tick on sources from that supercharged answer view and then regenerate the answer. So I might see that, actually, I don’t want two reports that I’m aware of that, you know, I’m clear wouldn’t actually feed the answer correctly, but I’d really like to see that Mintel report in. So I can simply jog toggle those out and regenerate the answer. And then we’re also gonna be releasing the capability to upload a single document to DeepSights so that users can really come with a document that they are already aware of. They don’t need to go look for it, upload it, and then really query it with all of DeepSights, Gen AI Power, obtaining answers exclusively from that document. And And then a second really key, piece that we’re releasing is our control center. So this allows, customers to provide guidance to DeepSights, steering the AI and tailoring it the way that your business works. And it’s cut out across a number of sort of splits as you can see here. You can specify, like, in plain English, the kinda general background, for the business, internal target audiences for DeepSights, specific contextual guidance, about how DeepSights should answer certain questions, certain sources that it might go to for particular question types. So if you got a key data source that’s relevant to financial style questions, all of that can be specified, and then that feeds into DeepSight’s, sort of way of working when it receives questions and really steers the output, to tailor it to your business. Great. So having now explained a little bit about DeepSight’s game platform and how it really works, I wanted to show you our MS Teams application. So I’m gonna pull up our Teams assistant. I’m I’ve been working in this today. This lives right in your Teams environment. I’m currently chatting directly with the assistant, but you can also call this up in, a group or a one on one chat in Teams by adding DeepSights. So let’s go ahead. I’m gonna send that question off to DeepSights. Assistant gets back to me, let me know that it’s now undertaking that process of looking for relevant, evidence and formulating me an answer. And let’s just give it a second to do its work here, and then it’s gonna come back to us in the Teams environment, with the answer. Great. So it’s come back. As you can see, similar to the look and feel of the, in platform, it’s got the answers. It’s linking directly to the sources, and I could click back by hitting any of these, and I would end up back in the platform to view, the underlying knowledge, that’s fed into that answer. Great. And now I wanted to talk about our API integration. So the DeepSight’s API lets, DeepSight’s reach larger audiences within our organizations. Working with a lot of our big enterprise customers, we’re aware that, a lot of the users, the target users who could access DeepSight’s output are not gonna be coming into the platform on a daily basis. There may be other tools as well that, could stand to be fed with insights content, and that’s where the API comes in. So the API allows, tech teams to build out applications around Insights, platform, accessing those answer, answers and reports that I showed you, also directly accessing the, the underlying search that provides content to DeepSights and even doing, like, auxiliary tasks such as uploading and downloading content directly from deep sites. So this is really a step towards enabling IT teams, Gen AI focused teams in the large organization to also leverage deep insights output. And just some of the use cases that we’re seeing with, with customers already live are, feeding AI driven concept and idea generation tools that you may have, for instance, at at your organization with insights output so that the output of those concept idea generation tools really can be grounded in your repository of insights, knowledge. We’re also seeing customers integrating DeepSights into internal search platforms so that from some other internal, maybe Internet based search platform, they can hit DeepSites and and get back either the DeepSites answers or reports or maybe just the document subset and so on. As said, the API is quite, free flowing in that sense, so really flexible to your organization’s needs. And then more and more customers are starting to use the API to plug DeepSight’s output directly into, some of the bigger enterprise AI systems. So the MS Copilot’s the ChatGPTs for enterprise. And I want to show you a live demo integration that we have. So I’m in our MarketLogix, Teams chat GPT for enterprise environment here. And we’ve configured this, GPT to understand when a market research focused question is coming in and then delegate that out to insights via an API to get back the trusted output from DeepSights. So let me just type a question here. What do we know about Gen Z? So first of all, ChatGPT is assessing that question. Okay. Yes. That’s something that belongs to the null repository that underpins, DeepSights. I confirm that it should go out of the ChatGPT environment here. Now it pings DeepSights , and that whole process kicks off that I showed you in platform. We’re scanning the repository for information. We’re finding the most relevant, documents, external sources, etcetera, and generating, an answer that’s gonna come back to me in a second in the ChatGPT environment. And here we see that really where potentially, you know, business users or maybe even your insights users are working, that Insights answer underpinned by your knowledge repository is coming back in the ChatGPT space. Let’s just stop this. And as this content now sits in ChatGPT, I could continue to work with it as I might. So, you know, something like summarize points, and off it will go leveraging ChatGPT’s capabilities there. I could, you know, ask it to do a SWOT analysis, ask it to draft an email from me, and so on. So really marrying up the kind of freedom of, the ChachiPT or Copilot space with the evident fact Insights, output, that we bring to the table. Yeah. Coming out of that, that kinda sums up what I really wanted to show, and happy to come back to you, Mike, for any questions that may have come in. Thanks, Joe. That was, it it brought to life for me. I think the, you know, the integrations, the particularly with the Teams and the API that that puts it into other places. Visualizing it makes it a lot clearer now as to why that’s a benefit and how it helps, you know, put it into other teams’ workflows, you know, rather than it being a a specialist. So thanks for that. Do please ask your questions here. We’ve got one, that’s coming from Chloe. I’ll come back to that in a minute. If you if you do have questions for Joe about the, the sort of latest innovations in insights, then please ask away. I mean, I’ve I’ve got a couple. And, I think one big one, which I think is probably something you get asked a lot, is how how do you sort of differentiate or what are the the sort of relative advantages of, Copilot, you know, chat g p GPT for enterprise versus a specialized solution like deep insights? You know, what are the kind of the key differences there? Yeah. Good. Great question. So, I mean, that’s the type of conversation we’re coming up to a lot in our kind of internal processes either with new customers, existing customers. I would say both have their strengths and weaknesses. So many of our customers are using either a Copilot or ChatGPT for the day to day. As I tried to kinda show there, those tools are really kind of unbridled LLMs. So they’re free to go off, and create text, you know, as as everyone who’s using a ChatGPT is aware, but it’s not linked back to evidence. So it becomes very challenging to ask ChatGPT a question, and get a response that you can then link to any kind of underlying evidence, let alone your sort of proprietary, set of of of insights, essentially. That’s where deep insights lets you, search for particular content, really evaluate whether the content is, of the correct standard to formulate a question or sort of formulate an answer, and then provide an answer. And the sort of API integration I showed there marries those two up quite nicely because you then have the grounded insights coming from, deep insights and the freedom to, you know, go off and do the more creative or open ended tasks that, the the chatty bitties, copalis, and so on. Yeah. Okay. That makes sense. So if you’ve got if you’ve got the sort of the context or some of the business, you know, rules or process or stuff that, you know, is built into that, then if you I know your your, your point about uploading a single document to be able to get answers and to summarize it into my presumably then, that’s more focused on, you know, the context and the the insights understanding and the kind of semantic framework than it would be if you just uploaded it into Claude or or the chat JPT. Is that is that correct as well? Yeah. Right. So even with that single document upload that I talked about, we still kick in our entire it’s called retrieval augmented generation process that underpins DeepSights. So first of all, we we split that document out into little chunks of text. We really embed them with the large language model, and then we’re able, even within the document, to find the most relevant piece of information, ensure that we can generate answers from it, and so on and generate an answer. That would be a little closer to what a ChatChimp would probably do if you would upload a document to it and start querying it and so on. But all of that hallucination control, all that’s parsing to the page level and so on, you you wouldn’t find, there, which is where we use that script. Yeah. Okay. Interesting. You talked about a couple of things that if the, if the evidence doesn’t quite meet the threshold for going into a summarized answer, but it it could still be interesting. And you also referred to, you know, maybe not having the legal framework to put something into generative AI. Can you just talk through those those a little bit more? Sure. So, yeah, I showed those both in that sort of recommended sources component of DeepSight’s answer. So even initially before we went to the supercharged answer view, we were treating, questions that couldn’t come back with enough evidence, in a way that we would say, look. We can’t answer that question convincingly, but here’s some documents within your repository that may answer the question. They might give you a hint. Go look at them. So rather than just killing the, sort of flow for a user, pointing them somewhere that they can go off and continue the the research flow. In that supercharged answer look now, we’re doing a component of that. So we’re showing content that didn’t make the threshold, the stringent standard we have to formulate an answer, but nonetheless, maybe relevant, to the user to go look at, use their judgment. And if they would, you know, for instance, find it useful, they could as I showed their ticket, how to integrate the answer. The other thing around the legal framework is we’re constantly working with third party providers, both syndicated, source providers to make the integrated content available to DeepSights. But there are, for some of these providers, legal challenges around wanting to really expose all of their content to an LLM, and let us work on it, so to speak. And in the case where we don’t have a framework to fully digest that content with the the Gen AI, We rather take a snippet of it, understand that it’s relevant, and show it to the end user there and say, look. We can’t deal with this content, but go click there. You’ll find probably an answer that would be, you know, relevant. Yeah. Okay. Okay. That makes sense. It wasn’t something that had occurred to me actually that with all those different data sources, you’re gonna have limits on, you know, what you can do with some of them because of the licensing and the, the agreements. Yeah. Yeah. Definitely. It’s a big challenge actually for us. Yeah. I can imagine. Okay. Great. So one of the questions that Chloe is asking and now this feels like a a big step change for me when you talked about being able to read the content of PDFs, being able to get access to data in, you know, in tabular format in in Excel and so on. Can we just go over that a little bit and explain you know, so to to what level of depth is that? Because I noticed that in the example you gave, it was sort of pulling out it didn’t even seem to be labeled by age, but it recognized that sixteen to twenty five was age. So there’s obviously some contextual kind of understanding there. So can you just talk to us about what you know, where that’s where that’s, you know, kind of fully, you know, crushed up and where there may still be limitations? Sure. Yeah. I think that that’s a huge step for us. And, yeah, you’re right. An example I gave, it wasn’t wasn’t fully labeled. I mean, we’ve got internal examples that are even even less and more complicated. But, basically, these these LMs at this point, the cutting edge models, can really look at those infographics or tables, charts, etcetera, and pull meaning out of them almost as you or I would and put that into plain language, you know, plain English text, which can then be wrapped back into the answer. So we’ve done testing where we’ve taken, you know, insights PowerPoint decks where there’s usually a call at the top that’s explaining the context of we remove that, fed it in there, ask questions, and it does, you know, it does a human level job at understanding those those charts and so on. So big step change. As I mentioned, it’s it’s it’s not just that they can summarize the charts and pull evidence out of it. You can also find that content now because a big challenge with a huge repository knowledge is if the content’s in a chart and the chart’s not being read, it just won’t come up. You’re gonna find a less relevant text based snippet there. And then maybe did Chloe’s Chloe’s answer or Chloe’s question was around whether we could do larger, charts like big Excel, and so on. Currently, what I described there would would be table charts, infographics in the PPTs, the PowerPoints, the Word docs that are going in. What we’re now working on and piloting, with customers is a couple different ways to take structured data as she’s more describing. So the first would be a manual upload or some kind of integrated upload of Excel content, and then you could really query it, with Insights. So in plain English or whatever language you’re using, asking questions, it would then go to those Excel charts and read them. And then the other is API based integrations directly directly to structured data sources that are probably currently underpinning Power BI dashboards and and so on, really going circumventing the the dashboard, going directly to the data source, and then making that sort of queryable as well. So those are big, steps that are coming kind kind of at the cutting edge, I guess, of what elements can do for him. Yeah. That’s really interesting. There’s, for, in fact, I think everyone here will probably be more or less subscribed to our weekly news email that will come out later today about the research tools. And there’s a link to an article by, Ethan Mollick, who’s one of the sort of leading, writers on on AI and the academics. And he’s talking about how the new the latest Claude model took, you know, roughly labeled data in a table format and then created a perfect, you know, chart visualization for it and then gave narrative on that chart and data. So I guess this is slightly different use case, but coming at this from a different, angle where the Yeah. You know, pace of change in some of these language models is is quite dramatic. So, yeah, interesting to see how that that develops. Steven, you got a great question here, which is, you know, you’re saying, could DeepSights be used to help with sort of internal back office process documentation? So things like proposals or screeners for qualitative research questionnaires. You know, could you load all that up and have it as almost your kind of your knowledge repository for the process end rather than the output end? Yep. Yeah. Great question. So I didn’t have a road map slide in there, but, I mean, as I described at the beginning, we’re a larger, platform that’s, you know, sort of helping insights teams throughout their process, whether executing research, carrying out the research flow, or looking for answers to business questions based on that research. The next sort of iteration of offerings we’re bringing out, of course, in addition to improvements to deep insights, are all around stuff like using GenAI to prepopulate research briefs and then start actually creating research briefs, using the power of, these LNs. Also, using them to create, meaningful assets. So we call them knowledge zones. They’re, like, these beautiful curated blog like, pieces of content starting to leverage any AI to build that out and so on. So, yeah, that’s kind of the next step change, I would say, going from finding, summarizing, knowledge into generating, creating proposals, creating kind of final content that can be shared within the business. So Right. Okay. Because I I was gonna ask you, you know, the the historic whatever features in the historic MarketLogic platform is a sort of research process management capability as well, isn’t it? Is that I guess that’s what you’re describing here, which is the, you know, where DeepSights goes. Yep. Exactly. So our research management, offering is, yeah, one of our it was one of our first sort of value props. A lot of our customers are are are leveraging it to execute their global research. It underpins the entire research flow. So, there’s a lot of different aspects of that that we’re starting to bring Gen AI into. One piece is around creating the briefs either via uploading some content and having, the AI generated brief for you, But we’re also able to plug in the kind of Insights output in a little bit of a different view than I showed so that it can be, sourced directly into the research project, pulling on, previous studies or letting you know other research you may have done. So we’re now using GenAI to solve a lot of the value, props that we were always attempting to solve and and solving with the current technology. We can just do it much better now with GenAI. So, yeah, our our research management offering is coming along with the whole GenAI. Okay. Oh, great. Tyler, your question, I think, overlaps, with with what I was talking about earlier, not specific to DeepSights, but, I think, Joe, you hinted at that. So is there a future iteration where DeepSights could generate, you know, visualizations, display charts in the output? Yeah. Sure. Absolutely. So we’re experimenting with, I mean, we I’ve even seen pilots that we’re doing internally with with audio generated, you know, using some kind of third party offerings to plug in there. But, absolutely around either generating tables, charts, and so on in the answer. It hasn’t been the first place we’ve gone, so we kind of focused on identifying, answers to the business questions in the knowledge, repository of our customers. And now we’re starting to iterate on all of the kind of outputs and so on, working with our customers. So we’ve got a very good base of users, from these big forty five hundred help companies, and they’re able to feed us with kind of the most pressing outputs and so on that they’d like to see these sites, kinda generate. Great. The audio thing is interesting, actually. I’ve been looking a lot at the Eleven Labs audio capabilities for, for audio conversion of our some of our content on the Insights platform site. And I would recommend anyone I think you need to be in the US or the UK for this right now. If you are, download the Eleven Labs iPhone app and try listening to any bit of content, a blog post, a link with some of the new voices they’ve got on there because they’ve synthesized Laurence Olivier, Burt Reynolds, and a handful of other stars, and it’s incredible. So I I’ve been listening to, to various blog posts voiced by Laurence Olivier, and it makes the news so much more dramatic. It’s fantastic. So, yeah, audio, I haven’t thought we thought about that as an out platform. Great. Martin, you are asking a big question, which I think is, you know, a a a sort of broader philosophical one about the role of human partners and, you know, and experts. If the AI is gonna be able to do more of the interpretation of, you know, different formats and inputs, where does the human partner focus on? Maybe I don’t know whether this is what you’ve had from from customers, Joe, you know, or where they’re thinking about the skill set and the, expertise they have in teams going forward. Yeah. That’s a great question. So, I think, I mean, we all see this also working with GenAI, like, in our own day to day work that there’s still a huge role for humans in terms of understanding what the AI is doing, and that that’s basically changing quickly. But as you’re on top of it, you’re aware of, you know, the quality, the trustfulness of a given output, and where you as a human need to, kind of verify that, maybe ask the Gen AI to rework it and so on. We’re trying to connect the dots on that with things like, the caveats that I pointed out so that we’re giving you a hint. Like, you’d be aware that there’s likely contradictions in your own knowledge base, but you’d have to go search for them again or try to repose the question to the Gen AI. Right? So we point them out kind of proactively saying, look. Here are some clear things. You’d probably agree if you if you go and read them. You’d get used to it as well as the end user saying, yeah. Yeah. That’s that’s either a good good look or, yeah, I know the AI is actually over conservative for my taste, and, you know, I’d like to sort of, disregard some of that. And then we’re also, moving into the agentic space, though AI agents are kind of the next big thing that’s coming both this kind of question and answer, technology. And there, there will also be a huge role for the human to intervene at places within the agentic flow. So maybe the agent goes off and does some component like a research piece, comes back, you as the human evaluate the evidence before you tell it to, yeah, we’d draft me up something here and so on. So, also, there, it’s it’s somewhere we’re really gonna be plugging the the insights expert into the whole flow to more quickly and effectively generate the call. Yeah. Fascinating. You’re talking about the agents, as an iteration, you know, coming. Actually, for those of you in the audience, if you can stick around for our next session, I can recommend that you do to just to see agents brought to life in a very different kind of context. But this is with Vervae who have an agent based solution that overlaps a little with, the sort of knowledge synthesis, but then also with your primary research and your synthetic persona generation using different agents to, to kind of accelerate the innovation and insights process. So it’s very much an evolving space. Okay. Okay. I did enjoy seeing the AI caveats as a as an explicit feature because it does feel like we’re, you know, we’re entering, you know, a kind of more, I think, you know, realistic understanding now. We’ve had an explosion of capability. We’ve all gone, wow. And now it’s like, okay. You know, practically, where are the limits and the, you know, where should we be, if not skeptical, at least aware that it’s gonna need more work in different places? And if any of you, missed yesterday’s session with MX eight Labs, I could recommend answering that because it’s, it’s an end to end survey creation solution, but built natively with AI. And it’s, a great example of, you know, an AI product leader actually underselling AI, showing you how it can be a little bit, you know, lacking in places. Very great, very honest, and, well worth a look on the replay. So do check that out if you can. And, I thought we had some comments. We just have some hellos from, Mexico, Argentina, Chicago. Wonderful. Nice to know that we’ve got such a a broad spread in the audience again today. Good. Do you know? I think that probably is a reasonable place to, you know, to bring this to a close. I thought that was, great, very interesting. Thanks, Caroline. You’ve just put a a link there if people wanna go in deeper to find out more information. We also have several other, webinars and, thought leadership pieces with MarketLogic on the Insights platform site. You can go and download and check those out, anytime you like. Joe, thank you very much for, I would say flawless. Now some of you may notice that we often encourage the demo parts of these to be prerecorded because technology isn’t always kind in live demos, but that looked to me like it run, without any hiccups in front of a live audience. So, you can breathe a sigh of relief. Yeah. Thank you. Good stuff. Well, thanks very much, everyone. We’ve got about twenty minutes until our next session with Verve on AI agents. So a little bit of time to to, send some emails or have a little cup of tea. But, Joe, thanks very much to you and to the market. Thanks to you, Mike. Take care, everybody. Thanks. Bye bye.
Watch the Demo Days for Research & Analytics Tools to:
- Elevate your research with the latest tools and techniques.
- Discover data analytics to enhance your reporting and insights.
- Learn about the latest developments in research and analytics.
- Stay ahead of the curve on innovations in AI.
Join hundreds of researchers, designers, consultants, product managers and insights professionals at this free virtual conference.
Market Logic Session Description:
Market Logic’s end-to-end market insights platform helps the world’s leading brands to generate and capitalize on insights. Our AI software enables customer-centric business decisions with a comprehensive suite of tools for insights professionals and an intuitive AI insights assistant for business users.
Our insights engine connects all your consumer and market data and tools to leverage existing investments. Our DeepSights™ AI assistant equips decision-makers to get insight-rich answers to business questions 24/7, and our DeepSights™ Workspace offers a suite of expert tools for market research teams to increase the impact and accessibility of their insights teams across the business.
Market Logic’s longstanding industry leadership has been recognized by many independent analysts including AI Breakthrough Awards, Forrester Research, the Market Research Society’s Best Data Solution Award, and the BIG Innovation Awards.
Join our DeepSights™ to learn how to:
- Get direct answers to market questions in seconds by specially trained AI for insights
- Generate synthesized summary reports drawing from research studies and trusted newsfeeds and secondary content subscription
- Integrate insights into your daily workflows via connectors to MS Teams and Google Chat tor other business application APIs
- Keep stakeholders easily up-to-date with AI-curated newsfeeds and newsletters
