All right. Great. Thank you, everybody, for joining. My name is Joe Rini. I’m director of product management at MarketLogic, where I lead, our development efforts around our AI powered platform. And I wanna thank everybody for joining today on kinda midmorning, day two. I’m actually from Canada, but I’ve been in Germany for a decade. So I’m riding the, the jet lag, so to speak. It’s more like late afternoon for me. But, again, thanks to everybody for joining. So I’m thrilled to be here alongside, Nancy Stettler from Mars, and we’re gonna talk today all about how MarketLogic and Mars are working together to really, drive AI and consumer insights. I’ll quickly turn it over to Nancy to introduce herself. Yeah. Sure. Yeah. Insights. Yeah. Hi, everyone. I’m Nancy Steller. I work for Mars Pet Care. I’m out of the Nashville area, and I’m happy to be here today. I work in the global consumer technical insights, and I also do capabilities management. So that’s why I’m here today with our MarketLogic team to talk about Synapse and Deep insights function. And, what I’d like to you you saw in our title, it was all about knowledge management. So I as you think through the conversation today, maybe have in the back of your head some of the things you do right now when you’re thinking about how you look for when you’re about to do research, for example, what are your steps that you do to look through what hey, what’s come before, and what can you leverage? And we’re gonna talk a little bit about that. But I would welcome your questions at the end on that too and hear your input. Great. So the basic plan is, right, quick introduction to MarketLogic, to DeepSites, talk a little bit about the partnership with Mars, and then turn it over to Nancy, to learn about her case study in how DeepSites is really, helping them in action, and then, of course, you know, some questions at the end. So first of all, who’s MarketLogic? Many of you may know us. We’ve been in this space for about fifteen years. And by and large, we’re a SaaS platform. We work with some of the biggest companies in the world with their global insights teams, providing them with the platform to store, curate, analyze, and really understand all of their market consumer insights, right, across brands, across geographies, and so on. So in the typical case, we’ll really have global buy in and all of the content, that our consumer insights teams are dealing with, are servicing to stakeholders in our platform. Our main user base will be, those kind of users, but we’ll also have all of the business stakeholders throughout the organization, ideally coming to our platform, either to self serve content or to, view curated content that our insights teams are really building in the platform. And I always like to frame this a bit, in terms of the type of people who might be in the audience. So, of course, the one group is all of the insights team from these big organizations who may be using us, who may be one day using us. But all of the research suppliers, the vendors who are providing ad hoc research, qualquant, you know, DIY research and so on, data panels. We’re also probably an important consideration for you to have in mind. In speaking with a lot of you over the years at these kind of conferences, also with all of our customer base, there’s always this feel that a lot of the time, there’s a little bit of a white space in terms of understanding where your content actually ends up once you serve it to your customer base. Right? So of course, there’s the one time decision made on top of the research you’re providing. But where is that content ending up? Who in the organization organization is seeing it? How is it being disseminated? How is it being looked at going forward? So I think that’s an important consideration that that side of the the house could also take out of, you know, a talk like this. So I don’t wanna spend much time on this, really, this kind of how we frame the problem. This is sort of why we started as an organization. It’s still a problem that we see in the space, and it’s basically, you know, quickly said, this trade off between the idea that there’s pressure to innovate. There’s also pressure on costs. And at the same time, there’s this perception from the top down, not just in the insights, community, but really coming from even the c level that insights are not being always or consistently leveraged in business decisions, right, which I think we can all resonate with. And that’s something that MarketLogic tries to address, in a number of ways and more and more so leveraging kind of Gen AI technologies, which is what we really wanna talk about today. So let’s say, you know, what we offer is, of course, a Gen AI platform. We sit on top of all of our content’s content, and we really try and split that into three pillars. The first piece around democratizing content. That’s what we’re really talking about today. But we also have some offerings, where users can be synthesizing and building content in the platform. And with this whole Gen AI move, we’re starting to move more and more into automated AI agent, based offerings, which we won’t focus on today. But that’s kind of where we see the technology going. And, you know, of course, happy to talk about it in the q and a, if it makes sense. So DeepSight is now really focusing on what the platform does. DeepSight reads through all of your content, insights? Trackers, charts, structured data, reports, visual data within those reports, transcripts and so on, news RSS feeds, any integrated content that our customers will be, looking at, such as, like, Mintel reports and so on. So that’s all plugged into our platform in a single space. And then we’re really leveraging the power of GenAI to provide a, sort of synthesized immediate real time answer to business questions leveraging the power of AI. So I don’t wanna spend too much time demoing, through the platform. Just wanna quickly highlight, like, what the tool looks like, some of the key attributes about it so you have some context, when Nancy starts, describing how they leverage it in their day to day work. So this will take, you know, a short look at what Insights looks like for the end user. And what you’ll notice is, first of all, this is all generated by a large language model, actually. It’s underpinned by all of that research that we hold on behalf of the customers. But ultimately, this answer is generated by a large language model for the end user. So you see, first of all, we have a synthesized answer generated across all those content types. We really focus on ensuring that any of the output is tied to a statement, a fact, a data point from within that trusted set of knowledge. And we take an effort to always source and cite those sources, which you can see here on the right hand side, as well as in text. So you can kind of think of it like the technology that you’re used to from a Chat Jeeb tier or Copilot, but really underpinned by, all of the content which the customer holds to ensure that it’s fact based, that there’s no hallucination, and so on. And then we’ve done you know, built in a lot of features that we’ve worked on together with our, insights community to ensure that it’s tailored to the way that they work. So the first thing we do is give the freedom to, really tailor the sources. So we give a first pass at figuring out the most relevant sources to a business question to create the answer. But we ultimately give the user the ability to go, you know what? That study is not relevant. I’ll take that out because I don’t wanna send that answer off to my stakeholder. Let’s bring in a couple other studies that I really wanna focus on here. We also use the large rungings model to reflect upon the answer, and you can see that, on the screen, where we actually ask it to go, look, here’s the answer, here’s the evidence. What are the inconsistencies you may see here? Are there issues with, with the data potentially? Are there contradictory statements? So all things that an Insights or a marketer could themselves, of course, see, but we let the large language model do a lot of the work to put that in front of the user quickly, give them a hint, follow-up questions they might want to, you know, take a look at, and so on. We also leverage best in class models to read through all the visuals, tables, infographics, and so on in all these reports. So we know that a lot of the content in these reports is actually stored in these type of visual pieces. And, we’ve been able to tap into them as well. So that experience is actually drawing on the full set of the content found in the reports. And then just this quarter, really kind of cutting edge in terms of what these large language models can do now. We are able to integrate in structured databases. So any kind of brand trackers, sales data, and so on, NPS scores that maybe insights teams or their business stakeholders are needing to source from structured databases often have to go to a data analyst or another data team who’s really holding that data. We can now loop that into deep sites. And don’t wanna spend too much time. Technically, what we can do is do some SQL query based off the user’s query. The end user doesn’t see any of this. They just get the answer back and we’re combining in then all of that insights coming out of the structured content together with all that unstructured content feeding the answer. So in terms of the Mars, partnership, just before turning it over to Nancy to really talk about the use case, we’ve worked together for ten years now, and we are on v three, so version three of what we really consider our platform. Twenty twenty one was a big, period in our relationship when Mars really went through a period, which they called the Zero Waste Initiative, of looking at all their tools, platforms, and so on and sort of amalgamating them, deciding what to continue with. We were able to continue on with them. ‘twenty two, ‘twenty three really, worked through the v3 of the platform. And then finally, yeah, just last year, they adopted Insights. And Nancy will talk, you know, more about actually how they’re leveraging it and some of the usage stats around how they really use the DeepSites piece, which I showed. The tool’s been live since twenty twenty three. And, you know, it’s a slow and gradual adoption from both our existing customer base and then, of course, all the new customers that we are taking on. And, yeah, I mean, I don’t wanna overly focus on the exact number here, but this just gives you the kind of, let’s say, rough estimation of the volume and the, let’s say, the cost, the value of the content that we’re holding on behalf of customers like Amar. So, you know, potentially plus seven hundred million worth of their knowledge assets are then summarized, contained on our platform, and able to be served up to, the end user kind of instantly. And then another really key point is that we wanted to focus on all of these partners that you see on the right hand side. So, you know, your Euromonitors, VoxPop, me, Zappy. A lot of the players actually out there in the hall today, we’re integrating in with them. So Mars is doing research with those guys, but then we’re able to pull the end outputs through into our platform, really making it available kind of going forward for some of them. And with that, I would now turn it over to Nancy to, take us through. Thank you. Thank you. Yeah. So thinking back to what I said earlier about knowledge management. Right? And when you’re about to do some research or you’re kicking off a project, sometimes it’s great to go back and look at what’s been done before. But sometimes it’s not so easy to find that research. Right? Depending on where you’re at, maybe it could be that you’re using shared drives from previous research. You could be using, SharePoint or it could be coming from a lot of different places. And then sometimes it’s a lot of tribal knowledge. Right? You’re asking your coworkers, hey. Do you know who did this research before me? I need to chat with them on this. And sometimes it ends up being, like, a lot of work, right, just to get that knowledge harvest, going back to see what sort of insights had been gleaned previously. So the great thing about Synapse is it makes it a one stop shop for our team. It really enables us to go at the speed of innovation, essentially, and also helps us really leverage those insights that we’ve gained previously more effectively. So, essentially, I just kinda gave a case in point here around, since I work in pet food, about dog kibble. And so, we could say, like, maybe our, marketing insights partners have discovered that, hey. You know what? We have a proposition. We wanna make mealtime a little bit more like what, we would have for us, the pet parents. And so let’s say we’re gonna go with the flavor that’s more like this roasted chicken dinner. What does that look like? You know? So then the first questions you might have as someone who’s leading that research and building those learning plans might be like, okay. How do I give guidance to the team, the product development team, to really bring this to life? So I’m gonna look into what I’ve what’s been done before. Hop into Synapse, ask some questions of deep sites into that, initiative within it. And then I might ask questions very simply like, hey. What do we know about dog kibble appearance and how it’s appealing to pet parents? Right? Because I’m gonna start thinking, okay. How do I make this kibble look a little bit like a roast chicken dinner even though we know it’s kibble? I mean so we’ll start asking questions around that. We’ll also know that aroma could have an impact as well, but how much does that matter? So those could be more questions we could ask for deep sites. And what it’s gonna do is pull out all of the research that’s been done previously. It’ll mark it like a little bit like Joe showed on the previous slide and tell you where it’s pulling it from. And then on top of that, you can ask it to generate a report, which is great because now you don’t have to do that and pull it all together yourself. It pulls this report out and then you can share that back to your team and as well as the key stakeholders. So as you’re giving your project updates to key stakeholders, you can say, hey. This is what we know before and this is how we can move forward and how we’re gonna build our learning plan based on that, previous knowledge. Right? So now you have this and it kind of gives you some guardrails to go back to your product development team around, hey, this is what we know about what it should look like. Here’s what we know about Aroma. Let’s work on this now to get some prototypes out there to show to pet parents. Right? So while they’re working on that and doing their magic, I will then start working about, okay, what I’m gonna start doing some groups here. I’m gonna do maybe some focus groups or maybe some home use testing. But I wanna look at what research has been done previously again and even leverage it not just for the insights now, but also how did they do it. You know, what, mod do they have examples of moderator guides that they used? Do they have examples of questionnaires that I could leverage as well? That’s also in here. So it’s great. So you kind of get an idea like a foundation and then you just customize it for your piece of research moving forward. And so that’s what you do. And then you do your research and you’re like, okay, great. You know, it then becomes pet parent approved. Long story short. Right? Pet parent approved and you know that it looks good, it smells good, and their dogs love it. So that’s what you do. You can use the synapse throughout that process without having to repeat or duplicate what research has been done before, but really leveraging it. And the key thing here too is how do you really drive awareness within an organization? And as part of my role within capabilities, for the consumer technical insights team, I really do have to help my teammates really understand how to leverage this tool. Right? So, but it’s not only my teammates. It could be within our marketing organization or the broader r and d. So these trainings could be direct to users or it could be done through, more indirectly, maybe embedded into other trainings or even as I said earlier, as you’re, giving your project updates, right, to leadership or other key stakeholders, you can show how you use Synapse in that process. And it’s great. We usually include links in that as well so people can go back and click on it themselves. And now you have leadership popping in there and doing their own research as well just to kind of become a little bit more embedded in it and more aware. So those are some examples of people and how you would do it. And I think, like, Synapse, for example, I have one here on capabilities training. Sometimes you might have someone coming new into the organization or they’ve even just moved roles within r and d, let’s say. And now they’re working on innovation projects and they need to understand that stage gate process. Right? So you can show how Synapse can be embedded in that stage gate process throughout to help, you know, build your learning plans. And, with the, this deep sites, as as soon as that was added into Synapse before it was just a kind of a search and it would pull out different reports and whatnot. But now that you have deep sites and it really shows you the references and turns it into a report, I think it really just catapulted the use of Synapse for sure, within the organization. So you can see in q four, it went up quite a bit, and then we really hit some record numbers, and it continues to grow moving forward. And it really does help our innovation teams move forward. And it’s not just for innovation too. You could use it for even change management, and that sales teams use it as well. So it just keeps growing kind of on its own now at this point. Word-of-mouth, I guess you could say. Yeah. And that’s it. So I know I asked that question upfront about what do you do right now to get your information. And if anybody has any questions on that and maybe how this works, we’d love to hear it. Go ahead. You definitely can. The great thing is is it does show the references and the, and the articles or the research that it’s pulling from. So you definitely have that lens. You can see how old the research was. I mean, I do believe you can actually take some of that research out and say, you know what? I wanna redo this now and not utilize that particular research. Does that sound right, Jim? Yeah. Good to hear. So that’s that’s a good way to put it. I would say purely from the product side, what we do is we run a, you know, a search experience. We have an algorithm in there to try and prioritize recency versus relevancy, of course, biasing towards more recent content. But we also wanna show highly relevant content that maybe is a bit older. You know, it’s often dependent on the nature of the type of content. There’s big studies out there from, you know, a couple years back that are still super relevant. And And then we have a bunch of features, as Nancy mentioned, to let you go in, remove older content, or bring in older content if it’s actually relevant but didn’t make it through. And that whole effort of really trying to show the sources, show even the pages within the sources that have been referenced is all an effort to make it really obvious, yeah, what content’s being brought in, why, and so on. And that little AI watch outs piece that I showed as well, so that reflection piece by the large language model factors in there as well. So it’s gonna say, hey, that study’s, you know, maybe a bit older, but here’s why it was surfaced and so on. And just generally, in the best case, I mean, we would have actually all the content that we would have done with Mars together since twenty fifteen. Depending on the situation, customers may elect to put, just a, you know, a more recent subset of that through the, Gen AI piece. It’s still available in a regular search experience to go back to, but maybe just the more recent stuff’s been ingested by the Gen AI piece and available to deep sites. And just one follow-up. Sure. Yeah. You can actually add that in as well. Absolutely. It is. It’s across the whole organization. So sometimes you might be asking a question about supplementation, for example. And it’s gonna show you supplementation not only and maybe I’m really concerned about pet care, but it’s also gonna show maybe what we’ve learned for supplementation in humans. Right? And so that could be also relevant because usually that sort of, like, leads into sort of like a leader on what hey. What’s happening with us as people that we can later leverage and use for our pets? So yeah. Mhmm. Yeah. I I would just add to that. I think especially in this kind of previous era, the whole idea of taxonomies around all of the content, so tagging the content with, let’s say, the various categories that would be relevant to to, say, Mars was highly important. You could then, you know, filter on particular taxonomies and so on. Still important, but with the improved search capabilities that come with this whole, Gen AI movement, so to speak, you’re able the the system is able to really understand the question, actually find the stuff, even though it may be, you know, from Petcare, from a different group, and and try to show it to the most, relevant user. And then the flip side would be there’s also permission consideration. So some content actually within the organization can only be viewed by certain teams and so on. Just think like, you know, r and d content or m and a reports and so on that shouldn’t be seen by everyone. Of course, we have, like, permissions around that at the user user group level. And that all peters through to the, that answer you saw there. So you would actually only see an answer based on sources that you yourself would have access to. So that’s kind of some of the ways that we handle that. Yeah. Go ahead. Yeah. The question was how we can personalize feeds and so on by, like, you’d say, title or different, positions within the organization. Absolutely, yes. In the deep insights part you saw there, there’s some of the things I just mentioned in terms of, that, let’s say, personalization. We also have feeds, news feeds, and so on in the main platform, which we didn’t highlight today, that are personalized by, your behavior in the system, your title, and so on. And actually, again, Gen AI, large language models, bring, an added capability there as well. So we’re starting to work on tools that build up a personalized profile of each person in the system. And then that can kind of reflect back against the content to build personalized feeds. So yes, we have it now, but it’s also improving with this whole large language model world. Go ahead. Yeah. Absolutely. I mean, you can definitely, use that. You’re right. And sometimes you do get the same questions on and on. It doesn’t hurt to go back in and check again because you might have more recent research. But you’re insights. You could ultimately build, like, FAQs from that as well. I’m not sure that that’s necessarily embedded within it currently, but Right. Good thought. Yeah. And we we have something called knowledge zones, which are kind of like a microsite or a blog that, users can curate in the system. We have customers leveraging those to build up kind of like a permanent or a semi permanent. They update it manually, but repository around a particular topic area often used for onboarding or, you know, key topic areas. Yeah. Go ahead. I don’t know if you really can always set a hurdle because, you know, it just seems like every project is so individual. Right? But it definitely can give you some guardrails. I don’t know if you would necessarily always set hurdles with that, though. But yeah. Mhmm. Sure. Mhmm. Mhmm. It’s all of them. It’s all of them. And it will also search out some secondary sources too with which we might have partnerships that are also embedded into that. So absolutely. Yeah. Yeah. I think Mars is a good case of that. We have you know, you can imagine other customers where it’s more of a challenge to get all of the content in or maybe some of the, like you said, concepts or reports that that didn’t make it through. In the ideal world, of course, they’re in there too because there’s learnings in them as well. And that’s kind of a change management piece that we also work with the customers on. So, yeah. Good question. Go ahead. Yeah, good question. So the question was, how is it getting actual hard numbers that he saw in some of the answers that I showed there. So, you know, it had data points and so on. There’s basically three ways, let’s say. So first of all, it can pull, that might have been a statement in a report, either in text or in a table, in an infographic, and so on in that report. It’s pulled that out and simply displayed in the answer if that’s the case. It also might have been doing that text SQL database tap in thing that I mentioned, where it’s actually gone, taken the query that the user asked. I think it was something like, what do we know about millennial spending or something. Converted that to an actual SQL query in the back end. The user doesn’t see this, run it, and pull the data point from a database, and brought it back. At the limit, it can also do small calculations within the large language models capability, just, like, technically speaking. So it can, you know, divide two numbers if that’s in the in the context of the question that the person was looking for and give you the percentage, let’s say, or something like that. With the SQL piece, it can do more of that, like, natively on the SQL side and bring it back. But the model itself doesn’t do much, let’s say, math or calculations. It tries to stick to what’s expressly stated in the report and so on. Yeah, good question. Go ahead. Well, I think that’s really where it’s important. You might see those conflicting data points, and it could also just be how the research was and those data points were acquired. How was the research conducted, and how was that data point acquired? So that’s where it’s really important where if someone in my field that the consumer technical insights or the consumer marketing insights really can help add that additional lens, onto that to really, take good care when you’re looking at that. Right? Because that can happen, especially depending on how the research was conducted. So Go ahead. Yeah. Good question. So the question is, where is all that data stored? For all of the we call them internal documents, all the reports and so on that, like, Amaras would be uploading, we store them in a Cloud in the Google Cloud Services environment. So we hold that content. We also then process it. That’s what makes it available to that Gen AI deep sites piece I showed you. For any of the well, not for any. For the third party content that we’re integrating in with, there’s a number of different ways we do it. So technically speaking, we can do some API calls out to them and search their environment and surface it in real time. We also have integrations where we actually host their content. And, yeah. I mean, there’s variations within that. But by and large, the core content, we hold it in the on on our customers’ behalf in the Cloud. It’s a report, actually. Yeah. So you can have it put it into a report for you, for sure. Mhmm. Yeah. Good question. In terms of too quickly on on dashboards, currently, we don’t build and, yeah, let’s say, make dashboards out of the data. We can host them, though, on behalf of the customers. So we, like, kind of iframe them in from wherever they’re sitting. That’s something that customers sometimes do in one of these knowledge zones I mentioned. But ultimately, the end report is, yeah, this text based report. Go ahead. Yeah. Yes. Yeah. Good question. It was it was, what if it doesn’t have an answer? Do you see that often? That’s why I have something to say on that. I I don’t. But sometimes, yeah, I I suppose sometimes I’ll get a question maybe within our r and d organization, and it’s something very specific. Like, it might even be around nutrients and understanding more. And it could just mean that, really, the question they’re asking is so detailed. It’s not something that you glean a specific insight off of. It might be a little too technical, and that can happen. Right? So it does happen. It just means that you’re not gonna have that for your knowledge harvest. You’re not gonna have an insight on it. So that means you gotta discover it. Right? Yeah. Mhmm. Yeah. No. Yeah. Yeah. That’s so the question was she’s common enough. Right? She’s working with another internal Jenny I platform that’s large language model based, And the finding is that, like, it’s answering everything just, you know, writing away answers and so on. So for I mean, that’s one of the major value props of what we tried to set up to build. By forcing the system to ground the answer in the content that we hold, Everything in there is coming from that content because, otherwise, the large can just go off and spew out whatever whatever it wants. So we explicitly prevent that from happening. And we have a couple actual hard coded features. So if there’s really no content found, it will simply say we don’t have any content on that. If there’s limited, it will give you a smaller answer, like, clearly reflecting the fact that there’s not much data there and so on. And then, you know, that AI watch out’s piece is also meant to go like, hey, look, there’s not much content on this. It’s a white space. Yeah. Good question. So we have like a minute left. Was there a further question? Yes. Go ahead. Yeah. The question was, does it you ask a specific question, but does it then somehow kind of take in the related, let’s say, subset of questions around? Do you find that it’s happening to everyone? It might. Or at least it gives you thoughts thought starter sometimes. Right? So then you go, oh, wow. Maybe I need to dig into this a little bit more. You know? But usually, it’s pretty close to what you’re asking. I mean but, obviously, it comes from research. Maybe sometimes research has been conducted for something very different. Right? But you’re gleaning insights from that, so it’ll still relate it back to that a little bit. So yeah. I mean, it can happen for sure. Yeah. It’s for us, it’s an interesting trade off. It depends on, you know, the different customers, but sometimes people will say, oh, hey. Like, this answer is actually veering a bit from what I asked about. On the other hand, people are saying, hey. That’s great. You’re showing me, you know, something from the the overall kind of space that’s related. In the in this report generation that Nancy spoke about, we actually explicitly do something in the back end where we generate follow-up questions using a large language model. We don’t show them to the end user, but then we have, like, five or six sub questions, which we pose to the knowledge base as well. So they were, like, explicitly broadening out the question like you referred to. But it somehow intuitively happens in any case just based on the way it searches across the content. Okay. That takes us to exactly the thirty minute mark. Thank you very much everybody for your time.
In April 2025, Market Logic and longtime partner Mars presented at the Quirks Conference in Chicago, IL. Our presentation, “Reimagining Knowledge Management in the Age of AI” explores how Mars and Market Logic have partnered to reshape their insights capabilities by leveraging new gen AI research workflows.
Presented by Joseph Rini, Director of Product Management at Market Logic Software, and Nancy Stettler, Global R&D Sensory and Consumer Scientist at Mars Petcare.

Session Details:
Today’s insights and market research teams face increasing demands for timely, relevant insights on demand. But how do enterprises like Mars not only ensure that their consumer insights are accessible but also drive measurable impact at scale?
This session will look at the evolving role of the insights function in the age of AI. Join Market Logic and Mars to discuss how teams are leveraging reimagined knowledge management technology to unite enterprise functions, centralize their knowledge assets and embed insights directly into daily business decisions.
We will also take a look at the future trajectory of AI-powered insights technology and outline the best practices around how enterprises can deploy these rapidly evolving solutions effectively. The end result? AI that helps you maintain competitive advantage and fosters a culture of insights-led innovation throughout your organization.
Key takeaways:
- Strategies for democratizing insights and embedding them into everyday business operations through AI-enabled knowledge management.
- The role of AI in generating actionable, trusted consumer and market insights – and how insights teams can work strategically with AI.
- Use cases: A firsthand look at how Mars is scaling the impact of its insights across enterprise functions.