Securely leverage all your enterprise data to build scalable and trustworthy generative AI (GenAI) applications with an agile architectural pattern.
Elevate skills and productivity across the organization with AI-enhanced applications. Progress® MarkLogic® allows you to provide large language models (LLMs) with your domain-specific knowledge to democratize access to information across your organization.
Augment large language models with your enterprise information and your rules to drive validity of generative AI responses.
Create a rich, trustworthy content source for any generative AI model to surface insights with repeatable confidence.
Enhance search with securely personalized recommendations and human-digestible insights for better user experiences.
Build fact-based applications that can interpret intent and enable smarter operations to multiply your workforce productivity.
The effectiveness of generative AI depends on the data it uses to generate results. Graph Retrieval Augmented Generation (RAG) allows you to augment generative AI prompts by securely incorporating enterprise data, guiding the model to generate contextually relevant responses and reducing hallucinations and data biases for more accurate AI outputs. Connecting LLMs and knowledge graphs empowers generative AI models with access to private data and a deep understanding of that data to retrieve fact-based information about your enterprise.
Webinar
Watch our hands-on webinar and learn how to design and build a robust semantic RAG workflow for trustworthy AI applications.
Watch webinarThe Progress MarkLogic platform combines multi-model data management with real-time, relevance-based search and semantic capabilities to provide an adaptable, secure foundation for your generative AI solutions.
Easily switch generative AI models to adapt to new business requirements or take advantage of technology advancements. With the MarkLogic platform’s flexible data model, you can use the same memory and data against multiple generative AI models to enable a variety of use cases without incurring the extra costs of re-indexing.
Learn moreHarness the wealth of your enterprise content and provide a rich information set to your generative AI models. The MarkLogic platform integrates diverse data sources and formats, including structured and unstructured data, to create a curated, quality and consistent data source for your AI-enhanced applications with easy model-driven mapping, entity modeling and smart mastering.
Learn moreImprove the robustness of LLM responses with a semantic knowledge graph as your AI model’s external long-term memory. MarkLogic allows you to store, index and search RDF triples and enrich your data models with new semantic relationships and metadata to provide enhanced context for your AI systems.
Learn morePower natural language search and human-centric experiences with multi-model, real-time search and a unified query API. The MarkLogic native search engine identifies the most relevant information to answer a user question with comprehensive indexing, relevance ranking, co-occurrence and proximity boosting and returns high-confidence results.
Learn moreSignificantly improve search relevance to maximize the retrieval accuracy of your RAG systems. The MarkLogic native vector operations capabilities allow you to store vector embeddings close to your data in JSON or XML format and perform large-scale similarity searches, effectively prioritizing the content that best matches the user's question.
Learn moreTake AI projects from incubation to production with robust security and unmatched scalability. The MarkLogic advanced security controls tightly couple role-based and query-based access to the content used by GenAI to generate the answer users get, helping to elevate data privacy. Automated lineage and provenance explain how generative AI models reach conclusions and reference the sources generating the response to build trust.
Learn moreProgress MarkLogic and Progress® Semaphore™ can enhance generative AI’s answers with enterprise data and SME knowledge, improving AI trustworthiness.
Explore semaphoreAccelerate your AI implementation with our RAG examples and sample code for the most common AI use cases.
See examples of how to split text into smaller chunks that can be stored in the same document or as a separate document.
See how to build a RAG retriever for your AI application using a text, semantic or vector query.
Learn how to add vector embeddings to documents in MarkLogic Server with LangChain and the MarkLogic Data Movement SDK.
Watch a demo of how to orchestrate a hybrid search in MarkLoigc Server and use native vector operations to refine your results.
Graph RAG is an enhanced approach to Retrieval Augmented Generation (RAG) that utilizes knowledge graphs to fetch related documents and improve the accuracy of results generated through open generative AI models.
By semantically tagging and indexing documents based on nodes and edges within the knowledge graph, the graph RAG approach directly retrieves semantically related documents. Context is generated by leveraging semantic search values from Graph RAG alongside RAG’s similarity search values. Graph RAG’s values allow for verifying the semantic basis of the retrieval, enhancing the accuracy of the fetched information.
Multi-model Retrieval Augmented Generation (RAG) leverages all the tools in the MarkLogic data management platform to get the best (high recall with high precision) content for your generative AI RAG context. The better the context, the more accurate your generative AI’s answers will be.
The tools available to the MarkLogic platform via the Optic API include: search with use-case-tunable relevancy, human-in-the-loop validated, re-usable semantic graphs, tabular queries of structured content, geospatial queries, key-value based co-occurrence queries and more.
Fine-tuning generative AI models is a popular technique to adjust a pre-trained GenAI model to perform a specific task or improve its current performance on a particular set of data. However, it comes with significant drawbacks, including security concerns, narrow data and high implementation costs.
Progress MarkLogic and Semaphore empower you to generate rich semantic knowledge graphs to provide externalized capabilities for your generative AI to surface insights with repeatable confidence. You can continuously refresh enterprise knowledge graphs with new facts to contextualize prompts for AI systems without the inference costs.
Multi-model data management technologies allow you to leverage the widest and richest set of information within and outside your organization and prepare it for use by the AI model. Graph databases work only with graph data models, limiting the data formats and information you could be transforming to feed into your generative AI applications.
Role-based security control (RBAC) and query-based security control (QBAC) at the document and element level allow content presented to the generative AI to match the user’s access rights to the enterprise knowledge and support a zero-trust architecture. Advanced security options like dynamic redaction provide even more granular access control to your enterprise knowledge.
MarkLogic supports a variety of RAG system implementations, depending on your data models and metadata maturity. You can implement a semantic search against a knowledge graph or, if knowledge graphs are not feasible, you can instead leverage the comprehensive set of native search technologies in the MarkLogic platform to implement hybrid search (full-text, vector and semantic search) against multi-model data in MarkLogic for the best results with generative AI. You can combine both approaches for even greater accuracy and cost-efficiency.
Develop contextualized and trustworthy generative AI-enhanced applications.