Highlights:
- Neo4j is introducing its fully managed graph database platform, Neo4j Aura Professional, to the AWS Marketplace.
- To represent and store data, Neo4j uses graph structures of nodes, edges, and properties instead of rows and columns in tables.
Neo4j Inc., a company that creates graph databases, recently announced that it will be launching a multi-year strategic partnership with Amazon Web Services Inc., a leader in cloud computing, to help businesses improve the capabilities of their generative artificial intelligence models.
The concept involves integrating knowledge graphs with native vector search to minimize occurrences of “hallucinations.” This approach aims to enhance model accuracy, transparency, and explainability. Additionally, the partnership aims to address challenges related to the necessity of long-term memory in large language models trained on enterprise datasets.
In conjunction with the partnership, Neo4j is introducing its fully managed graph database platform, Neo4j Aura Professional, to the AWS Marketplace. Now available to the public, it will allow companies to swiftly initiate the use of the Neo4j database, the company confirmed.
The platform offered by Neo4j differs from those of conventional database systems. To represent and store data, it uses graph structures of nodes, edges, and properties instead of rows and columns in tables. This format has the benefit of making data retrieval much simpler, often requiring only one operation.
Neo4j’s notable strength lies in its native vector search functionalities. This feature enables storing unstructured data, including images and written notes, as vector embeddings. It captures both explicit and implicit relationships and patterns. Through this capability, AI systems can retrieve and utilize a broader range of data, significantly improving their capacity for reasoning and inference.
Neo4j asserts that these capabilities position its database is well-suited for providing a foundation for LLMs. Simultaneously, it serves as a repository for long-term memory storage, contributing to more precise, explainable, and transparent results for generative AI systems.
Through collaboration with AWS, Neo4j is integrating its platform with Amazon Bedrock, a fully managed service facilitating AWS users’ access to foundational LLMs from third parties. It has also developed a reference architecture, enabling joint customers to access a catalog of models through application programming interfaces and utilize them as the groundwork for AI applications.
The company asserts various benefits, including a reduction in hallucinations, where AI generates fabricated responses. Through Retrieval Augmented Generation, companies can enable LLMs to access proprietary data, creating virtual assistants rooted in enterprise knowledge. This approach allows for more personalized experiences and complete answers by utilizing real-time search, ensuring that models stay continuously updated.
Lastly, the company mentioned that customers can now harness their unstructured data, converting it into structured information through knowledge graphs and vector embeddings. Once securely stored in a knowledge graph, AI models can access this data for training and inference.
Atul Deo, General Manager of Amazon Bedrock, said, “With Neo4j’s graph database and Amazon Bedrock’s integration, we aim to provide customers sophisticated options to deliver more accurate, transparent, and personalized experiences for their end-users in a fully managed manner.”