Highlights:
- The company claims that in 30 minutes or less, they can start from zero and create a production-grade large language model that is prepared for validation and deployment.
- SeekrFlow gives enterprise customers the ability to see inside models, challenge the outcomes, and validate them at the token level to address the issues of accuracy and hallucinations.
A business-ready AI platform developer, Seekr Technologies Inc. unveiled SeekrFlow, a self-service AI product that will enable business customers to train, authenticate, deploy and optimize AI apps. The company has built trust into the application lifecycle.
Businesses are embracing AI technology more and more, but many are finding that integrating AI and machine learning into projects can be challenging to put together. Seekr claims that by providing a single source, it makes this process easier.
With just one API call for the application programming interface, SeekrFlow handles everything. It also includes a software development kit and, as of right now, a user-friendly no-code interface. The company claims that in 30 minutes or less, they can start from zero and create a production-grade large language model that is prepared for validation and deployment.
Businesses may use the same technology to train their models with a feature called “Principal Alignment,” which is an intelligent agent that makes it easier to maintain the model’s alignment with domain-specific knowledge like brand guidelines, industry-specific rules, and company policies. According to the company, the feature maintains the accuracy of the base model replies up to three and six times, respectively, at a 90% lower cost of data preparation and 2.5 times faster than using standard methods.
Seekr President and Chief Technology Officer, Rob Clark said, “Many enterprise AI projects today have been stalled due to complexity, cost and hallucinations. SeekrFlow addresses all of those concerns, and by being platform- and hardware-agnostic, makes it available no matter where the customer runs AI or where their data resides.”
SeekrFlow gives enterprise customers the ability to see inside models, challenge the outcomes, and validate them at the token level to address the issues of accuracy and hallucinations. Users can troubleshoot by asking the model to evaluate its results and provide values between one and 100 using confidence scores. Color coding facilitates side-by-side comparisons of prompts between various models for real-time evaluations, making it easier for users to recognize and analyze individual tokens and determine where additional validation is required.
Naturally, a model’s lifecycle does not end with its launch. SeekrFlow provides a visual dashboard with real-time visibility to monitor LLM performance and health while running production. Token counts, memory use, API requests, and uptime are just a few of the metrics that are provided by the backend to developers, engineering teams, and other operators so they can quickly understand what’s going on. This guarantees that users may quickly scale up resources and do cost optimization as needed.
Because SeekrFlow is AI model-agnostic, it may be used with almost any open-source or closed-source LLM that the user wishes to bring, such as Mistral AI Mixtral, OpenAI’s GPT-4, and Llama-3 from Meta Platform Inc.
The company signed a multiyear partnership with Intel Corp., although the platform can also access any hardware or AI architecture. Customers can use the arrangement to install trusted AI on the Intel Tiber Developer Cloud by using Seekr.