Qwak Feature Store provides a unified store for features during training and real-time inference without the need to write additional code or create manual processes to keep features consistent.
As we support different ways to ingest features into Qwak Feature Store, including Batch, streaming, or non-materialized features, we recently added support for MongoDB as a batch data source to pull data from.
The Data Source connectors provide you with a consistent data source interface for any database, and they create a standard way to combine stream and batch data sources for Feature Transformations.
Defining a Feature Set enables you to create features from your analytical data: when calculating feature values, Qwak will simply read from the underlying data source.
For example, in a fraud detection model use case, we might have two values to pull from the MongoDB data source:
And two from Streaming events from Kafka:
Snowflake data source connector definition:
Register batch feature using the MongoDB connector:
Qwak Feature Store helps ensure that models make accurate predictions by making the same features available for both training and for inference.
For more information, or to contact us if you are missing a data source connector, feel free to reach out us at firstname.lastname@example.org