Benefit from a Unified solution to connect to various technologies
In any data project, in your Information System, there will be various other use cases to cater to.
Managing data exchanges using Kafka, could possibly be one of those use cases.
With different kinds of technologies, platforms, databases, file formats, having a Data Integration solution that is Unified, sits at the center of the entire technological landscape, and caters to all sorts of data requirements, makes a lot of things easier for the data engineers.
Stambia component for Apache Kafka allows users to easily manage data movements between sources and targets with simple drag and drop.
The solution is completely templatized to provide agility and flexibility in your projects.
Some of the key feature of this component is the ability to seamlessly manage:
Metadata for Kafka in Stambia can be configured to connect to Confluent Schema Registry with SSL encryption to secure Kafka cluster.
This Schema then are created as a JSON / Avro Metadata separately.
Stambia Templates for Kafka uses this information to automatically manage the Schema creation and updates during executions.
Topics can be configured in the metadata to have key, value created, where the schema as a JSON / Avro can be created, just with a drag n drop of the respective metadata of the formats.
This way the entire process of managing the evolving schema becomes metadata-driven and reduces complexity.
Similarly, Consumers can be configured to point to a specific topic. This also allows to provide a reject topic where the rejected / erroneous records can be moved without causing a failure.
In the consumer metadata, mentioning the topic it is subscribed to, provides the schema from the topic itself. Hence when you drag and drop the consumer, you get to see the structure, and further can be mapped to the intended target.
Simplicity and Agility with no need to write manual code
With the Kafka Component, users can quickly configure the metadata and get going with different mappings to Produce and Consume data wherever required.
The dedicated templates do the heavy lifting of generating the code, as well, managing the Schema Registry.
As a result, the focus is on what to do than how to..
Stambia, A Unified Solution for data projects
Stambia being a Unified solution can cater to various needs of connecting to different technologies and integrating to different architecture.
Apart from managing Kafka, users can work with various source and target databases, applications, big data technologies, data frameworks like Apache Spark and so on.
This results in creating a consistent integration layer and increases productivity of the team.
• Any relational database system such as Oracle, PostgreSQL, MSSQL, ... • Any NoSQL database system such as MongoDB, Elastic, ... • Any Cloud system such as Amazon Web Service (AWS), Google Cloud Platform (GCP), Microsoft Azure, ... • Any ERP applications such as SAP, Microsoft Dynamics, ... • Any SAAS applications such as Salesforce, ...
Performances are improved when loading or extracting data through a bunch of options on the connectors allowing to customize how data is processed such as choosing a Spark jdbc load, or specifc data loaders from the database.