Stambia Data Integration

In a context where data is at the heart of organizations, data integration has become a key factor in the success of digital transformation. No digital transformation without movement or transformation of data.

Organizations must meet several challenges:

  • Be able to remove the silos in the information systems
  • Agile and fast processing of growing data volumes and very different types of information (structured, semi-structured or unstructured data)
  • Manage massive loads as well as ingest the data in real-time (streaming), for the most relevant decisions
  • Control the infrastructure costs of the data

In this context, Stambia responds by providing a unified solution for any type of data processing, which can be deployed both in the cloud and on site, and which guarantees control and optimization of the costs of ownership and transformation of the data.


For further information, see our page Why Stambia ?

architecture stambia data integration

The use cases of Stambia Data Integration

Projects with Stambia

With a unique architecture and a single development platform, Stambia Enterprise can address any type of data integration projects, whether projects addressing very large volumes of data or more real-time oriented projects.

Here is a non-exhaustive list of feasible projects:

  • Business Intelligence (datawarehouses, datamarts, infocentres) & Analytics
  • Big Data, Hadoop, Spark et No SQL
  • Platform migration to the Cloud (Google Cloud Plateform, Amazon, Azure, Snowflake…)
  • Exchanges of data with third parties (API, Web Services)
  • Data Hub project, exchange of data between applications (batch or real-time mode, exposure or use of Web Services)
  • Replication between heterogeneous databases
  • Integration or production of flat files
  • Management of business data repositories


    Some examples of architectures with Stambia

    API WebServices MicroServices Management
    DataHub Global
    Snowflake Automatic Data Migration With Stambia


    How does Stambia Data Integration work?

    Stambia is based on several key concepts

    A Universal Mapping

    Unlike traditional approaches, which are technical-process oriented, Stambia offers a vision based on the "universal" mapping: any technology must be able to be fed or read in a simple way, regardless of its structure and complexity (table, file, Xml, Web Service, SAP application, ...). It's a business-oriented data vision

    To know more consult the page "Universal Mapping".

    A Model Driven Approach

    The Stambia approach is based on models. The notion of templates or adaptive technologies offers a capacity for abstraction and industrialization of data flows. This approach increases productivity, agility and quality during the completion of the projects.

    To know more see the page "The Model Driven Approach".

    An ELT philosophy

    The "delegation of transformation" or ELT architecture allows to maximize performance, lower infrastructure costs, and master the data flows.

    To know more consult the page "The ELT approach".

    Master the trajectory

    Stambia's vision is to enable its customers to master the cost of ownership of data integration projects and platforms. This is possible thanks to the Stambia business model and the technological approaches that increase productivity and improve the learning curve.

    To know more consult the page Pricing Model.

    And a simple and light architecture

    The architecture of Stambia Enterprise is based on three simple components:

    • Designers are development and test UI. Designers rely on an Eclipse architecture for easy sharing and flexible management of developments and team projects.

    • Runtimes are the processes that execute the jobs in production. They are based on a Java architecture facilitating their deployment on site or in the cloud. They are compatible with Docker and Kubernetes architectures.

    • Stambia Analytics is the web component that enables production (deployment, configuration and planning) and job tracking, as well as the management of different runtimes.
    architecture stambia data integration

    Technical specifications and prerequisites


    Simple and agile architecture

    • 1. Designer: data integration development studio
    • 2. Runtime: engine that executes data integration processes
    • 3. Production Analytics: management of production (logs, delivery)

    You can extract data from:

    • Any relational database system such as Oracle, PostgreSQL, MSSQL, ...
    • Any NoSQL database system such as MongoDB, Elasticsearch, Cassandra, ...
    • Any High performance database system such as Vertica, Teradata, Netezza, Action Vector, Sybase IQ, ...
    • Any Cloud system such as Amazon Web Service (AWS), Google Cloud Platform (GCP), Microsoft Azure, ...
    • Any ERP applications such as SAP, Microsoft Dynamics, ...
    • Any SAAS applications such as Salesforce, Snowflake, Big Query, ...
    • Any Big Data system such as Spark, Hadoop, ...
    • Any Messaging MOM or ESB such as JMS, Apache Active MQ, Kafka, ...
    • Any file type such as XML, JSON, ...
    • Any spreadsheet such as Excel, Google Spreadsheet, ...

    For more information, consult the technical documentation


    Technical connectivity
    • Email (SMTP)
    • LDAP, OpenLDAP
    • Kerberos

    Data Integration - Standard features

    • Reverse : The database structure can be reversed in a dedicated Metadata
    • DDL /DML Operations : DML/DDL operations can be performed on the database, such as: Insert, Update, Select, Delete, Create or Drop
    • Integration methods : Append, Incremental Update
    • Staging : The databases can be used as a staging area to make data transformation, mutualization, ... The following modes are supported:
      • staging as subquery
      • staging as view
      • staging as table
    • Reject : Reject rules can be defined to filter or detect data that does not fulfill the rules during integrations
      • Three type of rules can be created: Fatal, Warning, Reject
      • Depending on the specified type, the rejected data will not be handled in the same way
      • Rejects of previous executions can also be automatically recycled
    • Replication : Replication of databases is supported. The replication source can be any rdbms, Flat files or XML files, Cloud, ...
    Data Integration - Advanced features
    • Slowly Changing Dimension Integrations : Integrations can be performed using Slowly Changing Dimension (SCD)
    • Load methods (depending on connectivity):
      • Generic load
      • COPY loader
    • Change Data Capture (CDC)
    • Privacy Protection : GDPR capabilities
      • Anonymization
      • Pseudonimization
      • Audits
      • ...
    • Data Quality Management (DQM) : Data Quality Management capabilities added to the metadata and the Stambia Designer
    • Operating System:
      • Windows XP, Vista, 2008, 7, 8 or 10 / Both 32-bits and 64-bits are supported
      • Linux / Both 32-bits and 64-bits are supported
      • Mac OS X / Only 64-bits is supported
    • Memory
      • At least 1GB of RAM
    • Disk Space
      • The system must have at least 300 Mo of free disk space
    • Java Virtual Machine
      • JVM version 1.8 or higher
    • Miscellaneous : Linux Windowing System: GTK+ 2.6.0 and all its dependencies is required on Linux environments
    Cloud Deployment Docker image available for Runtime and Production Analytics components
    Standard supported
    • Open API Specifications (OAS)
    • Swagger 2.0
    • W3C XML
    • WSI compliant
    • SQL
    Scripting language Jython, Groovy, Rhino (Javascript), ...
    Sources versionning system

    Any plugin supported by Eclipse (CVS, SVN, git, …)

    Migrating from Oracle Data Integrator (ODI) *, Informatica *, Datastage *, talend, Microsoft SSIS

    * capabilities to Migrate seamlessly

    Want to know more ?

    Consult our resources

    Vidéo : An introduction to Stambia Designer
    Open Video
    Tutorial - Template Reject Management
    Open Video
    Your customized demonstration
    Get your demo
    Ask advices to our experts in Data integration.
    Contact us
    Discover our training and certifications
    To know more about it