Teradata is a widely used, fully scalable database used to manage large volumes of data. The Teradata Vantage platform offers variety of data and analytics solutions with hybrid cloud offerings.
Stambia component for Teradata is built to compliment the robust capabilities Teradata has to offer and simplify the Integration in the Analytics project using Teradata solutions.
In any Analytics project, it is important to choose solutions that improves your agility. A lot of solutions out there can become quite complex in a longer run and as a result a lot of time and effort goes to manage these tools.
Teradata with its Vantage platform, focuses on providing the best answers and offers you hybrid cloud products that simplifies your analytics journey.
The Integration on the other hand should offer the same simplicity, flexibility and ease of use so that you focus your time on implementations and data requirements and be more agile in your projects.
With changing data landscapes, we see more and more customers make use of various kinds of technologies and architectures to fulfill different types of requirement.
As a result, a lot of IT teams are dealing with new types of data formats, applications and ecosystems. To be able to integrate and exchange data between Teradata and Hadoop technologies, or Integrating Teradata with Spark, is really an essential feature for a Data Integration tool.
These features should be part of the same solution (same design, same architecture) to make life easier for the IT teams to be consistently working different types of projects.
Cloud continues to become mainstream within most organizations and an adoption of "as-a-service" models is getting more and more popular. In fact, a lot of organizations are now considering, and some have already moved to a multi-cloud architecture.
Although, there are organizations that still prefer to have some data On-premise, resulting in adopting a Hybrid architecture. Hence, it is very essential for the IT teams to be able to support Integration needs from all these standpoints.
A key point: owning a solution that supports Hybrid and Multi-cloud implementations, for e.g. integration projects where you have an on-premise Teradata instance and (or) Teradata Vantage on AWS or Azure as well as GCP for some other need
When you own a Teradata solution, using an Integration tool that provides a proprietary and external engine to process and transform the data is totally not necessary.
With some very robust capabilities to Ingest, Analyze and Manage the data, Teradata checks all the boxes in terms of Integration (or ETL).
An Integration solution should leverage on these capabilities to improve the global performances of your Analytics projects. The solution should also automate the Integration to provide agility and flexibility in your development, and provide the ability to manage batch data integration for large amounts of data, as well as real-time data ingestion for immediate analysis of your business.The ELT approach is the best way to optimize your investment and get the best performances.
In the age where Machine Learning algorithms are the new norm, the ability to provide then the right data is very important. Insufficient or bad quality data can impact the results of these algorithms.
On the other hand, Teradata platform is constantly evolving and constantly adding new and improved features to give users the best out of the solution. A rigid solution has no place in the current technological lanscape.
Hence the integrators have to be highly customizable and be ready to absorb the technological changes and new data requirements.
Cost Optimization is a one of the critical topics for most of the CIOs for any new or existing data and analytics project or initiative. The complexity to manage the cost, at every stage of the project, depends on the kind of tool sets used.
When talking about integration, it starts with buying software licenses, hardware to support it, human resource to design and implement it and long term maintenance of the projects. Evaluation of the Data Integration solution should focus both, on it's technical capabilities as well as the overall expenses you incur using it.
These expense should not just be the up front cost, but a well thought out expenditure, for the next few years is what is important to realize, before moving forward
The Stambia Component for Teradata, is the best way to simplify the extraction or integration of data with the Teradata MPP system, providing increased productivity with an easy to use graphical solution.
Stambia component for Teradata does a reverse engineering of all the information about the database that becomes really benifitial in the designs for optimization and automation.
These would be standard information such as schemas, tables, columns, datatypes etc. as well as some specific ones like Primary indexes, UPI etc.
On the other hand, this metadata can be customized and adapted based on specific optimization to achieve specific objective.
The Model Driven Approach of Stambia (through the templates) is used to adapt the connector to all types of projects.
Simply connect to various technologies and file formats and design a very simple mapping to extract from the source, load to Teradata and transform inside Teradata. The Universal Data Mapper in Stambia lets users focus on business rules and work on a very simple and high level design.
Customers using Teradata and Stambia testify that the turnaround time for them to implement any new data project is much shorter when compared to what was done previously.
As an ELT solution Stambia uses the native utilities to ingest, analyze and transform data. This works best in terms of the Performance needed to process large sets of data. This also reduces significantly, the need for dedicated ETL server and the tool comes with light footprint.
On the other hand, Stambia's Model Driven Approach automates a lot of redundant steps that are done in some of the other Tools. Some of the benefits of this approach are:
Stambia uses specific methods adapted to Teradata in order to integrate or extract the data.
Load or export can be done using tools such as Teradata Parallel Transporter, Fastload, Multiload, Fastexport and other utilities provided by Teradata.
The incremental mode of integrating the data proposes different methods such as "insert and update", renaming table methods, "delete and insert" methods or "merge" operations.
Primary indexes and statistics are detected and used to improve the performance.
"Query band" can be used to track the SQL orders generated by Stambia and provide a way to optimize and master the processes.
Most of the ETL tool on market fail to work inside Teradata. As a result a lot of data transformation has to be done through BTEQ.
Users then write BTEQ scripts in order to fulfil data transformation needs.
Let's try something new with Stambia component for Teradata Vantage and discover how the tool entirely AUTOMATES the migration of BTEQ scripts to produce STAMBIA Mappings & Processes..
Simple and agile architecture
JDBC, HTTP, HTTPS
Based on the architecture, below storages can be used:
You can extract data from:
For more information, consult the technical documentation
Structured and semi-structured
XML, JSON, Avro
Data Integration - Standard features
|Data Integration - Advanced features||
|Cloud Deployment||Docker image available for Runtime and Production Analytics components|
|Sources versionning system||
Any plugin supported by Eclipse (CVS, SVN, git, …)
|Migrating from||Oracle Data Integrator (ODI) *, Informatica *, Datastage *, talend, Microsoft SSIS
* capabilities to Migrate seamlessly
Did not find what you want on this page?
Check out our other resources:
Replay : Teradata Vantage with Hybrid Integration Platform
Webinar : How a Hybrid Integration Platform help you solve Integration complexities in your evolving data landscape?
Stambia announces its merger with Semarchy the Intelligent Data Hub™
Stambia Data Integration becomes Semarchy xDM Data Integration