#Adobe Experience Platform

How Data Ingestion Works in Adobe Experience Platform

Contents

Data ingestion in Adobe Experience Platform refers to the process of bringing data from various sources into the platform for further use cases such as customer profiling, segmentation, and analysis. The ingestion process involves several steps to ensure data quality, governance, and compliance.

Overview

Adobe Experience Platform supports ingesting data from various sources, including but not limited to:

  • Cloud storage (e.g., Amazon S3, Azure Data Lake Storage)
  • Streaming data sources (e.g., Apache Kafka, Amazon Kinesis)
  • Customer Relationship Management (CRM) systems
  • Marketing automation platforms
  • Offline data sources (e.g., CSV files)

The ingestion process follows these general steps:

  • Data source connection
  • Data mapping
  • Data ingestion
  • Data monitoring

Data Source Connection

The first step in the ingestion process is to establish a connection between Adobe Experience Platform and the data source. This can be done through source connectors, which are pre-built integrations that simplify the process of ingesting data from various sources.

Data Mapping

Once the connection is established, the next step is to map the data from the source to the Experience Data Model (XDM) schema. XDM is a standardized framework that defines how data is structured and formatted in Adobe Experience Platform. Mapping ensures that the ingested data conforms to the XDM schema, enabling consistent data governance and interoperability across Adobe applications.

Data Ingestion

After mapping, the data is ingested into Adobe Experience Platform through batch or streaming ingestion methods:

Batch Ingestion

  1. Data is uploaded to a cloud storage location (e.g., Amazon S3 bucket).
  2. A batch ingestion request is initiated, specifying the data source and mapping details.
  3. Adobe Experience Platform retrieves the data from the cloud storage location and processes it according to the specified mapping.
  4. The ingested data is stored in the Data Lake, a scalable and secure data repository within Adobe Experience Platform.

Streaming Ingestion

  1. Data is streamed from the source in real-time or near real-time.
  2. Adobe Experience Platform connects to the streaming source (e.g., Apache Kafka) and ingests the data as it arrives.
  3. The ingested data is processed and stored in the Data Lake.

Data Monitoring

After ingestion, Adobe Experience Platform provides monitoring capabilities to track the ingestion process, identify potential issues, and ensure data quality. This includes monitoring ingestion metrics, error handling, and data validation.

Conclusion

Data ingestion in Adobe Experience Platform is a crucial process that enables organizations to bring data from various sources into a unified platform for further analysis, segmentation, and personalization. The ingestion process involves establishing data source connections, mapping data to the XDM schema, ingesting data through batch or streaming methods, and monitoring the ingestion process for data quality and governance.

Back to Glossary

Axamit blog

Get Inside Scoop on Adobe Experience Platform Updates, Trends, Best Practices
October 18, 2024

Data Governance Framework: A Path to Data Integrity and Compliance

Explore how a data governance framework helps organizations manage data quality, security, and compliance with real-world examples and best practices.

Read More
October 2, 2024

How to Implement Data Governance: the Surefire Practices

Discover how proper data governance not only protects your business from risks but also empowers your team to unlock new opportunities and insights.

Read More
September 6, 2024

Optimizing Adobe Experience Manager Performance: Expert Techniques for Peak Results

Discover actionable techniques to boost Adobe Experience Manager performance and ensure a seamless experience for your users.

Read More