#Adobe Experience Platform

How Data Ingestion Works in Adobe Experience Platform

Contents

Data ingestion in Adobe Experience Platform refers to the process of bringing data from various sources into the platform for further use cases such as customer profiling, segmentation, and analysis. The ingestion process involves several steps to ensure data quality, governance, and compliance.

Overview

Adobe Experience Platform supports ingesting data from various sources, including but not limited to:

  • Cloud storage (e.g., Amazon S3, Azure Data Lake Storage)
  • Streaming data sources (e.g., Apache Kafka, Amazon Kinesis)
  • Customer Relationship Management (CRM) systems
  • Marketing automation platforms
  • Offline data sources (e.g., CSV files)

The ingestion process follows these general steps:

  • Data source connection
  • Data mapping
  • Data ingestion
  • Data monitoring

Data Source Connection

The first step in the ingestion process is to establish a connection between Adobe Experience Platform and the data source. This can be done through source connectors, which are pre-built integrations that simplify the process of ingesting data from various sources.

Data Mapping

Once the connection is established, the next step is to map the data from the source to the Experience Data Model (XDM) schema. XDM is a standardized framework that defines how data is structured and formatted in Adobe Experience Platform. Mapping ensures that the ingested data conforms to the XDM schema, enabling consistent data governance and interoperability across Adobe applications.

Data Ingestion

After mapping, the data is ingested into Adobe Experience Platform through batch or streaming ingestion methods:

Batch Ingestion

  1. Data is uploaded to a cloud storage location (e.g., Amazon S3 bucket).
  2. A batch ingestion request is initiated, specifying the data source and mapping details.
  3. Adobe Experience Platform retrieves the data from the cloud storage location and processes it according to the specified mapping.
  4. The ingested data is stored in the Data Lake, a scalable and secure data repository within Adobe Experience Platform.

Streaming Ingestion

  1. Data is streamed from the source in real-time or near real-time.
  2. Adobe Experience Platform connects to the streaming source (e.g., Apache Kafka) and ingests the data as it arrives.
  3. The ingested data is processed and stored in the Data Lake.

Data Monitoring

After ingestion, Adobe Experience Platform provides monitoring capabilities to track the ingestion process, identify potential issues, and ensure data quality. This includes monitoring ingestion metrics, error handling, and data validation.

Conclusion

Data ingestion in Adobe Experience Platform is a crucial process that enables organizations to bring data from various sources into a unified platform for further analysis, segmentation, and personalization. The ingestion process involves establishing data source connections, mapping data to the XDM schema, ingesting data through batch or streaming methods, and monitoring the ingestion process for data quality and governance.

Back to Glossary

Axamit blog

Get Inside Scoop on Adobe Experience Platform Updates, Trends, Best Practices
AJO_B2B
December 2, 2024

Adobe Journey Optimizer: Transforming B2B Customer Engagement

Explore how Adobe Journey Optimizer B2B Edition empowers businesses with personalized journeys, enhanced sales intelligence, and seamless integrations.

Read More
CJA_Examples
November 22, 2024

Customer Journey Analytics Examples and Use Cases

Discover real-world examples of Customer Journey Analytics in action and see how it can help your business improve customer engagement, retention, and conversion rates.

Read More
AEP_Article_Main
November 14, 2024

What is Adobe Experience Platform

Explore how Adobe Experience Platform connects data from multiple sources to create personalized customer experiences.

Read More