We use cookies. Find out more about it here. By continuing to browse this site you are agreeing to our use of cookies.
#alert
Back to search results
New

Mid-Level PEGA CDH Data Engineer II

Ampcus, Inc
United States, Virginia, Vienna
Sep 22, 2024
Mid-Level PEGA CDH Data Engineer II

Location: Vienna, VA/ Pensacola, FL/ San Diego, CA

06+ Months Contract

Description:

Top 3 Required Skills:

Hands-on experience creating automated data pipelines using Pega DataFlow and other modern technology stacks for batch ETL or API

Hands-on experience with Pega 8.X, Postgres and Cassandra

Knowledge and experience with configuration, load-balancing, auto-scaling, monitoring, networking, and problem-solving in a cloud environment

Top 3 Desired Skills:

Load Customer Data into Pega CDH on Pega Cloud, monitor data quality and implement related controls

Migrate existing ETL for the Pega marketing PaaS solution to a Pega managed service in the cloud, SaaS solution

Implement integration plans and interface with testing teams to incorporate plans into the integration testing process

Basic Purpose:

Develop technical solutions for importing and ingesting Customer Data into Pega CDH on Pega Cloud. The Data Engineer will be responsible for guiding the design, development, of
Client s Data with a specific focus on Pega DataFlow, and ETL pipelines to support Pega Marketing systems capabilities. Solves highly complex problems; takes a broad perspective to identify solutions. Interacts with other functional teams or projects. Works independently.

Responsibilities:

Load Customer Data into Pega CDH on Pega Cloud, monitor data quality and implement related controls

Migrate existing ETL for the Pega marketing PaaS solution to a Pega managed service in the cloud, SaaS solution

Evaluate possible designs, improve methods, and implement optimizations

Document best practices for data models, data loads, query performance and enforce with other team members

Implement integration plans and interface with testing teams to incorporate plans into the integration testing process

Perform data archival

Create reports to provide insight into ETL execution statistics and status

Ensure the security and integrity of solutions including compliance with
Client , industry engineering and Information Security principles and practices

Analyze and validate data sharing requirements within and outside data partners

Work directly with business leadership to understand data requirements; propose and develop solutions that enable effective decision-making and drives business objectives

Qualifications and Education Requirements:

Bachelor s degree in Information Systems, Computer Science, Engineering, or related field, or the equivalent combination of education, training, and experience

Hands-on experience creating automated data pipelines using Pega DataFlow and other modern technology stacks for batch ETL or API

Hands-on experience with Pega 8.x, Postgres and Cassandra

Knowledge and experience with configuration, load-balancing, auto-scaling, monitoring, networking, and problem-solving in a cloud environment

Experience with processing various file types

Experienced in the use of ETL tools and techniques and have knowledge of CI/CD

Very good understanding of SQL (Advanced level)

Demonstrates change management and/or excellent communication skills

Pega Cloud experience is desired, but not required

Applied = 0

(web-c5777866b-x6xvf)