Description: **Onsite | Charlotte, NC** Our client has an exciting opportunity for an AWS Software Engineer. Due to client requirement, applicants must be willing and able to work on a w2 basis. For our w2 consultants, we offer a great benefits package that includes Medical, Dental, and Vision benefits, 401k with company matching, and life insurance. Rate: $80 - $85 / hr. w2
Responsibilities:
- Provide technical direction, guide the team on key technical aspects, and be responsible for product tech delivery.
- Lead the design, build, test, and deployment of components in collaboration with Lead Developers (Data Engineer, Software Engineer, Data Scientist, Technical Test Lead).
- Understand requirements/use cases to outline technical scope and lead delivery of technical solutions.
- Work closely with the Product Owner to align on delivery goals and timing.
- Collaborate with Data and Solution architects on key technical decisions.
- Lead design and implementation of data quality check methods.
- Ensure Test Driven Development.
- Strong skills with business stakeholder interactions.
- Ensure data security and permissions solutions, including data encryption, user access controls, and logging.
Experience Requirements:
- At least 5+ years in a Data Engineering role.
- Must-Have Tech Experience:
- AWS Tech Stack:
- AWS Lakeformation
- AWS EMR
- Apache Hudi
- Flink
- Aurora
- PostgreSQL
- AWS Glue (Glue Catalog, Glue ETL, and Crawler)
- AWS Athena
- Redshift
- SageMaker/ML
- AWS Lambda
- DynamoDB
- RDS (Relational Database Services)
- AWS S3: Strong foundational concepts like object data store vs block data store, encryption/decryption, storage tiers, etc.
- Implementation and tuning experience in Streaming use cases and Big Data Ecosystem (such as EMR, Hadoop, Spark, Hudi, Kafka/Kinesis, etc.)
- Ability to build scalable data infrastructure and understand distributed systems concepts from a data storage and compute perspective.
- Good understanding of Data Lake and Data Warehouse concepts. Familiarity with Modern Data Architecture.
- Ability to define standards and guidelines with an understanding of various compliance and auditing needs.
- Proficiency in Python or Java for large volume data processing.
- Terraform Enterprise.
- Kafka
- Qlik Replicate
Additional Helpful Experience:
- AWS Services: CloudTrail, SNS, SQS, CloudWatch, Step Functions
- Experience with Secrets Management Platforms like Vault and AWS Secrets Manager
- Experience with DevOps pipeline (CI/CD) - Bitbucket; Concourse
- Experience with RDBMS platforms and strong proficiency with SQL
- Knowledge of IAM roles and policies
- Experience with Event-Driven Architecture
- Experience with native AWS technologies for data and analytics such as Kinesis, OpenSearch
- Databases: Document DB, Mongo DB
- Hadoop platform (Hive; HBase; Druid)
- Java, Scala, Node JS
- Workflow Automation
- Experience transitioning on-premise big data platforms into cloud-based platforms such as AWS
- Background in Kubernetes, Distributed Systems, Microservice architecture, and containers
- Experience with REST APIs and API gateway
- Deep understanding of networking DNS, TCP/IP, and VPN
- Kafka Schema Registry
- Confluent Avro
- Linux and Bash Scripting
\
|