Job Description
Title: Data Engineer – Cloud Data Pipelines Location: Remote/ Tysons Corner, VA (Hybrid) Employment Type: Full-Time Overview: We are seeking a Data Engineer experienced in building scalable, cloud-native data pipelines and data products. The ideal candidate has hands-on experience with AWS, Snowflake, Airflow, dbt, and streaming or batch ingestion tools. This engineer will be responsible for developing ingestion pipelines, transformation logic, and curated datasets used across the enterprise. Responsibilities: • Build and maintain ingestion pipelines from multiple source systems (APIs, databases, streaming platforms). • Develop transformation logic using dbt to support curated data products. • Implement workflow orchestration using Apache Airflow. • Deploy and optimize Snowflake schemas, warehouses, and performance settings. • Collaborate with Data Analysts and Product Developers to translate requirements into scalable pipelines. • Ensure pipelines meet SLAs for latency, quality, and data availability. • Troubleshoot pipeline failures and perform root-cause analysis. • Work with DevSecOps to deploy pipelines via CI/CD in AWS. Required Qualifications: • 5+ years of Data Engineering experience • Strong experience with AWS data services (Lambda, S3, Glue, Step Functions, Kinesis, etc.) • Hands-on experience with Snowflake (warehouses, roles, integrations, performance tuning) • Proficiency with Airflow for scheduling/orchestration • Experience using dbt for transformations • Experience with SQL and Python • Experience integrating with APIs, SQL Server, Salesforce, or similar platforms Preferred Qualifications: • Experience with Kafka or OpenFlow ingestion • Experience with Precisely or similar data catalog/data integrity tools • Experience in GovCloud environments • Familiarity with UDM or enterprise semantic models Apply tot his job
Apply tot his job
Apply To this Job