Job Description
- Job Description:
- Lead the foundational setup of new data environments, including Snowflake provisioning, SSO configuration, and initial system architecture.
- Design, build, and manage scalable data pipelines and ETL/ELT processes to transform raw data into actionable insights.
- Support the scaling of engineering teams to handle high volumes of commercial projects while maintaining business continuity during organizational transitions.
- Develop and maintain robust DBT data models to support semantic layer build-outs and advanced analytics capabilities.
- Partner closely with business leaders and stakeholders to bridge the gap between technical delivery and strategic objectives.
- Provide engineering support for custom backend applications, ensuring seamless data integration and reliable exposure to end users.
- Establish and enforce best practices for data engineering, architecture, and deployments that can scale across multiple organizations.
- Requirements:
- Proven experience in data engineering with a strong focus on building pipelines, supporting custom applications, and leading deployments.
- Hands-on experience with Snowflake and other modern cloud data platforms such as AWS, Azure, or GCP.
- Strong technical expertise in DBT for scalable data transformation, modeling, and semantic layer development.
- Demonstrated ability to design and build systems from the ground up, not just extend existing infrastructure.
- Advanced SQL and Python skills for complex data modeling, performance optimization, and pipeline development.
- Ability to balance deep technical expertise with business acumen, confidently navigating complex stakeholder environments.
- Comfortable operating in a nimble, mid-size consulting environment where priorities can shift, and adaptability is key.
Benefits:
Apply tot his job
Apply To this Job