Randstad Digital logo

Senior Data Engineer Lead

Randstad Digital

Markham, Canada

Share this job:
$3 - $6 Posted: 1 day ago

Job Description

<p><b>Position Overview</b>:</p><p>This is a critical leadership and hands-on role within our Information Technology Enterprise Data Services Group, focused on accelerating company's multi-year journey to build a next-generation data platform on <b>Snowflake</b> to enable advanced AI and GenAI capabilities. You will serve as a <b>Staff Engineer-level</b> technical leader, responsible for the architecture, design, and implementation of robust, scalable enterprise data solutions.</p><p><br></p><p><b>Key Requirements & Core Technical Skills (Must-Haves)</b></p><p>To be effective from day one, candidates <b>must</b> have hands-on, proven experience with the following four core technologies. This experience will be the primary focus of the technical evaluation:</p><ul><li><b>Snowflake (Database/Data Warehouse):</b> Deep expertise in modern cloud data warehousing architecture and performance.</li><li><b>dbt (Core/Cloud) (Data Transformation):</b> Expert-level proficiency in modeling and transforming data within Snowflake.</li><li><b>Python (Programming Language):</b> Core skill for data pipeline development, scripting, and dbt integrations.</li><li><b>AWS (Cloud Computing):</b> Practical experience with relevant AWS services (e.g., EC2, S3, Airflow).</li></ul><p><br></p><p><b>What You'll Do</b></p><p><b>Technical Leadership & Strategy</b></p><ul><li><b>Define and Drive Strategy:</b> Shape the technical roadmap, data engineering strategy, and best practices for the Enterprise Data Services group.</li><li><b>Solution Architecture:</b> Lead the development of both <b>high-level</b> (working with Enterprise Architecture) and <b>low-level</b> solution designs for enterprise-scale data ecosystems, ensuring alignment with business goals.</li><li><b>Mentorship & Excellence:</b> Provide technical guidance, code review, and mentorship to Data Engineers and project teams, fostering a culture of engineering excellence and delivery focus.</li></ul><p><b>Hands-On Engineering & Delivery</b></p><ul><li><b>Pipeline Development:</b> Lead the design and implementation of highly scalable, high-performance data pipelines, primarily utilizing <b>dbt</b> for transformation and <b>AWS Managed Airflow/Zena</b> for orchestration.</li><li><b>Advanced Coding:</b> Write and maintain clean, reusable, and high-quality code in <b>SQL, Python, Shell, and Terraform</b>, with a focus on performance and maintainability.</li><li><b>Data Modeling:</b> Design and review conceptual, logical, and physical data models to support new and evolving business requirements.</li><li><b>Quality & Governance:</b> Champion data quality, governance, and cataloging practices across the platform.</li></ul><p><b>Collaboration & Project Management</b></p><ul><li><b>Agile Leadership:</b> Lead agile ceremonies, ensure a delivery-focused mindset, and drive the timely execution of data initiatives.</li><li><b>Cross-Functional Collaboration:</b> Work closely with Architects, Data Designers, QA Engineers, and Business stakeholders to deliver cohesive, customer-centric data products.</li><li><b>Issue Resolution:</b> Perform <b>root cause analysis</b> and implement effective solutions for complex, high-priority data issues.</li></ul><p><br></p><p><b>What You'll Bring</b></p><ul><li><b>Extensive Experience:</b> Proven track record of architecting and delivering complex, high-impact data projects from inception to production.</li><li><b>Core Technical Stack:</b> <b>Mandatory expertise</b> in <b>Snowflake, dbt, Python, and AWS</b>.</li><li><b>Advanced Expertise:</b> Deep knowledge of relational databases (e.g., PostgreSQL, Aurora) and modern coding practices in <b>SQL</b> and <b>Python</b>.</li><li><b>Resilience & Communication:</b> Exceptional communication and presentation skills (technical and business), with the ability to thrive in fast-paced, high-pressure environments.</li><li><b>Domain Knowledge (Asset):</b> Familiarity with insurance industry processes and systems.</li><li><b>AI/ML Exposure (Asset):</b> Experience in operationalizing AI/ML and GenAI models.</li><li><b>Certifications (Asset):</b> Having two or more certifications such as <b>SnowPRO Advanced Data Engineer</b>, <b>dbt Developer</b>, or <b>AWS Cloud Practitioner</b> is a strong advantage, though a willingness to attain them within 3-6 months is required.</li></ul><li></li><p></p>
Back to Listings

Create Your Resume First

Give yourself the best chance of success. Create a professional, job-winning resume with AI before you apply.

It's fast, easy, and increases your chances of getting an interview!

Create Resume

Application Disclaimer

You are now leaving Thisjob.ca and being redirected to a third-party website to complete your application. We are not responsible for the content or privacy practices of this external site.

Important: Beware of job scams. Never provide your bank account details, credit card information, or any form of payment to a potential employer.