CloudVital logo

Data Engineer

CloudVital

Toronto, Canada

Share this job:
Posted: 5 days ago

Job Description

<p>We are looking for a highly skilled and specialized <strong> Data Engineer</strong></p><p><br></p><p><strong>Job Title: </strong>Data Engineer </p><p><strong>Job Location:</strong> Toronto </p><p><strong>Job Type: </strong>Full - Time </p><p><br></p><p>This is an onsite position requiring employees to work from the office five times per week.</p><p><br></p><p>This role requires a unique blend of expertise spanning established open-source tools <strong>(Talend, Pentaho), modern cloud-native platforms (Azure Data Factory, Airbyte</strong>), and deep technical data management skills including advanced data modeling, SQL mastery, and rigorous STTM documentation. </p><p><br></p><p>The ideal candidate will bridge legacy systems and modern cloud architectures within a demanding, data-intensive environment. </p><p><br></p><p><strong>Key Responsibilities</strong></p><ul><li>Design, build, automate, and support robust ETL pipelines utilizing a hybrid stack:</li><li>Orchestrate and execute data movement within the Azure ecosystem.</li><li>Manage and modernize existing on-premise and open-source data flows.</li><li>Implement and customize connectors for rapid SaaS/API integration.</li><li> Design and implement highly optimized data warehouse schemas (star schema, snowflake schema, data vaults) to support analytical reporting and business intelligence needs.</li><li> Drive the requirements gathering process, creating detailed, rigorous STTM documents that clearly define data transformations, data quality rules, and metadata definition</li><li>Write complex SQL, T-SQL, or PL/SQL queries and stored procedures, and perform expert-level performance tuning on large-scale relational databases (e.g., SQL Server, Azure SQL DB, Snowflake).</li><li>Implement rigorous data validation, cleansing, and error handling mechanisms across all platforms to ensure data integrity and compliance with industry standards.</li><li>Leverage Azure services (Data Lake Storage Gen2, Synapse Analytics, etc.) in conjunction with ETL tools to build scalable cloud-native data solutions.</li><li>Use scripting languages (Python, PowerShell) to automate job orchestration, monitoring, and testing procedures not natively covered by the ETL tools. </li></ul><li></li><p><strong>Required Skills and Qualifications</strong></p><ul><li>10+ years of dedicated experience in ETL development and data warehousing, specifically working with multiple technologies mentioned above in an enterprise setting.</li><li>Familiarity with BI tools and specific technologies like<strong> SSAS (SQL Server Analysis Services), MDX, and OLAP cubes</strong></li><li>Experience with big data frameworks like<strong> Hadoop, Spark, and related components such as HDFS, Hive, and Sqoop</strong></li><li>Experience in scripting languages like <strong>Python, Java, or Scala </strong>for automation and complex data manipulation </li><li>Understanding how to build and manage data pipelines that feed machine learning models and AI applications. </li><li>Hands-on experience building pipelines, managing linked services, datasets, and triggers.</li><li>Experience in using <strong>Python/Pandas</strong> for custom, production-ready ETL pipelines</li><li>Experience managing and developing jobs within the open-source frameworks like <strong>Talend & Pentaho (PDI).</strong></li><li> Familiarity with deployment and connector utilization/customization <strong>using Airbyte.</strong></li><li>Proven experience designing dimensional and normalized data models.</li><li>Mastery of complex SQL query writing and performance optimization techniques.</li><li>Demonstrated ability to produce detailed and accurate Source-to-Target Mappings and functional specifications.</li><li>Experience with Python or PowerShell scripting.</li><li>Bachelor's degree in Computer Science, Information Technology, or a related quantitative field.</li><li>Strong analytical ability, meticulous attention to detail, excellent communication skills, and the capacity to translate complex technical concepts into clear documentation for stakeholders. </li></ul><p><br></p><p><strong>Preferred Qualifications</strong></p><ul><li>Microsoft Certified: Fabric Data Engineer Associate - DP700</li><li>Microsoft SQL Certification </li><li>Microsoft Certified Professional(MCP)</li><li>Microsoft Certified IT Professional (MCITP)</li><li>Experience with workflow orchestration tools like Apache Airflow.</li><li>Knowledge of Big Data platforms or NoSQL databases.</li></ul><p><br></p><p>If interested, please apply through LinkedIn or share your resumes at </p>
Back to Listings

Create Your Resume First

Give yourself the best chance of success. Create a professional, job-winning resume with AI before you apply.

It's fast, easy, and increases your chances of getting an interview!

Create Resume

Application Disclaimer

You are now leaving Thisjob.ca and being redirected to a third-party website to complete your application. We are not responsible for the content or privacy practices of this external site.

Important: Beware of job scams. Never provide your bank account details, credit card information, or any form of payment to a potential employer.