Skip to content

AWS BigData Developer

  • Full Time
  • Toronto

Tata Consultancy Services

Inclusion without Exception:

Tata Consultancy Services (TCS) is an equal opportunity employer, and embraces diversity in race, nationality, ethnicity, gender, age, physical ability, neurodiversity, and sexual orientation, to create a workforce that reflects the societies we operate in. Our continued commitment to Culture and Diversity is reflected in our people stories across our workforce and implemented through equitable workplace policies and processes.



About TCS:

TCS is an IT services, consulting, and business solutions organization that has been partnering with many of the world’s largest businesses in their transformation journeys for over 55 years. Its consulting-led, cognitive-powered portfolio of business, technology, and engineering services and solutions is delivered through its unique Location Independent Agile™ delivery model, recognized as a benchmark of excellence in software development.

A part of the Tata group, India’s largest multinational business group, TCS employs over 612,000 of the world’s best-trained consultants in 55 countries. The company generated consolidated revenues of US $29 billion in the fiscal year ended March 31, 2024,and is listed on the BSE and the NSE in India. TCS’ proactive stance on climate change and award-winning work with communities across the world have earned it a place in leading sustainability indices such as the MSCI Global Sustainability Index and the FTSE4Good Emerging Index.

Required Skills:


• Understand requirement from product owners and translate into requirement and scope documents.

• Decide on the best fitment of technologies/services that are available in scope.


• Create Solution for data ingestion and data transformation using Hadoop services like Spark, Spark streaming, Hive, etc.

• Create technical design documents to communicate solutions to the team and mentor the team to develop the solution

• Build the solution with Hadoop Services as per design specifications


• Assist team teams to build test cases and support with testing the solution

• Coordinate with Upstream, Downstream and other supporting teams for production implementation

• Provide post-production support for Solutions implemented


• Develop data engineering frameworks in Spark on AWS Data Lake platform?.

Coordinate with clients, data users and key stakeholders to understand feature requirements needed merge them to create reusable design patterns

• Data onboarding using the developed frameworks

• Understand and make sense of available code in Netezza and Hadoop to design a best way to implement its current features in AWS Data Lake


• Unit test code and aid with QA/SIT/Perf testing

• Migration to production environment

• Candidate should have strong working experience with Hadoop platform.


• Strong hands-on experience on Hive, Spark with Scala.

• In-depth knowledge and extensive experience in building batch workloads on Hadoop.

• Adept in analyzing and refining requirements, consumption query patterns and choosing the right technology fit like RDBMS, data lake and data warehouse.


• Should have knowledge of analytical data modelling on any of the RDBMS platform / Hive

• Should have working experience in Pentaho

• Proven practical experience in migrating RDBMS based data to Hadoop on-prem


• Good experience in Data warehouse and Data Lake platforms

• Good experience in implementation AWS Data Lake, S3, EMR/Glue, Python, AWS RDS, Amazon Redshift, Amazon Lake Formation, Airflow, Data Models, etc.

• MUST have very strong knowledge of Pyspark


• Must understand data Encryption techniques and be able to implement them

• Must have experience in working with Bitbucket, Artifactory, AWS Code Pipeline

• Hands-on experience working with Terra bytes peta bytes scale data and millions of transactions per day.


• Skills to develop ETL pipeline using Airflow?. Knowledge of Spark streaming or any other streaming jobs

• Ability to deploy code using AWS Code Pipeline and Bit bucket is an added plus?. Expert in any of the following programming language: Scala, Java and comfortable with working on Linux platform.

• Knowledge of Python CI/CD pipeline design AWS Cloud Infrastructure for services like S3, Glue, Secrets manager, KMS, Lambda, etc.


Tata Consultancy Services Canada Inc. is committed to meeting the accessibility needs of all individuals in accordance with the Accessibility for Ontarians with Disabilities Act (AODA) and the Ontario Human Rights Code (OHRC).

Should you require accommodations during the recruitment and selection process, please inform Human Resources.



Thank you for your interest in TCS. Candidates that meet the qualifications for this position will be contacted within a 2-week period. We invite you to continue to apply for other opportunities that match your profile.

To apply, please visit the following URL:

THISJOB.CA