
Cognizant
We are looking for a highly skilled Big Data Java Developer who will play a pivotal role driving the Big Data initiatives using expertise in Apache Hadoop, Java and advanced Java technologies. The role involves working with Spark, Scala and Big Data technologies for advanced data transformations, and analytics to deliver high-quality solutions on time for our clients.
We are Cognizant Artificial Intelligence:
Digital technologies, including analytics and AI, give companies a once-in-a-generation opportunity to perform orders of magnitude better than ever before. However, clients need new business models built from analyzing customers and business operations at every angle to really understand them.
With the power to apply artificial intelligence and data science to business decisions via enterprise data management solutions, we help leading companies prototype, refine, validate, and scale the most desirable products and delivery models to enterprise scale within weeks.
In this role, you will
- Design, develop and test a large-scale, custom distributed software system using the latest Java, Scala and Big Data technologies.
- Build data pipelines for batch and real-time data ingestion, transformation, and loading.
- Collaborate with multi-functional teams to design and enhance Java-based applications ensuring seamless integration and performance.
- Work with cloud platforms (e.g., AWS, Azure, GCP) for scalable data solutions.
- Conduct code reviews and provide constructive feedback to ensure high-quality software development practices.
- Ensure data security and compliance with industry standards safeguarding sensitive information.
- Participate in agile development processes contributing to sprint planning and retrospective meetings.
What you’ll need to succeed (required skills)
- 8+ years backend application development experience in Java, Microservices and Spring/Spring Boot.
- 5+ years in developing and optimizing Big Data applications using Java/Scala and Spark on Cloudera/HDP.
- Must have 3+ years of hands-on experience with Hadoop ecosystem tools (e.g., HDFS, Hive, Pig, MapReduce).
- Strong experience in real-time data streaming using Apache Kafka or similar technologies.
- Experience in developing/designing micro-service architecture.
- Experience in building end to end data pipelines on AWS/Azure and/or Databricks.
- Proven hands-on experience in containerization – Docker, Kubernetes, Openshift etc.
- Proficient in SQL and working with large datasets.
- Working knowledge of Jenkins CI, Git, CI/CD pipelines.
- Superb communication and interpersonal skills.
What will help you stand out (preferred skills)
- Experience in large scale on-premise to Cloud migration projects.
- Programming experience in Python (2+ years)
At Cognizant, we’re eager to meet people who believe in our mission and can make an impact in various ways. We encourage you to apply if you have most of the skills above and feel like you are strongly suited for this role. Consider what transferrable experience and skills make you a unique applicant and help us see how you’d be beneficial to this role.
Cognizant will only consider applicants for this position who are legally authorized to work in Canada without requiring employer sponsorship, now or at any time in the future.
Working arrangements:
We believe hybrid work is the way forward as we strive to provide flexibility wherever possible.
Based on this role’s business requirements, this is a hybrid position requiring 3 days a week in a client office in Toronto, ON. Regardless of your working arrangement, we are here to support a healthy work-life balance though our various wellbeing programs.
Note:The working arrangements for this role are accurate as of the date of posting. This may change based on the project you’re engaged in, as well as business and client requirements.
Rest assured; we will always be clear about role expectations.