The Data Services and solutions team is responsible for design, development, test ,deployment and operation of large enterprise data warehouse/BI data solutions using both on-prem and cloud technologies. The Full Stack Data Engineer, Senior will report to the Manager, Data Services and Solutions. In this role you will be deeply involved in the design, development, and deployment of secure, high-quality software solutions. Your role will focus on integrating security and automation throughout the software development lifecycle (SDLC), with an emphasis on writing clean, maintainable code and building infrastructure that supports CI/CD pipelines, automated testing, and cloud-native delivery. You'll implement and enforce DevSecOps best practices tailored for Azure, contribute to infrastructure as code, and work closely with developers, testers, and cloud engineers to ensure code is secure, scalable, and production-ready from day one. This role requires a hands-on engineer who thrives in a collaborative environment and is passionate about code quality, automation, and secure cloud development. Our leadership model is about developing great leaders at all levels and creating opportunities for our people to grow - personally, professionally, and financially. We are looking for leaders that are energized by creative and critical thinking, building and sustainiing high-performing teams, getting results the right way, and fostering continuous learning.
Your Knowledge and Experience
- Requires a bachelor's degree in computer science, Information Technology, Management Information Systems, or a related field (or equivalent experience), with a minimum of 5 years of relevant experience in enterprise application support and cloud-based solution delivery
- Experience in Cloud platform preferably Azure (or AWS or GCP) and its related technical stack including ADLS, Synapse, Azure data factory etc.
- Experienced in Snowflake and/or Databricks
- Solid experience with JavaScript, along with CSS responsive design practices
- Strong technical understanding of data modeling (data vault 2.0), data mining, master data management, data integration, data architecture, data virtualization, data warehousing and data quality techniques
- Hands on experience using data management technologies like Informatica PowerCenter/IICS, Collibra, Reltio Master Data Management, DBT Cloud, Dbt core, Denodo and Golden Gate or Striim replication
- Working knowledge of testing tools and systems and scheduling software (Tidal, Control-m)
- Basic experience in working with data governance and data security and specifically information stewards and privacy and security officers in moving data pipelines into production with appropriate data quality, governance and security standards and certification
- Proficiency in Unix command-line operations, including directory navigation, file manipulation, and shell scripting, python along with utilities like awk, sed etc.
- Hands-on experience with CI/CD pipelines(e.g. Bitbucket Pipelines, GitHub Actions) and Infrastructure as Codetools like Ansible to automate cloud deployments
- Demonstrated/ Excellent ability to influence and collaborate with stakeholders, vendors, and cross-functional teams, with excellent verbal and written communication skills to translate and execute technical deliverable
- Preferred experience in the healthcare industry
- Strong process orientation with the ability to follow and improve procedures for critical maintenance and operational tasks
|