Design, build, and maintain scalable data pipelines and ETL processes Develop and optimize data models, data marts, and curated datasets Perform data ingestion, cleansing, and transformation from multiple sources Ensure data quality, integrity, and consistency across systems Implement data architecture best practices and standards Collaborate with cross-functional teams to understand business requirements Apply CI/CD practices within data environments Support data security initiatives (e.
g., RLS, data masking, encryption) Monitor and troubleshoot data workflows and performance issues Contribute to continuous improvement of data platforms and processes Minimum 5+ years of experience in data engineering or related roles Strong experience with ETL, data pipelines, and data modeling Hands-on experience with modern data platforms such as: Databricks, Snowflake, Cloudera, Microsoft Fabric, Teradata, or similar Experience with data ingestion tools (e.
g., Kafka, NiFi, Airflow, Informatica) Proficiency in SQL and working with relational & NoSQL databases Good understanding of data architecture and data warehousing concepts Experience with data security practices (RLS, masking, encryption) Familiarity with CI/CD in data environments Experience with visualization tools (Power BI, Tableau, Qlik) is a plus Banking or payments industry experience is a strong advantage