--
Raya Holding for Financial Investments

Job Details

Job description

 

·        Develops and optimizes batch and streaming data pipelines to ensure timely and reliable data delivery for business analytics.

·        Architects end-to-end ETL/ELT workflows using modern integration tools to streamline data ingestion from diverse sources.

·        Manages the Hadoop ecosystem (Spark, Hive, HDFS) to maintain high-performance big data processing capabilities.

·        Designs Medallion architecture layers (Bronze, Silver, Gold) to ensure structured data progression and high-quality consumption.

·        Implements advanced data modeling (Star, Snowflake, and SCD patterns) to provide scalable and efficient data structures for reporting.

·        Monitors data quality, observability, and governance protocols to safeguard data integrity across the organization.

·        Supports machine learning workflows by building feature pipelines and MLOps frameworks to accelerate AI model deployment.

·        Collaborates with cloud infrastructure teams to refine system performance and cost efficiency while ensuring environment scalability.

  • Automates deployments using Git, CI/CD, and containerized environments to maintain seamless and error-free code releases

Preferred candidate

Years of experience

No experience required

Similar Jobs

About Raya Holding for Financial Investments
Egypt, Cairo
Investment Management