Big Data Engineer - VOIS

Role purpose: 
-     The Big Data engineer is a core member of the agile teams delivering data pipelines and capabilities within the organization through building and automating data flows, and provides expert guidance and delivers through self and others to:
1.    Integrate the necessary data from several sources necessary for analysis and for Technology actions
2.    Build applications and products that make use of large volumes of data and generate outputs that allow actions that generate incremental value
3.    Deliver and implement core capabilities (frameworks, platform, development infrastructure, documentation, guidelines and support) to speed up delivery in the Big Data Programme, assuring quality, performance and alignment to the Group technology blueprint of components releases in the platform
4.    Support stakeholders and functions in obtaining benefiting business value from the operational data


Key accountabilities and decision ownership:
•    Designing and producing high performing stable end-to-end applications to perform complex processing of batch and streaming massive volumes of data in a multi-tenancy big data platform in the cloud (Hadoop on-premises will be consider), and output insights back to business systems according to their requirements.
•    Ingest and automation the necessary data from local and group sources onto GCP platform
•    Accountable to ensure delivery of solution & use case enablement, GCP project & resource enablement, data source ingestion for Networks sources, application production rollouts and code/execution optimisation for big data solutions.
•    Working with key stakeholders such as the Group Big Data/Neuron team, ADS, ACoE, local market IT and Big Data teams to define the strategy for evolving the Big Data capability, including solution architectural decisions aligned with the platform architecture
•    Investigating and driving new technologies adoption to identify where they can bring benefits
•    Ensuring common data architecture, structure and definition, data cleanings and data integrity
•    Support data security & privacy processes

Core competencies, knowledge and experience:
•    Experience of working on projects that span multi-disciplinary areas of differing size and complexity to meet business expectations and deliver results
•    Confident and able to liaise and influence at all levels within Vodafone and/or relevant customer organizations
•    Able to communicate effectively across organisational, technical and political boundaries, understanding the context
•    Understands business requirements, Drives improvements and process innovation


Must have technical / professional qualifications: 
•    Experience with modern data architectures and cloud data analytics technologies, preferably on GCP.
•    Hands-on software development experience in Python programming; Scala, Java desirable
•    Hands on experience in writing complex SQL queries and queries tuning.
•    Experience in UNIX/Linux administration and bash scripting
•    Hands on experience on Apache Spark, Apache Kafka, BigQuery, CDAP, Python programming, bash scripting, Apache Airflow.
•    Strong knowledge of data movement techniques and best practices to handle large volumes of data.
•    Experience with data warehousing architecture and data modelling best practices.
•    Experience with File Systems, server architectures, and distributed systems.
•    Google Cloud certified Professional Data Engineer and/or Cloud Architect desirable
 

#VOIS

تاريخ النشر: اليوم
الناشر: Vodafone jobs
تاريخ النشر: اليوم
الناشر: Vodafone jobs