Job Description
As a Data Architect, you will be responsible for designing, implementing, and managing data solutions across various lines of business. You will play a critical role in developing robust and scalable data architectures that meet the evolving needs of different data projects. The ideal candidate should have a strong background in big data technologies, data engineering, and architecture design.
Job Description:
- Proven experience as architect and engineering lead in Data & Analytics stream
- In-depth understanding of data structure principles and data platforms
- Problem-solving attitude, solution mindset with implementation expertise
- Working experience on Modern data platforms which involves big data technologies, data management solutions, and data virtualization
- Well-versed with the end2end data management philosophies and governance processes
- Has pre-sales experience and have involved in RFP/RFI/RFQ processes
- Creative problem-solver with strong communication skills
- Excellent understanding of traditional and distributed computing paradigm
- Should have excellent knowledge of data warehouse / data lake technology and business intelligence concepts.
- Should have good knowledge in Relational, No-SQL, Big Data Databases and should be able to write complex SQL queries.
Personal Skills
- Good command of the English language.
- Strong problem-solving and analytical skills.
- Excellent communication and collaboration skills.
- Ability to work well in a team environment.
- Attention to detail and ability to manage multiple tasks simultaneously.
- Strong time management and organizational skills.
- Preferred to be TOGAF certified.
Technical Skills
- Data integration - ETL tools like Talend and Informatica. Ingestion mechanisms like Flume & Kafka.
- Data modeling - Dimensional & transactional modeling using RDBMS, NO-SQL and Big Data technologies.
- Experience in Snowflake modeling would be an advantage.
- Data visualization - Tools like Tableau, Power BI, and Kibana.
- Master data management (MDM) - Concepts and expertise in tools like Informatica & Talend MDM. Big data - Hadoop eco-system, Distributions like Cloudera / Hortonworks, Pig, and HIVE.
- Data processing frameworks - Spark & Spark streaming.
- Hands-on experience with multiple databases like PostgreSQL, Snowflake, Oracle, MS SQL Server, NOSQL (HBase / Cassandra, MongoDB), is required.
- Knowledge of various data modelling techniques and should be hands-on with data modeling tools like ERWin, TOAD, PowerDesigner, etc.
- Experience in cloud data eco-system - AWS, Azure or GCP.
- Demonstrate strong analytical and problem-solving capability.
- Good understanding of the data eco-system, both current and future data trends.
Education
- Bachelor's degree in Computer Science, Software Engineering, or a related field.