Big Data Engineer
- Leading insurance group working in more than 170 countries.
- Big Data Engineer
Acerca de nuestro cliente
Zurich is one of the world's leading insurance groups, and one of the few to operate on a global basis. Our mission is to help our customers understand and protect themselves from risk. With about 55,000 employees serving customers in more than 170 countries, we aspire to become the best global insurer as measured by our shareholders, customers and employees. We help individuals, small and medium sized companies and global corporations around the world understand and protect themselves from risk by offering a wide range of insurance products, solutions and advisory services.
Descripción de la oferta
- Part of Big data delivery team to help implement the new data lake platform; define & implement a data integration and workflow framework that provides for the loading of data, data integration, quality management, aggregation, joins etc. and distribution to various systems using advanced Big data technologies.
- Ensure, in collaboration with Solution Designer and platform manager, scalable and high performance solutions.
- Provide application support for IT related problems in the area of data processing workflows.
- Develop and maintain scalable data management and analytic solutions using Hive, Hbase , Informatica etc.
- Hands on technical role; contribute to all phases of the software development lifecycle, including the analysis, architecture, design, implementation, and QA.
- Ensures peer reviews and unit tests are prepared and conducted across the entire team.
- Responsible for provision of effort estimates required to develop software.
- Transition SQL and file system application data stores to a Big Data scalable framework where appropriate.
Perfil buscado (h/m)
- A very strong SQL/data analysis or data mining background, experience with Business Intelligence, Data Warehousing, common/reusable components, Audit Controls, ETL Framework.
- Solid understanding of large scale data management environments (relational and/or NoSQL).
- +3 Years of experience in building massively scalable distributed data processing solutions with Hadoop, HBase & Hive.
- Demonstrable knowledge of Continuous Integration/Test Driven Development and Code Analysis including its application within the software development lifecycle.
- +2 years of experience in the development of ETL routines preferably Informatica Big Data edition.
- Experience in the development and configuration of RESTAPIs.
- Experience in agile IT development methodologies (e.g:. SCRUM).
- Prior experience working in a financial services environment desirable, preferably insurance.
- Ability to work effectively in a global organization, under time pressure and to manage ambiguity.
- Bachelor Degree (IT), Master degree or Technical Diploma or equivalent.
- Good English skills (Written & Oral).
- Strong knowledge of Informatica PowerCenter toolset.