14 Dec 2019
This exciting role involves working as part of a dynamic team to design and implement new and innovative strategies and scripts to automate acquisition, ingestion, integration, cataloging and delivery of data alongside development of some of the pioneering data-driven products.
In this Role you will:
Lead the initiatives for the optimization and maintenance of data ingestion pipelines including both stream and batch technologies
Require a solid understanding of data governance, including access management; modelling; data warehousing; data quality processes and metadata management.
Own accountability for developmentof data storage table and message schemas
Support Hadoop technologies to ensure data is stored and accessed without hassles
Streamline the operations of Self-service data lakes and manage the effective implementation of role based access controls through technical enforcement and governance
Discover, report and mitigate platform issues through proactive monitoring of health and performance KPI’s
Identify best practices for data ingestion and metadata management
Communicate and report data ingestion and analysis related risks and issues to the Project Manager.
Demonstrated technical experience working with Hadoop environments or other similar technologies
Steely eyed smart when it comes to high quality data characteristics (complete, valid, timely, unique, accurate, consistent)
Able to rapidly acquire new technical skills in an ever changing environment
Passionate, determined and go-getter attitude.