facebook

Connecting...

VP/AVP, Big Data Delivery Lead, Middle Office Technology, Technology & Operations

Job Title: VP/AVP, Big Data Delivery Lead, Middle Office Technology, Technology & Operations
Contract Type: Permanent
Location: Singapore
Industry:
Reference: WD20685
Contact Name: Jasper Tan
Job Published: April 19, 2021 18:31

Job Description

Business Function

Group Technology and Operations (T&O) enables and empowers the bank with an efficient, nimble and resilient infrastructure through a strategic focus on productivity, quality & control, technology, people capability and innovation. In Group T&O, we manage the majority of the Bank's operational processes and inspire to delight our business partners through our multiple banking delivery channels.

 

Responsibilities

  • Drive big data architecture design within the team that will be implemented across several projects
  • Work with key business and technical stakeholders to drive robust and effective big data architecture design
  • Deliver solutions for online data extraction, data integration and data management.
  • Contribute to the establishment and maintenance of distributed computing platform and big data services
  • Writing of documents that clearly explain how algorithms should be implemented, verified and validated
  • Manage key stakeholders and explain design decisions in a clear and articulate fashion
  • Lead and advise project teams in delivering enterprise projects, especially where big data is concerned
  • Analyse and perform performance tuning for jobs and Queries.
  • Mentor junior developers in the team to help them to adopt best practices.
  • Review pull requests and give constructive feedbacks to the peers/junior developers.
  • Write production quality code

 

Requirements

  • Experience in a field encompassing Distributed computing, Big Data Analytics, Data Transformation
  • Relevant industry experience in cloud technology would be favourably considered
  • Knowledge on Core Java / Scala preferably
  • Hands on experience in Spark is must
  • Hands on experience on HDFS (Hadoop), spark, impala, hive.
  • Knowledge on database technology like MariaDB TX and X5
  • Computer Science fundamentals in algorithm design, problem solving, and complexity analysis
  • Knowledge of professional software engineering practices & best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations
  • Understand big data ecosystem for enterprise applications and is able to design data architecture that satisfies project requirements
  • Able to perform Unix / Linux scripting

 

We offer a competitive salary and benefits package and the professional advantages of a dynamic environment that supports your development and recognises your achievements.

Are you looking for better jobs?

Flexible Work . Equal Pay . Leadership Development

Join Our Movement

Are you looking for talents?

Join Us To Diversify Your Team!

Post A Job