Adzuna logo

Lead Software Engineer (Big Data Stack Admin)

Location: Singapore
Company: Great Eastern Holdings
Apply for this job

Job Purpose

We are seeking a highly motivated and talented Hadoop Admin or Big Data Stack Admin to work on Great Eastern Life's next-generation data platform. Working alongside team of engineers and architects, you will be responsible for running and managing our big data stack in DEV/SIT/UAT/PROD and supporting a hybrid data platform. This is a great opportunity to be an integral part of a team building GE's technology platform leveraging open source technologies, and work on challenging and business-impacting projects.

The Job / Responsibility

  • Responsible for implementation and ongoing administration of Platform infrastructure in a minimum of 15 nodes environment.
  • Takes care of the day-to-day running of big data clusters (including but not limiting to Hadoop and MariaDB dbases)
  • Responsible for working closely with the database team, network team, BI team and application teams to make sure that all the big data applications are highly available (24x7) and performing as expected.
  • Responsible for capacity planning and estimating the requirements for lowering or increasing the capacity of the Hadoop/MariaDB cluster
  • Responsible for enabling different level of Hadoop/MariaDB security at ecosystem level.
  • Performance tuning of Hadoop clusters and Hadoop ecosystem.
  • Perform POCs of new capability in Hadoop/MariaDB Platform
  • Monitor and enhance Hadoop cluster jobs performance and capacity planning
  • Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments.
  • Cluster maintenance as well as creation and removal of nodes using Ambari or others and CICD pipeline
  • Monitor (24x7) Hadoop/MariaDB cluster performance, connectivity and security
  • Manage and review Hadoop log files and enhance retention policy.
  • Handles performance tuning of Hadoop clusters and Hadoop MapReduce routines.
  • File system management and monitoring.
  • Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability
  • Collaboration with application teams to install operating system and updates, patches, version upgrades when required.
  • Backup and recovery tasks
  • Good understanding of OS concepts, process management and resource scheduling.
  • Basics of networking, CPU, memory and storage.
  • Good hold of shell scripting
  • Takes accountability in considering business and regulatory compliance risks and takes appropriate steps to mitigate the risks.
  • Maintains awareness of industry trends on regulatory compliance, emerging threats and technologies in order to understand the risk and better safeguard the company.
  • Highlights any potential concerns /risks and proactively shares best risk management practices.

Our Requirements

  • Candidate Must have a bachelor's degree in Computer Science, Software Engineering, or a related field
  • 10 years in the software engineering profession, in which 5 years of hands-on experience in Hadoop eco system.
  • Must have 3 years' experience owning multiple critical applications on big data platform with batch and real-time data pipelines.
  • 5 years' experience as a Hadoop Admin and worked on more than one cluster with each cluster not limited to 15 Nodes
  • Must have experience on Hadoop Cluster setup from ground-up.
  • Must have strong experience with design, development and analytical skills in handling both structured and unstructured data.
  • Must have the ability to develop and maintain strong collaborative relationships at all levels across IT and Business Stakeholders.
  • Must have the ability to prioritize multiple tasks and deal with urgent requests in a demanding environment.
  • Excellent written and oral communication skills. Adept and presenting complex topics, influencing and executing with timely / actionable follow-through
  • Hands on knowledge on Elastic search, Kibana dashboards and log management mechanism.
  • Good understanding of financial/insurance Data warehouse models.
  • Ability to provide innovative ideas and see through implementation in HDFS, Spark, Hive, Kafka, Scala, Python technologies.
  • Extensive experience working with data warehouses and big data platforms.
  • Experience in building strong relationships with senior leaders will be preferable.
  • Aptitude to acquire new skills as needed for the role
  • High level of integrity, takes accountability of work and good attitude over teamwork.
  • Takes initiative to improve current state of things and adaptable to embrace new changes.

Characteristic we look for …

  • A trendsetter. You thrive in an intellectually challenging environment with leading edge technologies.
  • A team player. We over 'I'.
  • A learner. You have an insatiable thirst for knowledge and greater understanding.
  • A pragmatist. Your goal is to create useful products, not build technology for technology's sake.
  • An empath. You understand what the customer needs and use that perspective to create the best user experience.

About Great Eastern

Established in 1908, Great Eastern places customers at the heart of everything we do. Our legacy extends beyond our products and services to our culture, which is defined by our core values and how we work. As champions of Integrity, Initiative and Involvement, our core values act as a compass, guiding and inspiring us to embrace the behaviours associated with each value, upholding our promise to our customers - to continue doing our best for them in a sustainable manner.

We work collaboratively with our stakeholders to look for candidates who exhibit or have the potential to embrace our core values and associated behaviours, as these are the key traits that we expect from our employees as they develop their careers with us.

We embrace inclusivity, giving all employees an equal opportunity to shine and play their role in exploring possibilities to deliver innovative insurance solutions.

Since 2018, Great Eastern has been a signatory to the United Nations (UN) Principles of Sustainable Insurance. Our sustainability approach around environmental, social, and governance (ESG) considerations play a key role in every business decision we make. We are committed to being a sustainability-driven company to achieve a low-carbon economy by managing the environmental footprint of our operations and incorporating ESG considerations in our investment portfolios; improving people's lives by actively helping customers live healthier, better and longer; and drive responsible business practices through material ESG risk management.

To all recruitment agencies: Great Eastern does not accept unsolicited agency resumes. Please do not forward resumes to our email or our employees. We will not be responsible for any fees related to unsolicited resumes.

Apply for this job


The number of jobs in each salary range for all:

Similar jobs

Lead Software Engineer (Big Data Platform)
Great Eastern Holdings
Big Data Engineer
Innovative Consulting Pte. Ltd.
Big Data Software Engineer(N7975)
Big Data Software EngineerN7975
Big Data Engineer
Quess Corp Limited