Transforming data into innovation requires a skilled and passionate professional.
">
* A Data Engineer will play a vital role in designing, developing, and delivering business-critical features on key products that are part of the payment ecosystem. They will foster an innovative and dynamic culture, driven by diversity and inclusion.
* This individual will work alongside a fantastic team of people and highly invested managers focused on career progression.
The ideal candidate is passionate about making jobs simpler and more rewarding.
We are a team of big thinkers who engineer the future of banking, creating industry-leading product lifecycles that support capabilities, people, and technology.
Together we can create something remarkable. As a Data Engineer, you will be empowered to deliver high-quality solutions, drive technical excellence, and contribute to the development of next-generation data platforms and pipelines.
Data is at the heart of our operations. We rely on robust data warehousing solutions, such as Amazon Redshift, Google BigQuery, or Snowflake, to drive informed decision-making and business growth.
You will have the opportunity to:
* Work with Cloudera Data Platform and large migration programs from on-prem to Cloud
* Develop expertise in Data warehousing solutions like Amazon Redshift, Google BigQuery, or Snowflake
* Build next-generation Data platforms and Data pipeline solutions across the organization
* Lead and work on end-to-end delivery of projects, leveraging Agile methodologies
* Design and implement Data modeling, schema design, and Data architecture principles
* Guide engineers and lead teams towards pragmatic and fit-for-purpose solutions
* Cultivate creative problem-solving skills to tackle complex Data challenges
* Drive DevOps practices and CI/CD tools for efficient software delivery
* Maintain knowledge of Data governance practices, Data privacy regulations, and Security measures
Tech skills required:
* Hadoop ecosystem (HDFS, MapReduce, YARN)
* Advanced SQL programming
* Proficiency in Spark for large-scale data processing
* AWS Cloud services experience and certification
* Deep understanding and experience with Kafka for real-time data streaming
* Dealing with complex file formats and structures
* Traditional Data warehouse concepts (Facts/Dims, normalization)
* Event-driven Data architecture
* Apache Iceberg
* MSK (Kafka)
* Redshift
* Glue ETL
* S3
We offer a flexible and inclusive work environment, allowing you to balance your work-life needs. Our team is committed to driving positive change, and we encourage creativity, collaboration, and continuous learning.
If you share our passion for innovation and customer satisfaction, apply now. We look forward to welcoming talented individuals who embody our values: Care, Courage, and Commitment.