The Senior AWS Data Analyst is responsible to design, build, and support scalable data pipelines and curated datasets on AWS. He/She will work with cross functional teams to ingest, transform, and serve data for reporting, analytics, and downstream applications. The ideal candidate is hands on, strong in SQL/Python, and experienced with AWS native data services and modern data engineering practices.
Key Responsibilities
* Design, develop, and maintain end to end data pipelines (batch and near real time) on AWS Data Platform
* Build and manage ETL/ELT workflows using AWS services (e.g., AWS Glue, S3, Redshift, Athena, EMR) and orchestration tools such as Airflow
* Implement data ingestion patterns from diverse sources (databases, APIs, files, event streams) into lake/warehouse layers such as raw, cleansed, and curated data layers
* Develop transformation logic using SQL and Python/PySpark for cleansing, enrichment, and standardisation
* Implement robust data quality checks, reconciliation controls, and monitoring/alerting for failures and anomalies
* Collaborate with data analysts/data scientists to model datasets for analytics and machine learning consumption.
* Contribute to DataOps/DevOps practices: version control, CI/CD, automated testing, release management, and operational support.
* Produce and maintain technical documentation (data flows, mappings, job schedules, runbooks, and operational procedures)
* Optimise Data Pipeline performance and support workflow orchestration and scheduling
* Support production deployments and operations
Required Skills & Experience
* Advanced SQL skills
* Hands on experience working with Teradata and Siebel CRM data sets
* Experience delivering data pipelines in a large scale enterprise data platform environment
* Strong hands on AWS experience with common data services such as: Amazon S3, AWS Glue, Amazon Redshift, Amazon Athena, Amazon EMR
* Strong programming capability in Python and strong data transformation experience using PySpark (preferred) and/or Spark.
* Advanced SQL skills (query optimisation, complex joins, window functions, performance tuning)
* Experience with workflow orchestration tools such as Airflow
* Solid understanding of data warehousing concepts (dimensional modelling, partitioning, incremental loads, CDC concepts).
* Experience implementing monitoring, logging, alerting, and operational support processes.
* Strong communication skills and ability to work with stakeholders to translate requirements into data deliverables
* Telco Industry Experience is highly desirable
Interested candidates can send their updated resume to or reach me @ M: 61283195529
#J-18808-Ljbffr