Improve upon existing systems that collect, manage and convert data for our teams to interpret in a meaningful way. Experience with building AI/ML models and rule-based engines required.
Responsibilities
- Design, build, and maintain scalable data pipelines to collect, process, and store large volumes of structured and unstructured data from various sources.
- Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and ensure data solutions meet business needs.
- Implement and optimise data models, data architecture, and data warehousing solutions for efficient data retrieval and analysis.
- Monitor data systems' performance, identify bottlenecks, and implement solutions to optimise data processing and storage.
- Ensure data security and privacy measures are in place to safeguard sensitive information and comply with relevant regulations.
Job requirements
- Bachelor's degree in Computer Science, Engineering, or a related field; Master's degree preferred.
- Robust programming skills in languages such as Python, Java, Scala, or SQL for data manipulation and analysis.
- Proficiency in working with big data technologies, such as Hadoop, Spark, or similar distributed computing frameworks.
- Experience with data modeling, data warehousing, and ETL development using tools like Apache Airflow or similar.
- Knowledge of database systems like MySQL, PostgreSQL, MongoDB, or Cassandra.
#J-18808-Ljbffr
📌 Data Engineer
🏢 Lendaly Pty
📍 Sydney