· 5+ years’ experience working in Data Engineering and Warehousing.
· 3 - 5 years’ experience integrating data into analytical platforms.
· Experience in ingestion technologies (e.g. Sqoop, Flume), processing technologies (Spark/Scala), and storage (e.g. HDFS, HBase, Hive).
· Experience in data profiling, source-target mappings, ETL development, SQL optimization, testing, and implementation.
· Expertise in streaming frameworks (Kafka/Spark Streaming/Storm) essential.
· Experience in building Spark SQL and Spark Data Frame API based applications.
· Experience managing structured and unstructured data types.
#J-18808-Ljbffr