• Ingesting data from different sources to Kafka and HDFS/HIVE. • Knowing how to work with big data tools or stream-processing system, such as Hadoop, Kafka, Spark-streaming, etc. • Experiences on Cloudera/Hortonworks Hadoop Distributions. • To evolve systems for problem preventing, performance improving, monitoring and administrating. Being familiar with data movement (extract, transform, load), especially Trinity with Teradata BTEQ Script.
3 years of experience required
No management responsibility