• Design, construct, install, test and maintain highly scalable data pipelines with state-of-the-art monitoring and logging practices.
• Investigate and resolve performance and stability issues in data processing systems and advise necessary infrastructure changes
• Select and integrate data tools and frameworks required to provide requested capabilities
• Recommend ways to improve data reliability, efficiency and quality.
• Collaborate with Data Scientists, DevOps and Project Managers on meeting project goals.
• BS or MS in Computer Science or Computer Engineering
• Proficient understanding of distributed computing principles
• Data warehouse experience with major ETL tool. The experience and knowledge of good hands in any ETL tools is required.
• Familiar with relational database and NoSQL database. like MySQL, Hadoop/HBase, MongoDB, Redis…etc
• 3+ years of experience in working with big data using technologies like Spark, Kafka, Flink, Hadoop, and NoSQL datastores.
• 2+ years of experience on distributed, high throughput and low latency architecture.
• 2+ years of experience deploying or managing data pipelines for supporting data-science-driven decisioning at scale.
• A successful track-record of processing and extracting value from large disconnected datasets.
• Machine learning or statistics related knowledge
• Experience with the AWS ecosystem tools and system level DevOps tools
TECHDesign was founded in 2015 under Winbond Electronics Corp, the trusted name in semiconductors since 1987. The idea behind the formation of TECHDesign was simple: we saw a growing need to make the increasingly complex electronics supply chain more accessible to small and medium sized enterprises and big corporations alike. Since then, we have stayed true to our roots by making TECHDesign a place where our clients and suppliers can connect with ease.