We are now expanding the team, and you will have an opportunity to work with the teams in Taiwan and overseas, and take care of the big data environment and services. You will be involved in the development projects (e.g. Workflow integration, BI solution, ..., etc.) and maintain the current reporting services and integration.
1. Develop tools and services to improve the data team work on data analysis.
2. Maintain exisiting services regarding data pipeline.
3. Technical support for Data Analysis Team.
Must Have Requirements:
1. Fluent in English
2. Familiar with Hadoop / Hive
3. Practical knowledge of Java / Python
4. Good understanding of any SQL/NoSQL database (MySQL / Cassandra / etc.)
Good to Have Requirements:
1. Experience with Airflow / Hue / Ambari
2. Familiarity with building Scalable/Maintainable RESTful APIs on Linux
3. Experience of build an inhouse BI solution from scratch
4. Knowledge of Metadata and OLAP Cube
5. Experience on hands on Machine Learning or AI projects