Data Engineer with a proven track record of delivering high-quality results in data processing and management. With five years of industry experience, I have honed my skills in Python programming and container technologies. I have a strong background in constructing CI/CD pipelines and leveraging serverless services on the Google Cloud Platform (GCP). I am now seeking a challenging role where I can apply my expertise and contribute to the development of innovative data engineering solutions.
Nov 2021 - Present
Taichung Taiwan
Oct 2019- Oct 2021
Taichung, Taiwan
Developed and maintained ETL processes using Python to transfer data into Hadoop Ecosystem, including HBase and Hive, for efficient data storage and retrieval.
Proficient in SQL for data manipulation and query optimization.
Collaborated with cross-functional teams to design and implement data pipelines, ensuring data integrity and accuracy.
Streamlined data processing workflows, resulting in significant time and resource savings.
Worked on data integration with Snowflake, enhancing the company's data warehousing capabilities.
Data Warehousing (Snowflake)
Hadoop Ecosystem (HBase, Hive)
RESTful API Development (FastAPI)
Container Technologies (Docker, docker-compose, Kubernetes)
CI/CD Pipeline Development (Jenkins and Ansible)
Google Cloud Platform (GCP): Cloud Build, Cloud Deploy, GKE, Cloud Function, Pub/Sub
Serverless Service Integration
Mandarin - Native
English - Profient
Sep 2015 - Sep 2017
Sep 2010 - Sep 2014
Data Engineer with a proven track record of delivering high-quality results in data processing and management. With five years of industry experience, I have honed my skills in Python programming and container technologies. I have a strong background in constructing CI/CD pipelines and leveraging serverless services on the Google Cloud Platform (GCP). I am now seeking a challenging role where I can apply my expertise and contribute to the development of innovative data engineering solutions.
Nov 2021 - Present
Taichung Taiwan
Oct 2019- Oct 2021
Taichung, Taiwan
Developed and maintained ETL processes using Python to transfer data into Hadoop Ecosystem, including HBase and Hive, for efficient data storage and retrieval.
Proficient in SQL for data manipulation and query optimization.
Collaborated with cross-functional teams to design and implement data pipelines, ensuring data integrity and accuracy.
Streamlined data processing workflows, resulting in significant time and resource savings.
Worked on data integration with Snowflake, enhancing the company's data warehousing capabilities.
Data Warehousing (Snowflake)
Hadoop Ecosystem (HBase, Hive)
RESTful API Development (FastAPI)
Container Technologies (Docker, docker-compose, Kubernetes)
CI/CD Pipeline Development (Jenkins and Ansible)
Google Cloud Platform (GCP): Cloud Build, Cloud Deploy, GKE, Cloud Function, Pub/Sub
Serverless Service Integration
Mandarin - Native
English - Profient
Sep 2015 - Sep 2017
Sep 2010 - Sep 2014