CakeResume Talent Search

上級
On
4〜6年
6〜10年
10〜15年
15年以上
Avatar of 郭懿萱.
Avatar of 郭懿萱.
Past
管理師 @台灣之星
專案經理、產品經理、系統分析師
1ヶ月以内
自動化專案負責人,節省人力並提升效率:將例行性分析報表以 Tableau 彙整排程自動化,節省人力每月 120hr 工時 自動化工具:SQL. ETL 視覺化工具:Tableau 【台灣之星】用戶續留經營部 - 資深管理師,2017 年 3 月年 8 月 曾負責加值服務、國際漫遊、退租分析、客服隨口行
ETL
Google Workspace
Tableau
無職
面接の用意ができています
フルタイム / リモートワークに興味あり
4〜6年
東吳大學
心理學系
Avatar of Zheng Tzer Lee (李政澤).
Avatar of Zheng Tzer Lee (李政澤).
Consultant @Startup
2023 ~ 2024
Pre-sales/PM/Business Consultant/Business Analyst/System Analyst
1ヶ月以内
產業趨勢分析 • 整合內部資料與台灣政府的 Open data,自動化業務開發環節 • 透過 Tableau Server 建置公司資料倉庫 • 使用 Python 和 Tableau Prep 自動化 ETL,建立datapineline • 領導 A/B 測試設計與執行 Data Science 專案 • 開發推薦系統 • 和開發團隊一起開發内部 CRM 系統 • 實施會員經營管理,和會
Python
Tableau Prep/Tableau Desktop
ETL
就職中
面接の用意ができています
フルタイム / リモートワークに興味なし
4〜6年
Fu Jen Catholic University
Brand and Fashion Management
Avatar of Chia-Wei Yen.
Avatar of Chia-Wei Yen.
Technology Consultant @台灣易思資訊科技股份有限公司
2019 ~ 2024
程式設計師
1ヶ月以内
Chia-Wei Yen New Taipei City, Taiwan || [email protected] A dedicated and results-oriented professional with over 10 years of comprehensive experience in database management, analytics, and software development. Proficient in SQL with specialized expertise in ETL processes, data warehousing, and business intelligence solutions. Skilled in leveraging IBM Cognos Planning Analytics (TM1) and Oracle Hyperion Essbase for advanced financial planning and analysis. Possesses strong analytical acumen complemented by proficiency in Python scripting. Demonstrates adaptability by effectively navigating UNIX environments, including AIX and Linux. Currently expanding proficiency by learning JavaScript and C+
Microsoft Office
SQL
Linux
就職中
面接の用意ができています
フルタイム / リモートワークに興味あり
10〜15年
National United University
Information Manager
Avatar of the user.
Avatar of the user.
資深工程師 @資拓宏宇國際股份有限公司 International Integrated Systems, Inc.
2015 ~ 現在
資深後端工程師
1ヶ月以内
python
Django
kafka
就職中
面接の用意ができています
フルタイム / リモートワークに興味あり
6〜10年
National Taiwan University of Science and Technology
Avatar of Justin Liu.
Avatar of Justin Liu.
Manager @GOMAJI 夠麻吉
2017 ~ 現在
Project Lead / Tech Lead / Team Lead / Technical Manager
1ヶ月以内
services. Implemented CI/CD systems and designed internal software processes, communicating closely with executive leadership. (2) Achievement: Enhanced IT infrastructure flexibility and scalability, improved system reliability and operational efficiency, reduced costs. 4. Data Platform and Personalized Recommendation System: (1) Responsibility: Built AWS data platform including ETL, data warehousing, and lakes. Led development of a personalized recommendation system using AWS Personalize, custom algorithm and Generative AI, e.g. OpenAI, Genimi. (2) Achievement: Boosted customer conversion rates by 2% through detailed customer profiles and targeted insights. 5. Research and Training on
Team Lead
Management Team
Cloud Architecture
就職中
面接の用意ができています
フルタイム / リモートワークに興味あり
10〜15年
Shih Hsin University
Management Information Systems, General
Avatar of the user.
Avatar of the user.
技術經理 @SYSTEX 精誠資訊
2021 ~ 現在
Hybrid Architect and Backend Engineer
1ヶ月以内
Golang
Oracle Database
PostgreSQL
就職中
面接の用意ができています
フルタイム / リモートワークに興味あり
15年以上
臺灣國立空中大學
法律專班
Avatar of 朱建銘.
Avatar of 朱建銘.
Soft Engineer @銓鍇國際股份有限公司
2023 ~ 現在
java程式開發
1ヶ月以内
理和分析工具。 建置Sam Project 進行Aws Lambda 部署。 建置Promethus, Grafana 監控Backend, Mongo 資源監控。 嘗試理解帳務Domain前處理技術, spark 相關技術的資料進行ETL處理。 嘗試使用Terrform 部署 參與重構的規劃討論,參與後端開發,並與前端進行整合,理解商務邏輯運用大數據的數據處理,並
Java EE
JavaScript / ES6 / jQuery
JBoss Application Server
就職中
面接の用意ができています
フルタイム / リモートワークに興味あり
4〜6年
東南科技大學
資訊科技與通訊學系
Avatar of 林冠安.
Avatar of 林冠安.
Past
Data Analyst @趨勢科技 TrendMicro
2021 ~ 2024
Data Analyst、Data Engineer、Data Scientist、Customer Experience Analyst
1ヶ月以内
具: Tableau, Oracle OBIEE, SSRS 技能 程式語言:R、Python、TSQL、PL SQL、VBA。 資料庫:MS SQL、Oracle SQL、ADX 報表系統:Tableau、SSRS、OBIEE、 系統:OFSAA、MSCI RiskMetrics。 對於金融領域知識、資料分析及ETL流程有實務經驗,及良好的團隊合作能力。 學歷 輔仁大學 - 金融所 輔仁大學 - 統計與資訊學系 經歷 趨勢科技 2021/11 – 2024/1
R
PL/SQL
Python
無職
面接の用意ができています
フルタイム / リモートワークに興味あり
6〜10年
天主教輔仁大學 FU JEN CATHOLIC UNIVERSITY
金融所
Avatar of the user.
Avatar of the user.
Senior PM @SetNet 技術顧問公司
2019 ~ 現在
1ヶ月以内
Lean Six Sigma
Microsoft Project
Minitab
面接の用意ができています
フルタイム / リモートワークに興味あり
10〜15年
Gabriela Mistral University
International Business
Avatar of Vel Tien-Yun Wu.
Avatar of Vel Tien-Yun Wu.
Data Engineer @Groundhog Technologies Inc.
2021 ~ 2024
Data Analyst、Data Engineer、Data Scientist、Customer Experience Analyst
1ヶ月以内
Vel Tien-Yun Wu I bring 5 years of hands-on experience in data engineering and software development, with a focus on building scalable data processing systems utilizing Hadoop, Spark, Kafka and Docker. My expertise in developing efficient ETL pipelines has been fundamental in optimizing data workflows for various data warehouses, enhancing data integrity and availability. My track record includes managing high-volume data pipelines, automating scheduling processes to improve operational efficiency, and deploying monitoring solutions that have reduced Mean-Time-To-Repair (MTTR) by 40%. I have a strong foundation in SQL, especially PostgreSQL, which enables
Git
Python
Scala
就職中
面接の用意ができています
フルタイム / リモートワークに興味あり
4〜6年
University of Illinois at Urbana-Champaign, School of Information Sciences
Information Management

最も簡単で効果的な採用プラン

80万枚以上の履歴書を検索して、率先して求人応募者と連絡をとって採用効率を高めましょう。何百もの企業に選ばれています。

  • 検索結果をすべて閲覧
  • 新しい会話を無制限に始められます
  • 有料企業にのみ履歴書を公開
  • ユーザーのメールアドレスと電話番号を確認
検索のコツ
1
Search a precise keyword combination
senior backend php
If the number of the search result is not enough, you can remove the less important keywords
2
Use quotes to search for an exact phrase
"business development"
3
Use the minus sign to eliminate results containing certain words
UI designer -UX
無料プランでは公開済みの履歴書のみ利用できます。
上級プランにアップグレードして、CakeResume限定の何百万の履歴書など、すべての検索結果を閲覧しましょう。

Definition of Reputation Credits

Technical Skills
Specialized knowledge and expertise within the profession (e.g. familiar with SEO and use of related tools).
Problem-Solving
Ability to identify, analyze, and prepare solutions to problems.
Adaptability
Ability to navigate unexpected situations; and keep up with shifting priorities, projects, clients, and technology.
Communication
Ability to convey information effectively and is willing to give and receive feedback.
Time Management
Ability to prioritize tasks based on importance; and have them completed within the assigned timeline.
Teamwork
Ability to work cooperatively, communicate effectively, and anticipate each other's demands, resulting in coordinated collective action.
Leadership
Ability to coach, guide, and inspire a team to achieve a shared goal or outcome effectively.
2ヶ月以内
Sr. Data Engineer
17LIVE
2021 ~ 現在
Taipei, 台灣
Professional Background
現在の状況
就職中
求人検索の進捗
就職希望
Professions
Data Engineer, Python Developer, System Architecture
Fields of Employment
Information Services
職務経験
4〜6年
Management
なし
スキル
Python
MySQL
Linode
API Development
Linux
RabbitMQ
Celery
Nginx
Flask(Python)
Django(Python)
Git
docker swarm
Docker
docker-compose
Data Mining
Machine Learning
Traefik
Redis
ELK(ElasticSearch)
ELK
Prometheus
Grafana
Airflow
dolphindb
SQL
FastAPI
GKE
K8S
Real-Time Systems
GCP
言語
English
中級者
Job search preferences
希望のポジション
Data Solution Architect, Sr. Data Engineer, Data Engineer Manager
求人タイプ
フルタイム
希望の勤務地
Taipei, 台灣, Taiwan
リモートワーク
リモートワークに興味あり
Freelance
はい、私はアマチュアのフリーランスです。
学歴
学校
NDHU
専攻
統計
印刷
Profile 02 00@2x 71843ef6a0df47d6255a9c0436c409dcd5cd81f6514c51a6b2a93339d82bbff6

linsam

data engineer、backend engineer

 • 0972724528 •  台灣  •  [email protected]

5~6 years experience with data engineer and soft engineer. (Distributed Queue System, Database, Web Crawling, RESTful API, ETL, Docker, CICD, GCP, K8S, Airflow ...etc.)

1~2 years experience with data science. (data analysis, machine learning and deep learning)

Work Experience


17 Live -  Senior Data Engineer (IC5), May. 2021 - now

Refactor ETL, create a airflow project by Cloud Composer to transfer ETL tools from digdag to airflow and transfer ETL develop method from shell script to python. 
• Maintenance BigQuery more than 100 tables. 
• Create pipelines from mysql and mongo to bigquery. 
• Create a good development culture, including the introduction of CICD, dev-stage-uat-master, release news, unit tests and test coverage. 
• Using Airflow unified scheduler job, like cloud function scheduler, BQ scheduler, crontab, and ML model by R or Python ...etc.
Reduce Data Team 25% cost.
• Create Data Team's first real-time ETL system via GKE, Pub/Sub and Memorystore for sending push notifications to users.
• Create Data Team's first API via GKE for ML model, include achieve graceful shutdown, and run stress test via ApacheBench, and setup auto-scaling by hpa. 95% latency is under 200ms and RPS is over 200.
• Create a Tagging System for tracking groups of users. 
• Create a BigQuery Resource Monitor to monitor users BQ slot and query count usage. 
• Create document culture by confluence.
The finalists of Break the Norm awards on 2021-Q3 and 2021-Q4. 
• Assist in interview more than 10 new data engineer. 
• Mentor junior data engineers to be more effective individual contributors.
• Apply the data team's models to the company's APP. (automatically send push notifications and in-app messages
• Automatically update recommend streamer list via data team's models to the company's APP.

SinoPac Holdings -  Software Engineer(Python), Nov. 2019 - May. 2021

• Develop python Api (shioaji) for stock/option/future place orde and account. 

• Develop C# Api (shioaji) for stock/option/future place orde and account, and setup CI/CD with GitHub actions.

• Deploy test system for simulate trading by docker swarm.

• Collecting distributed system Log by elk, grafana and prometheus. 13GB log data/daily.

• Monitor distributed system and alert chatbot.

• Develop a transaction-by-trade and odd lot trading API.

Open Up Summit Speaker ( FinMind ) - 2019-12-01

Tripresso - Data Engineer, Oct. 2018 - Nov. 2019 

• Analysis travel data and build a machine learning model. Estimating increase 3% orders (revenue). 

• Maintain and develop an ETL distributed queuing system with 20 machines

• Optimize the ETL system reduced more than 50% execution time. 

• Develop new product crawler let product volume increase 1.5%. 

• Making analysis BI charts provide for other departments.

Mandatory Military Service,Oct. 2017 - Oct. 2018

NDHU - RA, Mar. 2016 - Aug. 2017

Analysing G7 financial data. Model validation and parameter estimation by regression models ( SUR, MLE, Bootstrapping ). And comparing single equation estimators and confidence interval with system equation.

NDHU - TA, Sep. 2015 - Jul. 2017

Calculus, Linear Algebra, Statistics.

Projects


FinMind Open data Api


Open source financial data, more than 50 dataset, provide Api. 

More than 2,000 people registered.

2,000 stars on github.

Automatic update daily by docker swarm, distributed queue system rabbitmq and celery ( 10 cloud machines ). 

Total more than 1 billion data, 10 million streaming data per day.

Architecture diagram.



Bosch Production Line Performance - Kaggle Post-competition analysis, top 6% rank.

Highly imbalance data, ratio is 1000 : 1, 10 GB dataset size. And the data is 50% missing value. More than 4000 variables, but I build models by only 50 features.


Rossmann Store Sales - Kaggle 

Post-competition analysis, top 10% rank.

Time series problem. Building models predict sales after 48 days.


Grupo Bimbo Inventory Demand - Kaggle

Post-competition analysis, top 8% rank. 

Time series problem, eighty millions data size. Building models predict inventory demand after 2 weeks.


Instacart Market Basket Analysis - Kaggle

Real competition, top 25% rank. 

Predicting which products will an consumer purchase again.



 Verification code to text

Create python package of Taiwan Train Verification Code to text.

The model is made by keras-CNN.

Skills


Distributed Queue System

1. Rabbitmq & Celery & Flower. 

2. 8 nodes ( Cloud ) distributed queue system for web crawling. 

3. Deploy by Docker and GKE.

4. Graceful Shutdown.


Database

1. MySQL ( RDBMS ). 

2. Redis ( NoSQL ). 

3. Dolphindb ( TSDB ).


GCP

1. Pub/Sub.
2. GKE ( K8S ).
3. GCE.
4. BQ.
5. Composer.
6. MemoryStore.

CI/CD

1. Create automated tests and automated deploy for the FinMind team. 

2. Using gitlab runner. 

3. CD for auto publish python package. 

4. CD for auto update and deploy new version service.


Log Collect & Monitor

1. Distributed system log collect by elk.  

2. Prometheus and Grafana. Monitor user usage, request latency, request count 

3. Monitor by telegram bot and slackbot.

4. Monitor vm and container by Netdata and cadvisor.



data pipeline

1. Design data pipeline for crawler, backend and analysis by airflow.
2. Design more 200 ETL by airflow.
3. Build airflow by composer
4. Build a real-time pipeline for sending push notifications to users

Machine Learning

xgboost, random forest, svm. statistics - ols, lasso.


Web Crawling

1. Python - request, BeautifulSoup, lxml, selenium. 

2. Auto recognition captcha code by CNN model.


Data Mining

Python - numpy, pandas, sklearn. 

R - parallel, dplyr, data.table, mice.


WEB

1. https://finmindtrade.com/ 

2. nginx

3. frontend - vue 

4. backend - python 

5. traefik.


API

1. FastAPI.
2. Websocket.
3. Loading Balance.
4. Async.
5. Graceful Shutdown.

Stress Test 

1. ApacheBench.
2. Upper bound of FinMind api is 8000/minute request.


Education

National Dong Hwa University, Master of Science,  Sep. 2017.

Major : Mathematics and Statistics.

Tamkang University. Bachelor of Science, Sep. 2015.

Major : Mathematics

Languages


R, Python. Basic in English and proficient in Chinese.

Resume
プロフィール
Profile 02 00@2x 71843ef6a0df47d6255a9c0436c409dcd5cd81f6514c51a6b2a93339d82bbff6

linsam

data engineer、backend engineer

 • 0972724528 •  台灣  •  [email protected]

5~6 years experience with data engineer and soft engineer. (Distributed Queue System, Database, Web Crawling, RESTful API, ETL, Docker, CICD, GCP, K8S, Airflow ...etc.)

1~2 years experience with data science. (data analysis, machine learning and deep learning)

Work Experience


17 Live -  Senior Data Engineer (IC5), May. 2021 - now

Refactor ETL, create a airflow project by Cloud Composer to transfer ETL tools from digdag to airflow and transfer ETL develop method from shell script to python. 
• Maintenance BigQuery more than 100 tables. 
• Create pipelines from mysql and mongo to bigquery. 
• Create a good development culture, including the introduction of CICD, dev-stage-uat-master, release news, unit tests and test coverage. 
• Using Airflow unified scheduler job, like cloud function scheduler, BQ scheduler, crontab, and ML model by R or Python ...etc.
Reduce Data Team 25% cost.
• Create Data Team's first real-time ETL system via GKE, Pub/Sub and Memorystore for sending push notifications to users.
• Create Data Team's first API via GKE for ML model, include achieve graceful shutdown, and run stress test via ApacheBench, and setup auto-scaling by hpa. 95% latency is under 200ms and RPS is over 200.
• Create a Tagging System for tracking groups of users. 
• Create a BigQuery Resource Monitor to monitor users BQ slot and query count usage. 
• Create document culture by confluence.
The finalists of Break the Norm awards on 2021-Q3 and 2021-Q4. 
• Assist in interview more than 10 new data engineer. 
• Mentor junior data engineers to be more effective individual contributors.
• Apply the data team's models to the company's APP. (automatically send push notifications and in-app messages
• Automatically update recommend streamer list via data team's models to the company's APP.

SinoPac Holdings -  Software Engineer(Python), Nov. 2019 - May. 2021

• Develop python Api (shioaji) for stock/option/future place orde and account. 

• Develop C# Api (shioaji) for stock/option/future place orde and account, and setup CI/CD with GitHub actions.

• Deploy test system for simulate trading by docker swarm.

• Collecting distributed system Log by elk, grafana and prometheus. 13GB log data/daily.

• Monitor distributed system and alert chatbot.

• Develop a transaction-by-trade and odd lot trading API.

Open Up Summit Speaker ( FinMind ) - 2019-12-01

Tripresso - Data Engineer, Oct. 2018 - Nov. 2019 

• Analysis travel data and build a machine learning model. Estimating increase 3% orders (revenue). 

• Maintain and develop an ETL distributed queuing system with 20 machines

• Optimize the ETL system reduced more than 50% execution time. 

• Develop new product crawler let product volume increase 1.5%. 

• Making analysis BI charts provide for other departments.

Mandatory Military Service,Oct. 2017 - Oct. 2018

NDHU - RA, Mar. 2016 - Aug. 2017

Analysing G7 financial data. Model validation and parameter estimation by regression models ( SUR, MLE, Bootstrapping ). And comparing single equation estimators and confidence interval with system equation.

NDHU - TA, Sep. 2015 - Jul. 2017

Calculus, Linear Algebra, Statistics.

Projects


FinMind Open data Api


Open source financial data, more than 50 dataset, provide Api. 

More than 2,000 people registered.

2,000 stars on github.

Automatic update daily by docker swarm, distributed queue system rabbitmq and celery ( 10 cloud machines ). 

Total more than 1 billion data, 10 million streaming data per day.

Architecture diagram.



Bosch Production Line Performance - Kaggle Post-competition analysis, top 6% rank.

Highly imbalance data, ratio is 1000 : 1, 10 GB dataset size. And the data is 50% missing value. More than 4000 variables, but I build models by only 50 features.


Rossmann Store Sales - Kaggle 

Post-competition analysis, top 10% rank.

Time series problem. Building models predict sales after 48 days.


Grupo Bimbo Inventory Demand - Kaggle

Post-competition analysis, top 8% rank. 

Time series problem, eighty millions data size. Building models predict inventory demand after 2 weeks.


Instacart Market Basket Analysis - Kaggle

Real competition, top 25% rank. 

Predicting which products will an consumer purchase again.



 Verification code to text

Create python package of Taiwan Train Verification Code to text.

The model is made by keras-CNN.

Skills


Distributed Queue System

1. Rabbitmq & Celery & Flower. 

2. 8 nodes ( Cloud ) distributed queue system for web crawling. 

3. Deploy by Docker and GKE.

4. Graceful Shutdown.


Database

1. MySQL ( RDBMS ). 

2. Redis ( NoSQL ). 

3. Dolphindb ( TSDB ).


GCP

1. Pub/Sub.
2. GKE ( K8S ).
3. GCE.
4. BQ.
5. Composer.
6. MemoryStore.

CI/CD

1. Create automated tests and automated deploy for the FinMind team. 

2. Using gitlab runner. 

3. CD for auto publish python package. 

4. CD for auto update and deploy new version service.


Log Collect & Monitor

1. Distributed system log collect by elk.  

2. Prometheus and Grafana. Monitor user usage, request latency, request count 

3. Monitor by telegram bot and slackbot.

4. Monitor vm and container by Netdata and cadvisor.



data pipeline

1. Design data pipeline for crawler, backend and analysis by airflow.
2. Design more 200 ETL by airflow.
3. Build airflow by composer
4. Build a real-time pipeline for sending push notifications to users

Machine Learning

xgboost, random forest, svm. statistics - ols, lasso.


Web Crawling

1. Python - request, BeautifulSoup, lxml, selenium. 

2. Auto recognition captcha code by CNN model.


Data Mining

Python - numpy, pandas, sklearn. 

R - parallel, dplyr, data.table, mice.


WEB

1. https://finmindtrade.com/ 

2. nginx

3. frontend - vue 

4. backend - python 

5. traefik.


API

1. FastAPI.
2. Websocket.
3. Loading Balance.
4. Async.
5. Graceful Shutdown.

Stress Test 

1. ApacheBench.
2. Upper bound of FinMind api is 8000/minute request.


Education

National Dong Hwa University, Master of Science,  Sep. 2017.

Major : Mathematics and Statistics.

Tamkang University. Bachelor of Science, Sep. 2015.

Major : Mathematics

Languages


R, Python. Basic in English and proficient in Chinese.