本職缺目前暫停接受應徵。
儲存
職缺 9 個月前更新

職缺描述

Responsibilities Include:

  • Collect, define and document engineering data requirements.
  • Design and develop the data pipeline to integrate the engineering data into the data lake
  • Design analytics database schema
  • Automate and monitor ETL/ELT jobs for analytics database
  • Design data model to integrate with existing business data
  • Work with to existing team to integrate, adapt, or identify new tools to efficiently collect, clean, prepare, and store data for analysis
  • Design and implement data quality checking steps to ensure high data quality for dashboard and ML/AI models
  • Provide technical and backend configuration support to the engineering applications

職務需求

Basic Qualifications:

  • Bachelor’s degree in Computer Science, Mathematics, Engineering, or in a related field
  • Experience writing shell scripts, scheduling cron jobs and working with Linux environment.
  • Experience using Airflow or similar data pipeline tools
  • Experience using Gitlab / GitHub or similar version control tools
  • 4+ years’ experience with object-oriented programming language: Python, Java, etc.
  • 4+ years’ experience building processes supporting data pipeline, data cleansing / transformation, data quality monitoring
  • 4+ years’ experience in DB schema, data pipeline design and database management
  • 4+ years’ experience in optimizing data pipelines, architectures and data sets Fluency in structured and unstructured data and management through modern data transformation methodologies
  • Experience with engineering data management tools: Cadence, Pulse, Windchill, ELOIS, Creo or similar tools
  • Strong analytical, problem solving, verbal and written communication skills
  • Not afraid of conflicts, and able to build consensus through direct interaction and compromise
  • Ability to work effectively cross-culturally and across multiple time zones
  • Ability to work with cross-functional teams and stakeholders

Preferred Qualifications:

  • 4+ years’ experience with designing and managing data in modern ETL architect like Airflow, Spark, Kafka, Hadoop, Snowflakes
  • 4+ years’ experience in optimizing data pipelines, architectures and data sets fluency in structured and unstructured data and management through modern data transformation methodologies
  • Experience working with engineering data or similar data
  • Experience with creating API for MS SQL database
  • Experience with Microsoft Power Platform
  • Experience with designing and developing dashboard
  • A successful history of manipulating, processing and extracting value from large datasets
  • Brings established relationships across Lenovo ISG to the role
  • English language proficiency is preferred, and Mandarin capability is advantaged

面試流程

The interview will be conducted via Teams.

Minimum 2 and Max 3 Rounds interview will be conducted by Team which based in the US

查看所有職缺
查看所有職缺
儲存
1
需具備 5 年以上工作經驗
1,300,000+ TWD / 年
部分遠端工作
您的邀請連結
這是您專屬的職缺邀請連結。當有人透過您的邀請連結應徵這個職缺時,您會收到 email 通知。
分享職缺
Logo of Lenovo_台灣聯想環球科技股份有限公司.

關於我們

Lenovo是一家營業額達600億美元,名列《財富》世界500大,業務遍及180個市場的全球化科技企業。 我們專注於“Smarter Technology for All”的願景,每天為數百萬名客戶提供智慧化裝置與基礎架構,並幫助他們打造智慧化解決方案、服務與軟體,期待攜手成就一個更具包容性、更值得信賴且可持續發展的數位化未來。

價值觀是我們行為的準則,Lenovo的價值觀包含: -服務客戶 -誠信共贏 -創業精神 -開拓創新。


團隊

Avatar of the user.
Talent Acquisition Team
Avatar of the user.
Talent Acquisition Specialist
Avatar of the user.
Talent Acqusition Specialist
Avatar of the user.
Talent Acquisition Coordinator
Avatar of the user.
Talent Acquisition Partner

職缺

實習生
實習
2
3萬 ~ 4萬 TWD / 月
儲存