Feb 2019 - Present
Bengaluru, Karnataka, India
Leading Metrics team to make sure of the data sanity of energy consumption data.
Experience working with Snowflake Functions, hands on exp with Snowflake utilities, stage and file upload features, time travel, fail safe.
Designing, development, and maintaining data pipelines and ETL processes using Snowflake and SQL.
Optimization of data models and schema design in Snowflake for efficient query performance.
Expertise on engineering platform components such as Data Pipelines, Data Orchestration, Data Quality, Data Governance & Analytics.
Ensure data quality, consistency, and integrity by implementing data validation and cleansing processes.
Monitor and troubleshoot data integration and processing issues, ensuring data pipelines are robust and scalable.
Creating explores and dashboard on looker and do the analysis for data sanity.
Optimisation of looker by modifying lookml or in redshift.
Handling ETL if required using Xplenty if required.
Writing lookml code for metrics requirements.
Also writing sql code to extract and do quick analysis of data.
Worked on various projects like hybrid watchdog, disagg dashboard, subscribe and unsubscribe of email/paper.
Worked on development of Analytics Workbench application by integration looker as a backend.
Created incremental PDTs in looker to avoid rebuilding of PDT's for huge set of data.
Designing AWS redshift data streaming.
Creating different models for different db connections, adding data security using access filters in looker, handling the admin of looker, optimisation of looker views to reduce the PDT's build time.