- Develop & Maintain Data Pipelines: Design, build, and orchestrate scalable and automated ETL/ELT pipelines using Google Cloud Dataflow, Cloud Functions, and Cloud Composer to process data from diverse sources.
- Manage Data Warehouse: Architect and manage our enterprise data warehouse in BigQuery. DE is responsible for data modeling, schema design, and query optimization to ensure high performance and data quality.
- Deliver Business Intelligence: Collaborate directly with business stakeholders to understand their needs. Translate business requirements into technical specifications to build, and maintain interactive dashboards and reports in Looker Studio.
- Ensure Data Integrity: Implement data validation, quality checks, and monitoring frameworks to ensure the accuracy and reliability of the data delivered to business users.
- System Integration: Ingest data from various internal and external sources into Cloud Storage, ensuring seamless integration with datalake ecosystem.
- Collaboration & Support: Work closely with data analysts, data scientists, and other data team to support their data needs and promote a data-driven culture across the organization.
- Other related duties as assigned.