Position: Senior GCP Data Engineer
Location: Wilmington (Delaware) or New Jersey Onsite
Exp 12
Skills:
Sqoop DataFlow DataProc Cloud Pub/Sub Cloud Composer PySpark Python GCS BigQuery DAG
Certifications:
GCP Professional Data Engineer
Source Data Analysis & Mapping: Conduct thorough analysis of source data systems collaborate with stakeholders to define detailed source-to-target mappings and translate business requirements into technical specifications for data ingestion and transformation
- Effort Estimation & Planning: Provide accurate effort and resource estimates for development tasks supporting sprint planning and roadmap alignment within the agile framework
- Data Ingestion & Pipeline Development: Design build and maintain robust scalable ETL/ELT pipelines that efficiently ingest data from the data lake into the BigQuery data warehouse and onward to business-specific data marts
- Transformation Implementation: Implement complex data transformation logic ensuring data quality accuracy and timeliness to meet various analytical and operational business needs
- Unit Testing & Quality Assurance: Develop and execute comprehensive unit tests for all data pipelines and transformation logic to ensure functionality accuracy and performance prior to deployment. Commit to delivering high-quality reliable solutions that meet business requirements
- Collaboration & Communication: Work closely with the Data Architect to align architectural guidelines with the Scrum Master to support agile delivery processes with the PM to meet project timelines and with the BA to ensure requirements are fully captured and addressed
- Best Practices & Cost Optimization: Follow industry best practices for BigQuery schema design data partitioning clustering and query optimization while proactively managing cost control measures such as efficient resource usage and storage lifecycle management
- Reliability & Scalability: Ensure that all pipelines and data workflows are resilient fault-tolerant and scalable to support growing data volumes and evolving business demands
Note: Momento USA is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race color religion sex pregnancy sexual orientation gender identity national origin age protected veteran status or disability status.
Position: Senior GCP Data Engineer Location: Wilmington (Delaware) or New Jersey Onsite Exp 12 Skills: Sqoop DataFlow DataProc Cloud Pub/Sub Cloud Composer PySpark Python GCS BigQuery DAG Certifications: GCP Professional Data Engineer Source Data Analysis & Mapping: Conduct thorough analysis of...
Position: Senior GCP Data Engineer
Location: Wilmington (Delaware) or New Jersey Onsite
Exp 12
Skills:
Sqoop DataFlow DataProc Cloud Pub/Sub Cloud Composer PySpark Python GCS BigQuery DAG
Certifications:
GCP Professional Data Engineer
Source Data Analysis & Mapping: Conduct thorough analysis of source data systems collaborate with stakeholders to define detailed source-to-target mappings and translate business requirements into technical specifications for data ingestion and transformation
- Effort Estimation & Planning: Provide accurate effort and resource estimates for development tasks supporting sprint planning and roadmap alignment within the agile framework
- Data Ingestion & Pipeline Development: Design build and maintain robust scalable ETL/ELT pipelines that efficiently ingest data from the data lake into the BigQuery data warehouse and onward to business-specific data marts
- Transformation Implementation: Implement complex data transformation logic ensuring data quality accuracy and timeliness to meet various analytical and operational business needs
- Unit Testing & Quality Assurance: Develop and execute comprehensive unit tests for all data pipelines and transformation logic to ensure functionality accuracy and performance prior to deployment. Commit to delivering high-quality reliable solutions that meet business requirements
- Collaboration & Communication: Work closely with the Data Architect to align architectural guidelines with the Scrum Master to support agile delivery processes with the PM to meet project timelines and with the BA to ensure requirements are fully captured and addressed
- Best Practices & Cost Optimization: Follow industry best practices for BigQuery schema design data partitioning clustering and query optimization while proactively managing cost control measures such as efficient resource usage and storage lifecycle management
- Reliability & Scalability: Ensure that all pipelines and data workflows are resilient fault-tolerant and scalable to support growing data volumes and evolving business demands
Note: Momento USA is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race color religion sex pregnancy sexual orientation gender identity national origin age protected veteran status or disability status.
View more
View less