drjobs Data Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Bangalore - India

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

job Description: Data Engineer 23 years Experience)
We are looking for an experienced Data Engineer with 23 years of experience to join our growing team. This role will involve working with cuttingedge tools in the Google Cloud Platform (GCP) ecosystem specifically focusing on BigQuery Dataflow Dataform Cloud Storage Pub/Sub and Airflow. You will play a key role in building and maintaining scalable data pipelines ensuring data integrity and supporting the analytics team to drive datadriven decisions.
Key Responsibilities:
  • Design develop and maintain data pipelines using GCP BigQuery GCP Dataflow GCP Dataform and Airflow to ensure seamless data integration and transformation processes.
  • Collaborate with crossfunctional teams to understand data needs and ensure effective data solutions and integrations.
  • Implement best practices for data quality monitoring and optimization in cloudbased environments.
  • Work with GCP Cloud Storage and GCP Pub/Sub to manage and process large datasets and enable realtime data streaming.
  • Write efficient Python scripts to automate data workflows and integrate with different tools within the GCP environment.
  • Leverage DBT (Data Build Tool) for data transformation testing and documentation.
  • Monitor and troubleshoot data pipeline issues ensuring smooth and continuous operations.
  • Participate in code reviews testing and deployment to production environments.
Required Skills:
  • 23 years of experience as a Data Engineer preferably with handson experience in Google Cloud Platform (GCP).
  • Strong expertise in GCP BigQuery GCP Dataflow GCP Dataform and Airflow for building data pipelines and workflows.
  • Proficiency in Python for scripting and automation.
  • Experience with GCP Cloud Storage and GCP Pub/Sub for managing and processing data.
  • Familiarity with DBT for data transformation and version control.
  • Understanding of ETL and data warehousing concepts as well as data modeling and schema design.
  • Ability to troubleshoot optimize and ensure high performance of data processes.
  • Good communication skills and ability to work collaboratively within a team.
Preferred Skills:
  • Experience with data analytics tools like Looker Studio or Tableau.
  • Knowledge of SQL NoSQL databases and data visualization concepts.
  • Familiarity with CI/CD pipelines and version control using Git.
Education:
  • Bachelor s or Master s degree in Computer Science Engineering Data Science or a related field.

troubleshooting,data quality,automation,data warehousing,scripting,dbt,gcp dataflow,airflow,gcp bigquery,data modeling,big query,etl,schema design,dataform,gcp cloud storage,optimization,python,gcp data flow,cloud storage,gcp dataform,gcp pub/sub

Employment Type

Full Time

Company Industry

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.