Job Summary
NetApp is developing a portfolio of data-centric storage platforms and services to help the worlds leading organizations harness their data in new and interesting support of that mission we are rapidly growing our Keystone team. The Keystone Service team is responsible for cutting-edge technologies that enable NetApps Pay as you go offering. Keystone Service helps customers manage data on prem or in the cloud and have invoices that are charged in a subscription manner
As a Senior Data Engineer in the NetApps Keystone organization you will be responsible for leading and executing our most challenging and complex projects. You will be responsible for decomposing complex problems into simple solutions understanding system interdependencies and limitations and engineering best practices.
Job Requirements
- 8 years experience in Data Engineering and Software Engineering.
- Well versed with Data Modelling & Data Migration.
- Expert in data engineering with knowledge in Oracle Postgres.
- Should have knowledge on ETL tools and version Tools like GIT Github & Gitlab.
- Need work experience in building real-time datalake. Should be skilled in Data modeling Big Data technologies Hadoop Spark Kafka Airflow Bigquery AWS.
- Coding/Scripting in python is recommended.
- Cloud Technologies: Amazon Web Services (AWS) Google Cloud Platform (GCP).
- Should be Familiar with bug tracking tools like JIRA.
- Build and maintain data lakes and proficient in ETL jobs.
- Collaborate with external and internal teams to troubleshoot performance and functional issues.
- Work with DevOps team to integrate new code into existing continuous integration (CI) and continuous delivery/deployment (CD) pipelines.
- 5 years experience in designing and implementing highly scalable cloud architectures including data lakes data warehouses Spark clusters NoSQL databases streaming data analytics and cloud automation.
- Ability to communicate and resolve design issues and escalations.
Education
- Bachelor of Science degree in Engineering required (Computer science preferred) and/or relevant experience.
- 8 years experience in commercial software development.
Required Experience:
Senior IC
Job Summary NetApp is developing a portfolio of data-centric storage platforms and services to help the worlds leading organizations harness their data in new and interesting support of that mission we are rapidly growing our Keystone team. The Keystone Service team is responsible for cutting-edge ...
Job Summary
NetApp is developing a portfolio of data-centric storage platforms and services to help the worlds leading organizations harness their data in new and interesting support of that mission we are rapidly growing our Keystone team. The Keystone Service team is responsible for cutting-edge technologies that enable NetApps Pay as you go offering. Keystone Service helps customers manage data on prem or in the cloud and have invoices that are charged in a subscription manner
As a Senior Data Engineer in the NetApps Keystone organization you will be responsible for leading and executing our most challenging and complex projects. You will be responsible for decomposing complex problems into simple solutions understanding system interdependencies and limitations and engineering best practices.
Job Requirements
- 8 years experience in Data Engineering and Software Engineering.
- Well versed with Data Modelling & Data Migration.
- Expert in data engineering with knowledge in Oracle Postgres.
- Should have knowledge on ETL tools and version Tools like GIT Github & Gitlab.
- Need work experience in building real-time datalake. Should be skilled in Data modeling Big Data technologies Hadoop Spark Kafka Airflow Bigquery AWS.
- Coding/Scripting in python is recommended.
- Cloud Technologies: Amazon Web Services (AWS) Google Cloud Platform (GCP).
- Should be Familiar with bug tracking tools like JIRA.
- Build and maintain data lakes and proficient in ETL jobs.
- Collaborate with external and internal teams to troubleshoot performance and functional issues.
- Work with DevOps team to integrate new code into existing continuous integration (CI) and continuous delivery/deployment (CD) pipelines.
- 5 years experience in designing and implementing highly scalable cloud architectures including data lakes data warehouses Spark clusters NoSQL databases streaming data analytics and cloud automation.
- Ability to communicate and resolve design issues and escalations.
Education
- Bachelor of Science degree in Engineering required (Computer science preferred) and/or relevant experience.
- 8 years experience in commercial software development.
Required Experience:
Senior IC
View more
View less