Itron is innovating new ways for utilities and cities to manage energy and water. We create a more resourceful world to protect essential resources for today and tomorrow. Join us.
Opening Paragraph:
We are seeking a highly skilled Senior Software Engineer with 58 years of experience in designing and building scalable data engineering solutions using modern bigdata and cloud technologies. The ideal candidate will have strong expertise in PySpark Databricks Delta Lake SQL and streaming platforms such as Kafka. You will be responsible for developing reliable data pipelines enabling governed data products and collaborating with crossfunctional teams to drive highimpact data initiatives. This role requires technical ownership strong problemsolving and the ability to mentor other engineers.
Duties & Responsibilities:
- Design and develop large-scale data processing pipelines using PySpark and Spark SQL on Databricks.
- Build and maintain streaming and batch ingestion pipelines using cloud-native services and Kafka.
- Implement and manage Delta Lake tables with proper governance and lifecycle management.
- Work with data governance frameworks and implement secure compliant data access patterns.
- Ensure high reliability through testing monitoring validation and proper operational practices.
- Collaborate with analytics platform and data science teams to deliver curated datasets.
- Apply engineering best practices for code quality version control documentation and automation.
- Troubleshoot production issues and contribute to improving system robustness.
- Participate in design discussions provide technical guidance and mentor junior engineers.
Required Skills & Experience:
- 58 years of experience as a Software Engineer or Data Engineer.
- Strong hands-on experience with PySpark and distributed computing frameworks.
- Proficiency with Databricks including working with clusters notebooks and workflows.
- Advanced SQL skills for analytical transformations and data modeling.
- Practical experience with Apache Kafka and building data pipelines on streaming platforms.
- Solid understanding of Delta Lake and production-grade data lakehouse implementations.
- Strong programming skills in Python including testing and clean coding practices.
- Experience with cloud platforms (Azure AWS or GCP).
- Familiarity with data governance data quality concepts and secure access patterns.
- Experience working with Git and CI/CD processes.
Preferred Skills & Experience:
- Experience working with Databricks Unity Catalog for centralized governance access control lineage and auditing.
- Experience with orchestration tools such as Airflow Databricks Workflows Prefect or ADF.
- Exposure to data contracts schema management or metadata governance frameworks.
- Experience with cloud monitoring and observability tools.
- Familiarity with Infrastructure-as-Code (e.g. Terraform) and containerization tools.
- Understanding of secure data handling RBAC and compliance practices.
- Experience collaborating with ML teams or enabling ML pipelines (optional).
Itron is committed to building an inclusive and diverse workforce and providing an authentic workplace experience for all employees. If you are excited about this role but your past experiences dont perfectly align with every requirement we encourage you to apply the end you may be just who we are looking for!
The successful candidates starting wage will be determined based on permissible non-discriminatory factors such as skills and experience.
Itron is proud to be an Equal Opportunity Employer. If you require an accommodation to apply please contact a recruiting representative at 1- or email .
Itron is transforming how the world manages energy water and city services. Our trusted intelligent infrastructure solutions help utilities and cities improve efficiency build resilience and deliver safe reliable and affordable service. With edge intelligence we connect people data insights and devices so communities can better manage the essential resources they rely on to live. Join us as we create a more resourceful world:
Required Experience:
Senior IC
Itron is innovating new ways for utilities and cities to manage energy and water. We create a more resourceful world to protect essential resources for today and tomorrow. Join us.Opening Paragraph:We are seeking a highly skilled Senior Software Engineer with 58 years of experience in designing and ...
Itron is innovating new ways for utilities and cities to manage energy and water. We create a more resourceful world to protect essential resources for today and tomorrow. Join us.
Opening Paragraph:
We are seeking a highly skilled Senior Software Engineer with 58 years of experience in designing and building scalable data engineering solutions using modern bigdata and cloud technologies. The ideal candidate will have strong expertise in PySpark Databricks Delta Lake SQL and streaming platforms such as Kafka. You will be responsible for developing reliable data pipelines enabling governed data products and collaborating with crossfunctional teams to drive highimpact data initiatives. This role requires technical ownership strong problemsolving and the ability to mentor other engineers.
Duties & Responsibilities:
- Design and develop large-scale data processing pipelines using PySpark and Spark SQL on Databricks.
- Build and maintain streaming and batch ingestion pipelines using cloud-native services and Kafka.
- Implement and manage Delta Lake tables with proper governance and lifecycle management.
- Work with data governance frameworks and implement secure compliant data access patterns.
- Ensure high reliability through testing monitoring validation and proper operational practices.
- Collaborate with analytics platform and data science teams to deliver curated datasets.
- Apply engineering best practices for code quality version control documentation and automation.
- Troubleshoot production issues and contribute to improving system robustness.
- Participate in design discussions provide technical guidance and mentor junior engineers.
Required Skills & Experience:
- 58 years of experience as a Software Engineer or Data Engineer.
- Strong hands-on experience with PySpark and distributed computing frameworks.
- Proficiency with Databricks including working with clusters notebooks and workflows.
- Advanced SQL skills for analytical transformations and data modeling.
- Practical experience with Apache Kafka and building data pipelines on streaming platforms.
- Solid understanding of Delta Lake and production-grade data lakehouse implementations.
- Strong programming skills in Python including testing and clean coding practices.
- Experience with cloud platforms (Azure AWS or GCP).
- Familiarity with data governance data quality concepts and secure access patterns.
- Experience working with Git and CI/CD processes.
Preferred Skills & Experience:
- Experience working with Databricks Unity Catalog for centralized governance access control lineage and auditing.
- Experience with orchestration tools such as Airflow Databricks Workflows Prefect or ADF.
- Exposure to data contracts schema management or metadata governance frameworks.
- Experience with cloud monitoring and observability tools.
- Familiarity with Infrastructure-as-Code (e.g. Terraform) and containerization tools.
- Understanding of secure data handling RBAC and compliance practices.
- Experience collaborating with ML teams or enabling ML pipelines (optional).
Itron is committed to building an inclusive and diverse workforce and providing an authentic workplace experience for all employees. If you are excited about this role but your past experiences dont perfectly align with every requirement we encourage you to apply the end you may be just who we are looking for!
The successful candidates starting wage will be determined based on permissible non-discriminatory factors such as skills and experience.
Itron is proud to be an Equal Opportunity Employer. If you require an accommodation to apply please contact a recruiting representative at 1- or email .
Itron is transforming how the world manages energy water and city services. Our trusted intelligent infrastructure solutions help utilities and cities improve efficiency build resilience and deliver safe reliable and affordable service. With edge intelligence we connect people data insights and devices so communities can better manage the essential resources they rely on to live. Join us as we create a more resourceful world:
Required Experience:
Senior IC
View more
View less