Senior Software Engineer
Lets be unstoppable together!
At Circana we are fueled by our passion for continuous learning and growth we seek and share feedback freely and we celebrate victories both big and small in an environment that is flexible and accommodating to our work and personal lives. We have a global commitment to diversity equity and inclusion as we believe in the undeniable strength that diversity brings to our business employees clients and communities (with us you can always bring your full self to work). Join our inclusive committed team to be a challenger own outcomes and stay curious together. Learn more at .
We are seeking a skilled and motivated Data Engineer to join our growing team. In this role you will be responsible for designing building and maintaining robust data pipelines and infrastructure on the OnPremises and cloud platform. You will leverage your expertise in PySpark Spark Python and Apache Airflow to process and orchestrate largescale data workloads ensuring data quality efficiency and scalability. If you have a passion for data engineering and a desire to make a significant impact we encourage you to apply!
Key Responsibilities:
Data Engineering & Data Pipeline Development
- Design develop and optimize scalable DATA workflows using Python PySpark and Airflow
- Implement realtime and batch data processing using Spark
- Enforce best practices for data quality governance and security throughout the data lifecycle
- Ensure data availability reliability and performance through monitoring and automation.
Cloud Data Engineering :
- Manage cloud infrastructure and cost optimization for data processing workloads
- Implement CI/CD pipelines for data workflows to ensure smooth and reliable deployments.
Big Data & Analytics:
- Build and optimize largescale data processing pipelines using Apache Spark and PySpark
- Implement data partitioning caching and performance tuning for Sparkbased workloads.
- Work with diverse data formats (structured and unstructured) to support advanced analytics and machine learning initiatives.
.Workflow Orchestration (Airflow)
- Design and maintain DAGs (Directed Acyclic Graphs) in Airflow to automate complex data workflows
- Monitor troubleshoot and optimize job and dependencies
Required Skills & Experience:
- 4 years of experience in data engineering with expertise in PySpark Spark.
- Strong programming skills in Python SQL with the ability to write efficient and maintainable code
- Deep understanding of Spark internals (RDDs DataFrames DAG partitioning etc.
- Experience with Airflow DAGs scheduling and dependency management
- Knowledge of Git Docker Kubernetes and apply best practices of DevOps for CI/CD workflows
- Experience in cloud platform like Azure/AWS is favourable.
- Excellent problemsolving skills and ability to optimize largescale data processing.
- Experience in Agile/Scrum environments
- Strong communication and collaboration skills with the ability to effectively work with remote teams
Bonus Points:
- Experience with data modeling and data warehousing concepts
- Familiarity with data visualization tools and techniques
- Knowledge of machine learning algorithms and frameworks
We are seeking a skilled and motivated Data Engineer to join our growing team. In this role you will be responsible for designing building and maintaining robust data pipelines and infrastructure on the OnPremises and cloud platform. You will leverage your expertise in PySpark Spark Python and Apache Airflow to process and orchestrate largescale data workloads ensuring data quality efficiency and scalability. If you have a passion for data engineering and a desire to make a significant impact we encourage you to apply!
Key Responsibilities:
Data Engineering & Data Pipeline Development
- Design develop and optimize scalable DATA workflows using Python PySpark and Airflow
- Implement realtime and batch data processing using Spark
- Enforce best practices for data quality governance and security throughout the data lifecycle
- Ensure data availability reliability and performance through monitoring and automation.
Cloud Data Engineering :
- Manage cloud infrastructure and cost optimization for data processing workloads
- Implement CI/CD pipelines for data workflows to ensure smooth and reliable deployments.
Big Data & Analytics:
- Build and optimize largescale data processing pipelines using Apache Spark and PySpark
- Implement data partitioning caching and performance tuning for Sparkbased workloads.
- Work with diverse data formats (structured and unstructured) to support advanced analytics and machine learning initiatives.
.Workflow Orchestration (Airflow)
- Design and maintain DAGs (Directed Acyclic Graphs) in Airflow to automate complex data workflows
- Monitor troubleshoot and optimize job and dependencies
Required Skills & Experience:
- 4 years of experience in data engineering with expertise in PySpark Spark.
- Strong programming skills in Python SQL with the ability to write efficient and maintainable code
- Deep understanding of Spark internals (RDDs DataFrames DAG partitioning etc.
- Experience with Airflow DAGs scheduling and dependency management
- Knowledge of Git Docker Kubernetes and apply best practices of DevOps for CI/CD workflows
- Experience in cloud platform like Azure/AWS is favourable.
- Excellent problemsolving skills and ability to optimize largescale data processing.
- Experience in Agile/Scrum environments
- Strong communication and collaboration skills with the ability to effectively work with remote teams
Bonus Points:
- Experience with data modeling and data warehousing concepts
- Familiarity with data visualization tools and techniques
- Knowledge of machine learning algorithms and frameworks
Circana Behaviors
As well as the technical skills experience and attributes that are required for the role our shared behaviors sit at the core of our organization. Therefore we always look for people who can continuously champion these behaviors throughout the business within their daytoday role:
- Stay Curious: Being hungry to learn and grow always asking the big questions
- Seek Clarity: Embracing complexity to create clarity and inspire action
- Own the Outcome: Being accountable for decisions and taking ownership of our choices
- Center on the Client: Relentlessly adding value for our customers
- Be a Challenger: Never complacent always striving for continuous improvement
- Champion Inclusivity: Fostering trust in relationships engaging with empathy respect and integrity
- Commit to each other: Contributing to making Circana a great place to work for everyone
Location
This position can be located in the following area(s): Bangalore
Prospective candidates may be asked to consent to background checks (in accordance with local legislation and our candidate privacy notice Your current employer will not be contacted without your permission.
Required Experience:
Senior IC