Company Description:
We are a consulting company with a bunch of technologyinterested and happy people! We love technology we love design and we love quality. Our diversity makes us unique and creates an inclusive and welcoming workplace where each individual is highly valued. With us each individual is her/himself and respects others for who they are and we believe that when a fantastic mix of people gather and share their knowledge experiences and ideas we can help our customers on a completely different level. We are looking for you who is immediate joiner and want to grow with us! With us you have great opportunities to take real steps in your career and the opportunity to take great responsibility.
Job description:
Requirement
- At least 9 years work experience
- Take endtoend responsibility to build optimize and support of existing and new data products towards the defined target vision
- Be a champion of DevOps mindset and principles and able to manage CI/CD pipelines and terraform as well as Cloud infrastructure in our context it is GCP (Google Cloud Platform).
- Evaluate and drive continuous improvement and reducing technical debt in the teams
- Design and implement efficient data models data pipelines that support analytical requirements. Good understanding of different data modelling techniques and tradeoff
- Should have experience with Data Modelling
- Experience in data query languages (SQL or similar). Knowledge of ETL processes and tool
- Experience in data centric and API programming (for automation) using one of more programming languages Python Java /or Scala.
- Knowledge of NoSQL and RDBMS databases
- Experience in different data formats (Avro Parquet)
- Have a collaborative and cocreative mindset with excellent communication skills
- Motivated to work in an environment that allows you to work and take decisions independently
- Experience in working with data visualization tools
- Experience in GCP tools Cloud Function Dataflow Dataproc and Bigquery
- Experience in data processing framework Beam Spark Hive Flink
- GCP data engineering certification is a merit
- Have hands on experience in Analytical tools such as powerBI or similar visualization tools
- Exhibit understanding in creating intermediatelevel DAX measures to enhance data models and visualizations
- Have understanding of Microsoft excel functions such as: power pivot power query. Tabular Editor DAX etc.
- Fluent in English both written and verbal
Required cloud certification: Yes
Start: Immediate
Location: Bangalore
Form of employment: Fulltime until further notice we apply 6 months probationary employment.
We interview candidates on an ongoing basis do not wait to submit your application.