Job Title: Data Engineer SDE III
Company Name: JSW One
Job Type: Technical
No Of Openings: 1
CTC: 4155 LPA
Location: Mumbai Bengaluru
Experience: 912 Years
About Company:
JSW One Platforms is a venture founded by the worldrenowned JSW group. JSW One is our integrated technology platform that seeks to transform India through increased transparency trust and easeofbusiness. The JSW One Platforms are home to JSW One MSME and JSW One Homes. JSW One MSME is a onestop multiproduct digital marketplace for MSMEs planning on taking their business to the next level. We connect manufacturers to resources by leveraging JSW One s collective access and expertise in the field. JSW One MSME is a consistent flexible and trusted steel solution partner supporting the raw material needs for MSMEs of all sizes. We help MSMEs streamline their steel supply and demand thereby delivering a useful digital experience for steel buyers
Job Description
Role & Responsibilities
- Architect Plan and Develop our data strategy and work with our team to turn it into working software.
- Cocreate the detailed application architecture strategy with the teams aligning your strategy with the teams deliverables
- Take an active role in collaborating to develop strategic direction systems roadmap and business and operational processes by providing the required technical guidance.
- Work with business stakeholders product managers and architects to understand the business datand related processes.
- Own the communication and documentation of the strategy syndicating with CTO business and the development teams
- Get handson in the code building prototypes and upholding best design and engineering practice demonstrating the patterns you would like realized
- Extensive knowledge in data integration architecture/ETL/Data warehouse esp large volumes with complex integrations
- Help the teams slice their deliveries to allow for agile and incremental delivery of business value
- Consult development teams towards application development in Cloud Native way
- Evaluate tools / services and new technologies and suggest the right service and technology to be used.
- Speed as a Habit can operate in a fastmoving environment make quick decisions and execute fiercely to deliver outcomes.
- Using Agile and Dev/Ops methods build platform architecture using a variety of sources (such as cloud IaaS/ SaaS).
- Integrate data from a variety of business subject areas: Leads management Customer Portal Seller App SAP Sales Force etc.
- Implement rules & automate data cleansing mapping transformation logging and exception handling.
- Design build and deploy databases and data stores
- Participate in crossfunctional teams to promote technology strategies analyze and test products perform proofofconcept and pilot new technologies and/or methods.
- Establish and document standards guidelines and best practices for teams utilizing the solutions.
- Review vendor solution designs to ensure technology appropriateness standards compliance and platform capacity alignment.
Ideal Candidate
- At least 8 years of overall experience with over 3 years in architecting and working on big data applications and technologies
- BS/MS in computer science or equivalent work experience.
- Strong ObjectOriented Programming concepts.
- Should be proficient in Server Side (Java/Linux) technologies.
- Expertise in GCP (google cloud platform) and ability to operate in DevOps model.
- Expertise in architecting or developing features for enterprise scale systems will be an added advantage.
- Passion in being the technology ambassador and coaching engineering excellence to junior engineers.
- Strong understanding of the Software design/architecture/databackeddecisionmaking processes.
- Handson deep knowledge of Java and Spring boot
Special Remarks from Company:
Mandatory
- Strong Data Engineering profiles
- Mandatory (Experience 1 Must have 8 YOE as a Data Engineer with ETL Data Architecture and Integration.
- Mandatory (Experience 2 Must have 5 YOE in Software Development with Java / Python
- Mandatory (Core Skill 1 Must have Experience in Data / BigData Engineering Technologies. Any one (e.g. Hive Hadoop HBase HDFS Airflow Spark Kafka Google Cloud Data Engineering Technologies Google Cloud BigQuery Google Cloud DataFlow Google Cloud DataLab Google Cloud Dataplex Google Cloud DataProc etc)
- Mandatory (Core Skill 2 Must have Experience in any database MySQL / PostgreSQL / Postgres / Oracle / SQL Server / DB2 / SQL
- Mandatory (Core Skill 3 Must have experience with Data Pipelines
- Mandatory (Company) Product Companies
- Mandatory (Education) B.E / B. Tech / M Tech
- CV Attachment is mandatory
- What is your preferred location Mumbai /Bengaluru
- Please provide CTC Break up (Fixed Variable ESOPS)
*** Max Notice Period 30 Days
*** Outstation Candidates allowed
databases,big data technologies,transformation,object-oriented programming concepts,apache spark,postgres,kafka,data,etl/data warehouse,google cloud bigquery,agile,gcp,postgresql,mysql,google cloud datalab,sql,data architecture,apache kafka,hadoop,airflow,google cloud dataproc,exception handling,hbase,data stores,java,detailed application architecture strategy,google cloud data engineering technologies,sde,hive,data pipelines,standards compliance,sales force,cloud technologies,db2,google cloud platform,data cleansing,etl,spring boot,bigdata engineering,apache airflow,data strategy,data engineering,python,google cloud dataplex,oracle,hdfs,code building,google cloud dataflow,spark,software design/architecture/data-backed-decision-making,server side (java/linux) technologies,sql server