Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailIncedo is a US-based consulting data science and technology services firm with over 3000 people helping clients
from our six offices across US Mexico and India. We help our clients achieve competitive advantage through
end-to-end digital transformation. Our uniqueness lies in bringing together strong engineering data science and
design capabilities coupled with deep domain understanding. We combine services and products to maximize
business impact for our clients in telecom Banking Wealth Management product engineering and life science
& healthcare industries.
Working at Incedo will provide you an opportunity to work with industry leading client organizations deep
technology and domain experts and global teams. Incedo University our learning platform provides ample
learning opportunities starting with a structured onboarding program and carrying throughout various stages of
your career. A variety of fun activities is also an integral part of our friendly work environment. Our flexible
career paths allow you to grow into a program manager a technical architect or a domain expert based on your
skills and interests.
Our Mission is to enable our clients to maximize business impact from technology by
Provides leadership for the overall architecture design development and deployment of a full-stack cloud native data analytics platform.
Designing & Augmenting Solution architecture for Data Ingestion Data Preparation Data Transformation Data Load ML & Simulation Modelling Java BE & FE State Machine API Management & Intelligence consumption using data products on cloud
Understand Business Requirements and help in developing High level and Low-level Data Engineering and Data Processing Documentation for the cloud native architecture
Developing conceptual logical and physical target-state architecture engineering and operational specs.
Work with the customer users technical architects and application designers to define the solution requirements and structure for the platform
Model and design the application data structure storage and integration
Lead the database analysis design and build effort
Work with the application architects and designers to design the integration solution
Ensure that the database designs fulfill the requirements including data volume frequency needs and long-term data growth
Able to perform Data Engineering tasks using Spark
Knowledge of developing efficient frameworks for development and testing using (Sqoop/Nifi/Kafka/Spark/Streaming/ WebHDFS/Python) to enable seamless data ingestion processes on to the Hadoop/BigQuery platforms.
Enabling Data Governance and Data Discovery
Exposure of Job Monitoring framework along validations automation
Exposure of handling structured Un Structured and Streaming data.
Experience with building data platform on cloud (Data Lake Data Warehouse environment Databricks)
Strong technical understanding of data modeling design and architecture principles and techniques across master data transaction data and derived/analytic data
Proven background of designing and implementing architectural solutions which solve strategic and tactical business needs
Deep knowledge of best practices through relevant experience across data-related disciplines and technologies particularly for enterprise-wide data architectures data management data governance and data warehousing Highly competent with database design
Highly competent with data modeling
Strong Data Warehousing and Business Intelligence skills or including: Handling ELT and scalability issues for enterprise level data warehouse
Creating ETLs/ELTs to handle data from various data sources and various formats
Strong hands-on experience of programming language like Python Scala with Spark and Beam.
Solid hands-on and Solution Architecting experience inCloud Technologies Aws Azure and GCP (GCP preferred)
Hands on working experience of data processing at scale with event driven systems message queues (Kafka/ Flink/Spark Streaming)
Hands on working Experience with GCP Services like BigQuery DataProc PubSub Dataflow Cloud Composer API Gateway Datalake BigTable Spark Apache Beam
Feature Engineering/Data Processing to be used for Model development
Experience gathering and processing raw data at scale (including writing scripts web scraping calling APIs write SQL queries etc.)
Experience building data pipelines for structured/unstructured real-time/batch events/synchronous/ asynchronous using MQ Kafka Steam processing
Hands-on working experience in analyzing source system data and data flows working with structured and unstructured data
Must be very strong in writing SparkSQL queries
Strong organizational skills with the ability to work autonomously as well as leading a team
Pleasant Personality Strong Communication & Interpersonal Skills
A bachelors degree in computer science computer engineering or a related discipline is required to work as a technical lead
Certification in GCP would be a big plus
Individuals in this field can further display their leadership skills by completing the Project Management Professional certification offered by the Project Management Institute.
We value diversity at Incedo. We do not discriminate based on race religion color national origin gender sexual orientation age marital status veteran status or disability status.
Full Time