Demonstrate a strong sense of ownership and responsibility with release goals. This includes understanding requirements technical specifications design architecture implementation unit testing builds/deployments and code management.
Ensure compliance through the adoption of enterprise standards and promotion of best practice / guiding principles aligned with organization standards.
Build and maintain the environment for speed accuracy consistency and up time
Hands-on position requiring strong analytical architecture development and debugging skills that include both development and operations.
Attaint in-depth Functional knowledge of the domain that we are working on.
Understand Incident Management Change Management and Problem Management root cause analysis.
Ensure data governance principles adopted data quality checks and data lineage implemented in each hop of the Data.
Be in tune with emerging trends cloud technologies & Big Data.
Should drive and execute complex technical requirements.
Demonstrate excellent verbal and written communication skills.
Collaborate with team members across the globe.
Interface with cross functional teams and downstream applications as needed.
Be a self-starter that is also an excellent team player.
Follow agile best practices. Experienced in working with global teams.
Regularly evaluate cloud applications hardware and software.
Work closely with cyber security team to monitor the organizations cloud policy.
Focus on building a team culture that is based on trust. Inspire your team member so that they focus on continuous growth.
What Were Looking For:
Minimum 10 years of working experience in Technology (application development and production support).
5 years of experience in development of pipelines that extract transform and load data into an information product that helps the organization reach its strategic goals.
Minimum 3 years of experience in developing & supporting ETLs and using Python/Scala in Databricks/Spark platform.
Experience with Python Spark and Hive and Understanding of data-warehousing and data modeling techniques.
Knowledge of industry-wide visualization and analytics tools (ex: Tableau R)
Strong data engineering skills with AWS cloud platform
Experience with streaming frameworks such as Kafka
Knowledge of Core Java Linux SQL and any scripting language
Experience working with any relational Databases preferably Oracle.
Experience in continuous delivery through CI/CD pipelines containers and orchestration technologies.
Experience working in an Agile development environment.
Experience working with cross-functional teams with strong interpersonal and written communication skills.
Candidate must have the desire and ability to quickly understand and work within new technologies
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.