- Expert proficiency in Python programming language
- Expert proficiency in PySpark including Spark SQL and other Spark APIs
- Testing and debugging applications with Python test framework tools like PytestPyUnit etc
- Indepth knowledge of Python frameworks and libraries such as Django or Flask
- Experience with AWS cloud platforms including services like S3 Databricks and Data Lake Storage
- Experience with continuous integration/continuous deployment (CI/CD) pipelines and tools
- Experience with data pipeline tools such as Airflow Kafka and Jenkins.
- Design principles that are executable for a scalable app
- A bachelors degree in computer science information technology or any relevant field A masters degree is preferred but not mandatory.
- Strong knowledge of cloud concepts architecture patterns and best practices
- Proven experience in designing implementing and maintaining AWS solutions
- Effective communication and teamwork skills including the ability to collaborate with crossfunctional teams
- Strong problemsolving and analytical abilities
Qualifications :
These duties are too complex and specialized to be performable with a bachelors degree related to computer science or computer information systems or information technology.
Additional Information :
All your information will be kept confidential according to EEO guidelines.
Remote Work :
No
Employment Type :
Contract