Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailThis is a remote position.
Job Description: AI Developer 5 Years Experience)
Mandatory Skills:
Python Development with AI/ML
TensorFlow PyTorch scikitlearn
NLP LLMs RAG Systems Computer Vision
AWS Bedrock & AWS Services (S3 Lambda SageMaker ECS/EKS IAM)
RESTful APIs (Flask/FastAPI)
Docker & Kubernetes
Data Engineering (ETL Apache Airflow/AWS Glue/Spark)
Relational & NoSQL databases (PostgreSQL MySQL DynamoDB MongoDB)
Secondary or Good to Have Skills:
CI/CD Pipelines and DevOps
Project management tools (Jira/Trello)
Cloud infrastructure optimization and security
Years of Experience: 5 Years
Role Type: Permanent (Talpro)
CTC Offered: 12 LPA
Notice Period: Immediate
Work Mode: Permanent Remote
Job Summary:
We are seeking a proficient AI Developer to join our remote team bringing extensive experience in Pythonbased AI/ML development cloud computing (AWS Bedrock) and backend development. The role demands expertise in deploying robust scalable AI solutions optimizing machine learning models and managing data pipelines efficiently. Strong analytical and collaborative skills combined with an ability to clearly communicate technical concepts are essential.
Job Responsibilities:
1. AI & Machine Learning Development:
Develop train finetune and deploy sophisticated ML models using TensorFlow PyTorch scikitlearn and NumPy.
Work extensively with NLP Large Language Models (LLMs) RetrievalAugmented Generation (RAG) and computer vision technologies.
Integrate and optimize pretrained models provided by AWS Bedrock for scalability and efficiency.
2. Backend Development & API Integration:
Design and implement RESTful APIs with Flask or FastAPI for AI model deployment and integration.
Develop robust microservices architectures ensuring security efficiency and scalability of backend services.
3. Cloud & DevOps:
Deploy AI solutions on AWS Bedrock integrating seamlessly with AWS ecosystem services (S3 Lambda SageMaker ECS/EKS IAM).
Manage cloud infrastructure ensuring optimized costs performance and security.
Implement containerized solutions with Docker and Kubernetes along with CI/CD practices for rapid and reliable deployments.
4. Database & Data Engineering:
Design query and optimize databases (PostgreSQL MySQL DynamoDB MongoDB).
Develop automated data pipelines using tools like Apache Airflow AWS Glue or Spark.
Manage robust ETL processes tailored for AI applications.
5. Collaboration & Documentation:
Closely collaborate with data scientists engineers and product teams to integrate AI capabilities.
Prepare comprehensive documentation on AI models system architectures and deployment processes.
Clearly communicate complex AI concepts to nontechnical stakeholders.
Essential Requirements:
Programming & AI Development:
Minimum 5 years experience in Pythoncentric AI/ML development.
Proven expertise in frameworks: TensorFlow PyTorch scikitlearn.
Solid experience with neural networks NLP techniques LLMs RAG and computer vision.
Backend & API Development:
Demonstrated proficiency developing RESTful APIs using Flask or FastAPI.
Strong understanding of microservices architecture OOP and functional programming.
Cloud & DevOps:
Significant handson experience with AWS Bedrock and related AWS services (S3 Lambda SageMaker ECS/EKS IAM).
Practical skills in containerization (Docker) and orchestration (Kubernetes).
Familiarity with CI/CD workflows and DevOps methodologies.
Data Engineering & Databases:
Expertise in relational (PostgreSQL MySQL) and NoSQL (DynamoDB MongoDB) database systems.
Extensive knowledge of ETL practices Apache Airflow AWS Glue or Sparkbased data pipelines.
Version Control & Collaboration:
Proficient with Git collaborative coding practices and code versioning workflows.
Experience using project management and collaboration tools (e.g. Jira Trello).
Soft Skills:
Strong analytical problemsolving capabilities.
Excellent verbal and written communication skills.
Teamoriented with proven collaborative abilities.
Adaptable and proactive learner keen on adopting emerging AI technologies.
Job Requirements:
Essential Requirements:
Programming & AI Development:
5 years of experience in Python development with AI/ML focus.
Proficiency in AI/ML frameworks like TensorFlow PyTorch and scikitlearn.
Experience in neural networks NLP LLMs RAG and computer vision.
Backend & API Development:
Experience in RESTful API development using Flask or FastAPI.
Strong knowledge of OOP functional programming and microservices architecture.
Cloud & DevOps:
Experience with AWS Bedrock and other AWS services S3 Lambda SageMaker ECS/EKS IAM.
Handson experience with containerization (Docker) and orchestration (Kubernetes).
Understanding of CI/CD pipelines and DevOps practices.
Data Engineering & Databases:
Strong knowledge of PostgreSQL MySQL DynamoDB and MongoDB.
Experience with data pipelines ETL processes and Apache Airflow/Spark/AWS Glue.
Version Control & Collaboration:
Proficiency in Git and best practices for collaborative development.
Familiarity with project management tools like Jira or Trello.
Soft Skills:
Strong problemsolving and analytical thinking.
Excellent communication skills to explain AI models and technical concepts.
Ability to work collaboratively in crossfunctional teams.
Adaptability to learn and implement emerging AI/ML technologies.
Full Time