Job Brief:
Employment Type: Full time Onsite
Timings: 12PM-9PM
Location: Johar Town G4
Experience: 4-5 years
Responsibilities:
- Develop AI-Driven APIs: Build internal tools and endpoints using FastAPI that serve data insights powered by LLMs.
- Automate Business Logic: Replace manual data entry and analysis tasks with autonomous n8n workflows.
- Data Quality Assurance: Ensure the inputs fed into LLM pipelines are clean and the outputs generated are accurate and hallucination-free.
- Database Management: Maintain and optimize the PostgreSQL instance to handle increasing loads of relational and vector data.
Requirements:
Candidates must possess hands-on proficiency in the following areas:
Core Programming & APIs
Python Proficiency: Advanced knowledge of Python for data manipulation and
backend logic.- API Development: Proven experience building documenting and maintaining RESTful APIs using FastAPI (preferred for async capabilities) and Flask.
Microservices: Ability to wrap data models and scripts into deployable services.
AI & LLM Integration
LLM Pipelines: Experience constructing chains and agents (e.g. using LangChain
LlamaIndex or custom scripts) to process text and data.- OpenAI API: Deep familiarity with the OpenAI ecosystem (GPT-4 Assistants API function calling) and cost optimization strategies.
- Prompt Engineering: Demonstrated ability to design system prompts that yield consistent structured outputs (JSON) from unstructured data.
Database & Data Management
PostgreSQL Mastery: Advanced SQL skills including complex joins window
functions and performance tuning.- Vector Data: Experience with vector embeddings and vector similarity search (e.g. using pgvector in PostgreSQL) for RAG (Retrieval-Augmented Generation) implementations.
- Data Modeling: Ability to design efficient database schemas to store historical data and LLM interaction logs.
Automation & Orchestration
n8n Workflow Automation: Strong experience creating complex conditional
workflows in n8n to connect disparate services (Webhooks APIs Databases).- ETL/ELT: Ability to automate data extraction and transformation pipelines using n8n and Python scripts.
Job Brief:Employment Type: Full time OnsiteTimings: 12PM-9PMLocation: Johar Town G4Experience: 4-5 yearsResponsibilities:Develop AI-Driven APIs: Build internal tools and endpoints using FastAPI that serve data insights powered by LLMs.Automate Business Logic: Replace manual data entry and analysis t...
Job Brief:
Employment Type: Full time Onsite
Timings: 12PM-9PM
Location: Johar Town G4
Experience: 4-5 years
Responsibilities:
- Develop AI-Driven APIs: Build internal tools and endpoints using FastAPI that serve data insights powered by LLMs.
- Automate Business Logic: Replace manual data entry and analysis tasks with autonomous n8n workflows.
- Data Quality Assurance: Ensure the inputs fed into LLM pipelines are clean and the outputs generated are accurate and hallucination-free.
- Database Management: Maintain and optimize the PostgreSQL instance to handle increasing loads of relational and vector data.
Requirements:
Candidates must possess hands-on proficiency in the following areas:
Core Programming & APIs
Python Proficiency: Advanced knowledge of Python for data manipulation and
backend logic.- API Development: Proven experience building documenting and maintaining RESTful APIs using FastAPI (preferred for async capabilities) and Flask.
Microservices: Ability to wrap data models and scripts into deployable services.
AI & LLM Integration
LLM Pipelines: Experience constructing chains and agents (e.g. using LangChain
LlamaIndex or custom scripts) to process text and data.- OpenAI API: Deep familiarity with the OpenAI ecosystem (GPT-4 Assistants API function calling) and cost optimization strategies.
- Prompt Engineering: Demonstrated ability to design system prompts that yield consistent structured outputs (JSON) from unstructured data.
Database & Data Management
PostgreSQL Mastery: Advanced SQL skills including complex joins window
functions and performance tuning.- Vector Data: Experience with vector embeddings and vector similarity search (e.g. using pgvector in PostgreSQL) for RAG (Retrieval-Augmented Generation) implementations.
- Data Modeling: Ability to design efficient database schemas to store historical data and LLM interaction logs.
Automation & Orchestration
n8n Workflow Automation: Strong experience creating complex conditional
workflows in n8n to connect disparate services (Webhooks APIs Databases).- ETL/ELT: Ability to automate data extraction and transformation pipelines using n8n and Python scripts.
View more
View less