This is a remote position.
We are seeking a Senior Software Developer Algorithm to join our team.
Responsibilities:
- Build realtime data analysis backend services using Python NodeJS a proprietary database and Redis.
- Design config file schemas data models and an ETL generator to streamline customer data onboarding.
- Deploy and manage AWS infrastructure using Nomad Terraform Consul Nginx S3 and EFS. Develop Semantic Data Graphs to automatically infer relationships between data columns.
- Create Ontologies for a unified metrics and instructions vocabulary enabling easy data labeling and insights/anomaly configuration.
- Implement Abstract Data Representation to optimize data storage decisions such as sharding.
- Develop an Advanced Pipeline Compiler to automate query/ETL configuration supporting crosssource data consolidation and external API integrations.
- Write clean documented code and collaborate with data scientists to integrate experimental results.
- Engage in problemsolving discussions to innovate solutions.
- Establish best practices and help recruit top Software Engineers.
Requirements
- 8 years of software industry experience.
- Bachelor s in Computer Science Software Engineering or equivalent.
- Strong analytical and problemsolving skills.
- Deep computer science fundamentals with the ability to simplify complex problems into efficient algorithms and data structures.
- Proven experience delivering scalable reliable products in Agile environments.
- Expertise in ETL pipelines and Big Data tools (e.g. BigQuery dbt Hadoop Spark).
- Extensive experience with Python Numpy and Pandas for computational/data challenges.
- Knowledge of scalable application design REST APIs and realtime data processing.
- Experience with AWS/GCP analytics and interest in Semantic Data Graphs or Ontologies.
- Fluent efficient coder with enthusiasm for data experience with TypeScript JavaScript and React in building modern web applications.
Benefits
- Work Location: Remote
- 5 days working
8+ years of software industry experience. Bachelor s in Computer Science, Software Engineering, or equivalent. Strong analytical and problem-solving skills. Deep computer science fundamentals, with the ability to simplify complex problems into efficient algorithms and data structures. Proven experience delivering scalable, reliable products in Agile environments. Expertise in ETL pipelines and Big Data tools (e.g., BigQuery, dbt, Hadoop, Spark). Extensive experience with Python, Numpy, and Pandas for computational/data challenges. Knowledge of scalable application design, REST APIs, and real-time data processing. Experience with AWS/GCP analytics and interest in Semantic Data Graphs or Ontologies. Fluent, efficient coder with enthusiasm for data experience with TypeScript, JavaScript, and React in building modern web applications.