Join our quest as a Data Architect where youll be at the heart of innovation shaping our trailblazing Generative AI platform. Our vision To revolutionize the GenAI landscape making it as universal and userfriendly as possible touching lives from seasoned developers to creative designers and everyone in between.
Our mantra is simple yet profound: Focus Flow and Joy. If you have a fervent interest in crafting innovative products for a broad audience and are excited about leveraging stateoftheart technology this is the right role for you.
Imagine being part of our Data Platform Engineering team a place where status quo are questioned processes are perfected and cuttingedge tech is used not just for the sake of it but to fundamentally transform effort into efficiency and ideas into reality. This isnt just a job; its a journey to redefine the future of technology.
What youll do
You will be part of a dynamic team with experienced professionals dedicated to delivering comprehensive solutions that harness the power of Data and Generative AI technology including the development of custombuilt products.
By joining our community of passionate builders you will contribute to our shared goal of providing the most valuable userfriendly and enjoyable experiences! You will be playing a key role in ensuring the quality and rapid delivery of the products built using the Generative AI platform.
We enjoy:
Exploring bleedingedge technologies tools and frameworks to experiment with and build better products for existing customers Evaluating areas of improvement with technical products built and implementing ideas which will make us better than yesterday Collaborating with developers to work on technical designs and develop code configurations and scripts to enhance the development lifecycle and integrate systems Collaborate proactively and respectfully with our team and customers Develop tools and integrations to support other developers in building products Take solutions from concept to production by writing code configurations and scripts Improve existing platforms or implement new features for any of our products Create comprehensive documentation for implemented solutions including implementation details and usage instructions Promote our culture of focus flow and joy to gain developers support for our solutions
Qualifications
What you bring
Build Data pipelines required for optimal extraction anonymization and transformation of data from a wide variety of data sources using SQL NoSQL and AWS big data technologies. Streaming Batch Work with stakeholders including the Product Owners Developers and Data scientists to assist with datarelated technical issues and support their data infrastructure needs. Ensure that data is secure and separated following corporate compliance and data governance policies Take ownership of existing ETL scripts maintain and rewrite them in modern data transformation tools whenever needed. Being an automation advocate for data transformation cleaning and reporting tools. You are proficient in developing software from idea to production You can write automated test suites for your preferred language You have frontend development experience with frameworks such as You have backend development experience building and integrating with REST APIs and Databases using languages such as Java Spring JavaScript on Flask on Python You have experience with cloudnative technologies such as Cloud Composer Dataflow Dataproc BigQuery GKE Cloud run Docker Kubernetes and Terraform You have used cloud platforms such as Google Cloud/AWS for application hosting You have used and understand CI/CD best practices with tools such as GitHub Actions GCP Cloud Build You have experience with YAML and JSON for configuration You are uptodate on the latest trends in AI Technology
Greattohaves 3 years of experience as a data or software architect 3 years of experience in SQL and Python 2 years of experience with ELT/ETL platforms (Airflow DBT Apache Beam PySpark Airbyte) 2 years of experience with BI reporting tools (Looker Metabase Quicksight PowerBI Tableau) Extensive knowledge of the Google Cloud Platform specifically the Google Kubernetes Engine Experience with GCP cloud data related services Dataflow GCS Datastream Data Fusion Data Application BigQuery Data Flow Data Proc Dataplex PubSub CloudSQL BigTable) Experience in health industry an asset Expertise in Python Java Interest in PaLM LLM usage and LLMOps Familiarity with LangFuse or Backstage plugins or GitHub Actions Strong experience with GitHub beyond source control Familiarity with monitoring alerts and logging solutions Join us on this exciting journey to make Generative AI accessible to all and create a positive impact with technology
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.