Job Description
We’re in search of a Full Stack Engineer who thrives on innovation and creative problem-solving across both server-side and client-side technologies. This role is ideal for someone excited to design, build, and maintain the backbone of our AI solutions while also creating intuitive, dynamic user experiences. You’ll be pivotal in enhancing our AI capabilities by developing high-performance back-end systems and integrating advanced AI into daily data workflows. Simultaneously, you’ll craft impactful front-end interfaces, from our React-based SaaS platform to our VSCode extension—offering data professionals a seamless, AI-driven developer experience.
Qualifications:
3+ years of experience building Python-based web application backends (e.g., Flask, FastAPI, Django) and the infrastructure on which they run5+ years of experience designing and developing scalable back-end systems, APIs, and microservicesStrong proficiency in JavaScript/TypeScript and React for building user-facing applicationsHands-on expertise in front-end tooling and libraries (Webpack, Babel, Redux, etc.), with a focus on responsive design, usability, and performance optimization(Desirable) Working knowledge of static code analysis of SQL(Desirable) Experience developing and extending VSCode extensions in JavaScript/TypeScript(Desirable) Familiarity with AI and machine learning concepts(Desirable) Experience with cloud platforms (AWS, Azure, GCP) and containerization technologies (Docker, Kubernetes)(Desirable) Involvement in open-source projects
Projects You'll Work On:
Backend Integration of LLM Architectures: Lead the development and maintenance of the backend infrastructure that powers state-of-the-art Large Language Model (LLM) architectures, including RAG, ReAct, and Agent-based systems. Architect data pipelines for processing extensive datasets and devise AI-powered endpoints crucial for our SaaS application and VSCode Extension.Front-End Development for DataPilot: Build and enhance intuitive, user-friendly interfaces in React for our browser-based SaaS platform. Ensure seamless interactions with our AI backend and deliver an exceptional user experience to thousands of data professionals worldwide.VSCode Extension: Develop and maintain our VSCode extension (JavaScript/TypeScript) to bring AI-driven functionalities directly into developers’ IDE workflow. Focus on smooth integration, performance, and developer experience.Graph-Based Data Structure Design: Design and develop large-scale graph-based data structures that effectively model complex data stores. Create intricate, efficient structures that can be fed to our AI applications as context, ensuring both scalability and performance.
Impact:
Innovation at the Forefront: Push the boundaries of software engineering by combining traditional techniques with cutting-edge AI technologies.High Visibility & Impact: Directly affect the productivity and capabilities of global data teams, as your contributions will be crucial to the daily operations of thousands of users spread across 100s of countries.Open Source Contribution: As part of our commitment to the developer community, you will contribute to our open-source initiatives, gaining recognition in the tech community.Career Growth: This role is a launchpad into the rapidly advancing field of AI, offering exposure to state-of-the-art technologies and generative AI applications.
What We Offer:
Competitive salary and equityBase Salary Range: $130,000 - $180,000 [influenced by your expertise, where you live, how the interview goes, and how well you match the role]Equity: 0.0% - 0.2%Extensive professional development opportunities and resources for continuous learningDynamic and intellectually stimulating work environment with a team of talented engineersOpportunities to shape the direction of the company and leave a lasting impact
Altimate AI's DataPilot offers data incident assistance and prevention by securely connecting your data to LLMs (PII-free). It’s a collection of AI agents - natively integrated in the favorite tools of data teams - to generate data documentation, data quality tests, data contracts, and better versions of existing queries.