About the job
About Cartesia
At Cartesia, we are on a mission to revolutionize artificial intelligence by creating interactive, ubiquitous intelligence that can operate seamlessly in any environment. Despite advances, current AI models struggle to process extensive streams of audio, video, and text simultaneously, an area we aim to transform.
Our innovative team, which originated from the Stanford AI Lab, has developed State Space Models (SSMs), a breakthrough in training efficient, large-scale foundation models. Merging deep knowledge of model engineering with a design-centric approach, we are dedicated to delivering cutting-edge AI models and applications.
Supported by esteemed investors including Index Ventures and Lightspeed Venture Partners, alongside an array of industry experts and advisors, we are well-positioned to lead the future of AI.
About the Role
As we open our inaugural European office in London, we are eager to welcome talented individuals who share our vision for advancing real-time multimodal intelligence.
Your Impact
Engage in pioneering research focused on neural network architecture design to elevate the state-of-the-art (SOTA) in alternative architectures such as state space models, efficient Transformers, and hybrid architectures.
Create innovative architectures that enhance model quality, inference efficiency, and adaptability across various environments, from cloud infrastructures to on-device implementations.
Investigate capabilities like statefulness, long-range memory, and advanced conditioning mechanisms to boost model expressiveness and generalization.
Analyze how architectural choices affect model trade-offs, including scalability, robustness, latency, and energy consumption.
Formulate new frameworks and tools to assess architectural innovations, benchmarking performance in both research and production contexts.
Collaborate with multidisciplinary teams to translate architectural research into scalable, impactful systems for real-world applications.
