About UsAt Prior Labs, we are pioneering the development of foundation models designed to comprehend tabular data, which serves as the cornerstone of both scientific and business endeavors. While foundation models have revolutionized the way we process text and images, the potential of structured data remains largely untapped. We are seizing a monumental $600B opportunity to redefine how organizations interact with scientific, medical, financial, and business datasets.Our Momentum: We are recognized as the global leader in structured data machine learning. Our groundbreaking TabPFN v2 model, featured in Nature, has set a new benchmark for tabular machine learning. Since its launch, we have scaled our model's capabilities by over 20 times, achieving more than 2.5 million downloads and garnering over 5,500 stars on GitHub. Our technology is rapidly gaining traction in both research and industry, paving the way for the next generation of tabular foundation models which we are actively commercializing with leading enterprises across Europe and the US.Our Team: We are a selective and dynamic group of over 20 engineers and researchers, handpicked from a pool of more than 5,000 applicants. Our diverse backgrounds include industry giants such as Google, Apple, Amazon, Microsoft, G-Research, Jane Street, Goldman Sachs, and CERN. We are led by the creators of TabPFN and receive guidance from eminent AI researchers, including Bernhard Schölkopf and Turing Award laureate Yann LeCun. Discover more about our team here.Join Us: With support from top-tier investors and leaders from Hugging Face, DeepMind, and Silo AI, we are experiencing rapid growth. This is your chance to help shape the future of structured data AI. Read our manifesto.Your Impact AreasAs a Research Scientist Intern, you will be at the forefront of collaborating on a novel class of AI models that go beyond mere incremental improvements. At our early-stage startup focused on foundation models for tabular data, you will encounter a myriad of thrilling research opportunities and challenges that align with your interests and expertise. Our initiatives include:Enhancing our transformer architectures to effectively handle datasets from 10K to over 1M samples without sacrificing performance.Developing multimodal models that integrate textual and tabular data on proprietary datasets.Creating tailored architectures for time series analysis, forecasting, and anomaly detection.
Jan 15, 2025