Martian is doing for LLMs what Google did for websites. In the early internet, the number of websites was exploding and it was hard to figure out what website you should use for what task. Google fixed that problem by building a search engine that aggregated websites across the internet. A similar problem exists in AI today; the number of models is exploding and it's hard to figure out what model you should use for what task. Martian fixes that problem through a model router: You give us your prompt, we run it on the best model in real time. Model-routing is just the first tool we're building to help understand the way in which models behave. By pioneering techniques like this, we want to solve the most fundamental problem in AI: understanding why models behave the way they do, and creating guarantees they'll behave the way we want.