<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Intelligent Founder AI: Funding Playbook]]></title><description><![CDATA[Practical templates and guides for funding strategy: choosing the right options, pitching investors, structuring deals, and managing the money you raise.]]></description><link>https://www.intelligentfounder.ai/s/funding-playbook</link><generator>Substack</generator><lastBuildDate>Sun, 19 Apr 2026 05:33:46 GMT</lastBuildDate><atom:link href="https://www.intelligentfounder.ai/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Poonam Parihar]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[poonamparihar@gmail.com]]></webMaster><itunes:owner><itunes:email><![CDATA[poonamparihar@gmail.com]]></itunes:email><itunes:name><![CDATA[Poonam Parihar]]></itunes:name></itunes:owner><itunes:author><![CDATA[Poonam Parihar]]></itunes:author><googleplay:owner><![CDATA[poonamparihar@gmail.com]]></googleplay:owner><googleplay:email><![CDATA[poonamparihar@gmail.com]]></googleplay:email><googleplay:author><![CDATA[Poonam Parihar]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[$12 Billion for a Fine‑Tuning API?]]></title><description><![CDATA[Mira Murati, Thinking Machines, and Tinker API. the one AI Bubble story that went largely unnoticed in 2025, plus why it actually matters!]]></description><link>https://www.intelligentfounder.ai/p/12-billion-for-a-finetuning-api</link><guid isPermaLink="false">https://www.intelligentfounder.ai/p/12-billion-for-a-finetuning-api</guid><dc:creator><![CDATA[Poonam Parihar]]></dc:creator><pubDate>Mon, 15 Dec 2025 17:16:32 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!7rf1!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8eefd9e-4dff-4cb1-ab10-7c294e178f74_3668x2048.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!7rf1!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8eefd9e-4dff-4cb1-ab10-7c294e178f74_3668x2048.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!7rf1!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8eefd9e-4dff-4cb1-ab10-7c294e178f74_3668x2048.jpeg 424w, https://substackcdn.com/image/fetch/$s_!7rf1!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8eefd9e-4dff-4cb1-ab10-7c294e178f74_3668x2048.jpeg 848w, https://substackcdn.com/image/fetch/$s_!7rf1!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8eefd9e-4dff-4cb1-ab10-7c294e178f74_3668x2048.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!7rf1!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8eefd9e-4dff-4cb1-ab10-7c294e178f74_3668x2048.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!7rf1!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8eefd9e-4dff-4cb1-ab10-7c294e178f74_3668x2048.jpeg" width="1456" height="813" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d8eefd9e-4dff-4cb1-ab10-7c294e178f74_3668x2048.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:813,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:4721696,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.intelligentfounder.ai/i/181597412?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8eefd9e-4dff-4cb1-ab10-7c294e178f74_3668x2048.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!7rf1!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8eefd9e-4dff-4cb1-ab10-7c294e178f74_3668x2048.jpeg 424w, https://substackcdn.com/image/fetch/$s_!7rf1!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8eefd9e-4dff-4cb1-ab10-7c294e178f74_3668x2048.jpeg 848w, https://substackcdn.com/image/fetch/$s_!7rf1!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8eefd9e-4dff-4cb1-ab10-7c294e178f74_3668x2048.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!7rf1!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8eefd9e-4dff-4cb1-ab10-7c294e178f74_3668x2048.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h1>Fine&#8209;tuning infra is a genuine bottleneck</h1><p>Fine&#8209;tuning infra is one of the many hard AI&#8209;infra problems. It sits in the <strong>model&#8209;development / personalization layer</strong> of the AI stack. it comes <em><strong>after</strong></em><strong> a foundation model has been pre&#8209;trained, and </strong><em><strong>before</strong></em><strong> that adapted model is deployed </strong>and served to users, as the step where you specialize a generic LLM with your own data, tasks, and tone. Beyond fine&#8209;tuning, teams still struggle with data pipelines, GPU&#8209;efficient serving, security/compliance, observability, and integration into messy legacy systems. </p><p>But fine&#8209;tuning infra is<strong> one of the main pain point</strong>, because <strong>most teams don&#8217;t struggle with </strong><em><strong>ideas</strong></em><strong> for customizing models, they struggle with the plumbing needed to actually run those experiments.</strong> Fine&#8209;tuning large LLMs means coordinating high&#8209;end GPUs, fast storage, and low&#8209;latency networking while keeping utilization high and costs under control, which demands <strong>deep distributed&#8209;systems expertise</strong> that many orgs simply don&#8217;t have. Even &#8220;lighter&#8221; approaches like<strong> LoRA still hit GPU memory limits, data&#8209;pipeline bottlenecks, and orchestration complexity, </strong>so without a <strong>solid infra layer,</strong> teams waste 30&#8211;50% of training time on underutilized hardware and operational glitches instead of iterating on models.</p><h3>But can a narrow tool, ( not a full platform ) and a small slice of AI infrastructure stack justifies a  $12B valuation? </h3><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.intelligentfounder.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.intelligentfounder.ai/subscribe?"><span>Subscribe now</span></a></p><p></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!CCs8!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17af93a4-b61f-4e26-a0a0-61c9923a427f_3668x2048.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!CCs8!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17af93a4-b61f-4e26-a0a0-61c9923a427f_3668x2048.jpeg 424w, https://substackcdn.com/image/fetch/$s_!CCs8!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17af93a4-b61f-4e26-a0a0-61c9923a427f_3668x2048.jpeg 848w, https://substackcdn.com/image/fetch/$s_!CCs8!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17af93a4-b61f-4e26-a0a0-61c9923a427f_3668x2048.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!CCs8!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17af93a4-b61f-4e26-a0a0-61c9923a427f_3668x2048.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!CCs8!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17af93a4-b61f-4e26-a0a0-61c9923a427f_3668x2048.jpeg" width="1456" height="813" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/17af93a4-b61f-4e26-a0a0-61c9923a427f_3668x2048.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:813,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2234315,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.intelligentfounder.ai/i/181597412?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17af93a4-b61f-4e26-a0a0-61c9923a427f_3668x2048.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!CCs8!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17af93a4-b61f-4e26-a0a0-61c9923a427f_3668x2048.jpeg 424w, https://substackcdn.com/image/fetch/$s_!CCs8!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17af93a4-b61f-4e26-a0a0-61c9923a427f_3668x2048.jpeg 848w, https://substackcdn.com/image/fetch/$s_!CCs8!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17af93a4-b61f-4e26-a0a0-61c9923a427f_3668x2048.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!CCs8!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17af93a4-b61f-4e26-a0a0-61c9923a427f_3668x2048.jpeg 1456w" sizes="100vw"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p>The last time Mira Murati really broke through my news feed on linkedIn  was in <strong>August 2025</strong>. Fortune ran and amplified<strong> MIT&#8217;s NANDA report </strong>claiming that <strong>around 95% of generative&#8209;AI pilots at companies were failing</strong>, a story that went viral for its stark headline about the<strong> &#8216;GenAI divide&#8217; </strong>and that was before anyone could get their hands on the report to analyze the sample data size. around the same day another Fortune piece from few weeks ( end of June - earlier July timeline)  before on <strong>Thinking Machines&#8217;</strong> <strong>$2 billion seed round at a roughly $12 billion valuation</strong> showed up and was presented as a<strong> milestone / record breaking / largest raise for a  female founder in frontier tech. </strong>No sign of AI boom still being alive on this one,  while the NANDA piece became the go&#8209;to reference for skepticism about enterprise AI, and overall AI hype. </p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.intelligentfounder.ai/p/is-fortune-playing-both-sides-of&quot;,&quot;text&quot;:&quot;Fortunes' two sides on AI Hype Cycle&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.intelligentfounder.ai/p/is-fortune-playing-both-sides-of"><span>Fortunes' two sides on AI Hype Cycle</span></a></p><p></p><p>So when I saw bunch of<a href="https://thinkingmachines.ai/blog/tinker-general-availability/"> </a><strong><a href="https://thinkingmachines.ai/blog/tinker-general-availability/">updates on Tinker</a>,</strong> including the latest one from 2 days ago, about both Tinker&#8217;s general availability and Mira Murati eyeing fo<strong>r $50B valuation, </strong> I wanted to dig deeper. </p><h1>SO What is Tinker Exactly?</h1><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!MLeh!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F374e6acb-02b0-4b6b-af0b-89073ad62c6d_3668x2048.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!MLeh!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F374e6acb-02b0-4b6b-af0b-89073ad62c6d_3668x2048.jpeg 424w, https://substackcdn.com/image/fetch/$s_!MLeh!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F374e6acb-02b0-4b6b-af0b-89073ad62c6d_3668x2048.jpeg 848w, https://substackcdn.com/image/fetch/$s_!MLeh!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F374e6acb-02b0-4b6b-af0b-89073ad62c6d_3668x2048.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!MLeh!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F374e6acb-02b0-4b6b-af0b-89073ad62c6d_3668x2048.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!MLeh!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F374e6acb-02b0-4b6b-af0b-89073ad62c6d_3668x2048.jpeg" width="1456" height="813" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/374e6acb-02b0-4b6b-af0b-89073ad62c6d_3668x2048.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:813,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1395624,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.intelligentfounder.ai/i/181597412?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F374e6acb-02b0-4b6b-af0b-89073ad62c6d_3668x2048.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!MLeh!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F374e6acb-02b0-4b6b-af0b-89073ad62c6d_3668x2048.jpeg 424w, https://substackcdn.com/image/fetch/$s_!MLeh!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F374e6acb-02b0-4b6b-af0b-89073ad62c6d_3668x2048.jpeg 848w, https://substackcdn.com/image/fetch/$s_!MLeh!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F374e6acb-02b0-4b6b-af0b-89073ad62c6d_3668x2048.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!MLeh!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F374e6acb-02b0-4b6b-af0b-89073ad62c6d_3668x2048.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>If you ask people outside the AI infra bubble <strong>what Tinker does,</strong> most won&#8217;t know. And that&#8217;s partly because the answer is boring:<strong> it&#8217;s an API for fine-tuning and training large language models.</strong></p><p>Fine-tuning which I already explained above, and that might have bored you already? - is one of the least sexy parts of AI.<strong> It&#8217;s not building a frontier model. It&#8217;s not creating the next GPT.</strong> </p><h3>It&#8217;s the infrastructure work that happens after a model is built</h3><p>the process of<strong> taking a base model and adapting it for a specific task, domain, or organization.</strong></p><p>Today, if you want to fine-tune a model, you have a few options. You can try to do it yourself, or you can use one of the big cloud providers like AWS, Google Cloud, Azure but then you&#8217;re locked into their ecosystem and their pricing.</p><h3>Tinker&#8217;s pitch is simpler</h3><p> <strong>write your training loop in Python, upload your data, and Tinker handles the heavy lifting.</strong> The company manages the GPU orchestration, the fault tolerance, the storage, the scaling. You keep control of your algorithms and your data flows. You pay per token for what you use.</p><p><strong>It&#8217;s useful. It solves a real problem. </strong>Research labs are already using it. </p><h4>But it&#8217;s also... not revolutionary. </h4><p>It&#8217;s managed infrastructure on top of open-weight models. Think of it like Databricks for LLMs, but narrower and earlier-stage.</p><p>So here&#8217;s the tension that sits at the heart of this entire story: VCs have given Mira Murati $12 billion for a fine-tuning API because they believe &#8220;the vision&#8221;, that this <strong>wedge </strong>will expand <strong>into a full control plane for how open-weight models are customized, trained, aligned, and deployed</strong>, even though the current product is just one slice of that vision.</p><p><strong>The Tension:</strong></p><ul><li><p>Product shipped: One API for fine-tuning</p></li><li><p>Vision marketed: <strong>A control plane for how all open-weight models get customized and deployed</strong></p></li><li><p>There&#8217;s a<strong> MASSIVE GAP</strong> between what exists and what&#8217;s being priced</p></li></ul><p>Here&#8217;s where it gets more interesting - </p><h2>Not Open Source (Despite the Framing)</h2><p><em>( and I wrote in length about open source <a href="https://www.intelligentfounder.ai/p/5-ai-shifts-that-actually-changed">here</a> and <a href="https://www.intelligentfounder.ai/p/openai-co-founds-the-agentic-ai-foundation">here</a> specially model gap and agentic AI open frameworks) </em></p><p>I needed to clarify something that kept confusing me: Tinker gets marketed with &#8220;open&#8221; language, but it&#8217;s not actually open source.</p><p><strong>What&#8217;s Proprietary:</strong></p><ul><li><p><strong>Tinker&#8217;s core infrastructure is closed and proprietary</strong></p></li><li><p>Runs on Thinking Machines&#8217; servers in their cloud</p></li><li><p><strong>You don&#8217;t own or control the training infrastructure</strong></p></li></ul><p><strong>What&#8217;s Open:</strong></p><ul><li><p>The models it supports are <strong>open-weight</strong> (Llama, Qwen, Kimi etc.)</p></li><li><p><strong>They publish the &#8220;Tinker Cookbook&#8221; as open-source</strong> (example implementations)</p></li><li><p>The framing emphasizes &#8220;<strong>not locked into closed systems</strong>&#8221;</p></li></ul><p><strong>The Reality Check:</strong><br>The business model is<strong> classic SaaS infra: proprietary service, open-weight model support, usage-based pricing.</strong> </p><h3>The &#8220;open&#8221; positioning is more about marketing than product architecture.</h3><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.intelligentfounder.ai/p/12-billion-for-a-finetuning-api?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.intelligentfounder.ai/p/12-billion-for-a-finetuning-api?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p></p><p>I dont; think I need to go in length about who&#8217;s Mira Murati but some details may help, make the case. </p><h2>Who is Mira Murati?</h2><p>Mira Murati spent the last six years at <strong>OpenAI running product</strong>. she was the CTO, <em>the</em> product lead for ChatGPT, for RLHF (Reinforcement Learning from Human Feedback, the breakthrough technique that made LLMs usable), and for every major release that turned OpenAI from a research lab into a trillion-dollar company and defined the frontier of consumer AI.</p><p>When you run OpenAI&#8217;s product, you&#8217;re not just shipping features. You&#8217;re shaping how hundreds of millions of people interact with AI. <strong>You&#8217;re learning what works, what breaks, how to build alignment and safety into systems that billions will use.</strong> So You&#8217;re in the room when the most important technical decisions in AI are made. and That&#8217;s valuable yes, worth a $12 billion bet on Day 1? Hmm. </p><p>She didn&#8217;t launch <strong>Thinking Machines</strong> alone however. Her co-founders include six other <strong>OpenAI veterans, plus researchers from Mistral AI and DeepMind</strong>. Her investors include VC firms like a16z and Accel, but also strategic checkbooks from Nvidia, Cisco, ServiceNow, AMD, and Jane Street. The cap table reads like a who&#8217;s-who of the companies -</p><h3>that have the most to gain if this lab becomes another foundational piece of the AI stack.</h3><blockquote><p>So it  understand why a fine-tuning API is worth $12 billion before it has a business model to show, I guess we have to understand who built it.<strong> - the AI celebrities? </strong></p><p>So the first thing to understand: </p></blockquote><h2><strong>this wasn&#8217;t VC capital making a bet on a product. </strong></h2><p>It was VC capital making a bet on a team and a narrative, the idea that these specific people, at this specific moment, could launch the<strong> next foundational AI company</strong>. yes? </p><h2>and Why Do VCs Care So Much? Why $12 Billion for a Fine-Tuning API?</h2><blockquote><p>This is the question that started my whole investigation. When I looked at the funding timeline, it looked insane. and the answers I found paint a picture of a market that&#8217;s either brilliantly optimized or fundamentally broken, depending on well, who you actually ask. I am asking AI, mostly. so far! yes. </p></blockquote><p><strong>The Capital Trajectory:</strong></p><ul><li><p>June 2025: Raised $2 billion at $10 billion valuation (seed round - largest ever in that category ( <strong>AI din&#8217;t care to highlight a woman here btw))</strong></p></li><li><p>July 2025: Valuation hit $12 billion</p></li><li><p><strong>November 2025: In talks to raise at ~$50 billion valuation</strong></p></li></ul><p><strong>For Reference:</strong></p><ul><li><p>Facebook: 8 years to $50B valuation</p></li><li><p>Stripe: 6 years to $50B+ valuation</p></li><li><p><strong>Thinking Machines: 11 months to $50B+ &#8220;in talks&#8221; the company was founded in January 2025. </strong></p></li></ul><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Gh_5!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e81d1b2-4aa7-4234-aee1-d17a4a3a718c_3668x2048.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Gh_5!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e81d1b2-4aa7-4234-aee1-d17a4a3a718c_3668x2048.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Gh_5!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e81d1b2-4aa7-4234-aee1-d17a4a3a718c_3668x2048.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Gh_5!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e81d1b2-4aa7-4234-aee1-d17a4a3a718c_3668x2048.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Gh_5!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e81d1b2-4aa7-4234-aee1-d17a4a3a718c_3668x2048.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Gh_5!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e81d1b2-4aa7-4234-aee1-d17a4a3a718c_3668x2048.jpeg" width="1456" height="813" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2e81d1b2-4aa7-4234-aee1-d17a4a3a718c_3668x2048.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:813,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:944035,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.intelligentfounder.ai/i/181597412?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e81d1b2-4aa7-4234-aee1-d17a4a3a718c_3668x2048.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Gh_5!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e81d1b2-4aa7-4234-aee1-d17a4a3a718c_3668x2048.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Gh_5!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e81d1b2-4aa7-4234-aee1-d17a4a3a718c_3668x2048.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Gh_5!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e81d1b2-4aa7-4234-aee1-d17a4a3a718c_3668x2048.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Gh_5!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2e81d1b2-4aa7-4234-aee1-d17a4a3a718c_3668x2048.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>With one product shipped. That had been live for two months.</p><h2>So VCs&#8217; Real Bet - Even if Tinker flops, this team is valuable enough to acquire!</h2><ol><li><p><strong>Team Pedigree = Optionality</strong></p><ul><li><p>Mira Murati: Former OpenAI CTO, led ChatGPT and RLHF</p></li><li><p>Six other OpenAI veterans on the founding team</p></li><li><p>Signal to investors: &#8220;This group could launch the next frontier AI lab&#8221;</p></li><li><p><strong>Narrative: &#8220;Even if Tinker flops, this team is valuable enough to acquire&#8221;</strong></p></li></ul></li><li><p><strong>Strategic Positioning</strong></p><ul><li><p><strong>Positioned as the control layer for open-weight model training</strong></p></li><li><p>If successful, becomes a <strong>chokepoint in the AI stack</strong> (like Snowflake for data)</p></li><li><p><strong>Market framing: &#8220;Whoever owns model customization infrastructure owns the future&#8221;</strong></p></li></ul></li><li><p><strong>Real Market Tailwinds</strong></p><ul><li><p>Fine-tuning and orchestration services forecast at $10&#8211;15B by 2033</p></li><li><p>~30% compound annual growth expected</p></li><li><p><strong>Every AI deployment needs model customization,</strong> alignment, and experimentation</p></li></ul></li><li><p><strong>Capital Concentration &amp; FOMO</strong></p><ul><li><p>40% of all VC exit value in 2025 comes from AI</p></li><li><p>Most of that concentrates in a handful of &#8220;obvious&#8221; bets</p></li><li><p>Large funds need massive checks; they can&#8217;t find enough mega-deals</p></li><li><p>Fear of missing out is stronger than fear of overpaying</p></li></ul></li><li><p><strong>Strategic Investor Incentives</strong></p><ul><li><p>Nvidia (strategic investor) benefits from selling GPUs to Thinking Machines</p></li><li><p>Big cloud providers benefit from infra that locks in open-weight workflows</p></li><li><p>These aren&#8217;t purely financial bets; they&#8217;re ecosystem plays</p></li></ul></li></ol><p><strong>The Bottom Line:</strong></p><ul><li><p><strong>Actual traction: Minimal </strong>(early adopter researchers, a few enterprise pilots)</p></li><li><p>What&#8217;s being priced: <strong>The optionality that this team becomes indispensable</strong></p></li><li><p>The gap between these: <strong>Enormous</strong></p></li></ul><p></p><p>Now that we&#8217;ve already answered the question about $12B I wanted to understand - </p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.intelligentfounder.ai/p/turns-out-throwing-money-at-ai-doesnt&quot;,&quot;text&quot;:&quot;AI market maturity Aug 2025&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.intelligentfounder.ai/p/turns-out-throwing-money-at-ai-doesnt"><span>AI market maturity Aug 2025</span></a></p><p></p><h2>What&#8217;s the Business Plan is Really?</h2><p>I looked at what Murati and the team publicly state about their strategy.</p><h3><strong>The Official Narrative:</strong></h3><ul><li><p><strong>&#8220;Democratize access to advanced model customization&#8221;</strong></p></li><li><p><strong>&#8220;Enable researchers and developers to experiment without massive GPU clusters&#8221;</strong></p></li><li><p><strong>Focus on &#8220;meta-learning&#8221; and &#8220;superhuman learners&#8221; rather than just bigger base models</strong></p></li><li><p><strong>Implicit bet: &#8220;The value isn&#8217;t in the biggest model; it&#8217;s in who controls how models are customized&#8221; </strong></p></li></ul><p>And this is interesting because<strong> it&#8217;s implicitly </strong><em><strong>against</strong></em><strong> the frontier model thesis </strong>that OpenAI and Anthropic are pursuing. Murati is betting that the <strong>next wave of AI value comes from infrastructure,</strong> not from who builds the best base model.</p><h2>Why Fine-Tuning as the First Product?</h2><p>If you&#8217;re going to build a <strong>multi-billion dollar platform, why start with fine-tuning? </strong>It seems narrow.</p><blockquote><p><strong>The Strategic Logic:</strong></p></blockquote><ul><li><p>It&#8217;s the <strong>highest-friction point as stated in the beginning </strong>in open-weight model development</p></li><li><p><strong>Every lab, every enterprise, every developer</strong> building on Llama/Qwen/DeepSeek hits this problem</p></li><li><p>Fine-tuning infrastructure is genuinely hard: GPU orchestration, distributed training, fault tolerance</p></li><li><p><strong>If you own the access point where people customize models, you control a huge slice of downstream value</strong></p></li></ul><blockquote><p><strong>The Bet Underlying the Valuation:</strong></p></blockquote><ul><li><p>Today: Tinker is a fine-tuning API</p></li><li><p>Tomorrow: Expands to hosting, routing, monitoring, alignment tooling</p></li><li><p>Endgame: <strong>Becomes the control plane</strong> for open-weight AI development</p></li><li><p>The $12B+ price tag assumes that transition from<strong> &#8220;wedge&#8221; to &#8220;platform&#8221;</strong></p></li></ul><blockquote><p><strong>The Problem with This Logic:</strong></p></blockquote><ul><li><p>Big clouds (AWS, Google Cloud, Azure) are already commoditizing fine-tuning</p></li><li><p><strong>The longer Thinking Machines stays &#8220;just&#8221; a fine-tuning API, the more vulnerable it is to this commoditization</strong></p></li><li><p>They have maybe 2&#8211;3 years to expand before this window closes</p></li></ul><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.intelligentfounder.ai/?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share&quot;,&quot;text&quot;:&quot;Share The Intelligent Founder&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.intelligentfounder.ai/?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share"><span>Share The Intelligent Founder</span></a></p><p></p><h2>And about Controlling the AI Infrastructure - </h2><p>This below is a <em>full stack</em> from hardware up through deployment. Tinker is just a slice of it. </p><p><strong>The Full AI Infrastructure Stack:</strong></p><ul><li><p>Compute layer: GPUs, TPUs, specialized chips</p></li><li><p>Storage and networking: Distributed systems, high-speed interconnects</p></li><li><p>Orchestration: How jobs get scheduled and run</p></li><li><p>Data pipelines and management</p></li><li><p>MLOps and observability</p></li><li><p>Model training and fine-tuning</p></li><li><p>Model serving and inference</p></li><li><p>Monitoring and alerting</p></li></ul><p>So when VCs call Tinker &#8220;AI infrastructure,&#8221; what do they actually mean?</p><h2><strong>The Honest Assessment:</strong></h2><ul><li><p><strong>Is it AI infrastructure? Technically, yes (it&#8217;s part of the stack)</strong></p></li><li><p><strong>Is it </strong><em><strong>the</strong></em><strong> AI infrastructure? No, it&#8217;s a slice</strong></p></li><li><p><strong>Is calling it &#8220;infrastructure&#8221; generous? Absolutely</strong></p></li></ul><p><strong>Why It Still Gets That Label:</strong></p><ul><li><p>If Thinking Machines expands to hosting, inference, orchestration, and monitoring, it could become broader</p></li><li><p>Investors are pricing the <em>optionality</em> of that expansion</p></li><li><p><strong>It&#8217;s a bet on a future position, not the current product</strong></p></li></ul><p><strong>Comparable Analogy:</strong></p><ul><li><p><strong>Kubernetes started as &#8220;just&#8221; container orchestration</strong></p></li><li><p>It expanded into a <strong>foundational platform </strong>for distributed systems</p></li><li><p>Today, knowing Kubernetes is a table-stakes requirement for infra engineers</p></li></ul><h3>So the question now -  Is Tinker the Kubernetes of open-weight model training, or is it a more niche tool?</h3><p></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.intelligentfounder.ai/p/40-of-these-billion-dollar-ai-companies&quot;,&quot;text&quot;:&quot;who's profiting the most from AI agents?&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.intelligentfounder.ai/p/40-of-these-billion-dollar-ai-companies"><span>who's profiting the most from AI agents?</span></a></p><p></p><h2>Quick look a the the Money Trail and Who Really Wins</h2><p><strong>Funding Breakdown (approximately $2B):</strong></p><ul><li><p>a16z (traditional VC): ~$500M</p></li><li><p>Nvidia (strategic hardware): ~$600M</p></li><li><p>Accel (traditional VC): ~$400M</p></li><li><p>Other corporates (ServiceNow, Cisco, AMD): ~$350M</p></li><li><p>Jane Street (prop trading firm): ~$150M</p></li></ul><p><strong>I already wrote about Nvidia&#8217;s engineered demand on <a href="https://www.linkedin.com/posts/pariharpoonam_in-1999-lucent-technologies-had-a-problem-activity-7405332177342746624-ri-8?utm_source=share&amp;utm_medium=member_desktop&amp;rcm=ACoAABOwyVABqGh1W58dox6xqbl3c10tDVwz7x4">linkedIn</a> from ecosystem perspective, but then I din&#8217;t pay much attention to its Thinking Machines investment. it&#8217;s not very far form the original game though, but just it reiterate - </strong></p><ul><li><p>Nvidia isn&#8217;t investing to get 5-10x returns on equity</p></li><li><p>Nvidia is investing because Thinking Machines will buy Nvidia GPUs</p></li><li><p>Every dollar of Nvidia investment that flows back as GPU purchases is revenue for Nvidia</p></li><li><p><strong>This is called &#8220;engineering demand&#8221; and it&#8217;s entirely legal and rational</strong></p></li></ul><p><strong>The Flywheel:</strong></p><ul><li><p>Nvidia invests in startups &#8594; startups buy Nvidia GPUs &#8594; Nvidia revenue grows &#8594; Nvidia stock up &#8594; Nvidia invests in more startups &#8594; repeat</p></li><li><p>It&#8217;s not a conspiracy; it&#8217;s capital and incentives aligning</p></li></ul><p><strong>The Implication:</strong></p><ul><li><p>The $12B valuation is partly driven by strategic investors locking in customer/ecosystem relationships</p></li></ul><h3>But all this makes the valuation even less connected to &#8220;fair financial pricing&#8221;</h3><p></p><p>One of my key questions that followed in this research was : <strong>is Thinking Machines unique, or is this pattern everywhere?</strong></p><h2><strong>The Answer:</strong> This pattern is everywhere in 2025&#8217;s AI funding.</h2><p>You&#8217;re Not Alone&#8212;And That&#8217;s the Problem. </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!WJbF!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa591bfec-e9f8-4811-b491-00437f04aa47_3668x2048.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!WJbF!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa591bfec-e9f8-4811-b491-00437f04aa47_3668x2048.jpeg 424w, https://substackcdn.com/image/fetch/$s_!WJbF!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa591bfec-e9f8-4811-b491-00437f04aa47_3668x2048.jpeg 848w, https://substackcdn.com/image/fetch/$s_!WJbF!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa591bfec-e9f8-4811-b491-00437f04aa47_3668x2048.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!WJbF!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa591bfec-e9f8-4811-b491-00437f04aa47_3668x2048.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!WJbF!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa591bfec-e9f8-4811-b491-00437f04aa47_3668x2048.jpeg" width="1456" height="813" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a591bfec-e9f8-4811-b491-00437f04aa47_3668x2048.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:813,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1288092,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.intelligentfounder.ai/i/181597412?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa591bfec-e9f8-4811-b491-00437f04aa47_3668x2048.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!WJbF!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa591bfec-e9f8-4811-b491-00437f04aa47_3668x2048.jpeg 424w, https://substackcdn.com/image/fetch/$s_!WJbF!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa591bfec-e9f8-4811-b491-00437f04aa47_3668x2048.jpeg 848w, https://substackcdn.com/image/fetch/$s_!WJbF!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa591bfec-e9f8-4811-b491-00437f04aa47_3668x2048.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!WJbF!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa591bfec-e9f8-4811-b491-00437f04aa47_3668x2048.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>The Pattern: ( image above) </strong></p><ul><li><p>All valued in the billions despite being <strong>early-stage or single-product</strong></p></li><li><p>All have<strong> founder pedigree (ex-OpenAI, ex-FAANG, high-profile billionaires)</strong></p></li><li><p>All betting on &#8220;<strong>big future narrative</strong>&#8221; over current traction</p></li><li><p>All getting <strong>special treatment vs. &#8220;normal&#8221; startups with real revenue</strong></p></li></ul><h2>The Two-Tier System</h2><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.intelligentfounder.ai/p/12-billion-for-a-finetuning-api/comments&quot;,&quot;text&quot;:&quot;Leave a comment&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.intelligentfounder.ai/p/12-billion-for-a-finetuning-api/comments"><span>Leave a comment</span></a></p><p></p><p>This is where it gets really unfair.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!f-ln!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d709233-2f97-4edd-9172-05585ec0c910_3668x2048.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!f-ln!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d709233-2f97-4edd-9172-05585ec0c910_3668x2048.jpeg 424w, https://substackcdn.com/image/fetch/$s_!f-ln!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d709233-2f97-4edd-9172-05585ec0c910_3668x2048.jpeg 848w, https://substackcdn.com/image/fetch/$s_!f-ln!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d709233-2f97-4edd-9172-05585ec0c910_3668x2048.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!f-ln!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d709233-2f97-4edd-9172-05585ec0c910_3668x2048.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!f-ln!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d709233-2f97-4edd-9172-05585ec0c910_3668x2048.jpeg" width="1456" height="813" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/5d709233-2f97-4edd-9172-05585ec0c910_3668x2048.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:813,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1181257,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.intelligentfounder.ai/i/181597412?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d709233-2f97-4edd-9172-05585ec0c910_3668x2048.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!f-ln!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d709233-2f97-4edd-9172-05585ec0c910_3668x2048.jpeg 424w, https://substackcdn.com/image/fetch/$s_!f-ln!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d709233-2f97-4edd-9172-05585ec0c910_3668x2048.jpeg 848w, https://substackcdn.com/image/fetch/$s_!f-ln!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d709233-2f97-4edd-9172-05585ec0c910_3668x2048.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!f-ln!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5d709233-2f97-4edd-9172-05585ec0c910_3668x2048.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>Tier 1: AI Lab / Infra (Celebrity Founders)</strong></p><ul><li><p>Founders: ex-OpenAI, ex-Anthropic, high-profile billionaires</p></li><li><p>Stage: Pre-product or single product</p></li><li><p>Valuation multiples: Basically &#8220;<strong>what can we get away with asking?&#8221;</strong></p></li><li><p>VC behavior: <strong>&#8220;How much do we need to commit to get in?&#8221;</strong></p></li><li><p>Reality: Narrative and FOMO drive the round</p></li></ul><p><strong>Tier 2: Everything Else (Normal Founders)</strong></p><ul><li><p>Founders: Competent people without OpenAI pedigree</p></li><li><p>Stage: Revenue, product-market fit</p></li><li><p>Valuation multiples: 10&#8211;50x revenue (traditional startup metrics)</p></li><li><p>VC behavior: &#8220;Prove this isn&#8217;t a fad and show unit economics&#8221;</p></li><li><p><strong>Reality: You need to actually execute</strong></p></li></ul><p><strong>The Gap:</strong></p><ul><li><p>Mira Murati gets $12B on a team and a narrative</p></li><li><p><strong>A normal team with the same product would struggle to raise $100M</strong></p></li></ul><h3><strong>This is the venture capital market in 2025: radically unfair based on founder brand</strong></h3><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.intelligentfounder.ai/p/ais-gold-rush-20-unpacking-the-psychology&quot;,&quot;text&quot;:&quot;Unpacking VC psychology behind $B~  bets&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.intelligentfounder.ai/p/ais-gold-rush-20-unpacking-the-psychology"><span>Unpacking VC psychology behind $B~  bets</span></a></p><p></p><p>To Summarize to all - and </p><h2>Why This Matters Beyond Tinker?</h2><ul><li><p>Capital concentration: <strong>Most VC dollars flowing to ~20 AI &#8220;obvious&#8221; bets </strong></p><ul><li><p>Capital is concentrating in fewer hands</p></li><li><p>The biggest mega-funds control most of the money</p></li><li><p>They&#8217;re making bigger bets on fewer companies</p></li><li><p>Within AI, this is extreme: 40% of 2025 VC exit value from AI, but flowing to tiny number of companies</p></li></ul></li><li><p>Two-tier system: <strong>Celebrity founders </strong>get priced on optionality; everyone else <strong>needs real traction</strong></p><ul><li><p><strong>The Numbers:</strong></p><ul><li><p>Roughly 60% of all VC capital flowing to AI now goes to top 20 AI labs/startups</p></li><li><p>Next 30% goes to the next 100 AI companies</p></li><li><p>Final 10% goes to all other startups</p></li></ul><p><strong>What This Creates:</strong></p><ul><li><p>If you&#8217;re Mira Murati: You can raise billions on reputation</p></li><li><p>If you&#8217;re not: You&#8217;re competing for scraps</p></li></ul></li></ul></li><li><p><strong>Systemic incentive misalignment:</strong>  Nvidia is the most obvious example, but it&#8217;s not unique. also - </p><p><strong>Nvidia&#8217;s plan is Smart, Not Nefarious:</strong></p><ul><li><p>It&#8217;s legal and transparent</p></li><li><p>It&#8217;s actually quite rational</p></li><li><p>But it means the &#8220;valuation&#8221; reflects partly supply chain integration, not just financial returns</p></li></ul><p><strong>The Implication:</strong><br>The $12B valuation is inflated by strategic value, not just equity value.</p></li></ul><p></p><h2>So Where&#8217;s This All Heading? </h2><h2>First for Tinker and Mira Murati. - The Most Likely Scenarios</h2><p>Based on everything I&#8217;ve learned, here are the probability-weighted outcomes:</p><p><strong>Scenario 1: Moderate Success (50% probability)</strong></p><ul><li><p><strong>Tinker becomes a real infra business with solid customer base</strong></p></li><li><p>Revenue grows to $500M&#8211;$1B annually by 2030</p></li><li><p>Never justifies $50B valuation on fundamentals</p></li><li><p>Investors mark it as a &#8220;win&#8221; anyway because the team proved the thesis</p></li><li><p>Murati remains influential in AI</p></li></ul><p><strong>Scenario 2: Strategic Acquisition (35% probability)</strong></p><ul><li><p>In 3&#8211;4 years, market cools, valuations normalize</p></li><li><p><strong>Large tech company or hyperscaler acquires Thinking Machines</strong></p></li><li><p>Reason: Get the team + plug Tinker into their infra</p></li><li><p>Price: $5&#8211;20B (depending on market conditions)</p></li><li><p>Outcome: Murati gets high-level AI role at acquirer</p></li></ul><p><strong>Scenario 3: Overfunded Drift (15% probability)</strong></p><ul><li><p>Tinker never achieves product-market fit at scale</p></li><li><p>Company stays well-capitalized but doesn&#8217;t grow explosively</p></li><li><p>Big clouds commoditize fine-tuning before Thinking Machines expands</p></li><li><p>Slow fade or acqui-hire in 5&#8211;7 years</p></li><li><p><strong>Murati&#8217;s brand survives; she lands a top role elsewhere</strong></p></li></ul><p><strong>My Estimate:</strong><br>Scenario 1 or 2 is most likely. Scenario 3 is possible if execution falters. Scenario 4 (explosive success) would require frontier-lab-level execution, which is rare.</p><h2>What This All Reveals About 2025</h2><p>My tiny weekend investigation of Thinking Machines somewhat turned into a much larger story about <strong>how capital, narrative, and incentives are shaping AI right now</strong>.</p><p><strong>The Core Truth:</strong></p><ul><li><p>Thinking Machines is getting $12 billion <strong>partly because</strong> Mira Murati is genuinely talented and has proven track record</p></li><li><p>But it&#8217;s also getting that money <strong>because of hype</strong>, FOMO, strategic investor incentives, and capital concentration</p></li><li><p>The product (fine-tuning API) is <strong>real but narrow</strong></p></li><li><p><strong>The valuation is disconnected from fundamentals</strong></p></li><li><p><strong>And yet this isn&#8217;t fraud,</strong> it&#8217;s the logical output of the current venture capital incentive structure</p></li></ul><p><strong>What This Means:</strong></p><ul><li><p><strong>For builders:</strong> If you&#8217;re not in the &#8220;obvious&#8221; AI category, you&#8217;re playing a different game with much harsher metrics</p></li><li><p><strong>For investors:</strong> You&#8217;re probably overpaying if you&#8217;re investing in 2025 AI at these valuations</p></li><li><p><strong>For the market: </strong>Capital misallocation on this scale usually ends badly</p></li></ul><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!hMqX!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13e01c5c-f4ad-4f34-948e-88699713dbb0_3668x2048.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!hMqX!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13e01c5c-f4ad-4f34-948e-88699713dbb0_3668x2048.jpeg 424w, https://substackcdn.com/image/fetch/$s_!hMqX!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13e01c5c-f4ad-4f34-948e-88699713dbb0_3668x2048.jpeg 848w, https://substackcdn.com/image/fetch/$s_!hMqX!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13e01c5c-f4ad-4f34-948e-88699713dbb0_3668x2048.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!hMqX!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13e01c5c-f4ad-4f34-948e-88699713dbb0_3668x2048.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!hMqX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13e01c5c-f4ad-4f34-948e-88699713dbb0_3668x2048.jpeg" width="1456" height="813" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/13e01c5c-f4ad-4f34-948e-88699713dbb0_3668x2048.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:813,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:956554,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.intelligentfounder.ai/i/181597412?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13e01c5c-f4ad-4f34-948e-88699713dbb0_3668x2048.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!hMqX!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13e01c5c-f4ad-4f34-948e-88699713dbb0_3668x2048.jpeg 424w, https://substackcdn.com/image/fetch/$s_!hMqX!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13e01c5c-f4ad-4f34-948e-88699713dbb0_3668x2048.jpeg 848w, https://substackcdn.com/image/fetch/$s_!hMqX!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13e01c5c-f4ad-4f34-948e-88699713dbb0_3668x2048.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!hMqX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13e01c5c-f4ad-4f34-948e-88699713dbb0_3668x2048.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p><strong>The Uncomfortable Truth:</strong><br>Mira Murati probably deserves some level of premium valuation based on her track record. <strong>But $12B for a fine-tuning API that&#8217;s been live for two months? That&#8217;s not a premium.</strong> That&#8217;s the financial equivalent of FOMO dressed up as venture capital.</p><p></p><h2>Did all of the above really solve the $12 Billion Question? </h2><p>I was just curious why no one is talking about tinker that much? well no that its generally available? it will be? But the real story is that, - </p><p><br><strong>The silence around Tinker isn&#8217;t a bug - it&#8217;s a feature. </strong>Tinker is infrastructure. It doesn&#8217;t change how people use AI. It&#8217;s valuable for researchers, <strong>but it&#8217;s not a cultural moment.</strong></p><ol><li><p><strong>Tinker is real</strong> &#8211; A useful product that solves a genuine problem for researchers and ML teams</p></li><li><p><strong>The valuation is not justified by current traction</strong> &#8211; It&#8217;s priced on optionality, narrative, and founder brand</p></li><li><p><strong>This isn&#8217;t unique</strong> &#8211; Anthropic, xAI, Mistral, Cursor all follow similar patterns</p></li><li><p><strong>Strategic incentives matter</strong> &#8211; Nvidia and others aren&#8217;t purely financial investors</p></li><li><p><strong>Capital is extremely concentrated</strong> &#8211; A tiny number of AI bets soak up most of the money</p></li><li><p><strong>It&#8217;s probably not a scam</strong> &#8211; But it&#8217;s also not transparent investing</p></li></ol><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.intelligentfounder.ai/p/12-billion-for-a-finetuning-api?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.intelligentfounder.ai/p/12-billion-for-a-finetuning-api?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p></p><p>What however make a real story is <strong>how we got to a place where a fine-tuning API can command a $12 billion valuation</strong>. That story reveals something about the state of venture capital, the concentration of AI funding, and the narratives we tell ourselves about which companies matter.</p><blockquote><p>Thinking Machines may or may not become a solid infra company, or gets acquired by AWS or Google in 3&#8211;4 years. 2025 becomes the time capital got really crazy for AI. In fact honestly the real question isn&#8217;t whether Thinking Machines will succeed or fail. It&#8217;s whether an entire venture capital ecosystem that can value a fine-tuning API at $12 billion is working the way it&#8217;s supposed to.</p><p>I think the answer is no.</p></blockquote><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.intelligentfounder.ai/p/12-billion-for-a-finetuning-api/comments&quot;,&quot;text&quot;:&quot;Leave a comment&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.intelligentfounder.ai/p/12-billion-for-a-finetuning-api/comments"><span>Leave a comment</span></a></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.intelligentfounder.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.intelligentfounder.ai/subscribe?"><span>Subscribe now</span></a></p><p></p>]]></content:encoded></item></channel></rss>