Hug all the faces
Hugging Face announced a mega Series D funding round of $235 million this week. The investment from Salesforce, Google, Amazon, Nvidia, Intel, AMD, Qualcomm, and others, values the company at $4.5 billion.
Between the lines:
The company, a leader in the MLOps space, plans to double down on other areas like AI research and enterprise support.
But it's worth noting that Hugging Face's annualized revenue is reported to be at less than $45 million - 1/100th of its latest valuation.
Undoubtedly, a large chunk of the money will go towards more Nvidia GPUs, as the chipmaker plans to triple its production of H100s in 2024.
I don't cover much funding news, because raising money is not the same as doing significant research or building an outstanding product. But to provide some context, here are a few AI fundraises from August alone:
Anthropic raised $100 million with a plan to build a telco-focused LLM.
Modular, a platform for developing AI systems, raised $100 million.
Resilience, a cyber insurance startup, raised $100 million.
Tenstorrent, an AI chipmaker, raised $100 million.
Viome, which sells supplements based on AI-driven microbiome analysis, raised $86.5 million.
Elemental Cognition, makers of enterprise chatbot LLMs, raised $60 million.
Simon Data, which uses ML for analytics and marketing automation, raised $54 million.
Weights & Biases, a (different) AI development platform, raised $50 million.
Tractian, which uses AI and sensors to spot mechanical failures, raised $45 million.
And CoreWeave, which previously raised over $500 million to offer GPUs for AI training, secured $2.3 billion in debt financing.
Code Llama
Not content to go a whole month between foundation model releases, Meta rolled out Code Llama. The model can generate code and help debug human-written software.
Why it matters:
Code Llama is based on Llama 2, and is similarly licensed for research and commercial use.
Unlike other recently-released coding models, Code Llama seems to be competitive with Github’s Copilot - meaning we might see more sophisticated code-generating programs and agents soon.
But maybe the most interesting detail is from the research paper - the best model version, “Unnatural Code Llama,” wasn’t released to the public. It seems like it was trained on another LLM’s output, which may violate TOS.
Elsewhere in the FAANG free-for-all:
Meta also released SeamlessM4T, a model that can translate and transcribe nearly 100 languages across text and speech.
YouTube and UMG are exploring how to use music in AI tools and how to pay artists for AI-generated music. Meanwhile, The Verge looks deeper at Google's various copyright dilemmas.
Microsoft adds another AI ally in Databricks, as it plans a new business service.
And while not technically Big Tech, OpenAI is bringing fine-tuning to ChatGPT.
Peak AI
In this moment, AI is overhyped. It’s clear to me that the expectations of AI far outstrip its current capabilities. But we're currently somewhere on the technology sigmoid curve with AI - and we don't know whether recent breakthroughs have put us at the end of the curve, or at the beginning.
Yes, but:
Critics like
argue that AI demand is already shrinking, and that most use cases are related to spamming and scamming.Others, like
and , believe we're at the start of a new technology wave, and offer predictions on what the future holds. (Though the latter admits that we're reaching the limits of our current AI).And if you want a refresher on how exactly we got here, Ars Technica has a great deep dive on exactly that.
The bottom line: For what it’s worth, I do think LLMs are a transformational technology - one that we’ll be grappling with for years to come. The promise of ChatGPT has a long way to go, with several critical hurdles to overcome. But I’m not ready to dismiss the long-term potential of AI just yet - though I’m not ready to pray at the altar of ChatGPT, either.
Things happen
OpenAI Stanford and UCSF turn thoughts into AI speech. Twilio’s AI-powered tools. Inside the internet’s biggest AI porn marketplace. 40% of the S&P 500 mentioned AI in earnings calls (but not official filings). Nvidia’s older chips, hobbled for the Chinese market, still see huge demand. Interviews with Microsoft’s CEO and CTO on AI. The UK in talks to buy £100 million in GPUs for national AI resources. Books3, a widely-used LLM dataset of 170K works, including copyrighted novels. Why Sergey Brin left retirement to work on AI. How Creative Commons licenses work with generative AI. Science, Nature, and other journals grapple with AI-written papers. You (probably) don’t need to fine-tune an LLM. ElevenLabs can now fake your voice in 30 languages. Deepmind’s AI art project.
Charlie, I understand why you don't cover funding as much, nor do I think you ought to, but of course it's a tremendous force. The market is very likely going to be a huge participant in deciding what the next successful venture will be, and where the brightest minds will go.
Glad to see some diversification in where those funds go!
For sure! I think part of why I shy away from it is my own background/bias - having been in Silucon Valley most of my career, I'm a little jaded when it comes to flashy fundraises. But cutting edge AI is so capital intensive that the money actually does matter.