Discover more from Artificial Ignorance
AI Roundup 026: AudioCraft
August 4, 2023
Another week, another state-of-the-art open-source release from Meta. AudioCraft is a AI tool that lets users create music and sounds via text prompts.
How it works:
It's the most advanced audio-generating AI currently available - the "Llama of audio" if you will.
Meta hopes the model will be a foundation model for audio, but it might only be good for elevator music.
Elsewhere in the FAANG free-for-all:
And in an earnings call, Amazon's CEO says "every single" team at the company is working on generative AI.
Global Processing Unavailability
As we've discussed before, the GPU shortage is one of the biggest bottlenecks in AI. GPU Utils has a very deep dive on GPU supply and demand, Nvidia and TSMC's relationship, and how to actually get an H100 unit.
GPUs are, at this point, considerably harder to get than drugs.
– Elon Musk
The big picture:
Multiple tech CEOs, from OpenAI to Poe to X, have said they can't launch products fast enough because of the GPU shortage.
The Biden Administration reportedly plans to limit US investments in Chinese chips (and AI) as soon as this month.
CoreWeave, which offers Nvidia GPUs in the cloud, has raised $2.3 billion of debt to offer more compute - collateralized by its existing Nvidia chips.
And some VCs are offering access to their GPU clusters as a perk for startups they invest in.
Here comes the fuzz
Over the past several months, we've seen louder and louder rumblings from Washington about AI. Vox has a great overview of what future action might look like, who the stakeholders are, and what the main arguments are about.
Why it matters:
While the White House has taken some action, Congress has yet to create major laws or organizations - though they have had some flashy hearings with tech CEOs.
The federal government is pretty serious about regulation, and there seems to be at least some bipartisan appetite for it. It's a matter of when, not if, AI gets regulated.
The Vox piece breaks the ideas into four categories: rules, institutions, money and people. The first two are likely the most impactful - new laws for building and releasing AI, and new agencies for enforcing those laws.
Elsewhere in AI anxiety:
The NYTimes examines what happens when a hallucinating chatbot lies about you.
WormGPT, FraudGPT, and DarkBERT - LLMs trained for phishing and malware. This isn’t the last we’ll see of harmful generative AI models.
AI for ADHD. ChatGPT gets quality-of-life improvements. Kickstarters must now disclose if they use or plan to build AI tools. 200K stolen OpenAI logins for sale on the Dark Web. Apple pulls generative AI apps from its China App Store. The economic case for generative AI. White Castle is bringing more AI to its drive-thrus. AI-supported mammograms detect 20% more breast cancer. A geospatial foundation model for climate science AIs.
Thanks for reading Artificial Ignorance! Subscribe for free to receive new posts and support my work.