- AI Natives
- Posts
- AI Natives #14 - entertainment chatbots, AI chips and Dubbing
AI Natives #14 - entertainment chatbots, AI chips and Dubbing
ElevenLabs and Captions goes after AI Dubbing, all tech giants want to reduce their dependency on Nvidia and AI coding assistants are having hard time with managing their finances
Hey there, #TheAINatives! 🤖
Happy to have you here joining us with the fourteenth issue of The AI Natives! 🥳
This was an interesting week. A lot of buzz from the start-ups, a lot about hardware related news for AI optimization. Nothing crazy that would keep me at my toes in relation to danger, apart from, perhaps, chatbots with influencers that Meta and character.ai want to compete on.
To counterbalance there are also good news - people responsible for the new frontier AI tech are asking about actual utility of their developments. Markets seems to follow through with similar questions - what is the economic output that we could see from AI?
Truth being told, change takes time.
It will be long before we will see current, strong revenue streams of companies being replaced or influenced by AI capabilities. There is indeed a lot of hype now on AI coming after everyone, their jobs, their lives (check Nitty-Gritty Stuff!). However, models now are far from being optimized. To run to their best quality, these require a lot of computing power and capital. Some nimble, less generic models can survive and make waves in niche applications, but we are yet to see massive deployments in, let’s say, healthcare (apart from AlphaFold).
This is far from surprising news, but hype about impact of AI makes everyone anxious and eager to see the proof. Especially folks at Wall Street, willing to calculate how much quarterly results have improved thanks to AI. This won’t, however, happen overnight.
Btw. You will see a change with 🚩, 💚or 🟠 being next to the news title. This is expression of my take, whether I see the news as positive or dangerous. We will delve deeper into dangers of AI going onward, as there is not enough being discussed in that space.
Let The #14 AI Natives issue begin! ⏬
CONVO OVER COFFEE ☕
Big topics to discuss with friends and colleagues
AI Dubbing is a space of new startups 🚩
Not only ElevenLabs unveils AI Dubbing, translating Video and Audio Content into 20 Languages, but also Captions introduces Lipdub - an AI Dubbing App oriented towards Gen Z.
My take: This is an interesting space, where clearly many new startups made a lot of waves. Spotify with their actions from last weeks heralded the way onwards - adapting the content generated in video/audio form into different languages. Here we see these models popping up in new spaces and being easily available for end users. The danger comes in form of manipulating messages addressing one population to another. Politicians could be the first victims. But effects for movie and podcast space? Groundbreaking.
Are developer AI tools entering a fight for survival phase? 🟠
We all know about magical properties of ChatGPT dropping code left and right. Although seemingly miraculous, ChatGPT is not the best coding buddy you can hire. Companies specialized in these spaces are enticing their users with AI Assistants promising help solely in writing good code. The issue? How to make money. GitHub, the king of storing code repositories, released Copilot a while back. It generated waves, with 1.5mln users taking advantage of this tool. But new report from Wall Street Journal suggests that GitHub looses money for running it, instead of bringing more revenue. Which may justify their recent blogposts claming how good Copilot is.
My take: Replit, a developer tooling vendor, deployed their own AI code companion to all users. Which only justifies this as being a trend going onwards. The issue remains with cost associated with running these models - while GitHub has a backing of Microsoft, Replit partners with Google for infrastructure access. Why are these organizations getting so closely linked with cloud giants? Because deploying models like code assistance at scale now requires ridiculously high computing cost and only those giants are able to deal with it now. Go around and take a look on the job market - plenty of roles at Amazon are Data Center related. You will see many similar positions at Microsoft and Google. Cloud giants know that the more tools like code assistance appear, the more their infrastructure will be needed. The question that we should all have is: will it save the innovation wave, or dwarf it? Ultimately, while companies can power on and develop less computational hungry models, they may found themselves too close to giants, like Icarus flying too closely to the sun.
‘Funny’ chatbots time 🚩🚩
After recent Meta’s announcement of creating avatars allowing people to chat with their favorite known individuals, character.ai jumps on the same bandwagon allowing group chats in their communications with avatars of famous and imaginary characters.
My take: while both seemingly compete in entertainment chat experiences, the deeper danger of these changes comes from threat to the privacy of public figures. These avatars not only frequently fails to acknowledge they are just avatars, but also can go into unexpected spaces talking about sensitive things without proper attention, potentially damaging stars and influencer’s reputations. I won’t even start on possibility of having some crazy fanatics popping up and claiming that they can assault a movie star in real life because they got ‘intimate on the chatbot’ - I hope you see the danger here as I am, with some people potentially mixing artificial world with realities to the dangerous extent.
AI Chips for everyone!🟠
AMD made waves by acquiring open-source AI software company - Nod.ai - to go deeper in adapting software for HPC (high performance computing). But that is just a tip of an iceberg. Microsoft reportedly works on its own AI Chip. OpenAI thinks of one as well. DeepMind has been using AI for a while now to optimize their chips for the best possible quality with training their models. Now, why is this happening?
My take: There are two reasons, in my opinion, that drives this push: you can not only optimize the hardware for better performance against your models (and deal with extravagant computational costs that way), but you can also deal with chip shortage that influence companies potential to develop and train new models. For now Nvidia is very happy with where things are going. But as much as noone expected Apple to develop their own M1 and M2 chips and break the partnership with Intel, now Nvidia may not be seeing the wave of ‘let me do it myself’ companies popping around. And there are many arriving. Furiosa.ai is one, charming example - with their processing unit called Warboy (witness me!). But if you would go around and check what giants are hiring for, you would see Google desperately looking for GPU PMs right now. The side effects? Potentially more space for operating systems and open-source projects.
In other mainstream news:
Reply