Good evening (or morning or afternoon). We are three AI enthusiasts and business leaders on a journey to demystify AI for ourselves, and in the process, for all our readers! It’s been two months since we conceptualised this newsletter, and we haven’t had a dull moment ever since.
Let’s dive right in!
News That Matters
Google’s annual developer conference - I/O 2025 - was buzzing with AI news
Google is bringing AI Mode in search to compile one response instead of showing you a number of search results. This may forever change the way we “google”! And that’s not the only exciting update — Google Meet’s live audio translation capability demo feels straight out of your sci-fi.
To compete with Open AI’s upcoming shopping features (that we covered in issue 3), Google’s Shop with AI mode will have personalized recommendations, real-time price tracking, and virtual try-on.
On the creative side, Veo 3 can whip up high-quality videos with sound, Imagen 4 crafts detailed AI images, and Flow lets you easily turn clips into smooth, cinematic videos. Phew.
OpenAI’s Operator gets smarter and safer
OpenAI upgraded its AI agent Operator from GPT-4o to the smarter o3 model. Operator autonomously handles tasks like research, booking, and form-filling. The update improves its ability to manage complex jobs and enhances safety for sensitive actions and payments.
One Trend of Note
Impact of AI on Jobs
AI's impact on jobs is a hot topic. Some say it’s going to kill jobs, others think it’ll create new ones. The reality? Early trends suggest some hiring slowdown (e.g. entry level coding), but, AI replacements haven’t always been smooth (more of that below).
The World Economic Forum’s Future of Jobs Report 2025 predicts AI might replace 92 million jobs by 2030, but create 170 million new ones. So, a net gain? Some jobs will grow, others will shrink or change. Routine work like data entry and admin tasks are most at risk, while roles that need problem-solving, creativity, and tech know-how are on the rise.
The report also highlights AI is changing how people work, not just what they do. This shift isn't just for entry-level roles. Even experienced professionals are using AI for initial drafts, data crunching, or summarizing large reports. People’s real value? Adding human judgment and guidance.
Klarna, the Swedish fintech, offers a recent example of AI adoption challenges. Klarna initially claimed its AI chatbot could replace 700 support agents and cut costs, but is now rehiring staff after service quality dropped. The CEO admitted cost-cutting was overemphasized. Klarna is shifting to a hybrid model: AI handles routine queries, while humans take on complex or sensitive issues.
There’s still a lot playing out, with different studies highlighting different risks and opportunities. But most agree: AI is unlikely to leave the job market untouched. The biggest changes may be in how work gets done, rather than whether it disappears entirely. Job titles might stay the same, but the day-to-day tasks will evolve. Going back to the Klarna example above, its pivot emphasizes that while AI can manage high volumes of simple tasks, it (thus far) struggles with empathy, critical thinking, and unforeseen situations — skills that remain uniquely human. This new approach also underscores the idea that humans need to focus on developing and leveraging these distinctly human skills, becoming adept at being "AI-augmented" rather than replaced by AI.
Key takeaway? AI literacy matters. Get comfortable with AI, sharpen analytical and flexibility skills, and focus on relationships and stakeholder management — they're becoming more valuable than ever.
What are your thoughts on how AI might change the nature of work? We'd love to hear.
AI Term of the Fortnight
Getting to Know AI Models: LLMs, SLMs, and Beyond
In AI, language models are trained to understand and create human language. Specifically, Large Language Models (LLMs) learn from enormous amounts of text, and have hundreds of billions to trillions of "parameters" (internal settings that help the model predict).
Wondering if there are Small Language Models (SLMs)? Yep! SLMs are lighter, faster models trained on smaller datasets. “Smaller” can still mean tens or hundreds of millions of parameters. They’re easier to run and cheaper to deploy, perfect when you want smart AI without the full power of an LLM.
For instance, an SLM can be sufficient for a FAQ-based chatbot, but an LLM is needed for a banking chatbot that handles complex, multi-step queries. Email spam classification can run on an SLM, while LLMs are better for content moderation across nuanced categories like misinformation or hate speech.
With us so far? To make matters more complex, LLMs come in different flavours. Here's a quick peek at a few types:
1. ‘General Purpose’ Models: Fast, versatile and great at generating natural language, but not built for deep reasoning.
Examples: GPT-4o, Gemini 2.5, DeepSeek-V3, Grok, LLaMA 3
Use cases: Generating business reports and market analysis, Chatbots, Real-time language translation
Why it matters: Great for automating repetitive tasks. But they can miss the mark on logic-heavy or memory-intensive work.
2. Reasoning Models: Smarter models that can follow logic, and think through complex tasks like structured problem solving, math, or decision-making.
Examples: Open AI’s o1 and o3 mini, DeepSeek-R1, Anthropic’s Claude 3 Opus
Use cases: Financial analysis, Legal case reviews, Business plans
Why it matters: Can help businesses with complex, strategic decisions, not just automation
3. Hybrid Models: Mix multiple architectural types to combine speed and complex reasoning
Examples: Anthropic’s Claude 3.7 Sonnet
Use cases: Autonomous AI agents (more on this soon!), Research assistants
Why it matters: More context-aware and adaptable for advanced business needs.
Keep in mind, these classifications aren’t set in stone, as AI companies keep experimenting with various approaches. For instance, Google’s Gemini 2.5 Pro, a general purpose model, now has enhanced reasoning capabilities, and an experimental ‘Deep Think’ mode. Phew.
AI in Practice
Otter.AI: From Minutes to Action Items
Haven’t we all attended meetings where we have been the designated note takers? Even if not designated, jotting down summaries, decisions, and action items are key to so many of our daily work lives.
AI has made the whole cottage industry of notes-taking redundant, letting us focus on actually absorbing a meeting instead of us obsessing over notes.. In today’s AI in Practice, we explore how to use Otter.AI for real-time transcription in business meetings.
Install the Otter.AI app / create a login on the web page.
Give your Otter account access to your calendar - device / Google / Teams. Otter can now directly join your meetings as a participant and take notes.
If you don’t want to give this access, you can still have Otter join your meetings by sharing the link in your Otter account.
Another option is to upload / import audio / video files into Otter.
Tip: The upload audio option helps immensely in case of fully face to face meetings, which have become more prevalent in a post-covid work-from-office world.
Otter’s output includes a summary, decisions, and action items, all of which can be edited before sharing.
Do note that Otter integrates most deeply with Zoom, enabling near real-time transcription. Other platforms may have slightly more latency, but the output quality remains strong.
That's it for this issue. Do share your feedback and what you’d like to see next on AI UnGeeked.
Cheers!