- Stocks To Space
- Posts
- Do Something Great
Do Something Great
Plus: How Perplexity builds products, Controversial thoughts on money, Distribution is King and much more...
Hey everyone,
Stocks To Space has just surpassed 1,000 subscribers—a MASSIVE welcome to everyone who is new!
I’m absolutely blown away, not just by the fact that I reached this target more than seven months early but also by your interest in what is essentially a diary of my curiosities.
You are all officially the Stocks To Space OGs. Thank you for being here.
I hope you enjoy this week’s Sunday Space, where I serve up the best ideas, tools and resources I’ve found each week as we explore the technology shaping the future.
If you find something thought-provoking, forward it to a friend.
IDEAS
Do Something Great
Just because you might not get a biography written about your life doesn’t mean you shouldn’t do something great.
RESEARCH
What is RAG…?
If you’re like me, you would’ve had this experience on X/Twitter before:
If that’s you, I promise it won’t happen ever again.
Because today, we’re studying how LLMs are being made more accurate and reliable via Retrieval-Augmented Generation (aka RAG).
The best way to understand RAG is by way of an analogy.
Think of a GP who handles many health issues. For more complex cases, GPs refer patients to specialist doctors with in-depth knowledge of that case.
Similarly, LLMs can address all types of broad questions, but they rely on RAG to bring in specialist knowledge for detailed and authoritative answers.
A simple RAG workflow, Source: Armin Norouzi, Ph.D
As you can see, RAG’s primary function is to deepen the model’s knowledge base with facts fetched from external sources.
This is a huge boon for AI application users who want to explore more current or specific topics.
For example, an LLM augmented with a medical index would be the ultimate assistant for nurses. Financial analysts would benefit hugely from LLMs augmented with current market data.
Other benefits of RAG include:
Being faster and cheaper than retraining the entire model,
Models get sources they can cite, building trust and
Limiting model hallucination.
The possibilities for using RAG in your AI apps are almost limitless. It allows users to converse with data repositories, opening up many new experiences.
Any business can turn its technical docs, policy manuals, customer support tickets or onboarding videos into knowledge bases to enhance LLMs.
If companies like AWS, Google, Microsoft, NVIDIA and Pinecone are building AI experiences using RAG, so should we.
AI Word of the Day
Large Language Models (LLMs)
Created with Midjourney
LLMs are advanced AI systems trained on vast datasets of diverse text from books, video transcriptions and articles on the internet.
They use deep learning techniques to identify patterns in the data and understand contextual relationships between words, enabling them to generate text that is remarkably similar to human language.
INSIGHTS
From Articles I Read
Source: Lenny’s Newsletter
Perplexity isn’t just a bold startup with an idea to kill Google Search.
They’re also a high-performing organisation.
Some takeaways from Perplexity’s co-founder, Johnny Ho, on how they build products:
An AI-first approach: Everyone is urged to use AI tools as a first port of call to solve problems rather than bothering colleagues.
IC focus: Perplexity has only two PMs in their 50-person company. They prefer product focussed engineers who will get shit done.
Small teams: Johnny builds teams ideally made up of two people. This reduces coordination headwinds and keeps the organisation fluid.
I love getting insights into how startups are being built in the AI era. Perplexity is just one of many, in my opinion, doing an excellent job.
Created with Midjourney
I really enjoyed this concise read on timeless money principles.
Three of my favourite takes that are pretty controversial:
“Money can’t buy happiness” is something only rich people who don’t struggle to make ends meet will say.
Lifestyle creep is fine if you also have savings rate creep.
People who brag about money probably aren’t that successful.
I firmly believe that the key to building wealth is not just learning how to earn money. It’s also about developing a healthy mindset and respect for it.
A good start to the latter is confronting the reality of money (as a resource) and having open, honest and reasonable conversations about it.
I’ve always thought one must build the best product to make a great company. Evan Armstrong argues otherwise in this eye-opening piece.
Here are my key insights from the article on building distribution:
Distribution isn't the same as the marketing cost line item. It’s a series of choices around pricing, packaging, novel usage patterns and social sharing built into the product.
The principle for distribution innovation: Exploit underpriced opportunities to acquire customers profitably.
Platform shifts are underway even today (think Apple Vision Pro), meaning tomorrow’s biggest companies will focus on distribution innovation and building great products.
“Long-term enterprise value—that is, building a big-ass business—is only possible by having a differentiated way of getting that product to a customer.”
Focus on distribution.
From X/Twitter
In the 90s you could fairly reliably make video games with bad performance knowing that in 18 months, thanks to Moore’s law, they would perform great.
Feels like AI is experiencing an accelerated version of this.
— Yuri Sagalov (@yuris)
2:04 AM • May 14, 2024
It's hard to predict what AI will do to the world, but that's all the more reason to learn to program. AI may churn up any industry, but the programmers have the best chance of surfing on this wave instead of having it crash on their heads.
— Paul Graham (@paulg)
3:23 PM • May 13, 2024
From YouTube
TOOLS
Perplexity
Perplexity.ai is a new type of search engine. In fact, they call themselves an “answer engine” rather than the former term that Google owns.
Instead of 10 blue links, Perplexity provides instant answers to your query that are summarised using sources from around the internet. The great thing is that its sources are cited within the answer, so users can verify the response.
In addition to getting context-specific answers in chat, users can engage in thoughtful conversations with Perplexity by following up in the thread or using their new Copilot function, which fine-tunes queries.
I’ve recently started using it, and, like when I first used ChatGPT, I’m obsessed with the user experience of conversational, answer-driven search.
While Google catches up with AI Overviews, I highly recommend using Perplexity if you want to optimise your search experience.
THOUGHTS
Quote I’m Pondering
“Founders have to go all in on their best idea. Don’t hedge. Don’t try to do five things at once. Just go all in on your best idea.”
— David Sacks, from E179 of the All-In podcast
What did you think of today's edition?How can I improve? What topics interest you the most? |
Was this email forwarded to you? If you liked it, sign up here.
If you loved it, forward it along and share the love.
Thanks for reading,
— Luca
*These are affiliate links—we may earn a commission if you purchase through them.
Reply