Cassidy Visit, and Tool-Using LLM Reflection

Published: April 5, 2026

Cassidy Visit, and Tool-Using LLM Reflection

This reflection highlights Cassidy Williams’ visit and her advice to become an AI-native engineer while still building core skills without over relying on AI. It also explains key ideas from RAG and MCP resources, showing how AI can be improved with external context and tool access. The post ends with a fictional MCP-powered investing chat and a personal plan to compare AI-generated campus images with original photos for this week’s Nogramming progress.

Takeaway from Cassidy Williams Visit

Recently, our class had the privilege of hearing from Cassidy Williams, a software engineer, developer advocate, and content creator who has made her mark across the tech industry at companies like Amazon, CodePen, and Netlify. Known in the developer community for making tech feel approachable and fun, whether through her writing, social media, or her famously enthusiastic love of mechanical keyboards, Cassidy brought a refreshingly grounded perspective on AI and what it means for the next generation of engineers.

She encouraged us to embrace AI tools and develop as AI-native engineers, while also warning against using them as a crutch, both for the security risks involved and for the sake of our own growth. She stressed the importance of still building things without AI so you can truly understand what you are creating.

On the question of whether AI will replace software engineers, she was direct: it won't. The field will look very different in ten years, but engineers are not going anywhere, and those who learn to work alongside AI will have a real edge. Beyond the technical, she got candid about what actually drives career decisions, touching on compensation, burnout, changes in job scope, and even something as practical as a company's location. It was a reminder that career decisions are rarely just about the work itself, and a great window into what a role in developer advocacy really looks like day to day.

Two Resources on RAG, & MCP

RAG Resource: Code a simple RAG from scratch

RAG (Retrieval-Augmented Generation) is a powerful technique that combines information retrieval and text generation to improve a LLM performance by give it context of external knowledge sources.

For example, if you ask a LLM based chatbot "What is the weather today in Washington D.C.?" It will most likely struggle since it doesn't have weather data to predict the weather. How RAG fixes this problem is giving that model access to a weather database, through which the model can then retrieve previous weather patterns and predict future weather.

This is a very simple explanation of what RAG actually does, underneath the dataset has to be broken into small chunks so that the embedding model can create a accurate vector representation for it. This vector database is the what the chatbot model will use for generation.

There is room for improvement with this model: - If the question covers multiple topics, the system may not be able connect the various pieces together as it retrieves chunks based on the similarity to the query chunks. - The database is stored in memory so it might not be scalable for large datasets.

MCP Resource: Model Context Protocol Docs

"MCP (Model Context Protocol) is an open-source standard for connecting AI applications to external systems" ~ MCP Docs

In simpler terms, MCP is a protocol that can be used to connect AI applications to data sources & tools. This can be helpful for developers so that they can integrate their IDEs with AI applications/chatbots. For AI applications and agents MCP provides them with data sources and other tools so they can enhance their capabilities.

Fictional Chat with a Super-Powered MCP LLM

Below is a fictional conversation with an AI chatbot that has access to a variety of tools and data sources through MCP.

Me: "I have $100 and I want to invest it. Invest it for me, mainly in stocks, avoid cryptocurrency, and make sure to diversify."

LLM: "According to your current investment, I have invested $50 in CompanyA, $30 in CompanyB, and $20 in CompanyC."

Me: "That sounds good, can you provide me with an explanation of why you chose those stocks?"

LLM: "Given your current investment standings, CompanyA and CompanyB are both strong performers, both currently have 150% returns on investment, and CompanyB has a 120% return on investment. Additionally, CompanyA and CompanyB are both leaders in their respective industries, with strong financials and a history of consistent growth. CompanyC is also a strong performer in the e-commerce industry, with a dominant market share and a history of innovation."

Me: "That is true, but isn't CompanyB about to increase their prices for their products? I heard that they are going to increase the price of their products by 20% in the next month, which might lead to a decrease in sales and a decrease in stock price."

LLM: "You are right! However CompanyB has a strong brand loyalty and a large customer base, which may help mitigate the impact of the price increase. Additionally, the company has a history of successfully navigating market challenges and adapting to changing consumer preferences. While there may be some short-term volatility in the stock price, I believe that CompanyB's strong fundamentals and market position will help it weather any potential downturns."

This Week's Nogramming Progress Plan

Since I changed my mind about my project, I will be looking through the photos I took of campus last Thursday and look through my camera roll for photos that will be interesting to ask AI to generate. After that I will ask GPT to generate photos based on the ones I have. Then I will create a scrapbook through Canva to show and compare AI photos vs. my photos.


~Shree