Here’s the latest installment in the series on working with LLMS: https://thenewstack.io/the-future-of-sql-conversational-hands-on-problem-solving/
I keep returning to the theme of choral explanations (#4 on my list of best practices), and it’s especially relevant in the SQL domain where there are just so many ways to write a query.
Exploring the range of possibilities used to be arduous, time-consuming and hard to justify. Now it’s becoming hard to justify not doing that; optimizations (sometimes major ones) can and do emerge.
The rest of the series:
1 When the rubber duck talks back
2 Radical just-in-time learning
3 Why LLM-assisted table transformation is a big deal
4 Using LLM-Assisted Coding to Write a Custom Template Function
5 Elevating the Conversation with LLM Assistants
6 How Large Language Models Assisted a Website Makeover
7 Should LLMs Write Marketing Copy?
8 Test-Driven Development with LLMs: Never Trust, Always Verify
9 Learning While Coding: How LLMs Teach You Implicitly
10 How LLMs Helped Me Build an ODBC Plugin for Steampipe
11 How to Use LLMs for Dynamic Documentation
12 Let’s talk: conversational software development
13 Using LLMs to Improve SQL Queries
14 Puzzling over the Postgres Query Planner with LLMs
15 7 Guiding Principles for Working with LLMs
16 Learn by Doing: How LLMs Should Reshape Education
17 How to Learn Unfamiliar Software Tools with ChatGPT
18 Using AI to Improve Bad Business Writing
19 Code in Context: How AI Can Help Improve Our Documentation