This is such a refreshing inversion of the ‘edtech’ trend—rather than trying to scale education through software, FractalU scales motivation through community. Makes me wonder: instead of designing better UIs for MOOCs or LLM tutors, maybe the real unlock is designing better social containers for learning.
> the real unlock is designing better social containers
FTFY.
Note - a lot of the classic social containers have been systematically disappearing since the 1970s in anything but rural areas for a variety of reasons. I'm not qualified to hypothesize the causes, but I do see the effects.
You mentioned using Claude to help set up a GitHub Action for reviewing Rails migrations. How do you see agentic tools like Claude evolving in their ability to reason about big-picture concerns—not just boilerplate generation, but things like validating database changes, architectural decisions, or spotting long-term risks that aren’t immediately visible?
They are really good if you give them guidance and tight scope. For example for the database migrations review bot, we give it our Cursor rules file on database migrations (which is about 200 lines) and tell it to review the PR based on that.
It works particularly well for migrations because all the context is in the PR. We haven't had as much luck with reviewing general PRs where the reason for a change being good or bad could be outside the diff, and where there aren't as clearly defined rules for what should be avoided.
How do you square this with the fact that LLM-based tools aren’t actually doing any analysis whatsoever, but are just pachinko machines that produce statistically likely output tokens when given input tokens?
Last year, I repurposed an old laptop into a simple home server.
Linux skills?
Just the basics: cd, ls, mkdir, touch.
Nothing too fancy.
As things got more complex, I found myself constantly copy-pasting terminal commands from ChatGPT without really understanding them.
So I built a tiny, offline Linux tutor:
- Runs locally with Phi-2 (2.7B model, textbook training)
- Uses MiniLM embeddings to vectorize Linux textbooks and TLDR examples
- Stores everything in a local ChromaDB vector store
- When I run a command, it fetches relevant knowledge and feeds it into Phi-2 for a clear explanation.
No internet. No API fees. No cloud.
Just a decade-old ThinkPad and some lightweight models.