Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Elon Musk's unlicenced methane gas turbines sound like something that was cut fr…
ytc_UgwDlL-7K…
G
Who is going to refill the fuel tanks? You'll either need to pay someone to be o…
ytc_UgzFulLWl…
G
@SteveCave Except you don't have to learn how to use AI for generating images. T…
ytr_UgxWmn2Gu…
G
Les maisons de services publics et oui ça fait du lien social. Bien plus utile a…
ytc_UgyrNsW9i…
G
I'm 49. When I as 16 it was computers that would end up replacing people. Later …
ytc_Ugzdze_f_…
G
Dont these data centres just ignore local orders and build anyway? Its going to …
ytc_Ugwfoc3GO…
G
At the end of the video, you reference a previous major shift in industry, and l…
ytc_UgxDfLLCh…
G
He’s forgets humans are actually scientific. There’s no way technology is capabl…
ytc_Ugy6m4DwI…
Comment
Yeah that's the big problem with AI for me - I like having the option to use it when I need it, but a lot of AI tools keep trying to be helpful even when I'd rather do things myself. I think it's going to be particularly an issue when it comes to learning things, it'll be much easier to become OK at something, but much harder to become good at something because AI will keep replacing learning opportunities with autofilled solutions. Ironically, it's even a problem in AI use - image generators now have prompt generators attached to them, so you don't even have the opportunity to learn how to consistently prompt what you want to see because the AI prompts for you.
youtube
2025-07-21T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzJeM4vf-99-DTSKhJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxkL3E8WrwmED7eMKF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzKGAPlm4aVPNyVnih4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxm9wDXvgVnmubN2FJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzHen8fyC9aOdYya2R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxMw5nDgqM3iSX7Hn94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyBqP4mjrTUaC_cABV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxjJxPzURp5SDrbVxh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw8e-1YnFPvZudnGup4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxLMdNcQ3XtpBnQvP14AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]