Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Of course it's art, it's in the name innit?
Joking aside, you can debate the v…
ytc_UgxRuTL6O…
G
AI engineer to be here.
Even I would not attend an AI interview.
If I am not wo…
rdc_n6ud2pu
G
I truly think you're way too trusting of Tesla. They ARE death traps. The auto p…
ytc_UgwH_5oQP…
G
> The AI is a chat bot
> The AI needs to interact with humans and have them inte…
ytc_UgwCBxYq9…
G
guys do you think if we start making bad art on purpose the ai would steal it…
ytc_UgzrwagKq…
G
On the long run cheating is giving people bad luck, I guess. On the short run th…
ytc_UgzA7fuPY…
G
France has a strict no autopilot usage. The transportation safety ministry will …
ytc_UgwBPmTdj…
G
I've mentioned this in similar videos, but the people behind these AI art genera…
ytc_UgwktSr-A…
Comment
Says we're going to AGI then immediately backtracks and says we need to use domain specific models cause of the energy/memory limitations. That's the big lunch pin going forward. Yes, we could do all of these great things but is it going to be cost effective. Simply, with how much it takes to train models. Probably not. It goes to show that most people can't even fine-tune a model at home. When we get close that point, maybe we can start talking about gen ai as a cost effective strat to problem solving.
youtube
AI Responsibility
2025-10-21T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxW7VrcDxo1BYimqA14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxK17EqoJbTiwDdG7Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy0MNzA3MMtDU1e2jN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzCDMGq460IsYDoUW54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzDaWml_F-2d66cOC94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyv3qUxf1ffCyMX2xR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugws5h1CdJnByFHdp3B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxrogRh3BVCkTdMSz14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzV4bNvbP_QWNLT7e54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy9u7RTMNvH6Jq47xN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}
]