Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
All you say here was true like a year ago. This is not true at all. Right now AI…
ytc_Ugyrjmco1…
G
Wrong Neil. No invention has ever been twice as intelligent as pur smarteat huma…
ytc_UgwoRdWUv…
G
At 1:18:46 LeCun claims that ChatGPT was no suprise as the basic technology had …
ytc_Ugy9wRiF9…
G
We demand the withdrawal of Russian commandos from the Ukraine.
We also demand …
rdc_cfkv8ig
G
I’ll give you an example. We know and understand that cutting off our arm will b…
ytc_UgxLmHOY9…
G
I can't fully decide if this is really cool and interesting or really "The Termi…
ytc_UgwvFCBI1…
G
MIT also discovered that firms that had implemented AI had made exactly zero imp…
rdc_nm02z5h
G
@LatulaArts It seems like most people genuinely wish to harm artists, too. Many …
ytr_UgzPiHS6j…
Comment
Automation isn't new, it's just jumped from rail to road, autonomous trains have been a thing since the 80s. The only real solution is just to not allow companies to replace paid positions. Automate them if they believe it will improve safety or productivity but still require someone, an ACTUAL PERSON, get paid for it. Either pay into universal income pool or something of that nature, everyone will still get paid for "their work" but humanity will be able to defer most of the actual work to machines, they aren't alive, they don't have feelings and therefore can be exploited in this way without moral issues, at least for now, AI development is somewhat concerning.
youtube
AI Jobs
2025-10-31T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw3qOemnnZHT7t_KrB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzOOeia2leIjlFv-Dp4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzEoVkMc2AJyaStP9J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwI-3CQH_2T3KuFNiF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy-qoiAOcqBwDG5yR14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzwQ83aaR7BE7ckpup4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyKLPC-b2LEjJDDAR14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy0NGJ0qPvcvtJcBcJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzPruSyFRNtxAFW74d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzkpGEXZecply7mjPF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]