Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the goal of a humanoid robot is versatility, i can see a future where there woud…
ytc_Ugw4d868K…
G
I personally only use AI when I feel my art has something missing, or that I don…
ytc_Ugw6zXEYX…
G
If you want to program a robot to be more like a human you should get rid of the…
ytc_UgxDcf1-t…
G
Am I tripping or does this dude literally label anyone as the "godfather of AI"…
ytc_Ugwz3jKaa…
G
Meh. I started talking about my problems with chatgpt and i haven't needed tk us…
ytc_UgxLGUyts…
G
OMG this is how AI gets convinced that it is actually conscious!!! 😱arrrrrh I me…
ytc_UgxLbXOsR…
G
ChatGPT says a 100MW datacenter uses 1.1 million gallons a day. 100MW is the lo…
ytc_Ugwa-0XpS…
G
A little confusing when u asked if they wanted control u used the word ,,"you" b…
ytc_Ugy2z9yf9…
Comment
The Luddites were RIGHT though! They destroyed the machines that were going to take their jobs because they were demanding that the technology wait for humans to be taken care of before it was used as an excuse to screw over those humans.
Automation means there will be fewer jobs. People should not need to fight each other for jobs in order to survive. People should not need to fight with machines for work in order to survive.
I would sooner burn down an AI server warehouse than make a writer or artist unable to pay rent. These chuds spitting the word Luddite like an insult have no idea what they're invoking when they use the word
youtube
Viral AI Reaction
2025-07-01T19:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxYcTor5sUnpG3wDJ54AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzUwY-HBH3NhMG71Vx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxk-O_yupmY4_rX6gV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxZrgCRGxFLbEQGTYV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwsQOCIaGf_hRYwMZh4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugxi9XDdqg50v-lvSSF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxu787x7dGaiyCTi4x4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyBWtne2-TPPbNu5D94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzoQCSef7XKMmPi1Gx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwX_eZsifBqRJfJY3N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]