Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I feel like AI and robots should have been created to take the tasks off our sho…
ytc_UgzglSfGw…
G
Google A.I framed me for a serious felony charges and a crimes that never commi…
ytc_Ugzd36Qe3…
G
I was thinking what if when the person was waving their hand in front of this ro…
ytc_Ugxyoc3Zk…
G
9:39 on the basis that the a.i. learns the internet tells us thats the state of …
ytc_UgxTVTebn…
G
To be clear, I don’t actually mind the idea of AI taking over all jobs; I’m agai…
ytc_UgyL3sR1y…
G
They call him the godfather of AI because as a good practitioner of his culture …
ytc_UgwOY5kF1…
G
hol on a minute cause i think I've lost o whole season here not a single episode…
ytc_UgyEnviYD…
G
Too long to watch, just unplug the super intelligent AI if it starts talking fun…
ytc_UgzHCH_7D…
Comment
UBI really is a fantasy though. With our current technology, AI can't build roads or do maintenance on sewer plants. All the cushy desk jobs are getting replaced by AI, but not the physical labor jobs. Which means, if you create UBI to solve the "Nobody can do easy desk jobs anymore" problem, you instantly create the "Nobody wants to do back breaking labor when they can get paid not to" problem.
That is the most dangerous part about AI taking jobs; its taking the BEST jobs to work, and leaving the worst jobs untouched. we want to do something to help those who are displaced from their jobs, but you can't do anything that would disincentivize working in the fields nobody wants to do, but absolutely NEEDS to get done for society to continue to function. We really should have regulated AI a long time ago... There are no clean solutions to this problem now.
youtube
Viral AI Reaction
2025-11-24T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwGSErC6c-KlmfOoqF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxkJfcOUFGLHrTjgoB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwtV-rwZvCG1tHMz2V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx_0pEY0HiSz-zcTZ54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzuk_45tsLimsG_TUp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxy6BxgRYJgtRJU-Dh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwWNQ4D7dAexe4QOJJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxuuwgXLuYBoRzldYN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw4weInbW6TUrO5ZJV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyGlni-4o2ofQ85QOF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}]