Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A slight problem with poisoning is : if it is done by a program (ie done by an a…
ytc_UgwfJek1M…
G
@group555_ So AI can’t be bad, if it’s the user choosing to do something bad wit…
ytr_UgwEiUuS2…
G
@Constellasian it shouldn't have been invented in first place if I'm being fair …
ytr_UgyknPmTh…
G
I want to meet someone so scared of AI taking all the jobs that they go out of t…
ytc_UgyNpe1zs…
G
Mujhe aisa lag raha hai ki ek time aayega jab AI world pe human beings pe domina…
ytc_Ugzt5hr11…
G
Thank you for commenting, @elianalia1082! Maybe the robot is always in the first…
ytr_Ugxn-Wdy8…
G
So people are mad he’s telling everyone it’s AI and others know this and buy it?…
ytc_Ugz7v_bQI…
G
"I read your A.I. chats"
Me : "So?"
"All of them"
Me : * preping for suicide *…
ytc_UgzOGKrAn…
Comment
There are still no AI tools. LLMs are fancy autocomplete with no concept of reality, or even conception. They can only ever serve as tools, and will never become a reliable replacement for any sort of worker. These are inherent limitations of what the technology is. LLMs cannot think, and relying on them too much will impair your own ability to think.
youtube
AI Jobs
2025-09-17T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxpAT7phYUdn1x7d5p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyV-21Sv7efZP-WI2p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzjLswr4JBLNJ4XvYV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyodzMLT8lUS2iJbrN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw47jxSGZZUgTTFyep4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyzxAbcoBQFuoP6kot4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugye_7j7f3cAsA9xWGh4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyttd0zobpsz3foi3p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy7k-zAxBxj6v3bhE94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx1jV8QIJlZdK0DQ1Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]