Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Andrew Siegel I was referring to the robot that moves the butter, it appears in …
ytr_UgwPqZZQi…
G
Sounds like the best way to alleviate the lack of jobs problem is to burn down t…
ytc_UgxG5ieuZ…
G
remember the comic where a ai out lives humanity floats through the universe asc…
ytc_UgzTpeOgk…
G
That's just algorithms and data gathering doing it's job. It's not listening, co…
ytr_Ugz9uytYf…
G
I bet the dude is in love with his washibg machine. Wow. Conscious llms? Right. …
ytc_UgxtA1l83…
G
Our days are actually about 6-7h and we get homework form pretty much every subj…
ytr_UgyKboBwI…
G
That dil-weed is one of the reasons I was so against AI.... That guys sucks.... …
ytc_UgwRDjA-D…
G
Hiii, thats beautiful art your doing lavender and the Ai "artists" should just s…
ytc_UgyWubq59…
Comment
People acting like AI weapons won’t go wrong are foolish. AI is a tool, and tools can be used inappropriately and destructively. Just like nuclear science, it can be used to destroy cities, or create MRI machines. I study computer science and AI and machine learning systems are, by their design, extremely hard to predict. There is no guarantee that an AI system will behave predictably is all scenarios. And what happens if a fully autonomous and self sustaining system is engaged, and the people with their finger on the off switch get killed, who could possibly turn off the system? What humanity needs is not more powerful weapons.
youtube
2018-04-03T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwM5aZIxWW4j5iDuXx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxl09f6L7Rj-RKTPZF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzieqUhMRMtOB8uQh14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzlxnxQw99WAmfHehF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzZoRdIrjkSS-bAyZ54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyqukX4nqlg8PxxK0F4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzjQP_1zTltW_9IU5l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwH2zhdVVSb5TK2kxh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyHwfpnimaOHxKVZVF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwpUlJEuK97Bnz92U54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]