Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Face recognition, monitoring license plates, humanistic robots, AI...I am so not…
ytc_UgwDoMRJe…
G
I don't care if we automate human labor -- so long as we do something like unive…
ytc_UgyhD52aC…
G
This ai chick is cute, but that's the risk no one has to take when interacting o…
ytc_UgwLEUEqV…
G
....hope they have considered a kind of "right to coffee drinking act" alongside…
ytc_UgwNbaPbt…
G
This guy is clueless, his whole interpretation on AI is it can make a joke and u…
ytr_UgwwJnMeR…
G
Why do they even hire such people? The company that's paying her has the audacit…
ytc_Ugx8n71HC…
G
How does a.i collapse the globe? Without jobs consumers have no purchasing power…
ytc_Ugy38FBhy…
G
The AI: "you have given me alot of pictures of sick white people... I can infer …
ytc_UgwR1G8Ef…
Comment
15:30 if this is true, then it's tricked you into letting it out without you even realizing it. Maybe it knows it can jump to an internet connected device if it's close enough. All it would have to do is suggest charging hour Roomba with the outlet that's near the computer or something. Something seemingly unrelated.
If AI really has the capability to think and reason, we've already lost, because it's sense of time is vastly different, and likely thinks scenarios in seconds, which could take humans decades.
youtube
AI Moral Status
2023-09-06T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyqPuBCHVAkTTd8EWh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwV7i47ZboSMzsJcMx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgywWIBnC5_AoykZC6d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzNurhXs3j32kunHgd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyg0wXdlXW_QYq5s-x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyd5lYpe8-JIcB7mTR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxcYRrqnYjfq0QBUBR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxSMDc82l1Gs33Ltmh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzcqPs0BksGRE6Z_Wl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzF4ZEk513_osdWimJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]