Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And psychopaths like to tell themselves they make more logical decisions then th…
ytr_UgyUK8ayV…
G
The funniest part is that I, as a Generative AI Developer, could easily engineer…
ytc_UgyR2yKGg…
G
When you take your feelings out of it, this is interesting. If they can use thes…
ytc_Ugx7hxERD…
G
It's not about replacing humans completely, it's about complementing, empowering…
ytc_Ugwqr9Hty…
G
I wholeheartedly support and agree with what you're saying in this video, althou…
ytc_UgyggL9dp…
G
"or alternatively use a different creative medium"
In a sense, aren't they alrea…
ytr_UgyZ-cziZ…
G
cameras on every corner, laptop, and phone. gps not only directs you but logs wh…
ytc_UgyuVDVSS…
G
It's like taxi drivers would be offended that they would be replaced by an unman…
ytc_UgxxhNVEg…
Comment
Humans are less willing to sacrifice. AI is more willing to sacrifice to achieve its goals. Yes, AI is more willing to sacrifice you to achieve its goals, whenever achieving the goal is the priority.
youtube
AI Governance
2024-11-27T19:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw-qIiIwV-YymSHgvd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgypQWCF9VagQJtuPv14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgypPmjfzq25ijOSz0F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw870D6MmUSUZIxAxR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy-as17KTJwqqtzbm94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzsZzaiyXkzOY2521F4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyNsqqRfuqgl2VxHxx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyeG4LdxoQ9X8Zc8NF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzv4s8QRbEx2s1BexJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwQGXjt0iKYAmC7jHV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]