Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hi Rituja, you got the right answer. Kudos.
The contest is over and winners have…
ytr_Ugyj-y6D4…
G
Ya it's creepy didn't you already have a problem with the robots talking to them…
ytc_UgwyeA9Y0…
G
"Our AI's are not sentient. They are advanced algorithms." This. People have a p…
ytr_UgwqVWL1_…
G
This shows AI is far from perfect. And if fully implemented, there will be probl…
ytc_UgxzA6vas…
G
If AI becomes as smart as humans there is nothing to worry about. Americans do n…
ytc_UgyRuJ15h…
G
The one thing AI doesn't have is soul and you can tell the music has zero soul. …
ytc_UgxFrYBFI…
G
also AI is not "just a Tool" at least not used like one.. A tool doesnt do your …
ytc_UgzH6bh1A…
G
We should start spreading the message that “Tesla forces you to disengage full s…
ytc_Ugx2ZzOYI…
Comment
I’m old enough to have watched the original Terminator movie at the theater. Every time I see anything on AI, I’m immediately brought back to that first Terminator movie. We obviously didn’t learn and decided to open Pandora’s Box regardless of the potential consequences. Now we’ll pay the price no matter how steep that may be.
youtube
AI Governance
2025-10-18T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx00VnaDyXksMRKNal4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzGSMBwPEQHLiVl7kJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzDsmtf8XkcrlmLZlR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxc0UW9sATQ4xM-eht4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyNpx3HWzSRqOe6bh54AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwvo0iwMH5dtFJ4dnB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgycmjNlSG1RudqSea14AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw3CFxmQsBX6sjK-NB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzx_nZvHm1aYEWvPeF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyWpgKPz54q-8rNEzx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]