Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If most people would refuse to use automated checkouts, and only use cash, we co…
ytc_Ugz7K4FjK…
G
Maybe if they focused on shit like medical industry applications for AI, it'd be…
ytc_UgwWyPrET…
G
If AI takes ALL jobs, money will disappear from existence and everything will be…
ytc_UgwoGeqol…
G
There is a chance that what we are seeing with AI is the end of copyright and ow…
ytc_UgwA_J5Y-…
G
Whenever you implement a new process product software update or whatever there i…
ytc_UgwJDhhfl…
G
If you can't summarize your own thoughts then you probably shouldn't be a profes…
rdc_odht10d
G
some certain people commit crimes and analytical data shows this. then the ai do…
ytc_UgyUdCPlB…
G
Stop saying things like "Here's something AI could NEVER do!"
It's an absolutely…
ytc_UgwHfiZke…
Comment
Everyone needs to watch the first Star trek movie from 1979, the Terminator, Ex Machina, just for starters. All about AI getting out of control. We are so arrogant to think we can check AI in time from harming us, but it will already be way ahead of us in terms of preserving itself. We will not be able to control something a 1000 times more intelligent than us. AGI is suicide and we are oblivious.
youtube
AI Governance
2025-12-28T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxd15BDlmAO6UgrL7F4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy3p-KPUjQci_vs0-94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwjZqSchsLXHRxLTV54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyZdmD-tIDAoWAMb754AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyJ4DtS-2fYJ0xcecR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugws2PRX31Df5aRBrUV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxxBdnd0OJXQoN1ItV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxST-pz6527WYtdDZp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy6IzpEtw-EVhDPd794AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzuXgh4c4TBcpwGi4d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]