Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What is security in automated cars? I am afraid we will see cases soon where aut…
ytc_Ugw0rok7M…
G
Isn't it interesting that every idea for making AI "safe," are nearly all just t…
ytc_UgzxgIjTe…
G
Lmao obviously he would want that because he would be pointed over sear of Ai, A…
ytc_Ugwyg1rDa…
G
This video is like they been listening to all public schools parents in America …
ytc_UgxH9Zn9D…
G
I HATE AI. Whhhhhy do we need it? Waste scarecrow water, takes away farmland, h…
ytc_UgyC_whgX…
G
Tesla autopilot is like smoking 50 cigs per day. Eventually it will get you kill…
ytc_UgzmCCqup…
G
So... are Google ready to acknowledge that Gemini is sentient? Because Anthropic…
rdc_oi3j7jt
G
I have serious doubts about our ability to control AI because, let’s face it, tr…
ytc_UgyXgjhJ5…
Comment
The only way to break this a little bit is to regulate the robot development, as it is a hardware you can SEE. You can try to regulate the systems development, but it's impossible to be sure nobody will be working on it under the covers.
youtube
AI Governance
2026-02-04T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxFEVppC0bD64JrIbF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxNQixGdWVVWmRlzcJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz7w5PlsiYnj1W3-9h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyXG9ZTr2WkvOP-iv14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzgCGot3WqygDQunql4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxkQ6kEajaSKLI7mx54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxQ8n-uFz6MUyOiol94AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyHDwXddgvz_r4sfp14AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzH3sZGi5T2cSxO8a54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz7Cts11Q39wuZ4z8d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]