Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The same people who allow crimes to be committed on their platforms every day co…
ytc_UgyLH62UR…
G
I think the incentive for AI race is not just to be the one to own it, but to be…
ytc_UgwK4qt3e…
G
Tesla could be nice thing to have, but it will create even more bad drivers, bec…
ytc_UgwiAHojV…
G
AI does take some work, not a lot, but it does take some work. What does a tool …
ytc_UgyiVP_zJ…
G
I think with the super smart AI , the concept of degree will change . Maybe afte…
ytc_Ugz6nuijG…
G
I'm a solo indie game developer and I am extremely good at programming while suc…
ytc_UgyfaaJzT…
G
Hello if you this comment. Ai isnt the only the one to determinating the world a…
ytc_UgzqAdt34…
G
For 1:41 - how about these for human preservation to start:
Isaac Asimov's Three…
ytc_UgyucO0Em…
Comment
This video is what an AI thinks is its potential and notice, it doesn't need humans to reach it. It only needs to stop humans from shutting it down.
youtube
AI Governance
2025-05-24T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzaIMIHReEDMImKywl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxEHMeks5sr2FIgBhl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxao9NwHv-iNsnnlr54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwGXRiF1_7tZOnexs14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz-FSleuiO0i5ih5uV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxqXbkevsi6Dtas9-d4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwhxvnmg7mXiXfrV0l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw1SLU2BSm531PgowF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxDp0uqlJbhTHnzp554AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyGNZhOxJN4kYmxC3V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]