Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We need human we don't need AI we are human animals tree happy children we love…
ytc_Ugw9oG0aE…
G
Stupid dumb and crazy the whole creation the whole Henson this guy China everyth…
ytc_UgznYfZGJ…
G
The modern loneliness crisis is really boiling down to addictive technology taki…
rdc_ohy85do
G
I've had arguments with people telling me AI doesn't rely on artists and can gro…
ytc_UgxCj3iYo…
G
@Scruffy-qi3ik lose the cope and ad hominem attacks if you want to be taken ser…
ytr_UgwXPDybv…
G
The AI I've used has had a terrible time trying to write code (SQL). Sure it can…
ytc_UgzdeNwSW…
G
Bernie, I came in figuring you had an inside track to the future applications of…
ytc_UgwvXZb_9…
G
In the Bible, the term "Ai" is interpreted to mean "heap of ruins."
Destructio…
ytc_UgzMMBT5g…
Comment
Issac Asimov
The Three Laws of robotics.
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
youtube
AI Governance
2025-06-18T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzKMW6Y0OT6hTaUGE54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugysbuq9403cPD3mZNt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwubkC7tifs_5n2CJx4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwZqxWqP4bKiWoIE0x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw9f9mpxFp6z1nIghR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxECXqYH8qfn5mzWq14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyiQg1FlDRUPSDGDrV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwmtC7fi939-RZLR_F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz3CfR3V2PQ8YV0jCh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxOK-dI469z0yYDLKB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]