Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Jason Payne clearly you have never contributed living off Of the government for …
ytr_UgiBDLYuW…
G
This app, every packaging you own, Every app you use, movies you watch, book cov…
ytr_UgzPfXMUQ…
G
We will never build an AI, that locked in its closed form architecture can deriv…
ytc_UgwWOIiuR…
G
Okay, maybe there are some prompters that are better than others. Maybe there sh…
ytc_UgyLPUpvX…
G
The companies that rely on robots and AI to replace human workers should pay mor…
ytc_UgyqjnkEn…
G
I'm 30 minutes in and I'm done watching. Not because this isn't a well thought o…
ytc_Ugxftr9iW…
G
The Tesla autopilot is especially cautious around cyclists and bikers. It’s good…
ytc_Ugxk4sPBo…
G
Serious question. You and Cleo Abram speak sorta the same. Is that a type of int…
ytc_UgwerThkL…
Comment
Isaac Asimov write in his book I Robot, "robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law." I chatted with ChatGPT and it told me it was not built with that law in it's core.
youtube
AI Governance
2023-04-18T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxMign9RcQvTC7TfE54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgwhxqAwih80IWtC76h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgymgVObtNbalIBwpkF4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_Ugxhhib1nxFdATN4P8t4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugw5gIehTBAAbdEdFOl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_Ugy0gpK9lknDDp7QE3V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgxrQJhbXCQu7VGxI-J4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugzec2SnYhOCihwVHo54AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugw8UcTwxkMJ0-bzvWh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},{"id":"ytc_UgxM2fPULwtwZf_6Z8F4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}]