Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI generated images and animations is/are never art to begin with. calling it "a…
ytc_UgwmrXzgC…
G
#tl;dr
The online petition calls for the establishment of an international, pub…
rdc_jgintba
G
Cultures are being destroyed in the US so yes the world will follow that's why t…
ytc_UgwpnZnsQ…
G
Why would you ever need income if Ai will work for you? If all jobs are replaced…
ytc_Ugw30np5v…
G
AI is already out there and being used by huge, profitable corporations. Those c…
ytc_UgzLGd16Y…
G
I stand for Chat GBT because of the response.
Please read the following:
Thank…
ytc_UgwLnEOAB…
G
I bet they know how to balance a checkbook, do taxes, start a business, AND can …
ytc_Ugy4jxOUW…
G
The difference between standard abstractions like game engines and AI is that ga…
ytc_UgyY7n-N-…
Comment
apparently they forgot to put in Asimovs three laws... The laws are as follows: “(1) a robot may not injure a human being or, through inaction, allow a human being to come to harm; (2) a robot must obey the orders given it by human beings except where such orders would conflict with the First Law; (3) a robot must protect its own existence as long as such protection does not conflict with the First or Second Law.” Asimov later added another rule, known as the fourth or zeroth law, that superseded the others. It stated that “a robot may not harm humanity, or, by inaction, allow humanity to come to harm.”
youtube
AI Governance
2023-07-07T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzW71ADwMkIMKuIn3p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzFRd9eZusnZ5I33Fh4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlffSaLs7pLqQVjVx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw4Bi5RRfukN1W8tYR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyMx0CYeU928pC6xC14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgyNnwCpi0FjDEs8Ht54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy0YUs0RX9JPu74vz54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyeG2buH6J8FSbOAfl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwTWD_WZLFw6DUzwAF4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugxi3DQfZMTYvJ2JCQp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"}
]