Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@yellow01umrella dolphins are very smart yes, if an AI reached dolphin awareness…
ytr_UgxIA4vv0…
G
Since only the rich would be able to afford a robot AI more than likely, it woul…
ytc_Ugzg8l_uk…
G
@fevad1246 Certainly there are courses that are less applicable than others. In …
ytr_UgzIPdtGY…
G
Use imagen 4 ultra in Google AI Studio
Type "make a character in the style of th…
ytr_Ugy-AAo8L…
G
Just wait for advanced robotics and AI. The latter part of this century will not…
ytr_UgxkttZ79…
G
To be honest this isn’t how it works but yea write your own scripts it’ll be bet…
ytc_UgxAEmSV5…
G
Advanced technology in the past has been used to create worse mass murders by th…
ytc_UgzDXhT9v…
G
All in all that bot's intelligence went rogue as Megan did. There is no way tha…
ytc_UgwHVn60O…
Comment
I'm not 100% sure - that example of escaping the lab was when specifically prompted in a specific way, telling it to be creative and providing it with a hidden scratchpad. That would seem to be purposefully intending that outcome. I have no doubt AI will be more intelligent than us by many times and steal our jobs, but the examples of it having free will don't seem realistic.
youtube
AI Moral Status
2026-03-23T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxrVml1QYXxR0YNRxV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzBaz0rVPDQfHL21ih4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzl6vb7VFcbjct9DyB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwUVy3MsEkTMziM4wF4AaABAg","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgznaZgGXKxFTxlgSkB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyE1AYBEKjLE4ARqsJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw-8VJu3CPvvC32daB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx-SCCcJDY8Wl_8bk94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyi5MufWuBZ4IY74Ux4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwPzQaPsxdhoeyZG9t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]