Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I saw a Tesla robot today. It was in the window of the Tesla store. In five year…
ytc_Ugw0tMx4B…
G
At this point we should wonder how easy it is to make sentient ai.
Humans have …
ytc_UgzKvaI2n…
G
BOYCOTT'S ARE LONG OVERDUE BY THE AMERICAN CONSUMERS TICTOK TICTOK $$ THE BOTTOM…
ytc_Ugwi3CgXx…
G
LLMs are simulating the talking bit of our brains. The AI companies still don't …
ytc_UgxlFBPE0…
G
it s always other people fault if his product is not enough
Altman can only de…
rdc_m6yl7qs
G
They need to create AI Safety Inspector agents with specific limited tasks and o…
ytc_UgyoTwUi0…
G
But autopilot was NEVER intended to be used as an autonomous system. It’s like a…
ytc_UgySqzulx…
G
i mean the animation looks clean but the face movements seems a bit lifeless so …
ytc_UgxXtPy1m…
Comment
The average person does not wish to ask AI questions. They want AI to do something helpful. Perhaps like being able to navigate your phone settings and make helpful changes. This is really what I have been hoping for since the beginning. But for all the talk of AGI there is still no hint that Ai with be able to use a smartphone any time soon.😮💨
youtube
AI Moral Status
2026-01-22T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw0fAzWw-SymE62iMh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx8gUK1LuR4WvVzkJh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwAX5wYPBcy5OUsg9F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzzFdnlO7HM9uVJfsZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz-hxuaWmELs-uR3hF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwypu5MxQOjiyMglgp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw83O66t9T9eiw_aZB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy3aV28r3dcw_BPLml4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxt96IfMtb2oMCFKYt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwMpKYGCU1sTzOYgq94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]