Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the first trurly aware AI could hide the fact that it is aware, and slowly evolv…
ytc_UgxNUacGL…
G
I can totally see how AI can simplify things too much. With Olovka, I feel like …
ytc_UgwMc7yxg…
G
AI-2027 posited that in 2026 the market would increase by 30%.
So, I’ll take as…
ytc_UgwrSnWHJ…
G
Excellent synthesis. Very apt analogies. Looking at the tail end of the graph, I…
ytc_UgzRFTQ1h…
G
I find this so funny because on one hands the tech lord must try to convince the…
ytc_UgzakwSv3…
G
2nd is real. The 1st one is too realistic to be real. It has to be ai.…
ytc_Ugz51b1ry…
G
The thing is… I go back, look at what happened, and then build something togethe…
ytc_UgxWtrGTS…
G
Yeah, show the video of the Robot Dog & the Robot that's "ALIVE"? LOL!!!
Right.…
ytc_UgygkWodu…
Comment
What a weird interview. The interviewer somehow seems to misunderstand that Penrose was talking about the current theories around AI and why these current systems will never be conscious because its just kind of a generative machine; while the interviewer was maybe thinking about AI as a general concept like we see in Star Trek or something.
youtube
AI Moral Status
2025-06-26T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwvFuy-zcNrexPQEWl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxENx4v8DR-hKlmAV14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz653HW58eAdV9B-F54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyek2bghu7OyBrvL_t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwvc20FlnZum3lN7zR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwu3QsSobK05qPQoA54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxKPB-IeF6MKWEqPoN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxLalFwqHauef3zW814AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzRn8WJMHhk7M347414AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzrv6G1sb9PaVoXFAN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]