Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@8:00 Straight out of 2001: A Space Odyssey, a film made in 1968 where the AI li…
ytc_UgzOdKiZu…
G
here’s the straight-up biological truth without the politics:
👉 Yes. Abortion do…
ytc_UgwPa6bQE…
G
2025 AI is much smarter than any monkey and has been for a few years now…
ytc_UgxfSr2UQ…
G
i had a recent revelation about AI.
Google could make millions off of AI by cha…
ytc_UgwJ3iBfJ…
G
There is no Future. Besides smaller microchips, the Internet and AI, nothing muc…
ytc_UgxCMKJrq…
G
What this gentleman refers to in malevolence, in conjunction with the gentleman …
ytc_UgzKBSAbq…
G
Eh i get what you’re saying, and I’m with you on doing tasks yourself without us…
ytr_UgyBiabA1…
G
Let AI figure out how to routinely give hands 5 fingers, then we'll talk about t…
ytc_Ugw0Yk9fM…
Comment
I heard a conversation held by Stephen Hawking trying to explain the dangers of AI. The conversation was actually very scary. Thank you for having the Godfather of AI as a guest speaker.🙏🏽🙏🏽
youtube
AI Governance
2025-08-11T04:5…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwDfox2ehZr4UMdU2B4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwvqkT6eZB9YZLjIwx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy7mjx5iPk3BHRdgvZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzxkG4mMwtIoTEeo6l4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugxlh4444vyymgCTck54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwonBo1bcGmvlmjV914AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwpGJbPzOPkTMpKaYt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugyb4jfb7l6RbK5RZ8F4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwNlawkUR_Ga_TUGkh4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw9qli2xamRyHmUNAx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"outrage"}
]