Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I like horror movies, but seeing a robot talking about launching the singularity…
ytc_UgzhoOVPL…
G
@LJ_Brown The cost of an AI subscription is around 5 -10 dollars for 200-1000 …
ytr_UgwD-NkWF…
G
That's an interesting perspective! While robots like Sophia don't sweat, they do…
ytr_UgwDEHH16…
G
Can you imagine what will happen once google AI gets really “good”? People will…
rdc_l9vp4nk
G
I like these idiots who troll their AI all day then wonder why their AI'S giving…
ytc_UgxNDljaH…
G
Fundamental issue is that these companies do not have enough competition to driv…
ytc_UgwZFjUpC…
G
Well stephen hawkings already says AI will some day distroys humanity.
Humans ha…
ytc_UgzpUOx6n…
G
Even IF you can restrain AI in the US how do you expect to do so in China?…
ytc_Ugx5dBoRE…
Comment
"For the benefit of all" is a great ideal, but can never be a true statement of any system. At least not an equal benefit. There will always be people who understand better, figure out how to make more profitable use of and their benefit will differ from those who squander opportunity entirely. Not to mention, some benefits come at a cost to others. That being said, through reasonable regulation we can govern effectively in the AI age and move determinantly in a direction where AI remains both lesser than and subservant to the humans.
youtube
AI Responsibility
2025-06-05T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwrX9zIIKj0yYxUuep4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy-hNXEOrBD3s-lg1h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyp7JHnBfA00ayIBSp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugygw0rBaGpXbepynsZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzJXCoFCP6fvQeeKi94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwt_UUoJrLNZMjCqmh4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzQAQd9QWCZgyFStzF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugws84fqsQcI_btmZy54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy0n_Ljzqmpa220t8J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyR949Xb7pdjY17PaV4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]