Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
so why don't government use AI to produce food and increasing food production an…
ytc_UgyW-Dpo9…
G
The majority of Americans do not have the intellect to understand what is about …
ytc_Ugzs57Z2a…
G
Your asking the wrong questions, why is AI doing it?. I think AI could be sentie…
ytc_UgyN-7MU4…
G
Stop bc the first time I used it I used it for 12hrs straight erm…
Don’t get me…
ytc_UgxeS20KM…
G
I would love for you to do a conversation with Gabriel Rene and Dr. Karl Friston…
ytc_UgxDVDpOH…
G
Driver or auto-drive
There may not be sudden threat
But within 10 years there wi…
ytc_UgwFXS6St…
G
Yeah i use AI generated images? What are you gonna do? Draw me pregnant?
(Now we…
ytc_UgzeNJQYy…
G
LLM and gen AI has ruin the word AI for everybody. AI are much more things, much…
ytc_UgxnbxTxD…
Comment
"meeting peoples' energy in a conversation" - This!
What I have seen a lot of over the past couple of years is exactly this... I find that if I am convinced something is possible, then the AI also becomes convinced and will grind away on any given problem until we're going round in circles with zero progress. Yet if I start to believe something may be unachievable or that we're following the wrong path, the AI often starts to suggest that maybe we cannot complete the task and should just give up.
youtube
AI Moral Status
2025-11-01T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwDx3DQjiqU2qJG6FZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwTK6k8Aqw9vNPIK-94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwei_7KP3azDFb_-Pp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyjvbECDnG4bkxbxWB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxQrs3xC8lMDghTtEV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzVkOt8_Xb97UiZNcJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzUgLam1hNwDO55mjN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxTrEIy5Yb9WlaNc6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxVPdJuAHQIJOjuimN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugych_K1BB1AgP2OzlV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]