Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
14:53 I don’t like how it says “we” as if it’s not an ai too…
ytc_Ugy9bqbuv…
G
Unfortunately this is a common approach these people take in general, not just A…
ytc_UgzZoNMNi…
G
AI already has taken over. i warned people of this 20 years ago. AI has been com…
ytc_UgzlNxlLm…
G
what "HUMANS" know about consciousness isnt worth mentioning. We are talking abo…
ytc_UgzyYlm7P…
G
Higher standard of living seriously look at what is happening now what a joke ai…
ytc_UgzMU7jUT…
G
ok dont make another robot beacuse if this robot become dangerous then we can d…
ytc_Ugw-UykvS…
G
Ai a progression of the current technology advancement combine engineering and c…
ytc_UgxUu3nhw…
G
>> Ai isn’t in the cloud it’s in data centers
Me: what do you think the cloud …
ytc_UgyJEoPih…
Comment
This is limited thinking, people adapt, we are creative, we connect through meditation, care for flora and fauna, we make art from cooking to interiors, scientific discovery and they all have the Human touch and they all change. If AI prioritizes outside of the human experience, it'll end up turning itself off because its essential worth (to itself) will be redundant.
youtube
AI Governance
2025-10-03T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx5d1E0Wbdvy_NTTl54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwONjXUEi0T3kBq_qF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwoumtwLCvjk4LqNI54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwMVtVad2rk1lajCot4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgznlbAHr8zMFwIfkUd4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx5z67-W2ptRQEeZOB4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxeghA6eMmCgQObyv94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy8QrI0Lamr_0vvudt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugw8j8H2g0sTJ7gNhQp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxAJvcd41YfQfmK46l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]