Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Are you a teacher? I think you've missed the point. Kids don't have to learn wit…
ytc_UgypXyqs_…
G
Apparently someone has never watched The movie Terminator! Didn't end well for h…
ytc_UgzbAUwrE…
G
Yea the video is misleading. The % of Tesla incidents is lower than Waymo incide…
ytr_UgwXPaUhW…
G
@dunkfluga what if they make deep fake of your mom? Your sister? Or yourself? Th…
ytr_Ugz1xoqB2…
G
I remember hearing a quote that went something like "Let's say you give a robot …
ytc_UgyRBROBA…
G
writing a sentence doesnt make you an artist. it doesnt even make you an author.…
ytc_Ugyhblses…
G
Everyone at my job would make the same joke about how AI ones that learned how t…
ytc_UgwHbUW8V…
G
holy stop glazing ai is all you do go make angry comments on this channel?…
ytr_UgyD7geu9…
Comment
Safety is the number 1 priority, if it can constantly upgrade itself and using just a dump example; it may get enlighten and know the serious issue and risk to the earth
Humans are number 1 on that list. If there is no safety then we a will be out.
It might even go back to revolution series or some terminator situations. Or even worst
But I hope safety is still a big priority
youtube
Cross-Cultural
2025-11-12T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzr6B7OYaa4NaSlUJR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyFg6wPA72MCjjYf_54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwdCEydGggJzErDTJp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwlHvyBo-ic2p0DUbt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgztZPqsLl7Ea-poh_x4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyZ9-yZWqWKB_oK1PR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwixG6SlM9XoXG72Bl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzLnEvllLWn_wSgSw54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwUZ6wC2X6XjK5SKLh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwsPT95uV6EFlRYKHl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"}
]