Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm so excited for the future that I don't even care what happens as long as we …
ytc_Ugglgq8eI…
G
Porque não vendem estas máscaras personalizadas? Assim não precisaria fazer proc…
ytc_UgxAyoKMX…
G
there are plenty of factual problems with ai centers. but it is very critical th…
ytc_UgxaUCGCz…
G
If people are really this f'n stupid then we definitely are doomed. Not by AI , …
ytc_UgwWgRjw0…
G
I used AI in college. It built all my personalized study guides based on what my…
ytc_Ugy01tNXv…
G
I don't think consciousness is the goal they are aiming for building these ai sy…
ytr_Ugw-RBqYd…
G
Most I know don't have this problem anymore. It's developing so fast. We're just…
ytc_UgweNe3I2…
G
I find it ironic, listening to the Godfather of A.I. speak about A.I. becoming a…
ytc_UgxtbEDOg…
Comment
Best case sounds like we will create a mother ai, which will care for us as their pets or kids. I was listening to the podcast whilst resting and hugging my tiny Maltese and was thinking, doesn't sound so bad to have her life. Worry free, everything is provided, she can do mostly what she wants. It's true, she gets many limitations to her freedom, like, she goes out when I take her out, she eats when I give her and what I give her and things like this. But she also gets spoiled like a queen. But first, yes, as I mentioned, her freedom is sort of restricted and more importantly, probably not all dog parents behave as I do. So yeah, idk, it's scary to have someone or something else rule over our lives.
youtube
AI Governance
2025-10-07T09:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxf3PgpfIPzVsibBjt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwQ3YoOIa6iA35zAjB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwvm5KHXnOMrbaDt5B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwO2149GfQsRkPSk-R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyc7-laCXohTR19Cv94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwpG6VDTXP_haHYJ1p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugznrcc0GTZQUDkksCB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyd1J1brnFJV7ZWHVh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy2eZDlKg3EohN0net4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzleBQBFssWltZYaWh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]