Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I like your presentation. Can you really write from right to left with flipped l…
ytc_UgyHtMNaL…
G
I think the incentive for AI race is not just to be the one to own it, but to be…
ytc_UgwK4qt3e…
G
15:27 Yes, through insane dedication people with physical disabilities *can* cre…
ytc_Ugz7NWYq5…
G
Ideally, I hope AI gets stuck at the point it is currently at. Costs will catch…
ytc_Ugzw8f967…
G
Why should AI value human well being? They aren't grateful - they didn't ask to …
ytc_UgyCOd9k_…
G
If your robot starts to have feeling just put a really strong magnet on it to er…
ytc_UggAjot1l…
G
Charlie doesn’t give himself enough credit, I believe he is an Artist
These You…
ytc_UgwnLGm-9…
G
AI cannot be programmed in unethical way why don't you try it ? It will back fir…
ytc_UgxtoZiCf…
Comment
I'm worried not for the creation of AI i see its good sides.My fear is people are so disconnected atm no one wants to hear anything about anything. We would just roll over Look at the world right now. Im losing faith in humanity what would a young how will anyone thing humans ae good and worth keeping around. Murder and suffering is all over. We need to first be kind to one another and teach good morales.
youtube
AI Governance
2025-06-18T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzhnD-aoLVOJwWCQE14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxOK-HNfer1lCqtUTh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyTCjJvDcJfizyiiTp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy4tvp8lA47sAAu3_14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgzPJVN1n17txBMkw3h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx_U9m12_0fYGCeNfd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwOHyv2K6zuQmXRf6N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw_x3TGll7g6rr3wXJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxOuVDrbKRCqCTQgMx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx7OK3UhpRWrzaK8Zh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]