Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Me who chats with ai that would probably get me arrested if someone saw 😅 I once…
ytc_UgwG5_hz8…
G
New Year’s Resolution:
This year I resolve to get quaint with the idea that rea…
ytc_UgyT9P4d5…
G
So we’ve really taught AI to be like us and it turns out we don’t want super hum…
ytr_UgzAFidDe…
G
Yes, yes, yes, we need self driving cars, and those self driving cars will need …
ytc_UgzrSJEwQ…
G
0:21 I understand that AI images are either goofy, horrifying, or soulless, but …
ytc_UgwUvBpB8…
G
Yes and no about the AI thing, the "AI" we have right now are essentially big YE…
ytc_Ugz9daB98…
G
Autopilot isn't full self driving mode its an assistant to lessen the stress for…
ytc_UgxRV6RhM…
G
No, ai "art" doesn't have background, feeling, the inspiration that an artist cr…
ytc_UgwY722K9…
Comment
I enjoyed watching this video and found the material very interesting, as long as they stayed on topic. But Geoffrey should remain on a constant bias and not attack the morality certain individuals and/or blatantly certain parties. Using examples like Musk has done good things, like electric cars and Starlink....Musk has done some bad things...doesn't give examples on how his morality would affect the development of AI. At least give examples so that we can learn only because the definition of morality isn't the same across the board.
youtube
AI Governance
2025-06-17T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzljMaRZtl_nzzLg-94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzLKhTgbpY-gGjF6iF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxz8dCJlgLV32bftJ54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw65N6zbOrZUIS4UvR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx2TG0zTGq2oS2crld4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwcfUZ3PrbRNMsqSot4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyzGD7wVUTRZCsiwJ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw-1aJzVyEJdgaWjZ94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwqOYsz3aQhvou12Pl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw6DosDpS_i9e3cSjN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}
]