Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Humans have enough humans to talk to. Why would we need a robot?
The main thing …
ytc_UgxEeY5zX…
G
'Great question' in a meeting means 'I wasn't listening and need a second to rec…
rdc_oi2t4z0
G
"You wrote that AI could be the greatest risk to the continued existence of huma…
ytc_Ugz0HYwMg…
G
You will need oil to oil the robot machines. We could always make sure they nev…
ytc_UgymKok4I…
G
it would be hilarious to generate copies of AI "artists" images. Making an ai pr…
ytc_UgzSjczl0…
G
This is so stupid they TOLD the AI to do these things so it did. Dumb fake outra…
ytc_UgxPesWh7…
G
I hate AI because it's pregidece it says all historical figures are black includ…
ytc_UgyNtbwuq…
G
I don't know how dumb you have to be to think a robot could have rights. Anyone …
ytc_UgimUt_LM…
Comment
Interesting, where I don't trust it, it is alluring I guess if one could be decentralized and kept in defense of the planet over all nations in other words it automatically retaliates if any nation uses nuclear weaponry. Basically denying it's use globally. Maybe if they couldn't use em they'd stop building em. But that means their would have to be some global governing entity trusted by all. Not sure that'll ever be possible.
youtube
AI Governance
2023-07-07T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzAvEeYlkPYMxp5nVV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgyanJ4BReR0ptedKUJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzDqCI7dS_hSUtatdJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzLYxoaZGdkAFXAF8p4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyHW2LXVIsqmM19wLl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwSa-FweB574cT9p4J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxfBC_0ZldanqDjMzd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwq6yeIep2xbl6Auhd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwylAiNHFwxURgkMNB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyF0tW_4Otd3dNsOKt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]