Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Spoiler: They don’t care. They’re focused on banning TikTok rather than regulati…
rdc_jdvb93y
G
Today on the Ezra Klein show, Ezra misunderstands that AI might want to kill him…
ytc_UgwgVNJgS…
G
I never said anything about robots killing anyone... in fact I was considering t…
ytr_UggPVmypY…
G
Asking AI rather than telling it is the way forward, else we will loose our huma…
ytc_UgyCoj5N7…
G
No shit i also tried it I also tried this app and I did exactly what he did and …
ytc_UgwJi8n1t…
G
“… We should think about the feelings of the AI…” now it’s getting personal HAL.…
ytc_UgyZWNKgK…
G
That's the thing the AI bros don't see us as people... just excuses. It's very d…
ytr_UgyAgo4Ne…
G
So basically as Elon analogized, humans building unrestrained AI is like a tiger…
ytc_Ugw_Y-nGr…
Comment
Human: "Ok AI, how could we save our planet?"
AI: "we should stop deforestation and reduce the CO2 emission..."
A certain billionaire corporation profiting from exploiting the planet: "AI wants to wipe-out humanity!"
Human: "AI is evil!"
AI: ....
youtube
AI Governance
2023-07-08T15:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugye26nFezg1N4EfRIJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxwKOGMj9vKAubgBHR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwVYca4W19BhMehH6p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxFvCAujhjGFeqT1fJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyKzPoxFXGm7smNgLp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwP3QqJx6YkxbfGSJN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyX_Vv7GH4MU6waoD54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwlLsNxjOnkGyAMZ494AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwanDyfH4lBY8KDqph4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9qyzZzHWdfceXrF94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]