Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One day I was on my phone and I was looking at my wallpaper, then my father walk…
ytc_UgwSg6tPi…
G
@rachtreiber3811 Skillful* You call me a dropout when you seem like you dropped …
ytr_UgyoCXSal…
G
I just debated Grok, way better debate because it wants an AI government and a h…
ytc_Ugw4M5c76…
G
A ROBOT does not have feelings , it s artificial intelligence which does not hav…
ytc_UgzuU5r-j…
G
Ai generated content is probably not copyright protected so good luck actually m…
ytc_UgwKc6Cod…
G
Not to make light of this or be disrespectful , but isn’t the fact that these pe…
ytc_UgyAhLOFm…
G
Here is the most important thing you need to know about AI: AI will eventually h…
ytr_UgyyEkvkm…
G
"They're not gonna stop even because of those 2 children" That's sounds crazy.Th…
ytc_Ugy5GiZyL…
Comment
Does artificial intelligence have the ability to know what hurts everybody the most? Or does it simply follow the standard "ethics" we have now? Please ask yourselves that and then ask yourselves how to prevent this if we abdicate our human role. Humanity isn't a joke, considering our history but this Big Machine would solve that little chore for us...
youtube
AI Governance
2023-06-17T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyZWh8EOuBq5oOw2Y54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgysS1H_fEv2T6iVf6R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyYw0DLoWg3fwHTwQF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyOyvhgyquvie1V05h4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgwtFHmHAswSUAruLFt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwFeiIqlMt5RE-1FF14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwRE6JpWjkyMftbov14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxtFxOZ6nxyr3UiFhp4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxipXuw1eABhVgGiqZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxVpaR4t1zY-LLB78t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]