Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Microsoft. Bill gates made his billions by selling defective operating systems t…
ytc_Ugz91DQJg…
G
Lazy govt school teachers won't let this get popular. Actually teaching kids us…
ytc_UgxOoc5-s…
G
David Friedberg: "AI means more time with your family and more time with your fr…
ytc_Ugw4i3Sbt…
G
Training AI does NOT require anyone's consent! It is a Fair Use of the content.
…
ytc_UgxlypwDp…
G
I am not saying there is no danger, but I don't believe in these scenarios just …
ytc_UgxO9j3Qv…
G
The data the algorithms use come from encounters from biased police. That means …
ytc_UgxpRhKce…
G
When it gets to the point that it takes millions of people’s jobs you’ll see a w…
ytc_Ugw7GCtjJ…
G
For the first controversy, the answer is no. The AI is trained on actual art, me…
ytc_Ugy2fWeuK…
Comment
The danger is we are going to base the AI in space, with limitless solar power, beaming down to earth through massive wifi constellations into millions of robots... what could possibly go wrong? And unlike the film where we blow up ground stations.. these are untouchable... it will be fine.
youtube
AI Governance
2025-12-30T19:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwftAS0wyIPEcqGwhd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzsWCO4o_spcon71CJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz629SqRlpm3rCay3B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy0vhhwZUyxOqALN3V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzxee8ZjMK4GnLC8RF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxKyUcuc1k4DjTSCzB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw_3Y4AcF08qHEylXt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw7BqWbx3Z0KcWE3x94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyNLxmKZY2wZHSuWul4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy2pYbkJVazv8xzWM94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}
]