Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Theres no way we can make senitent AI we have a POLICY forbidding it 😂😂…
ytc_UgxP7fPtT…
G
People are already using ai for “social” interaction and they’re already about t…
ytc_UgxAWWaQ9…
G
“Omgggg you guuuuuys, I made up a word and other people used it, you guuuuyyyss.…
ytc_Ugx2IYFBB…
G
That’s crazy to say because artist had to learn everything they know. As an arti…
ytc_UgxDOFs-e…
G
Here's the thing about AI and robots in my opinion. They can do behind the scene…
ytc_UgzAFatp8…
G
@heathsims7091 I get that, but the high level of inaccuracy from some of the ma…
ytr_UgxhdPIqe…
G
Its such an interesting point in history of our civilization.
I agree with mos…
ytc_UgyO1h1vX…
G
Software developers will move to developing agentic systems. Rather than doing t…
ytc_Ugw4GMasQ…
Comment
The thing with AI is that it can't simply be commanded to do things, it isn't able to be controlled because it has a mind that considers things and weighs up options. AIs can learn, so if an AI weapon learns self preservation, it suddenly decides it will not fight. There are just too many problems with AI, it's such an important and crucial issue.
youtube
2015-08-04T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UghtSkBgzSYBtHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjViNNXfNfSJHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UggmA0mXDPRJZHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjCKuvjORfp8ngCoAEC","responsibility":"none","reasoning":"contractualist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UggRPYH0T4jMPHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggAVsZqHgrQLHgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UggiVbomHzBmy3gCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugh3w9U0giWCwngCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjfNG0lGF6WFXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugi1I3DCzAfkyHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]