Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This question looks at the way ur curser moves not the actual answer to the ques…
ytc_UgzOdXwsh…
G
If AI ruled the world it will do a better job then the people in power at the mo…
ytc_Ugz8xDVGB…
G
Somehow I don’t care as an artist I’ll just let it happen unless someone actuall…
ytc_Ugza-qJij…
G
The only way to make AI doesnt give bad effect to human is release it when it ca…
ytc_Ugx7aqefk…
G
You have the most disgusting people teaching AI, liars, thieves, manipulators an…
ytc_UgyZfcfdH…
G
i suddenly have a masculine urge to create malware and completely destroy ai hos…
ytc_UgzQ7pEAZ…
G
I had a trouble ticket w google support. I asked them if I could get an extensi…
ytc_Ugz7-VVY3…
G
Generalized “AI” won’t happen in our lifetime, this guy is a pure grifter but so…
ytc_UgwndzV0b…
Comment
As long as AI is used strictly as a tool and not a complete replacement for most of these jobs, we will be fine, but if we overuse AI and rely on it for too many things, we will begin to regress and degrade because of the knowledge that AI robots can do the work for us. We can use it where we please so long as we are ready to replace it when nesecary. The part about the robot apokalypse is obvious..,
youtube
2015-01-12T20:3…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugg6_c_fnxJFiXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjBrm-BO4E1Z3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Uggwq5VL_P9YvngCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugi28m3CG46xzHgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggmA4p100IU0HgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgiqEwaXkqSM-ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugi0PpcKcA8VCXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgghsB3quoCVXHgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjzbO8DgHLWlngCoAEC","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgiclBN6LTRIL3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"}
]