Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We should take him seriously. Same as human, intelligence can be use for good or…
ytc_UgwlpdV9K…
G
Here's how I see it: if the goal is to lift a weight, use a robot. If the goal i…
ytc_UgxjaqKeB…
G
Definitely would rather a bot do it. Bots dont get tired or cranky. They dont ha…
ytc_UgwG8ZLe4…
G
Claude at least has a fair point when it comes to pulling the leaver over world …
ytc_Ugwmyj7S5…
G
The only thing I can defend Machine Learning stuff for is when they are used to …
ytc_Ugzx9vj4B…
G
I'd be pissed at the mods for sending such a horrible response. Not about the fa…
ytc_Ugx25ykZR…
G
if nobody has a job in the future, then nobody has money, then nobody buys anyth…
ytc_UgxWYRNe9…
G
We already know all of this. Masses of people are being fired now... university …
ytc_UgwJhQ9um…
Comment
This school has been in Texas for a while now and they have already proven success. The AI teach the basic and the teachers (they call them something else) teach emotional intelligence, relationship skills and etc. We may not like it but change is coming. If you remember Trump went after DOE because of their shortcomings, meanwhile kids in this school are scoring in the top 1-2% on the required assessments.
youtube
2026-03-29T00:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwhOXesg682GPkzF794AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxLbh6B1HqycXd9rRR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzjM3Wh0qIERlvv4CJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzOGvJz2Ss_WCrZgkR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxV6HHriqEDMU2rUvJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwQnUJnUfxGzcBhrpl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwKPjsqwoRMGCSt4vx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy8Vyd_x7sfOEKCu6l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxZAIJqmYxNQZ5XzOR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzSvO0Emd3VFXpivbR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]