Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Amazon prefers self driving vehicles and have drones delivered items over humans…
ytc_UgxdyhTqs…
G
Me when a real life artist criticizes AI art just to turn around and criticize r…
ytc_UgyxbLRpA…
G
This is what they said about science 100 years ago..AI is just the progression o…
ytc_Ugw4-Bweq…
G
Bruh the AVGN is so IN character, I don't believe is AI, just admit it you calle…
ytc_Ugx20lFU0…
G
Yeah... or even other models.
>Now us as a society, are supposed to rely sol…
rdc_n7lskp5
G
I don't think people need to worry about AI long term. We already getting bored …
ytc_UgzsztxId…
G
I can't believe you were actually right, although for a whole different reason. …
rdc_hh11ug5
G
1. The fact that they wanted to use EVERYTHING proves it would not have been as …
ytr_UgxkXGd0-…
Comment
Has anyone mentioned the Matrix? I mean it all started because of the advancement of AI to the point they basically revolted against humanity, killing off the majority of humans, and creating their own identical version of our world and even humans. I'm not saying that that will happen if technology continues to advance, but I think it is a possibility to consider and prepare for.
youtube
2013-12-07T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugj8AXUuhgfjUXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghEfYIiBlCtyHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg6pJ8sg8sIuXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh4Izu1dFDCBngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UggStT0fkttiU3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjHME_FVR-RjHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgiFPP6fP-f4CXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugjk-OLPfqT00HgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugj_Nwoh-nEukngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UggdtWoUYVl_S3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]