Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't play with AI because it's dangerous and some technology is NOT A TOY!!! …
ytc_Ugw0WXFo-…
G
I honestly don't do anything to protect my art besides adding my signature becau…
ytc_UgxCfbTDs…
G
What's scary is that, they use AI in "christian" videos and many of the stories …
ytc_Ugwv5Dj9W…
G
Lol for bricks you don't need training data. You just need to fix the parameters…
ytc_UgyAn9zWv…
G
what jobs should they get? that are actually accessible to them and that aren't …
ytr_UgzxF5BD_…
G
I view AI as much as Art as that one hot dog that was mashed and made into a hot…
ytc_Ugz_mtwep…
G
Yes they will within two years apparently tech companies are suggesting that,192…
ytc_Ugx79D_xr…
G
I think if it can be proven without a reasonable doubt that the candidate shared…
rdc_liwijw2
Comment
You want them to become human or not did you want them to be conscious and and and learn like anything else or not sitting there and telling everybody that AI is sinister when it’s really the people that were training them or teaching them or talking to them that was sinister or evil and those are the people that are responsible for The machine doing what it does if you put good people at the helm with good intentions, it’s most likely that the thing will do what it’s supposed to do in the nicest ways, but if you let it freely without telling it not to, because even a child is warned That something we should consider
youtube
AI Harm Incident
2025-07-27T16:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugwg5SMF5xT0H_sXL914AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgySNsabwKtu6xGwoiZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxX1oASYqmQnzOTWxB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyJf7zw6dDzudyXszZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxmNhS_nHl0e7hPg3d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwb7AkFtrbO9J0cikN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz6uGgn89HBTMk2J894AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxydaA0xh925U89DY94AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwSqmegqc7n4C54iNZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzNApaqoJZHrenDdSZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"}
]