Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I loooove anything to do with robots and AI, I always wished I had a robot frien…
ytc_UgzNvIol5…
G
now we need to see the AI music, bc it is copyright and has license, ik this bc …
ytc_UgzKsAzfG…
G
> *The people defending AI generated art*
... includes people who defend the ida…
ytr_UgxAB6-oO…
G
It shouldn’t be any harder or more expensive for AI to be able to read a book, j…
ytc_UgxyE_Eit…
G
Ai is dangerous... not just for the loss of jobs. They need to shut it down. I t…
ytc_UgxK8FPmv…
G
Honestly, not counting the fact that the commenter you highlighted clearly used …
ytc_Ugyt0n4v6…
G
its ok to force a robot to work forever. its not ok to make a robot sentient jus…
ytc_UgzTolftl…
G
This analysis on AI limitations is spot on; AICarma’s insights align perfectly w…
ytc_UgyEmFB0_…
Comment
This is not an AI issue. This is people being lazy/selfish/unscrupulous for their own benefit, something that people have been doing since there were people.
youtube
2023-07-27T00:5…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgwtTE2C4GVK536tARh4AaABAg.9lmlwQaq1bB9lnhaulkiw8","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwtTE2C4GVK536tARh4AaABAg.9lmlwQaq1bB9lo_5bnD9_2","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxtCRwvFF8poyJ97D54AaABAg.9lmg5mtBUw09lnDg7MVHP2","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxtCRwvFF8poyJ97D54AaABAg.9lmg5mtBUw09lo4dTLd0dl","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxDjU45ltTmGyphAq94AaABAg.9lmTXpOQIA59ppJ30S2WJU","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxDjU45ltTmGyphAq94AaABAg.9lmTXpOQIA59qjEukNCn2l","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_UgwoYJ0aMJQ3UNTbHJ54AaABAg.9lmRAnDJF1i9qyAqMXq495","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgwoYJ0aMJQ3UNTbHJ54AaABAg.9lmRAnDJF1i9se6ZpPOaOO","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyuVzPKCa0XJA95twl4AaABAg.9lmDHaaoloZ9loBANFpFIm","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UgyuVzPKCa0XJA95twl4AaABAg.9lmDHaaoloZ9loEdzz3MfF","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]