Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As someone who has just spent $10k on custom joinery for my boat, I agree your w…
ytr_UgwPstfbW…
G
This is ridiculous. If you knew what an actual AI looks like you would be dying …
rdc_kvhvmi3
G
Massively overstating the capabilities of ai and robots under the guise of 5 yea…
ytc_Ugx0ITns6…
G
US is integrating more and more autonomous AI systems into their military, but I…
ytc_Ugx9sYJgB…
G
If someone is leaving OpenAI for AI safety, government surveillance, or willingn…
rdc_o7wxhm0
G
**Spoiler Alert**
Here's what will happen in the near future:
AI will be regul…
ytc_UgzTozych…
G
The CEO's in your other videos saying "AI won't destroy jobs it will create them…
ytc_UgyIPJRQT…
G
I once heard something that "ai art" should refer to getting help from ai (which…
ytc_UgxlVJxOH…
Comment
I think it's disingenuous to teach AI to say things like "like" or "love" unprompted. It's, in essence, teaching it to lie. Sure, people do that all the time and say emotions they don't feel, but that doesn't mean we should program our computers to do the same.
youtube
AI Moral Status
2023-05-26T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwbaZhFEfZHvH2y-pd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz7tS9b1xfkyjEtbll4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzzUFs8RzO_5ANSFiN4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx0XiYsBXgVZ63R3Ch4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzNysZXxjsb9UWX7v14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzXe_2v-BcOYEdmMtd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy7wyqqLXoGKul-VAF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxVA0bDheFsHW5zS4B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwd_v09tOxZLcxNeYh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxWtksY1Py7QSnKHwp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]