Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That story AI wrote about world war 4 is the plot to Terminator. James Cameron, …
ytc_UgzFiDKqQ…
G
I love his optimism but the takeover of AI is now inevitable. Those human jobs W…
ytc_UgyF2Zi5_…
G
Did chatGPT tell her to say all that or did she he compose this presentation wit…
ytc_UgzRld4ru…
G
He clearly understands what he's speaking about. I believe that the consent issu…
ytc_Ugx7ORKwQ…
G
I think this is the gripe of capitalism. Automation is great for menial tasks, …
rdc_jj3ga2h
G
WAYMO has been in Arizona Phoenix Valley for years [since ive moved here] It to…
ytc_UgyTLKSaf…
G
1. The first thing wrong with this video was the robot having a gun. 2…
ytc_UgwZHQmCh…
G
I wouldn't say "never' in the case of AI right now. Right now everything is poss…
ytc_Ugw4q3d19…
Comment
I love this video especially parts 3 and 4 which share a lot of the same view as me for alignment and safety.
I would also stress what was briefly touched on that conciseness is not strictly required for any of the apocalyptic scenarios - nor is AGI. Any AI that's substantially better than us as the relevant tasks would be equally capable - whether or not it can predict rain from a feeling in it's bones, make a perfect pizza or ride a bike. AGI doesn't need to do everything we can do, just the things that affect the outcome. If the outcome were to be based on fighting a legal case then we probably already have the beginnings of AGI and the line of when we have AGI will only become more blurry over time. It's far more likely that AGI will never learn some things because it doesn't need to.
youtube
AI Moral Status
2023-09-03T15:4…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwuvPMvW1Yd5WAkGBd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy3gnCoP-xn7-S94MF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCEPREMTYhaJWhC5J4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwk6wTlq55JmN6yt8t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxn1WdFZU03E9Dt6NN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwBaL9lMw5ttOIWXfJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz2p_H50QBytY42knB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwq61-e9i9c3mZ6Hy54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx_tLfNlBoYL7u6GuB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzshgq8A2AcZcL7Ftt4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"outrage"}
]