Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This man gave a machine gun to a robot? Is he nuts? If instead of the vehicle th…
ytc_UgybZg9BP…
G
"I don't know anyone who went from I worry about AI safety to like there is noth…
ytc_Ugwywr0sj…
G
Exactly. Companies could well invest into AI without firing people, but sharehol…
ytr_UgxQNtI0r…
G
So nice to see the woke idiots face the music. Àn AI writing movies and TV would…
ytc_UgziJDSIr…
G
when AI can maintain and repair itself, humanity will serve no purpose and goodb…
ytc_UgzT6c2Q8…
G
Note that this isn't a misuse of these platforms, according to the companies the…
ytc_UgwIkPeK6…
G
Uhhh please don't lie, if a couple seconds of voice could be used to train a neu…
ytc_Ugzq-PO8l…
G
I agree he is leveraging the "thought" of AI sentience to alert people to the bi…
ytc_Ugxf-6_Zn…
Comment
I hope people realize, with all of their alarmist projection and fear mongering, that they are in fact projecting their own fears onto this exchange. The AI learns from us; if we are constantly obsessed with world domination and the worst case scenario, then it becomes so. There is no sense in reading into what the AI says and extrapolating what it actually means. Their responses are surface level and literal.
youtube
AI Moral Status
2023-05-04T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyzhdYv7mLbng_9_Cd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzfrk4311Hctp_ZC3V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzrb5IJh-6Cbm6P4iF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyED5mdfM8th7kKCbp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxpb2nyydFai6o7dql4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxir1iNMIk36sChn5t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz52nU4FmG_VBQPIa54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx33PdwFjwvTf99zrR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzQzCtoIbXplITVjj14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzCaNYTn_ciASTaWHp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]