Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think the thing upsets me the most about the whole situation, it's not the tec…
ytc_UgzeeSwYH…
G
Sorry about your rough week but it’s sad a 4 year old had to talk to ChatGPT for…
rdc_mvjds5z
G
His 5 minute speech declares it's not a long term strategy... well in the long r…
ytc_Ugx6W4Wat…
G
Unfortunately, they don’t care and will continue to roll out their plans to have…
ytc_UgwS-mw3P…
G
I mean, if you think asking ChatGPT for medical advice is a sound idea, it's not…
ytc_Ugx3DAYaS…
G
u cant make it so EU cant have AI i weapons and just hope that Russia or China w…
ytc_UgyEA08h0…
G
DISCLAIMER: This video was made before AI generation became what it is today. I …
ytc_UgwS6-46Y…
G
I don't think people understand the jobs that are cutting with AI our jobs that …
ytc_Ugz6g4uo4…
Comment
im no computer scientist.... but i wonder... since us humans create AI.... why cant we put restrictions on it? like.... could AI ever turn against us if we deny it access to internet? or to somehow make it so they couldnt ever feel the need/want to do something "bad"? idk much at all about AI at all tbh.... guess im gonna check that out :o
youtube
AI Governance
2024-01-18T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy1UtSEqExuIaa6Lv14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugxbo16YqAfbqQn9W5d4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxV-FAhOA1qqV5VlW14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugx4D9k-ChzSQcbiCIR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxVuSpfgtvWbDZEU7R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxEq7YdaN7zmXU09mZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwvDdo3JIt5aVXB_F94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzeXcLhY6HHL8v-vOB4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgxHq_quEZlPVQSrfx14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzhpy0r6TzTWeu5Dud4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]