Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If AI takes over all the working class jobs, there needs to be a new system for …
ytc_Ugyz84ukG…
G
Robot: To be or not to be….
*shoots the guy*
Robot: I already forgot the quote…
ytc_Ugz7wWaNq…
G
That ironically never happens for some reason. I guess the perk of being friend/…
ytr_UgwJIAJN6…
G
AI can do anything you can do better AI can do anything better than you…
ytc_Ugy0zk5Cy…
G
One thing I will say is the prompt at the beginning saying”this is my first time…
ytc_UgyGjd9d7…
G
Yudkowsky larger macro argument that AI will develop a purpose or goal of its ow…
ytc_UgziqkC0Y…
G
Ai will only become a problem because they are programed to be that way. 🤨…
ytc_UgzUPrrlb…
G
Can't wait for AI to go for management and CEO positions. They will try really h…
ytc_UgzFVN2ri…
Comment
@Saucegod207 you really can’t look past the clip. let me explain something very obvious. This has terrifying implications and is not far off from ai-objectives today, for the not so distant future.
youtube
AI Moral Status
2023-09-21T19:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgzTh7r5or_PTl2ls4x4AaABAg.9uxRyLUeK-q9uysZOkRjRX","responsibility":"unclear","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgwD_M3xMYRkxUQDMaZ4AaABAg.9uxA5UG820Y9v-NBvcYCza","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwHfsL9kUj6wxfCqGB4AaABAg.9ux9qwP-Ltm9uyjjsIHEpf","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgyN7p0c2Zo42020YYd4AaABAg.9uwbsBwU5no9uwz8xQDsRr","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgzcsMhDY2lSt0cyO-x4AaABAg.9uw61pGUYcc9uwGmhxrAA4","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzcsMhDY2lSt0cyO-x4AaABAg.9uw61pGUYcc9uwK61jO9Ua","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwXeRp4BDYyMV0_ePl4AaABAg.9uthrBX_aoD9uuYj_KTu7L","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgwXeRp4BDYyMV0_ePl4AaABAg.9uthrBX_aoD9uvVmJkl0Ew","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_UgyRh9511C7RvTsS3eJ4AaABAg.9usxnMC3jpQ9usySV4LDts","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzfHzZb2MjPJvrYPg54AaABAg.9urOluJbSol9usMq8fAT6U","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}
]