Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
11months old and it's still news to me. I didn't think everyone would be so dumb…
ytc_Ugy2qPJeo…
G
Well I guess it’s best to start learning about robots then 🤷🏻♂️
Human Intellig…
ytc_UgyBIrj8b…
G
Maybe someday we'll just put our super ai on Mars with limited resources so it c…
ytc_UgwmVogrr…
G
I have the same feeling but with Anthropic's Claude after exhaustively using it …
rdc_my7ouq0
G
I have this economical Ai theory:
I think there needs to be a restructuring of …
ytc_UgxIBdRtm…
G
i much prefer the feeling of hand making my art rather than having a computer do…
ytc_Ugwyi2RrO…
G
As a cinematographer, I gotta say.
The way this video was framed is a little su…
ytc_UgxuUfarA…
G
12:12 If "everyone should own themselves" (which is not something which can be c…
ytc_UgwQ91Xxt…
Comment
@douglee4687 MS shrank its 30 member ethics and society group down to 7 due to the interference of the team to push their OpenAI product to market before their competition. They want that sweet sweet chedda'!
youtube
AI Governance
2023-03-31T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgwJZoDDlg98dprfXbN4AaABAg.9nsCEb0moA19nwdIDdWecF","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugx1wSrFcb5bmBIHq_14AaABAg.9nsBfcRxwZZ9nsZt7lmgi_","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugx53gVQRfrB0WewCaF4AaABAg.9nsAIP2yrw79nskCqH2BwA","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugx53gVQRfrB0WewCaF4AaABAg.9nsAIP2yrw79nsnL-EDRGD","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugx53gVQRfrB0WewCaF4AaABAg.9nsAIP2yrw79nsnyjXjYjB","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugx3wd-12H38vyuTQVF4AaABAg.9nsAHCa3XKJ9nsEcgrVI7Q","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwDt-1g4RgGWajY3Td4AaABAg.A3puKxGCGt8AHhOKd2zzUp","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgzJybAmyxfZ1NFPZiZ4AaABAg.A1qNMAAfHDnA3Pr6yIKlyz","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgwdP0GOKRf8AHUxQIx4AaABAg.A1-vMckcgCJAEyIRr__FPt","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxIvfOx8Zxyquvg3Yl4AaABAg.9xyTSj5KVQe9zf4FTIRbW1","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]