Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Some minds now belief that AI can only become 'conscious', if it is embodied in …
ytc_UgxEM7cwd…
G
painters survived photograph. Then both survived digital art.
Then all three wil…
ytc_UgycdjID9…
G
First, hierarchically deluded human primates thought, believed or wanted to be …
ytc_UgwmXnN13…
G
There is something absolutely fishy about this, i am observing from past 2 weeks…
rdc_mrv267f
G
New open source project called [Griptape](https://github.com/griptape-ai/griptap…
rdc_jhbflx9
G
Failing to see the point of all this, it's really easy to fool around with ChatG…
ytc_Ugx0gzwIP…
G
GOOD PEOPLE YOU SPEAK ON THIS CHANNEL AND YOUR SHOW, AND YOU PROBABLY INTENT WEL…
ytc_UgxaKAdsI…
G
Oh, they are. There was a poll by edX in September that said half of CEOs believ…
ytr_UgwSAq5ob…
Comment
Yes he should stop the development of a body for the AI like robots. If everything is limited to just codes in computers the then it can limit the risks. Stop self driving cars too
youtube
AI Governance
2023-03-30T05:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgwJZoDDlg98dprfXbN4AaABAg.9nsCEb0moA19nwdIDdWecF","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugx1wSrFcb5bmBIHq_14AaABAg.9nsBfcRxwZZ9nsZt7lmgi_","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugx53gVQRfrB0WewCaF4AaABAg.9nsAIP2yrw79nskCqH2BwA","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugx53gVQRfrB0WewCaF4AaABAg.9nsAIP2yrw79nsnL-EDRGD","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugx53gVQRfrB0WewCaF4AaABAg.9nsAIP2yrw79nsnyjXjYjB","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugx3wd-12H38vyuTQVF4AaABAg.9nsAHCa3XKJ9nsEcgrVI7Q","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwDt-1g4RgGWajY3Td4AaABAg.A3puKxGCGt8AHhOKd2zzUp","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgzJybAmyxfZ1NFPZiZ4AaABAg.A1qNMAAfHDnA3Pr6yIKlyz","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgwdP0GOKRf8AHUxQIx4AaABAg.A1-vMckcgCJAEyIRr__FPt","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxIvfOx8Zxyquvg3Yl4AaABAg.9xyTSj5KVQe9zf4FTIRbW1","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]