Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Forget about the worries of AI what about the worries of Hybrid AI, When we have…
ytc_UgysNTFQW…
G
Just a thought, sometime in the future you may be able to Connect a robot or rob…
ytc_UgyVgE6c6…
G
This guy thinks the AI's reply was funny? Is it not defying him? No, Its lying …
ytc_UgxHjDWtc…
G
what would the positives be? What if AI gains / has sentience - how could we eth…
ytr_UgyhiAiSO…
G
This is basically the same thing running chat support. How often does the chat b…
ytc_Ugx1auMf-…
G
The creators of the ai programs have literally begged the public to stop saying …
ytc_UgzlGIZFi…
G
Right. The trend for a century has been to increase productivity while reducing …
rdc_lv8j518
G
Hans ends with, "Good Riddance". Stephen Hawking warned us of AI for a reason...…
ytc_UgxhWYKUP…
Comment
Always had a feeling that humans only last another 500-700 years. Humans can't even get all the basics in order and yet they keep trying to exponentially advance. Whether it's AI or some other way, humans will destroy themselves.
youtube
AI Governance
2023-05-04T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwPR2Kmr68oavJc64B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzctNjZOQxS8IxhAF14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxR3mtlRHv9Ka0FvFJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyvYU36nnExB_Rvd_d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgztE_des2X7uXGUS294AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzIBDTjA77Mpwn3yiB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyKZ-QFXQINmVyOOnl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwCvJPB0cnp8dAMJCR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx9ihceHIq5Ts0ceq54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxrfQQ4UHofbm9NDTZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]