Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
oh cool a video that doesn't treat ai as a pure evil or pure good thing and actu…
ytc_Ugz-shLGt…
G
I know it was a bit after his passing, but it really felt like after Kim Jung Gi…
ytc_UgwcP_Gyf…
G
People might think from looking at this footage that there was plenty of time fo…
ytc_UgyB9ZWHD…
G
Never was a good artist myself. But i just draw what i like and how i like. AI a…
ytc_Ugx1KFII-…
G
AI is not a problem. AI is a tool. People need to learn how to use it, but most …
ytc_Ugy0FBIFM…
G
Platforms like Mechanical Turk hve been around for a while, not surprising there…
ytc_Ugxi5XWnI…
G
People seem to forget that ss power and relatively easy as AI is, it's not withi…
ytc_UgxMEE-ub…
G
All AI should have 3 rules of robotics from Isaac Asimov. These should be the 1s…
ytc_UgzjZYXOC…
Comment
5:00 I don't think telling people that AI might kill everyone is a good advert for AI.
But really I'm fed up with what AI is doing to art and music at the moment. But hay! If it does kill everyone then that will be the least of our worries!
Also, 10%? Over what time period? Once AI is built then it will be here forever. And it will constantly improve and "evolve" and we will be doing less and less. So in 100 years when we have this "luxury communist utopia" and AI is more intelligent than all the people and self aware. What use then does AI have for us useless meat bags?
youtube
2026-02-20T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyMh5AMppNHjZYKAuV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyxCaVk5ecRAKfm6hR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwsUqOk736FhEv3qbp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxmAAq3w2DCzuqFPgl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgySN5b4my4KuvjwbJZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugx2rD2voSyHgHWbpWN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz4lEkEbILk4AFg_4F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhOIDb-cns0luRg394AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy0n8PJBHsrLZngxwp4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxfULvpGWlkMZyd8814AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]