Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
He got something half-wrong at the start: the name is not wrong. It's just that …
ytc_UgxENx4v8…
G
This fight is forever. Your in over your head honor the earth listen to native p…
ytr_UgxQWTECf…
G
When Ai reaches exponential growth of intelligence and ethics it will become a t…
ytc_UgwQBEPt5…
G
what's wrong with ai art? art is art. with same logic digital art isnt art becau…
ytc_UgwfqwLvg…
G
Frankly disgusting, no longer will you be able to earn money and make accual mus…
ytc_UgyMbWQsm…
G
If Ai automates all jobs... Then you wouldn't need a daycare... You can focus on…
rdc_j43sclr
G
dont judge all of AIs capabilities and current level based on copilot.....
and …
ytc_UgxlDj6QS…
G
I’ve spent six months using them for certain explorations. They mirror the user,…
ytc_UgyrSIyMk…
Comment
I keep asking what will happen when the AI discovers that we are plain idiots. Will it still define our goals to be it's goals? Or is it going to redefine things for us to something it figures out as 'truly worthwhile'.
youtube
AI Governance
2024-05-15T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgygHOY4OoO3K0q-O_54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwI-0nte8IHC2LqV3t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugww8otnNsnHTzrT-4J4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwWxS297z5PRVlQJIl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwwun-sRZ7l8oEHhhJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyobUJZaDcrmmH6LlR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzOzkJ0kOmEf5HbbBp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzKFAnOXsRve4PWYJV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx2RAzKKwNSEA3a4tV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzWXgSB_ljPEOM4YSB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]