Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The programs that AI has written are trash. Junior programmers aren’t getting hi…
ytc_UgwvctKdu…
G
I love ai but doesn't that mean human art lost it value? no!, ai never be perfec…
ytc_UgxQRAg7a…
G
As soon as Tesla called it Auto Pilot they implied completely autonomous self-d…
ytc_UgxagnHKB…
G
Personally I gaslight the ai to insanity or turn into a bag of Cheetos and fly a…
ytc_UgyC3f4Si…
G
1. What you said is probably a lie, because these programs are not a good source…
ytr_UgyuYGBWk…
G
I think we’re not worried about Larry Page and GOOGLE it’s the unknown that’s th…
ytc_Ugz-_zpH2…
G
Ask AI anything about a jew and it will tell you a lie and accuse you of being a…
ytc_UgzAVaEGO…
G
That line "Art is hard" anger's me so much like yeah we know its hard but you sh…
ytc_UgztWVN_h…
Comment
Boo hoo, who cares 🙄 paranoia as usual by one person out of 8 billion pepel. Similar kind of scare-mongering reasoning by Luddites during the Industrial Revolution... just hook the AI to everyones' bank account and instruct the AI to make everyone rich. That will lay everyones' concerns to rest. How it goes about doing that we don't need to overthink too much about as it leads to pointless analysis paralysis, debates that go round in circles, and cognitive overload. We want AI to solve global poverty do we not? If it did make a 'mistake' and just withdraws everyones' money and puts it in a rocket to send to outerspace... just make a better AI next time...
youtube
AI Governance
2023-05-02T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzrpPuDWJI9C6V53uF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxzw3Rd0PXGFxIG7Eh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugyk1dnsAZgNINr3BSN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxD7Z7rS0-cJ6THYXp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyw_dtHomgJlaissu54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwiA9HRzH-5srh78qV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzQHooUjhnM6bcACNx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugw2zAavHuJ3jL5Vw6B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxhe-UPZtqniTUv4xh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgyNwBPOfLA9xur1FdJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]