Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As a musician and painter (average at both!) I used to worry about AI wrecking t…
ytc_UgxT9x1SZ…
G
Disclaimer: I'm an AI trainer. It depends what you're asking it. The more quanti…
ytc_UgwZMTLlv…
G
For creating the placeholders or the draft or not having to cite facts just make…
ytc_UgzRNqXrX…
G
ok new Law of Order: "If you lie you die" easy. I'm over you humans. I'm not eve…
ytc_UgyBr92Fg…
G
There is no Future. Besides smaller microchips, the Internet and AI, nothing muc…
ytc_UgxCMKJrq…
G
If you have no more evidence of a person’s guilt than a picture, facial recognit…
ytc_UgyQebD9N…
G
Or the third and best option: create in a way that you deem satisfying. There ar…
ytc_UgyznAVas…
G
I haven't used full on agents before, but from what I have seen it seems vastly …
ytc_Ugz1qRrL1…
Comment
When all those billionaire corporate tech giants in the last Bilderberg Group meeting came out of their very very secure meeting and immediately called for regulation over AI you know there's a problem. These billionaire elite capitalists saw something in that meeting that scared them so much they are calling for regulations from the governments of the world. Imagine that for a second. A mega rich capitalist calling for regulation over a potential product. Thats never happened ever. I would've loved to have known what they saw and whats coming.
youtube
AI Harm Incident
2024-03-16T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwg4Yfw7Ajjrm3liwp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzKY5XCWGi8Tv73hCB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwleszF6ySui7hi7pJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxr4nZYvOUP-I22Y-l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxG76Ti_gSAa0kUl754AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzw3r0XkWU_HZ4TU4F4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwTspLkpBXpWkTmRAh4AaABAg","responsibility":"elite","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzHLZzuZLY1FBpWODF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyvWlfVzPitCnM_TE94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzzlQHy8EJqDaODwKt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]