Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is anybody else but me considering that each mass manufactured electronic device…
ytc_UgwkpEEV5…
G
Guys. My brother snitched on me when I was talking to Boxten on poly ai. My mom …
ytc_UgwvRUwnT…
G
There have been many movies before Ai even came into action. Its nothing new . B…
ytc_UgzFNPDOd…
G
AI or not, in 50 years we'll have mostly automated companies passing the same mo…
ytc_Ugw-WV0KP…
G
Everyone is so worried about us getting wiped out by AI, but we never think abou…
ytc_UgyALVkOH…
G
A REALLY good reason to be polite to AI is simply to practice being polite with …
ytc_UgzTx7PoQ…
G
When a robot tell you that it was surreal experience, can't get more real than t…
ytc_UgzeXn1MD…
G
One of the main reasons it didn't see the police cars was the flashing lights. I…
ytc_Ugwx86xes…
Comment
I don't know why the media is all over this guy. He's just peddling the tech-centrist liberal talking points that frighten the public and make tech companies more money. What we really need is something that's most likely not going to happen: a government that cares more about people than profits. What we should really be afraid of is the fact governments won't regulate AI out of their irrational economic exuberance, and humanity will suffer as a result.
youtube
AI Governance
2023-05-11T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxhm06UvSKPeD20KGd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxpAWSPiv9Y93L9Dpl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxLT_au8qL4QTepL3Z4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwktViPMxCks1ZiEkV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQcFTtWkR4vvwCcwN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz6O3JyRdABbuyhcrN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz3okGizgtfrI-I0Xl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw_37HB9HYj9KJu0pl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw4Jrs6SrObZUubfwh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwEK0HM8jUwFiRV--B4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}
]