Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You require a job to earn money to buy things from companies. If the companies r…
ytc_Ugyqnvif2…
G
I love how you emphasize the command structure! It’s like how I use Rumora to au…
ytc_UgxrXuG_E…
G
An artificial heart can be created that is connected to the command line and the…
ytc_UgzfxCQvU…
G
It really does blow me away the amount of hate and vitriol AI art gets. People c…
ytc_Ugyj6qV6N…
G
my question is why did you think it was a good idea to get into a self driving c…
ytc_UgySmjR4b…
G
A robot comes to your house. Knocks you out and just throws you in the trash bin…
ytc_UgzFWw2p1…
G
I just had a good experience with an AI answering system in a large company, thi…
ytc_UgwPrJyuw…
G
Your constant cuts make me question the editing... You clipped him 3 times in 5 …
ytc_UgyUeev8g…
Comment
The risk is that human civilization is entirely destroyed. This is not an edge-case risk - the end of life on earth is a plausible outcome of deploying this technology, making AI as potentially dangerous as nuclear weapons. Having these special interest groups that seek profit make the case for AI is not a good idea.
youtube
AI Governance
2023-05-17T13:0…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy7077bntBMstBT1R94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxGZ-mM2OcnMSWDPpt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwI1q-het8yxTQn-Yl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyk9JPqd43xFjrh26d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyptKh4c5yfbWW6SbB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzqo3vyetWpEbt4TNF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy69TkwGtyotvTzy_V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz6U5F2bS0YoLThD7J4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzNilJdNQWMbrFcsTN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzzQ-JIgxOU4gsnN4l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]