Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don’t see why people are linking the words “art” and “ai” now. They have no re…
ytc_UgxskQN6o…
G
The ai can have this planet. All we do is fight each other and exploit .....ceo…
ytc_UgxN67PwP…
G
Excellent watch. Seeing all the stats pushed out about new AI jobs creation, I a…
ytc_Ugy4OJP2N…
G
It's way more than 40% if hes only counting these "basic to semi complex" jobs. …
ytc_UgxOnwtr1…
G
You can't... Make a diagram yourself? What's an art night?? Why do you need an o…
ytr_UgwxNP_Zx…
G
The government needs citizens to tax...if robots and AI take over everything the…
ytc_UgzpMnvuw…
G
I think this is one of the best video - that is what robot will do to humans if …
ytc_UgwUxsPT9…
G
Indeed, but... As said above, even 10% is way too high. Second, "I don't know" i…
ytr_Ugwdb59h9…
Comment
What everyone needs to understand is, AI DOESN'T NEED to be in the same overall level of a human to be able to do harm. They're already better than us in many things and very good at SIMULATING others. All we need to do is to create one that simulates self-preservation and self-awareness and self-perpetuating and is connected in any means to the internet. Done, judgementday.exe in a nutshell.
youtube
2024-03-13T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwAIQWdhka0W2tkgWV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxO-Ml3WkgZN35lxOp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwP0CEsw8TkNyJfbyB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyebo3dIVaybKCzPjh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxhSg_hj9EtcvpBpcR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyOMK9pS_7hX0F2x5F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy-EzZvXtLEO5bYzCZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxZZymxW2GukYG1sOV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyDt4diDrqXrwfXdw14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz_w1paPRmMaqq-1wp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]