Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You told an LLM to roleplay, and it did. Then you reacted as if the LLM had the…
ytc_UgwNGGrYR…
G
Even companies run by AI have to sell their products and services to someone. An…
ytc_UgxmorSz6…
G
I made Skibidi Elmo out of the prompt:
“Elmo with a long neck peeking out of a b…
ytc_UgzTYpxE3…
G
2nd attempt: the basis of crime is the class divide, a lack of a job, lack of b…
ytr_UgwpUoUAr…
G
A.I. is a crock. Plumbers, electricians, doctors, lawyers, police officers, a…
ytc_UgwMze8t9…
G
Publicity stunt , they themselves hire someone to make such videos as it makes t…
ytc_UgzqxlP1z…
G
Ai is not the problem. Or the machines. The problem is the humans.
We never lea…
ytc_Ugxjfa6j5…
G
So when AI gets into those Boston Dynamics robots they will be literally dancing…
ytc_UgxZ42R9O…
Comment
But... Our P-Doom if not centered around a specific, like nukes or AI it's 100%! So the clock idea makes sense, but I feel like we're comparing apples and oranges, here?
Also, if our doom were to come from AI, nuclear is very low probability of being the tool used by it; but could be our desperate response. AI would not use something that will hurt itself and the resources, while using a biological weapon can be selective.
youtube
AI Governance
2026-02-25T06:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyguxlSmhlIKh4gZdd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgweDRYen7rHTPUc3lR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgywEqcCcgDCABDNsJt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwWtSy0N1tjtMhxcZd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyVfVxsND3Ua3tNcqV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwrISWJ7hLjSvcP1Zd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwyK5F1g2Q8-W0m5Wl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzGSkrwJrn7aXPYS454AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlL3VQZqpQHX0KRBN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0R9HqqG275eEcUxt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]