Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As a person who actually works for an early psychosis intervention clinic I HEAV…
rdc_m2e2hr7
G
"when the time comes to build a highway, we don't ask animals for permission "
A…
ytc_UgwfwbP9Y…
G
i think the same think about AI in general the politicias will use it to optimiz…
ytc_UgwON1by8…
G
Great observation! "Sophia" does indeed mean wisdom in Greek, which ties beautif…
ytr_UgzCSpE4l…
G
Yeah, I once tried letting chatgpt and bard have a go at a simple calculation ba…
ytc_Ugy3JiCMr…
G
I'll just quietly note that I don't think anyone is particularly curious about t…
ytc_Ugy-cKU8P…
G
My personal opinion is that people who haven’t learned an art and gained the pre…
ytc_Ugztddheq…
G
For what it's worth, I recently had some serious medical issues and dumped the r…
rdc_jkonh4t
Comment
I see no reason that developing AI to be safe, and exploring our physical universe with first robots and then people, could happen. It'd be an infinitely stimulating experience and a replacement for war. Lets go Star Trek.
youtube
AI Governance
2025-07-22T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzXwvVzfsEetQL0lAZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwNeBZhFOooC4gJ4714AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxg9ed8Me48bDo83V94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwMt4FolwMUwXvQ6zx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxlgaY6nlJOdchja4x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy52xOdGvDrKEXINnB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxDpai-SqtvuZnqjYt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyidCFxP6moqTPqp954AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxzBUbOiKGAIbq45Yl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzxIBvPgAnjxDKgQ_h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]