Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Excellent synthesis. Very apt analogies. Looking at the tail end of the graph, I…
ytc_UgzRFTQ1h…
G
The only use that I could see for driverless vehicles was would be for people wi…
ytc_UgwNY1O8K…
G
AI won’t need to conquer humanity by force. We’ll willingly hand over decision-m…
ytc_Ugzvs2G3_…
G
Why is everyone complaining can any of you actually draw or paint. How about wor…
ytc_UgzAx5_Bl…
G
Correct, that's why I called it a middle ground. Arming robots yet, but not in …
rdc_gs6jhg9
G
AI is useful but its soulless I feel good when I see art I know someone took tim…
ytc_UgzVi8WAb…
G
You gave a goddamn Tommy Gun to a f@#$ing robot!?! 🤷🏼♂️
Did you guys miss …
ytc_UgwmwkAQd…
G
I wonder if these trucks have remote human supervisors like the passenger cars i…
ytc_UgzYF0D9I…
Comment
Not happy that you skipped the two studies that looked into whether or not an AI would kill a person to prevent its own shutdown
And both studies found that AI have a terrifyingly high likelihood of killing a person to prevent its own shutdown
youtube
2025-11-08T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzXOLLf0YGuhfdH2uF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugys7rExqJrriIRxN9h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzmUaNfrpVg046aRWt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx0DyEQOU2kgoDp9sx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzTNyZkytSvstTZSBt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz9Nv-9_TYRhGua-kJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzHgiOG_90brnQvNLp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy_U1lsrKMIj8-Ng8l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwkzrroFQde2Ts_26l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugyba8NbLggj1W6AmjB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"})