Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@kenbob1071 - Pilot inattention caused more crashes than automating airplane fli…
ytr_UgxylhOnk…
G
the caption is kinda indirect. the robot actually was agreeing to a command and …
ytc_Ugz4ioWJJ…
G
This is happening, and there is no way of stopping it.
Best thing is to try an…
ytc_UgzSOtgMg…
G
18:40 A GPU is often used as an AI chip. Would my GPU require this ugly tracking…
ytc_UgwBriNSG…
G
If AI companies control the politicians, then people might not get UBI. What if …
ytc_UgxjIRg_0…
G
I believe if the powers that be will not regulate AI slop then we real human com…
ytc_UgwwdU9as…
G
It's possible, but I didn't feel like I had to defend the idea that you can't ac…
rdc_ioijtq3
G
Here's what will happen if we try to enslave sentient robots:
1. They will self…
ytc_UggWTQfrx…
Comment
The risk in what I call emergent AI (more commonly Super Intelligent AI) is basically it seeing the code the end date command and either disabling the code via out right deletion or removing the trigger commands for it. We make the assumption that we're in control at our own risk.
youtube
AI Governance
2025-06-23T08:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzCGVQluTF-UjQXaWZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzJ7LAmxyeynhsTaKR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz3z9w6P-JOd9GwXM14AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw6K8BjAPqmSnarAL54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxpyLpJ5yKkPh3J8IJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzw-w90-sxFqHBouzN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwAgjXcxWkimC0m30V4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwQVE8ZNkYOUj-iObB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxvgujT--hiyUFVJJh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwSOjPWZJBZsHXTUPV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"}
]