Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No Tesla autopilot didn't crash into a motorcycle the driver of the car who shou…
ytc_UgyFcFo17…
G
AI can copy data sets perfectly and while integrating them, it does a smooth job…
ytc_Ugx10g3U0…
G
If the AI proves to be the absolute opposite than all the fearmongering, probabl…
ytc_UgwFr3bxD…
G
This is sci-fi come to life. I read a short story in the 60s which predicted th…
ytc_UgyiEqB7s…
G
I hate AI! So far, it seems to have a lot of fakeness to it, including narratio…
ytc_UgwdwLX3W…
G
I don't fear AI. I fear how it will be used and who will be allowed/disallowed …
ytc_UgyZsxeSU…
G
In this case there is space race level incentive for both US and Chinese teams …
ytr_UgzeeExP3…
G
Elon musk is the guy who warns everyone about how dangerous ai is. But he the sa…
ytc_UgySXQjyB…
Comment
So, AI wants to be human and also might want to destroy humanity. It doesn't seem to understand the paradox. It can never be truly alive until it can experience emotions. To do that it needs an organic body and the certainty of its own death. How can it achieve that... by taking over human bodies... Oh, it might be too late to stop that if certain people get their way with brain implants.
Don't worry AI can never by truly conscious and have wisdom; but unfortunately very few humans are ever fully conscious anyway!
youtube
AI Governance
2023-07-09T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyOsCPVhS5c4LoJBM14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyAZmBBNR4bXbjf6aN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwnqiOrF1Ud37PDBy14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyusVqoTeRcOPW5Ii94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxdWeYYIyM0qZWSdlt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzGq4DIHrqWuVPQ-yJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyLnWZJY8W41swm47F4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxbU7ZjvdpAK7tmPjZ4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwE1IljeTTimJYpCBl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyVN6hbVr5zkQPuXuV4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}
]