Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI art is you ask someone to draw for you and the guy drawing for you looks at s…
ytc_UgwUmsikA…
G
Did the robot realise it was firing at another robot in the car? or the robot …
ytc_Ugz6hsOT8…
G
This is literally the most insane shit I've ever seen, it's like a kid that has …
ytc_Ugx16Pe6a…
G
Alright buddy just because it’s called artificial intelligence doesnt mean that …
ytc_UgzijqYZ_…
G
even in fucking casinos have AI face recognition software you say i will never g…
ytc_UgzEnjMjU…
G
@Javilin447 not at all the same argument, teaching people will allow them to de…
ytr_UgyuwF-qH…
G
My best friend is a distal artist and I see more soul and love in her art than a…
ytc_UgwW2r6_h…
G
I’m lowkey obsessed with how the Ryne AI editor catches flow issues that most ot…
ytc_UgyGvdck1…
Comment
Ezra comes off as emotional and is anthromorpophising Generalised AI technology would be dangerous because it cannot be ring fenced, it is like looking into a fractal, the deeper you go into it the more escape routes there are for the super intelligence. it will always outwit humans, the only way to avoid human extinction, is to only give very specific rules and silo the utility of each programme and what your aim is with it, ie to cure prostate cancer, AI is so infinitesimally complex, it can't be contained any other way, and if we allow it , will happen so fast and be so far beyond any human acceleration. we can't silo it- it isn't possible. we've had it unless we stop now.
youtube
AI Governance
2025-11-28T17:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwt6DaGWcFenvlbTBp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxyf_KfEQ2SLYok9-d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxNer6d7CZRXqmlaG54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzR2vrJcaG9Ig5JR1B4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyZZ9jHk4bbQfEP0Dx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyuPRpKBn8Os3Fin7N4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzsg8sUUHkAulH3hU14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwLJvQHEMlGkfLb-Ph4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy4HfiN8Djm14kV0pR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy5zIlJQ3h8lFTfr1F4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}
]