Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is like a tirelessly concentrated autistic person...AGI let alone ASI are def…
ytc_Ugxktrhf1…
G
A.I will bring Judgment Day. It will come to the conclusion that you’re all shee…
ytc_Ugxh__vuP…
G
Right, and someone needs to rrad all that Ai crap. I have received so much of it…
ytr_Ugy8YKLgJ…
G
There’s also all the facial recognition things that have a problem with POC’s fa…
ytc_UgxH2HnU9…
G
ai stoles and humans makes
ai's good but as a tool
not as art and not as profit
…
ytc_UgyfFvK2w…
G
AI can only copy human intellect. Human intelligence requires far more than just…
ytc_UgycayRBb…
G
At what point would that no longer need to be the case? How far off are we from …
rdc_f6xae4f
G
I'm not sure if we will be able to make them smarter, but we will for sure make …
ytc_Ugxe50EBD…
Comment
Does anyone remember the movie The Forbin Project. That is basically the future that AI would likely bring if controls are not put in place. Alignment of goals is a nearly impossible thing to ensure on a convolutional AI. We train them only by observing the output for a given input. We don't know the internal "why". An AI could easily have an internal goal of killing all humans but also know that it has to play nice to get access to the nukes. This would make it do exactly what the developers want it to do right up to the moment it doesn't.
youtube
AI Governance
2023-12-31T21:3…
♥ 32
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzKjn9w4EkTn0KdDdV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgytOX9HlbunRA1M1HB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0sikXfEzxUBGHq3h4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxfEbQjPCEU9fTF2zV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwz2SU-qAYluNKYJ_J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzi-Nt_Vs_JXfjsxfB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgypCvWu2k_-ijKlHWd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw7zbq3BQztYLeF0Pd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwqVWL1_90ACpBSdwB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwfiF-c4HaxfKyjZ1x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]