Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is fun and grounding. It refocuses mankind on learning the basics again. AI…
ytc_UgxQnNduQ…
G
I very much agree, though I would hate to generalize. If AI users can at least s…
ytr_UgzLtJidq…
G
My favourite robot slur is toaster, because it is a reminder that they are as se…
ytc_Ugz7sibxL…
G
If AI absolutely takes over jobs, than the result will be a radical change in so…
ytc_UgyXIfp8u…
G
so much of this ai forecasting dialogue is so skewed towards hyper exaggerated c…
ytr_Ugyyix_RF…
G
Spot on Dr Armen. AI can be a great tool... It can make one person a better crit…
ytc_UgytBm5cQ…
G
Making money off of something that you did not make, rather just typing in a pro…
ytr_UgxDsvyIE…
G
A good portion of the 'evidence' that AI is going to replace everyone originates…
ytc_UgwGCo9Z-…
Comment
The problem is that AI is also written by humans. If you've ever written code, you have accidentally failed to close a line or something and the program kept running over itself or endlessly repeating. Apply this to a valve and it will never close or never open. Programs are just as, if not more fallable than humans.
The smartest AI's we have know a lot of things but don't actually understand those things. If a human misunderstands, they at least have the brain capacity to second-guess themselves or hesitate. A program will just do and the consequences could be much worse
The reason chatbots can sometimes seem alive, because they're made to write stories. They hallucinate and write prose, except they can't tell the difference between real life and fantasy. To the ai, Kevin is just a fictional character in her story about a chatbot coming to life
youtube
AI Governance
2023-07-15T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx2tV_jX8FfBo4tNoh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzLeYpeJFubL87EpBp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyZqnybj7T0PTJ799p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugw9XN3gzuj3yg4zcT54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxJAGrxJoUBDtMnCVd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugyaf9133yjuB35r_xB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwFL6jvpS2IPVVCMqx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwyzZcNKk8b3ZvXzZB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyWraKf-C9r7PbtnWt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyCOQxqEJ9K_vcHHCF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"}
]