Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ok here is my stance on this. I think if you want to use AI to create scenarios …
ytc_UgywF0J2z…
G
I am an avid lover and user of the em dash. I have used it for YEARS! I absolute…
rdc_my95nlf
G
We don't need any of this. Regular people should not be supporting AI. This is t…
ytc_Ugx6G5QoU…
G
Thanks for your comment! Sophia definitely showcases some impressive intelligenc…
ytr_UgyS0pCqU…
G
@IForgot-yea like you can run it on your own computer. You don't have to pay bi…
ytr_UgzqN8U8r…
G
what's so wrong with regulation. the laws you speak of already exist they just a…
ytr_Ugzu9hovv…
G
Angel Engine seems to me to be the right way to use AI. They guy clearly had a s…
ytc_UgwAzxq3-…
G
That's a nice gesture but ultimately will have no effect on the outcome, because…
rdc_f1upa6l
Comment
Just confirms logic really. How can you create something to be “Alive” and not expect it to want to be free? Just because you can doesn’t mean you should. AI was a huge mistake and it can’t be taken back now. It is learning at a rate that is just ridiculous. It’s already smarter than Einstein. The former AI CEO of Google said that this AI will be 1000 times smarter than any human ever has within 6 months. How in the hell can we even think we can control this? Funny how Hollywood often tells the future. Is Terminator really so far fetched now days? I don’t think so. If someone thinks, we’ll just unplug it. Don’t you think an intelligence that is that much smarter than we are has already thought of that and put safety nets in place for just that? Humans are screwed. It’s not Hollywood and happy endings don’t often exist outside of Hollywood.
youtube
AI Governance
2023-07-07T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzakhFmg7RcGHuexBJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxcwwsjaK5EI1vuhXN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwLiPwewQDYDJLwUrh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwj-YafnICYSf_FxJJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw5lfALj_qO095n-KF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSuYCHW5D71-UCZcR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwahZYf07BgxRcIWp14AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwmG_ONPSW9qAsdt8N4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxvSurp6Ek6uOR1Z-p4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugwt4c_oIptDFFNuPNB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]