Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Prediction is not enough, what will happen when the AI gets something that is VE…
ytc_UgyhTLoA2…
G
Just don't put Ai in charge of dangerous shit. Okay, Boy? Are you still on mind …
ytc_Ugzwib-zI…
G
I think about how I was 19 and was working a few bucks over minimum wage and my …
rdc_ljbkx08
G
Computation alone will never be conscious. Symbols and rules don’t have a way to…
ytc_Ugx9wCNB2…
G
By my calculation the moment we make AGI we get extinct.
And there is no chance …
ytc_UgyYpc5lx…
G
Agree so much. Intelligence requires consciousness. Period. It is very tiring …
ytc_UgwsD_6W8…
G
If AI can take artwork, writing, or even voices without permission, where do we …
ytc_UgxX8AxD_…
G
Well we'd like to kind of know what he thinks is so horrific that is going to ha…
ytc_Ugywz_Yl4…
Comment
I wasn't born yesterday, was prepared to learn this digital age, learn codes wanted to get on top and understand it all. Think I was born before colour television and sharing a phone with our neighbours! But now I think its a darker tech, whether by design or otherwise, I dont know but it doesn't look good. When I started understanding this, the tech was changing by half a yr. So I felt by that stage I would not be able to catch up, now I can see by next yr, I want to get off this merry go round and it doesnt look good. I really wish it was something good but the way this AI tech has been promoted over the last two yrs has been in your face. I was never consulted about this new push. Just say hold down just pause a little................
youtube
AI Governance
2025-09-04T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx2xs16FCUxq262Bs54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy21imotQmT-Of8Es94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxaxIuAErMrcXRrasl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwYVMbdbqLfe0408eB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwzdQNhoFHYMYRuX-B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyaQDZf_wIiH7JmsOx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxM-NpAHOhBCrUeyNd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxVj0rnUP515Oe4Qt14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwlHAv9wtzFMhMLd6B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzpx4uQ7YujtMX6bO54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]