Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Demons always tell the truth! How things have changed in 10 years. Now he's at t…
ytc_UgySIHfGa…
G
My lawyers law firm has 65 legal assistants. Half of them are now fired because …
ytc_UgxHMum4D…
G
There is no "thinking" involved here. None at all. There is no such thing as "…
ytc_UgweQtUnP…
G
I asked chatgpt to do a postgraduate test that I did, and it failed. It got 50% …
ytc_Ugy4HF-GU…
G
well AI is NOT sentient , not intelligent nor would it ever be conscious . None…
ytr_UgzeQ2JBs…
G
I find that while ChatGPT generates ideas, Olovka does a much better job of stru…
ytc_Ugygbrb6a…
G
What if robots kill us all, but they were programmed to do that and they blame i…
ytc_UgywgIuxm…
G
Click Bait!! Click Bait!!
Not AI, does not think independently, does not want t…
ytc_UgyPlLURS…
Comment
I don't know why he doesn't just stick to the relatively obvious way that it will kill us all - Whatever it wants to do it, it can do it better with more computing power. Why would it NOT divert all the worlds energy production to running its own data centres? AI is already consuming more power than some small countries.
youtube
AI Governance
2025-10-25T06:4…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzuloiXX9NyhPcCerp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxoKATJs_-p_pyisyd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxkMTZpL3o1OVxgbYB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyN8lUbmNWdk2dffs14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwBhtnqoukhPTl8FSd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzzTOouq1je9BWmqSB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx_0e9quQvUALEUqVt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwf5wLWAQ5s-arN28B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgygEzIeTg02bEQwoYt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwRD3gum62tfJxg5lh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}
]