Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think this is a very silly question. If robots manage to become sentient they …
ytc_UgiJxrcUr…
G
At what point could we interview ai or super intelligence.. that is assuming it …
ytc_UgycyCxgT…
G
Saying digital and ai are the same because digital is more convenient is like sa…
ytr_Ugz8y9V49…
G
Ilya's secret sauce is ISRAEL. Giving that NETANYAHU guy control to AI is the c…
ytc_Ugwr1R6Le…
G
AI is simulated human beings. And they should be controlled by all means. Module…
ytc_UgzsMY8cO…
G
@lexluthor9509 I mean. With the way current models are built, both "ai" research…
ytr_UgxUQauBt…
G
Making excuses for Newsom nixing the first AI regulation bill in Cally. Why do y…
ytc_UgzfsAn0y…
G
Ai can be great sometimes just saying like a scripted Minecraft smp like the dre…
ytc_UgxxEzMHa…
Comment
I'm not afraid AI will kill us and or destroy society. I'm afraid of who will take control of AI and decide to use it to kill us and or destroy society.
I'm referring to a handful of Billionaires who consider normal people as "useless eaters".
You know, the Libertarian tech-bros like Musk, Thiel, Palmer Lucky etc etc who hate Democracy.
youtube
AI Governance
2025-10-15T19:0…
♥ 10
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyhusB7AZC1eVN4bcB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzLFydI3wfRNar7-op4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxRekLlkKU-uXU70i14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx844MNZVkI1Ho0V8l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz_SaTH9vRJtNNqrS54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy7-AwN5naVzFMIhal4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyAoCHKeaB4Wikdfs54AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxRAbWMwR1al9AGT_t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzsIWHq6WvXAEbFnAt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwOW0iYMCstbyV00aV4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"resignation"}]