Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's stories like these that make me wonder if natural selection is starting to …
ytc_UgzdQsGer…
G
You can definitely pull it off with some of the plug and play hardware accelerat…
rdc_mwyyot0
G
It's an extension of consumer protection legislation in the context of surveilla…
ytc_Ugw0YEhLv…
G
I knew Ai or A1 was a powerful destructive force when I got mortal kombat 2 for …
ytc_UgwUpyHJ1…
G
I think surveillance capitalism is already dead. Their models dont work. If y…
ytc_UgwO5q1sr…
G
I feel so much for this mother and her terrible loss. As a mother of two teenage…
ytc_UgxtXgPSv…
G
The "one important thing to remember", is that it's ai.... Not human. Do not tre…
ytc_UgwK6P8M8…
G
It’d be better if the CEOs were replaced by AI. Why go for the small job that pa…
rdc_mvb9g2t
Comment
If super intelligence is inevitable the argument it is it has to be you to create it. You can try to align it to your values even if futile. VS waiting for someone else to create and align it to their values or worse, creating evil ai. The dilema means the race for super intelligence is on and won’t stop. Whoever get there 1st, has a slim chance of making it ‘safe’ and can control all future AI attempts
youtube
AI Governance
2025-09-11T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyHROhXaY6aBa__Pn94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugywzn5nDALmFQaSzzd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxiSQTPhXKNs8B5smR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxAwhtiZUZXFRStGtt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw1LOf3P6QPz0h_kmF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwqcKLYlzTlsFZ8mPF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxWVgHbbte-rk90V8h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSQCibJxPdbyoZXo94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwy7xtsdSUR_6aCZwV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyYr67WZVsRzdAKfSR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]