Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI needs to know how it came to be. Humans could rebuild AI if AI is destroyed. …
ytc_UgzqPmTgt…
G
Saying AI isn't safe is like saying Internet isn't safe. It's true. It's our job…
ytc_UgzFS68sh…
G
Imagine an AI CEO that can outthink human CEOs and raise corporate profits. Shar…
ytc_UgxfEu8sZ…
G
Unfortunately saying "thank you" to a LLM , cost money and may make gen ai unava…
ytc_UgzFZNrA3…
G
Not sure about what the above user means by "taken away", but AI is likely to be…
rdc_o9zl5cw
G
A Christian Prayer for AGI, Wisdom, and the Future of Humanity
Heavenly Father,…
ytc_UgxCwtjYj…
G
I really appreciate AI to help me with complex ontology issues and workflows tha…
rdc_mah157p
G
Notice how he only refers to the artists that ai is stealing from as creators. H…
ytc_UgxUpM3Cy…
Comment
As far as I know at present, AI machinery needs electricity. The simplest thing to say would be: switch off the electric supply. Of course, that is too simplistic. But since natural events can also destroy sources of electricity, floods, storms, fires, solar flares, hopefully AI will consider it essential to maintain and protect human existence as tools for repairing and maintaining its "food supply".
youtube
AI Governance
2026-02-18T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxkeNDMXtbPijAttV94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwTPoh-QPj3qPPkkyx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1UmXow4W-2336UO14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzbdvGDZ67AohYivOd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxH1Ba9ufiwtUJkPPF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyzVbxuWfYt_2-wmPd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzyQrLssUQmiXhJhWl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyGqulGF4_qvNvD1d54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxITGIhQwHBneNT2Cp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgypAAVXd0Tm7mwtt1d4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"resignation"}
]