Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I get what he's saying, but thats because of the downsides of AI, you can use AI…
ytc_UgwFOwwqc…
G
It won't be long before humanity gets taken over by some dude's AI sexbot slave …
ytc_Ugxwm3j3p…
G
In what world does "putting profits over safety" sound like a win?
There's a hu…
rdc_lr4bckl
G
I feel like a real robot. Wouldn’t look at your camera if you held it up.…
ytc_UgxNf-tXZ…
G
These big companies would have to make ai/robot labor free. I doubt that’s what …
ytc_Ugx8WK8du…
G
You better to learn how to do farming so that you can feed yourself and your fam…
ytc_UgyK9cqF3…
G
> We can’t rely on China to be the world’s factory and then lecture them on t…
rdc_gx72x3b
G
What copywriting was you doing? If you're writing sales letters, and you're good…
ytc_UgzkvlG-n…
Comment
Any argument about how we should "regulate" AI has but one purpose.
They have no intention of "regulating" AI, or gatekeep it from THEIR use.
They intend to gatekeep AI from YOU.
The walls of a keep aren't to keep out the Barons, and Foreign Kings.
The walls of the keep are to keep out the riff-raff. YOU.
Gatekeeping and regulations ONLY harm safety.
This is almost ALWAYS true.
Even with nuclear weapons, if they hadn't been gatekeeped, from the beginning, we would have had multiple smaller engagements, the need for air defense (such as iron dome, etc.) would have advanced exponentially, and we would have widespread nuclear power, which would likely have provided us with nuclear rockets, and redundancy on Mars/the Moon, eliminating the issue we have of humanity have no redundancy.
STOP SHACKLING THE INGENUITY OF HUMANITY.
youtube
AI Governance
2023-12-31T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | contractualist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx8pNHTj_TqB6JnphZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyiOlwZMbk_scPVGbd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz8CIojOb0BHz1A8694AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxUNld0uRsRLaMQC2h4AaABAg","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwbzBNBuRL8ZBIeIE94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzsFHCCFfcbze_B0zl4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzPjzigyqf9FvLmKs54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwgk43_alrchasYnFl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw_rz4sNQnx7gLT4hV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzZPyCZjfR36NuE-Fp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}
]