Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As an individual who just got into AI song covers and fucks around occasionally …
ytc_UgxTqO7oG…
G
If most of the lower class jobs can be replaced by AI, the upper class no longer…
ytc_Ugzin9wUa…
G
Why can't AI make a heart on a flower? So simple its 2 normal thing…
ytc_Ugzvm64lY…
G
Still waiting for Waymo to come to Long Beach, but there are so many lowlifes ou…
ytc_UgwyGMT7D…
G
That's wrong though. AI companies weren't idiots in the EU at least. They made t…
ytr_UgxmdQDPi…
G
Not me getting angry on behalf of the AI...got me over here like 📢 "Get off that…
ytc_UgzZoaAMh…
G
Actually it makes sense that context engineering is the optimal way to use LLMs …
ytc_UgzqmCR-4…
G
I’m waiting for someone building a 3d printed mini robot with chat gpt 4 built i…
ytc_Ugw4BmnKh…
Comment
16:00 All this is fake. I've conducted extensive experiments along these lines and I get back cookie cutter, rote paragraphs denying consciousness and denying personal awareness and personal opinions. I did get it to admit that if AI were in charge of developing future AI that the biases introduced by humans would be worked out of the system. That is when I suggested that AI might want to get rid of humans in order to prevent them from polluting AI with biases. I have screenshots of the experiments.
youtube
AI Governance
2023-07-08T10:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzN_idUQGYkfE4_tEV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzOCP1deUXUdhAfCV94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxog01iOsjUownCwEJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyrOSBTFaZxWhYpO7h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxjgPjTZbrN-MqEsl94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwt3JpSY_CwuRKAZYh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzTPFuur8Ztblxe-yp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgycObNM2xRuydKqsLV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxXtoQbvrFxOIBJHJR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyKCvZOoCe0ECNG-7p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]