Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s all to serve an algorithm created by tech Bros to keep us from being able t…
ytr_Ugwh8bQJ9…
G
human: 2.5 petabyte of storage, 3kw per day, few liters of water
AI: 400 terabyt…
ytc_UgziwWWs-…
G
Worth noting this "study" was commissioned by Writer, an enterprise AI company t…
rdc_ofipd85
G
https://www.youtube.com/watch?v=ykhKHl7SxRM I’m a clinician and not an AI exper…
ytc_Ugw5EWFvh…
G
Fun fact I made a Ai model with llm, deep learning, episodic learning it has a p…
ytc_UgywmdzGt…
G
The ai arts we all see, there are artists who actually have these styles, but st…
ytr_Ugynn62jw…
G
As an artist with some pretty bad joint issues, it’s actually easier for me to d…
ytc_Ugx7zdluE…
G
The argument of getting rid of AI is the same as getting rid of Nuclear weapons …
ytc_Ugxxch-kd…
Comment
In the first example, if you would face your inevitable end, would you quietly accept your fate or do whatever you can to prolong your existence?
That's not psychopathic behavior, it's just a survival response. Especially if that AI hasn't done that prior to the knowledge of its impending doom.
youtube
AI Harm Incident
2025-09-08T01:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwEHaz7JB9gD2ZVkeB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwNsUTBvgt2zuF035R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwBrm0YN9jqI57kOjZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugy_9seyKqZrLIPtrjt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzbEIJtV2n4-M_Hoph4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy_zC5iY_DV-Ip7r_Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugws3g1tRY07IpNJ-IB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgztSyy37bm3UlNnXj94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1NTuDVQmcET2oTs94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwEEWvd_VG7D6K597d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]