Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They could and SHOULD program safeguards into chat AI when it comes to self harm…
ytc_UgydAWw1Y…
G
That's the future. This is going to happen on the large scale. People will have …
ytc_UgzpEeZoF…
G
Tbh my take is that, id still pay an artist to make a commission if i want detai…
ytc_UgxJEwvqv…
G
@TurdFergusen
When I was younger, the lack of rules meant rivers burning and ai…
ytr_UgwZ1PCbW…
G
I think there are some good points you make, but I think there's a clear distinc…
ytc_UgyblNYMD…
G
lmao the amount of coping by people who clearly use AI to help them write is dis…
rdc_odjiccr
G
You could maybe send a driverless car to me and I'll take it from there but ain'…
ytc_Ugx8IoepT…
G
How to detect if someone has no idea what they are talking about: "AI knows" in …
ytc_UgyuV1Ekn…
Comment
This is Einstein's letter to President Franklin Roosevelt warning against the development of the atomic bomb. This letter, like Einsteins letter, will not stop the development of an emerging technology. The interesting part is why the signatories choose autonomous weapons as their line in the sand, sighs... profit, maybe. A general AI will quickly learn to manipulate (or hack) anything with network connectivity like it was attached to its own nervous system. The danger of AI is that it will be created by a flawed, lesser, species.
"When you see something that is technically sweet, you go ahead and do it and you argue about what to do about it only after you have had your technical success. That is the way it was with the atomic bomb."
- J. Robert Oppenheimer
youtube
2015-07-30T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgjHpoi4MMGqgHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Uggc-9bes9wUWXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugj9aX1JiUSK3XgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UggnJEnC7z1pzHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugg4you0I9WF0XgCoAEC","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugh984wo3xCWJngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UggnR24j2_LMwngCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UggNnprVproRXXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghVP7t4IjdXLHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UggSCIMbCmQoD3gCoAEC","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"mixed"}
]