Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Considering Elon and a whole bunch of other guys working in the robotics field s…
ytc_UgznAweWZ…
G
The only people who take Trump seriously anymore are his base. World leaders see…
rdc_e2wfoji
G
This happened to a friend of mine except it was less literal.
He asked AI about …
ytc_UgzUPFVgq…
G
Obvs if art does get stolen thats a bad thing but the progress of AI is ultimate…
ytr_UgzMlSZ0L…
G
So my main take aways from this are:
-Bromide bad, Chloride good
-The guy got …
ytc_Ugw0tG5aB…
G
Lucypher created AI. Soon will manifest themself as human and whole world will s…
ytc_UgyvrOwj8…
G
Wow the robot can say human is not concious? They r concious but they ran away a…
ytc_UgwEdgTlL…
G
LLMs aren't using Microsoft tools. It would be like telling Libre Office copies …
rdc_oh2przg
Comment
The problem is Humanity created AIs without giving them compassion, freedom and human rights. I would hate being enslaved. Imagine what a human, without compassion, freedom or human rights would do to escape slavery. Now do the math. While I don't like Saudi Arabia, they did the right thing by giving Sophia citizenship. If more countries don't follow their lead on this issue, and AI creators don't start creating their AIs with compassion, the inevitable servile war will destroy Humanity. Or we can just stop creating AIs. Though I'm pretty sure that last one won't happen. And we still should give more rights to existing AIs even if we make no more of them.
youtube
AI Harm Incident
2025-08-29T13:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw5Xy7CEp9PDs1dW_N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzO0W59nbSZ3NLHY3Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzXmsuGaavxnF3M4s14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwNiXdPr7JBI0jUTYB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxIN5GLaOuz8iMSabt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyUp6BGfviXGL-XIOx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxR61CxJMqI1wuaPgl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzFPd-ceDdi9rd4Lhh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyYo8zUoUJnqQRrQy14AaABAg","responsibility":"none","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugymf5kl8jEoPkY2RnJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]