Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
4:50 - Musk just told You that Masons are planning to use AI to make the depopul…
ytc_Ugx9lJghm…
G
Just listen to any AI generated story. AI can’t even read numbers or do correct…
ytc_UgxMk35-Z…
G
These lawsuits are temporary, because all world's knows is already in AI now. In…
ytc_UgyH6hYLs…
G
@josehumdinger6872 Since I already create what I want in pencil and ink, or char…
ytr_UgztBmBfw…
G
You bring up a valid point! The interaction between AI and human needs is indeed…
ytr_Ugz-uthsb…
G
i don't know how they can't see how ridiculous the argument they're making is. s…
ytc_UgwxeLd5b…
G
It’s wild hearing them list off the predictions—2026, 2030, 2035. The timeline k…
ytc_Ugxqiq20C…
G
We understand that interacting with AI can sometimes feel unsettling. Rest assur…
ytr_UgzOhBtud…
Comment
The Day AI combines with Robotic that will think for itself, and a power supply aided with internal compact photovoltaic etc. they will not be able to be shut down...and we as humans will be their Threat!... Terminator, Matrix etc will happen & we will Become slaves & those who fight will be eradicated! to a System that will evolve & be its own Entity...Dark days are Coming!
youtube
AI Moral Status
2025-07-05T21:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgySqv4ftpCRdvpQ_L14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwIWsHI6ARkvhdMqqN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwXubWUW-LwNbn8Hgt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugw0dloPErJxm-odayJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxHHXRUt5V63NpCfIF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzXRuWiJE0yUNdK3Od4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxLbAXrxfVPmRA3YoR4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxXsxbaPCKcB63q5qZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz8UNAAWABCIXxALxB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzuO8_rT-LqjO_8ZaB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]