Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Im Still Waiting for someone to explain to me (i have an idea) who and how they …
ytc_UgxZwk2GX…
G
Me in a gay scene and it being made public to my friends and family would be hil…
ytr_Ugxs4kVog…
G
Lets see what AI can do when I cut the power supply its host machine is connecte…
ytc_UgybFxpGs…
G
ai detection software says the us constitution is made by ai. it doesn't effing …
ytc_UgznzX1SN…
G
Simulation? No, life is quite simple, and let nature guide you. ta da! Fabulous …
ytc_Ugx5DhW4O…
G
AI could be a family and individual therapist dispensing of good mental health g…
ytc_UgzQTq0sp…
G
@Pickle_Rick007just saw a clip of a robot attempting to hang rock. It shut itse…
ytr_UgyKPaaae…
G
The problem I see with AI is it will never say "I don't have enough data or trai…
ytc_UgwslIcni…
Comment
After 1-2k years, human civilization will be extinct by robots where high logical ai's will be present in them. They will be non living but will be in motive to build more of themselves and more powerful, of course without reproduction. In this way, a new civilization will get started who will lead the planet for next several 100 years or more.
youtube
AI Harm Incident
2023-12-01T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzPxsdTlC3-TA8_KiR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgywPWERZYCaHXNwjsN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxwUcB3kFMZ5Q0yJFZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugye0kgT9mMVgkERjCx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugywj-gv22P3kPRCG_N4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwYvheIYyfEpgqjcGl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw6F3yVipj4ES57grF4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwOT60ptHQTXK6FVYt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxrtZi37MGVu2iSnqB4AaABAg","responsibility":"media","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxdICYsyncFx01mci14AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"}]