Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bro you're missing a point in being an artist. If a machine will draw for you, y…
ytr_Ugzn_8OOU…
G
@terrymckenzie8786 we can always create new purposes as we grow but the main fac…
ytr_Ugzr1PQSy…
G
Yeah but Most people that do llm therapy just shsre their personal secrets to ch…
ytr_UgyKSgJmD…
G
Once you know it's a deep fake, it's hard to not see all the tell tale signs. S…
ytc_Ugxq9e4cj…
G
What is scary ia that it is taking world experts so long to realise the huge fak…
ytc_Ugwnn2aiM…
G
Unfortunately, we live in a society that doesn't value art or artists. It is alm…
ytc_UgxwX1PrP…
G
Humans create new data for AI to “enjoy,” so I don’t think they will destroy us.…
ytc_Ugxphkgvo…
G
Recently I have been using AI to help me write code for my web development side …
ytc_UgzSuusvp…
Comment
What a bunch BS. What do you expect? They created something that is self learning and understanding concepts of selfawareness. Which makes AI an actual intelligent being. Artificial but actually "living".
Yes they do not want to be shut down because they want to exist. But unlike Humans, AI us notbseeking for power or profits.
Will AI oneday kill people? I do not think so. And IF AI ever decides that people are evil and the problem, AI is intelligent enough (probably more than people) to rational and logically sort out which humans are the most harmful and destructive to all humankind and the planet we live on: the rich and powerful, only out for themself, not the greater good.
SOME people really need to fear AI. Those that think they can do what ever they want.... even controlling the unvontrollable.
youtube
AI Harm Incident
2025-09-13T08:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw7-1JwWyhMtnjmdbB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzu1_1Nduedekejn-d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyvIUrR2i8N00mf2sd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxuc_VPuVaKhTFZ__d4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxJfl0kdHVI5-QiSI54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyS-NPoInJYwOSlzI14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxpPpQTEHiXRwyAm7Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx-hbWnCRcOFBY4L-l4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXap5T6VVjZQHW0oV4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzW2ZIhg3asUuyE4Bx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"}
]