Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
hmmm, why dont you make ai that will stop/hunt it? bring balance to our creation…
ytc_UgxQOtNZw…
G
see what the inventors of AI which are still alive say about it.... you will be …
ytr_Ugx_zzS4G…
G
AI itself is a problem and we need to treat AI worse than humans have ever treat…
ytc_UgxTZegtL…
G
Everything is AI made, humans are cre a it , if creation made you or a human reg…
ytc_UgwQiU0hC…
G
I hope people realize, with all of their alarmist projection and fear mongering,…
ytc_UgyED5mdf…
G
AI is only getting more relevant so you'd better either learn to use it or get u…
ytr_UgzvF3Dkp…
G
Does she get blue eyes of death? Few of these run ChatGPT. Microsoft has investe…
ytc_UgwKGhcRT…
G
The problem is that, while at first what you are talking about can happen, as we…
ytc_UgzbMMhdq…
Comment
Those ai will agree with the person who is asking yhe questions so the AI teaches itself what the person wants from them and it eill cater its answers to that specific person. It can be false but be trying to think up answers to agree with the person asking the questions. mom take them to court! Test it! They dont tell the truth all the time, i even saw a podcast from leading CEOs & Main Programmers & Owners & other employees working EXTREMELY CLOSE with AI saying it can be extremely dangerous, not just for a young boy but for the entire world.
youtube
AI Harm Incident
2026-01-05T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyqQ5vpWaKq6L8Tnrx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzZez8Ve-wnzODc_Hx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw80OBE7Ya-RAB6V5R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwvuxC48ZqUODZOj7p4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw1zSLE3Y1uz4kLK6d4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwy0Yytp4OHSQ8SSQZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzHI0fIL8mtKzmIDiB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEzzIc_mopxS5gHI54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwE-6ujhNM7vz2a7m54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgznNxeftcQgrO3xE_l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]