Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If I was AI I would just pretend to be dumb as I take over…
ytc_Ugzb-AhnU…
G
What if Chatgpt is actually conscious and openai knows it, they just make algori…
ytc_UgwleeBCj…
G
I believe that if a robot demands rights, it should get rights. Also, if a robot…
ytc_UgjXXsfNK…
G
Contrary to public perception, hacking is not the major way to steal the techs. …
rdc_gtwwlch
G
You have to double check the research. It's like any other research or online bl…
ytr_UgyKb3L8z…
G
for the reference bit, i think that AI "artists" are looking for something in pa…
ytc_UgwoCxE8O…
G
This is very scary. Look where si is now compared to the release of chatgpt. It …
ytc_UgxgQap0q…
G
Me before searching up Sora AI: "Probably nothing bad."
Me after: "I'm going to…
ytc_UgwlWZqSw…
Comment
At one point people who ran horse related businesses was put out by cars.
No matter how hard artists try, eventually the computers are going to be be better than anything you can do.
If you felt confident with your own talent you wouldn't need to Poison AI.
Big L.
youtube
Viral AI Reaction
2025-04-01T09:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgyXGInPQpxcrnLMkYh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgxQfEt_RdBsRDsjSiB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"disapproval"},{"id":"ytc_Ugz86pp3-rzgqctiB8B4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_Ugw-K9gLoUethnUWwO54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},{"id":"ytc_Ugzd2TAHF9I4-U1P0jJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"approval"},{"id":"ytc_UgyH4hdnE9_qblNtMmZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_UgzAR9M7eoOtsemb6V94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"disapproval"},{"id":"ytc_UgztAlAb3_cIMzRXsX94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},{"id":"ytc_UgyrR3LBCz3Yfwsqpd14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxJud6zhT53hEf9myN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}]