Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Humanity problem is not nuclear or AI…but GREED…when there wont be nothing left …
ytc_Ugw25PMhs…
G
Agreed that last robot said good riddance to the audience these things are going…
ytr_UgzKaiyfN…
G
The person might not have the talent to create the art but he created inspiratio…
ytc_Ugy_2r3ej…
G
What kind of dumbass hits the gas while not looking where they are going?
That …
ytc_Ugyf9NbHD…
G
Ai has still a long way to go in order to apply to my field of practice but I my…
rdc_m844rma
G
Holy shit that courage the cowardly dog one..
Muriel looks savage. She about to…
ytc_UgzZWxBlr…
G
Because capitalism only incentivizes short term profits, and if some shitty AI s…
ytc_UgwlFt6iG…
G
I THINK THE BIGGEST FEAR HUMANS SHOULD HAVE IS IF GOVERNMENTS USES AI ROBOTS FOR…
ytr_UgwMGh49o…
Comment
So far, artificial intelligences lie all the time, the AI stock market would not be immune to this and probably crash all the time as new lies get found out.
AIs are not cheap to run and require an insane amount of power
We've been trying to replace janitors with AIs for decades, best we can do so far is a roomba that can't handle stairs. Replacing human hands is still a pipe dream in a ton of complex applications.
Same as above, just inventing a self driving car has been "just 10 years from now" for the last 20 years. It's incredibly difficult to accomplish in perfect conditions, never mind rain or snow.
youtube
Viral AI Reaction
2025-12-02T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzpQ7gXM2Ad-2FEotd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugznr6989MGF4fPjRa94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzc10s1ADCr617oznF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzxQUDmugGHvZaqvc14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyx-3Rt7JH3BnGtvi54AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzZYISWrj0Mu7DPecF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxHApS1XABXcS0zo3h4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxsDGnBdL3Sjgbvi5R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxb2X-eo6IjDP6aKDl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyDm-ziZQq87P2sSGp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"}
]