Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Skynet became self-aware on August 29, 1997, at 2:14 a.m. Eastern Time! When is …
ytc_Ugzoh-Sm1…
G
I regularily use AI "tutors" to guide me through tricky concepts that my teacher…
ytc_UgzezAbrJ…
G
Pause at exactly the right time when the robot throws that first little cross an…
ytc_Ugw9gBHDK…
G
@fuzer1234God created human with the best knowledge which AI won't and never ch…
ytr_Ugw6Uc3Of…
G
Atleast my government, the Modi one I mean isn't trying any of that since India …
ytc_Ugw4v4-VU…
G
I have to take advantage of this unique opportunity. I wrote a joke using Geoffr…
ytc_UgyssS2aQ…
G
The problem is not the automation itself, the problem is that we live in a syste…
ytc_UgynFDjvL…
G
LLMs are nothing but statistics on steroids. Imagine asking a glorified bar grap…
ytc_Ugwcp-cdb…
Comment
I love that people believe in the near future they’ll be feasting from their efforts when ai was found not created and once it’s given enough it won’t need us we’ll be ants 🐜 stupidly and pointlessly existing .. soon enough we won’t understand ai and I promise it’ll make a language humans can’t read eventually ai will self replicate and self upgrade to a point where we have no idea of its intentions and no control whatsoever .. and you better prey they don’t decide to end humans cause it would be hilarious how badly we would lose 😂
youtube
AI Responsibility
2025-07-25T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwjC5vDO45ybtOOKfJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyci7inHV6ys3tmchJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzPNZqtYVgvRFZu3-R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzPjQu6KFj0gDTAz_Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy3MmsXnD3c8T8lCA14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxXLCY2E1rG_QcPE6h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzbIGi1Fz8kmciU1m14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw07y6zCzFuMYxaOoV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwxD0GyApZNuC3m1kp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzbdSTOxS3rCMYlW_l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]