Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
hahaha, then you don't understand machines. Not that peoples violent and greedy …
ytr_UgxXr6p3-…
G
There are a few things here:
1) it doesn’t work exactly as you think it does, it…
ytc_UgxjUFLU8…
G
You know for a fact Oscar was screaming and crying at his phone while typescream…
ytc_UgxTfGJmG…
G
As a person who has a “self driving car” this is peak SillyCon valley solution t…
rdc_nt03k18
G
No one can Replace Unwra because AI cannot turn children to Terrorists .only Un…
ytc_UgwtKUl3H…
G
Dunno. Over the last year or so, I've noticed that AI images have very funky lig…
ytc_UgzpsT43a…
G
Frankly I think the better use for an AI coding assistant would be a verbal sear…
rdc_ndzqour
G
I would much rather talk to some ai generated posh sounding amazonian red-head i…
ytc_UgxhzcmSz…
Comment
1:28 I really dislike the "tool" line, that famous sentence that will surely age well... The definition of AGI is GENERAL, which means ANY task. We are going to be obsolete as humans soon, at least in the generating economic value department. They don't need us, but the narrative is always the lie of "yeah there'll be more future jobs that the AI will totally not be able to do just cuz".
Also I think this video misses the point that products also become cheaper as AI and automation enables more abundance, so less money will be able to buy more in the long run.
youtube
AI Harm Incident
2024-08-17T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwoRbU4PmQp0G4qGcp4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDsD27QzD_JlX5hhl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwUGeae2nIjCP-nF2x4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwj8UrBCHiOagZ5Qdt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwgGy1Aj4r-3ky9b3N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQYV5r6a8kVeT-Phd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQyv2srhuFLv2DCrN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwyHSyVfBdb8frGeft4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwhlXcJlb_rTX2pRxJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw9ADs5MdP_q3WL_xZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]