Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@daydreamer8373 Tesla’s "no LIDAR" strategy might cut costs, but it’s a safety g…
ytr_UgwdIVEgU…
G
I want to add my 2 cents. Since getting 14.2.1 I’ve never had to make an interve…
ytc_UgwT4BN1q…
G
Huh?! 🤔 Amazon CEO basically using AI to THREATEN his Employees to do more with …
ytc_UgwIs3aRs…
G
well a chatbot wouldn't know what it talked about unless you are logged in to th…
ytc_UgzgA2yIi…
G
No cause why am i crying rn I got told face to face with my friend that does ai …
ytc_UgxB8bPyi…
G
It's thieves vs poor. AI could be a boon to all mankind. 20 hour work weeks, e…
ytr_Ugxxo_fUl…
G
The problem is that AI has no common sense. A certain number of times, communic…
ytc_UgyCJ0jxF…
G
If the technology exists to make a toaster than very soon someone will invent t…
ytr_Ugy8ADoU7…
Comment
To be honest, I'm pretty sure this comment is going to be buried in the comment section but I don't care. I just want to voice my thoughts out into the void.
Hear me out, I'm pretty sure if someone had red the arc of the scythe series they'd know what I'm talking about.
Anyways, what if we had artificial intelligence take over? But it refused to would take any jobs that humans could do? There would also be an Universal base salary for people that don't work also.
Anyways, I recommend reading the book that I based my thoughts off off of cuz it's pretty good.
youtube
AI Harm Incident
2024-08-04T22:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy9U2F0nwbPsL5hDIl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyfARGIv5mVaCmE2SF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgymGq8SkdYPlkB5SAB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxHenMC_oRAn5-e1vx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz6y7AgO3Aay6yXkxh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzDLMmKNeEekORa2l54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugysi2GSn0vQ7BGa0194AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyIO4B2P_Fbhwv0rTZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx2Byrtd2LgIB4JDHF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw2ISZuxNj8D94dL3Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]