Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Elon is SPED in the head AI should be regulated like anything else people can ge…
ytc_Ugz-CFWmw…
G
If my sibling acted like Shad acted about art around Jazza i`d just up and hit t…
ytc_UgyKCYLn7…
G
When Italian data-protection authority asked a question to ChatGPT about if all …
ytc_Ugz9XQgUa…
G
Using AI to replace human artists is obviously bad, but attacking people who jus…
ytc_UgykWgWrO…
G
Well at least we all know now, that if we are injured due to a Tesla self drivin…
ytc_Ugx-rE8ig…
G
its boring
theres no effort in AI art
you just type in a few words and BOOM... i…
ytc_Ugxe5JKQv…
G
My friend Billie found out about my ai chats and now she says she will tell my p…
ytc_UgxWTedoA…
G
allah created humans as highly powerful on all over jeans so it cant ever happen…
ytc_UgzluMUjr…
Comment
I stand up and applaud you, Steven. It seems art is the area they figured tech companies wouldn't be held accountable when something went wrong. But it's just the 1st step. Now they are letting us know they are going to be using our data for AI training. Soooo ethical of them. I'm touched. Greed and lack of understanding of the bigger picture have been defining traits of Silicon Valley, particularly in the last 20 years. Ultimately they are sabotaging themselves. No money in people's pockets equals no market. Replace humans and the whole system comes crumbling down. We are all, through tech companies, currently building a better world for machines.
youtube
Viral AI Reaction
2024-09-13T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyRXND9YcZnMN4WgMd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxLQTAe3OZF1wNfrI54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz4jgNMer8SKzqmSWV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyV9D6HD405mbN1Vhp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyReDPK8r5XRzB62hB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyMDa3z3G1bh36mQ154AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzTcbHKutFC3PMiFMd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXsWe_scbw6sahWUt4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyNPUp3zPR1wS0AS2J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxIBa_iepBhZyAqOg94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]