Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
> We shouldn't be looking for loopholes to make it okay
What is or isn't OK …
rdc_kzf40su
G
Why do this video made me think that Stephanie Soo's might also have a deepfake …
ytc_UgwjtDrKz…
G
@titankronos65173 what even is your point, man?
People you are talking about, wo…
ytr_UgwD-NkWF…
G
A lot of BS. No AI or robot can repair my car or do maintenance on my AC in the …
ytc_Ugx67GTu7…
G
If peaceful protesting fails, just remember that all it takes to knock these dat…
ytc_UgxrPHpZ0…
G
When it comes LLMS the US is winning hands down, there’s no question about it…
ytr_UgzVkF_Nr…
G
Using an argument supposedly fighting against some imaginary class of artists is…
ytc_UgwlDDaTp…
G
i understand but there’s more reason to dislike ai. one, it burns a lot of fossi…
ytr_Ugx8c_YQ-…
Comment
lol that interviewer watching things from a rich retirees pov ,who have no idea how greedy company owners are nor what ai can do.
Richest people pay next to o taxes , full time workers are homeless... UBI lol ... even the idea is absurd even in a socialist country, in a capitalist its just pure comedy...
Rich will get their self driving Tanks and military security robots, recycling human bodies for more energy to the data centers...
Companies / politicians do not give a D about long term it is too risky/ uncertain > all want the short term profits/ success.
CEOs like : in 20yrs no people can afford any products > who cares I will be owning whole towns of real estates and have 100 b*tches to have fun with approach..
youtube
AI Harm Incident
2025-06-19T08:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugy05clI0gQiaMtARZJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzCWUiD54zhBytfWih4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyAKEBG9v_v4BrphkB4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwuHdKE3SJzKNuWuS14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzblSATp53kryPPkzh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzNPT7kKXSHMlLQGzR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwPqUfBCiFUujMVH4l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxhrLbIdJHa6ecPArR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzSnnDx1iRR4dHFjGJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyVqos0Bg6K84ZyhsJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"})