Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If all jobs are taken by AI, they where will humans go and who will be the consu…
ytc_UgzC77TPF…
G
First of all that pukie pukie as absolutely amazing! And 2nd of all I studied ar…
ytc_UgzC2VnZR…
G
Oh crap! I don’t think AI is the problem whit this kinda thing and others. It t…
ytc_UgzRh7JB2…
G
We are going to flip the house this year and are almost guaranteed a Democratic …
rdc_oi0brx0
G
Facial recognition software is being used for also finding missing people and ch…
ytc_UgyWFDxL0…
G
It's funny how the Ai version looks better than all the artwork drawn by artist…
ytc_UgztLn2w0…
G
The same artificial intelligence that big companies will use to make more money …
ytc_UgwyCqqjA…
G
Do you remember everyone throwing a fit about bitcoin mining energy use? AI use…
ytc_Ugz6Hv84v…
Comment
"AI are gonna unalive people!!"
...No. And, frankly, this hysterics only weakens your argument. I was fully onboard until you brought this one out.
If someone removes themselves from the gene pool because of AI: No one is at fault for that. The person was simply severely mentally ill, and trying to regulate AI is the wrong way to solve that problem.
If AI is going to be used in robots that can engage humans autonomously, then the fault lies with whomever deployed the autonomous drones.
If AI is negligent then fault lies with whomever put AI in a situation where its negligence can cause harm to others.
We already have laws that cover these things, and employers and officers are responsible for the problems of AI.
youtube
AI Governance
2025-07-03T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz7zBigqh-fnTFMM7d4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwlUZmg9swaAyfWYqt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw152fUHFASB0lHhox4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxntbcdZIb6NxZSEp54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz0OSfEsfF2wWKOVex4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw-srun5I76akZO4Fh4AaABAg","responsibility":"user","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzvtKr3t_CRqmKjPL14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyp728_nknIxd_ozrd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx6ydBUY3KNt-_E9ZB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy6OjrNzb6sIcAAiBB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]