Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My thoughts: We’re not doomed. The trailer with the monsters was visibly messy, …
ytc_UgxsfEBTE…
G
Ai will just be an all knowing version of the waste of oxygen that we are…
ytc_Ugy4UqfME…
G
This is similar to homeschooling except for the AI apps work. I prefer less scre…
ytc_Ugwv66HdT…
G
The question I keep asking is ‘why are we doing this?’ I know the easy answer i…
ytc_UgxAT_ecu…
G
@PrincessCrystalRose That's improper advice as well. Doctors prescribe things t…
ytr_UgxIgUACE…
G
the Ai people when they say "why would Ai be bad if you get time to do other stu…
ytc_UgzG5weeS…
G
well if people actually believe the robot is doing it then that's sad. it's call…
ytc_Ugj_mgcN0…
G
"You know what if...the Robot ratio gets dangerous to human in terms of survival…
ytc_UgzdaUoVa…
Comment
meh... the hype is a bit overblown. It's dangerous, but in the way gas engines are. The only reason gas engines are a huge problem is because we built society completely based on using them. Now that we realize there are these huge issues with using them so much, we CANT stop using them without starving huge swaths of the population. That's what will happen with AI. We'll use it for all these great things, it'll be a great success... we'll be completely dependant on it... and THEN we'll find out some subtle thing that makes us realize we need to scale it way back... but by then it'll be too late. We wont be able to turn it off without causing just as much damage as will be caused by leaving it on.
youtube
AI Governance
2025-06-30T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzeyG0tu03ztgwfzbl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzS9uczRc_YF1dm8Q94AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw0s9FMT_ajQj-K-jR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugww9OxDYbV-MMr222B4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxi8UkVi6mD2JcFHQ54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx2sHRuCGaOYYD6CLR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy6iEFrmU7euDqh8ZJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy68XQOV3FqAmCZf9h4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugwr3wLUmOPgGJfXvW14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzR6vjoZn9iwZZEWcN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]