Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
*"1990's Plot Twist"...* The guy with the sandals and hat is the only robot on s…
ytc_UgwSO1dAp…
G
We may become interplanetary species with the power of AI, or we can be doomed b…
ytc_UgzlGyIEa…
G
Moral of the story is: AI advice is not always perfect and is best taken with a …
ytc_UgxyLFFkz…
G
@Stefanoabed05
In my opinion it even hinders human evolution. Global investment…
ytr_Ugz_vEeF5…
G
The driver wasn't paying attention I know these things would happen with self d…
ytc_Ugxu6rorH…
G
That actually likely is not even the explanation here. As with many aspects of t…
ytr_Ugy2Q-KaI…
G
All due respect, your guest is right, but he is being too theoretical for everyd…
ytc_UgwCvgVoJ…
G
Interviewer looks like a scary robot, so unfeelingly and smiling all the time. …
ytc_UgyxELS-N…
Comment
I can guarantee this is not gonna happen. The thing is there is a law in terms of automation and AI will not break it, its sort of a natural law.
It goes like this, there is a balancepoint where adding more automation will add more work to keep the automation in check updated and in working condition.
So automation will solve up to a certain point if you try to automate more than that you will need more work to keep it going. And the workers will be specialists in AI to keep it going.
At that point it makes more sense to just get it back a bit and keep the optimum level of automation.
youtube
Viral AI Reaction
2025-11-28T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugz9LM1joVps_sv_POJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgwB_cBnVdGMpi4CYKB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgzNetCFx7RbfglciRt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgwUeLy3LDiitosR7Qx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyKD_TxZ_OwDv3WZLN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_Ugz7U2LlbJX6jClodD54AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgyfYVErS99XmhauiXx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgxlAUvcb5XuWvtdO3V4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_UgyeGy3teDmfRlw3yw54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgyHwap8eVdiDcJFrk14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]