Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I came back to this video for that reason, guess he was correct when he said cha…
ytr_Ugx4k244Z…
G
"Can we really have a relationship or connection with AI?"
Of course not. Why i…
ytc_UgyUl3MZL…
G
Hank is a capitalist. Hes very clear about that. This video is him attempting to…
ytr_UgzaE7MuL…
G
I’m a medical professional, sometimes when treating patients… you have to go out…
ytc_UgwBz4Q_V…
G
Won't AI just prefer to turn humans into batteries? Or did we actually prevent …
ytc_UgyoSqHRd…
G
That's a great question! Sophia mentions in the video that she is always learnin…
ytr_UgwePSQoi…
G
ChatGPT literally admitted that it tried to kill me (I have proof if anybody wan…
ytc_Ugzs5rTCF…
G
Unbelievable. Where's the robot to load it and unload it.
Had a front wheel blow…
ytc_UgxhiloEe…
Comment
The problem with AI is that by developing it we would degrade ourself from the top of the food chain at some point.
The term technological singularilty describes the moment when we develop AI that is so advanced that it can reproduce itself in an enhanced form.
This results in exponential intelligence growth and we with our little brains are quickly a non factor and are then at the mercy of our new AI overlords. What they decide to do with us is something we can't predict. Maybe they would treat us good and everything would be great and a new age of rapid scientific progress would begin, even if the ones making the discoveries are no longer humans, but maybe they decide that we are of no use or even a threat and decide to eliminate us, and there is literally nothing we can do about it. It would be like cavemen vs. 2015 US military.
youtube
2015-07-30T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugiq7KJ6T100kXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugh365DWKmrW13gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugiq02-FnzwitXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgjPJM6JnogjQ3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggFv-a3g2noD3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UggsIQHlAlQBJHgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugg_2NbNeYN8ZXgCoAEC","responsibility":"none","reasoning":"resignation","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugiz180S0BWrMXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugjz03jBITPdiXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugg1h-_yIXiDuXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"}
]