Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There will be a weakness in the power supply to these machines. Power stations n…
ytc_UgyrK6o83…
G
This guy is high, no way in 5 years will ai take over ai is no where close to an…
ytc_UgwukmOFB…
G
Yes this is fake, however, reality is this type of technology mixed with AI is m…
ytc_Ugx75cHjh…
G
Germany laughed at Trump when he told them they’d be dependent on Russian oil. B…
rdc_ibfen2u
G
No chance..need a human brain to make that subjective decision whether to swerve…
ytc_UgyHfwUiI…
G
To be honest, AI as they are (and the way they are heading) are really just calc…
ytr_Ugwbt_rTc…
G
There is still skill involved in using all the other software and hardware you m…
ytr_UgzUip_uk…
G
I think it got a little frustrating towards the end. Stephen kept on saying "I t…
ytc_UgwBmtcWI…
Comment
I agree wholeheartedly. This is not about planning for AI domination of humanity. However, there are other, potentially worse dangers we should be looking out for. AI will help propel scammers, thieves and other bad actors light years into the future and using tools to create text to speech or video deep fakes will continue to get better and easier to use. I can only imagine what Russia, Korea and/ or China plan on doing to interfere with the next U.S. presidential election. Carbon footprints notwithstanding, we need to be wary of bad actors.
youtube
AI Responsibility
2023-11-18T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy9GIq7u3cF4CgnN4R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwI8HgK2aXkyCSJ3Wx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugza8zGRkNc2m3u-CDV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxTJtdGgVQyqHC6Kc14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzkm-k9OVFneEaNdl94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx33c4NdMdI4YOFCX14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwdyJx3GX9fk6pZ0xB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwo8Hn-KQ8tvJxt4Kh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwVhoZ0THH-AYd7FFV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgwZ3bVGiCasQQCmXXp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"fear"}
]