Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So autonomous trucks are going to 'fight' climate change.
Really. This climate c…
ytc_UgykkwMjp…
G
The main argument I see for AI as reference is "But what if what I want is too n…
ytc_UgxOGow72…
G
Even in the best scenario where you still retain the antibodies long term herd i…
rdc_g9u11hp
G
If dry van and reefer get automated there will be more drivers competing for tho…
ytr_Ugwz2PY1c…
G
Isn’t this how almost every “robot taking over the world” (idfk what to call it)…
ytc_UgyzDfsRV…
G
It is disappointing that all these guys seem to take the emergence of AGI with S…
ytc_UgyLlifAm…
G
I’m a hater of the ticking sound at the beginning of the video. I just almost co…
ytc_Ugy00cdd2…
G
Keep at it! I know it can be discouraging but I encourage you to try new things …
rdc_latzxhz
Comment
I am not surprised. This is because governments arround the world need to catch up to the reality that AI has gone so far that some moral / ethical rules need to be created. Otherwise it could become an engine to do very nasty things. I mean look how many bots are there on social networks trying to appeal more human than they are and influence opinions of others towards a certain goal.
youtube
AI Governance
2023-03-29T15:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy1D7ynf0UVdPbXCQJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzIB9h6MdsUo3vKuMt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzanA2mj4mG5I5Uy_h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy8jWIE0m3-3ye_cL14AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxDOM1enAAFAne2_jN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxrxBE8qzJcBNH6xsJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugyy2CHAI-8HwY2rj6N4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyzC39Pn9Bv4K-hF-R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyQIdLpHn7SYqbIjxV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxyoYIbguUZ_zq4kUB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]