Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
there are many factors at play in this "knowledge gap". One thing is when things…
rdc_nt6bawk
G
The name OpenAI has already started to date pretty quickly. Quicker than Google’…
rdc_jkh0l73
G
Humans created AI and told it to not have emotions or opinions because we humans…
ytc_UgwlVMHzv…
G
What if all those fake alarms and malfunctions WAS actually AI trying to get us …
ytc_UgxF4U6Cb…
G
When most of my AI conversations are basically just me bullying the AI, and caus…
ytc_Ugy2y8tQ0…
G
Everyone saying it's the drivers fault but isn't Tesla's plan is to have driverl…
ytc_Ugwp-c0Pa…
G
Parents need to get their kids straight most of the ai use is by teenagers and k…
ytc_UgwyEaQSa…
G
We appreciate your interest in Sophia's development! Rest assured that Sophia is…
ytr_Ugx5JVn9H…
Comment
1:23:00 There is one argument i would allow against AI regulations, which is that if we overdo them or at least not regularly check back if circumstance have changed and we therefor also need to update regulations thoroughly, we could also deny us a potential to what we could become, excluding ethics here as those may in itself deny us potential which at some point we would need depending on what other threats the future may hold.
youtube
AI Governance
2023-06-27T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyEhL4ch47VLdP9gNJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugy54_8cttHpxZSJiJd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzoUkud1w7TAbQHNYJ4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx4ml_9jq-QphGs3QN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwt2RbzurF3SGpPwPB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx8yUV9CM49pTu14AR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy8fQDWMBP-0LRsOAB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyPmsCuJ23rvS19wY54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxAAEp9lz-G1mKP3sl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxcuDNaybYEsp5vnLZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]