Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
bollocks. trained on mid-2024-publically-available level of info that is a wikip…
rdc_mck11et
G
I don't understand why people hate AI so much. Like, we literally use AI to help…
ytc_Ugzg4yPuj…
G
use Ai to solve , improve Tesla inventions, if you think it is so smart…
ytc_Ugy8OWvLz…
G
@keepthatawayfromme7471 not "art night". "Art AT night before the exam".
I can …
ytr_UgwxNP_Zx…
G
My perspective is that it is unethical to starve people deliberately. Creating …
ytc_UghGUSpj2…
G
Lethal autonomous weapons' are not possible, they tested some of them during bot…
ytc_UgywAQ8eQ…
G
As we shift toward a “personalized medicine” the use of AI in healthcare is inev…
ytc_Ugy_HBzp_…
G
1. Everything A. C. Clark said about future was true. 2. He said in 2001 (24 yea…
ytc_UgyTHJluv…
Comment
but likely would just be ignored.
At the most, governments will create some laws to limit AI but they would be too slow to create those laws. By that time, AI has already developed further and those laws would be even more difficult to implement.
Teach your children to think on their own from a young age.
Limit cellphone and TV time when they’re young so they don’t grow up with brains dependent on technology. They may be the last remaining thinking humans in the future.
youtube
AI Governance
2023-04-22T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgwZntqPGIgWYv4htSJ4AaABAg.9ow0efnHoNI9qiWgGaEqtE","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgwqkUO2pH5Q2anArJR4AaABAg.9oocO4ie34E9owB1rg0A58","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwqkUO2pH5Q2anArJR4AaABAg.9oocO4ie34EA-CG17EpDfl","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytr_UgyGJ3LjyktJjh8NnXx4AaABAg.9okqp90Faxd9oqab8FTt_7","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgwU3WzFZY1FquYg9ZJ4AaABAg.9ojTbLtn21d9ojmj15TuHo","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytr_UgzoaRU8n-Fbyv0CqZR4AaABAg.9ojIF3Z2kSB9ojsO5AWtmj","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugx_BTGqAyd3Iu3XYr54AaABAg.9oijMsbKCOe9omdSHwFWBd","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugz6XGBag01yn2fU-6h4AaABAg.9oi1CtEXWke9onALH073Tm","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytr_UgyQNT_vcfhkVVT3iep4AaABAg.9ohvHpOl2ss9p0MXs5vNI-","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugw95KH2IrfyfsjhXSZ4AaABAg.9ohrSbehnaL9ohu-UNbFo5","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]