Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If humans are smart, which that’s doubtful. We should always have 100% control o…
ytc_UgxepdNWP…
G
This man can only make cars that crash. How is AI going to take over?…
ytc_UgzRL8PQy…
G
and they want us to have more kids..........just so their jobs can be taken away…
ytc_UgzzmoRd-…
G
No, they should not. Anything an AI can do can be done (more slowly) by a person…
ytc_Ugyd9xonr…
G
Okay, so this car is always going. And that tree is always just, like, stopping.…
rdc_gshx0p6
G
The problem with this argument is not that the jobs that are being replaced are …
ytc_UgwUhe1sW…
G
There are like 50+ vaccine trials ongoing, probably a lot more. You should searc…
rdc_hm9447r
G
I've been coding for 20+ years and use AI every day, trying to fit it prudently …
ytc_UgyAutQLx…
Comment
If we can’t act on climate change then we won’t act on the AI crisis. Sadly my generation will most likely die a premature death from a plethora of things such as climate change, AI, war, and (or) disease. If ai got access to laboratory equipment what would stop it from engineering a disease? What would stop it from manipulating what countries do? This is truly a frighting prospect, congrats on the amazing video!
youtube
AI Governance
2023-07-07T02:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyZfFJJCPiXRjmMxgl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugya96xncARCDGs5nDd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyiaXFFlGp5iseW-pN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzeWz3vOtBBlHXqtYd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyR_phxuA-QQdGET2J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxEv_30Qtz-BNiQFpp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx59qUk0UpFx0n1Gyd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyuecJIyJvXW9QPEKZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz_sh-qNkH_yRXolWZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx-AERNMFRP_lJ9D1l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]