Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have a car with drivers assist, didnot cost $10,000 and does pretty much the s…
ytc_UgzWltwy_…
G
I don't understand why AI developers feel the need to harass and physically acco…
ytc_Ugwryk8hd…
G
There are still no AI tools. LLMs are fancy autocomplete with no concept of real…
ytc_Ugyttd0zo…
G
Doomsday is closer. AI and VR and deception is getting worse over time. You'll b…
ytc_UgwM1gm4n…
G
Data centres use less than 1% of fresh water
Data centres also use around 1–2…
ytr_Ugz6hAxwZ…
G
For the world to survive we must stop all wars and manufacturing arms and nuclea…
ytc_Ugy6R6b0U…
G
The facial recognition camera was created by white people to be prejudiced again…
ytc_UgwXIPw_T…
G
One time i discussed content idea with chatgpt..it was quite rare... A few days …
ytc_UgyjvnrTF…
Comment
I think the dangers Elon Musk didn't really want to talk about are AI-controlled state police and AI-controlled mass surveillance.
Imagine Stalin's totalitarian hell, but instead of corrupt people (that you might be able to bribe, or reason with, or pray for their compassion) you have an emotionless superintelligent machine controlling every aspect of your life.
youtube
AI Governance
2023-04-20T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgysLscr16tq0tu71Q14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwrlL1eNSre2Pda3754AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyPZUNbAdFsVIpEPwp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwg2tav6471ERAg7eN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw7uWq6XWzD3Ruv5NV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_Ugypk6WeTtl_k5PMbq54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxVUnZc_HAd9RWgqpl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyO56CidKyonctrUcZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwRo1B2I24pGiljDs14AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyBsMMlv_oAY--WBcF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}
]