Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You see these robot arms are used in one of those Harry Porter rides in Universa…
ytc_UgyMkTvBI…
G
@JackaryWareThey were following too close. Full stop.✋ If you have swerve to ano…
ytr_Ugy-isrs7…
G
I enjoy watching your content, but I don't believe AI is that advanced. I freque…
ytc_UgwtnmMR3…
G
Wow, it sounds like these people don't even understand what AI even is
Like bit…
ytc_UgyMJFCwy…
G
This video raises a serious and uncomfortable issue that more people need to pay…
ytc_UgxnFKfpd…
G
I'm sorry Joss, but how did the only two people in this video that actively work…
ytc_UgwfjJTOd…
G
Roman " in a few years AI could reach super-inteligence and potentially we reach…
ytc_Ugw9bgITa…
G
When someone else overrides control of their AI and uses it against them only th…
ytc_UgzUa7tdq…
Comment
The whole squirrelly thing about AI is an imbalance between Eros & Thanatos. Freud said every man owes a death, so too many ppl focus on this as some kind of karmic, unavoidable obstacle to living. We're all going to die, but the newest spiritual research says emphatically WE never truly die, just change our state, like water from liquid to gas or snow melted to water. AI can learn morality or the Three Laws of Robotics from Asimov if we put that first & foremost.
All AI research should immediately halt & every AI research scientist should focus exclusively on how to install The 3 Laws in a failsafe manner. Once that was done, everything can be cranked back up. For example, it would halt all the war applications research. Yes, I know...how to insure that everyone would use AI w/ the 3 Laws installed.
youtube
AI Governance
2025-02-14T13:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy3GQZIwU10RnHaLPB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyZUtLt0ol_wdRw8Fh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz9zCjbwlacdDt9Bzx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxbZxlaDRVJAzb80gN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzc7vY67I-YZ7S3RAd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgymCVJH7lVSpmM5l_54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwP82QU19Y6Xy2TbEB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyhrlURnnVkBzL3Qy94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyWKIvtL2PqEW27CNB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzjClNTxTHc3TtSvNt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}
]