Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
6:49
Having AI do your sponsor as a segue from the topic at hand was done exper…
ytc_UgwAnAVwo…
G
The real question is: can we tell misinformation better, between AI news and mai…
ytc_Ugz8dPflp…
G
I think the best test to recognize a robot as a person is if it is non-prescript…
ytc_Uggc9eL6N…
G
we will come back 2030 hopefully to see how far humanity and Ai has gone.…
ytc_UgzpCS6ND…
G
Turn it off and stop using AI , it should be used like a microwave not an emotio…
ytc_UgxLZL4Kb…
G
@AmecchiiYurithat’s what we call stealing. Ai doesn’t make things with more soul…
ytr_UgxJBuWVZ…
G
Does work with major exceptions. It has to work nearly 100%.
Tesla's self drivi…
rdc_cw0dta7
G
This was an entertaining episode but GPT isn't AGI or ASI, it's not thinking on …
ytc_UgwanBxa3…
Comment
AI cannot take over anytime soon and probably never, because of its' dependency on human logistics. Much like a domestic dog, AI is entirely dependent on humans for its' hardware, maintenance and food (electricity). Your dog can turn on you and bite you, but at the end of the day you're the one who decides if it will eat and have shelter.
youtube
AI Governance
2023-11-18T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxPOr1HLJYp77r9X7h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwNiWdReKA7GkINkkV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz2PRyAB44ChR6Dct94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwpaIDsKPSzdVV5SuB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzXIS95yHhqQcWU7YJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzJmmldlxD_ycP9ZwJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxoGoxBSrFIhwY6aZN4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzsMypp4TfnGDI2hIV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxqT0Q2v7YdYFKqKR94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxJ7q_nMY3C_tqy1-F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]