Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I like to think I wouldn't use it if it existed back when I was a student but I …
ytc_Ugx1bGIa7…
G
The professors expect the highest abilities of their students. They, themselves …
ytc_UgzbI3g8g…
G
Google, what you don't realize is. The one is doing the answering to all the que…
ytc_UgyAK70iD…
G
I would love to see an experiment where a person tries to live with no human gen…
ytc_Ugwlodpla…
G
Ok so basically there’s huge incentive to create new jobs, regulate AI, standard…
ytc_Ugw7tc_Cv…
G
Michio Kaku is now all over the place with his predictions cause he has also sai…
ytc_UgwLrFTmh…
G
I would not be standing around while a robot has been handed a gun, it can turn …
ytc_UgxfCn4B9…
G
Gemini is consistently the worst for me every time. Google assistant was more a…
rdc_mi769wb
Comment
Unfortunately they didn't dig deeper into the statement "what does understand mean: You've got to be conscious of it". I'm neither an expert in AI nor the functioning of the brain, but this seems like a small step to achieve. I looked up Gödel's theorems, and quickly came to the conclusion that it's too much for my intoxicated brain... Still, I think we people overestimate our brain's capabilities and uniqueness....for now...
youtube
AI Moral Status
2025-05-29T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwMeihC1a54PHDvegx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzFxLRxiZhYylZMAq94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwKGcsA_P0WfZwia3l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxysP1mM2rEuoAc8j94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyYnLqbc1V5WkC8N4R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgziJc7AHEFXkB2Lr3t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwL655vyL6BBDETDjF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwinSEoqk214h7P8QZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxIwL6yg_RQypPM1NZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwO8vT2Omz8ha2IJqx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}]