Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
it doesn't matter ... in the long run, they will get the same facial recognition…
rdc_ektdbez
G
Honestly, i think this is a great use of ai, for inspiration. Like, yea they wer…
ytc_Ugxg-eZGA…
G
You don’t even have to be a ‘big’ artist; I have only slightly over 1k on Instag…
ytc_Ugwel0Pqt…
G
AI will NEVER be conscious. Consciousness is a biological faculty, each cell of …
ytc_UgzkZQUSo…
G
A.I and Art are two words that I think, shouldn't be put together ..like Mayonna…
ytc_UgxWAYyOS…
G
The reason humans underestimate this type of change it that they view evolution …
ytc_Ugyp7s5M2…
G
Always treat everyone the same way we want to be treated inclusive machines no m…
ytc_UgwelB636…
G
Didn't they skewed the training data so that ChatGPT makes less worse statements…
ytc_UgxUuWo3T…
Comment
Are we sure that "solving alignment" will help us avoid disaster? If AI is a powerful tool maybe it's safer to be left on "auto" mode.
If humanity was a 7 year old kid and he had to fly a supersonic jet, then what is the chance it ends well? Probably around 0%. But if the jet could fly automatically I would feel much better about it.
Needless to say I'd rather not fly at all if I could help it.
youtube
AI Governance
2025-10-16T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw87q9zbZpHndyBToh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwgqUoJfgOROg0z8v94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxHMLXRr5iWJK8TPC14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxBIFZaYUxl9uHUwgR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy87F2xB863izA0VL54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxmWvMLtxhgNNtrt4V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxdNMxsQXIFl-zWOat4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgxJfABiBpM6-L5_NN14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwBi8g74GabhQa4XVp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwaSWTdB9heV1BMBnd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]