Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If this George guy was putting his foot onto the gas pedal to speed the car up, …
ytc_UgzePyt56…
G
Super AI will be able to breeze through security and have control over the world…
ytc_UgxRFwJ-r…
G
I mean in Canada at the age of 16 you can get a job and move out of your parents…
ytc_UgzOf1mc6…
G
This was a very interesting interview. I particularly enjoyed the bits about the…
ytc_Ugz5PVUYE…
G
Alexisthebest ever Yes, but what if that task is to kill anyone who could remote…
ytr_Ugghx3Nm4…
G
I use ai since I can’t afford an artist for the projects I’m working on but once…
ytc_UgwBkzjm-…
G
Born with the ability to draw? What's this guy on? Becoming a professional paint…
ytc_UgwBkYKfS…
G
That's exactly what I wanted to comment.After reading yours ,I refrained my self…
ytr_Ugz6r4qR1…
Comment
AI isn't going to become conscious anytime soon because that's not how LLMs (which have adopted the title of AI despite not fitting the classic definition) work. They can't "learn" anything. They can't form opinions or new ideas. All they can do is regurgitate words from their training data in the most likely order. Which is good for sounding like a human, but not for much else.
I'm no scientist, but the way I see it, AI isn't going to be _real_ AI until we can make artificial neurons at a quantity near enough to us. Nature nailed this; cells which can form connections to eachother to create more and more complex ideas.
youtube
AI Moral Status
2023-07-12T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugy5VPTTYZYHTJ3TVlF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"ytc_UgzBkRuYyFfgSU_RHFJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwnKpbcQoZGE4bNrCp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwVonILuww8ji8YONZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgxruvYm2hmT_jeNAwZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgyoinQEGNhjBOndaLV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugx_L3uTVDRQwev7bG54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzVSa1Eaj99x8SziKB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzUCmL1Z7sfmFJMtMR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},{"id":"ytc_UgxOmxuqOg7CM98Tv3p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]