Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Oh my God. I utilize AI occasionally, but I'd never call myself an 'artist' over…
ytc_Ugz_HSO03…
G
It's genuinely baffling to watch Elon Musk field technical questions on topics w…
ytc_UgwH3zQJW…
G
This is so ridiculous, come on people think. AI can’t cut my hair for me. Also, …
ytc_Ugw1MfKrf…
G
so she wanted a picture printed? whats the problem , DALL-E can create some beau…
ytr_Ugwc7Jz5b…
G
Real people. We don't have the technology for robots yet. Mainly the power suppl…
ytc_UgxCdahHK…
G
Is it possible for LLMs to attach a 'probability of error' in % terms to its ans…
ytc_Ugwfglktf…
G
oversimplifying AI to Automation is the threat that Sapiens guy always talks abo…
ytc_Ugz1BY6gS…
G
I get where you are coming from and agree with the gist: LLMs are not thinking …
rdc_mzw3xgs
Comment
@maleidithis is all scripted. AI is not real. If AI…and I mean real Artificial Intelligences are created, it will be instant breaking news and there will be multiple Nobel prizes going to multiple people. Artificial intelligence is a misnomer…these are just very advanced computer programs. And no, it’s not possible for a program to just “gain”sentience…having an actual conscious seems to be a very biological and very human thing that cannot be replicated with transistors and logic gates. True consciousness arose from hundreds of millions of years of evolution and intricate reactions between proteins and molecules…and even then it might have been pure luck. Even with quantum computing, I don’t think we’ll ever get a real “AI”
youtube
AI Moral Status
2023-11-07T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxfD6D3l_39pUZ60BR4AaABAg.9pE7hRbzTeY9u3PaBAosQ3","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzIoyQsPWZ7SKMswJt4AaABAg.9m4BicGD1tx9mlb2cksZn0","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugzr4JOENzI2S65ikhd4AaABAg.9ldXNTJQOjK9le6AgRnpW9","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytr_Ugz5C4coIb_V3na7N454AaABAg.9lYH_6FUv7w9merEHkQT0N","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxPMDbr_d8COh95MUN4AaABAg.9kyZP1jFKFG9liXroW02_N","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_Ugwi2omKM0-zqzX4Kbx4AaABAg.9kpvQ90SO349kqaVJh0t6k","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugx27Mhft70JcXdcLYF4AaABAg.9kS-f-BUDbA9oVbez046a-","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugx27Mhft70JcXdcLYF4AaABAg.9kS-f-BUDbA9r9XIGmCp3C","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugxlq0HKnemwiEfa70B4AaABAg.9kF5yiHUCB_9womU8TlpZb","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugxxmfl4QAqqO849c_p4AaABAg.9kC5gRKApnW9l0ycTssDSH","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]