Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In good humour sans offence:
Google CEO must be exploring the ways to prevent "A…
ytc_UgyMHvxTh…
G
This is complete bullshit. Telling that people will be more productive, but stil…
ytc_UgxCMJxKO…
G
I remember reading an article in 2023 that explained how we will never have self…
ytc_UgzxUQoKQ…
G
Nice dialog. Can I suggest a topic? Ask it if what happened at Tiananmen Square …
ytc_UgyED_oEO…
G
IMO the key word is "artificial": A.I. today *appears* to do things in an intell…
ytc_UgyNLpOBC…
G
a.i is simply next step in human evoloution. think of having ur conciousness upl…
ytc_UgxJjVSnb…
G
All I know ai will never be able to wipe a person or take care of a sick person…
ytc_UgwSiKa7w…
G
Yeah, ethics shouldn’t get in the way of humans becoming a more advanced civiliz…
ytr_UgxmqADW-…
Comment
I have also noticed that side to side movement attracts attention when i'm on the bike and this was reinforced by a highway patrol officer who told me that a driver weaving in and out of lanes gets his attention faster than speed differential. Not sure why this is, but animals often attack in a weaving approach, so maybe this is getting at some part of our instinctive defence mechanisms. I wonder if that kind of instinctive reaction could be taught to an AI? Perhaps not unless the human teachers were exposed to some kind of surprise visual stimulation to trigger it. We're going to need psychologists and behaviourists involved sooner or later on the AI journey.
youtube
AI Harm Incident
2022-12-28T05:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxIYtMKgczcqi80WP94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgfGFA3gwkcGpjisZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwl_hPP_MFr2jRpgfV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwTon5T35b9e_tRRTF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzKp3T8pvKbxYZ3EMF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQg-kbAoVjtGZWPP54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzzH2uuodXCgwmz0kF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx3FfYVAZsd1eOF0u54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWQxsLxtdvM1qnlx14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxmGYRbmFAqcnQm6jd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]