Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I had breaken chat gpt so many times about morality and consistency that if AI s…
ytc_UgyUa1oZR…
G
Not here for the ai part, but the talent argument is valid, if you take 1000 peo…
ytc_UgxaqLy0O…
G
AI needs some humans to make electricity and mine minerals to turn into more AI.…
ytc_Ugxg5Fybt…
G
13:27 The problem with LLMs is that they reinforce such whack ideas and studies …
ytc_UgyCwJh0_…
G
It's incredibly tone. Deaf, if not hypocritical, for workers in this industry to…
ytc_UgyEe7gxv…
G
this is already a thing very much possible, but its easy to overcome. ai is more…
ytc_UgzRnRlwN…
G
I'm not afraid because while I'm not paranoid and influenced by Terminator, I co…
ytc_UgywxcWMU…
G
It would be hard to really change anything. Most people using AI generated art j…
ytc_Ugwx_DtJO…
Comment
Another reason nations might be hesitant to commit to a ban on the development of autonomous drone technology is just how low the industrial base required to field the most basic versions is shaping up to be. The powers that be lose potential advantage if they cease all research into a technology that only really takes a guy in his shed to get off the ground.
youtube
2024-07-12T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwND01huVrrVx_lj7N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzGV-1YwIiRe3KYNQ54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyY0eTJR82eqQ6sQ2d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxx0_H2-Us8DC2xd714AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwUaEYd0mnJ3gzOAkx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgytnwCYWoxdzNhgtZN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwaIi_roGHDzbjZs314AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzO65ibccFcK3AbnBF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz9abVLWtihdOB01uF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzimHjXUFoosB0cRZF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]