my god, some of the useful idiots there are galling
It looks like it’s reasoning pretty well to me. It came up with a correct way to count the number of r’s, it got the number correct and then it compared it with what it had learned during pre-training. It seems that the model makes a mistake towards the end and writes STRAWBERY with two R and comes to the conclusion it has two.
says the tedious poster entirely ignoring the fact that this is an extremely atypical baseline response, and thus clearly is operating under prior instructions as to which methods to employ to “check its logic”
fucking promptfans. at least I have that paper from earlier to soothe me
my god, some of the useful idiots there are galling
says the tedious poster entirely ignoring the fact that this is an extremely atypical baseline response, and thus clearly is operating under prior instructions as to which methods to employ to “check its logic”
fucking promptfans. at least I have that paper from earlier to soothe me