I suspect that this is the direct result of AI generated content just overwhelming any real content.

I tried ddg, google, bing, quant, and none of them really help me find information I want these days.

Perplexity seems to work but I don’t like the idea of AI giving me “facts” since they are mostly based on other AI posts

  • bitjunkie@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    52 minutes ago

    It’s not just you. At some point, search’s primary purpose went from “finding the information you’re looking for” to “getting paid to put links in front of you”. Then they kept iterating on it, quarter by quarter, for a very long time.

  • DancingBear@midwest.social
    link
    fedilink
    English
    arrow-up
    15
    ·
    edit-2
    4 hours ago

    The other day I googled how long should I broil a ribeye steak and the google AI told me to broil it for 45 minutes.

    Broil is the hottest setting on the oven and you’re supposed to broil the meat as close to the burner as possible. This would probably burn down your house.

    • Appoxo@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 hours ago

      Huh…Can’t replicate that claim (though I would believe it happening)

      On the 20th Sep. I asked my Google Home if it would be raining.
      It responded that it would rain. I asked when it would rain.
      Home responded with “Today it won’t rain.”

      Like what? 5 seconds ago you said it would. No weather report reports rain. Where did you get the first response from??
      And I could even replicate it (have it on video)

      • DancingBear@midwest.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        45 minutes ago

        I can’t get it to repeat it either but it was definitely an ai auto response thing from google ai overview or whatever it’s called

        Now it’s giving distance from burner and everything lol. It’s learning 👀

  • raspberriesareyummy@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    1
    ·
    5 hours ago

    The whole internet is in the process of being filled with garbage content. Search engines are bad but also there’s not much good content left to find (in % of the total)

    • bitjunkie@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      48 minutes ago

      And the AI is trained on the shitty search results. It just parses them many times faster than a human reader can, which does at least make it better at getting to the fucking point. Once paid advertising is fully integrated with LLM, it will be as shitty and useless as traditional search. And then the entire world will collectively hop to the next trend so it can get hyper-monetized/enshittified, too.

  • stealth_cookies@lemmy.ca
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    4 hours ago

    I’ve been trying to use ddg and I just find it infuriating that it never finds what I need, especially if I’m looking for local information about something. Google seems to always prioritize those types of results when I need them (probably because it makes it easier to sell me something).

  • Sterile_Technique@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    6 hours ago

    There’s an extension that filters out websites from every engine. So like when you see Quora or other other digital garbage in your result, block it once and you’ll never see another Quora article again.

    Idr the name of the extension - I’ll check when I get home and follow up.

  • Zink@programming.dev
    link
    fedilink
    English
    arrow-up
    11
    ·
    7 hours ago

    My experience is that search engines are still decent at finding niche information that would normally be hard to find. But for anything mainstream, for instance any household product that should be easy to find information about, instead how about these 300 pages of top 10 lists of Amazon affiliate links buried under AI generated filler?

  • JackbyDev@programming.dev
    link
    fedilink
    English
    arrow-up
    9
    ·
    8 hours ago

    I’m going to be honest with you. They feel no worse today than they have for the past ~5+ years or so. SEO blog spam with a dozen paragraphs to tell you exactly one line of information have been around for quite a while. Many of these articles felt generated either from crappy writers or “AI” tools predating the LLMs we have now.

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    7
    ·
    8 hours ago

    It is, and it’s not just the search engines to blame.

    The content out there is incredibly spammy. It doesn’t pay to create good content. It pays to make a pool of AI gunge based on what people search for and then stick ads on it.

    • A_Random_Idiot@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      6 hours ago

      Spam sites laden with key words and massive SEO to farm advertising dollars from clicks long predated AI

      It doesnt help that big search engines like google have realized people will go as far as page 2 or 3 to find the results, so intentionally worsen their search results to increase ads being served.

  • catastrophicblues@lemmy.ca
    link
    fedilink
    English
    arrow-up
    4
    ·
    7 hours ago

    I’ve found that using Kagi, then DDG, then Google always gets me the results I need. But 95% of the time, Kagi gets it.

  • Voroxpete@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    13
    ·
    10 hours ago

    It’s not just you. Search got worse, and it did so intentionally.

    Ed Zitron lays it all out really well, with all the receipts, but the basic version is this; Google has an incentive to make you search more for the same things, because then they can show you more ads. And google is, first and foremost, an ad delivery company. Every “product” they own is an ad delivery vehicle. It’s not just AI slop that made search based; Google made search bad, and everyone else followed suit, to a greater or lesser degree.

  • ContrarianTrail@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    6 hours ago

    I don’t honestly even remember the last time I’ve googled something. Nowdays I’ll just ask chatGPT

    • zarkanian@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 hours ago

      The problem with getting answers from AI is that if they don’t know something, they’ll just make it up.

      • OlinOfTheHillPeople@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 hours ago

        “If I have to create stories so that the American media actually pays attention to the suffering of the American people, then that’s what I’m going to do.”

        • VanceGPT
      • ContrarianTrail@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        5 hours ago

        LLMs have their flaws but for my use it’s usually good enough. It’s rarely mission critical information that I’m looking for. It satisfies my thirst for an answer and even if it’s wrong I’m probably going to forget it in a few hours anyway. If it’s something important I’ll start with chatGPT and then fact check it by looking up the information myself.

        • zarkanian@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 hours ago

          So, let me get this straight…you “thirst for an answer”, but you don’t care whether or not the answer is correct?

          • ContrarianTrail@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 hours ago

            Of course I care whether the answer is correct. My point was that even when it’s not, it doesn’t really matter much because if it were critical, I wouldn’t be asking ChatGPT in the first place. More often than not, the answer it gives me is correct. The occasional hallucination is a price I’m willing to pay for the huge convenience of having something like ChatGPT to quickly bounce ideas off of and ask about stuff.

            • zarkanian@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 hours ago

              I agree that AI can be helpful for bouncing ideas off of. It’s been a great aid in learning, too. However, when I’m using it to help me learn programming, for example, I can run the code and see whether or not it works.

              I’m automatically skeptical of anything they tell me, because I know they could just be making something up. I always have to verify.