Im trying to figure out how online search funkcion works… Didnt have much luck for now. And also general discusion about the app would be wery helpfull for eweryone.

  • Teppichbrand@feddit.org
    link
    fedilink
    arrow-up
    9
    arrow-down
    2
    ·
    1 day ago

    This is offtopic, but I switched from Alpaca to duck.ai. I try not to use AI too often and even though I like the idea to run it locally, duck.ai is way easier to use.

    • MickeyMice@lemm.eeOP
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      21 hours ago

      I use it also some times but that is online ai and everything you use it for goes to someones servers so its not private. Alpaca is using ollama to run ai localy on your machine so everything you use it for is private. So those two are completly diferent and are not for comparing which one is easier to use because ease of use is not the point here. Privacy is.

  • vermaterc@lemmy.ml
    link
    fedilink
    arrow-up
    4
    ·
    1 day ago

    Taking advantage of the fact that this thread became popular, question to all of you guys: do you recommend some other open source LLM front ends?

    • Domi@lemmy.secnd.me
      link
      fedilink
      arrow-up
      2
      ·
      17 hours ago

      LM Studio is by far my favorite. Supports all GPUs out of the box on Linux and has tons of options.

        • Domi@lemmy.secnd.me
          link
          fedilink
          arrow-up
          1
          ·
          14 hours ago

          Looks like you’re right.

          I switched to it when Alpaca stopped working on AMD GPUs and was under the impression it is open source.

          • ozymandias117@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 hours ago

            Depending on how you had it installed, Alpaca split support in the Flatpaks.

            If you want AMD support, you need to install com.jeffser.Alpaca.Plugins.AMD

            • Domi@lemmy.secnd.me
              link
              fedilink
              arrow-up
              1
              ·
              8 minutes ago

              Doesn’t work for me unfortunately, always falls back to CPU ever since the packages were split up.

    • juipeltje@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      19 hours ago

      So far i’ve really liked just using ollama in the terminal since it just spits out text anyway.

      • vermaterc@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        19 hours ago

        ofc I could even send raw api requests, but sometimes it’s good to have a nice GUI that “just works”.

        Specifically I’m looking for something that could handle not only text responses, but also attachments, speech recognition and MCP support.

        • juipeltje@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          15 hours ago

          Yeah in that case you probably want something else. So far i’ve only ever used it for text based questions. I think i remember seeing that there is also a webui out there but i don’t remember the name.

  • Shape4985@lemmy.ml
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    1 day ago

    I used alpaca but they made some changes recently that made it confusing and a pain to use. I deleted it after that as i dont use ai much anyway.

  • astro_ray@piefed.social
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 day ago

    There are still active accounts on lemm.ee?

    I am not certain what you mean by online search function. It can connect to the internet but it doesn’t exactly function like a search engine from what I can understand.

  • Snot Flickerman@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    4
    ·
    edit-2
    1 day ago

    https://github.com/Jeffser/Alpaca

    This will probably help anyone unfamiliar with it, since the first search result for Alpaca AI is another online paid AI service which does something entirely different than this. It’s used for AI image generation.

    The main question I have is since Ollama is optional… If you optionally use it, is it still sharing data with Facebook Meta?

    • MickeyMice@lemm.eeOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      1 day ago

      Didnt know that ollama is sharing data with facebook… Why would it do something like that? Wouldnt that be oposite of what it was created for and that is privacy… Where did you get that info?

      • Snot Flickerman@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        1 day ago

        It looked like from comments that’s why he made the Ollama integration optional, because some people were concerned since Ollama was built by Meta. It can run without Ollama, it seems.

        EDIT: Doing more research on Ollama itself, I’m unconvinced that it’s sharing any data, despite being built by Meta.

        • MickeyMice@lemm.eeOP
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 day ago

          I didnt know that ollama was built by meta, where did you find that out? Its also an open source project it shouldnt have malicios code like that…

          • spencer@lemmy.ca
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            1 day ago

            Meta trained and published the model but it’s an open model. I’m not an expert but I don’t believe it’s sharing data with Meta since it’s just the model they trained, you can download it and run it offline. You’re just using the output of all the training they did on your own compute.