What’s up, what’s down and what are you not sure about?

Let us know what you set up lately, what kind of problems you currently think about or are running into, what new device you added to your homelab or what interesting service or article you found.

I finally finished my first iteration of my Minilab including a very smooth migration from the old server yesterday so I can go to the service side of things again. I plan to get some kind of selfhosters VPN for external access to stuff that’s not exposed to the internet, I’ll have to investigate which one.

  • cron@feddit.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    19 days ago

    I’ve installed coraza web app firewall with OWASP ruleset this weekend. I must admit that it wasn’t as easy as I expected it, but it now (mostly) works. I had to give up with nextcloud though.

  • anotherandrew@mbin.mixdown.ca
    link
    fedilink
    arrow-up
    1
    ·
    19 days ago

    A couple things I’ve been working on

    First, I spun up a larger VPS to consolidate two smaller ones. This time I dockerized almost everything. Still a docker newb, but karakeep, redmine, mbin, lemmy (still deciding which I want), davical. Asterisk and postfix/dovecot are probably gonna stay on the vps root. I’m using zfs and compression. Interestingly, the postgres database that everything is using seems to get better compression than the mail spool.

    A couple weeks ago I picked up a NetApp 7 bay disk shelf for $30. It uses fibre channel (AT-FCX) controllers and I’ve never used that before. I grabbed a $7 FC HBA (QLE2560), a 2m cable and an m2-to-PCIe adapter meant for an eGPU. The idea is to see if I can’t get the RK3588 board I’m playing with to see it. I did something similar with a $50 Dell 12 drive bay and my old C6100.

  • chirospasm@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    18 days ago

    Hello! I recently deployed GPUStack, a self-hosted GPU resource manager.

    It helps you deploy AI models across clusters of GPUs, regardless of network or device. Got a Mac? It can toss a model on there and route it into an interface. Got a VM on a sever somewhere? Same. How about your home PC, with that beefy gaming GPU? No prob. GPUStack is great at scaling what you have on hand, without having to deploy a bunch of independent instances of ollama, llama.ccp, etc.

    I use it to route pre-run LLMs into Open WebUI, another self-hosted interface for AI interactions, via the OpenAI API that both GPUStack and Open WebUI support!

  • nfreak@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    19 days ago

    I started this about a month ago, absolutely no idea what I was doing, and in that short time this little box has grown a ton. Got the basics for cloud storage, jellyfin with the arr suite, navidrome to replace spotify/tidal, etc. Got my scanner going right into paperless, finally starting a budget planner with actualbudget, even set up homebox to maybe eventually keep track of my collections of random bullshit. Spent 3 days fighting with Wireguard and gluetun to make a single VPN connection that’ll hook me into my LAN but also output all my traffic through Mullvad, using pihole as my DNS - I should get Unbound set up at some point too but that’s a project for another day.

    Today I learned about homeassistant, and while I’m not one to care about IoT shit or whatever, just dabbling with NFC tags for the lights and such has been pretty neat.

    This week I’m getting a second machine in that I’m going to use exclusively as a NAS and stop relying on USB external hard drives.

    I really just wanted a little 24/7 Bob Ross box with a bit of cloud storage, and this project blew up a lot more than I thought it would LOL

  • hobbsc@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    19 days ago

    i just moved almost all of my containers (except for my omada controller) to my VM running fedora and podman off my VM running ubuntu and docker. why? i was in a product sales call (being sold to) and didn’t have any actual work tasks to do during that time. Now there’s an additional VM on the network.

    Trying to decide if I’ll move omada as well or just shift everything back. I shouldn’t have fiddled with the stack while I was bored. A video game or something would’ve been a better idea.

  • RagingHungryPanda@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    19 days ago

    Sweet!

    What’s up is everything I’ve been running and down is what I haven’t.

    not working

    I haven’t been able to get friendica to connect to Maria DB, so I’ll eventually try just MySql. Grafana isn’t running bc I would need to change a lot of things to get an exporter into each container and the truenas apps don’t really allow that configuration - fine if you have docker compose though, which I’ve started doing more and more.

    new

    I just got up and running with Stirling pdf, a free (and paid) PDF editor. That looks pretty sweet.

    But I’m now also using 15GB of the 32 on the system, which is still plenty for Arc cache for me

    what I want

    I want to rent a VPS to host various fediverse apps, probably Lemmy, pixelfed, and write freely to start, for the nomad/expect communities. I’ve been looking at netcup and they have some decent arm offerings.

    I’d like to put Talos Linux on it so I can get some kubernetes experience. They have a good sized server for €10, so I could expand to add a DB server or one specifically for logging and metrics.

    I was looking at Hetzner, but I’ve read that their block storage is super slow and causes timeouts on DB.

    Of course, can I even run these apps on arm? I guess I gotta find that out.

    One thing I’d like to do is make a web page that makes signups super easy and would create an account on all services, ideally. Not a huge deal of that isn’t reasonable, but it’d be nice to allow doing it once rather than multiple times. If I could get sso, that’d be good, but I don’t know how supported that is.

  • Plebcouncilman@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    19 days ago

    I know next to nothing about using the command line, so I’ve been relying pretty heavily on ChatGPT to set my stuff up and so far it has reliably helped me overcome every issue. The problem is, of course, that I often don’t even understand what the issue was in the first place so I don’t even know if the fix that the ai spits out is, let’s say, correct. I don’t really want to become an It expert, I just want to be able to host some services on my own to depend less on corps, is it alright if I continue to rely on the AI? Or do you guys think that I just have to learn this stuff or else I might mess up?

    I don’t have great security concerns btw, my ISP doesn’t allow port forwarding, so I access my server exclusively though Tailscale.

    • tofu@lemmy.nocturnal.gardenOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      19 days ago

      Most of the stuff will somewhat work, but you’ll introduce side effects sooner or later by using commands that might work but are not the proper ones and alter unrelated things. At some point those will likely bite you and you have no idea where it’s coming from. I’d suggest to check at least what the commands you are copying are doing.

    • milicent_bystandr@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      19 days ago

      I’ve had some amusing mixed experience with ChatGPT for this. When I asked about iptables rules to restrict podman, it was great. About podaman quadlets, though, which I first misspelled ‘quartlets’, it completely made it up, and even sent me a fake link to nonexistent documentation when I challenged it!

      • it’s more helpful if you ask the right questions
      • and its answers often give you ideas of what to google
      • Old stuff that has been written about many times over is more likely to get a proper answer
      • sometimes the gist of a wrong command/answer could still help me understand what to do with the right one

      Try to understand whatever you use from AI. At least understanding the general picture of what it means, and a basic idea of “this flag is for this; this option is for that”. AI can also help you with that understanding, but again beware of it completely making up something logically coherent but wrong.

      • Plebcouncilman@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        19 days ago

        Yes this happened to me as well, I don’t remember what I was talking about but I remember I made a typo and it just ran with it as if it was a real thing. I let it keep going to see if it ever realized it was talking about something that didn’t exist but nope it kept going until I pointed it out.

        I ask for it to explain what the command did and I did manage to wrap my head around a few concepts but in the end I feel like I’m trusting it to not insert any vulnerabilities into the system, and I don’t like that. Mistrust is the whole reason I’m doing this. But yeah I’ll pay close attention and maybe even ask all the implications of he changes we make.

    • gonzo-rand19@moist.catsweat.com
      link
      fedilink
      arrow-up
      0
      ·
      19 days ago

      What you can probably do to build some knowledge if you’re going to be using AI anyway is ask it to explain some of the concepts to you. You also have the ability to ask clarifying questions about anything you don’t understand.

      • Plebcouncilman@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        19 days ago

        Yes I do that, and it does help me a lot to understand what I’m doing it’s just I’m a top down type of guy. Like I don’t like messing with anything unless I fully understand it, which often makes me very unproductive. I decided to not be that way with this self hosting thing because I realized I would never get around to it with that mentality. Better to break shit as I go.