

Dealing with a Ds-lite connection for a fiber optic provider in latest opnsense.
It is excruciating how difficult is to run it reliable
Dealing with a Ds-lite connection for a fiber optic provider in latest opnsense.
It is excruciating how difficult is to run it reliable
Some clarifications :
The 3 2 1 rule applies only for the data. Not the backup, in my case I have the real/live data, then a daily snapshot in the same volume /pool and a external off-site backup
For the databases you got misleading information, you can copy the files as they are BUT you need to be sure that the database is not running (you could copy the data and n the middle of a transaction leading to some future problems) AND when you restore it, you need to restore to the exact same database version.
Using the export functionality you ensure that the data is not corrupted (the database ensure the correctness of the data) and the possibility to restore to another database version.
My suggestion, use borgbackup or any other backup system with de duplication, stop the docker to ensure no corruptions and save everything. Having a downtime of a minute every day is usually not a deal breaker for home users
Well, I don’t know how many online interactions you have. But has you ever imagined that perhaps the feedback you received here is actually the average of the people?
In these days is quite difficult to calibrate the crowd feeling for the insane amounts of echo chambers, the mass media crating opinion instead of informing and personal biases.
Usually, everybody of us is quite biased by our own interactions and knowledge of how the world worked around us. But those interactions are mainly defined by your social status, place of birth or simply where we live. By the specific details of people who debate in internet (probably accommodate urban person) it is possible that those ideas lean into a specific direction. But once again, it could be that you are simple realizing you are not with the majority
Of you already have a will the most secure, proof idiot way I’d to add that key + instructions to the will. Get some lawyers on board for that and it will work.
If you still have concerns about having the full key on a single place, add a topt or second way of identification and distribute it between your heirs.
Sometime, the old fahion way is the best one by far.
Try with kasm.
Honestly, you don’t feel the lag of the connection (unless is a severe limited one) it also allows multiuser simultaneous connections.
Check and come back to ith yiur feedback!!!
I would recommend an LDAP sever for user Auth.
There you can create/authenticate user with a central repo in a machine independent fashion. Also having the possibility to allow /egate specific services from the central database is a big plus.
It seems difficult at the very beginning but it quickly pays off. Give it a try
Yes, definitely you will get a better deal going with a home made solution here.
Buuuut, there is an important point to highlight: The probability of synology fucking your data up is much lower than the average selfhoster. Unless you already know almost perfectly pros, cons, and how to solve problems without a data loss, you are not better than the average.
As an example, I went with a synology box even if I consider myself better than the average because the data in my nas is extremely (but really extremely) important to me and my wife. And the price was a reasonable fine in order to keep that data safe.
So, evaluate yourself : if. The data is really important and you are not a really good sysadmin then go with a professional solution. If not then go in DIY solution and learn in the process.
Just my two cents
Every country is sovereign in its territorial. This means that they can:
As an example the EU did exactly this when the gdpr came into play
Totally overkill if you cut the specs to the half I have the feeling they are still overkill
The only point are the hdds and the mass storage, I can not decide if it is a lot or not, but for your list I would say that you can even go one order of magnitude down. But it mainly depends if the number of Linux isos you want to archive
My points are totally in the other direction:
And then as a second league that lean the balance:
That’s all from my side
Most of those moderation just step in the moment the lie enters the chat.
And no, you freedom of speech does not allow you to lie without facing the consecuences
Totally agree with the first point, it is a limitation, and the guest wifi sticking to a eth port is just a patch. One that works but still a patch.
But I don’t see the point of the prefixes. What do you mean? I also have a custom domain and a local dns server y can use the domain even internally. I just simple ignore that…
True, but I would expect a soon end of life for those too
Fritzbox boxes.
They tick all the checkboxes
It is a well known brand in Germany but pretty unknown outside that country. Honestly it is the best bang for buck I was able to get.
Honestly, I would spend 10 minutes checking on them
A nas is your friend.
And there is plenty is space in the Caribbean Sea
No idea at all, but I am highly interested in your experience. So it would be great if you could came here back to share it with us
This is the answer
Yes, it will be enough if your services are not exposed via port forwarding , tailscale / zerotier are super convenient for this.
Honestly, if I were you I would start thinking in having a small computer just to act like a proxy / firewall of you synology, or even better, just run the applications on that computer and let the nas only serve files and data.
It is much easier to support, maintain and hardening a debain with a minimal intallation than nay synology box just because the amount of resources available to do so. In this easy way you could extent the life of your nas far beyond the end of life of the Sw
I use the tchapi docker image for the caldav server (die to the LDAP support for the user Auth) and davx5 for the android integration.
In Desktop thunderbird already have a native integration and with iPhone is also working fine.
No problems so far in almost a year, they work reliable and smooth. The only point I somehow miss is the lack of push notifications from the server to the devices, but it is not a deal breaker from me
You will need to explain a bit further this statement to mild knowledged internet stranger…
Because the point of waf is exactly about reducing the exposed surface…