deepdive

joined 1 year ago
[–] deepdive@lemmy.world 2 points 10 months ago (1 children)

This post was about browsers but my feelings when I wrote It was a more general "conclusion". I only found out recently about some "hidden" privacy concerns with browsers (WebRTC leaking your real ip, fonts fingreprinting...) But when I found out about android's default keyboard sending samples, IOT weaknesses, smart devices data hoarding... It really feels like a losing battle while being connected to the world...

[–] deepdive@lemmy.world 2 points 10 months ago (1 children)

Do not overthink they want to know about you everything.

That's true, they probably already have everything they need... It's not only about my personal data, and my example only points out to the web technology, but everywhere around us are some data hoarding devices that are either used to targeted ads, campaign, profiling, IA dataset feeding... whatever !

It feels like we already lost our right to privacy and how personal data, telemetry is used as a whole in our society...

68
submitted 10 months ago* (last edited 10 months ago) by deepdive@lemmy.world to c/privacy@lemmy.ml
 

Heyha !

This is probably going to be long take and it's late here in europe... So for those who bare with me and are ready to read through my broken English, thank you.

I'm personally concerned about how my data and my identity is used against my will while surfing the web or using/hosting services. Self-hoster and networking enthousiast, I have some entry/medium security infrastructure.

Ranging from self-hosted adblocker, dns, router, vlans, containers, server, firewall, wireguard, VPN... you name it ! I was pretty happy to see all my traffic being encrypted through wireshark and having what I consider a solid homelab.

Also having most undesired dns/ads blocked with adguard in firefox with custom configuration, blocking everything, and changing some about:config options:

  • privacy.resistFingerprinting
  • privacy.trackingprotection.fingerprinting.enabled
  • ...

I though I had some pretty harden security and safe browsing experience, but oh my I was wrong...

From pixel tracking, to WebRTC leaking your real ip, fonts fingreprinting, canvas fingreprinting, audio fingerprinting, android default keyboard sending samples, ssl certificate with known vulnerabilities...

And most of them are not even some new tracking tech... I mean even firefox 54 was aware of most of these way of fingerprinting the user, and it makes me feel firefox is just another hidden evil-corp hiding with a fancy privacy facade ! Uhhg...

And even if you somehow randomize those fingerprint, user-agent and block most of those things, this makes you stand out of the mass and makes you even easier to track or fingerprint. Yeah something I read recently and it actually make sense... the best way to be somehow invisible is actually to blend into the mass... If you stand out, you are pretty sure to be notices and identified (if that makes sense :/)

This really makes me depressed right now... It feels like a losing battle where my energy is just being wasted to try to have some privacy and anonimity on the web... While fighting against the new laws ringing on our doors and big tech company always having two steps ahead...

I'm really asking myself if it really matters and if it actually make sense to use harden technology or browsers like arkenfox or the tor browser whose end node are mostly intercepted by private institutions and governemental institutions...

I'm probably overthinking and falling into a deep hole... But the more i dig into security and privacy, the more I get the feeling that this is an already lost battle against big tech...

Some recent source:

https://avoidthehack.com/firefox-privacy-config

[–] deepdive@lemmy.world 1 points 10 months ago

Rethinkdns is probably your best bet! Right now they are missing an important feature where It takes wireguard's DNS configuration into account, making it obsolete for those who have private dns in a local environnement with an upstream dns !

Can't wait for version 0.5.6 😄

[–] deepdive@lemmy.world 1 points 10 months ago (1 children)

Hey don't worry :)

Yeah, this could be a time saver in case you should/need to revoke certificates in your homelab setup ! Imagine changing the rootCA store on 20 devices ... Ugh !

Happy reading/tweaking ! Have fun !

[–] deepdive@lemmy.world 2 points 10 months ago* (last edited 10 months ago) (3 children)

Certificate chain of trust: I assume you’re talking about PKI infrastructure and using root CAs + Derivative CAs? If yes, then I must note that I’m not planning to run derivative CAs because it’s just for my lab and I don’t need that much of infrastructure.

An intermediate CA could potentially be useful, but isn't really needed in self-signed CA. But in case you have to revoke your rootCA, you have to replace that certificate on all your devices, which can become a lot of hassle if you share that trusted root CA with family/friends. By having a intermediate CA and hiding your root CAs private key somewhere offline, you could take away that overheat by just revoking the intermediate CA and updating the server certificate with the newly signed Intermediate bundle and serving that new certificate through the proxy. (Hope that makes sense? :|)

I do not know what X.509 extensions are and why I need them. Could you tell me more?

This will probably give you some better explanation than I could :| I have everything written in a markdown file, and reading through my notes I remember I had to put some basic constraints TRUE in my certificates to make them work on my android root store ! Some are necessary to make your root CA work properly (like CA:True). Also if you want SAN certificates (multidomaine) you have to put them in your x509 extensions.

’m also considering client certificates as an alternative to SSO, am I right in considering them this way?

Ohhh, I don't know... I haven't installed or used any SSO service and thinking of MFA/SSO with authelia in the future ! My guess would be that those are 2 different technologies and could work together? Having self-signed CA with a 2FA could possible work in a homelab but I have no idea how because I haven't tested it out. But thinks to consider if you want clients certificates for your family/friends is to have a intermediate CA in case of revocation, you don't have to replace the certificate in their root store every time you sign a new Intermediate CA.

I’ll mention that I plan to run an instance of HAProxy per podman pod so that I terminate my encrypted traffic inside the pod and exclusively route unencrypted traffic through local host inside the pod.

I have no idea about HAProxy and podman and how they work to encrypt traffic. All my traffic passes through a wireguard tunnel to my docker containers/proxy which I consider safe enough? Listening to all my traffic with wireshark seamed to do exactly what I'm expecting but I'm not an expert :L So I cannot help you further on that topic. But I will keep your idea in my notes to see If there could be further improvement in my setup with HAProxy and podman compared to docker and traefik through wireguard tunnel.

Of course, that means that every pod on my network (hosting an HAProxy instance) will be given a distinct subdomain, and I will be producing certificates for specific subdomains, instead of using a wildcard.

Openssl SAN certificates are going to be a life/time saver in your setup ! One certificat with multidomian !


I'm just a hobby homelaber/tinkerer so take everything with caution and always double check with other sources ! :) Hope it helps !


Edit

Thinking of your use case I would personally create a rootCA and an intermediateCA + certificate bundle. Put the rootCA in the trusted store on all your devices and serve the intermediateCA/certificate bundle with your proxy of choice. Signing the certificate with SAN X.509 extension for all your domains. Save your rootCA's key somwhere offline to keep it save !

The links I gave you are very useful but every bit of information is a bit dispatched and you have to combine them by yourself, but it's a gold mine of information !

[–] deepdive@lemmy.world 2 points 10 months ago

Yeah... and sometimes you find some uttery shitty people who use multiple account to comment shame you or think they are better than you while having a self conversation on your post ! Uhhhg !

[–] deepdive@lemmy.world 12 points 10 months ago* (last edited 10 months ago) (7 children)

If you want to run your own pki with self-signed certificate in your homelab I really encourage you to read through this tutorial. There is a lot to process and read and it will take you some time to set everything up and understand every terminology but after that:

  • Own self-signed certificate with SAN wildcards (https://*.home.lab)
  • Certificate chain of trust
  • CSR with your own configuration
  • CRL and certificate revocation
  • X509 extensions

After everything is in place, you can write your own script that revoks, write and generates your certificate, but that is another story !

Put everything behind your reverse proxy of choice (traefik in my case) and serve all your docker services with your own self-signed wildcard certificates ! It's complex but if you have spare time and are willing to learn something new, it's worth the effort !

Keep in mind to never expose such certificates on the wild wild west ! Keep those certificate in a closed homelab you access through a secure tunnel on your LAN !

edit

Always take notes, to keep track of what you did and how you solved some issues and always make some visuals to have a better understanding on how things work !

[–] deepdive@lemmy.world 2 points 10 months ago (1 children)

Step CA is really nice if you want to learn more about how a real CA works. Had some fun playing with it but yeah it's a bit overkill for home lab xD.

You can achieve the same result with openssl with less complexity !

[–] deepdive@lemmy.world 3 points 10 months ago (1 children)

Then, I tried ownCloud for the first time. Wow, it was fast! Uploading an 8GB folder took just 3 minutes compared to the 25 minutes it took with Nextcloud. Plus, everything was lightning quick on the same machine. I really loved using it. Unfortunately, there’s currently a vulnerability affecting it, which led me to uninstall it.

I have no idea on how you access your self-hosted services but wireguard could help you out to access all your service from all your devices, with less security risks and only one point of failure (the wireguard port). Also this takes away most of the vulnerabilities you could be exposed to, because you access all your home services through a secure tunnel without directly exposing the api ports on your router !

I personally run all my services with docker-compose + traefik + self signed CA certificats + adguardhome dns rewrite. And access all my services through https://service.home.lab on all my devices ! It took me some time to set everything up nicely but right now I'm pretty happy how everything works !

About the current ownCloud vulnerability, they already took some measure and the new docker image has the phpinfo fix (uhhg). Also while I wouldn't take their word for granted:

"The importance of ownCloud’s open source in the enterprise and public-sector markets is embraced by both organizations.”

[–] deepdive@lemmy.world 7 points 10 months ago (1 children)

That's way exposing your home services to the internet is a bad idea. Accessing it through a secure tunnel is the way to go.

Also, they already "fixed" the docker image with an update, something todo with phpinfo...

[–] deepdive@lemmy.world 1 points 10 months ago (2 children)

I used nextcloud for a year or so, but found the web GUI/apps slow, bloated and sometimes way to buggy ! Switch to owncloud for the simplicity of only having a cloud system without to much bloat.

I just read through the seafile documentation and yeah this is also not going to happen. Maybe I should switch to a simple webdav server...

[–] deepdive@lemmy.world -1 points 10 months ago (4 children)

Hummm... This kinda sucks ! Moving to seafile then ! Hope their native apps are as good as owncloud's !

22
submitted 11 months ago* (last edited 10 months ago) by deepdive@lemmy.world to c/privacy@lemmy.ml
 

Hi everyone !

Right now I use:

  • Firefox's full protection with everything blocked by default
  • AdGuard adblocker extension
  • Adguardhome DNS blocker
  • ProtonVPN through wireguard
  • Selfhosted searxng instance (metasearch engine aggregator).

While this gives me reasonable doubt of protection/privacy, this blocks me out to interact with FOSS projects on github, which kindda sucks!! I don't want to accepts GitHub's long cookie list of tracking and statistics, but not being able too interact and help FOSS project to thrive, improve, get some visibility, will in the long term hurt FOSS projects.

I'm aware of GitHub's cookie management preferences, but I don't trust them to manage and choose what should be accepted or not !

Firefox only allows to block/accept everything and all extensions are just to delete them. I couldn't find any related and somehow workaround on this issue.

Q: Is there anyway to only accept cookies allowing me to login and interact with repos without accepting those tracking and analytic cookies?

If you have any solution/workaround to share, I'm all ears !


Edit

I learned a few new things today:

  • Adguard AdBlocker extension for firefox allows to block cookies before they enter into your system
  • User Agent spoofing addon
  • Firefox privacy.fingerprintingProtection is not activated by default for everthing

– How to block specific cookies with the Adguard Adblocker extension

⚠️ This can and will cause the website to malfunction if you block the wrongs cookies ⚠️

To find out what specific cookie you want to block, you first need to know his name. For firefox you need to open the application menu -> more tools -> web developer tools OR right click inspect (keyboard shurtcuts depends on your system).

In the web developer tools windows go to STORAGE -> cookies.

githubcookiesexemple

After you found out what additional non-essential cookies you want to block out you need to add them in the AdGuard user rules:

||github.com/$cookie=tz
||github.com/$cookie=preferred_color_mode
||github.com/$cookie=color_mode
||github.com/$cookie=saved_user_sessions
||github.com/^$third-party

To read more about on how to create you own ad filters read the official documentation.

– User Agent spoofing

User agent string switcher

This extension allows you to spoof your browser "user-agent" string to a custom designation, making it impossible for websites to know specific details about your browsing arrangement.

– Firefox about:config privacy.fingerprintingProtection = true

Firefox's documentation is pretty straightforward but here is what they are saying about:

However, the Canvas Permission Prompt is not the only thing that Fingerprinting Protection is doing. Fingerprinting Detection changes how you are detected online:

  • Your timezone is reported to be UTC
  • Not all fonts installed on your computer are available to webpages
  • The browser window prefers to be set to a specific size
  • Your browser reports a specific, common version number and operating system
  • Your keyboard layout and language is disguised
  • Your webcam and microphone capabilities are disguised
  • The Media Statistics Web API reports misleading information
  • Any Site-Specific Zoom settings are not applied
  • The WebSpeech, Gamepad, Sensors, and Performance Web APIs are disabled

Type about:config in the address bar and press EnterReturn. A warning page may appear. Click Accept the Risk and Continue to go to the about:config page. Search for privacy.resistFingerprinting and set it to true. You can double-click the preference or click the Toggle Fx71aboutconfig-ToggleButton button to toggle the setting.

If it is bolded and already set to true, you, or an extension you installed, may have enabled this preference. If you discover the setting has become re-enabled, it is likely a Web Extension you have installed is setting it for you.


Closing thoughts

This may seem overkill for some people and I get it, but if you are really concerned about your privacy/security, there is nothing as "one-click/done" privacy. It's hard-work and a every day battle with E-corp and other hidden institutions that gather every bit of fingerprints/trace you leave behind ! I hope this long edit will help some people to have a more private and safer web browsing !

 

Hi everybody !

While I really like the simple and sleek google calendar web GUI and functionalities, I'm more and more concerned about my data and privacy. Even if I have nothing to hide, I don't agree anymore to sell freely and consciously my data to any GAFAM.

Has anyone any alternative to google calendar?

  • Free and if possible, open source? It can have some discret sponsors/ads. As long as it isn't to intrusive.
  • Todoist integration
  • Sync between devices
  • GUI doesn't have to be PERFECT, but a bare minimum for my candy eyes !
  • Can be API, Web... doesn't matter as long as it syncs between devices (android, mac, windows, linux)

I already searched through the web, but couldn't find any conclusive alternative, maybe someone knows some hidden gem :)

Thank you !


EDIT: The solution and compromise: nextcloud. It took me some times (2days) to set it up correctly and make it work as intended.

  • Android calendar sync with DAVx5
  • Calendar notification on android's native calendar app
  • 2way sync between Android calendar and nextcloud calendar
  • push notification on nextcloud web browser

A few things too keep in mind:

1 — if you build your nextcloud instance with docker-compose:

2 — Android permissions to sync with your calendar

  • DAVx5 mentions how to allow syncing seemingly
    • It's different for every android phone
    • Battery power mode
    • Work in the background
    • ...

3 — It won't work with todoist

  • Todoist is proprietary and won't work with DAVx5 and next cloud
  • alternative: jtx board! (build by the same devs as DAVx5 seems to work similarly)

Conclusion: Nextcloud isn't as good as the cloud sync provided by google/todoist and every other GAFAM cloud instance. It has his quirks and need some attention to make it work as intended. It take some times, reading and tinkering but those are compromises I'm willing to take :)

view more: next ›