• 1 Post
  • 140 Comments
Joined 1 year ago
cake
Cake day: July 18th, 2023

help-circle

  • Opt-in should be mandatory for all services and data sharing. I would start my transition to Linux today if this were opt-out, though the way Apple handles this for other services makes me believe opt-in will be temporary.

    Currently, when you setup any device as new, even an offline/local user on macOS, the moment you log into iCloud it opts-almost-every-app-and-service-into iCloud, even one’s you have never used and always disabled on every device. There’s seemingly no way to prevent this behavior on any device, let alone at an account level.

    Currently, even though my iPhone and language support offline (on-device) Siri, and I’ve disabled all analytics sharing options, I must still agree to Apple’s data sharing and privacy policy to use Siri. Why would I need to agree to a privacy policy if I only want to use Siri offline, locally on my device, and disable it from accessing Apple’s servers or anything external to the content on my phone? Likely because if you enable Siri, it auto-enables (opts in) for every app and service on your device. Again, no way to disable this behavior.

    I understand the majority of users do not care about privacy or surveillance capitalism, but for me to trust and use a personal AI assistant baked into my devices OS, I need the ability to make it 100% offline, and fine grained network control for individual apps and processes, including all of the OS’s processes. It would not be difficult to add a toggle at login to “enable iCloud/Siri for all apps/services” or “let me choose which apps/services to use with iCloud/Siri, individually”. Apple needs stronger and clearer offline controls in all its software, period.



  • Lucky! I finally got my mum to use the password manager I admin, but she still reuses the same dozen passwords for everything and manually enters them in… sigh. I’ve set strong passwords and 2FA for all critical accounts, so I just let her be a moron with the rest of them.

    Computers break her brain. She literally responds with questions like “it’s IN the computer?” Zoolander style. I just do most of her shit myself because it’s less painful than trying to teach her.





  • WhatAmLemmy@lemmy.worldtoAsklemmy@lemmy.mlSearch engines down?
    link
    fedilink
    English
    arrow-up
    12
    ·
    edit-2
    2 months ago

    I was thinking about this and imagined the federated servers handling the index db, search algorithms, and search requests, but instead leverage each users browser/compute to do the actual web crawling/scraping/indexing; the server simply performing CRUD operations on the processed data from clients to index db. This approach would target the core reason why search engines fail (cost of scraping and processing billions of sites), reduce the costs to host a search server, and spread the expense across the user base.

    It also may have the added benefit of hindering surveillance capitalism due to a sea of junk queries from every client, especially if it were making crawler requests from the same browser (obviously needs to be isolated from the users own data, extensions, queries, etc). The federated servers would also probably need to operate as lighthouses that orchestrate the domains and IP ranges to crawl, and efficiently distribute the workload to client machines.





  • a) why the fuck would they go to that effort for a filthy commoner like yourself, and b) what are the chances that 0.01% of recoverable data contains anything useful!?!

    Nobody is gonna bother doing advanced forensics on 2nd hand storage, digging into megabytes of reallocated sectors on the off chance they to find something financially exploitable. That’s a level of paranoia no data supports.

    My example applies to storage devices which don’t default to encryption (most non-OS external storage). It’s analogous to changing your existing encrypted disks password to a random-ass unrecoverable throwaway.




  • For all average user requirements that just involve backups, PII docs, your sex vids, etc (e.g. not someone who could be persecuted, prosecuted, or murdered for their data) your best bet (other than physical destruction) is to encrypt every usable bit in the drive.

    1. Download veracrypt
    2. Format the SSD as exFAT
    3. Create a new veracrypt volume on the mounted exFat partition that uses 100% of available space (any format).
    4. open up a notepad and type out a long random ass throwaway password e.g. $-963,;@82??/@;!3?$.&$-,fysnvefeianbsTak62064$@/lsjgegelwidvwggagabanskhbwugVg, copy it, and close/delete without saving.
    5. paste that password for the new veracrypt volume, and follow the prompts until it starts encrypting your SSD. It’ll take a while as it encrypts every available bit one-by-one.

    Even if veracrypt hits a free space error at the end of the task, the job is done. Maybe not 100%, but 99.99+% of space on the SSD is overwritten with indecipherable gibberish. Maybe advanced forensics could recover some bits, but a) why the fuck would they go to that effort for a filthy commoner like yourself, and b) what are the chances that 0.01% of recoverable data contains anything useful!?! You don’t really need to bother destroying the header encryption key (as apple and android products do when you wipe a device) as you don’t know the password and there isn’t a chance in hell you or anyone else is gonna guess, nor brute force, it.




  • More specifically — by matching all of your purchases with every data point about you, and the people you associate with, they can build out statistical models to determine the data points that most strongly correlate to people like you and what motivates or persuades them to GENERATE MORE PROFIT — direct purchases, attention, clicks, discussion, word of mouth, lead generation, etc, etc…

    Or… You know… What motivates/persuades you to support a dictator or vote for a specific political candidate or party, as well as what demotivates/dissuades you from the opposition — including what types of psychological warfare, disinformation, deepfakes, “fake news”, and lies are most successful in achieving these goals — all of which benefits the most horrifically immoral and unethical criminals, corporations, and individuals the most (the “good guys” aren’t the ones using deception and lies to profit and win).



  • The moment of divergence is instantaneous between the clone and original. The only way it could not be instantaneous, is if both were just a brain connected to the exact same simulation, experiencing the exact same inputs. If they didn’t respond the same, then they aren’t an exact clone. Even then, the brains would be sustained with different blood, made up of trillions of slightly different atoms — although similar, not 100% identical due to quantum mechanics — with a slightly different fluid dynamics. Actually the only way they could be identical is if they weren’t brains but identical code, running in an identical simulation, with the exact same boundaries, and no possibility of probability, chaos or divergence from that code… Oh no I’ve gone cross eyed.