From Fedora Project Wiki

Description

Connecting with FUTURE profile to common websites


How to test

It would be great if you could try connecting to sites that you normally connect to, but I understand if you have privacy concerns.

  1. For some optional steps, sqlite package might be needed
    dnf install -y sqlite
  2. Let's start with some well known sites
    echo google.com youtube.com facebook.com wikipedia.org yahoo.com amazon.com live.com vk.com twitter.com instagram.com reddit.com linkedin.com |tr " " "\n" >>sites.txt
  3. Firefox - get really visited sites
    1. Export https sites from history
      for f in $(find ~/.mozilla/firefox/ -name places.sqlite); do
      sqlite3 $f 'select distinct substr(replace(url, "https://", ""), 0, instr(replace(url, "https://", ""), "/")) from moz_places where url like "https://%";' >>sites.txt
      done
    2. Alternatively/additionally, get https sites from bookmarks (Bookmarks -> Show All Bookmarks -> Import and Backup -> Export Bookmarks to HTML)
      cat bookmarks.html |grep -io 'href="https://[^ ]*' |cut -d\" -f2 |sed 's|https://\([^/]*\).*|\1|' >>sites.txt
  4. Chrome - get really visited sites
    1. Export https sites from history
      for f in $(find ~/.config/ -name History); do
      cp -f $f ./tmp.db && \
      sqlite3 tmp.db 'select distinct substr(replace(url, "https://", ""), 0, instr(replace(url, "https://", ""), "/")) from urls where url like "https://%";' >>sites.txt
      rm -f tmp.db
      done
    2. Alternatively/additionally, get https sites from bookmarks (Bookmarks -> Bookmark manager -> Organize -> Export bookmarks to HTML file...)
      cat bookmarks.html |grep -io 'href="https://[^ ]*' |cut -d\" -f2 |sed 's|https://\([^/]*\).*|\1|' >>sites.txt
  5. Filter possible duplicates
    cat sites.txt |sort |uniq >sites.new; mv -f sites.new sites.txt
  6. Try connecting to these sites with FUTURE profile
    update-crypto-policies --set FUTURE
    for site in $(cat sites.txt); do
    wget -q -O /dev/null https://$site || echo "FAIL wget $site"
    curl -s https://$site >/dev/null || echo "FAIL curl $site"
    (sleep 5; echo -e "GET / HTTP/1.1\n\n") |openssl s_client -connect ${site}:443 -servername $site &>/dev/null || echo "FAIL s_client $site"
    done

Expected Results

Connecting without problems ideally