From Fedora Project Wiki

No edit summary
No edit summary
 
(8 intermediate revisions by the same user not shown)
Line 3: Line 3:
|actions=
|actions=
It would be great if you could try connecting to sites that you normally connect to, but I understand if you have privacy concerns.
It would be great if you could try connecting to sites that you normally connect to, but I understand if you have privacy concerns.
# TODO provide default file
# For some optional steps, <code>sqlite</code> package might be needed
# Export https sites from Firefox
#:<pre>dnf install -y sqlite</pre>
# Let's start with some well known sites
#:<pre>
#:<pre>
#::for f in $(find ~/.mozilla/firefox/ -name places.sqlite); do
#:: echo google.com youtube.com facebook.com wikipedia.org yahoo.com amazon.com live.com vk.com twitter.com instagram.com reddit.com linkedin.com |tr " " "\n" >>sites.txt</pre>
#::  sqlite3 $f 'select distinct substr(replace(url, "https://", ""), 0, instr(replace(url, "https://", ""), "/")) from moz_places where url like "https://%";' >>sites.txt
# '''Firefox''' - get really visited sites
#::done
## Export https sites from history
#::cat sites.txt |sort |uniq >sites.new; mv -f sites.new sites.txt
##:<pre>
#::...</pre>
##::for f in $(find ~/.mozilla/firefox/ -name places.sqlite); do
# TODO export from Chrome
##::  sqlite3 $f 'select distinct substr(replace(url, "https://", ""), 0, instr(replace(url, "https://", ""), "/")) from moz_places where url like "https://%";' >>sites.txt
##::done</pre>
## Alternatively/additionally, get https sites from bookmarks (Bookmarks -> Show All Bookmarks -> Import and Backup -> Export Bookmarks to HTML)
##:<pre>
##::cat bookmarks.html |grep -io 'href="https://[^ ]*' |cut -d\" -f2 |sed 's|https://\([^/]*\).*|\1|' >>sites.txt</pre>
# '''Chrome''' - get really visited sites
## Export https sites from history
##:<pre>
##::for f in $(find ~/.config/ -name History); do
##::  cp -f $f ./tmp.db && \
##::    sqlite3 tmp.db 'select distinct substr(replace(url, "https://", ""), 0, instr(replace(url, "https://", ""), "/")) from urls where url like "https://%";' >>sites.txt
##::    rm -f tmp.db
##::done</pre>
## Alternatively/additionally, get https sites from bookmarks (Bookmarks -> Bookmark manager -> Organize -> Export bookmarks to HTML file...)
##:<pre>
##::cat bookmarks.html |grep -io 'href="https://[^ ]*' |cut -d\" -f2 |sed 's|https://\([^/]*\).*|\1|' >>sites.txt</pre>
# Filter possible duplicates
#:<pre>
#::cat sites.txt |sort |uniq >sites.new; mv -f sites.new sites.txt</pre>
# Try connecting to these sites with FUTURE profile
#:<pre>
#::update-crypto-policies --set FUTURE
#::for site in $(cat sites.txt); do
#::  wget -q -O /dev/null https://$site || echo "FAIL wget $site"
#::  curl -s https://$site >/dev/null || echo "FAIL curl $site"
#::  (sleep 5; echo -e "GET / HTTP/1.1\n\n") |openssl s_client -connect ${site}:443 -servername $site &>/dev/null || echo "FAIL s_client $site"
#::done</pre>
|results=
|results=
# TODO
Connecting without problems ideally
}}
}}

Latest revision as of 06:54, 30 March 2017

Description

Connecting with FUTURE profile to common websites


How to test

It would be great if you could try connecting to sites that you normally connect to, but I understand if you have privacy concerns.

  1. For some optional steps, sqlite package might be needed
    dnf install -y sqlite
  2. Let's start with some well known sites
    echo google.com youtube.com facebook.com wikipedia.org yahoo.com amazon.com live.com vk.com twitter.com instagram.com reddit.com linkedin.com |tr " " "\n" >>sites.txt
  3. Firefox - get really visited sites
    1. Export https sites from history
      for f in $(find ~/.mozilla/firefox/ -name places.sqlite); do
      sqlite3 $f 'select distinct substr(replace(url, "https://", ""), 0, instr(replace(url, "https://", ""), "/")) from moz_places where url like "https://%";' >>sites.txt
      done
    2. Alternatively/additionally, get https sites from bookmarks (Bookmarks -> Show All Bookmarks -> Import and Backup -> Export Bookmarks to HTML)
      cat bookmarks.html |grep -io 'href="https://[^ ]*' |cut -d\" -f2 |sed 's|https://\([^/]*\).*|\1|' >>sites.txt
  4. Chrome - get really visited sites
    1. Export https sites from history
      for f in $(find ~/.config/ -name History); do
      cp -f $f ./tmp.db && \
      sqlite3 tmp.db 'select distinct substr(replace(url, "https://", ""), 0, instr(replace(url, "https://", ""), "/")) from urls where url like "https://%";' >>sites.txt
      rm -f tmp.db
      done
    2. Alternatively/additionally, get https sites from bookmarks (Bookmarks -> Bookmark manager -> Organize -> Export bookmarks to HTML file...)
      cat bookmarks.html |grep -io 'href="https://[^ ]*' |cut -d\" -f2 |sed 's|https://\([^/]*\).*|\1|' >>sites.txt
  5. Filter possible duplicates
    cat sites.txt |sort |uniq >sites.new; mv -f sites.new sites.txt
  6. Try connecting to these sites with FUTURE profile
    update-crypto-policies --set FUTURE
    for site in $(cat sites.txt); do
    wget -q -O /dev/null https://$site || echo "FAIL wget $site"
    curl -s https://$site >/dev/null || echo "FAIL curl $site"
    (sleep 5; echo -e "GET / HTTP/1.1\n\n") |openssl s_client -connect ${site}:443 -servername $site &>/dev/null || echo "FAIL s_client $site"
    done

Expected Results

Connecting without problems ideally