From Fedora Project Wiki

No edit summary
No edit summary
Line 14: Line 14:
##::cat bookmarks.html |grep -io 'href="https://[^ ]*' |cut -d\" -f2 |sed 's|https://\([^/]*\).*|\1|' >>sites.txt</pre>
##::cat bookmarks.html |grep -io 'href="https://[^ ]*' |cut -d\" -f2 |sed 's|https://\([^/]*\).*|\1|' >>sites.txt</pre>
# TODO export from Chrome
# TODO export from Chrome
# '''Chrome''' - get really visited sites
## Export https sites from history
##:<pre>
##::for f in $(find ~/.config/ -name History); do
##::  cp -f $f ./tmp.db && \
##::    sqlite3 tmp.db 'select distinct substr(replace(url, "https://", ""), 0, instr(replace(url, "https://", ""), "/")) from urls where url like "https://%";' >>sites.txt
##::done</pre>
## Alternatively/additionally, get https sites from bookmarks (Bookmarks -> Show All Bookmarks -> Import and Backup -> Export Bookmarks to HTML)
##:<pre>
##::cat bookmarks.html |grep -io 'href="https://[^ ]*' |cut -d\" -f2 |sed 's|https://\([^/]*\).*|\1|' >>sites.txt</pre>
## Filter possible duplicates
## Filter possible duplicates
##:<pre>
##:<pre>

Revision as of 14:20, 29 March 2017

Description

Connecting with FUTURE profile to common websites


How to test

It would be great if you could try connecting to sites that you normally connect to, but I understand if you have privacy concerns.

  1. TODO provide default file
  2. Firefox - get really visited sites
    1. Export https sites from history
      for f in $(find ~/.mozilla/firefox/ -name places.sqlite); do
      sqlite3 $f 'select distinct substr(replace(url, "https://", ""), 0, instr(replace(url, "https://", ""), "/")) from moz_places where url like "https://%";' >>sites.txt
      done
    2. Alternatively/additionally, get https sites from bookmarks (Bookmarks -> Show All Bookmarks -> Import and Backup -> Export Bookmarks to HTML)
      cat bookmarks.html |grep -io 'href="https://[^ ]*' |cut -d\" -f2 |sed 's|https://\([^/]*\).*|\1|' >>sites.txt
  3. TODO export from Chrome
  4. Chrome - get really visited sites
    1. Export https sites from history
      for f in $(find ~/.config/ -name History); do
      cp -f $f ./tmp.db && \
      sqlite3 tmp.db 'select distinct substr(replace(url, "https://", ""), 0, instr(replace(url, "https://", ""), "/")) from urls where url like "https://%";' >>sites.txt
      done
    2. Alternatively/additionally, get https sites from bookmarks (Bookmarks -> Show All Bookmarks -> Import and Backup -> Export Bookmarks to HTML)
      cat bookmarks.html |grep -io 'href="https://[^ ]*' |cut -d\" -f2 |sed 's|https://\([^/]*\).*|\1|' >>sites.txt
    3. Filter possible duplicates
      cat sites.txt |sort |uniq >sites.new; mv -f sites.new sites.txt

Expected Results

  1. TODO