Solved Migrating Chrome search engines to Vivaldi
-
Thank you very much for the super quick reply @Pathduck, you're a true hero!
After copying the table, using
UPDATE main.keywords SET position = substr(hex(randomblob(30)), 1, 60) -- generates a random hex value of 60 characters long WHERE position IS NULL;
did the trick. Order doesn't matter to me.
Thanks for the additional information aswell.
Seems a strange dataformat for such a value but ok...
Oh hell yes, I have no idea how that even maps.
In the sense of reproducibility and to make it easier for others, do you think this will become a builtin feature of the import tool in the near future, or should we document the necessary steps somewhere in the documentation? Maybe this forum post enough for people to stumble over who are looking to do this aswell?
P.S. I was using JetBrains DataGrip to import data from one table to the other, so I don't have the correct SQL statement laying around to do this. Unfortunately I'll have to leave this as an exercise to the next reader
-
@JosXa No worries, glad it helped
I wasn't even sure if it would work, as I've never done an import like that. I mostly use a GUI SQLite tool to modify existing entries and do a backup once in a while.
Looks like you know your SQL and the right tools to use. This isn't something I'd expect general users to be able to do at all, even with documentation.
In the sense of reproducibility and to make it easier for others, do you think this will become a builtin feature of the import tool in the near future
No idea, and doing stuff this way is definitely not supported. If you don't know how to use the proper tools, there's no way to do this any other way than manually. AFAIK no browser does search engine imports.
or should we document the necessary steps somewhere in the documentation?
Trying to make some "fail-proof" documentation how to do this would be practically impossible, and after all it's not something users would be expected to do a lot. There's just too many variables and steps that could fail.
And how many users even have that amount of custom search engines needing import?
I have about 70 custom search engines, I just backup Web Data regularly and do an SQL export just to be sure.
There's also a backup tool, no idea if it still works since Vivaldi migrated to using Web Data instead of LDB for custom engines.
https://forum.vivaldi.net/topic/35443/backup-search-engines
It just grabs the engines as JSON to the clipbard and it can be imported later.Search engines are in Sync now at least, so you don't really have to think about it now that Chrome is out of the picture for good
-
@Pathduck Alright, so at least for the technically versed we have this thread to fall back on
-
Thank you @JosXa and @Pathduck for doing all the hard work. This is lucky timing for me since I just decided to switch from Chrome to Vivaldi yesterday and was already dreading moving ~150 search engines/keywords.
Summarized the steps below to hopefully help the next person. As suggested above, would be great if this were automated, but not sure if there are other challenges there.
Steps:
- Download your preferred SQLite client.
- Close both Chrome and Vivaldi. Web Data databases are locked when browsers are open.
- Using your SQLite client, connect to the Web Data databases for both Chrome and Vivaldi.
- Chrome DB file (Windows):
%LOCALAPPDATA%\Google\Chrome\User Data\Default\Web Data
- Vivaldi DB file (Windows):
%LOCALAPPDATA%\Vivaldi\User Data\Default\Web Data
- Chrome DB file (Windows):
- Copy the
keywords
table from Chrome tokeywords_temp
in Vivaldi.- The technique for this varies by client. If using DataGrip, connect to both and use the "Import" feature. For other clients, consider exporting the table from Chrome in a suitable format (e.g., DDL and DML, CSV, etc.) and then importing as
keywords_temp
in Vivaldi.
- The technique for this varies by client. If using DataGrip, connect to both and use the "Import" feature. For other clients, consider exporting the table from Chrome in a suitable format (e.g., DDL and DML, CSV, etc.) and then importing as
- Run the below on Vivaldi's database:
begin transaction; insert into keywords select (select max(k.id) from keywords k) + row_number() over (order by kt.id) id, kt.short_name, kt.keyword, kt.favicon_url, kt.url, -- Do not auto-delete engines on browser update 0 as safe_for_autoreplace, kt.originating_url, kt.date_created, kt.usage_count, kt.input_encodings, kt.suggest_url, kt.prepopulate_id, kt.created_by_policy, kt.last_modified, -- Ensure sync_guid is unique printf('%s-%s-%s-%s-%s', lower(hex(randomblob(4))), lower(hex(randomblob(2))), lower(hex(randomblob(2))), lower(hex(randomblob(2))), lower(hex(randomblob(6)))) as sync_guid, kt.alternate_urls, kt.image_url, kt.search_url_post_params, kt.suggest_url_post_params, kt.image_url_post_params, kt.new_tab_url, kt.last_visited, kt.created_from_play_api, kt.is_active, -- Ensure position is unique; increment over highest existing position -- It's unclear if the order of this value actually matters. The runtime position in Vivaldi appears to rely on the ID only. -- Alternative: substr(hex(randomblob(30)), 1, 60) printf('%s%d', (select max(k.position) from keywords k), row_number() over (order by kt.id)) as position, kt.starter_pack_id, kt.enforced_by_policy from keywords_temp kt; drop_table keywords_temp; commit transaction;
- Open Vivaldi and verify search engines are properly configured: vivaldi://settings/search/
-
@mike12489 That's awesome
I admit my SQL is a bit rusty, but I think I understand at least some of what that does
What would be even better for users who are not comfortable with technical stuff is if there was a batch/shell script to perform the same - export from Chrome, import to Vivaldi. It would have to use the
sqlite3
client I guess.While researching earlier, I did find:
https://ludovic.chabant.com/blog/2011/01/18/poor-mans-search-engines-sync-for-google-chrome-version-2/
But this is only for Chrome -> Chrome obviously. But might be possible to adapt to Vivaldi. -
Also @JosXa you might want to mark this topic as "Solved" if you're satisfied with the answer(s)?
I think there's no point keeping it "Unsolved" until Vivaldi implements a Search Engine import/export - after all, this is already in Sync. It might take a long time or possibly "never" happen
-
@mike12489 Many thanks!
I could import my search engines from Opera with sqlite-tools thanks to your code.
-
@josemuk Hello and Welcome to the Vivaldi Community
You did this only from the command-line
sqlite3
tool? If so, would you care to share your steps and command lines doing it? -
-
Download sqlite-tools from SQLite Download Page
-
Create a folder and unpack the archive there.
-
Get your Chrome, Opera... Web Data file and copy it to that folder as chrome.db
-
Be sure Vivaldi is closed, get your Vivaldi Web Data file (c:\Users<user>\AppData\Local\Vivaldi\User Data\Default\Web Data) and copy it to that folder as vivaldi.db
-
Open a shell, go to that folder and execute
sqlite3.exe -header -csv chrome.db "select * from keywords;" > keywords.csv
-
Run sqlite3.exe and you'll see something like this:
SQLite version 3.42.0 2023-05-16 12:36:15
Enter ".help" for usage hints.
Connected to a transient in-memory database.
Use ".open FILENAME" to reopen on a persistent database.
sqlite>
-
Enter the following commands:
.open vivaldi.db
.mode csv
.import keywords.csv keywords_temp
.schema keywords_temp
-
You'll see something like this:
CREATE TABLE IF NOT EXISTS "keywords_temp"( "id" TEXT, "short_name" TEXT, "keyword" TEXT, "favicon_url" TEXT, "url" TEXT, "safe_for_autoreplace" TEXT, "originating_url" TEXT, "date_created" TEXT, "usage_count" TEXT, "input_encodings" TEXT, "suggest_url" TEXT, "prepopulate_id" TEXT, "created_by_policy" TEXT, "last_modified" TEXT, "sync_guid" TEXT, "alternate_urls" TEXT, "image_url" TEXT, "search_url_post_params" TEXT, "suggest_url_post_params" TEXT, "image_url_post_params" TEXT, "new_tab_url" TEXT, "last_visited" TEXT, "created_from_play_api" TEXT, "is_active" TEXT, "starter_pack_id" TEXT, "enforced_by_policy" TEXT);
-
Execute the code (copy and paste and press [ENTER])
begin transaction; insert into keywords select (select max(k.id) from keywords k) + row_number() over (order by kt.id) id, kt.short_name, kt.keyword, kt.favicon_url, kt.url, -- Do not auto-delete engines on browser update 0 as safe_for_autoreplace, kt.originating_url, kt.date_created, kt.usage_count, kt.input_encodings, kt.suggest_url, kt.prepopulate_id, kt.created_by_policy, kt.last_modified, -- Ensure sync_guid is unique printf('%s-%s-%s-%s-%s', lower(hex(randomblob(4))), lower(hex(randomblob(2))), lower(hex(randomblob(2))), lower(hex(randomblob(2))), lower(hex(randomblob(6)))) as sync_guid, kt.alternate_urls, kt.image_url, kt.search_url_post_params, kt.suggest_url_post_params, kt.image_url_post_params, kt.new_tab_url, kt.last_visited, kt.created_from_play_api, kt.is_active, -- Ensure position is unique; increment over highest existing position -- It's unclear if the order of this value actually matters. The runtime position in Vivaldi appears to rely on the ID only. -- Alternative: substr(hex(randomblob(30)), 1, 60) printf('%s%d', (select max(k.position) from keywords k), row_number() over (order by kt.id)) as position, kt.starter_pack_id, kt.enforced_by_policy from keywords_temp kt; drop table keywords_temp; commit transaction;
- Quit sqlite with
.q
- Be sure you have a backup of
c:\Users\<user>\AppData\Local\Vivaldi\User Data\Default\Web Data
and overwrite it with the vivaldi.db you just created. - Run Vivaldi and check with
vivaldi://settings/search/
that the migration went well.
-
-
@josemuk Brilliant!
Now we only need someone to pack that into bat/sh scripts and we've got something even grandma could use (with a massive disclaimer!)
-
While having the chrome.db and vivaldi.db files in the folder where sqlite3.exe is, these two commands give us the final vivaldi.db:
sqlite3.exe -header -csv chrome.db "select * from keywords;" > keywords.csv sqlite3.exe vivaldi.db -cmd ".mode csv" -cmd ".import keywords.csv keywords_temp" "begin transaction;" "insert into keywords select (select max(k.id) from keywords k) + row_number() over (order by kt.id) id, kt.short_name, kt.keyword, kt.favicon_url, kt.url, 0 as safe_for_autoreplace, kt.originating_url, kt.date_created, kt.usage_count, kt.input_encodings, kt.suggest_url, kt.prepopulate_id, kt.created_by_policy, kt.last_modified, printf('%s-%s-%s-%s-%s', lower(hex(randomblob(4))), lower(hex(randomblob(2))), lower(hex(randomblob(2))),lower(hex(randomblob(2))), lower(hex(randomblob(6)))) as sync_guid, kt.alternate_urls, kt.image_url, kt.search_url_post_params, kt.suggest_url_post_params, kt.image_url_post_params, kt.new_tab_url, kt.last_visited, kt.created_from_play_api, kt.is_active, printf('%s%d', (select max(k.position) from keywords k), row_number() over (order by kt.id)) as position, kt.starter_pack_id, kt.enforced_by_policy from keywords_temp kt;" "drop table keywords_temp;" "commit transaction;"
-
@josemuk Great! I had a look - not sure if the code for position works as expected.
The values of position for the existing engines are like this:
2225ffffffff0000000a9a6b5444766e5a616859623075647948612b746d54577a35736f78383d
The inserted engines have just:
2226ffffffff31
In my SQLite tool they show as binary BLOBs
And if the position value fails checks, it will reset the DB...
-
Same here with DB Browser for SQLite:
and it works without problems with Vivaldi. The code is exactly the same @mike12489 posted so I don't know why it shows as a blob.
edit: The original Vivaldi database also has the blobs:
so I guess is fine.
-
@josemuk Well, I didn't test @mike12489 's script before - so the fault is probably there
But you're right - it does work with a clean Vivaldi
Web Data
file. But when using my own with 60 custom engines as the Vivaldi template, it fails. I'll have to check more... -
@josemuk @mike12489
If you wouldn't mind, please check the following DB file if you can understand what goes wrong on the insert?
https://ttm.sh/B7G.dbI have removed all autofill data (hopefully...)
-
The first position of your 87G.db is a blob instead of a text and that's why it fails:
I started from a clean database, manually added more than 100 search engines, imported the ones from Chrome and the final database was fine:
so I really don't know what happened with yours.
-
OK I found the possible culprit - the
prepopulate_id
cannot conflict with any other entry. And since I tested with a basically clean Chromium profile, some of them were duplicates.I think (but not 100% on this) that it can be set to 0.
Also, I think the Chromium version might play a role here, as Snapshot is on Chromium 116, so might cause some issues. At least I can't just copy the Snapshot
Web Data
to a Stable profile, it will just reset.Edit: Yeah, the Snapshot <=> Stable was the main problem. I simply can't copy a Snapshot db over to Stable and expect it to work as I did for testing.
It seems to work now with the prepopulate_id set to 0 to avoid conflicts with existing entries.
I think the prepopulate_id is always set to 0 for added search engines. So if set to 0 by the import, you'll end up with duplicates of the existing ones I guess... so should be safe to just have as it was. It was just for my testing seeing some engines not get imported.
-
PPathduck has marked this topic as solved on