@samuvuo Thanks for verifying. And in such a Linux'y way!
So I guess it doesn't require flushing DNS cache or a clean profile, just a restart. But I still can't see those requests from Vivaldi and it annoys me
Maybe Avast is playing up again, will have to check.
Update: Figured out why I didn't see them. I tested with the lappy and it has the "Connection-specific DNS Suffix" set by the DHCP server/router. There I saw the DNS requests:
qlwtgybqauloyxp.bb.online.no A Name Error
uqfevwxtkp.bb.online.no A Name Error
vebmiwinddfihii.bb.online.no A Name Error
My desktop uses a static IP and has no DNS suffix set, while most laptops use DHCP (for obvious reasons).
There's no mention of DHCP in the article(s), but maybe Chromium just assumes a static IP wouldn't need any checking for "NXDOMAIN hijacking"? Not even sure what the DNS suffix is good for anyway. The extent of my networking knowledge stops at this point however.
As to whether the Vivaldi team has any plans to do anything about it - I have no idea and I would think they'd leave the underlying network code alone as much as possible to avoid issues.
@michuu Thanks for the link to W3Schools. Strange i could not find that by DuckDuckGo engine.
//EDIT: Oh, ok, i did not search for key word CSS, i think. Or i did miss this link in search result list.
Anyway, your link helps!
And i will save that as a Note in Vivaldi.
@Komposten , it is at least one line less than seen from the page, for this same reason I have also put all the status bar icons up, next to the search bar and hidden the status bar, another line that is seen from the page .
The title bar, search bar and bookmarks bar I have kept as narrow as possible, using letters and small icons and so there are 3 more lines that I see on the page.
On a web page I need height on the widescreen monitor, not width, Widescreen is fine for videos or games, but not for a web page, where spaces up and down cut lines, always.
Well, everyone has their preferences and I'm very grateful for this that Vivaldi offers these possibilities of gaining readable space for who want this..
On web pages I have plenty of space on the sides.
That is called HTTP over QUIC.
More to read:
ZDnet Cloudflare, Google Chrome, and Firefox add HTTP/3 support
ZDNet HTTP-over-QUIC to be renamed HTTP/3
Youtube video on HTTP/3
IETF Protocol description
Test it at HTTP/3 test servers
Not implemented yet in browsers. But some daily builds (Canary) of Chromium and Firefox are testing this protocol.
@dude99 said in Attention: Google Chrome is now the new IE6!:
Watch this: https://www.youtube.com/watch?v=ELCq63652ig
Basically, Google is up to no good again. Google is getting more & more aggressive in breaking Google web services support for other browsers. Which slowly bugging Google service user to drop alternative browsers (including Chromium browsers) & adopt Google Chrome. This anti-competitive tactic, combine with continuous sabotage of 3rd party Adblockers are basically challenging the bottom line of illegal monopoly aka Antitrust Law.
If this continue, we are going to repeat the infamous Microsoft Antitrust debacle...
But this time it will be Google vs Everyone!
Google's approach is ridiculous, I tracked Microsoft's $ 1.3 billion lawsuit two or three years ago on Techgara but it was quite unexpected that they won. If Google is sued, what will be the result.
People have done something about it, but nobody can be bothered to use any of the options.
Until web sites protect them selves from spoofing by using DNSSec and TLS/DANE validation, nothing the browser has or does can guarantee a safe download.
Vivaldi do not protect the site with DNSSec, so yes you can have an encrypted download, but is it really from the real Vivaldi ?
All browsers could easily have a box in the download requester where you paste the hash from the site you get the file from, but as you cannot see if the site is being spoofed, it solves nothing.
There are several options for automatically including hashes with a clickable web link, but browser vendors can't be bothered until a security issue becomes critical.
Magnet links are only ever used on torrent sites, but are a universal standard that support many URLs/URIs and hash types.
You can have the file protected by including multiple sources and hashes, just like P2P downloads.
Metalinks also support multiple sources, networks and hashes, but are generally mostly just used by open source Linux projects for distributing ISOs
There is also a proposed standard "Trusted Linker Download Redirection"
When the Mint Linux site was distributing from a compromised mirror, anyone that used P2P or the hashes on the main site was protected as the bad ISO would have failed validation.
Anyone using the Firefox browser extension "Download Them All" would have had the option to automatically validate the file with multiple hashes and use multiple sources.
If 1 of the sources was the bad mirror, it would corrupt the file and fail validation.
However, as I keep pointing out, all that protection is worthless if you are getting your download via a faked site because the hashes will also be changed, so until all visitors and sites are both using DNSSec and the sites have configured it for validation it is only a partial solution.
Browsers have the ability to check certificates for domain names, but none have the ability to verify the domain is on the correct IP address.
Functionality of TLS/Dane validation needs to be added to browsers or the user has no notification that the DNS or site is being spoofed.
For now the best you can do is use 1 of the auto-scanning VirusTotal extensions
But be warned, VirusTotal is often up to a month behind recognising new malware.
I saw him talking about this on a BBC interview today. I very much look forward to seeing how this progresses.
I have some big concerns with it:
It would have to be more easy to use than current solutions - a lot of user focused or privacy centric tools have initial setup costs (time or otherwise)
There needs to be a very easy (ideally one click) easy to set it up and import existing data
And possibly the biggest issue:
If 3rd party services do build on it, or somehow hook in to it, how are you going to prevent them just "copying" the data to use themselves.
I like the idea of a segregated data pods and a decentralised assistant, and for privacy savvy folks that will be a massive boon. But how do you convince Jenny Public to also think privacy first and avoid linking potentially harmful 3rd parties.
An ecosystem like this will live or die based on how usable it is, and 3rd party integrations could bea big factor. It would only take one bad actor and your private data is out there for good.
I'm sure they are thinking about this, so i highly anticipated their proposed solutions.
There have also recently (within the past week or so) been articles on Ars Technica and The Register.
Both are highly critical. Basically google telling web developers they don't know what they are doing as pages must be formatted their way to be visible on google search.
Yet another reason to drop google search from your list of search engines.
Also implied that Bing is also a big supporter... as well as some others.
[EDIT] TODAY on Ars.https://arstechnica.com/gadgets/2018/09/bing-starts-serving-amp-pages-as-google-prepares-to-reduce-its-control/
Both PHP and Node.js are great. You should really take into account the requirements of your project to choose the best one. In any case, I've been looking into Node.js and found out that this framework is being used by Netflix, SAP, IBM and more - https://bit.ly/2FlKMOF. On the other hand, PHP has a great history and really big community and it's believed to be the most affordable solution for back-end. So, it all depends only on what you are planning to build with them.
Have never read about this vision, but it makes so much sense from a reader's perspective. Lines of text should be kept to a maximum of about 8 to 12 words, depending on language and type of content, for optimal reading speed and comfort. Horizontal instead of vertical scrolling aids this idea, as illustrated by your linked website.