Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Url regex changes #302

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open

Url regex changes #302

wants to merge 2 commits into from

Conversation

s-rah
Copy link
Member

@s-rah s-rah commented Nov 9, 2015

Combined PR with new Regex Changes to fixup some URL issues parsing odd control characters and prevent phishing through unicode bidi characters in links. Will close the old PRs.

s-rah added 2 commits November 8, 2015 16:24
The old URL regex had a few issues which were revealed by fuzzing,
the biggest being that it accepted non-printable characters (e.g.
0x00 or 0x01) as part of the URL.

This created the scenario where a url of https://example.com/[0x00]
would be rendered as %2 (and attempting to open the link would give
a value like "https://example.com https://example.com " due to some
odd iteraction with the regex that I haven't quite worked out.

The new regex appears to work with all the iterations I have tried
and rejects non-printable characters.
Prevent attempts at phishing through unicode direction controls
by forcing left-to-right display for links through html
entity ‪

This is a fairly minor risk as a victim would have to go through many
hoops and not see the obvious url issues. But better fixed than not.
This was referenced Nov 9, 2015
@s-rah
Copy link
Member Author

s-rah commented Nov 9, 2015

Snapshot of new behavior.

sample

@@ -41,7 +41,7 @@ LinkedText::LinkedText(QObject *parent)
: QObject(parent)
{
// Select things that look like URLs of some kind and allow QUrl::fromUserInput to validate them
linkRegex = QRegularExpression(QStringLiteral("([a-z]{3,9}:|www\\.)([^\\s,.);!>]|[,.);!>](?!\\s|$))+"), QRegularExpression::CaseInsensitiveOption);
linkRegex = QRegularExpression(QStringLiteral("([a-z]{3,9}:|www\\.)((([\\p{L}\\p{N}\\?#/~=]+)|([\\-\\._:&%][^\\p{Zs}])+)+)"), QRegularExpression::CaseInsensitiveOption);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This expression isn't quite right; some common symbols won't be included in the URL, and the trailing punctuation logic worked better before. I can also break it with some combining characters or emoji (although this is going far beyond normal).

What do you think of blacklisting \p{C} to remove control characters instead? Is that suffcient?

Something like:

([a-z]{3,9}:|www\\.)([^\\p{Z}\\p{C}<>"',.)\\[\\];!]|[,.)\\[\\];!](?!\\s|$))+
See https://regex101.com/r/tL7wM3/1 - but I didn't try to get BIDI or control characters in there yet.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, looks like my proposed one misses certain url params, not good. The new one looks better - and looks like control characters are blocked as expected - but will confirm through recoil later this evening.

On a related note, can we get a formalized spec of how we think this should work, for future reference.

  • URLs are detected if they start with a protocol specifier or with "www." and are followed by at least one non-unicode character.
  • A space always terminates the URL
  • ". " always terminates the URL and excludes the final period.
  • "! " always terminates the URL and excludes the final exclamation.
  • ", " always terminates the URL and excludes the final comma.
  • Unicode Control characters are not permitted inside the url and will terminate the URL.

This spec seems incomplete as www.?())@($3423#@$@#$# qualifies as a URL - although it is likely better to fall on the side of forgiveness - we can always refine it later.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants