I think patterns started off as generally recognized best solutions for common problems.
But now that they have been around for a while and we have experienced applications being made ten times more complicated than they need to be because people try to cram in all they patterns that they have read about ("my application is well architected, because it is loaded to the gills with patterns.") my impression of the value of the pattern has shifted a bit.
I now think that the primary purpose of patterns is to expand our vocabulary. I can say "singleton" and we all know what that means. There may be a dozen different ways to implement a singleton - it doesn't have to be done exactly like the GoF book or some web page.
I think that any pattern being used in an application could/should(/must!) be trumped by "the simplest thing that could possibly work."
And one last thing ... puh-leeze, let's not get suckered by some putz who creates his own web page declaring a pattern. It's only a pattern when it is generally recognized by the greater minds of computer science. And let's add to that there are a lot of intern engineers working for big companies that we trust that have the ability to post content ("IBM Department of Uber Genius Types Declare 'The Squeaky Duck Pattern'"). And there are some books that print some of this garbage too.