IIS Website asks for Username and Password

We see it almost every week, someone posting a question in a forum asking “Why does my web site prompt me for my user name and password?”  And yes, there are a lot of reasons this can happen, but we see an awful lot of posts where someone is at a loss for why, even after they’ve reconfigured the permissions and authentication several times.  And it doesn’t matter what version of IIS they’re using, just that it’s Windows Authentication.  Crazy, but the solution has nothing to do with the server, it’s all on the client side.

Several years ago, hackers got smarter.  (Okay, they get smarter every day, usually quicker than most of us…)  To keep up with the hackers, programmers got smarter too.  And browser programmers decided that they wouldn’t pass an authentication request to an untrusted domain.  Which is very smart.  But can lead to a double authentication issue.  What happens is a user logs into their Windows system.  Then they visit a web site, usually on an intranet, that requires them to be authenticated through Windows.  And the danged site asks them to authenticate again.  All because the browser copped an attitude and won’t let the web site know that the user is already logged in, simply because the browser doesn’t trust the web site.

The solution is simple.  Tell the browser to trust the web site.  How you do that may be a bit less than simple.  For example, in Internet Explorer (all versions from 5 up…), open the Tools menu and choose Internet Options.  On the Security tab, choose the Intranet Zone and click the Sites button.  In the Sites dialog, click the Advanced button.  And in the dialog box, enter the web site by server name, Fully Qualified Domain Name or IP Address of the web site.  Click on Okay and accept everything to save it and you’re golden.  Other than having to do this on every single client.

Internet Explorer Group Policy

Fortunately, Windows Group Policy allows you to handle this across your Active Directory domain.  Create a group policy that applies to Authenticated Users, and set the following policy:

User Config > Administrative Templates > Windows Components > Internet Explorer > Internet Control Panel > Security Page

Enable the Site to Zone Assignment List and add your intranet domain to the list in the following format:

{Host/Domain}
{Zone}

Where {Host/Domain} is the FQDN, server name, domain name or IP address of your site and {Zone} is a number as follows:

1 – Intranet Zone
2 – Trusted Sites Zone
3 – Internet Zone
4 – Restricted Sites Zone

So, to add the http://www.sample.com web site to the Intranet Zone so Internet Explorer will pass credentials, create your list as such:

http://www.sample.com 1

Firefox

To set Firefox to pass authentication through to your web site is a little less direct.  You need to edit the Config file, as follows:

Open Firefox and in the address bar type about:config and press Enter.  In the config preferences, find the line for network.automatic-nlm-auth.trusted-uris and double-click it.  Enter the web site URL in the dialog box, click Okay and restart Firefox.  Now your browser will also pass credentials to a web site.

Disclaimer

As usual, any knowledge of Mr. Phelps or his IMF team is denied.  Other than the cool TV shows and the adequate Tom Cruise movies of course.  (Although Thandi Newton was hot in MI II…).  We also, quite naturally, disavow any knowledge of this post if you foolishly follow our advice and break your system.  Or someone else’s.  Except Mr. Phelps’ system, since he and his team don’t exist…

IIS 7 URL Rewrite and Canonical URLs

What the heck is a “canonical URL?”  It sounds vaguely like a website the Pope would decree as required reading.  At least in Biblical terms.  But in computer terms, “canonical” simply means “the normal way stuff is.”  It comes from mathematics, where a standard format for writing equations is important, but it’s rarely used that way anymore.  And who really cares anyway?

For our use, a “canonical URL” is simply a standard URL for accessing your web site.  In search engine optimization it means that no matter how a user gets to your web site, it displays the exact same page with the exact same URL so that search engines know the page is a single page, no matter how many URLs point to it.  A basic example:

http://www.foo.bar and foo.bar are the same site.  You and I can tell that, because both URLs bring up the same page.  And everyone knows that the “www” host name can be dropped from a URL and still get to the site, right?  After all, your business card doesn’t include the www, and neither does your phone book listing.  That’s the normal way this stuff works.  The canonical way.  Except they are two different pages.

Computers don’t make leaps of faith and assume that whether or not a www precedes a URL makes no difference.  To a computer, they are two separate URLs, one with the www and one without.  You and I know they’re really the same thing, but computers are stubborn.  And picky.  And for SEO, stubborn and picky are not the best attributes because SEO is for people, some of whom may be stubborn and picky (these ones are often referred to as “ex-girlfriends”) but most of whom are not.  At least to the totally anal extent of a computer.  So we fix this by providing canonical URLs, ones that, to a computer, are exactly the same.  In simple terms, we convert any request for foo.bar into http://www.foo.bar (or vice-versa) before we let the search engine see it.  That way, a search engine sees one page even though there are two URLs that can reach it.  Instead of splitting the popularity of the page into two separate pages, the combined popularity score reflects what happens in real life.

And an easy way to do this is with the URL Rewrite module for IIS 7.  We’re not going to run through the setup and use of this module, you can find that information at Microsoft’s IIS web site, just a quick listing of the rewrite rule you enter in your web.config file.  Which would look something like this:

<rule name=”Redirect URL to WWW version”
        stopProcessing=”true”>
  <match url=”.*” />
    <conditions>
      <add input=”{HTTP_HOST}” pattern=”^foo.bar$” />
    </conditions>
  <action type=”Redirect” url=”http://www.foo.bar/{R:0}”
        redirectType=”Permanent” />
</rule>

Some quick notes:

stopProcessing=”true” — This stops processing any rewrite rules after this one is triggered.  Normally you would want to do this, but if you have further rules that need processing, like changing a query string to a friendly URL, set this to false.

redirectType=”Permanent” — This provides a 301 HTTP response to the requesting client, indicating that the foo.bar URL has been permanently redirected, or moved, to the http://www.foo.bar URL.  Search engine spiders are smart enough to make this a permanent change in their search results, in essence they correct older search links that didn’t have the www in the URL.

Canonical URLs will help your page ranking in search results, but this is not the only SEO technique that can be handled with the URL Rewrite module.  But it is a simple function you can configure once for the site and never worry about again.  To find other possibilities for SEO use of the IIS 7 URL Rewrite module, be sure to check Carlos Aguilar’s blog and the URL Rewrite Module forum at www.iis.net.

Disclaimer

SEO is not rocket science and it’s hard to screw up using the IIS 7 Rewrite module when you follow the documentation, but if you do it’s not our fault.  If you’re an offended ex-girlfriend who is angry at this post, might we suggest counseling to get past the anger and hurt and move on with your life.  There are plenty of guys out there who aren’t dirt bags…  Well, aren’t usually dirt bags.  After all, we’re guys.  And by definition we’re pretty much scum one way or the other.  And unfortunately, we stopped maturing in our early teens, though many of us hide our immaturity well enough to hold stable relationships.  At least until we do something stupid again.  That’s why florists and jewelers make such a good living.

Observations on the iPhone 3GS

Okay, we succumbed to the pressure and switched to iPhones.  Yep, the 3GS.  Our contract with Sprint was ending and we had the choice of upgrading to some sort of touch-screen PDA-like device after a few years of Motorola RAZRs.  We looked at them all, Palm Pre, HTC Touch Pro 2, Sprint’s Android, CrackBerry – Every operating system, piece of glitzy hardware and voice/data plan out there.  And the iPhone 3GS won.  Hands down.

Set aside the fact that switching to a Windows Mobile 6 device on Sprint would cost us the same, with a $150 loyalty discount, as if we had walked off the street as a new customer (Wireless companies have never really understood customer loyalty).  And forget that a new, discounted, Palm Pre or CrackBerry would set us back more than the non-discounted iPhone 3GS.  Or that the data plan from AT&T was less expensive than the competition.  Or that the Android and CrackBerry, in spite of the ability to develop third-party apps, really don’t have anything useful available on them.  The iPhone is just simply designed better.

Nobody ever said Apple couldn’t design sexy hardware.  Even without drinking the KoolAid™, Apple hardware has always been a designer’s wet dream.  And the proprietary software has always been designed from the user’s perspective, not the twisted mind of some geeky developer who has arcane keyboard shortcuts embedded in his DNA.  No, we don’t buy the “Macs always run perfect” routine — over the years we’ve seen more than our share of Sad Macs and Mac Bombs, and we struggle with the fact that the only user-solution available is to rebuild the desktop.  And we know that statistically it’s less likely to get a virus on a Mac, but we don’t get viruses, trojans or malware on a PC either — we know how to prevent that.  Even with all that, the iPhone just happens to be a user’s wet dream too.

First, it fits your hand.  All the touch-screen competitors make lousy phones, they don’t fit your hand next to your ear.  Sure, we could join the Borg and get Bluetooth implants on our cheekbones, but that’s not how we use a phone.  Yes, we have used the devices, the Jawbone is absolutely awe insipiring, but sticking a plug in our ear will have to wait until we have lost our natural hearing or we are assigned to a Special Forces detail guarding the President.

More importantly, the iPhone fits your pocket.  One reason we loved the RAZR phones is the clamshell devices were sleek and slipped into your jeans pocket beautifully.  Sure, the shiny coating was rubbed raw, but the phone was protected while banging around with keys, coins and the occasional gore and grit that slip into everyone’s life.  The iPhone, with its exposed touch screen and rather expensive repair policies, is far more scary to slip into your pocket.  But it’s just as capable.  A rubberized slider case from Incase, another group of wet-dream designers, and a cheap screen protector have kept the iPhone from major, and even minor, damage.  The oleophobic glass screen technology Apple chose is far superior to any touch-screen we’ve seen in the area of shedding potential dust, damage or dingyness.  If possible, the glass even appears to shed finger prints and smudges, a technology we’d like to see on our drinking glasses, eyeglasses and windshields.

And the iPhone has apps.  We have Zunes for our MP3 source, and we’re not enamored enough with music to waste our money with iTunes, but there are other apps available for the 3GS.  Like the Geocaching app.  Yes, we’re weekend Geocachers (Sunweasels), and the iPhone makes those park and grabs a bit easier.  We no longer have to plan a Geocaching expedition and load the GPS coordinates, we can thumb up a local cache for a lunch time break.  It doesn’t replace a real GPS, for one thing it’s far less accurate, but the iPhone 3GS, and the Geocaching app, is always tucked in our pocket.

Or mounted on the dash mount in the Wrangler.  It doesn’t bounce like a dedicated GPS (Garmin GPSMap 60CSx) does, plus with a simple cable it becomes the stereo source.  Satellite radio cuts out under trees, but the free Pandora app for the iPhone shines.  We have our own dedicated station playing our own range of hits, with no monthly/annual fee, and better reception than any satellite radio.  As a by-product for a pretty decent phone.

And that may be the key to our pleasure with the iPhone 3GS.  First and foremost, we need a phone.  Not a MP3 player, not a gaming system, not a mapping device and not a way to find the nearest public rest room.  A list of contacts, voice dialing and some sort of voice mail are the primary needs.  And, well served by the iPhone 3GS, these needs somehow fade into the background.  They’re not afterthoughts in this device, as they appear to be with the texting-scion CrackBerry or the sliding-keyboard Palm Pre, which has a more convoluted keyboard than even the CrackBerry.  They’re not overly complicated, as in most implementations of Windows Mobile.  They’re not even as difficult as they were with the phone-only RAZR.

We’re satisfied.  We tried the competitors, even the Go Phone and other disposables (actually quite expensive to use…).  We tried to stay with Sprint, we were happy with the coverage and service even though their pricing kinda ticked us off.  We had even left AT&T for Sprint years ago, when Cingular took over AT&T and their service took a nose dive.  We’ve been moderately satisfied with AT&T, coverage is okay for where we travel, but we probably would be more satisfied with other carriers.  AT&T got us simply because of the iPhone.  And the iPhone 3GS got us simply because, for our needs, it was the best.  Hands down.  Even if the Mac Geniuses are kind of annoying.

Disclaimer

We like our iPhones, you may not.  We really don’t care, but if you buy an iPhone because of us and regret your decision, don’t blame us.  We have a limited amount of exposure to them so far, and we use our phones differently than you may, so it’s your fault if you blindly follow the advice of some schmuck with a blog.

On the other hand, while we are iPhone converts, we don’t really like Macs for computers (we’ve been using them for a decade for specific tasks).  It may be the devil we know, but we’re comfortable with the PC, it goes above and beyond what we can get from a Mac at a third the price.  Or less.  A $300 Acer Aspire One Netbook running Windows 7 beats a MacBook Air any day of the week.  And we could break about five of them before we spent as much as we would have on the Air.  While we think the Mac versus PC commercials are cute, Windows, for us, is the better choice.  Though we have to admit, the Mac makes a good hardware platform for running Windows.

Making a Link, Checking it Twice…

It’s not even Christmas yet and I find myself with a checklist, this one for Federal web site requirements.  If you run a government web site, be it at the Federal, State or local level, or even outside the US, it’s hard to go wrong when you adhere to guidelines and practices developed by the Federal Government.  Although, in many ways, it’s also hard to do it right.

One checklist item mentioned is an external links review, something sorely lacking in many government web sites.  Most webmasters are capable of doing a decent job of ensuring the major links within their site are working, but do you have an easy-to-follow policy for reviewing external link?  Sure, you can run a link-checker to see if the link is broken (Xenu happens to be a current favorite around here…) but what if the link is still valid, only now it doesn’t point to where you intended?  This could be anywhere from annoying to disastrous.

For example, your link to www.sample.com/Default.aspx?Document=1234 is great, until the nice folks at Sample.com renumber their documents and instead of bringing up the schedule for the Bluegrass Festival it now brings up the requirements for applying for a permit to transport live frogs via personal watercraft.  No link checker will tell you that, you have to actually click on the link and make sure it goes where you think it should.  But hey, at least your Bluegrass fans are only a bit disapointed. It’s not like you linked to a list of the local houses of worship and now that link points to Kandi’s Kastle of Kink.

It’s amazing how many sites have menu items that pass to dead links but even more amazing is how many of those same sites have no method of reporting broken links.  Yes, running link checkers can help, analyzing the site’s log files for 404 responses may be even better, but what’s wrong with letting the viewers help?  Are you afraid they just might do so?

The best thing a webmaster can hope for is a dedicated viewer willing to report a broken link.  After all, it means they want to use your site and, even better, they probably have found the place the link should point to.  They have done 90% of your job, you just need to make a quick edit to fix a problem.  A simple “Report a Problem” link on your site that is either a mailto: link or backed by a response form goes a long way toward appeasing a viewer already frustrated by an incorrect or broken link.  A custom 404 message can make this even easier.

No link checking program, service or analyzer can ensure you don’t have problem links.  It is up to the webmaster, or to designated parties in your organization, to stay on top of these seemingly trivial things.  To a viewer, a site with broken or erroneous links appears less maintained, less important and less trustworthy.  And being less trustworthy is not something any government site needs to aspire to.

Disclaimer

The drivel posted here may or may not make a difference in your environment, so please feel free to pass it by at any time.  On the other hand, you know we’re right, so just listen up and follow the advice.

Book Review: Professional DotNetNuke 5

 	 Professional DotNetNuke 5: Open Source Web Application Framework for ASP.NET
Professional DotNetNuke 5: Open Source Web Application Framework for ASP.NET

Professional DotNetNuke 5, published by Wrox, is the first major book to cover the new 5.x release of DotNetNuke, and it’s assuredly the best book on the subject.  With the exception of the first chapter, which covers the evolution of the DotNetNuke framework and appears to be in the book only to justify Shaun Walker, Wrox DotNetNuke Series Editor, getting a portion of the royalties, this book is packed with useful information.  From installing DotNetNuke through administering the portals to using DotNetNuke as a development framework, the materials are covered in detail and in suitable depth for the audience.

The audience for this book is not those users or administrators who simply want an out-of-the-box product, and to be honest, the DotNetNuke framework isn’t really intended for that audience either.  The framework is just that, a development framework, aimed at developers.  And for that audience this book shines.  The second half of this book is almost exclusively aimed at developers writing custom modules for the framework and, if you’re going to get much out of this book, there is an assumption that the reader is a reasonably proficient ASP.NET developer.

Much of the development that programmers would do within the DotNetNuke framework hasn’t changed since version 4.x of DotNetNuke, but the limited changes are significant to developers.  This book dives deep into the use of the DotNetNuke core APIs, especially as they apply to module developers.  It also extensively covers the use of the ASP.NET membership functions, including roles and profiles, as well as the additional attributes added by the DotNetNuke framework.  The latest versions of the DotNetNuke framework incorporate ASP.NET 3.5 features and, in addition to the membership and security enhancements in the ASP.NET 3.5 framework, LINQ now plays a prominent role in module development.

The last quarter of the book is a walk through of developing a simple module for the DotNetNuke framework.  While this module could easily be developed in the DotNetNuke 4.x framework version, the walk through does cover changes to the framework found in the new version.  The book finishes with a simple explanation of skinning the DotNetNuke portal, although the explanation is too simplified and reads like it was stripped from the DotNetNuke documentation.

And that’s the beauty of this book and those like it.  DotNetNuke has always had woefully pitiful documentation, rarely updated and released well behind the updates to the framework.  Books like these are often the only documentation available to users of the DotNetNuke framework.  And unfortunately, in a few months much of this book may be out of date as the framework is updated.

Details

Professional DotNetNuke 5: Open Source Web Application Framework for ASP.NET
By Shaun Walker, Brian Scarbeau, Darrell Hardy, Stan Schultes and Ryan Morgan

Published February 2009 by Wrox
ISBN: 978-0-470-43870-1
Paperback, 600 pages

Disclaimer

This review is solely based on the opinions of one, somewhat insignificant, blog.  While we would love to believe that we are the most important influence in your life’s decisions, in reality we probably don’t know what we’re talking about.  So if you rely on this review to purchase, or reject, this book, don’t blame us if you think you made a bad decision.  We’re not threatening to give you a wedgie if you don’t see things our way.

Everyone isn’t Everyone…

Microsoft has made some unfortunate naming blunders in the past — unfortunate in that they have become standards even though they don’t make sense.  One of these is the Windows security group Everyone.  It sure sounds, from the name at least, that this group is composed of every account on the server.  After all, shouldn’t Everyone really mean every one?

Not to Microsoft.  Originally, this security group, along with the account Guest, was pretty useful.  It didn’t include every account, but it did include every account you would normally want to grant access to.  Fortunately with the Guest account, Microsoft saw the error of its ways and just stopped using the account, to the point of disabling it by default, but not to the point of eliminating it entirely.  Not so with the Everyone group.  That little bugger is still around to, well, bug us.

There are significant accounts that don’t belong to the security group Everyone, and really shouldn’t belong to that group.  One in particular causes no end of confusion for ASP.NET developers and aspiring Windows or IIS administrators.  That’s the ASP.NET process account.  By default this account is NT AUTHORITY/NETWORK SERVICE in Windows Server 2003 and Vista (and newer operating systems such as Server 2008 and Windows 7).  And frequently, this account needs access to files/folders through Windows NTFS permissions.

The problem for many developers and other users who aren’t well-versed in Windows security, is that the first indication that this account needs access is a generic “Access Denied” error.  The confused developer, thrust into the role of server manager, grants access to one account after another, resulting in the same error, and out of frustration grants access to the Everyone group.  That still results in the same aggravating error because, unbeknownst to the developer, the account that needs access — the ASP.NET process account — isn’t in that group.  Even though, through an unfortunate choice of names by Microsoft, it really sounds like it should be.

The solution to this is not to add the NETWORK SERVICE account to the Everyone group, or worse, make it a member of the Local Administrators group.  The NETWORK SERVICE account is a restricted account on purpose, and should stay that way.  The solution is to add the necessary file/folder access for the specific account, NETWORK SERVICE, that needs this access.

But we’re not going to go into detail on how to do this.  This blog isn’t the place to teach you Windows NTFS permissions, you can easily find that information elsewhere, and frankly, we just don’t have the room.  While you’re at it, remove the permissions you granted to the Everyone group.  After all, it didn’t work, did it?  Leaving that unwanted access intact is a security hole you don’t want to have to explain after North Korean hackers just downloaded all your clients’ credit card numbers.

FileMon

By the way, there is a great little tool for figuring out what accounts are being denied access to what files.  This is FileMon, from Systernals, now owned by Microsoft.  For full details, see the TechNet page.

Disclaimer

Messing with NTFS permissions can break your server.  If it does, it’s your own fault for foolishly following advice you got over the Internet.  If you aren’t sure of what you’re doing, find a qualified professional to do it for you.  And while you’re at it, stop running with scissors.

Project to Watch: IIS SEO Toolkit

It’s only the first beta, but the recently-released IIS Search Engine Optimization Toolkitfrom the IIS team at Microsoft is a project that needs watching.  Currently it has only three features, Site Analysis, Robots Exclusions and Sitemaps, all of which can be handled through other means.  But the SEO Toolkit provides an easy, mostly intuitive and extensible interface to handle these functions.

Robots Exclusion, one of the least understood aspects of SEO for many beginners, really isn’t hard.  Notepad on the server can handle your needs, provided you know what needs you have, and the process should only take a few minutes.  Per server.  Per change.  And that’s where this tool becomes useful.  The SEO Toolkit can run against any IIS server, local or remote, and makes changes to the robots.txt file simple for anyone to do.  Yes, you really do need to know why you’re making the changes, but this is a toolkit, not a Wizard.

Sitemaps are also just simple text files, and anyone with Notepad can create and edit one as well.  But Sitemaps have a lot of code to them (alright, it’s just an XML file, but to non-coders that really does look like difficult code…) and are incredibly easy to break by forgetting a bracket.  The toolkit not only makes editing a breeze, but lets you browse through the site graphically and choose the URL for the Sitemap entry.  Frankly, there’s no other sitemap editing or creation tool that’s this easy to use.

But the crowning glory of the SEO Toolkit is its site analysis features.  Sure, there are tools that can do this, but nothing this easy and free.  Not only can you crawl the site, seeing it the way search robots and browsers do, but you get very detailed reports on the site and general guidelines on fixing common SEO problems.  You will receive an analysis of your site that includes links and references within and to each URL, an analysis of keywords on pages and even the routing a robot takes to get to your pages.  While it doesn’t fix your site to make it more friendly to search engines, it does give you the information you need to make the changes, which is 90% of SEO anyway.

Remember, this is just a beta.  That means the feature list can only grow.  Microsoft’s IIS developers have taken to releasing many tools like this, and even major updates to existing features, completely outside the Windows release schedule.  These out-of-band releases, including the SEO Toolkit, can be downloaded at no charge from the IIS support site.  We’ll be watching this toolkit, you go ceck out the others.