Rather than the IIS Rewrite engine (which is lower in priority) I'm trying to block some persistent unwanted GET requests by some Akamai customer who continuously keeps hitting my IP address from several remote IPs (apparently for purposes of route optimization.
Here an excerpt of the relevant fields from the typical IIS log entries:
... GET /akamai-test-object.html ... HTTP/1.1 FirstFlowAgent…
I used the IIS manager to set up request filtering and against BOTH the request URL as well as the User Agent string in order to deny either. It added THIS to the web.config:
<system.webServer><security><requestFiltering><filteringRules>
<filteringRule name="Block Probing URLs" scanUrl="true" scanQueryString="false">
<denyStrings><add string="/akamai-test-object.html" /></denyStrings>
</filteringRule>
<filteringRule name="Block User-Agents" scanUrl="false" scanQueryString="false">
<scanHeaders><add requestHeader="User-Agent" /></scanHeaders>
<denyStrings><add string="FirstFlowAgent" /></denyStrings>
</filteringRule>
</filteringRules></requestFiltering></security>
</system.webServer>
Yet, the request apparently continue to make it through and show up in the IIS log.
Documentation of this feature is pretty scarce (other than listing each parameter and restating it's general/obvious purpose)