May 18, 2010 ByJoanna Smith
There are plenty of resources and methods out there you can use to reduce the amount of submissions to a form on your website by non-humans. Some website administrators get hundreds or thousands of submissions from bogus people. They are often automated bots searching for open holes your website to infuse content for whatever their purpose is. Many times it is to promote their own website, content, or services. Sometimes they are attempts to cause the website or webserver to crash, inject viruses or spread maliciousness to anyone using the internet. Often there is intent to spread porn traffic.
One person asked me, “Why would someone do that?” I answered, “The same reason people create world-wide system-debilitating viruses. Usually, the reason is ‘cause they can’.”
I received comments from a blog post I wrote recently, Optimizing Search Engine PageRanking by Using Dynamic 301 Redirects. After I weeded out the spam submissions and cross site linking, I was honored by some of the comments, like “A very useful info. thanks” and “like this information good work thanks”. Then I realized they were all over the place, by different bogus people.
A popular tool developers have used to prevent automated systems from making automated hits to a form is CAPTCHA. Here is more information about CAPTCHA, because I don’t intend on covering this tool in detail CAPTCHA: Telling Humans and Computers Apart Automatically.
The advantages of CAPTCHA is that the regular internet is used to going through this security check on forms all over the internet. This is also a disadvantage because it is really annoying. Another advantage to CAPTCHA is that it is easy to implement and very little code. You can see problems when you use CAPTCHA on an SSL secure page, as this control may not be secure. Another disadvantage is that SPAM programs get smarter and smarter and have learned how to get past that roadblock. They “kick over the pebble” and SPAM the form anyway.
I have an alternative approach.
Try creating a timer to your form that doesn’t allow submissions before a set amount of time.
- The first thing you do is add a server side textbox. Set the visible property to false.
- Secondly, add a custom validator for the text box above.
- Next, set the time when the page is loaded, on Page_Load.
- Then add the OnServerValidate method.
This method checks to see if the current time is greater then <n> seconds from the start time on Page_Load. Then the Page.IsValid is true or false based on the time difference you’ve pre-set. In my example, I set the time to 10 seconds.
Before the data from the form is saved, first check that the Page.IsValid is true.
<asp:TextBox runat="server" ID="txtStartTime" Visible="false"></asp:TextBox>
<asp:CustomValidator runat="server" ID="countSeconds" ControlToValidate="txtStartTime" OnServerValidate="CheckSeconds" ErrorMessage="Please try your submission again."> </asp:CustomValidator>
protected void Page_Load(object sender, EventArgs e)
txtStartTime.Text = DateTime.Now.ToString();
protected void CheckSeconds(object sender, ServerValidateEventArgs args)
DateTime startDate = Convert.ToDateTime(txtStartTime.Text);
if (DateTime.Now > startDate.AddSeconds(10))
args.IsValid = true;
args.IsValid = false;
protected void btnSubmit_Click(object sender, EventArgs args)
//Your save data here...
In conclusion, we require the “human” to spend at least 10 seconds on the page before submitting the form. If the form is submitted before 10 seconds, the user gets a message “Please try your submission again.” The average contact form probably takes at least 30 seconds or more to complete. Adjust the time accordingly for more protection. Like anything out there, it may not be foolproof, but it is an alternate.
I hope you have better luck getting value added leads and information from real human beings.