Sometimes when you running a SSIS package with hundreds of dataflow tasks you simply don't want to have frustrations of one error at pre-validation ruins the whole package, as it might just be a tiny error but cost you hours of debugging and more importantly, may even slow down the business process.
Recently I've come across similar problems with one of my packages, and i digged a little which I found a few approaches that could cater such issue.
Approach 1: my case, I have dozens of dataflow tasks running parallel and I want to allow some file changes and format changes to cause error but package still runs and I could get back to the error later on after I receive the error Email I set for each individual task.
To do this is really simple, just change the main control flow's property "maximumerrorcount" to whatever size you like but 1. If you wish, you can also set individual dataflow task's max error count number as well. but all these must based on that you have a very good understanding of what your package is doing and what sort of error you're expecting. If you're not sure of the errors, then you should also add some error handling in event handlers' of tasks.
Approach 2: A more sophisticated solution for this kind of issue is to write a script task to do the file validations yourself before the package does, and set variables whether the following child task should run or not based on your script task result. here's a link of showing an example: http://dichotic.wordpress.com/2006/11/01/ssis-test-for-data-files-existence/
Personally I found first one is really simple, but as I'm aware of, if error handling isn't good enough you might end up spending more time when weird error occurs.