10.Look At Descriptive Statistics First.
Many people try to put the cart before the horse, so to speak, and take-on complex analyses before they spend time examining the data from a basic perspective. Often the descriptive statistics provide CRITICAL context for uour complex analysis, allowing them to be much more interpretable and clear to understand.
9. Trim Your Data Prior To Analysis, Making It Easier To Focus On Analysis.
You can either manually delete your unneeded variables (after saving your dataset as a seperate set; see #8) or by using the "Define Variable Sets" function
8. Never Perform Analysis On The Master Copy Of Your Data.
In general, there is nothing to be afraid of while doing analysis, as it is very hard to actually "mess-up" your data while running analysis. However, with that said, NEVER use your master copy.
7. Base Your Hypothesis In Theory, Not On A Hunch (Or On The Data).
There is nothing worse than trying to explain a statistical anomaly that isn't supported in the literature and may be found due to random error...
6. Accept That You May Not Find "Significance".
Accept that you may not find "significance" and devote some time thinking about what that might mean (ahead of time). Sometimes the most interesting stories come from something that didn't happen, or a finding that didn't pan-out!
5. Check Assumptions BEFORE You Analyze Your Data.
Although this is a pain in the rear-end, this can save you a HUGE amount of time in the process of analysis, because violations of assumptions can cause some strange outcomes in the data that canthen lead you to try to explain the strange finding that may not even be valid.
4. Carefully Select Your Analysis.
Look it up, read blogs about it, ask your professor, or call your friendly stats consultant. Whatever you need to do, be sure you are picking the appropriate analysis to answer your research questions, as it will save you a lot of headaches later.
3. Try To Remember That There Is NO SUCH THING AS "BAD RESULTS".
At the risk of sounding preachy, just let the stats tell your data's story. While this sounds easier said than done (and it is), it will save you a lot of work trying to "rationalize" a finding later or trying to make a result "fit" with your pre-conceived notion of the result.
2. Use Syntax To Automate Repetitive Analyses.
This can save you TONS of time and decrease the likelihood of analysis errors, compared to running the analysis over-and-over again manually.
1.Form Clear, Specific, And Concise Hypothesis BEFORE Analysis.
It is much easier to test a theory if you know exactly what you expect to see happen (or not happen). This also helps to prevent so-called "data fishing expeditions", which carry with them a whole set of their own problems and complications.