# Kickstarter Stats 101: Does the Month I Launch Matter?

Many people have asked the question “Are certain months of the year bad months to launch or end a Kickstarter project and others months good months?”

Well, let me start by saying… no one can actually answer this question definitively because it’s far too subjective of a question. So humor me for minute and let me ask a question that the data may be able to actually answer.

Do the average success rates of projects on Kickstarter change significantly based upon the month they are launched or the month they end?

**The answer is a definitive, YES…**

Before we continue I should say that this data did influence my thinking when I created my current Kickstarter project, so hopefully it can add some value to yours as well.

### A Visual Summary of Monthly Success Rates

Below is a graph that overlays the success rates of projects that launch (blue line) and end (orange line) in a given month. The trend is pretty clear to see.

### Kickstarter Success Rates Based Upon Month Launched

Success rates (on average) are significantly lower if you launch your campaign during the months of July or August and significantly higher if you launch your project during the months of January, February or April.

You also see a small drop in the success rates of projects launched during November and December but not significant enough to say definitely that this drop matters. Further analysis needs to be done to determine if these small drops effect all projects launched during November and December, or just those that incorporate thanksgiving and Christmas.

### Kickstarter Success Rates Based Upon Month Ended

Success rates (on average) are significantly lower if you end your campaign during the months of August, September or December and significantly higher if you end your project during the months of February, March and May. Again, further analysis is needed to determine how the holiday seasons effect success rates.

The variation across all other months is not strong enough to conclude a significant difference exists.

You may have a lot of questions at this point. If you really want to understand the data, the process I took to arrive at these results, the statistical analysis I used and the underlying thought behind the analysis, you should read my set of blogs called **Kickstarter Statistics 101 – A Rough Introduction to Stats via Kickstarter **on the** Kickstarter Statistics 101 **landing page. If you just want to know about what the data says, keep reading.

### Determining Statistical Significance

All the data has been compiled and laid out in an easy to understand fashion (you can read a blog about how the data was prepared for analysis here). The lines graph and bar graphs above shows the average success rates for each month. From this we can see that these success rates differ from month to month, but what we really want to know, is if these success rates differ “significantly”.

### Performing the Statistical Analysis – ANOVA

To find this out if these difference are significant, I ran a simple ANOVA test, which means **AN**alysis **O**f **VA**riance. The ANOVA determines how much discrepancy (or variation) there exists within a group of data, and then compares that to the amount of variation there exists across all the groups analyzed. I will post a blog later to describe this in more detail, but for now a Google search or YouTube search should suffice to provide some basic understanding.

The ANOVA test will return many pieces of information, three of which are important to our conversation here. Those are the F-critical, The F-value and the P-value. Without getting drowned in detail, here is what each of these mean.

### The F-critical, The F-value and the P-value

F-critical – the F-critical is a threshold. If the F-value calculated from the ANOVA test is greater than the F-critical, then we have reason to believe that our results are significant. If our F-value does not exceed this threshold, than we cannot be sure that our results are significant. The P-value on the other hand gives us an idea of just how significant is our significance. In other terms, it gives us an idea of how often we could return the same result by pure chance.

So we want our P-value to be as low as possible (lower than 0.05 at least but even lower is even better) and we want our F-vale to surpass F-critical by as much as possible!

### The Initial Kickstarter Stats Results

When I performed the ANOVA on the success rates of the launch data I found an F-critical of 0.953, an F-value of 13.6 and a P-value of 2.74E-11 (a very tiny number!). Simply put, the differences between the success rates of each month are HIGHLY SIGNIFICANT.

For the end dates we had an F-critical of 1.99, an F-value of 12.56 and a P-value of 1.04E-10(another very tiny number). Again, the differences between the success rates of each month are HIGHLY SIGNIFICANT.

But now the question remains: Are certain months more significant than other? We can answer this question by determining which months in particular are causing such extreme results.

### Digging into the Kickstarter Statistics a Bit Further

What I decided to do next, was to remove each month one-by-one, and re-run the ANOVA test. Once the results of the ANOVA test found that the difference in success rates where no longer significant, we would have a pretty good idea of which month’s really diverged from the norm.

For the launch dates, those months were January, February, April, July and August. July and August affording lower success rates while January, February and April afforded higher success rates. For the end dates, those months were the February, March, May, August, September and December. August, September and December affording lower success rates while February, March and May afforded higher success rates.

The results of the ANOVA test only showed that no significance difference existed for the launch dates and end dates when all the above mentioned months were removed from the analysis. Statistically, this means we can thus assume that there’s really not much of a difference between the other months (at least not one that couldn’t have occurred by chance 5% of the time – which in my opinion is still a quite extreme threshold for this case) even though the average success rates do still differ by just a little bit.

### My Personal Interpretation

Please allow me to blabber in pure ignorance about my opinions. Nothing I say here should be construed as truth or fact, rather a good reason for you to leave a comment explaining how big of a dummy I am.

If you look at the months that produce higher and lower success rates, I think it’s apparent that it overlaps quite closely with the U.S. academic calendar as well as the traditional popular U.S. holidays. I think the summers afford many people the time to think up and prepare for an increasing number of Kickstarter launches (students and teachers probably make up a significant proportion of the Kickstarter community – demographic stats needed here). But once the school year starts back up again, that number begins to taper off again.

The bar graph below shows the total number of tabletop game projects launched each month since the beginning of Kickstarter with the success rates of each of those month laid over in line graph fashion in orange. I have not done any testing to identify the significance of this claim, but it does appear there in an inverse correlation between the number of projects launched in a given month and the success rates of a given month.

If there is an inverse correlation between the number of projects launched in a given month and the success rates of a given month, it would suggest that projects of a similar type (tabletop games for my example) are fighting for pieces of the same pie. Or in other words, the pool of potential backers is limited and projects will likely be fighting over this pool of limited backers. Again, this is just my assumption and based upon the graph above.

Here’s something else quite interesting. Since most projects are roughly one month or roughly 30 days (a valid assumption according to Kickstarter) then we could also assume that a project launched in one month (say February for example) will end the next month (say March for example). If I overlay the success rates of projects launched in a given month, with the success rates of projects which ended the following month, the graph seems to match up quite precisely.

The small bars show a 5% error in the estimates. It’s clear that for most months, the success rates of the both projects launched that month, and projects ending the following month fall with a 5% error of both sets of success rates. Not sure what to conclude about this, except that they are measuring the same projects, with the exception of December and January. Any ideas why this is the case?

Boy I wish I had more time to really dig deeper into the analysis.

### Nerdy Statistics Information

If you want to know more about how I got the data I am using and hear about some of the complications I had in getting it and using it, the whole data collection process is described in a previous blog post here.

If you want to know more about how I prepared the data for analysis, to try and minimize erroneous results and other complications, you can read a blog about how the data was prepared for the analysis here.

### Testing for a Normal Distribution

Before we perform any statistical test, we will want know whether the data is normally distributed. This is very important. I go into more detail about what exactly this means in an earlier blog post here. But for the sake of completeness:

Normally distributed simply means that results derived from the data will be driven dominantly by the average of the data, (what’s normal about the data) and likewise, results will not be driven by certain values which are abnormal to the data and that could, all by themselves, skew those results.

In case you’re curious, here are my histograms and P-P plot.

### Please Leave a Comment!

Please comment if you learned anything, if you want to correct anything, or if you have any questions!

This is a great post, John! I linked to it in my old “Timing and Length” Kickstarter Lesson.

Hey Jamey, thanks for the kind words and thanks for linking it to your blog post! I hope people can find the information very useful.

Thank you John.

The analysis is very informative.

Now I subscribed & I am waiting for more 🙂

Thanks Konrad. Hopefully others find these helpful as well and hopefully I will have more posted soon!

A ton of data out there saying to stay away from Dec / Jan / Feb. Your article shows that Jan / Feb is actually OK. Make me feel a little better since I want to do a Dec 2014 / Jan 2015 campaign.

Hey Helen. Well I think this data still shows that launching in December is pretty bad, but a January or February launch is not too bad. But also remember, the data is just that – data. It’s really hard to extract “advice” from the data but hopefully it at least adds to the conversation in a constructive way!

Hi,

Your data shows that August is the worst month. But some other statistics I saw show it visa versa, how can it be?

I don’t think you can compare stats here with other stats you’ve read about Kickstarter – unless those other stats are also only looking at board games. It’s nearly impossible to say for sure but it would be quite easy to suppose that

a) Board games in the summer months were less funded because of the described distribution of the pledges due to the large number of projects in those month and

b) People play a lot of board games in the winter months (we certainly do!). A lot of people will be out on summer vacation and not in playing games, if games are less on people’s minds they’re maybe less likely to back them.

Other kickstarter stats we see often don’t make market specific observations and are instead looking at kickstarter projects generally.

This is a fantastic analysis and very helpful, thanks for putting this together! I was actually looking for this exact data when someone from the Tabletop Game Kickstarter Advice group on Facebook posted this link.

Thanks Mike, I am glad you found it useful! I went back and saw that thread on the Tabletop Game Kickstarter Advice group on Facebook, and think that is a very interesting question James Mathe brought up. I Launched a campaign in April of this year (2104) and it was highly successful with no pre-campaign buzz. I launched my most recent one October 15th, of this year (2014) and we struggled even with bringing a few hundred backer from our previous campaign. Something is happening, but not sure what.

Good stuff, is there any data on the what works best? A video, a special seasonal offer – e.g. Valentines day gift etc?

This is really hard to identify with data. I think experience and research is the best indicator here and that’s where I would turn to http://stonemaiergames.com/kickstarter/

Another theory about why July/August sees lots of failure: because that is when bored college students on summer break get high and pitch stupid kickstarter projects.

Well Lester, I guess we do have to consider all our options here.

John, really fascinating data. Interestingly, I did similar analysis a couple of months ago but looked at number of funded projects (rather than success rate) and made some very different observations from yours. I think your hypothesis about an inverse correlation between number of projects started and success rate may have something to do with it.

I do the Dice Tower News Kickstarter report every Friday, which means I review all the tabletop boardgame projects that are approaching their funding dates every week. I ignore projects that are obviously not going to fund, and I think these projects might be influencing the “success rate” data that you calculate. Last December, I noticed a drop in the number of projects that I considered likely to fund from one week to the next, and that’s what prompted me to do my own statistical analysis. It was considerably less formal than yours, and as I said, it was based on number of funded projects by the month in which the campaign ended (

i.e.the funding date). Also, I only collected data in 2013 and 2014 because I didn’t have a bot to collect the data automatically; I used an entirely manual process.My observations were that

(1) every month in 2014 had more successful projects than the corresponding month in 2013 (i.e. Kickstarter is growing year over year) and

(2) December, January, and February were low-funding months, while May and November were higher-funding. (I didn’t do ANOVA analysis on my data, but perhaps I’ll go back and give that a try.)

So the trends I observed appear very different from yours, apparently because I’m looking at number of successes, not success rate. By the way, I tried looking at total funding dollars (on the notion that perhaps there is a seasonal “money supply” chasing projects), but I got wildly variable results and could reach no conclusion other than that popular projects bring out dollars that wouldn’t otherwise fund any project.

In your “Success Rates of Projects Launched with Month Ended Shifted Left,” you observe a divergence between projects launched in December and completed in January. Did you verify that your “population of Januaries” comes from the years following your “population of Decembers”? (For example, if your project-launch Decembers are 2009-2014, are your project-ended Januaries 2010-2015?)

And one last question: I can’t figure out what you are testing for normality in the last section of your article. What do the

x-axes represent in your two histograms?Hi pdowen, I found this really useful , thanks

Fantastic Post!! Thanks John! Great Data!

Hey thank you so much for this information. It really helps me more with the launch date for my Kickstarter campaign.

Thanks John for the useful post! We are in process to launch our kickstarter campaign and your post give us a great help. Thanks for sharing all this information.

Very Good stuff John, thanks!

I really appreciate this hard data and analysis. At the same time, though, I recognize that correlation doesn’t prove causation, so I still wonder to what extent launch months actually affect overall success.

For what it’s worth, I understand that summer brings both more projects and more traffic, which means that some projects may get more easily buried among intense competition while other projects that rise above that competition to prominence may enjoy increased total pledges.

I can also say from experience that December brings less public interest in crowdfunding projects, along with intense competition for advertising on social media, so I’d highly recommend against any campaigning between about Thanksgiving Day and New Year’s Day.

Hello John,

Thank you so much for your work. I am really happy to read all of your statistic reports.

Do you have any report regarding to location? I’m from Barcelona and the campaigns that are similar with mine, mostly from USA and I cannot figure out, if it would be bad idea to publish it from Spain? I can also publish it from USA with my friend’s help. If you have data based on geography I will be so happy to see.

Thank you.

Thanks so much for this post! It helped my husband and I a ton! We just launched our Kickstarter campaign for a card game we created! Whoop Whoop!

Thanks again!

(and go check out “OH NUTS!” Card game on Kickstarter!) 😉