You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As developers, we want to know how popular Toil is, who is using it, and how much they are using it.
Having these statistics would let us show what the dominant use cases are (are people still using old Python 2 Toil and never updating? Is everyone using AWS? Is everyone using Slurm?), which could function as a health indicator and let us predict what might be important to users. Or, if nobody actually is using Toil and we’re wasting our time writing it, we would want to know that. We might have bad product-market fit and not know it. Right now we only get feedback from people frustrated enough to contact us with a bug or feature request.
If we had statistics showing that a lot of Toil workflows are being run by people, we could point to it and try and convince funders to fund us.
We could use statistics to guide development towards popular features and away from unpopular ones. We could also build A/B testing machinery to e.g. find the best way to do command line options, although that would cause trouble with documentation and with consistency in scripts.
Having statistics on non-Dockstore Toil runs could let us see what fraction of people are actually using Dockstore to host workflows.
We have met a “surprising” number of people who use Toil at conferences, and some of them say they never change their Toil versions and it never breaks.
Problems to solve here are related to determining if users actually want to send statistics to us or tell us what workflows they are running or what features they are using or what errors they are encountering.
┆Issue is synchronized with this Jira Story
┆Issue Number: TOIL-1671
The text was updated successfully, but these errors were encountered:
As developers, we want to know how popular Toil is, who is using it, and how much they are using it.
Having these statistics would let us show what the dominant use cases are (are people still using old Python 2 Toil and never updating? Is everyone using AWS? Is everyone using Slurm?), which could function as a health indicator and let us predict what might be important to users. Or, if nobody actually is using Toil and we’re wasting our time writing it, we would want to know that. We might have bad product-market fit and not know it. Right now we only get feedback from people frustrated enough to contact us with a bug or feature request.
If we had statistics showing that a lot of Toil workflows are being run by people, we could point to it and try and convince funders to fund us.
We could use statistics to guide development towards popular features and away from unpopular ones. We could also build A/B testing machinery to e.g. find the best way to do command line options, although that would cause trouble with documentation and with consistency in scripts.
Having statistics on non-Dockstore Toil runs could let us see what fraction of people are actually using Dockstore to host workflows.
We have met a “surprising” number of people who use Toil at conferences, and some of them say they never change their Toil versions and it never breaks.
Problems to solve here are related to determining if users actually want to send statistics to us or tell us what workflows they are running or what features they are using or what errors they are encountering.
┆Issue is synchronized with this Jira Story
┆Issue Number: TOIL-1671
The text was updated successfully, but these errors were encountered: