American/Israeli entrepreneur, research scientist, and coach. Founder of Signal Patterns (acquired 2011), Chief Scientist at bLife Inc. Passionate about using methods of coaching and psychology to bring clarity and facilitate transformation and success for companies and individuals.
Thinking of a new venture (either startup or a new initiative within a larger company), I would always suggest being focused on a single offering first, but with the following caveats:
1. In my view, the purpose of the first offering is to test the business hypothesis, this means not only offering but: target audience, need, specifics of the offering as the solution, and its differentiation. Really be laser-focused on all of these.
2. The only way to test that complete hypothesis is to develop, in full, an offering for a well-defined target audience, etc.
3. Once it is out and you measure adoption and success, it's critical to assess which of the assumptions are proven to be valid - could be that the need is not strong for this audience, or that there's not enough differentiation - the offering is only a part of it. You can only test this well if you developed a good offering to a well-defined audience solving a well-defined problem.
4. Then - iterate. You may need to change your offering, but you may also discover that your audience has a stronger need (they all email you and ask for it). The key here is "drop" everything and be flexible - now it is time to lose the former focus - you may need to make dramatic changes.
In my experience, companies that went through this cycle (of building a deep offering and then iterating based on the data it provides) have been the most successful.
Several potential disadvantages:
1. A commitment to use the corporate's resources (even if inadequate)
2. Lack of flexibility to deviate from the corporate's plans and roadmaps when the market changes
3. Difficulty in getting other types of investors in down the road.
Conflicts are a natural part of forming and solidifying any team, and definitely so in a startup environment.
I experienced a number of conflicts, both as a CEO and as an employee. In all cases the conflict was a great opportunity because it was very telling about the team and about the business' direction and the way it is managed.
The important thing is to identify and distill the insights and make decisions quickly so that things do not continue to brew or escalate.
Glad to share more of my experience and insights on a call.
My experience with the different aspects of confidentiality ranges from military/government, through corporate R&D (at IBM Research) and as an executive in the private sector (as CEO and founder of a tech company).
Of course, if the disclosure of any details could do harm for anyone outside yourself or your organization (as is the case in government applications) - then the information should not be disclosed.
If the information is an idea, my experience has been that people are too worried about sharing ideas. I've had many situations where I disclosed the details of my idea in full, to people who have the resources to compete with it against me if they wanted, and I have never had anyone do that. To make an idea reality you have a thorough mental image in your mind of what the final product would look like, and chances are only you have that image at the necessary level of detail.
Regarding specific product information (when it is not publicly available) or quantitative business data (like forecasts): I have always been very careful with those, and only disclosed after signing an NDA. I also viewed the disclosure as a point of no return in the relationship, and assessed how much I personally trust the person I share the information with.
I'd be glad to chat further and provide more input based on the specifics of your debate.
There are three major challenges with the scenario you describe:
1) the data is usually not in a format that is ready to be chewed upon (less critical)
2) The data-science tasks are usually not well-defined by the people who need them (very critical)
3) The process tends to be iterative and not on-shot.
The only successful situations I'm aware of that is close to the one you describe is competitions/benchmarks where the task is crystal-clear and the data is ready-made (like the Netflix one or many others run in the research community and by the government).
In these competitions, issues (1) and (2) are addressed, and you may hire a person so that they can iterate on it and continue the work (issue 3).
In certain domains and projects there's less iteration needed, so if the problem to be solved is well-defined and data is well-prepared, it can be done successfully. I've been in several situations where I "ordered" a data-driven algorithm and plugged it in a live system.