Hello Paradox, I'll take a stab at your points...
Paradox wrote:Well not being able to initially clone without an account is still a bit of a put-off... why not mirror on a host like bitbucket?
I hope it's not too much of one. As JW has mentioned elsewhere, we just didn't know what the demand would be. It's not often that a full featured game engine transitions from private to open source. We tried hard to make things work, the first time, for everyone. And, as part of the choice to use Mercurial, we expected that after the first few clones were retrieved, they would be used to start offering clones themselves, rather than having everyone depend on the OpenUru repository. There are also some topology considerations that make it desirable to keep a repository resident on the Foundry system. The Foundry was conceived to be much more than just a repository host (or even the bugtracking, source browsing, and source review site). I'm hoping to have the Foundry make it easier for people to contribute to the open Uru project even if they're uninterested or unwilling to become skilled in the entire universe of tools and processes to create an instance of Uru. I'm doing this by provided automated building facilities for several platforms, from various Windows flavors to *nix, and maybe even MacOS. Building not confined to compiling code, but perhaps making batch model processing and/or rendering available, automated testing, documentation such as Doxygen, assembling an entire shard into an installer, etc. All of this starts from a repository (which is more efficient locally) and ends up with publicly downloadable build artifacts, browsable documents, etc. If the community has the opportunity to obtain a license to an expensive tool that would be helpful, I'd be happy to put it in an automated flow that anyone can use. It's a community resource.
The whole point of automation is in having a process be reproducible. Binaries are always built against the same libraries, with the same compilers. The same version of Blender is used to render something. And so on.
On that note, there are a bunch of changes to make it compile in Visual Studio 2010... what's the process to get those merged? Without a forking/pull request system, I don't see how contributions can be dealt with in a sane manner (and I don't consider massive patches attached to a bug tracker as "sane").
Branan has already started a dialog on the OU forums about possible processes, and it both (1) meshes in with some of our thoughts on a hierarchy of work/reviewed/qualified stages, and (2) uses Bitbucket for it's simplicity. But I do want to see if we can address some possible constraints on changes. I would assume that there would be interest to make some changes compatible with the current Cyan installation, so they can just drop fixes in and the general population would see the improvements. But at the same time people are interested as you point out in moving to more modern tools, libraries, etc. So we're going to going to have some guidelines as to where it's appropriate for various types of changes. I'm hoping to put out a reply to Branan later tonight and see what we can start to refine and propose.
On that topic, which do you see is more important... Set something shared up immediately because contributions are already backlogged, and perhaps have to change the process flow in a little while, or spend some time (days, not weeks) trying to anticipate some of these various configurations and come up with a fairly clear generalization? We could do either (and I don't think anything we do would really delay anyone's work right now).
There are so many items to cover, and it must be done with input from all interested parties. I'm glad to learn and integrate your thoughts about reviewing contributions, when more or less rigor in process is necessary, and yes, even repository organization.
_R