Wikipedia talk:MediaWiki future directions
Is wiki a good P2P application?
[edit](from the village pump)
With the hardware upgrade still fresh in everyone's mind it seems that moving Wikipedia to some sort of P2P network configuration might be a good idea. This way volunteers could host some articles and as the Wikipedia's user base grows so to does the processing power. Of course there would have to be many redundant copies of each article but that would give lots of bandwidth and fault tolerance. Articles in different languages could actually be hosted in their country or origin which would reduce bandwidth? It would also give P2P networks a legitimate use.
- I have wondered whether a cluster-type solution would be a good idea. Maybe if we could move towards hosting different languages on separate machines, we could eventually end up where those machines didn't all have to be in the same room, or even country, and then we could have what you're talking about. Unfortunately I don't know enough about how MySQL works, or how Mediawiki drives it, to know how feasible this sort of thing might be. --Phil 17:02, Feb 12, 2004 (UTC)
- There's a whole page somewhere with loads of weird and wonderful proposed server architectures, involving various degrees of distributedness. Can't for the life of me remember how I got there though - it wasn't here, on meta, or even on OpenFacts, but I drifted there while stuff was down once. Ooh, here we go, I found it: http://www.aulinx.de/oss/code/wikipedia/ (linked to from m:Cache_strategy, which may also have relevant discussion). Enjoy :-/ IMSoP 18:45, 12 Feb 2004 (UTC)
- That was on a wikipedia server error message I believe. --Ed Senft! 00:24, 13 Feb 2004 (UTC)
- There's a whole page somewhere with loads of weird and wonderful proposed server architectures, involving various degrees of distributedness. Can't for the life of me remember how I got there though - it wasn't here, on meta, or even on OpenFacts, but I drifted there while stuff was down once. Ooh, here we go, I found it: http://www.aulinx.de/oss/code/wikipedia/ (linked to from m:Cache_strategy, which may also have relevant discussion). Enjoy :-/ IMSoP 18:45, 12 Feb 2004 (UTC)
- Personally, I would love to see wikipedia is more decentralized. Imagine you create a new article and it is simply stored in your local drive. As time goes, it spreads among other computers. In other words, there is no special thing like submitting articles. You may wonder what's nice about this? As wikipedia grows, I am quite confident that it cannot be sustained by the central scheme we have now. Edit conflicts become more common thus annoying. It would become normal that the same problem is resolved in different places in different ways. Those diversation may sound the deteriment of coherence but it also guarantees articles to be more durable because vundalism can be difficult and disputes can be reduced.
- Well, this sounds more like P2P-based Internet idea. But this is what I think the future of wikipedia is. -- Taku 00:42, Feb 13, 2004 (UTC)
- I see a big probleme in the loss of coherence. If wikipedia is distributed, everyone can have their own true version of a disputed article -- but there would be no need to discuss and converge into one "peer-reviewed" article. That wikipedia is centralized (or at least: that the edit function and every db transaction following this has to be sort of centralized) seems to me the only possible topology, if you don't want to mess around with cached edit conflicts (X on distributed Wikipedia Server A changes article #123, Y on distributed server B does the same, and only five days later in some move upwards the net the different versions of #123 meet and clash. And now?). Wikipedia has to be centralized (but can be mirrored) for the same reasons that the DNS-hierarchy and Sourceforge are in a way centralized. -- till we *) 18:37, 13 Feb 2004 (UTC)
- Using PGP signatures and timestamps you could certainly have an officially sanctioned version of an article. Of course, that makes the software that much harder to write.... My feelings are that it's possible, and it'd be great, but it's a lot of work, and I don't think there's enough interest from dedicated programmers to get it done. Anthony DiPierro 15:51, 14 Feb 2004 (UTC)
- I guess my point is that wikipedia database can be distributed like usenet. For example, I doubt every single user is interested in an article of some village in Japan. As wikipedia articles resemble web pages on the Internet, some sort of diversion is inevitable. Also, why is it important to have one version, I am not meaning more than one "official" version. Imagine some article is created by a certain group of people even they use an offline mean. And in some places we debate which version is best and should be included. I think having more version can be beneficial when so many people are engaging in editing. My point is that having one autherlized true version is good but it doesn't necessalily imply the development should occur at one version. I see wikipedia 1.0 is evidence of this path.
- Of course, I am talking about the era when an English wikipedia hits one or ten billion articles. Just brainstorming for fun.
-- Taku 00:00, Feb 14, 2004 (UTC)
It seems there are several issues here. The first is just serving existing articles that have stabilized and don't see many changes. The second is consistency of articles being edited. Does someone know what the ratio is. I would guess many articles are fairly stable by now. If so these could be distributed with little impact. As a first cut the articles still being edited could remain in a central repository. Editing article that are distributed sounds a lot like distributed databases, replication and two phase commits to me but I'm no expert.
Shouldn't the page be on meta? --Michael Snow 22:50, 15 Apr 2004 (UTC)