Login | Register For Free | Help
Search for: (Advanced)

Mailing List Archive: Wikipedia: Foundation

How much of Wikipedia is vandalized? 0.4% of Articles

 

 

First page Previous page 1 2 Next page Last page  View All Wikipedia foundation RSS feed   Index | Next | Previous | View Threaded


rarohde at gmail

Aug 20, 2009, 3:06 AM

Post #1 of 37 (4192 views)
Permalink
How much of Wikipedia is vandalized? 0.4% of Articles

I am supposed to be taking a wiki-vacation to finish my PhD thesis and
find a job for next year. However, this afternoon I decided to take a
break and consider an interesting question recently suggested to me by
someone else:

When one downloads a dump file, what percentage of the pages are
actually in a vandalized state?

This is equivalent to asking, if one chooses a random page from
Wikipedia right now, what is the probability of receiving a vandalized
revision?

Understanding what fraction of Wikipedia is vandalized at any given
instant is obviously of both practical and public relations interest.
In addition it bears on the motivation for certain development
projects like flagged revisions. So, I decided to generate a rough
estimate.

For the purposes of making an estimate I used the main namespace of
the English Wikipedia and adopted the following operational
approximations: I considered that "vandalism" is that thing which
gets reverted, and that "reverts" are those edits tagged with "revert,
rv, undo, undid, etc." in the edit summary line. Obviously, not all
vandalism is cleanly reverted, and not all reverts are cleanly tagged.
In addition, some things flagged as reverts aren't really addressing
what we would conventionally consider to be vandalism. Such caveats
notwithstanding, I have had some reasonable success with using a
revert heuristic in the past. With the right keywords one can easily
catch the standardized comments created by admin rollback, the undo
function, the revert bots, various editing tools, and commonly used
phrases like "rv", "rvv", etc. It won't be perfect, but it is a quick
way of getting an automated estimate. I would usually expect the
answer I get in this way to be correct within an order of magnitude,
and perhaps within a factor of a few, though it is still just a crude
estimate.

I analyzed the edit history up to the mid-June dump for a sample
29,999 main namespace pages (sampling from everything in main
including redirects). This included 1,333,829 edits, from which I
identified 102,926 episodes of reverted "vandalism". As a further
approximation, I assumed that whenever a revert occurred, it applied
to the immediately preceding edit and any additional consecutive
changes by the same editor (this is how admin rollback operates, but
is not necessarily true of tools like undo).

With those assumptions, I then used the timestamps on my identified
intervals of vandalism to figure out how much time each page had spent
in a vandalized state. Over the entire history of Wikipedia, this
sample of pages was vandalized during 0.28% of its existence. Or,
more relevantly, focusing on just this year vandalism was present
0.21% of the time, which suggests that one should expect 0.21% of
mainspace pages in any recent enwiki dump will be in a vandalized
state (i.e. 1 in 480).

(Note that since redirects represent 55% of the main namespace and are
rarely vandalized, one could argue that 0.37% [1 in 270] would be a
better estimate for the portion of actual articles that are in a
vandalized condition at any given moment.)

I also took a look at the time distribution of vandalism. Not
surprisingly, it has a very long tail. The median time to revert over
the entire history is 6.7 minutes, but the mean time to revert is 18.2
hours, and my sample included one revert going back 45 months (though
examples of such very long lags also imply the page had gone years
without any edits, which would imply an obscure topic that was also
almost never visited). In the recent period these factors becomes 5.2
minutes and 14.4 hours for the median and mean respectively. The
observation that nearly 50% of reverts are occurring in 5 minutes or
less is a testament to the efficient work of recent changes reviewers
and watchlists.

Unfortunately the 5% of vandalism that persists longer than 35 hours
is responsible for 90% of the actual vandalism a visitor is likely to
encounter at random. Hence, as one might guess, it is the vandalism
that slips through and persists the longest that has the largest
practical effect.

It is also worth noting that the prevalence figures for February-May
of this year are slightly lower than at any time since 2006. There is
also a drop in the mean duration of vandalism coupled to a slight
increase in the median duration. However, these effects mostly
disappear if we limit our considerations to only vandalism events
lasting 1 month or shorter. Hence those changes may be in significant
part linked to cut-off biasing from longer-term vandalism events that
have yet to be identified. The ambiguity in the change from earlier
in the year is somewhat surprising as the AbuseFilter was launched in
March and was intended to decrease the burden of vandalism. One might
speculate that the simple vandalism amenable to the AbuseFilter was
already being addressed quickly in nearly all cases and hence its
impact on the persistence of vandalism may already have been fairly
limited.

I've posted some summary data on the wiki at:

http://en.wikipedia.org/wiki/Wikipedia:Vandalism_statistics

Given the nature of the approximations I made in doing this analysis I
suspect it is more likely that I have somewhat underestimated the
vandalism problem rather than overestimated it, but as I said in the
beginning I'd like to believe I am in the right ballpark. If that's
true, I personally think that having less than 0.5% of Wikipedia be
vandalized at any given instant is actually rather comforting. It's
not a perfect number, but it would suggest that nearly everyone still
gets to see Wikipedia as intended rather than in a vandalized state.
(Though to be fair I didn't try to figure out if the vandalism
occurred in more frequently visited parts or not.)

Unfortunately, that's it for now as I need to get back to my thesis /
job search.

-Robert Rohde

_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


susanpgardner at gmail

Aug 20, 2009, 8:40 AM

Post #2 of 37 (4103 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

Robert, thanks for this. I have long wanted that number: it is really interesting.

-----Original Message-----
From: Robert Rohde <rarohde [at] gmail>

Date: Thu, 20 Aug 2009 03:06:06
To: Wikimedia Foundation Mailing List<foundation-l [at] lists>; English Wikipedia<wikien-l [at] lists>
Cc: Sean Moss-Pultz<sean [at] openmoko>; <suh [at] parc>
Subject: [Foundation-l] How much of Wikipedia is vandalized? 0.4% of Articles


I am supposed to be taking a wiki-vacation to finish my PhD thesis and
find a job for next year. However, this afternoon I decided to take a
break and consider an interesting question recently suggested to me by
someone else:

When one downloads a dump file, what percentage of the pages are
actually in a vandalized state?

This is equivalent to asking, if one chooses a random page from
Wikipedia right now, what is the probability of receiving a vandalized
revision?

Understanding what fraction of Wikipedia is vandalized at any given
instant is obviously of both practical and public relations interest.
In addition it bears on the motivation for certain development
projects like flagged revisions. So, I decided to generate a rough
estimate.

For the purposes of making an estimate I used the main namespace of
the English Wikipedia and adopted the following operational
approximations: I considered that "vandalism" is that thing which
gets reverted, and that "reverts" are those edits tagged with "revert,
rv, undo, undid, etc." in the edit summary line. Obviously, not all
vandalism is cleanly reverted, and not all reverts are cleanly tagged.
In addition, some things flagged as reverts aren't really addressing
what we would conventionally consider to be vandalism. Such caveats
notwithstanding, I have had some reasonable success with using a
revert heuristic in the past. With the right keywords one can easily
catch the standardized comments created by admin rollback, the undo
function, the revert bots, various editing tools, and commonly used
phrases like "rv", "rvv", etc. It won't be perfect, but it is a quick
way of getting an automated estimate. I would usually expect the
answer I get in this way to be correct within an order of magnitude,
and perhaps within a factor of a few, though it is still just a crude
estimate.

I analyzed the edit history up to the mid-June dump for a sample
29,999 main namespace pages (sampling from everything in main
including redirects). This included 1,333,829 edits, from which I
identified 102,926 episodes of reverted "vandalism". As a further
approximation, I assumed that whenever a revert occurred, it applied
to the immediately preceding edit and any additional consecutive
changes by the same editor (this is how admin rollback operates, but
is not necessarily true of tools like undo).

With those assumptions, I then used the timestamps on my identified
intervals of vandalism to figure out how much time each page had spent
in a vandalized state. Over the entire history of Wikipedia, this
sample of pages was vandalized during 0.28% of its existence. Or,
more relevantly, focusing on just this year vandalism was present
0.21% of the time, which suggests that one should expect 0.21% of
mainspace pages in any recent enwiki dump will be in a vandalized
state (i.e. 1 in 480).

(Note that since redirects represent 55% of the main namespace and are
rarely vandalized, one could argue that 0.37% [1 in 270] would be a
better estimate for the portion of actual articles that are in a
vandalized condition at any given moment.)

I also took a look at the time distribution of vandalism. Not
surprisingly, it has a very long tail. The median time to revert over
the entire history is 6.7 minutes, but the mean time to revert is 18.2
hours, and my sample included one revert going back 45 months (though
examples of such very long lags also imply the page had gone years
without any edits, which would imply an obscure topic that was also
almost never visited). In the recent period these factors becomes 5.2
minutes and 14.4 hours for the median and mean respectively. The
observation that nearly 50% of reverts are occurring in 5 minutes or
less is a testament to the efficient work of recent changes reviewers
and watchlists.

Unfortunately the 5% of vandalism that persists longer than 35 hours
is responsible for 90% of the actual vandalism a visitor is likely to
encounter at random. Hence, as one might guess, it is the vandalism
that slips through and persists the longest that has the largest
practical effect.

It is also worth noting that the prevalence figures for February-May
of this year are slightly lower than at any time since 2006. There is
also a drop in the mean duration of vandalism coupled to a slight
increase in the median duration. However, these effects mostly
disappear if we limit our considerations to only vandalism events
lasting 1 month or shorter. Hence those changes may be in significant
part linked to cut-off biasing from longer-term vandalism events that
have yet to be identified. The ambiguity in the change from earlier
in the year is somewhat surprising as the AbuseFilter was launched in
March and was intended to decrease the burden of vandalism. One might
speculate that the simple vandalism amenable to the AbuseFilter was
already being addressed quickly in nearly all cases and hence its
impact on the persistence of vandalism may already have been fairly
limited.

I've posted some summary data on the wiki at:

http://en.wikipedia.org/wiki/Wikipedia:Vandalism_statistics

Given the nature of the approximations I made in doing this analysis I
suspect it is more likely that I have somewhat underestimated the
vandalism problem rather than overestimated it, but as I said in the
beginning I'd like to believe I am in the right ballpark. If that's
true, I personally think that having less than 0.5% of Wikipedia be
vandalized at any given instant is actually rather comforting. It's
not a perfect number, but it would suggest that nearly everyone still
gets to see Wikipedia as intended rather than in a vandalized state.
(Though to be fair I didn't try to figure out if the vandalism
occurred in more frequently visited parts or not.)

Unfortunately, that's it for now as I need to get back to my thesis /
job search.

-Robert Rohde

_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


gmaxwell at gmail

Aug 20, 2009, 9:34 AM

Post #3 of 37 (4102 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

On Thu, Aug 20, 2009 at 6:06 AM, Robert Rohde<rarohde [at] gmail> wrote:
[snip]
> When one downloads a dump file, what percentage of the pages are
> actually in a vandalized state?

Although you don't actually answer that question, you answer a
different question:

[snip]
> approximations:  I considered that "vandalism" is that thing which
> gets reverted, and that "reverts" are those edits tagged with "revert,
> rv, undo, undid, etc." in the edit summary line.  Obviously, not all
> vandalism is cleanly reverted, and not all reverts are cleanly tagged.


Which is interesting too, but part of the problem with calling this a
measure of vandalism is that it isn't really, and we don't really have
a good handle on how solid an approximation it is beyond gut feelings
and arm-waving.

The study of Wikipedia activity is a new area of research, not
something that has been studied for decades. Not only do we not know
many things about Wikipedia, but we don't know many things about how
to know things about Wikipedia.


There must be ways to get a better understanding, but we many not know
of them and the ones we do know of are not always used. For example,
we could increase our confidence in this type of proxy-measure by
taking a random subset of that data and having humans classify it
based on some agreed-on established criteria. By performing the review
process many times we could get a handle on the typical error of both
the proxy-metric and the meta-review.

The risk here is that people will misunderstand these shorthand
metrics as the real-deal and the risk is increased when we encourage
it by using language which suggests that the simplistic understanding
is the correct one. IMO, highly uncertain and/or outright wrong
information is worse than not knowing when you aren't aware of the
reliability of the information.

We can't control how the press chooses to report on research, but when
we actively encourage misunderstandings by playing up the significance
or generality of our research our behaviour is unethical. Vigilance is
required.

This risk of misinformation is increased many-fold in comparative
analysis, where factors like time are plotted against indicators
because we often miss confounding variables
(http://en.wikipedia.org/wiki/Confounding).

Stepping away from your review for a moment, because it wasn't
primarily a comparative one, I'd like to point out some general
points:

For example, If research finds that edits are more frequently reverted
over time is this because there has been a change in the revision
decision process or have articles become better and more complete over
time and have edits to long and high quality articles always been more
likely to be reverted? Both are probably true, but how does the
contribution break down?

There are many other possibly significant confounding variables.
Probably many more than any of us have thought of yet.

I've always been of the school of thought that we do research to
produce understanding, not just generate numbers and "Wikipedia
becomes more complete over time, less work for new people to do" is a
pretty different understanding from "Wikipedia increasing hostile
towards new contributors" are pretty different understandings but both
may be supported by the same data at least until you've controlled for
many factors.

Another example— because of the scale of Wikipedia we must resort to
proxy-metrics. We can't directly measure vandalism, but we can measure
how often someone adds "is gay" over time. Proxy-metrics are powerful
tools but can be misleading. If we're trying to automatically
identify vandalism for a study (either to include it or exclude it) we
have the risk that the vandals are adapting to automatic
identification: If you were using "is gay" as a measure of vandalism
over time you might conclude that vandalism is decreasing when in
reality "cluebot" is performing the same kind of analysis for its
automatic vandalism suppression and the vandals have responded by
vandalizing in forms that can't be automatically identified, such as
by changing dates to incorrect values.

Or, keeping the goal of understanding in mind, sometimes the
measurements can all be right but a lack of care and consideration can
still cause people to draw the wrong conclusions. For example,
English Wikipedia has adopted a much stronger policy about citations
in articles about living people than it once had. It is
*intentionally* more difficult to contribute to those articles
especially for new contributors who do not know the rules then it once
was.

Going back to your simple study now: The analysis of vandalism
duration and its impact on readers makes an assumption about
readership which we know to be invalid. You're assuming a uniform
distribution of readership: That readers are just as likely to read
any random article. But we know that the actual readership follows a
power-law (long-tail) distribution. Because of the failure to consider
traffic levels we can't draw conclusions on how much vandalism readers
are actually exposed to.

Interestingly— you've found a power-law distribution in vandalism
lifetime. Is it possible that readership and vandalism life are
correlated, that more widely read articles tend to get reverted
faster? That doesn't sound unreasonable to me and if it's true it
means that readers are exposed to far less vandalism than a uniform
model would suggest.

In any case— I don't say any of this to criticize the mechanics of
your work. I'm able to point these things out because you were clear
about what you measured, more so than some other analysis has been
(including my own, at times). But I do think that it's important that
we are careful to not describe our work in ways that will cause laymen
to over-generalize and that we keep in mind that the most readers are
not researchers, and that they desperately want the kind of pat
open-and-shut answers that we won't be able to even begin providing
until the study of Wikipedia is far better understood.


Likewise, users of Wikipedia research should be forewarned that
researchers are apt to use simple words like "vandalism" when they are
really measuring something far more specific and that surprising
correlations between what is actually being measured and the things it
is being measured against may produce misleading conclusions.


Cheers!

_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


chiesa.marco at gmail

Aug 20, 2009, 9:37 AM

Post #4 of 37 (4103 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

On Thu, Aug 20, 2009 at 12:06 PM, Robert Rohde<rarohde [at] gmail> wrote:
>
> Given the nature of the approximations I made in doing this analysis I
> suspect it is more likely that I have somewhat underestimated the
> vandalism problem rather than overestimated it, but as I said in the
> beginning I'd like to believe I am in the right ballpark. If that's
> true, I personally think that having less than 0.5% of Wikipedia be
> vandalized at any given instant is actually rather comforting. It's
> not a perfect number, but it would suggest that nearly everyone still
> gets to see Wikipedia as intended rather than in a vandalized state.
> (Though to be fair I didn't try to figure out if the vandalism
> occurred in more frequently visited parts or not.)
>
Thanks for the excellent analysis, Robert. Just to give an idea of
what 0.4% means in practice, you can think in terms of one country, 12
US counties, 33 Italian municipalities, 147 French municipalities or 1
Pope

Cruccone

_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


jwales at wikia-inc

Aug 20, 2009, 9:38 AM

Post #5 of 37 (4096 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

Robert Rohde wrote:
> When one downloads a dump file, what percentage of the pages are
> actually in a vandalized state?
>
> This is equivalent to asking, if one chooses a random page from
> Wikipedia right now, what is the probability of receiving a vandalized
> revision?

Is there a possibility of re-running the numbers to include traffic
weightings?

I would hypothesize from experience that if we adjust the "random page"
selection to account for traffic (to get a better view of what people
are actually seeing) we would see slightly different results.

I think we would see a lot less (percentagewise) vandalism that persists
for a really long time for precisely the reason you identified: most
vandalism that lasts a long time, lasts a long time because it is on
obscure pages that no one is visiting. That doesn't mean it is not a
problem, but it does change some thinking about what kinds of tools are
needed to deal with that problem.

I'm not sure what else would change.



_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


thekohser at gmail

Aug 20, 2009, 9:59 AM

Post #6 of 37 (4107 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

While the time and effort that went into Robert Rohde's analysis is
certainly extensive, the outcomes are based on so many flawed assumptions
about the nature of vandalism and vandalism reversion, publicize at one's
peril the key "finding" of a 0.4% vandalism rate.

http://en.wikipedia.org/w/index.php?title=John_McCain&diff=169808394&oldid=169720853
11 hours
Reverted with no tags.

http://en.wikipedia.org/w/index.php?title=Maria_Cantwell&diff=prev&oldid=160400298
46 days
Reverted with note: "Undid revision 160400298 by 75.133.82.218"
By the way, there was a two-minute vandalism in the interim, so in many
cases, just because an analyst finds a "recent and short" incident, he or
she may be completely missing a longer-term incident.

http://en.wikipedia.org/w/index.php?title=Ted_Stevens&diff=prev&oldid=170850508
There goes your "rvv" theory. In this case, "rvv" was a flag for even more
preposterous vandalism.

The notion that these are lightly-watched or lightly-edited articles is a
bit difficult to swallow, since they are the biographical articles about
three United States senators. These articles were analyzed by an
independent team of volunteers, and we found that the 100 senatorial
articles were in deliberate disrepair about 6.8% of the time, which would
vastly differ from Rohde's analysis. Certainly, one could argue that
articles about political figures may be vandalized more often, but one might
also counter that argument with the assumption that "more eyes" ought to be
watching these articles and repairing them. More detail here:

http://www.mywikibiz.com/Wikipedia_Vandalism_Study

Admittedly, there were some minor flaws with our study's methodology, too.
These are reviewed on the Discussion page. But, as with Rohde's assessment,
if anything, we may have underrepresented the problem at 6.8%.

I remain unimpressed with Wikipedia's accuracy rate, and I am bewildered why
"flagged revisions" have not been implemented yet.

Greg
_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


nawrich at gmail

Aug 20, 2009, 10:14 AM

Post #7 of 37 (4110 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

On Thu, Aug 20, 2009 at 12:59 PM, Gregory Kohs <thekohser [at] gmail> wrote:

> While the time and effort that went into Robert Rohde's analysis is
> certainly extensive, the outcomes are based on so many flawed assumptions
> about the nature of vandalism and vandalism reversion, publicize at one's
> peril the key "finding" of a 0.4% vandalism rate.
>
>
> http://en.wikipedia.org/w/index.php?title=John_McCain&diff=169808394&oldid=169720853
> 11 hours
> Reverted with no tags.
>

The best part about that little exchange is:
http://en.wikipedia.org/w/index.php?title=John_McCain&diff=next&oldid=169906715

wherein a revert was made returning the vandalism, followed by another when
the editor noticed his error.

I don't think Robert made any firm conclusions on the meaning of his data;
he notes all the caveats that others have since emphasized, and admits to
likely underestimating vandalism. I read the 0.4% as representing the
approximate number of articles containing vandalism in an English Wikipedia
snapshot; that is quite different than the amount of time specific articles
stay in a "vandalized" state. Given the difficulty of accurately analyzing
this sort of data, no firm conclusions can be drawn; but certainly its more
informative than a Wikipedia Review analysis of a relatively small group of
articles in a specific topic area.

Nathan
_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


erikzachte at infodisiac

Aug 20, 2009, 10:23 AM

Post #8 of 37 (4096 views)
Permalink
How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

There is another way to detect 100% reverts. It won't catch manual reverts
that are not 100 accurate but most vandal patrollers will use undo, and the
like.



For every revision calculate md5 checksum of content. Then you can easily
look back say 100 revisions to see whether this checksum occurred earlier.
It is efficient and unambiguous.



This will work for any Wikipedia for which a full archive dump is available.




Erik Zachte



_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


thekohser at gmail

Aug 20, 2009, 10:30 AM

Post #9 of 37 (4099 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

Nathan said:

"...but certainly its (sic) more informative than a Wikipedia Review
analysis of a relatively small group of articles in a specific topic area."

And you are certainly entitled to a flawed opinion based on incorrect
assumptions, such as ours being a "Wikipedia Review" analysis. But, nice
try at a red herring argument.

Greg
_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


andrew.gray at dunelm

Aug 20, 2009, 10:35 AM

Post #10 of 37 (4092 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

2009/8/20 Erik Zachte <erikzachte [at] infodisiac>:
> There is another way to detect 100% reverts. It won't catch manual reverts
> that are not 100 accurate but most vandal patrollers will use undo, and the
> like.
>
> For every revision calculate md5 checksum of content. Then you can easily
> look back say 100 revisions to see whether this checksum occurred earlier.
> It is efficient and unambiguous.

A slightly less effective method would be to use the page size in
bytes; this won't give the precise one-to-one matching, but as I
believe it's already calculated in the data it might well be quicker.

One other false positive here: edit warring where one or both sides is
using undo/rollback. You'll get the impression of a lot of vandalism
without there necessarily being any.

--
- Andrew Gray
andrew.gray [at] dunelm

_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


removed at example

Aug 20, 2009, 10:43 AM

Post #11 of 37 (4099 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

On Thu, Aug 20, 2009 at 11:23 AM, Erik Zachte <erikzachte [at] infodisiac>wrote:

> There is another way to detect 100% reverts. It won't catch manual reverts
> that are not 100 accurate but most vandal patrollers will use undo, and the
> like.
>
>
>
> For every revision calculate md5 checksum of content. Then you can easily
> look back say 100 revisions to see whether this checksum occurred earlier.
> It is efficient and unambiguous.
>
>
>
> This will work for any Wikipedia for which a full archive dump is
> available.
>
>
>
>
> Erik Zachte
>

Luca's WikiTrust could easily reveal this info.
_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


nawrich at gmail

Aug 20, 2009, 10:55 AM

Post #12 of 37 (4098 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

On Thu, Aug 20, 2009 at 1:30 PM, Gregory Kohs <thekohser [at] gmail> wrote:

> Nathan said:
>
> "...but certainly its (sic) more informative than a Wikipedia Review
> analysis of a relatively small group of articles in a specific topic area."
>
> And you are certainly entitled to a flawed opinion based on incorrect
> assumptions, such as ours being a "Wikipedia Review" analysis. But, nice
> try at a red herring argument.
>
> Greg
>

Well, you can understand where I would get that idea - since the URL you
provided had "Wikipedia Review members" performing the research, until you
changed it a few minutes ago.

http://www.mywikibiz.com/index.php?title=Wikipedia_Vandalism_Study&diff=90806&oldid=89479

My point (which might still be incorrect, of course) was that an analysis
based on 30,000 randomly selected pages was more informative about the
English Wikipedia than 100 articles about serving United States Senators.

Nathan
_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


thekohser at gmail

Aug 20, 2009, 11:43 AM

Post #13 of 37 (4092 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

Apologies to Nathan regarding the "Wikipedia Review" description. The
analysis team was, indeed, recruited via Wikipedia Review; however, almost
all of the participants in the research have now departed or reduced their
participation in Wikipedia Review to such a degree, I don't personally
consider it to have been a "Wikipedia Review" effort at all. I allowed my
personal opinions to interfere with my recollection of the facts, though,
and that's not kosher. Again, I hope you'll accept my apology.

I still maintain, however, that any study of the accuracy of or the
vandalized nature of Wikipedia content will be far more reliable and
meaningful if human assessment is the underlying mechanism of analysis,
rather than a "bot" or "script" that will simply tally up things. I think
that Rohde's design was inherently flawed, and I'm happy that Greg Maxwell
and I both immediately recognized the danger of running off and "reporting
the good news", as Sue Gardner was apparently ready to do immediately.

As I said, I feel that Rohde proceeded with research based on several highly
questionable assumptions, while the "100 Senators" research rather carefully
outlined a research plan that carried very few assumptions, other than that
you trust the analysts to intelligently recognize vandalism or not. Nathan,
by praising Rohde's work and disparaging my own, you seem to be suggesting
that you would prefer to live inside a giant mountain comprised of sticks
and twigs, rather than in a small, pleasantly furbished 12' x 12' room. I
just don't understand that line of thinking. I'd rather have a small bit of
reliable data based on a stable premise, rather than a giant pile of data
based on an unstable premise.

Greg
_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


wikimail at inbox

Aug 20, 2009, 2:10 PM

Post #14 of 37 (4094 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

On Thu, Aug 20, 2009 at 1:55 PM, Nathan <nawrich [at] gmail> wrote:
>
> My point (which might still be incorrect, of course) was that an analysis
> based on 30,000 randomly selected pages was more informative about the
> English Wikipedia than 100 articles about serving United States Senators.


Any automated method of finding vandalism is doomed to failure. I'd say its
informativeness was precisely zero.

Greg's analysis, on the other hand, was informative, but it was targeted at
a much different question than Robert's.

"if one chooses a random page from Wikipedia right now, what is the
probability of receiving a vandalized revision" The best way to answer that
question would be with a manually processed random sample taken from a
pre-chosen moment in time. As few as 1000 revisions would probably be
sufficient, if I know anything about statistics, but I'll let someone with
more knowledge of statistics verify or refute that. The results will depend
heavily on one's definition of "vandalism", though.

On Thu, Aug 20, 2009 at 12:38 PM, Jimmy Wales <jwales [at] wikia-inc> wrote:
>
> Is there a possibility of re-running the numbers to include traffic
> weightings?
>

definitely should be done


> I would hypothesize from experience that if we adjust the "random page"
> selection to account for traffic (to get a better view of what people
> are actually seeing) we would see slightly different results.
>

I think we'd see drastically different results.


> I think we would see a lot less (percentagewise) vandalism that persists
> for a really long time for precisely the reason you identified: most
> vandalism that lasts a long time, lasts a long time because it is on
> obscure pages that no one is visiting.


Agreed. On the other hand, I think we'd also see that pages with more
traffic are more likely to be vandalized.

Of course, this assumes a valid methodology. Using "admin rollback, the
undo
function, the revert bots, various editing tools, and commonly used
phrases like "rv", "rvv", etc." to find vandalism is heavily skewed toward
vandalism that doesn't last very long (or at least doesn't last very many
edits). It's basically useless.
_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


andrew.gray at dunelm

Aug 20, 2009, 2:57 PM

Post #15 of 37 (4098 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

2009/8/20 Gregory Maxwell <gmaxwell [at] gmail>:

> Going back to your simple study now: The analysis of vandalism
> duration and its impact on readers makes an assumption about
> readership which we know to be invalid. You're assuming a uniform
> distribution of readership: That readers are just as likely to read
> any random article. But we know that the actual readership follows a
> power-law (long-tail) distribution. Because of the failure to consider
> traffic levels we can't draw conclusions on how much vandalism readers
> are actually exposed to.

We're also assuming a uniform distribution of vandalism, as it were.
There's a number of different types of vandalism; obscene defacement,
malicious alteration of factual content, meaningless test edits of a
character or two, schoolkids leaving messages for each other...

...and it all has a different impact on the reader.

This has two implications:

a) It seems safe to assume that replacing the entire article with
"john is gay" is going to get spotted and reverted faster, on average,
than an edit providing a plausible-sounding but entirely fictional
history for a small town in Kansas. So, any changes in the pattern of
the *content* of vandalism is going to lead to changes in the duration
and thus overall frequency of it, even if the amount of vandal edits
is constant.

b) We can easily compare the difference in effect for vandalism to be
left on differently trafficed pages for various times - roughly
speaking, time * traffic = number of readers affected. If some
vandalism is worse than others, we could thus also calculate some kind
of intensity metric - one hundred people viewing enormous genital
piercing images on [[Kitten]] is probably worse than ten thousand
people viewing "asdfdfggfh" at the end of a paragraph in the same
article.

I'm not sure how we'd go ahead with the second one, but it's an
interesting thing to think about.

--
- Andrew Gray
andrew.gray [at] dunelm

_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


rarohde at gmail

Aug 20, 2009, 3:36 PM

Post #16 of 37 (4095 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

On Thu, Aug 20, 2009 at 2:10 PM, Anthony<wikimail [at] inbox> wrote:
> On Thu, Aug 20, 2009 at 1:55 PM, Nathan <nawrich [at] gmail> wrote:
>>
>> My point (which might still be incorrect, of course) was that an analysis
>> based on 30,000 randomly selected pages was more informative about the
>> English Wikipedia than 100 articles about serving United States Senators.
>
>
> Any automated method of finding vandalism is doomed to failure. I'd say its
> informativeness was precisely zero.
>
> Greg's analysis, on the other hand, was informative, but it was targeted at
> a much different question than Robert's.
>
> "if one chooses a random page from Wikipedia right now, what is the
> probability of receiving a vandalized revision" The best way to answer that
> question would be with a manually processed random sample taken from a
> pre-chosen moment in time. As few as 1000 revisions would probably be
> sufficient, if I know anything about statistics, but I'll let someone with
> more knowledge of statistics verify or refute that. The results will depend
> heavily on one's definition of "vandalism", though.

Only in dreadfully obvious cases can you look at a revision by itself
and know it contains vandalism. If the goal is really to characterize
whether any vandalism has persisted in an article from any time in the
past, then one really needs to look at the full edit history to see
what has been changed / removed over time.

Even at the level of randomly sampling 1000 revisions, doing an real
evaluation of the full history is likely to be impractical for any
manual process.

If however you restrict yourself to asking whether 1000 edits
contributed vandalism, then you have a relatively manageable task, and
one that is more closely analogous to the technical program I set up.
If it helps one can think of what I did as trying to characterize
reverts and detect the persistence of "new vandalism" rather than
"vandalism" in general. And of course, only "new vandalism" could be
fixed by an immediate rollback / revert anyway.

Qualitatively I tend to think that vandalism that has persisted
through many intervening revisions is in a rather different category
than new vandalism. Since people rarely look at or are aware of an
articles' ancient past, such persistent vandalism is at that point
little different than any other error in an article. It is something
to be fixed, but you won't usually be able to recognize it as a
malicious act.

> On Thu, Aug 20, 2009 at 12:38 PM, Jimmy Wales <jwales [at] wikia-inc> wrote:
>>
>> Is there a possibility of re-running the numbers to include traffic
>> weightings?
>>
>
> definitely should be done

Does anyone have a nice comprehensive set of page traffic aggregated
at say a month level? The raw data used by stats.grok.se, etc. is
binned hourly which opens one up to issues of short-term fluctuations,
but I'm not at all interested in downloading 35 GB of hourly files
just to construct my own long-term averages.

>> I would hypothesize from experience that if we adjust the "random page"
>> selection to account for traffic (to get a better view of what people
>> are actually seeing) we would see slightly different results.
>>
>
> I think we'd see drastically different results.

If I had to make a prediction, I'd expect one might see numerically
higher rates of vandalism and shorter average durations, but otherwise
qualitatively similar results given the same methodology. I agree
though that it would be worth doing the experiment.

>> I think we would see a lot less (percentagewise) vandalism that persists
>> for a really long time for precisely the reason you identified: most
>> vandalism that lasts a long time, lasts a long time because it is on
>> obscure pages that no one is visiting.
>
> Agreed. On the other hand, I think we'd also see that pages with more
> traffic are more likely to be vandalized.
>
> Of course, this assumes a valid methodology. Using "admin rollback, the
> undo
> function, the revert bots, various editing tools, and commonly used
> phrases like "rv", "rvv", etc." to find vandalism is heavily skewed toward
> vandalism that doesn't last very long (or at least doesn't last very many
> edits). It's basically useless.

Yes, as I acknowledged above, "new vandalism". My personal interest
is also skewed in that direction. If you don't like it and don't find
it useful, feel free to ignore me and/or do your own analysis.
Vandalism that has persisted through many revisions is a qualitatively
different critter than most new vandalism. It's usually hard to
identify, even by a manual process, and is unlikely to be fixed except
through the normal editoral process of review, fact-checking, and
revision. When vandalism is "new" people are at least paying
attention to it in particular, and all vandalism starts out that way.
Perhaps it would be more useful if you think of this work as a
characterization of revert statistics?

Anyway, I provided my data point and described what I did so others
could judge it for themselves. Regardless of your opinion, it
addressed an issue of interest to me, and I would hope others also
find some useful insight in it.

-Robert Rohde

_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


thomas.dalton at gmail

Aug 20, 2009, 3:49 PM

Post #17 of 37 (4092 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

2009/8/20 Robert Rohde <rarohde [at] gmail>:
> I am supposed to be taking a wiki-vacation to finish my PhD thesis and
> find a job for next year.  However, this afternoon I decided to take a
> break and consider an interesting question recently suggested to me by
> someone else:
> [snip]

That's an interesting bit of research, but, as you say, it is very
crude. This study seems to have a better methodology, although it has
a much smaller sample:

http://en.wikipedia.org/wiki/User:Aetheling/Vandalism_survival

If we could do that survey again with a large sample, it would be very
interesting indeed.

_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


thomas.dalton at gmail

Aug 20, 2009, 3:51 PM

Post #18 of 37 (4091 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

2009/8/20 Jimmy Wales <jwales [at] wikia-inc>:
> Robert Rohde wrote:
>> When one downloads a dump file, what percentage of the pages are
>> actually in a vandalized state?
>>
>> This is equivalent to asking, if one chooses a random page from
>> Wikipedia right now, what is the probability of receiving a vandalized
>> revision?
>
> Is there a possibility of re-running the numbers to include traffic
> weightings?

I'd like to see that data too. I'm sure you are right, vandalism
doesn't last as long on popular pages, but it would be very
interesting to see how much quicker it is reverted and how popular a
page needs to be for that to apply (or if it is a gradual
improvement).

_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


wikimail at inbox

Aug 20, 2009, 3:53 PM

Post #19 of 37 (4099 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

On Thu, Aug 20, 2009 at 6:36 PM, Robert Rohde <rarohde [at] gmail> wrote:

> On Thu, Aug 20, 2009 at 2:10 PM, Anthony<wikimail [at] inbox> wrote:
> > "if one chooses a random page from Wikipedia right now, what is the
> > probability of receiving a vandalized revision" The best way to answer
> that
> > question would be with a manually processed random sample taken from a
> > pre-chosen moment in time. As few as 1000 revisions would probably be
> > sufficient, if I know anything about statistics, but I'll let someone
> with
> > more knowledge of statistics verify or refute that. The results will
> depend
> > heavily on one's definition of "vandalism", though.
>
> Only in dreadfully obvious cases can you look at a revision by itself
> and know it contains vandalism. If the goal is really to characterize
> whether any vandalism has persisted in an article from any time in the
> past, then one really needs to look at the full edit history to see
> what has been changed / removed over time.
>

I wouldn't suggest looking at the edit history at all, just the most recent
revision as of whatever moment in time is chosen. If vandalism is found,
then and only then would one look through the edit history to find out when
it was added.


> > Of course, this assumes a valid methodology. Using "admin rollback, the
> > undo
> > function, the revert bots, various editing tools, and commonly used
> > phrases like "rv", "rvv", etc." to find vandalism is heavily skewed
> toward
> > vandalism that doesn't last very long (or at least doesn't last very many
> > edits). It's basically useless.
>
> Yes, as I acknowledged above, "new vandalism".


"New vandalism" which has not yet been reverted wouldn't be included.


> My personal interest
> is also skewed in that direction. If you don't like it and don't find
> it useful, feel free to ignore me and/or do your own analysis.


I do. I also feel free to criticize your methods publicly, since you
decided to share them publicly.


> Anyway, I provided my data point and described what I did so others
> could judge it for themselves. Regardless of your opinion, it
> addressed an issue of interest to me, and I would hope others also
> find some useful insight in it.


And I presented my criticism, which hopefully other will find some useful
insight in as well.
_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


thomas.dalton at gmail

Aug 20, 2009, 3:57 PM

Post #20 of 37 (4108 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

2009/8/20 Anthony <wikimail [at] inbox>:
> I wouldn't suggest looking at the edit history at all, just the most recent
> revision as of whatever moment in time is chosen.  If vandalism is found,
> then and only then would one look through the edit history to find out when
> it was added.

That only works if the article is very well referenced and you have
all the references and are willing to fact-check everything. Otherwise
you will miss subtle vandalism like changing the date of birth by a
year.

_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


wikimail at inbox

Aug 20, 2009, 4:09 PM

Post #21 of 37 (4099 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

On Thu, Aug 20, 2009 at 6:57 PM, Thomas Dalton <thomas.dalton [at] gmail>wrote:

> 2009/8/20 Anthony <wikimail [at] inbox>:
> > I wouldn't suggest looking at the edit history at all, just the most
> recent
> > revision as of whatever moment in time is chosen. If vandalism is found,
> > then and only then would one look through the edit history to find out
> when
> > it was added.
>
> That only works if the article is very well referenced and you have
> all the references and are willing to fact-check everything. Otherwise
> you will miss subtle vandalism like changing the date of birth by a
> year.


No need for the article to be referenced at all, but yes, it would be time
consuming, or at least person-time consuming. On the other hand, it'd
answer the question, in a way that an automated process never could do
(assuming I've got my statistical analysis right, anyway:
http://www.raosoft.com/samplesize.html seems to suggest a 99% confidence
level for 664 random samples out of 3 million, but I'm not sure what
"response distribution" means).
_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


rarohde at gmail

Aug 20, 2009, 4:13 PM

Post #22 of 37 (4090 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

On Thu, Aug 20, 2009 at 3:57 PM, Thomas Dalton<thomas.dalton [at] gmail> wrote:
> 2009/8/20 Anthony <wikimail [at] inbox>:
>> I wouldn't suggest looking at the edit history at all, just the most recent
>> revision as of whatever moment in time is chosen. If vandalism is found,
>> then and only then would one look through the edit history to find out when
>> it was added.
>
> That only works if the article is very well referenced and you have
> all the references and are willing to fact-check everything. Otherwise
> you will miss subtle vandalism like changing the date of birth by a
> year.

It's not just facts. There are many ways to degrade the qualify of an
article (such as removing entire sections) that would be invisible if
one looks at only one revision.

Anthony seems to be talking about a question of article accuracy
(unless I am misreading him). That is overlapping issue with
addressing vandalism, but there are a significant number of ways to
commit vandalism that nonetheless have nothing to do with impairing
the resulting article's accuracy.

-Robert Rohde

_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


thomas.dalton at gmail

Aug 20, 2009, 4:20 PM

Post #23 of 37 (4094 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

2009/8/21 Anthony <wikimail [at] inbox>:
> On Thu, Aug 20, 2009 at 6:57 PM, Thomas Dalton <thomas.dalton [at] gmail>wrote:
>
>> 2009/8/20 Anthony <wikimail [at] inbox>:
>> > I wouldn't suggest looking at the edit history at all, just the most
>> recent
>> > revision as of whatever moment in time is chosen.  If vandalism is found,
>> > then and only then would one look through the edit history to find out
>> when
>> > it was added.
>>
>> That only works if the article is very well referenced and you have
>> all the references and are willing to fact-check everything. Otherwise
>> you will miss subtle vandalism like changing the date of birth by a
>> year.
>
>
> No need for the article to be referenced at all, but yes, it would be time
> consuming, or at least person-time consuming.

You mean you could go and find references for the information
yourself? I suppose you could, but that is completely impractical.

>On the other hand, it'd
> answer the question, in a way that an automated process never could do
> (assuming I've got my statistical analysis right, anyway:
> http://www.raosoft.com/samplesize.html seems to suggest a 99% confidence
> level for 664 random samples out of 3 million, but I'm not sure what
> "response distribution" means).

The site looks like it is for surveys made up of yes/no questions, I
don't think it is going to apply to this.

_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


wikimail at inbox

Aug 20, 2009, 4:37 PM

Post #24 of 37 (4096 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

On Thu, Aug 20, 2009 at 7:13 PM, Robert Rohde <rarohde [at] gmail> wrote:

> On Thu, Aug 20, 2009 at 3:57 PM, Thomas Dalton<thomas.dalton [at] gmail>
> wrote:
> > 2009/8/20 Anthony <wikimail [at] inbox>:
> >> I wouldn't suggest looking at the edit history at all, just the most
> recent
> >> revision as of whatever moment in time is chosen. If vandalism is
> found,
> >> then and only then would one look through the edit history to find out
> when
> >> it was added.
> >
> > That only works if the article is very well referenced and you have
> > all the references and are willing to fact-check everything. Otherwise
> > you will miss subtle vandalism like changing the date of birth by a
> > year.
>
> It's not just facts. There are many ways to degrade the qualify of an
> article (such as removing entire sections) that would be invisible if
> one looks at only one revision.


I guess that's true. People could be removing facts, for instance, which
wouldn't be apparently by looking at one revision. So such an analysis
would potentially understate actual vandalism. But at least we'd know in
which direction the percentage is potentially wrong. And anecdotally, I
don't think the understatement would be significant.

There's also the question of whether or not we want to count an article
which had a fact removed a few years ago and never re-added to be a
"vandalized revision".

Anthony seems to be talking about a question of article accuracy
> (unless I am misreading him).


I'm attempting to best answer the question "if one chooses a random page
from Wikipedia right now, what is the probability of receiving a
vandalized revision", which I take to have nothing whatsoever to do with the
number of reverts.


> That is overlapping issue with
> addressing vandalism, but there are a significant number of ways to
> commit vandalism that nonetheless have nothing to do with impairing
> the resulting article's accuracy.


Significant number? I can only think of a handful.
_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


wikimail at inbox

Aug 20, 2009, 4:43 PM

Post #25 of 37 (4092 views)
Permalink
Re: How much of Wikipedia is vandalized? 0.4% of Articles [In reply to]

On Thu, Aug 20, 2009 at 7:20 PM, Thomas Dalton <thomas.dalton [at] gmail>wrote:

> 2009/8/21 Anthony <wikimail [at] inbox>:
> > On Thu, Aug 20, 2009 at 6:57 PM, Thomas Dalton <thomas.dalton [at] gmail
> >wrote:
> >
> >> 2009/8/20 Anthony <wikimail [at] inbox>:
> >> > I wouldn't suggest looking at the edit history at all, just the most
> >> recent
> >> > revision as of whatever moment in time is chosen. If vandalism is
> found,
> >> > then and only then would one look through the edit history to find out
> >> when
> >> > it was added.
> >>
> >> That only works if the article is very well referenced and you have
> >> all the references and are willing to fact-check everything. Otherwise
> >> you will miss subtle vandalism like changing the date of birth by a
> >> year.
> >
> >
> > No need for the article to be referenced at all, but yes, it would be
> time
> > consuming, or at least person-time consuming.
>
> You mean you could go and find references for the information
> yourself? I suppose you could, but that is completely impractical.
>

My God. If a few dozen people couldn't easily determine to a relatively
high degree of certainty what portion of a mere 0.03% of Wikipedia's
articles are *vandalized*, how useless is Wikipedia?

>On the other hand, it'd
> > answer the question, in a way that an automated process never could do
> > (assuming I've got my statistical analysis right, anyway:
> > http://www.raosoft.com/samplesize.html seems to suggest a 99% confidence
> > level for 664 random samples out of 3 million, but I'm not sure what
> > "response distribution" means).
>
> The site looks like it is for surveys made up of yes/no questions, I
> don't think it is going to apply to this.
>

"Is this article vandalized?" is a yes/no question...
_______________________________________________
foundation-l mailing list
foundation-l [at] lists
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l

First page Previous page 1 2 Next page Last page  View All Wikipedia foundation RSS feed   Index | Next | Previous | View Threaded
 
 


Interested in having your list archived? Contact Gossamer Threads
 
  Web Applications & Managed Hosting Powered by Gossamer Threads Inc.