Login | Register For Free | Help
Search for: (Advanced)

Mailing List Archive: Wikipedia: Mediawiki

How to export all articles on an existing wiki.

 

 

Wikipedia mediawiki RSS feed   Index | Next | Previous | View Threaded


jfoster81747 at gmail

Aug 9, 2013, 9:13 AM

Post #1 of 4 (19 views)
Permalink
How to export all articles on an existing wiki.

So far Ive tried the dumpBackup.php and that only gets part of it. It
has been suggested that its a php script timeout issue and that's
possible. It is a large site with over 5000 articles on it so it will be
large. I would appreciate any tips on how to do this. I've also looked
at XCloner as another suggested and it does not appear to provide the
functionality to do what I need. Does fine on existing hard pages
in /html directory, but does not seem to be able to pull files from a
mediawiki and place them into a .xml file for importing. Even something
that could break up a backup so that it gets everything could help.
Dumping the databse ( mysql) will not work as that part of the issue.
The existing data base in somewhat filled with old no longer relevant
tables that SLOW it way down.
Any tips please?
Thanks
John


_______________________________________________
MediaWiki-l mailing list
MediaWiki-l [at] lists
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l


emijrp at gmail

Aug 9, 2013, 9:25 AM

Post #2 of 4 (19 views)
Permalink
Re: How to export all articles on an existing wiki. [In reply to]

If your MediaWiki version is not too old and the hosting is not very slow,
this may work http://code.google.com/p/wikiteam/


2013/8/9 John W. Foster <jfoster81747 [at] gmail>

> So far Ive tried the dumpBackup.php and that only gets part of it. It
> has been suggested that its a php script timeout issue and that's
> possible. It is a large site with over 5000 articles on it so it will be
> large. I would appreciate any tips on how to do this. I've also looked
> at XCloner as another suggested and it does not appear to provide the
> functionality to do what I need. Does fine on existing hard pages
> in /html directory, but does not seem to be able to pull files from a
> mediawiki and place them into a .xml file for importing. Even something
> that could break up a backup so that it gets everything could help.
> Dumping the databse ( mysql) will not work as that part of the issue.
> The existing data base in somewhat filled with old no longer relevant
> tables that SLOW it way down.
> Any tips please?
> Thanks
> John
>
>
> _______________________________________________
> MediaWiki-l mailing list
> MediaWiki-l [at] lists
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l [at] lists
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l


tom at hutch4

Aug 9, 2013, 10:35 AM

Post #3 of 4 (18 views)
Permalink
Re: How to export all articles on an existing wiki. [In reply to]

Are you using importDump.php or the web interface? ie SpecialPages import.

Tom

On Aug 9, 2013, at 12:13 PM, "John W. Foster" <jfoster81747 [at] gmail> wrote:

> So far Ive tried the dumpBackup.php and that only gets part of it. It
> has been suggested that its a php script timeout issue and that's
> possible. It is a large site with over 5000 articles on it so it will be
> large. I would appreciate any tips on how to do this. I've also looked
> at XCloner as another suggested and it does not appear to provide the
> functionality to do what I need. Does fine on existing hard pages
> in /html directory, but does not seem to be able to pull files from a
> mediawiki and place them into a .xml file for importing. Even something
> that could break up a backup so that it gets everything could help.
> Dumping the databse ( mysql) will not work as that part of the issue.
> The existing data base in somewhat filled with old no longer relevant
> tables that SLOW it way down.
> Any tips please?
> Thanks
> John
>
>
> _______________________________________________
> MediaWiki-l mailing list
> MediaWiki-l [at] lists
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l

_______________________________________________
MediaWiki-l mailing list
MediaWiki-l [at] lists
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l


jfoster81747 at verizon

Aug 10, 2013, 6:24 AM

Post #4 of 4 (13 views)
Permalink
Re: How to export all articles on an existing wiki. [In reply to]

On Fri, 2013-08-09 at 13:35 -0400, Tom wrote:
> Are you using importDump.php or the web interface? ie SpecialPages import.
>
> Tom
>
> On Aug 9, 2013, at 12:13 PM, "John W. Foster" <jfoster81747 [at] gmail> wrote:
>
> > So far Ive tried the dumpBackup.php and that only gets part of it. It
> > has been suggested that its a php script timeout issue and that's
> > possible. It is a large site with over 5000 articles on it so it will be
> > large. I would appreciate any tips on how to do this. I've also looked
> > at XCloner as another suggested and it does not appear to provide the
> > functionality to do what I need. Does fine on existing hard pages
> > in /html directory, but does not seem to be able to pull files from a
> > mediawiki and place them into a .xml file for importing. Even something
> > that could break up a backup so that it gets everything could help.
> > Dumping the databse ( mysql) will not work as that part of the issue.
> > The existing data base in somewhat filled with old no longer relevant
> > tables that SLOW it way down.
> > Any tips please?
> > Thanks
> > John
---------------------------------
I have tried both & now a new issue is showing up. Seems the version of
mediawiki on the remote server is not able to upload the .xml files from
the old server. An error message pops up saying:

> Expected<Pages>,got<mediawiki>Expected <Page>
> got<siteinfo>,Expected<Page>
> got<sitename>
Don't know whats up now.
Thanks
John


_______________________________________________
MediaWiki-l mailing list
MediaWiki-l [at] lists
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l

Wikipedia mediawiki RSS feed   Index | Next | Previous | View Threaded
 
 


Interested in having your list archived? Contact Gossamer Threads
 
  Web Applications & Managed Hosting Powered by Gossamer Threads Inc.