User:Reza1615/BOT/new interwiki.py

From Wikidata
Jump to navigation Jump to search

Before running this code please do localization otherwise it will edit in Farsi :) in your wiki!

  • Important Note: Because of changing the API URL you should do this edit in API.py file!

Versions[edit]

  • Version 1: developing the bot
  • Version 2.00:
Making Item for multi-language links by one edit
Adding Namespace Limitation
Escaping Some API errors
Solving some minor Bug
This version is tested.
  • Version 2.10:
solving some minor bugs
  • Version 3.00:
Add possibly to merge single language items and empty the old item and save the list in zzinterwiki_import_merged_need_deleted.txt
  • Version 3.50:
improving the remove interwikis from local wiki (fix some minor bugs)
remove none-exist interwikis which cause interwiki conflicts
  • Version 4.00:
After deploying new JS tool If we leave NoItewiki and NoItem pages it is much better to not make new Item for them and most of conflicts are caused by single lang items! so this version doesn't make them.
  • Version 5.00:
Add many possibilities such as {‌{Pages with no Item}} and solving some minor bugs and improving Bot performance
  • Version 6.00:
Solved many minor Bugs which stop bot!
  • Version 7.00:
Solved Bug with wiki voyage and other bug for creating none exist item (now it creates :) )!
  • Version 8.00:
Based on the User:Ladsgroup/Labels, solving the label lang name for some wiki
  • Version 9.00:
Useing merge API and solving some minor bugs

localization part[edit]

  • Orgine_lang= your local wiki prefix
  • Ignore_Templates=[u'template:speedy delete',u'template:merge'] List of templates (tags) which bot don't add that pages like: {{Speedy delete}} {{Merge}} you can check your wiki's with their interwiki to Farsi.
  • Confilict_Template=u'\n{{Interwiki conflict}}' Conflict template which are listed in Q6099744
  • Skip_New=the number of new pages (articles) which should abort to let patrol users to tag {{Speedy delete}} or {{Merge}} to have less missed language links in wikidata
  • Local_Remove_IWs_Summary=u'Bot: removing exist language links in [[wp:wikidata]]: '
  • Interwwiki_name=u' Interwiki(s)'
  • Orgine_Category_Name= should be the name prefix of Category: in your wiki like رده: for fa.wiki
  • Orgine_Template_Name=should be the name prefix of Template: in your wiki like الگو: for fa.wiki
  • Orgine_NUM=for languages which had other numbering system like Farsi (u'۰۱۲۳۴۵۶۷۸۹') . if your language uses Numbers like english leave it as u'0123456789'. this numbers used in local wiki's edit summary.
  • Local_Remove_IWs_Summary=u'Bot: removing exist language links in [[wp:wikidata]]: '
  • Interwwiki_name=u' Interwiki(s)'
  • Remove_IWs_confilict_Summary=u'Bot: Removing Interwiki conflict template'
  • Confilict_Template_No_Item=u'\n{{Pages with no Item}}'
  • bot_edits=u'Bot: ' #Bot prefix for edit summary

for Toolserver accounts[edit]

in line 47 set your Toolserver's user account (YourUsername)

password_wiki = open("/home/YourUsername/pywikipedia/passfile", 'r')

for enwiki[edit]

Orgine_lang='en'
Ignore_templates=[u'template:speedy delete',u'template:merge']
Confilict_Template=u'\n{{Interwiki conflict}}'
Confilict_Template_No_Item=u'\n{{Pages with no Item}}'
bot_edits=u'Bot: '
Skip_new=200
Editing_Pause=3 # Sleeping time in Second for eding in wikidata
Orgine_Category_Name=u'Category:'
Orgine_Template_Name=u'Template:'
Orgine_NUM=u'0123456789'
Local_Remove_IWs_Summary=u'Bot: removing exist language links in [[wp:wikidata]]: '
Interwwiki_name=u' Interwiki(s)'
Remove_IWs_confilict_Summary=u'Bot: Removing Interwiki conflict template'

code[edit]

  1 #!/usr/bin/python
  2 from __future__ import unicode_literals
  3 # -*- coding: utf-8  -*-
  4 # Reza (User:reza1615)
  5 # Distributed under the terms of the CC-BY-SA 3.0 .
  6 # -*- coding: utf-8 -*-
  7 import wikipedia,pagegenerators,config
  8 import query,time,login,codecs,re,string
  9 import wikidata
 10 from pywikibot import textlib
 11 version=u'9.01'
 12 
 13 #-------------------------------localization--------------------   
 14 Orgine_lang='fa'
 15 Ignore_Templates=[u'الگو:ویکی‌داده نه',u'الگو:رده بهتر',u'الگو:حذف سریع',u'الگو:حذف زمان‌دار/پیغام',u'الگو:پیشنهاد حذف ۲',u'الگو:ادغام با',u'الگو:ادغام شد در',u'الگو:درخواست ادغام',u'الگو:رده پنهان',u'الگو:Hiddencat',u'الگو:انتقال رده',u'الگو:تغییر مسیر رده',u'الگو:DB',u'الگو:DELETE',u'الگو:Db',u'الگو:Db-Reason',u'الگو:Db-because',u'الگو:Db-reason',u'الگو:Delbecause',u'الگو:Delete because',u'الگو:Delete',u'الگو:Deletebecause',u'الگو:Speedy delete',u'الگو:Speedy',u'الگو:Speedydelete',u'الگو:حذف فوری',u'الگو:حس',u'الگو:کاندید حذف',u'الگو:بدون منبع مدت‌دار',u'الگو:بدون منبع ۲',u'الگو:بدون منبع۲',u'الگو:Prod',u'الگو:نامزد حذف',u'الگو:Proposed deletion',u'الگو:حذف زماندار']
 16 Orgine_Category_Name=u'رده:'
 17 Orgine_Template_Name=u'الگو:'
 18 Orgine_NUM=u'۰۱۲۳۴۵۶۷۸۹'
 19 Confilict_Template=u'\n{{تداخل میان‌ویکی}}'
 20 Confilict_Template_No_Item=u'\n{{صفحه بدون آیتم}}'
 21 bot_edits=u'ربات: '  
 22 Skip_new=200
 23 Editing_Pause=0 # Pause in second between edits
 24 Local_Remove_IWs_Summary=u'ربات: حذف میان‌ویکی موجود در [[وپ:ود|ویکی‌داده]]: '
 25 Interwwiki_name=u' میان‌ویکی'
 26 Remove_IWs_confilict_Summary=u'ربات:حذف الگوی تداخل رفع شده'
 27 rep_label= {"no":"nb",
 28             "als":"gsw",
 29             "fiu-vro":"vro",
 30             "bat-smg":"sgs",
 31             "be-x-old":"be-tarask",
 32             "roa-rup":"rup",
 33             "zh-classical":"lzh",
 34             "zh-min-nan":"nan",
 35             "zh-yue":"yue",
 36             "crh":"crh-latn",
 37             "simple":"en"}
 38 #----------------------------------------------------------------
 39 
 40 #-------------------Summary which uses in wikidata and they should be english ----
 41 creat_summary="Bot: Import page from {0}wiki".format(Orgine_lang)    
 42 update_link_summary="Bot: Update site links from {0}wiki".format(Orgine_lang)
 43 update_link_labels_summary="Bot: Update site links and labels from {0}wiki".format(Orgine_lang)
 44 update_Labels_summary=u'Bot: Update of labels.'
 45 #-------------------------------------------------------------------------------
 46 
 47 Orgine_Site=wikipedia.getSite(Orgine_lang,fam='wikipedia')
 48 repo = Orgine_Site.data_repository()
 49 SafeWork=True
 50 mysite=wikipedia.getSite('wikidata','wikidata')
 51 
 52 def login_wiki(mode):
 53     if mode==1:
 54         dataSite=wikipedia.getSite('wikidata','wikidata')    
 55     if mode==2:
 56         dataSite=wikipedia.getSite(Orgine_lang,'wikipedia')
 57     try:
 58         password_wiki = open("/home/YourUsername/pywikipedia/passfile", 'r')
 59     except:
 60         password_wiki = open(wikipedia.config.datafilepath(config.password_file), 'r')
 61     password_wiki=password_wiki.read().replace('"','').strip()    
 62     passwords=password_wiki.split(',')[1].split(')')[0].strip()
 63     usernames=password_wiki.split('(')[1].split(',')[0].strip()
 64     botlog=login.LoginManager(password=passwords,username=usernames,site=dataSite)
 65     botlog.login()
 66     
 67 def remove_old_IW (page,item_page):
 68     removed_iw,text,old_text=[],u'',u''
 69     global item_is_updtaed
 70     wikipedia.output(str(item_is_updtaed))
 71     if not item_is_updtaed:
 72        wikipedia.output("\03{lightred}++Because of some problem the item doesn't updated so we need the tereditional interwikis++\03{default}")
 73        return False
 74     My_Template=Confilict_Template
 75     WD_Data = wikipedia.DataPage(item_page)
 76     if not WD_Data.exists(): 
 77         return True
 78     WD_Dict=WD_Data.get()
 79     WD_Dict= WD_Dict['links']
 80     try:    
 81         text=page.get()
 82     except wikipedia.IsRedirectPage:
 83         page = page.getRedirectTarget()
 84         try:
 85             text=page.get()
 86         except:
 87             return True
 88     except:
 89         return True
 90     text=text.replace(u'[[ ',u'[[').replace(u' ]]',u']]').replace(u'[[ ',u'[[').replace(u' ]]',u']]').replace(u"\u200E"+u']]',u']]').replace(u'[['+u"\u200E",u'[[')
 91     text=text.replace(u'Template: ',u'Template:').replace(u'Category: ',u'Category:').replace(u'Wikipedia: ',u'Wikipedia:').replace(u'Portal: ',u'Portal:').replace(u'category: ',u'Category:')
 92     our_en_interwiki=u''
 93     if '[[en:' in text:
 94         our_en_interwiki = u'[[en:'+text.split(u'[[en:')[1].split(u']]')[0].strip()+u']]'
 95     text=text.replace(our_en_interwiki,our_en_interwiki.replace(u'_',u' '))
 96     if u'{{noexternallanglinks}}' in text:
 97         return True
 98     
 99     if not text:
100         save_file(u'[['+page.title()+u']]','error')    
101         return False
102         
103     text=text.replace(u'\r',u'')
104     interwikis=textlib.getLanguageLinks(text)
105     
106     if item_page!=page:
107         My_Template=Confilict_Template_No_Item
108         My_Template2=Confilict_Template
109     else:
110         My_Template=Confilict_Template 
111         My_Template2=Confilict_Template_No_Item  
112     old_text=text
113     if interwikis:    
114             for iw in interwikis:
115                     IW_site=interwikis[iw].site().language()
116                     IW_link=interwikis[iw].title()
117                     Lower_IW_link=IW_link[0].lower()+IW_link[1:]
118                     Text_IW_link=u'[['+IW_site+u':'+IW_link+u']]'
119                     Text_IW_link2=u'[['+IW_site+u': '+IW_link+u']]'
120                     Text_IW_link3=u'[['+IW_site+u' :'+IW_link+u']]'
121                     lower_Text_IW_link=u'[['+IW_site+u':'+Lower_IW_link+u']]'
122                     IW_prf=IW_site.replace(u'-',u'_')+u'wiki' 
123                     ineterwiki_row_link=IW_link.replace(u'_',u' ').strip()
124                     ineterwiki_row_link_target=redirect_find( ineterwiki_row_link,IW_site).replace(u'_',u' ').strip()
125                     if (IW_prf in WD_Dict) or ((IW_prf in WD_Dict) and (('#' in IW_link) or (u'#' in ineterwiki_row_link_target) or ineterwiki_row_link!=ineterwiki_row_link_target)):
126                         wikipedia.output('\03{lightred}- \03{default}'+Text_IW_link)
127                         text=text.replace(Text_IW_link+u'\n',u'').replace(Text_IW_link,u'').replace(Text_IW_link.replace(u' ',u'_'),u'').replace(Text_IW_link.replace(u'[[',u'[[‌'),u'').replace(Text_IW_link.replace(u']]',u'‌]]'),u'')
128                         text=text.replace(Text_IW_link2+u'\n',u'').replace(Text_IW_link2,u'').replace(Text_IW_link2.replace(u' ',u'_'),u'')
129                         text=text.replace(Text_IW_link3+u'\n',u'').replace(Text_IW_link3,u'').replace(Text_IW_link3.replace(u' ',u'_'),u'')
130                         text=text.replace(our_en_interwiki+u'\n',u'').replace(our_en_interwiki,u'').replace(our_en_interwiki.replace(u' ',u'_'),u'')
131                         text=text.replace(Text_IW_link.replace('Category:',u'category:').replace(' ',u'_')+u'\n',u'')
132 
133                         text=text.replace(lower_Text_IW_link+u'\n',u'').replace(lower_Text_IW_link,u'')       
134                         removed_iw.append(IW_site)
135     else:
136             if (My_Template.strip() in text) or (My_Template2.strip() in text):
137                     text=text.replace(My_Template.strip()+u'\n',u'').replace(My_Template,u'').replace(My_Template.strip(),u'').replace(My_Template2.strip()+u'\n',u'').replace(My_Template2,u'').replace(My_Template2.strip(),u'')
138                     wikipedia.output('\03{lightred} -'+My_Template+' \03{default}')
139                     text=text.replace(My_Template.strip()+u'\n'+My_Template2.strip(),My_Template.strip()).replace(My_Template2.strip()+u'\n'+My_Template.strip(),My_Template.strip())
140                     page.put(text.strip(),Remove_IWs_confilict_Summary)
141                     return True
142     interwikis=textlib.getLanguageLinks(text)
143     if  old_text!=text:   
144         if interwikis:
145             if not My_Template.strip() in text:    
146                 if page.namespace()!=10 : 
147                     text+=My_Template    
148                 else:
149                     if text.find(u'</noinclude>')!=-1:
150                         if string.conut(text,'</noinclude>')<2:
151                             text=text.replace(u'</noinclude>',My_Template+u'\n</noinclude>')
152                         else:
153                             text+=u'<noinclude>'+My_Template+u'\n</noinclude>'
154                     else:
155                         
156                         text=u'<noinclude>'+My_Template+u'\n</noinclude>'
157                 wikipedia.output('\03{lightblue} +'+My_Template+' \03{default}')
158                 wikipedia.output(u'+++++++++++++++++++')    
159                 try:
160                     text=text.replace(My_Template.strip()+u'\n'+My_Template2.strip(),My_Template.strip()).replace(My_Template2.strip()+u'\n'+My_Template.strip(),My_Template.strip())
161                     page.put(text.strip(),bot_edits+' + '+My_Template)
162                     
163                     return True
164                 except wikipedia.LockedPage:
165                     save_file(u'[['+page.title()+u']]','LockedPage')
166                     wikipedia.output(u'Skipping (locked page)')
167         else:
168             old_text2=text
169             text=text.replace(My_Template.strip()+u'\n',u'').replace(My_Template,u'').replace(My_Template.strip(),u'').replace(My_Template2.strip()+u'\n',u'').replace(My_Template2,u'').replace(My_Template2.strip(),u'')
170             if old_text2!=text:    
171                 wikipedia.output('\03{lightred} -'+My_Template+' \03{default}')
172                 text = re.sub('[\r\n]{3,}', "\n\n",text)    
173                 text=text.replace(u'<noinclude></noinclude>',u'').replace(u'<noinclude>\n</noinclude>',u'').replace(u'<noinclude>\n\n</noinclude>',u'')
174                 text=text.replace(My_Template.strip()+u'\n'+My_Template2.strip(),My_Template.strip()).replace(My_Template2.strip()+u'\n'+My_Template.strip(),My_Template.strip())
175                 page.put(text.strip(),Remove_IWs_confilict_Summary)
176                 return True    
177         
178         text = re.sub('[\r\n]{3,}', "\n\n",text)    
179         text=text.replace(u'<noinclude></noinclude>',u'').replace(u'<noinclude>\n</noinclude>',u'').replace(u'<noinclude>\n\n</noinclude>',u'')
180         if removed_iw:
181             if len(removed_iw)>10:
182                 removed_iw_NUM=str(len(removed_iw))
183                 for i in range(0,10):
184                      removed_iw_NUM = removed_iw_NUM.replace(u'0123456789'[i], Orgine_NUM[i])
185 
186                 removed_links_summary=removed_iw_NUM+Interwwiki_name
187             else:
188                 removed_links_summary=u', '.join(removed_iw)
189             if interwikis:
190                 text=text.replace(My_Template.strip()+u'\n'+My_Template2.strip(),My_Template.strip()).replace(My_Template2.strip()+u'\n'+My_Template.strip(),My_Template.strip())
191                 page.put(text.strip(),Local_Remove_IWs_Summary+removed_links_summary+u' + '+My_Template)
192             else:
193                 try:
194                     text=text.replace(My_Template.strip()+u'\n'+My_Template2.strip(),My_Template.strip()).replace(My_Template2.strip()+u'\n'+My_Template.strip(),My_Template.strip())
195                     page.put(text.strip(),Local_Remove_IWs_Summary+removed_links_summary)
196                 except wikipedia.LockedPage:
197                     save_file(u'[['+page.title()+u']]','LockedPage')
198                     wikipedia.output(u'Skipping(locked page)')
199             return True
200         else:
201             text=text.replace(My_Template.strip()+u'\n'+My_Template2.strip(),My_Template.strip()).replace(My_Template2.strip()+u'\n'+My_Template.strip(),My_Template.strip())
202             page.put(text.strip(),Remove_IWs_confilict_Summary)
203             return True 
204     else:
205         if interwikis:    
206             if not My_Template.strip() in text:    
207                 if page.namespace()!=10:    
208                     text+=My_Template
209                 else:
210                     if text.find(u'</noinclude>')!=-1:
211                         text=text.replace(u'</noinclude>',My_Template+u'\n</noinclude>')
212                     else:
213                         return False
214                 wikipedia.output('\03{lightblue} +'+My_Template+' \03{default}') 
215                 try:
216                     text=text.replace(My_Template.strip()+u'\n'+My_Template2.strip(),My_Template.strip()).replace(My_Template2.strip()+u'\n'+My_Template.strip(),My_Template.strip())
217                     page.put(text.strip(),bot_edits+' + '+My_Template)
218                     return True
219                 except wikipedia.LockedPage:
220                     save_file(u'[['+page.title()+u']]','LockedPage')
221                     wikipedia.output(u'Skipping (locked page)')
222 
223     return False
224 
225     
226 def save_file(case,type):
227     if type=='merge':
228         file = 'zzinterwiki_import_merged_need_deleted.txt'
229     elif type=='error':
230         file = 'zzinterwiki_import_errors.txt'    
231     elif type=='LockedPage':
232         file = 'zzinterwiki_Locked_Pages.txt'
233     else:
234         file = 'zzinterwiki_conflicts.txt'
235     try:        
236         file_text = codecs.open(file,'r' ,'utf8' )
237         file_text = file_text.read().strip()
238     except:
239         file_text=u''
240     if not case in file_text:    
241         with codecs.open(file ,mode = 'a',encoding = 'utf8' ) as f:
242                             f.write(u'\n'+case)
243 
244 def templatequery(pagelink):
245     temps=[]
246     pagelink=pagelink.split(u'#')[0].strip()
247     if pagelink==u'':
248         return False    
249     pagelink=pagelink.replace(u' ',u'_')
250     params = {
251             'action': 'query',
252             'prop':'templates',
253             'titles': pagelink,
254             'redirects': 1,
255             'tllimit':500,
256     }
257     try:
258         categoryname = query.GetData(params,Orgine_Site)
259         for item in categoryname[u'query'][u'pages']:
260             templateha=categoryname[u'query'][u'pages'][item][u'templates']
261             break
262         for temp in templateha:
263             temps.append(temp[u'title'].replace(u'_',u' '))         
264         return temps
265     except: 
266         return False
267         
268 def redirect_find( page_link,wiki):
269     page_link=page_link.replace(u' ',u'_')
270     site = wikipedia.getSite(wiki.replace(u'_',u'-'))
271     params = {
272         'action': 'query',
273         'redirects':"",
274         'titles': page_link
275     }
276     query_page = query.GetData(params,site)
277     try:
278         redirect_link=query_page[u'query'][u'redirects'][0]['to']
279         try:
280            hashpart=query_page[u'query'][u'redirects'][0]['tofragment']
281            return redirect_link+u'#'+hashpart
282         except:
283            return redirect_link
284     except:
285         if 'missing=""' in str(query_page):
286             return u''
287         else:
288             return page_link.replace(u'_',u' ')
289         
290 def Check_Page_Exists(page_link, wiki):
291     page_link=page_link.replace(u' ',u'_')
292 
293     site = wikipedia.getSite(wiki.replace(u'_',u'-'))
294     params = {
295         'action': 'query',
296         'prop':'info',
297         'titles': page_link
298     }
299     query_page = query.GetData(params,site)
300     try:
301         for i in query_page[u'query'][u'pages']:    
302             redirect_link=query_page[u'query'][u'pages'][i]['pageid']  
303             return True# page existed
304     except:
305         return False# page not existed
306         
307             
308 def get_interwikis(link,lang):
309     Orgine_Site=wikipedia.getSite(lang.replace(u'_',u'-'),fam='wikipedia')
310     if link.find('#')!=-1:
311         return False
312     if link==u'':
313         return False    
314     link=link.replace(u' ',u'_')
315     appenddict={}
316     try:
317         params = {
318             'action': 'query',
319             'prop': 'langlinks',
320             'titles': link,
321             'redirects': 1,
322             'lllimit':500,
323         }
324         pagename = query.GetData(params,Orgine_Site)    
325         for item in pagename[u'query'][u'pages']:
326             case=pagename[u'query'][u'pages'][item][u'langlinks']
327         for lang in case:
328             L_lang=lang['lang']
329             L_link=lang['*']
330             if not (L_lang in appenddict):
331                     if L_lang=='nb':
332                         L_lang='no'    
333                     appenddict[L_lang]=L_link 
334         return appenddict
335     except: 
336         return appenddict
337 
338 def check_item(wiki,link):
339     try:
340         site=wikipedia.getSite(wiki.replace(u'_',u'-'),fam='wikipedia')
341         page=wikipedia.Page(site,link)    
342         data=wikipedia.DataPage(page)
343         items=data.get()
344         #return False
345     except wikipedia.NoPage:
346         return True
347     except:
348         wikipedia.output("\03{lightred}Item has been created. Skipping...\03{default}")
349         return False
350 
351 def set_lable(data,new_langs,item):    
352     dic_item=data.get()
353     old=dic_item['links']
354     changes=False
355     for cases in new_langs:
356         if cases=='nbwiki':
357             cases='nowiki' 
358         dic_item['links'][cases]=new_langs[cases]    
359         if old!=dic_item['links']:    
360             wikipedia.output('added '+cases+'......................')    
361     for langs in dic_item['links']:
362         if ('voyage' in langs) or ('commons' in langs) or ('source' in langs) or ('book' in langs) or ('quote' in langs) or ('species' in langs)  or ('versity' in langs) or ('tionary' in langs) or ('news' in langs):
363             wikipedia.output("--> \03{lightred}"+langs+"\03{default}\03{lightblue} is passed becuse it dosen't support!\03{default}")
364             continue
365         if langs=='nbwiki':
366             langs='nowiki'
367         try:
368            value=dic_item['links'][langs]['name'].strip()
369         except:
370            value=dic_item['links'][langs].strip()
371         lang=langs.replace('wiki','').replace('_','-')
372         try:
373            value=unicode(value,'UTF8')
374         except:
375            pass
376         value=value.replace(u"_",u" ")
377         if lang !='fa':
378                 value = value.split(u'(')[0].strip()    
379         if lang =='es' or lang=='pt' or lang=='pt-br':
380             value = value.replace(u"Anexo:",u"")
381         if lang == 'cs':
382             value = value.replace(u"Príloha:",u"")
383         if lang == 'de-ch':
384             value = value.replace(u"ß",u"ss")
385 
386         if lang in rep_label:
387             lang=rep_label[lang]
388 
389         try :
390             a=dic_item['label'][lang]
391         except:
392             item.labels[lang] = value
393             changes=True
394             wikipedia.output('\03{lightgreen}for '+value+' added as label of '+lang+'\03{default}')
395     if changes:
396         changes=True
397     else:
398         wikipedia.output("Doesn't need any update!")    
399         changes=False   
400     return item,changes    
401 
402 def Update_data(data_add,appenddict):
403         item = wikidata.api.getItemById(data_add.title())
404         summary=''
405         confilict={}
406         new_langs={}
407         for lang in appenddict:
408             if lang=='nb':
409                 lang='no'                     
410             site_lang=lang
411             interlangLinks=appenddict[lang]
412             status=check_item(site_lang,interlangLinks)
413             if not status:
414                 wikipedia.output(site_lang+' has confilict!')
415                 confilict[site_lang]=interlangLinks
416                 continue
417             summary=update_link_summary
418             item.sitelinks[lang+"wiki"] = interlangLinks
419             new_langs[lang+"wiki"] = interlangLinks
420 
421         if confilict:
422             item_confilict=u'* [['+data_add.title()+u']] Confilict > '
423             for i in confilict:
424                 item_confilict+=u'[[:'+i+u':'+confilict[i]+u'|'+i+u'wiki]]-'
425             save_file(item_confilict[:-1],'conflict')
426             if SafeWork:
427                     wikipedia.output('\03{lightred}-->'+data_add.title()+' Passed! because of safe mode and conflict\03{default}')
428                     return False
429         global item_is_updtaed
430         if summary:
431             item,changes=set_lable(data_add,new_langs,item)    
432             if changes:
433                 summary=update_link_labels_summary
434             try:
435                 wikidata.api.save(item, summary)
436                 wikipedia.output('\03{lightblue}Page '+data_add.title()+' : '+summary+'\03{default}')
437                 return True
438             except Exception,e:
439                 try:
440                     wikipedia.output('\03{lightred}Page '+data_add.title()+' Passed! error was : '+str(e)+' \03{default}')
441                     item_is_updtaed=False
442                 except:
443                     wikipedia.output('\03{lightred}Page '+data_add.title()+'Passed!\03{default}')
444                     item_is_updtaed=False
445         return False
446 
447 
448 def find_diff(my_data,interwiki_links,namespace):
449         dictionary = my_data.get()  
450         dictionary= dictionary['links'] 
451         appenddict={}
452         for lang in interwiki_links:
453             if lang=='nb':
454                 lang='no'
455             L_lang=lang.replace(u'-',u'_')
456             L_link=interwiki_links[lang].replace(u'category:',u'Category:')
457             if not (L_lang in appenddict):    
458                 if not ((L_lang+'wiki') in dictionary):    
459                     appenddict[L_lang]=L_link
460                     wikipedia.output('\03{lightblue}+ '+L_lang +u' > '+L_link+' \03{default}')
461         if appenddict:
462                 done=Update_data(my_data,appenddict)
463                 if done:
464                    return True           
465         else:
466                 wikipedia.output(str(appenddict))
467                 item = wikidata.api.getItemById(my_data.title())
468                 new_langs={}
469                 item,changes=set_lable(my_data,new_langs,item)    
470                 if changes:
471                     summary=update_Labels_summary
472                     wikidata.api.save(item, summary)
473                     wikipedia.output('\03{lightblue}Page '+my_data.title()+' : '+summary+'\03{default}')    
474                     return True
475         return False
476 def merge_items(a,b):
477     #https://www.wikidata.org/w/api.php?action=wbmergeitems&fromid=Q42&toid=Q222
478     print a
479     print b
480     params = {
481             'action': 'wbmergeitems',
482             'fromid':a,
483             'toid': b,
484     }
485     mysite=wikipedia.getSite('wikidata','wikidata')
486     #if mysite:
487     try:
488         mege_items = query.GetData(params,mysite)
489         try:
490             error= mege_items[u'error']
491             return error
492         except:
493             return True
494     except: 
495         return False
496         
497 def check_one_lang(my_data,interwiki_links,Orgine_lang):
498     other_wikis=False
499     Orgine_title=my_data.title()    
500     item = wikidata.api.getItemById(Orgine_title)
501     if len(item.sitelinks) == 1 and len(interwiki_links)>1:    
502         for iw in interwiki_links:
503             if iw==Orgine_lang:
504                 continue
505             other_wiki_link=redirect_find( interwiki_links[iw],iw)
506             if not other_wiki_link.strip():
507                 continue
508             other_wiki_page = wikipedia.Page(iw, other_wiki_link)
509             other_wiki_data = wikipedia.DataPage(other_wiki_page)
510             if other_wiki_data.exists():    
511                 other_wikis=True
512                 break
513         if other_wikis:
514             #----------------------------------------------Merge case---------------
515             wikipedia.output(u'\03{lightgreen}====Merging Items!====\03{default}')
516             merge_result=merge_items(Orgine_title,other_wiki_data.title())
517             if merge_result:
518                 return True 
519             else:
520                 return False 
521     return False   
522     
523 def run(preloadingGen,hasnew,CreatNew,Remove_IW):
524     if not CreatNew:
525        Our_Ignore_Templates=[]
526     else:
527        Our_Ignore_Templates=Ignore_Templates
528 
529     if hasnew:
530         pagecount=0
531     else:
532         pagecount=Skip_new+1
533     for Orgin_page in preloadingGen:
534         global item_is_updtaed
535         item_is_updtaed=True
536         item_page=Orgin_page
537         pagecount+=1      
538         other_wiki=False
539         Origin_namespace=Orgin_page.namespace()    
540         if not Origin_namespace in [0,4,10,12,14,100,102]:
541             continue
542         Orgine_title=Orgin_page.title()
543         wikipedia.output(u'Page: -------- '+Orgine_title+u' --------' )  
544         if not Orgin_page.exists():
545              wikipedia.output(u'Page '+Orgine_title+ u' not existed so it is passed!')
546              continue
547         # checking for ignore templates
548         if Orgine_title:
549         #try:
550             do_work=True
551             page_templates=templatequery(Orgine_title)
552             if page_templates:
553                 for template in Our_Ignore_Templates:
554                     if template in page_templates:
555                          do_work=False
556                          wikipedia.output(u'\03{lightred}Page '+Orgine_title+u' had {{'+template.replace(Orgine_Template_Name,u'')+'}} so Bot is passed!\03{default}')    
557                          
558                          break    
559             if not do_work:
560                 continue
561             #----
562             interwiki_links=get_interwikis(Orgine_title,Orgine_lang)# get site's interwikis
563             interwiki_links[Orgine_lang]=Orgine_title    
564 
565             #------------------
566             my_page = wikipedia.Page(Orgine_Site, Orgine_title)    
567             my_data = wikipedia.DataPage(my_page)
568             if len(interwiki_links)==1:
569                 Remove_NOIW=True        
570                 if my_data.exists():# creating new item  
571                     #------------------------------------------------case 0 (NoIW-Item)---------------
572                     wikipedia.output(u'NoIW-Item= '+Orgine_title)
573                     wikipedia.output(u"Doesn't need update!")    
574                     continue    
575                 else:
576                     #------------------------------------------------case 1 (NoIW-NoItem)---------------
577                     if CreatNew:
578                         if Origin_namespace>0 and Orgine_title.find(u'/')!=-1:    
579                             wikipedia.output(u'It is a subpage!')
580                             continue    
581                         if pagecount < Skip_new:
582                             wikipedia.output('Page new rank is \03{lightred}'+str(pagecount)+'\03{default} so Bot is passed!')    
583                             continue
584                         wikipedia.output(u'NoIW-\03{lightred}NoItem\03{default}= '+Orgine_title)
585                         
586                         for i in range(0,2):
587                             try:
588                                 my_data.createitem(creat_summary)
589                             except:
590                                 if i<1:
591                                     login_wiki(1)    
592                             time.sleep(1)
593                             if my_data.exists():
594                                 break
595                         if my_data.exists():    
596                             wikipedia.output(u'Created item= '+creat_summary)                
597                         else:# switch to mode 2 (temparay item)                
598                             save_file(u'[['+Orgine_title+u']]','error')
599                             wikipedia.output(u'\03{lightred}Item creation had error..!\03{default}')
600                         wikipedia.output(u'Sleeping for '+str(Editing_Pause)+' Seconds ....')
601                         time.sleep(Editing_Pause)
602                     else:
603                         wikipedia.output(u"Page dosen't have itema and interwiki.After new JS tool if we don't make this kind of item is much better!")
604                         continue 
605                     
606             else:
607                 Remove_NOIW=True
608                 if my_data.exists():# normal updating to existed item
609                     #------------------------------------------------case 2 (IW-Item) ---------------
610                     wikipedia.output(u'IW-Item= '+Orgine_title)
611                     #if my_data.exists():
612                     try:
613                         done=find_diff(my_data,interwiki_links,Origin_namespace)
614                         if done:
615                             wikipedia.output(u'\03{lightgreen}Item updated!\03{default}')
616                         else:
617                             wikipedia.output(u'\03{lightyellow}Item did not updated...!\03{default}')
618                             done=check_one_lang(my_data,interwiki_links,Orgine_lang)    
619                             if not done:
620                                 wikipedia.output(u'\03{lightyellow}Item did not merged ..!\03{default}')
621                     #else:
622                     except:    
623                         save_file(u'[['+Orgine_title+u']]','error')
624                         wikipedia.output(u'\03{lightyellow}Item did not updated and bot passes..!\03{default}')
625                             
626                 else:
627                     for iw in interwiki_links:
628                         other_wiki_link=redirect_find( interwiki_links[iw],iw)
629                         #wikipedia.output(other_wiki_link)
630                         if not other_wiki_link.strip():
631                             continue
632                         
633                         other_wiki_Site=wikipedia.getSite(iw.replace(u'_',u'-'),fam='wikipedia')   
634                         other_wiki_page = wikipedia.Page(other_wiki_Site, other_wiki_link)
635                         try:
636                             other_wiki_data = wikipedia.DataPage(other_wiki_page)    
637                         except:    
638                             continue
639                         if other_wiki_data.exists():# creating new item    
640                              other_wiki=True
641                              item_page=other_wiki_data
642                              
643                              break
644                     if other_wiki:
645                         #----------------------------------------------case 3 (IW-OtherWikiItem)---------------
646                         wikipedia.output(u'IW-\03{lightgreen}OtherWikiItem\03{default}= '+Orgine_title)
647                         wikipedia.output(u'Updating item...')
648                         wikipedia.output(iw+ u':'+other_wiki_link)
649                         try:
650                             done=find_diff(other_wiki_data,interwiki_links,Origin_namespace)
651                             
652                             if done:
653                                 wikipedia.output(u'\03{lightgreen}Item updated!\03{default}')
654                             else:
655                                 done=check_one_lang(other_wiki_data,interwiki_links,Orgine_lang)    
656                                 if not done:
657                                     wikipedia.output(u'\03{lightyellow}Item did not merged ..!\03{default}')
658     
659                         except:    
660                             save_file(u'[['+Orgine_title+u']]','error')
661                             wikipedia.output(u'\03{lightyellow}Item did not updated and bot passes..!\03{default}')
662                     else:
663                         #----------------------------------------------case 4 (IW-NoItem)---------------
664                         wikipedia.output(u'IW-\03{lightred}NoItem\03{default}= '+Orgine_title)        
665                         sitelinks,labels = [],[]
666                         for iw in interwiki_links:
667                             inter_value=interwiki_links[iw].replace(u' ',u'_')
668                             if iw !='fa':
669                                 inter_value = inter_value.split(u'(')[0].strip()    
670                             if iw =='es' or iw=='pt' or iw=='pt-br':
671                                 inter_value = inter_value.replace(u"Anexo:",u"")
672                             if iw == 'cs':
673                                 inter_value = inter_value.replace(u"Príloha:",u"")
674                             if iw == 'de-ch':
675                                 inter_value = inter_value.replace(u"ß",u"ss")
676                             iw_l=iw
677                             if iw in rep_label:
678                                iw_l=rep_label[iw]
679 
680                             sitelinks.append({"site": iw+"wiki", "title": interwiki_links[iw]})
681                             labels.append({"language": iw_l, "value": inter_value.replace(u'_',u' ')})
682                         values = {"sitelinks": sitelinks,"labels": labels}    
683                         wikipedia.output(u"Creating item for %s" % (Orgin_page.title()))
684                         my_page2 = wikipedia.Page(wikipedia.getSite(iw,fam='wikipedia'), inter_value)    
685                         my_data2 = wikipedia.DataPage(my_page2)
686                         try:
687                             my_data2.createitem(value = values, summary = creat_summary)
688                         except:
689                                 wikipedia.output(u"\03{lightred}Bot couldn't make an item for "+iw+" : "+inter_value+u" so it is passed!\03{default}")
690                         save_file(u'[['+Orgine_title+u']]','error')    
691                         wikipedia.output(u'Sleeping for '+str(Editing_Pause)+' Seconds ....')
692                         time.sleep(Editing_Pause)                            
693 
694             #----Removing Iterwikis from Local wiki (it needes to have a flag on local wiki!)
695 
696             if Remove_IW and Remove_NOIW:    
697                 remove_result=remove_old_IW (Orgin_page,item_page)
698                 if not remove_result:    
699                     wikipedia.output(u"\03{lightpurple}Bot couldn't remove interwikis from Local wiki!\03{default}")
700 
701    
702         #except:
703         #    continue    
704     
705 def main():
706     wikipedia.config.put_throttle = 0
707     gen=None
708     wikipedia.put_throttle.setDelay()
709     hasnew,preloadingGen,CreatNew,Remove_IW=False,False,False,True
710     PageTitles,namespaces = [],''
711     genFactory = pagegenerators.GeneratorFactory()
712     for arg in wikipedia.handleArgs():
713         if arg.startswith('-newcat'):    
714             arg=arg.replace(':','')
715             if len(arg) == 7:
716                 genfa = pagegenerators.NewpagesPageGenerator(200, False, Orgine_Site,14)
717             else:
718                 genfa = pagegenerators.NewpagesPageGenerator(int(arg[7:])+Skip_new, False, Orgine_Site,14)
719             preloadingGen = pagegenerators.PreloadingGenerator( genfa,60)
720             hasnew=True
721             break
722         elif arg.startswith('-newspace'):    
723             arg=arg.replace(':','')
724             if len(arg) == 9:
725                 genfa = pagegenerators.NewpagesPageGenerator(2000, False, Orgine_Site,10)
726             else:
727                 genfa = pagegenerators.NewpagesPageGenerator(200+Skip_new, False, Orgine_Site,int(arg[9:]))
728             preloadingGen = pagegenerators.PreloadingGenerator( genfa,60)    
729             hasnew=True
730             break
731         elif arg.startswith ('-new'):
732             arg=arg.replace(':','')
733             if len(arg) == 4:
734                 genfa = pagegenerators.NewpagesPageGenerator(200, False, Orgine_Site,0)
735             else:
736                 genfa = pagegenerators.NewpagesPageGenerator(int(arg[4:])+Skip_new, False, Orgine_Site,0)
737             
738             preloadingGen = pagegenerators.PreloadingGenerator( genfa,60)
739             hasnew=True
740             break
741         elif arg.startswith ('-remove'):
742             Remove_IW=False
743         elif arg == '-autotitle':
744             autoTitle = True
745         elif arg.startswith('-page'):
746             if len( arg ) == 5:
747                 PageTitles.append( wikipedia.input( u'Which page do you want to chage?' ) )    
748             else:
749                 PageTitles.append( arg[6:] )
750             break
751         elif arg.startswith('-namespace:'):
752             namespaces=int(arg[11:])
753         elif arg.startswith('-force'):
754             SafeWork = False
755         elif arg.startswith('-file'):
756             textfilename = arg[6:]
757             if not textfilename:
758                 textfilename = pywikibot.input(
759                     u'Please enter the local file name:')
760             gen = pagegenerators.TextfilePageGenerator(textfilename,site=Orgine_Site)
761         elif arg.startswith('-CreatNew'):
762             CreatNew = True
763         else:
764             generator = genFactory.handleArg( arg )
765             if generator:
766                 gen = generator
767     if not gen:
768         wikipedia.stopme()    
769     if PageTitles:
770         pages = [wikipedia.Page( Orgine_Site,PageTitle ) for PageTitle in PageTitles]
771         gen = iter( pages )
772     if namespaces:
773         gen = pagegenerators.NamespaceFilterPageGenerator( gen,namespaces )
774     if not preloadingGen:
775         preloadingGen = pagegenerators.PreloadingGenerator( gen,pageNumber = 60)
776     run(preloadingGen,hasnew,CreatNew,Remove_IW)        
777         
778 if __name__ == "__main__":
779 
780     wikipedia.output(u'\03{lightpurple}      *******************************\03{default}')  
781     wikipedia.output(u'\03{lightpurple}      *     Code version is '+version+u'    *\03{default}')
782     wikipedia.output(u'\03{lightpurple}      *******************************\03{default}')      
783     login_wiki(1)
784     login_wiki(2)
785     item_is_updtaed=True
786     main()