Last modified: 2014-04-13 14:39:22 UTC
I observed that Page(site, title).interwiki() doesn't work. I think it is because DataPage().get()['link'] returns something like {"nowiki":{"name: "Title", "badges": []}} and it was expected something like {"nowiki": "Title"} Another bug i observed is that wikidata gives also the Commons link I have modified the wikipedia.py line 4719 from "links[code]) for code in links]" to "links[code]['name']) for code in links if "commons" not in code]" and it works fine but I think that the condition 'if "commons" not in code' is not *always* a good solution.
I assert that "" is the unique solution and it's right because the last function keep all links related with wikipedias, and doesn't expect a commons link. getSite(code.replace('wiki','').replace('_', '-'), fam='wikipedia') if "commons" appears in this function it raise a NoSuchSite error because language family does not exist. I have evaluate the following results: https://www.wikidata.org/w/api.php?action=query&format=jsonfm&prop=revisions&rvprop=content&titles=Q90&pllimit=max and I watch that commons link and wikivoyage links are included in the content. That means that it's necessary to exclude "commons" and "voyage" links to avoid other raise errors. So I purpose the following change for the line 4719 of the file wikipedia.py: links[code]['name']) for code in links if "commons" not in code and "voyage" not in code]
I meant: I assert that "links[code]['name']) for code in links if "commons" not in code]"
I made a dirty hack weeks ago in https://gerrit.wikimedia.org/r/#/c/85839/ as I found that problem. Maybe we should solve it in a generic way to filter the site in the interlanguage method but keep it for some interwiki links.