I get the impression that (perhaps even more than Bluemix) this is what the Wolfram Language is looking to offer in the longer term.
Seems to me that the main pros and cons are two sides of the same coin:
With Wikipedia, there's no 'search filter' between you and the text. Adding an algorithmic level of indirection between the user and the knowledge that they're looking for is subject to hidden biases.
If those biases are intended in your best interests, and the search is context-sensitive enough to present you with information in the form that is most useful and digestible to you, then this is a good thing. Otherwise, not. Like many topics in AI, problems arise because we're simply not that good at modelling human context yet.
Of course, we're already subject to this filter bubble effect via search engines and social media. The current consensus seems to be that even more of this would not be a good thing for society.