pythonsearchlucenenlplsa

How to build a conceptual search engine?


I would like to build an internal search engine (I have a very large collection of thousands of XML files) that is able to map queries to concepts. For example, if I search for "big cats", I would want highly ranked results to return documents with "large cats" as well. But I may also be interested in having it return "huge animals", albeit at a much lower relevancy score.

I'm currently reading through the Natural Language Processing in Python book, and it seems WordNet has some word mappings that might prove useful, though I'm not sure how to integrate that into a search engine. Could I use Lucene to do this? How?

From further research, it seems "latent semantic analysis" is relevant to what I'm looking for but I'm not sure how to implement it.

Any advice on how to get this done?


Solution

  • I'm not sure how to integrate that into a search engine. Could I use Lucene to do this? How?

    Step 1. Stop.

    Step 2. Get something to work.

    Step 3. By then, you'll understand more about Python and Lucene and other tools and ways you might integrate them.

    Don't start by trying to solve integration problems. Software can always be integrated. That's what an Operating System does. It integrates software. Sometimes you want "tighter" integration, but that's never the first problem to solve.

    The first problem to solve is to get your search or concept thing or whatever it is to work as a dumb-old command-line application. Or pair of applications knit together by passing files around or knit together with OS pipes or something.

    Later, you can try and figure out how to make the user experience seamless.

    But don't start with integration and don't stall because of integration questions. Set integration aside and get something to work.