We were expecting that we'd save a non-trivial amount of time by skipping the parsing phrase in the server... but Matthew tells me that, as far as he can tell, it's slightly slower than when we were running the recogniser without semantics, and creating the semantic forms on the server side! I don't understand this at all. Will run some offline tests tomorrow and see if I can spot any obvious time-sink.
Tuesday, 13 April 2010
Success, of sorts! I have a first version of the Nuance 9/server integration checked in, and Matthew tells me it runs correctly on his machine. (I don't yet have a full Nuance 9 installed here). Everything appears to do what it's supposed to: the original grammar is compiled into a Nuance 9 GrXML grammar, with the semantics transformed into string-concatenation semantics that put together a string representation of the semantic form. This is passed from the MRCP process to the dialogue server, which unpacks the strings, reconstructs the real semantic forms, and then passes them to downstream processing.
Monday, 12 April 2010
The last couple of weeks, I've been working with Matthew Fuchs (paideia.com) on getting Regulus to work with Nuance 9. We've had a bunch of problems, but we're making good progress and are nearly at the point of having things up and running.
I added code to check for recursivity, and it turns out to be easy enough, at least in the cases we've looked at so far, to fix the operationality criteria in grammar specialisation so that the generated grammars are non-recursive. It wasn't so easy, though, to use the 8.5 to 9 conversion tool, since it turned out that it didn't handle the 'concat' operator, completely central to Regulus semantics.
We wondered for a while if we'd either have to give up on using semantics in Nuance 9, and parse everything in Regulus, or else have Regulus directly generate Nuance 9 grammars - possible, but non-trivial. But I thought of a cute work-around over the weekend, which seems to solve the problem for now. Instead of generating the actual semantics, we generate strings which encode the semantics, and put them together with 'strcat' rather than 'concat' - the strcat operator is handled by the conversion tool.
Another reason why we were reluctant to generate Nuance 9 semantics directly is that, as far as we can make out, Nuance 9 doesn't have a tool for doing PCFG tuning, which is essential to good performance. But, with the current scheme, we can generate Nuance 8.5 grammars, do the PCFG tuning in 8.5, and then translate into 9. The conversion tool correctly carries across the generated probabilistic weights.
It's a bit of a Frankenstein's monster, but it does all seem to work! Matthew just told me that he was able to run the Nuance 9 grammar successfully in MRCP. Now we just need to integrate everything with the dialogue server, and we'll have the first version fully running. More soon, I hope...