Package edu.berkeley.nlp.lm.cache
Interface ContextEncodedLmCache
-
- All Superinterfaces:
java.io.Serializable
- All Known Implementing Classes:
ContextEncodedDirectMappedLmCache
public interface ContextEncodedLmCache extends java.io.Serializable
-
-
Method Summary
All Methods Instance Methods Abstract Methods Modifier and Type Method Description intcapacity()floatgetCached(long contextOffset, int contextOrder, int word, int hash, ContextEncodedNgramLanguageModel.LmContextInfo outputPrefix)Should return Float.NaN if requested n-gram is not in the cache.voidputCached(long contextOffset, int contextOrder, int word, float prob, int hash, ContextEncodedNgramLanguageModel.LmContextInfo outputPrefix)
-
-
-
Method Detail
-
getCached
float getCached(long contextOffset, int contextOrder, int word, int hash, ContextEncodedNgramLanguageModel.LmContextInfo outputPrefix)Should return Float.NaN if requested n-gram is not in the cache.- Parameters:
contextOffset-contextOrder-word-hash-outputPrefix-- Returns:
-
putCached
void putCached(long contextOffset, int contextOrder, int word, float prob, int hash, ContextEncodedNgramLanguageModel.LmContextInfo outputPrefix)
-
capacity
int capacity()
-
-