Process large text file with ByteStrings and lazy texts

1128670 pts.
Tags:
Big Data
I'm looking to process a large unicode text file that has over 6 GB. I need to count the frequency of each unique word. I'm currently using Data.Map to track the count of each word but it's taking way too much time and space. Here's the code:
import Data.Text.Lazy (Text(..), cons, pack, append)
import qualified Data.Text.Lazy as T
import qualified Data.Text.Lazy.IO as TI
import Data.Map.Strict hiding (foldr, map, foldl')
import System.Environment
import System.IO
import Data.Word

dictionate :: [Text] -> Map Text Word16
dictionate = fromListWith (+) . (`zip` [1,1..])

main = do
    [file,out]
I've tried using Data.HashMap.Strict and it's a bit better. What am I doing wrong here? What's the best way to do it in terms of performance and time?
0

Answer Wiki

Thanks. We'll let you know when a new response is added.
Send me notifications when members answer or reply to this question.

Discuss This Question:  

 
There was an error processing your information. Please try again later.
Thanks. We'll let you know when a new response is added.
Send me notifications when members answer or reply to this question.

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to:

To follow this tag...

There was an error processing your information. Please try again later.

Thanks! We'll email you when relevant content is added and updated.

Following

Share this item with your network: