Skip to content Skip to sidebar Skip to footer

Is There A Really Efficient (FAST) Way To Read Large Text Files In Python?

I am looking to open and fetch data from a large text file in python as fast as possible (It almost has 62603143 lines - size 550MB). As I don't want to stress my computer, I am do

Solution 1:

No, there is no faster way of processing a file line by line, not from Python.

Your bottleneck is your hardware, not how you read the file. Python is already doing everything it can (using a buffer to read the file in larger chunks before splitting into newlines).

I suggest upgrading your disk to an SSD.


Post a Comment for "Is There A Really Efficient (FAST) Way To Read Large Text Files In Python?"