In book Algorithms fourth edition by Robert Sedgewick on page 200, it says "for example, if you have 1GB of memory on your computer (1 billion bytes), you cannot fit more than about 32 million int values."
I got confused after my calculation: 1,000,000,000 bytes/4 bytes = 250 million
How the author got 32 million?
The book describes like below:
The author has acknowledged that this is an error in this book website, please refer to the link as follows: http://algs4.cs.princeton.edu/errata/errata-printing3.php