With more than 950 million users, Facebook(s fb) is collecting a lot of data. Every time you click a notification, visit a page, upload a photo, or check out a friend’s link, you’re generating data for the company to track. Multiply that by 950 million people, who spend on average more than 6.5 hours on the site every month, and you have a lot of information to deal with.
Here are some of the stats the company provided Wednesday to demonstrate just how big Facebook’s data really is:
- 2.5 billion content items shared per day (status updates + wall posts + photos + videos + comments)
- 2.7 billion Likes per day
- 300 million photos uploaded per day
- 100+ petabytes of disk space in one of FB’s largest Hadoop (HDFS) clusters
- 105 terabytes of data scanned via Hive, Facebook’s Hadoop query language, every 30 minutes
- 70,000 queries executed on these databases per day
- 500+terabytes of new data ingested into the databases every day
“If you aren’t taking advantage of big data, then you don’t have big data, you have just a pile of data,” said Jay Parikh, VP of infrastructure at Facebook on Wednesday. “Everything is interesting to us.”
Parikh said the company is constantly trying to figure out how to better analyze and make sense of the data, including doing extensive A/B testing on all potential updates to the site, and making sure it responds in real time to user input.
“We’re growing fast, but everyone else is growing faster,” he said.