Banking / Finance

Why should insurers pursue Big Data in the Cloud?

You can store all of the books available in the world in the cloud for less than a million dollars. One hundred terabytes of data, which is probably enough to store all of the data in a midsized insurance company, costs about $2,600 per month. The point is, for doing big-data analysis — which requires analyzing large amounts of data — loading it in cloud storage on demand and doing the analysis with powerful tools like BigQuery from Google makes economic sense. When I visited Google I was impressed when they did a BigQuery on all of Shakespeare’s works and found in less than a second that the word “love” was used 2,135 times.
Loading data to the cloud

According to Wikibon big-data statistics, 48 hours of video are uploaded to YouTube per minute. This translates to 15 terabytes per hour. This got me to investigate how large amounts of data can be loaded to Google cloud storage. You can do this in real-time directly to Google BigQuery, which is a NoSQL database and stores data in columns. You can also load these as CSV files or JSON, and you could do batch loads of 1 terabyte with a limit of 10,000 uploads per project per day. This is a lot of data storage capacity at an economical cost, and insurers should surely look at doing data analytics in the cloud.

Leave a Comment

Your email address will not be published.

You may also like

Crayon Yoda

Pin It on Pinterest