Configuring Mango DB for processing Big DATA


Recently we moved to MANGO DB from MYSQL for our Big DATA.

Overview :

We have 275 GB(550 Million) of data per collection and we inserted it into mongo which occupies 140 GB(Storage Size including Index).

Like this, we have 25 collections in a single database. Each collection has 30 fields, 16 are indexed.

We daily imports up to 20(10 GB) million data to each collection and from this, we need to fetch 100 million data from our overall collection and we regularly require to do aggregate(group) from this 550 million records to generate live reports.


1. Suitable Server Configure and tuning of Mongo DB ( Core & Ram & Version )

2. Aiming to get Reports less than 1 minute.

Kemahiran: Pentadbiran Pangkalan Data, Hadoop, Linux, NoSQL Couch & Mongo

Lihat lagi: form processing services data entry classifieds, php processing form data, javascript processing mysql data, xml data sevice live stock, processing raw data excel, clone joomla big data base, processing test data definition, processing pro data entry, data entry live chat, data entry data processing simple data entry, post processing excel data quality center, translate hello live united staes japanese, relatedwwwcignuswebcomdata entry indiacatalog processing online data entry indiahtm, processing imdb data, big data processing python, online data processing with storm big data pdf, processing of big data online, Online Live Sessions/classess on Big Data and Hadoop, big data processing with apache spark part 1 introduction

Tentang Majikan:
( 0 ulasan ) chennai, India

ID Projek: #18099043

2 pekerja bebas membida secara purata ₹24549 untuk pekerjaan ini


Document database expert

₹25000 INR dalam sehari
(12 Ulasan)

Hi. Interesting use case you have. Let me first start by saying that if you are going to continue to import 10GB worth of data to each collection on a daily basis. Then there is no way that you can improve your perf Lagi

₹24098 INR dalam 10 hari
(0 Ulasan)